From 20c5fc60fba3df38ef8d25bdc377792ac557f426 Mon Sep 17 00:00:00 2001 From: Melissa Vagi Date: Fri, 1 Dec 2023 13:51:03 -0700 Subject: [PATCH 01/11] Add drop processor doc to address content gap Signed-off-by: Melissa Vagi --- _ingest-pipelines/processors/date.md | 2 +- _ingest-pipelines/processors/drop.md | 128 +++++++++++++++++++++++++++ 2 files changed, 129 insertions(+), 1 deletion(-) create mode 100644 _ingest-pipelines/processors/drop.md diff --git a/_ingest-pipelines/processors/date.md b/_ingest-pipelines/processors/date.md index c8ba7ba863..3ce6602563 100644 --- a/_ingest-pipelines/processors/date.md +++ b/_ingest-pipelines/processors/date.md @@ -13,7 +13,7 @@ redirect_from: The `date` processor is used to parse dates from document fields and to add the parsed data to a new field. By default, the parsed data is stored in the `@timestamp` field. -## Example +## Syntax example The following is the syntax for the `date` processor: ```json diff --git a/_ingest-pipelines/processors/drop.md b/_ingest-pipelines/processors/drop.md new file mode 100644 index 0000000000..d0388e3246 --- /dev/null +++ b/_ingest-pipelines/processors/drop.md @@ -0,0 +1,128 @@ +--- +layout: default +title: Drop +parent: Ingest processors +nav_order: 70 +--- + +# Drop processor + +The `drop` processor is used to discard documents without indexing them. This is useful for preventing documents from being indexed based on certain conditions. For example, you might use a `drop` processor to prevent documents that are missing important fields or contain sensitive information from being indexed. + +The `drop` processor does not raise any errors when it discards documents, making it useful for preventing indexing problems without cluttering your OpenSearch logs with error messages. + +## Syntax example + +The following is the syntax for the `drop` processor: + +```json +{ + "drop": { + "if": "date_field", + "field": "field-to-be-dropped"]" + } +} +``` +{% include copy-curl.html %} + +## Configuration parameters + +The following table lists the required and optional parameters for the `date` processor. + +Parameter | Required | Description | +|-----------|-----------|-----------| +`description` | Optional | A brief description of the processor. | +`if` | Optional | A condition for running this processor. | +`ignore_failure` | Optional | If set to `true`, failures are ignored. Default is `false`. See [Handling pipeline failures]({{site.url}}{{site.baseurl}}/ingest-pipelines/pipeline-failures/) for more information. | +`on_failure` | Optional | A list of processors to run if the processor fails. See [Handling pipeline failures]({{site.url}}{{site.baseurl}}/ingest-pipelines/pipeline-failures/) for more information. | +`tag` | Optional | An identifier tag for the processor. Useful for debugging to distinguish between processors of the same type. | + +## Using the processor + +Follow these steps to use the processor in a pipeline. + +**Step 1: Create a pipeline.** + +The following query creates a pipeline, named `drop-user`, that uses the `drop` processor to prevent a document containing personally identifiable information (PII) from being indexed: + +```json +PUT /_ingest/pipeline/drop-pii +{ + "description": "Pipeline that prevents PII from being indexed", + "processors": [ + { + "drop": { + "if" : "ctx.user_info.contains('password') || ctx.user_info.contains('credit_card')" + } + } + ] +} +``` +{% include copy-curl.html %} + +**Step 2 (Optional): Test the pipeline.** + +It is recommended that you test your pipeline before you ingest documents. +{: .tip} + +To test the pipeline, run the following query: + +```json +POST _ingest/pipeline/drop-pii/_simulate +{ + "docs": [ + { + "_index": "testindex1", + "_id": "1", + "_source": { + "user_info": "Sensitive information including credit card" + } + } + ] +} +``` +{% include copy-curl.html %} + +#### Response + +The following example response confirms that the pipeline is working as expected: + +```json +{ + "docs": [ + { + "doc": { + "_index": "testindex1", + "_id": "1", + "_source": { + "user_info": "Sensitive information including credit card" + }, + "_ingest": { + "timestamp": "2023-12-01T20:49:52.476308925Z" + } + } + } + ] +} +``` + +**Step 3: Ingest a document.** + +The following query ingests a document into an index named `testindex1`: + +```json +PUT testindex1/_doc/1?pipeline=drop-pii +{ + "user_info": "Sensitive information including credit card" +} +``` +{% include copy-curl.html %} + +**Step 4 (Optional): Retrieve the document.** + +To retrieve the document, run the following query: + +```json +GET testindex1/_doc/1 +``` +{% include copy-curl.html %} From 6d0febd821a4fe9d08b049eb53ae2e5d4f024060 Mon Sep 17 00:00:00 2001 From: Melissa Vagi Date: Thu, 4 Jan 2024 13:14:56 -0700 Subject: [PATCH 02/11] Address tech review feedback Signed-off-by: Melissa Vagi --- _ingest-pipelines/processors/drop.md | 30 +++++----------------------- 1 file changed, 5 insertions(+), 25 deletions(-) diff --git a/_ingest-pipelines/processors/drop.md b/_ingest-pipelines/processors/drop.md index d0388e3246..dfffac2c0e 100644 --- a/_ingest-pipelines/processors/drop.md +++ b/_ingest-pipelines/processors/drop.md @@ -27,7 +27,7 @@ The following is the syntax for the `drop` processor: ## Configuration parameters -The following table lists the required and optional parameters for the `date` processor. +The following table lists the required and optional parameters for the `drop` processor. Parameter | Required | Description | |-----------|-----------|-----------| @@ -43,7 +43,7 @@ Follow these steps to use the processor in a pipeline. **Step 1: Create a pipeline.** -The following query creates a pipeline, named `drop-user`, that uses the `drop` processor to prevent a document containing personally identifiable information (PII) from being indexed: +The following query creates a pipeline, named `drop-pii`, that uses the `drop` processor to prevent a document containing personally identifiable information (PII) from being indexed: ```json PUT /_ingest/pipeline/drop-pii @@ -52,7 +52,7 @@ PUT /_ingest/pipeline/drop-pii "processors": [ { "drop": { - "if" : "ctx.user_info.contains('password') || ctx.user_info.contains('credit_card')" + "if" : "ctx.user_info.contains('password') || ctx.user_info.contains('credit card')" } } ] @@ -85,23 +85,12 @@ POST _ingest/pipeline/drop-pii/_simulate #### Response -The following example response confirms that the pipeline is working as expected: +The following example response confirms that the pipeline is working as expected, that is, the document has been dropped: ```json { "docs": [ - { - "doc": { - "_index": "testindex1", - "_id": "1", - "_source": { - "user_info": "Sensitive information including credit card" - }, - "_ingest": { - "timestamp": "2023-12-01T20:49:52.476308925Z" - } - } - } + null ] } ``` @@ -117,12 +106,3 @@ PUT testindex1/_doc/1?pipeline=drop-pii } ``` {% include copy-curl.html %} - -**Step 4 (Optional): Retrieve the document.** - -To retrieve the document, run the following query: - -```json -GET testindex1/_doc/1 -``` -{% include copy-curl.html %} From 35a20537956901b50c7306861a601f4bc32cb0e7 Mon Sep 17 00:00:00 2001 From: Melissa Vagi Date: Mon, 26 Feb 2024 14:29:33 -0700 Subject: [PATCH 03/11] Address tech review changes Signed-off-by: Melissa Vagi --- _ingest-pipelines/processors/drop.md | 19 +++++++++++++++++-- 1 file changed, 17 insertions(+), 2 deletions(-) diff --git a/_ingest-pipelines/processors/drop.md b/_ingest-pipelines/processors/drop.md index dfffac2c0e..395551471b 100644 --- a/_ingest-pipelines/processors/drop.md +++ b/_ingest-pipelines/processors/drop.md @@ -18,8 +18,7 @@ The following is the syntax for the `drop` processor: ```json { "drop": { - "if": "date_field", - "field": "field-to-be-dropped"]" + "if": "ctx.foo == 'bar'" } } ``` @@ -94,6 +93,7 @@ The following example response confirms that the pipeline is working as expected ] } ``` +{% include copy-curl.html %} **Step 3: Ingest a document.** @@ -106,3 +106,18 @@ PUT testindex1/_doc/1?pipeline=drop-pii } ``` {% include copy-curl.html %} + +The following response confirms that the document with ID `1` was not indexed: + +{ + "_index": "testindex1", + "_id": "1", + "_version": -3, + "result": "noop", + "_shards": { + "total": 0, + "successful": 0, + "failed": 0 + } +} +{% include copy-curl.html %} \ No newline at end of file From 73296f59d9e6e1721aefcdcc49f7dcb3c6f735ad Mon Sep 17 00:00:00 2001 From: Melissa Vagi Date: Mon, 26 Feb 2024 14:34:04 -0700 Subject: [PATCH 04/11] Delete _ingest-pipelines/processors/date.md Signed-off-by: Melissa Vagi Signed-off-by: Melissa Vagi --- _ingest-pipelines/processors/date.md | 141 --------------------------- 1 file changed, 141 deletions(-) delete mode 100644 _ingest-pipelines/processors/date.md diff --git a/_ingest-pipelines/processors/date.md b/_ingest-pipelines/processors/date.md deleted file mode 100644 index 3ce6602563..0000000000 --- a/_ingest-pipelines/processors/date.md +++ /dev/null @@ -1,141 +0,0 @@ ---- -layout: default -title: Date -parent: Ingest processors -nav_order: 50 -redirect_from: - - /api-reference/ingest-apis/processors/date/ ---- - -# Date -**Introduced 1.0** -{: .label .label-purple } - -The `date` processor is used to parse dates from document fields and to add the parsed data to a new field. By default, the parsed data is stored in the `@timestamp` field. - -## Syntax example -The following is the syntax for the `date` processor: - -```json -{ - "date": { - "field": "date_field", - "formats": ["yyyy-MM-dd'T'HH:mm:ss.SSSZZ"] - } -} -``` -{% include copy-curl.html %} - -## Configuration parameters - -The following table lists the required and optional parameters for the `date` processor. - -Parameter | Required | Description | -|-----------|-----------|-----------| -`field` | Required | The name of the field to which the data should be converted. Supports template snippets. | -`formats` | Required | An array of the expected date formats. Can be a [date format]({{site.url}}{{site.baseurl}}/field-types/supported-field-types/date/#formats) or one of the following formats: ISO8601, UNIX, UNIX_MS, or TAI64N. | -`description` | Optional | A brief description of the processor. | -`if` | Optional | A condition for running this processor. | -`ignore_failure` | Optional | If set to `true`, failures are ignored. Default is `false`. | -`locale` | Optional | The locale to use when parsing the date. Default is `ENGLISH`. Supports template snippets. | -`on_failure` | Optional | A list of processors to run if the processor fails. | -`output_format` | Optional | The [date format]({{site.url}}{{site.baseurl}}/field-types/supported-field-types/date/#formats) to use for the target field. Default is `yyyy-MM-dd'T'HH:mm:ss.SSSZZ`. | -`tag` | Optional | An identifier tag for the processor. Useful for debugging to distinguish between processors of the same type. | -`target_field` | Optional | The name of the field in which to store the parsed data. Default target field is `@timestamp`. | -`timezone` | Optional | The time zone to use when parsing the date. Default is `UTC`. Supports template snippets. | - -## Using the processor - -Follow these steps to use the processor in a pipeline. - -**Step 1: Create a pipeline.** - -The following query creates a pipeline, named `date-output-format`, that uses the `date` processor to convert from European date format to US date format, adding the new field `date_us` with the desired `output_format`: - -```json -PUT /_ingest/pipeline/date-output-format -{ - "description": "Pipeline that converts European date format to US date format", - "processors": [ - { - "date": { - "field" : "date_european", - "formats" : ["dd/MM/yyyy", "UNIX"], - "target_field": "date_us", - "output_format": "MM/dd/yyy", - "timezone" : "UTC" - } - } - ] -} -``` -{% include copy-curl.html %} - -**Step 2 (Optional): Test the pipeline.** - -It is recommended that you test your pipeline before you ingest documents. -{: .tip} - -To test the pipeline, run the following query: - -```json -POST _ingest/pipeline/date-output-format/_simulate -{ - "docs": [ - { - "_index": "testindex1", - "_id": "1", - "_source": { - "date_us": "06/30/2023", - "date_european": "30/06/2023" - } - } - ] -} -``` -{% include copy-curl.html %} - -#### Response - -The following example response confirms that the pipeline is working as expected: - -```json -{ - "docs": [ - { - "doc": { - "_index": "testindex1", - "_id": "1", - "_source": { - "date_us": "06/30/2023", - "date_european": "30/06/2023" - }, - "_ingest": { - "timestamp": "2023-08-22T17:08:46.275195504Z" - } - } - } - ] -} -``` - -**Step 3: Ingest a document.** - -The following query ingests a document into an index named `testindex1`: - -```json -PUT testindex1/_doc/1?pipeline=date-output-format -{ - "date_european": "30/06/2023" -} -``` -{% include copy-curl.html %} - -**Step 4 (Optional): Retrieve the document.** - -To retrieve the document, run the following query: - -```json -GET testindex1/_doc/1 -``` -{% include copy-curl.html %} From b572c4d046e651c41cd0d38296bba6cbfb566284 Mon Sep 17 00:00:00 2001 From: Melissa Vagi Date: Tue, 27 Feb 2024 12:45:20 -0700 Subject: [PATCH 05/11] Revert "Delete _ingest-pipelines/processors/date.md" This reverts commit 73296f59d9e6e1721aefcdcc49f7dcb3c6f735ad. --- _ingest-pipelines/processors/date.md | 141 +++++++++++++++++++++++++++ 1 file changed, 141 insertions(+) create mode 100644 _ingest-pipelines/processors/date.md diff --git a/_ingest-pipelines/processors/date.md b/_ingest-pipelines/processors/date.md new file mode 100644 index 0000000000..3ce6602563 --- /dev/null +++ b/_ingest-pipelines/processors/date.md @@ -0,0 +1,141 @@ +--- +layout: default +title: Date +parent: Ingest processors +nav_order: 50 +redirect_from: + - /api-reference/ingest-apis/processors/date/ +--- + +# Date +**Introduced 1.0** +{: .label .label-purple } + +The `date` processor is used to parse dates from document fields and to add the parsed data to a new field. By default, the parsed data is stored in the `@timestamp` field. + +## Syntax example +The following is the syntax for the `date` processor: + +```json +{ + "date": { + "field": "date_field", + "formats": ["yyyy-MM-dd'T'HH:mm:ss.SSSZZ"] + } +} +``` +{% include copy-curl.html %} + +## Configuration parameters + +The following table lists the required and optional parameters for the `date` processor. + +Parameter | Required | Description | +|-----------|-----------|-----------| +`field` | Required | The name of the field to which the data should be converted. Supports template snippets. | +`formats` | Required | An array of the expected date formats. Can be a [date format]({{site.url}}{{site.baseurl}}/field-types/supported-field-types/date/#formats) or one of the following formats: ISO8601, UNIX, UNIX_MS, or TAI64N. | +`description` | Optional | A brief description of the processor. | +`if` | Optional | A condition for running this processor. | +`ignore_failure` | Optional | If set to `true`, failures are ignored. Default is `false`. | +`locale` | Optional | The locale to use when parsing the date. Default is `ENGLISH`. Supports template snippets. | +`on_failure` | Optional | A list of processors to run if the processor fails. | +`output_format` | Optional | The [date format]({{site.url}}{{site.baseurl}}/field-types/supported-field-types/date/#formats) to use for the target field. Default is `yyyy-MM-dd'T'HH:mm:ss.SSSZZ`. | +`tag` | Optional | An identifier tag for the processor. Useful for debugging to distinguish between processors of the same type. | +`target_field` | Optional | The name of the field in which to store the parsed data. Default target field is `@timestamp`. | +`timezone` | Optional | The time zone to use when parsing the date. Default is `UTC`. Supports template snippets. | + +## Using the processor + +Follow these steps to use the processor in a pipeline. + +**Step 1: Create a pipeline.** + +The following query creates a pipeline, named `date-output-format`, that uses the `date` processor to convert from European date format to US date format, adding the new field `date_us` with the desired `output_format`: + +```json +PUT /_ingest/pipeline/date-output-format +{ + "description": "Pipeline that converts European date format to US date format", + "processors": [ + { + "date": { + "field" : "date_european", + "formats" : ["dd/MM/yyyy", "UNIX"], + "target_field": "date_us", + "output_format": "MM/dd/yyy", + "timezone" : "UTC" + } + } + ] +} +``` +{% include copy-curl.html %} + +**Step 2 (Optional): Test the pipeline.** + +It is recommended that you test your pipeline before you ingest documents. +{: .tip} + +To test the pipeline, run the following query: + +```json +POST _ingest/pipeline/date-output-format/_simulate +{ + "docs": [ + { + "_index": "testindex1", + "_id": "1", + "_source": { + "date_us": "06/30/2023", + "date_european": "30/06/2023" + } + } + ] +} +``` +{% include copy-curl.html %} + +#### Response + +The following example response confirms that the pipeline is working as expected: + +```json +{ + "docs": [ + { + "doc": { + "_index": "testindex1", + "_id": "1", + "_source": { + "date_us": "06/30/2023", + "date_european": "30/06/2023" + }, + "_ingest": { + "timestamp": "2023-08-22T17:08:46.275195504Z" + } + } + } + ] +} +``` + +**Step 3: Ingest a document.** + +The following query ingests a document into an index named `testindex1`: + +```json +PUT testindex1/_doc/1?pipeline=date-output-format +{ + "date_european": "30/06/2023" +} +``` +{% include copy-curl.html %} + +**Step 4 (Optional): Retrieve the document.** + +To retrieve the document, run the following query: + +```json +GET testindex1/_doc/1 +``` +{% include copy-curl.html %} From b6f790272f2acaaeab72b3d464e87f3a8f4cfba6 Mon Sep 17 00:00:00 2001 From: Melissa Vagi Date: Thu, 7 Mar 2024 09:36:04 -0700 Subject: [PATCH 06/11] Update _ingest-pipelines/processors/drop.md Co-authored-by: Nathan Bower Signed-off-by: Melissa Vagi --- _ingest-pipelines/processors/drop.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/_ingest-pipelines/processors/drop.md b/_ingest-pipelines/processors/drop.md index 395551471b..8d163decb8 100644 --- a/_ingest-pipelines/processors/drop.md +++ b/_ingest-pipelines/processors/drop.md @@ -7,7 +7,7 @@ nav_order: 70 # Drop processor -The `drop` processor is used to discard documents without indexing them. This is useful for preventing documents from being indexed based on certain conditions. For example, you might use a `drop` processor to prevent documents that are missing important fields or contain sensitive information from being indexed. +The `drop` processor is used to discard documents without indexing them. This can be useful for preventing documents from being indexed based on certain conditions. For example, you might use a `drop` processor to prevent documents that are missing important fields or contain sensitive information from being indexed. The `drop` processor does not raise any errors when it discards documents, making it useful for preventing indexing problems without cluttering your OpenSearch logs with error messages. From 8ee3c951405fc4ba3509b04dc17846f275ae179b Mon Sep 17 00:00:00 2001 From: Melissa Vagi Date: Thu, 7 Mar 2024 09:36:12 -0700 Subject: [PATCH 07/11] Update _ingest-pipelines/processors/drop.md Co-authored-by: Nathan Bower Signed-off-by: Melissa Vagi --- _ingest-pipelines/processors/drop.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/_ingest-pipelines/processors/drop.md b/_ingest-pipelines/processors/drop.md index 8d163decb8..265fc57390 100644 --- a/_ingest-pipelines/processors/drop.md +++ b/_ingest-pipelines/processors/drop.md @@ -31,7 +31,7 @@ The following table lists the required and optional parameters for the `drop` pr Parameter | Required | Description | |-----------|-----------|-----------| `description` | Optional | A brief description of the processor. | -`if` | Optional | A condition for running this processor. | +`if` | Optional | A condition for running the processor. | `ignore_failure` | Optional | If set to `true`, failures are ignored. Default is `false`. See [Handling pipeline failures]({{site.url}}{{site.baseurl}}/ingest-pipelines/pipeline-failures/) for more information. | `on_failure` | Optional | A list of processors to run if the processor fails. See [Handling pipeline failures]({{site.url}}{{site.baseurl}}/ingest-pipelines/pipeline-failures/) for more information. | `tag` | Optional | An identifier tag for the processor. Useful for debugging to distinguish between processors of the same type. | From 00466449720bd04176ed4d61c344c749f89fdf1b Mon Sep 17 00:00:00 2001 From: Melissa Vagi Date: Thu, 7 Mar 2024 09:36:26 -0700 Subject: [PATCH 08/11] Update _ingest-pipelines/processors/drop.md Co-authored-by: Nathan Bower Signed-off-by: Melissa Vagi --- _ingest-pipelines/processors/drop.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/_ingest-pipelines/processors/drop.md b/_ingest-pipelines/processors/drop.md index 265fc57390..58c2b109d3 100644 --- a/_ingest-pipelines/processors/drop.md +++ b/_ingest-pipelines/processors/drop.md @@ -34,7 +34,7 @@ Parameter | Required | Description | `if` | Optional | A condition for running the processor. | `ignore_failure` | Optional | If set to `true`, failures are ignored. Default is `false`. See [Handling pipeline failures]({{site.url}}{{site.baseurl}}/ingest-pipelines/pipeline-failures/) for more information. | `on_failure` | Optional | A list of processors to run if the processor fails. See [Handling pipeline failures]({{site.url}}{{site.baseurl}}/ingest-pipelines/pipeline-failures/) for more information. | -`tag` | Optional | An identifier tag for the processor. Useful for debugging to distinguish between processors of the same type. | +`tag` | Optional | An identifier tag for the processor. Useful for distinguishing between processors of the same type when debugging. | ## Using the processor From 0ac9c1d9b26fcce2001bf800f7a210a1f35c60f3 Mon Sep 17 00:00:00 2001 From: Melissa Vagi Date: Thu, 7 Mar 2024 09:36:35 -0700 Subject: [PATCH 09/11] Update _ingest-pipelines/processors/drop.md Co-authored-by: Nathan Bower Signed-off-by: Melissa Vagi --- _ingest-pipelines/processors/drop.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/_ingest-pipelines/processors/drop.md b/_ingest-pipelines/processors/drop.md index 58c2b109d3..42600cd38a 100644 --- a/_ingest-pipelines/processors/drop.md +++ b/_ingest-pipelines/processors/drop.md @@ -61,7 +61,7 @@ PUT /_ingest/pipeline/drop-pii **Step 2 (Optional): Test the pipeline.** -It is recommended that you test your pipeline before you ingest documents. +It is recommended that you test your pipeline before ingesting documents. {: .tip} To test the pipeline, run the following query: From 8081f65ba2184671d696ca6b1e1a8e2b5a2258b9 Mon Sep 17 00:00:00 2001 From: Melissa Vagi Date: Thu, 7 Mar 2024 09:36:47 -0700 Subject: [PATCH 10/11] Update _ingest-pipelines/processors/drop.md Co-authored-by: Nathan Bower Signed-off-by: Melissa Vagi --- _ingest-pipelines/processors/drop.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/_ingest-pipelines/processors/drop.md b/_ingest-pipelines/processors/drop.md index 42600cd38a..3a75845d93 100644 --- a/_ingest-pipelines/processors/drop.md +++ b/_ingest-pipelines/processors/drop.md @@ -84,7 +84,7 @@ POST _ingest/pipeline/drop-pii/_simulate #### Response -The following example response confirms that the pipeline is working as expected, that is, the document has been dropped: +The following example response confirms that the pipeline is working as expected (the document has been dropped): ```json { From 44776669ce1410114ee1c73b6a22c126a4309d53 Mon Sep 17 00:00:00 2001 From: Melissa Vagi Date: Thu, 7 Mar 2024 09:39:12 -0700 Subject: [PATCH 11/11] Update drop.md Signed-off-by: Melissa Vagi Signed-off-by: Melissa Vagi --- _ingest-pipelines/processors/drop.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/_ingest-pipelines/processors/drop.md b/_ingest-pipelines/processors/drop.md index 3a75845d93..1dd5fdb9d6 100644 --- a/_ingest-pipelines/processors/drop.md +++ b/_ingest-pipelines/processors/drop.md @@ -40,7 +40,7 @@ Parameter | Required | Description | Follow these steps to use the processor in a pipeline. -**Step 1: Create a pipeline.** +**Step 1: Create a pipeline** The following query creates a pipeline, named `drop-pii`, that uses the `drop` processor to prevent a document containing personally identifiable information (PII) from being indexed: @@ -59,7 +59,7 @@ PUT /_ingest/pipeline/drop-pii ``` {% include copy-curl.html %} -**Step 2 (Optional): Test the pipeline.** +**Step 2 (Optional): Test the pipeline** It is recommended that you test your pipeline before ingesting documents. {: .tip} @@ -95,7 +95,7 @@ The following example response confirms that the pipeline is working as expected ``` {% include copy-curl.html %} -**Step 3: Ingest a document.** +**Step 3: Ingest a document** The following query ingests a document into an index named `testindex1`: @@ -107,7 +107,7 @@ PUT testindex1/_doc/1?pipeline=drop-pii ``` {% include copy-curl.html %} -The following response confirms that the document with ID `1` was not indexed: +The following response confirms that the document with the ID of `1` was not indexed: { "_index": "testindex1", @@ -120,4 +120,4 @@ The following response confirms that the document with ID `1` was not indexed: "failed": 0 } } -{% include copy-curl.html %} \ No newline at end of file +{% include copy-curl.html %}