diff --git a/docs/sources/tutorials/get-started.md b/docs/sources/tutorials/get-started.md index c8fdbfa096..78b129bcec 100644 --- a/docs/sources/tutorials/get-started.md +++ b/docs/sources/tutorials/get-started.md @@ -7,7 +7,8 @@ weight: 10 ## Get started with {{% param "PRODUCT_NAME" %}} -This tutorial shows you how to configure Alloy to collect logs from your local machine and send them to Loki, running in a local Grafana stack. +This tutorial shows you how to configure Alloy to collect logs from your local machine, filter non-essential log lines, and send them to Loki, running in a local Grafana stack. + This process will enable you to query and visualize the logs sent to Loki using the Grafana dashboard. To follow this tutorial, you must have a basic understanding of Alloy and telemetry collection in general. @@ -129,19 +130,52 @@ Paste this component next in the `config.alloy` file: ```alloy loki.source.file "log_scrape" { - targets = local.file_match.local_files.targets - forward_to = [loki.write.grafana_loki.receiver] - tail_from_end = true + targets = local.file_match.local_files.targets + forward_to = [loki.process.filter_logs.receiver] + tail_from_end = true } ``` This configuration creates a [loki.source.file][] component named `log_scrape`, and shows the pipeline concept of {{< param "PRODUCT_NAME" >}} in action. The `log_scrape` component does the following: 1. It connects to the `local_files` component (its "source" or target). -1. It forwards the logs it scrapes to the "receiver" of another component called `grafana_loki` that you will define next. +1. It forwards the logs it scrapes to the "receiver" of another component called `filter_logs` that you will define next. 1. It provides extra attributes and options, in this case, you will tail log files from the end and not ingest the entire past history. -### Third component: Write logs to Loki +### Third component: Filter non-essential logs + +Filtering non-essential logs before sending them to a data source can help you manage log volumes to reduce costs. The filtering strategy of each organization will differ as they have different monitoring needs and setups. + +The following example demonstrates filtering out or dropping logs before sending them to Loki. + +Paste this component next in the `config.alloy` file: +```alloy +loki.process "filter_logs" { + stage.drop { + source = "" + expression = ".*Connection closed by authenticating user root" + drop_counter_reason = "noisy" + } + forward_to = [loki.write.grafana_loki.receiver] + } +``` + +1. `loki.process` is a component that allows you to transform, filter, parse, and enrich log data. + Within this component, you can define one or more processing stages to specify how you would like to process log entries before they are stored or forwarded. +1. In this example, you create a `loki.process` component named “filter_logs”. + This component receives scraped log entries from the `log_scrape` component you created in the previous step. +1. There are many ways to [transform, filter, parse, and enrich log data][parse]. In this example, you use the `stage.drop` block to drop log entries based on specified criteria. +1. You set the `source` parameter equal to an empty string to denote that scraped logs from the default source, the `log_scrape` component, will be processed. +1. You set the `expression` parameter equal to the log message that is not relevant to the use case. + The log message ".*Connection closed by authenticating user root" was chosen to demonstrate how to use the `stage.drop` block. +1. You can include an optional string label `drop_counter_reason` to show the rationale for dropping log entries. + You can use this label to categorize and count the drops to track and analyze the reasons for dropping logs. +1. You use the `forward_to` parameter to specify where to send the processed logs. + In this case, you will send the processed logs to a component you will create next called `grafana_loki`. + +Check out the following [tutorial][] and the [`loki.process` documentation][loki.process] for more comprehensive information on processing logs. + +### Fourth component: Write logs to Loki Paste this component last in your configuration file: @@ -222,7 +256,7 @@ sudo systemctl reload alloy Open [http://localhost:12345] and click the Graph tab at the top. The graph should look similar to the following: -{{< figure src="/media/docs/alloy/tutorial/healthy-config.png" alt="Logs reported by Alloy in Grafana" >}} +{{< figure src="/media/docs/alloy/tutorial/Inspect-your-config-in-the-Alloy-UI-image.png" alt="Your configuration in the Alloy UI" >}} The UI allows you to see a visual representation of the pipeline you built with your {{< param "PRODUCT_NAME" >}} component configuration. We can see that the components are healthy, and you are ready to go. @@ -239,7 +273,7 @@ Here you can see that logs are flowing through to Loki as expected, and the end- ## Conclusion Congratulations, you have fully installed and configured {{< param "PRODUCT_NAME" >}}, and sent logs from your local host to a Grafana stack. -In the following tutorials, you will learn more about configuration concepts, metrics, and more advanced log scraping. +In the following tutorials, you will learn more about configuration concepts and metrics. [http://localhost:3000/explore]: http://localhost:3000/explore [http://localhost:12345]: http://localhost:12345 @@ -253,4 +287,8 @@ In the following tutorials, you will learn more about configuration concepts, me [alloy]: https://grafana.com/docs/alloy/latest/ [configuration]: ../../concepts/configuration-syntax/ [install]: ../../get-started/install/binary/#install-alloy-as-a-standalone-binary -[debugging your configuration]: ../../tasks/debug/ \ No newline at end of file +[debugging your configuration]: ../../tasks/debug/ +[parse]: ../../reference/components/loki.process/ +[tutorial]: ./processing-logs/ +[loki.process]: ../../reference/components/loki.process/ +