diff --git a/docs/sources/concepts/components.md b/docs/sources/concepts/components.md index f6dafbd785..8dca24cba5 100644 --- a/docs/sources/concepts/components.md +++ b/docs/sources/concepts/components.md @@ -62,7 +62,7 @@ An example pipeline may look like this: 1. A `discovery.kubernetes` component discovers and exports Kubernetes Pods where metrics can be collected. 1. A `prometheus.scrape` component references the exports of the previous component, and sends collected metrics to the `prometheus.remote_write` component. -![Flow of example pipeline](/media/docs/agent/concepts_example_pipeline.svg) +{{< figure src="/media/docs/alloy/diagram-concepts-example-pipeline.png" width="600" alt="Example of a pipeline" >}} The following configuration file represents the pipeline. diff --git a/docs/sources/concepts/configuration-syntax/expressions/referencing_exports.md b/docs/sources/concepts/configuration-syntax/expressions/referencing_exports.md index 5d090f2bc5..d4f41389a4 100644 --- a/docs/sources/concepts/configuration-syntax/expressions/referencing_exports.md +++ b/docs/sources/concepts/configuration-syntax/expressions/referencing_exports.md @@ -41,7 +41,7 @@ prometheus.remote_write "onprem" { In the preceding example, you wired together a very simple pipeline by writing a few {{< param "PRODUCT_NAME" >}} expressions. -![Flow of example pipeline](/media/docs/agent/flow_referencing_exports_diagram.svg) +{{< figure src="/media/docs/alloy/diagram-referencing-exports.png" alt="Example of a pipeline" >}} After the value is resolved, it must match the [type][] of the attribute it is assigned to. While you can only configure attributes using the basic {{< param "PRODUCT_NAME" >}} types, diff --git a/docs/sources/reference/components/prometheus.exporter.gcp.md b/docs/sources/reference/components/prometheus.exporter.gcp.md index 4ca55beb42..71782507d4 100644 --- a/docs/sources/reference/components/prometheus.exporter.gcp.md +++ b/docs/sources/reference/components/prometheus.exporter.gcp.md @@ -14,7 +14,7 @@ Metric names follow the template `stackdriver__}} The following list shows its attributes: \ monitored_resource = `https_lb_rule`\ diff --git a/docs/sources/tasks/opentelemetry-to-lgtm-stack.md b/docs/sources/tasks/opentelemetry-to-lgtm-stack.md index e8eeff63d3..a7faf5cbcb 100644 --- a/docs/sources/tasks/opentelemetry-to-lgtm-stack.md +++ b/docs/sources/tasks/opentelemetry-to-lgtm-stack.md @@ -101,9 +101,7 @@ loki.write "default" { ``` To use Loki with basic-auth, which is required with Grafana Cloud Loki, you must configure the [loki.write][] component. -You can get the Loki configuration from the Loki **Details** page in the [Grafana Cloud Portal][]: - -![](/media/docs/agent/loki-config.png) +You can get the Loki configuration from the Loki **Details** page in the [Grafana Cloud Portal][]. ```alloy otelcol.exporter.loki "grafana_cloud_loki" { @@ -136,9 +134,7 @@ otelcol.exporter.otlp "default" { ``` To use Tempo with basic-auth, which is required with Grafana Cloud Tempo, you must use the [otelcol.auth.basic][] component. -You can get the Tempo configuration from the Tempo **Details** page in the [Grafana Cloud Portal][]: - -![](/media/docs/agent/tempo-config.png) +You can get the Tempo configuration from the Tempo **Details** page in the [Grafana Cloud Portal][]. ```alloy otelcol.exporter.otlp "grafana_cloud_tempo" { @@ -173,9 +169,7 @@ prometheus.remote_write "default" { ``` To use Prometheus with basic-auth, which is required with Grafana Cloud Prometheus, you must configure the [prometheus.remote_write][] component. -You can get the Prometheus configuration from the Prometheus **Details** page in the [Grafana Cloud Portal][]: - -![](/media/docs/agent/prometheus-config.png) +You can get the Prometheus configuration from the Prometheus **Details** page in the [Grafana Cloud Portal][]. ```alloy otelcol.exporter.prometheus "grafana_cloud_prometheus" { @@ -301,7 +295,7 @@ ts=2023-05-09T09:37:15.304234Z component=otelcol.receiver.otlp.default level=inf You can check the pipeline graphically by visiting [http://localhost:12345/graph][] -![](/media/docs/agent/otlp-lgtm-graph.png) +{{< figure src="/media/docs/alloy/otlp-lgtm-graph.png" alt="Graphical representation of a healthy pipeline" >}} [OpenTelemetry]: https://opentelemetry.io [Grafana Loki]: https://grafana.com/oss/loki/ diff --git a/docs/sources/tutorials/first-components-and-stdlib/index.md b/docs/sources/tutorials/first-components-and-stdlib/index.md index afd5aa4d36..a8a16d4e9e 100644 --- a/docs/sources/tutorials/first-components-and-stdlib/index.md +++ b/docs/sources/tutorials/first-components-and-stdlib/index.md @@ -109,9 +109,7 @@ The `url` attribute is set to the URL of the Prometheus remote write endpoint. The `basic_auth` block contains the `username` and `password` attributes, which are set to the string `"admin"` and the `content` export of the `local.file` component, respectively. The `content` export is referenced by using the syntax `local.file.example.content`, where `local.file.example` is the fully qualified name of the component (the component's type + its label) and `content` is the name of the export. -

-Example pipeline with local.file and prometheus.remote_write components -

+{{< figure src="/media/docs/alloy/diagram-example-basic-alloy.png" width="600" alt="Example pipeline with local.file and prometheus.remote_write components" >}} {{< admonition type="note" >}} The `local.file` component's label is set to `"example"`, so the fully qualified name of the component is `local.file.example`. @@ -178,9 +176,7 @@ Try querying for `node_memory_Active_bytes` to see the active memory of your hos The following diagram is an example pipeline: -

-Example pipeline with a prometheus.scrape, prometheus.exporter.unix, and prometheus.remote_write components -

+{{< figure src="/media/docs/alloy/diagram-example-pipeline-prometheus.scrape-alloy.png" width="600" alt="Example pipeline with a prometheus.scrape, prometheus.exporter.unix, and prometheus.remote_write components" >}} Your pipeline configuration defines three components: @@ -214,9 +210,7 @@ You can refer to the [prometheus.exporter.redis][] component documentation for m To give a visual hint, you want to create a pipeline that looks like this: -

-Exercise pipeline, with a scrape, unix_exporter, redis_exporter, and remote_write component -

+{{< figure src="/media/docs/alloy/diagram-example-pipeline-exercise-alloy.png" alt="Exercise pipeline, with a scrape, unix_exporter, redis_exporter, and remote_write component" >}} {{< admonition type="tip" >}} Refer to the [concat][] standard library function for information about combining lists of values into a single list. diff --git a/docs/sources/tutorials/logs-and-relabeling-basics/index.md b/docs/sources/tutorials/logs-and-relabeling-basics/index.md index 1e8e769c36..b61d63f971 100644 --- a/docs/sources/tutorials/logs-and-relabeling-basics/index.md +++ b/docs/sources/tutorials/logs-and-relabeling-basics/index.md @@ -60,7 +60,7 @@ prometheus.remote_write "local_prom" { You have created the following pipeline: -![Diagram of pipeline that scrapes prometheus.exporter.unix, relabels the metrics, and remote_writes them](/media/docs/agent/diagram-flow-by-example-relabel-0.svg) +{{< figure src="/media/docs/alloy/diagram-example-relabel-alloy.png" alt="Diagram of pipeline that scrapes prometheus.exporter.unix, relabels the metrics, and remote_writes them" >}} This pipeline has a `prometheus.relabel` component that has a single rule. This rule has the `replace` action, which will replace the value of the `os` label with a special value: `constants.os`. @@ -129,7 +129,7 @@ loki.write "local_loki" { The rough flow of this pipeline is: -![Diagram of pipeline that collects logs from /tmp/alloy-logs and writes them to a local Loki instance](/media/docs/agent/diagram-flow-by-example-logs-0.svg) +{{< figure src="/media/docs/alloy/diagram-example-logs-loki-alloy.png" width="500" alt="Diagram of pipeline that collects logs from /tmp/alloy-logs and writes them to a local Loki instance" >}} If you navigate to [localhost:3000/explore][] and switch the Datasource to `Loki`, you can query for `{filename="/tmp/alloy-logs/log.log"}` and see the log line we created earlier. Try running the following command to add more logs to the file. @@ -140,7 +140,7 @@ echo "This is another log line!" >> /tmp/alloy-logs/log.log If you re-execute the query, you can see the new log lines. -![Grafana Explore view of example log lines](/media/docs/agent/screenshot-flow-by-example-log-lines.png) +{{< figure src="/media/docs/alloy/screenshot-log-lines.png" alt="Grafana Explore view of example log lines" >}} If you are curious how {{< param "PRODUCT_NAME" >}} keeps track of where it's in a log file, you can look at `data-alloy/loki.source.file.local_files/positions.yml`. If you delete this file, {{< param "PRODUCT_NAME" >}} starts reading from the beginning of the file again, which is why keeping the {{< param "PRODUCT_NAME" >}}'s data directory in a persistent location is desirable. @@ -262,7 +262,7 @@ echo 'level=debug msg="DEBUG: This is a debug level log!"' >> /tmp/alloy-logs/lo Navigate to [localhost:3000/explore][] and switch the Datasource to `Loki`. Try querying for `{level!=""}` to see the new labels in action. -![Grafana Explore view of example log lines, now with the extracted 'level' label](/media/docs/agent/screenshot-flow-by-example-log-line-levels.png) +{{< figure src="/media/docs/alloy/screenshot-log-line-levels.png" alt="Grafana Explore view of example log lines, now with the extracted 'level' label" >}} {{< collapse title="Solution" >}} diff --git a/docs/sources/tutorials/processing-logs/index.md b/docs/sources/tutorials/processing-logs/index.md index bc03127f9b..41c9cb4d39 100644 --- a/docs/sources/tutorials/processing-logs/index.md +++ b/docs/sources/tutorials/processing-logs/index.md @@ -27,7 +27,7 @@ It can be useful for receiving logs from other {{< param "PRODUCT_NAME" >}}s or Your pipeline is going to look like this: -![Loki Source API Pipeline](/media/docs/agent/diagram-flow-by-example-logs-pipeline.svg) +{{< figure src="/media/docs/alloy/example-logs-pipeline-alloy.png" alt="An example logs pipeline" >}} Start by setting up the `loki.source.api` component: