Skip to main content

Collecting Metrics with Prometheus Components

Welcome to the metrics lesson. Here you'll learn how Alloy collects Prometheus metrics using exporters and prometheus.scrape, then forwards them to your configured outputs. The hands-on examples are split into focused sublessons so you can explore them one at a time.

Learning Goals

By the end of this lesson, you'll be able to:

  • Configure Prometheus components in Alloy to scrape metrics
  • Set up scrape jobs for common services and exporters
  • Understand how exporter targets flow into scrapes and relabeling
  • Validate and test your metric collection configuration

How Metrics Collection Works in Alloy

Alloy uses a pipeline model:

  1. Exporters expose metrics in Prometheus format (prometheus.exporter.*).
  2. Scrapes collect metrics (prometheus.scrape).
  3. Relabeling normalizes labels (prometheus.relabel).
  4. Remote write ships metrics (prometheus.remote_write).

The course examples use a shared relabeler defined in 01-global.alloy that forwards metrics to remote write endpoints.

01-global.alloy
livedebugging {
enabled = true
}


// 01-global.alloy

remotecfg {
url = "https://fleet-management-prod-008.grafana.net"
id = "vikasjha001"

attributes = {
"os" = "linux",
"env" = "production",
"hostname" = constants.hostname,
}

basic_auth {
username = "1064633"
password = sys.env("GCLOUD_RW_API_KEY")
}
}

prometheus.remote_write "grafana_cloud" {
endpoint {
url = "http://localhost:9090/api/v1/write"
}

endpoint {
url = "https://prometheus-prod-13-prod-us-east-0.grafana.net/api/prom/push"

basic_auth {
username = "1846252"
// Use the API Key you provided
password = sys.env("GCLOUD_RW_API_KEY")
}
}
}

// Update your relabeler to forward to the Cloud destination
prometheus.relabel "common_labels" {
forward_to = [prometheus.remote_write.grafana_cloud.receiver]

rule {
target_label = "env"
replacement = "production"
}
}

loki.write "grafana_cloud_loki" {
endpoint {
url = "https://logs-prod-006.grafana.net/loki/api/v1/push"

basic_auth {
username = "1022411"
password = sys.env("GCLOUD_RW_API_KEY")
}
}
}


otelcol.auth.basic "grafana_cloud" {
username = "1016726"
password = sys.env("GCLOUD_RW_API_KEY")
}

otelcol.exporter.otlp "grafana_cloud" {
client {
endpoint = "tempo-prod-04-prod-us-east-0.grafana.net:443"
auth = otelcol.auth.basic.grafana_cloud.handler
}
}

pyroscope.write "production_backend" {
endpoint {
url = "https://profiles-prod-001.grafana.net"
basic_auth {
username = "1064633"
password = env("GCLOUD_RW_API_KEY")
}
}
}

Lesson Map

Pick the example set you want to focus on:

Validating and Running Your Config

Validate a config file
alloy validate config.alloy
Validate a directory of configs
alloy validate /path/to/configs
Run Alloy
alloy run /path/to/configs

Verifying Metric Collection

  • Alloy metrics endpoint: curl http://localhost:12345/metrics shows Alloy's own metrics and scrape stats.
  • Remote write backend: verify metrics in Grafana Cloud or Prometheus.
  • Exporter logs: check Alloy logs if a target is unreachable.
warning

Always test your configuration in a non-production environment first. A misconfigured scrape job with a very short interval could overwhelm your applications with requests.

Common Pitfalls

  • Missing forward_to: Scraped metrics are discarded if they aren't forwarded to a receiver.
  • Wrong target format: Targets must be objects with at least __address__.
  • Exporter dependencies missing: Redis, Docker, or GitHub exporters require the corresponding service or token.
  • Permission issues: Docker and journald scrapes require access to sockets or files.
  • Network problems: Firewalls or TLS misconfigurations prevent scrapes.

Summary

In this lesson, you learned the core Prometheus scrape pipeline and how to validate and verify Alloy metric collection. Use the sublessons to dive into exporter examples, container and proxy metrics, synthetic probes, log sources, and profiling pipelines.

Remember that prometheus.scrape is the entry point; downstream components do the processing and delivery.

Quiz

Prometheus Scraping in Alloy - Quick Check

What is the purpose of the `forward_to` attribute in a `prometheus.scrape` component?

Question 1/5