Skip to main content

Setting Up Celery Prometheus Exporter

This guide explains how to install and run the prometheus-exporter-celery utility, expose its metrics on the server, and scrape them using Grafana Alloy (or Prometheus). This is useful when you want Celery task metrics in Grafana Cloud.


1. Install the Exporter Binary

cd /tmp
curl -L https://github.com/danihodovic/celery-exporter/releases/download/latest/celery-exporter -o ./celery-exporter
chmod +x ./celery-exporter
mv ./celery-exporter /usr/local/bin/celery-exporter

This downloads the latest exporter and makes it available system-wide.


2. Create a Systemd Service

Create the unit file:

/etc/systemd/system/celery-exporter.service

Use this content:

[Unit]
Description=Celery Exporter
After=network.target

[Service]
ExecStart=/usr/local/bin/celery-exporter --broker-url=redis://:redishost-name:6379/1
Restart=always
RestartSec=5

[Install]
WantedBy=multi-user.target

3. Reload, Enable, and Start

systemctl daemon-reload
systemctl enable celery-exporter
systemctl start celery-exporter

4. Check Service Status

systemctl status celery-exporter

Make sure it’s active (running).


5. Verify Metrics Locally

By default the exporter listens on port 9808. Test it:

curl http://localhost:9808/metrics

You should see Prometheus-style output like task counts, latency, worker info, etc.


6. Scrape with Grafana Alloy

If you’re using Grafana Alloy on the same VM, add a scrape block so Alloy can pick up the Celery exporter metrics and forward them to Grafana Cloud.

Edit:

/etc/alloy/config.alloy

Add this block:

prometheus.scrape "celery_exporter" {
job_name = "celery_exporter"
scrape_interval = "15s"

targets = [
{ __address__ = "127.0.0.1:9808" },
]

metrics_path = "/metrics"

forward_to = [prometheus.remote_write.metrics_service.receiver]
}

Then reload/restart Alloy (method depends on how you installed it, e.g. systemctl restart alloy on systemd installs).

This tells Alloy: “scrape Celery exporter at 127.0.0.1:9808 and forward it to the existing Prometheus pipeline.”

Additionally, add this in django's .env file and load it in settings.py file, so that task events data is also exported.

CELERY_WORKER_SEND_TASK_EVENTS=True
CELERY_TASK_SEND_SENT_EVENT=True

Add this in config/settings.py where other CELERY environment variables are defined.

CELERY_WORKER_SEND_TASK_EVENTS = config(
"CELERY_WORKER_SEND_TASK_EVENTS", default=False, cast=bool
)
CELERY_TASK_SEND_SENT_EVENT = config(
"CELERY_TASK_SEND_SENT_EVENT", default=False, cast=bool
)

Restart celery services and celery exporter services after this.