nginx-otel-nix/README.md
2023-06-13 23:29:26 +00:00

2.4 KiB

nginx_otel

This project provides support for OpenTelemetry distributed tracing in Nginx, offering:

  • Lightweight and high-performance incoming HTTP request tracing
  • W3C trace context propagation
  • OTLP/gRPC trace export
  • Fully Dynamic Variable-Based Sampling

Building

Install build tools and dependencies:

  $ sudo apt install cmake build-essential libssl-dev zlib1g-dev libpcre3-dev
  $ sudo apt install pkg-config libc-ares-dev libre2-dev # for gRPC

Configure Nginx:

  $ ./configure --with-compat

Configure and build Nginx OTel module:

  $ mkdir build
  $ cd build
  $ cmake -DNGX_OTEL_NGINX_BUILD_DIR=/path/to/configured/nginx/objs ..
  $ make

Getting Started

Simple Tracing

Dumping all the requests could be useful even in non-distributed environment.

  http {
      otel_trace on;
      server {
          location / {
              proxy_pass http://backend;
          }
      }
  }

How to Use

Directives

Available in http/server/location contexts

otel_trace on | off | “$var“;

The argument is a “complex value”, which should result in on/off or 1/0. Default is off.

Available in http context

otel_exporter;

Defines how to export tracing data. There can only be one otel_exporter directive in a given http context.

otel_exporter {
    endpoint “host:port“;
    interval 5s;         # max interval between two exports
    batch_size 512;      # max number of spans to be sent in one batch per worker
    batch_count 4;       # max number of pending batches per worker, over the limit spans are dropped
}

otel_service_name name;

Sets service.name attribute of OTel resource. By default, it is set to unknown_service:nginx.

Available in otel_exporter context

endpoint "host:post";

Defines exporter endpoint host and port. Only one endpoint per otel_exporter can be specified.

interval 5s;

Maximum interval between two exports. Default is 5s.

batch_size 512;

Maximum number of spans to be sent in one batch per worker. Detault is 512.

batch_count 4;

Maximum number of pending batches per worker, over the limit spans are dropped. Default is 4.

License

Apache License, Version 2.0

© F5, Inc. 2023