I have open telemetry exporting traces to Jaeger for many Node.js services, however with python, its not working:
opentelemetry-instrument \
--traces_exporter console,otlp \
--metrics_exporter console \
--service_name llama-indexer \
--exporter_otlp_protocol http/protobuf \
--exporter_otlp_endpoint http://localhost:4318/v1/traces \
start_api
The traces log with no issue to the console, but do not show up at all in jaeger
I have the same endpoint for js traces, works just fine. Heres my docker container in case it helps
83bce0bb7a12 jaegertracing/all-in-one "/go/bin/all-in-one-…" About an hour ago Up About an hour 5775/udp, 5778/tcp, 14250/tcp, 0.0.0.0:4317-4318->4317-4318/tcp, 0.0.0.0:16686->16686/tcp, 6831-6832/udp, 14268/tcp jaeger
Is there any way to get better observability into whats going wrong with the exporter, or anything obvious im missing
I tried many different parameters, I tried directly sending traces to datadog (didn't work) I tried manually instrumenting the FastAPI.
I have also tried using GRPC exporter, and it works, but i need http.
The issue was with my --exporter_otlp_endpoint
value. I used http://localhost:4318/v1/traces which was incorrect (although worked with grpc for some reason).
Switching to:
--exporter_otlp_endpoint http://localhost:4318 \
solved the issue.