-
Notifications
You must be signed in to change notification settings - Fork 758
Redis spans not being shown in Jaeger if volume and velocity is high #1620
-
Setup:
1 - Jaeger
docker run -p 16686:16686 -p 6831:6831/udp jaegertracing/all-in-one --log-level=debug
2 - Django application with Redis instrumentation
manage.py
...
from opentelemetry import trace
from opentelemetry.exporter.jaeger import JaegerSpanExporter
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchExportSpanProcessor
from opentelemetry.instrumentation.redis import RedisInstrumentor
if __name__ == "__main__":
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "app.settings")
trace.set_tracer_provider(TracerProvider())
trace.get_tracer_provider().add_span_processor(
BatchExportSpanProcessor(JaegerSpanExporter(
service_name='test',
agent_host_name='localhost',
))
)
redis_instrumentor = RedisInstrumentor()
redis_instrumentor.instrument()
...
3 - load_test.py script with Redis SET operations
import redis
import time
client = redis.StrictRedis(host="localhost", port=6379)
for i in range(1000):
client.set('a'*10, 'b'*20)
# time.sleep(1)
Running script: python manage.py shell < load_test.py
It may be issue at Jaeger end since _traced_execute_command is being called and span attributes are being set for each operation
Currently trying to find out if opentelemetry is not relaying/throttling or if Jaeger is not able to process large volume
Beta Was this translation helpful? Give feedback.
All reactions
Thanks for opening the discussion, can you also please share the dependencies along with their versions?
Replies: 1 comment 6 replies
-
Thanks for opening the discussion, can you also please share the dependencies along with their versions?
Beta Was this translation helpful? Give feedback.
All reactions
-
Thanks, From what you shared in the Gitter channel and seeing time.sleep(1) I believe that the Thrift/UDP (default Jaeger Exporter Protocol) is the reason. There is a packet size limitation, and well, it's also UDP. I haven't diagnosed this myself but will try it sometime later. Do you see any warning messages? Also would you mind checking the same with gRPC collector option?
docker run -d --name jaeger \ -e COLLECTOR_ZIPKIN_HTTP_PORT=9411 \ -p 5775:5775/udp \ -p 6831:6831/udp \ -p 6832:6832/udp \ -p 5778:5778 \ -p 16686:16686 \ -p 14268:14268 \ -p 14250:14250 \ -p 9411:9411 \ jaegertracing/all-in-one:1.21
# For simple local test ... BatchExportSpanProcessor(JaegerSpanExporter( service_name='test', insecure=True, transport_format="protobuf" )) ...
Beta Was this translation helpful? Give feedback.
All reactions
-
It indeed is UDP packet issue but due to existing logger configuration of Django app warning was not being printed in console
Will try gRPC collector and update
Thank you so much :)
Beta Was this translation helpful? Give feedback.
All reactions
-
Thanks suggested configuration worked 👍
version: '3'
services:
jaeger:
image: jaegertracing/all-in-one:latest
container_name: jaeger
restart: on-failure
ports:
- 14250:14250
- 16686:16686
env_file: .env
trace.get_tracer_provider().add_span_processor(
BatchExportSpanProcessor(JaegerSpanExporter(
service_name=service_name,
insecure=True,
transport_format="protobuf"
))
)
If anyone is facing logging issue in my case culprit was 'disable_existing_loggers': True flag of log configuration.
Beta Was this translation helpful? Give feedback.
All reactions
-
Please keep in mind that this config was just to test the our hypothesis. You may want to use collector with http basic authentication or gRPC with channel credentials in real application.
Beta Was this translation helpful? Give feedback.
All reactions
-
Thanks for heads up for now it is just a local implementation for testing out opentelemetry instrumentation for python packages
Beta Was this translation helpful? Give feedback.