Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exporting failed. Will retry the request after interval. #868

Open
kadhamecha-conga opened this issue Aug 25, 2023 · 4 comments
Open

Exporting failed. Will retry the request after interval. #868

kadhamecha-conga opened this issue Aug 25, 2023 · 4 comments
Labels
bug Something isn't working

Comments

@kadhamecha-conga
Copy link

hi team,

we are facing issue of traces drop at lambda execution, where traces dropped from lambda function.

error from cloudwatch:

{ "level": "error", "ts": 1692968962.3226757, "caller": "exporterhelper/queued_retry.go:296", "msg": "Exporting failed. Dropping data. Try enabling sending_queue to survive temporary failures.", "kind": "exporter", "data_type": "traces", "name": "otlp", "dropped_items": 4, "stacktrace": "go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).send\n\tgo.opentelemetry.io/[email protected]/exporter/exporterhelper/queued_retry.go:296\ngo.opentelemetry.io/collector/exporter/exporterhelper.NewTracesExporter.func2\n\tgo.opentelemetry.io/[email protected]/exporter/exporterhelper/traces.go:116\ngo.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces\n\tgo.opentelemetry.io/collector/[email protected]/traces.go:36\ngo.opentelemetry.io/collector/service/internal/fanoutconsumer.(*tracesConsumer).ConsumeTraces\n\tgo.opentelemetry.io/[email protected]/service/internal/fanoutconsumer/traces.go:77\ngo.opentelemetry.io/collector/receiver/otlpreceiver/internal/trace.(*Receiver).Export\n\tgo.opentelemetry.io/collector/receiver/[email protected]/internal/trace/otlp.go:54\ngo.opentelemetry.io/collector/pdata/ptrace/ptraceotlp.rawTracesServer.Export\n\tgo.opentelemetry.io/collector/[email protected]/ptrace/ptraceotlp/grpc.go:72\ngo.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler.func1\n\tgo.opentelemetry.io/collector/[email protected]/internal/data/protogen/collector/trace/v1/trace_service.pb.go:310\ngo.opentelemetry.io/collector/config/configgrpc.enhanceWithClientInformation.func1\n\tgo.opentelemetry.io/[email protected]/config/configgrpc/configgrpc.go:410\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\tgoogle.golang.org/[email protected]/server.go:1162\ngo.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1\n\tgo.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/[email protected]/interceptor.go:349\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\tgoogle.golang.org/[email protected]/server.go:1165\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1\n\tgoogle.golang.org/[email protected]/server.go:1167\ngo.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler\n\tgo.opentelemetry.io/collector/[email protected]/internal/data/protogen/collector/trace/v1/trace_service.pb.go:312\ngoogle.golang.org/grpc.(*Server).processUnaryRPC\n\tgoogle.golang.org/[email protected]/server.go:1340\ngoogle.golang.org/grpc.(*Server).handleStream\n\tgoogle.golang.org/[email protected]/server.go:1713\ngoogle.golang.org/grpc.(*Server).serveStreams.func1.2\n\tgoogle.golang.org/[email protected]/server.go:965" }

due to droppage we see issues in traces.

code details:
"OpenTelemetry.Instrumentation.AWSLambda"
Version="1.1.0-beta.2" />

otel config:
`
receivers:
otlp:
protocols:
grpc:
http:

exporters:
logging:
loglevel: debug
otlp:
endpoint: "grpc endpoint"
retry_on_failure:
initial_interval: 1s
max_interval: 5s
sending_queue:
queue_size: 2000
timeout: 5s
#enables output for traces to xray
service:
pipelines:
traces:
receivers: [otlp]
exporters: [logging, otlp]
`

let me know, if u need more info.

thanks.

@kadhamecha-conga kadhamecha-conga added the bug Something isn't working label Aug 25, 2023
Copy link

This issue was marked stale. It will be closed in 30 days without additional activity.

@github-actions github-actions bot added the Stale label Aug 25, 2024
@serkan-ozal
Copy link
Contributor

@kadhamecha-conga It seems that the problem is related to your GRPC endpoint where the collector is exported.

@kadhamecha-conga
Copy link
Author

@serkan-ozal grpc endpoint is

otlp/1:
endpoint: signals-grpc.demo.congacloud.app:443
tls:
insecure: true

can you please help ?

@serkan-ozal
Copy link
Contributor

@kadhamecha-conga
You use the port 443 but disable secure communication with insecure: true parameter. Can you try by removing insecure parameter?

@github-actions github-actions bot removed the Stale label Sep 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants