Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different Behavior with different opentelemetry collector images when using prometheusremotewrite exporter #37600

Closed
jpadmin opened this issue Jan 30, 2025 · 3 comments
Assignees
Labels

Comments

@jpadmin
Copy link

jpadmin commented Jan 30, 2025

Component(s)

exporter/prometheusremotewrite

What happened?

Description

I have been using opentelemetry collector in a kubernetes cluster with the help of the open telemetry operator but it fails to spawn the pods stating that prometheusremotewrite is an unknown type.

Steps to Reproduce

  1. Install Opentelemetry operator using the documentation as provided in this link
  2. Try to set up a collector using the following manifest after setting a valid prometheus push url.
apiVersion: opentelemetry.io/v1beta1
kind: OpenTelemetryCollector
metadata:
  name: collector
  namespace: default
spec:
  mode: daemonset
  config:
    receivers:
      otlp:
        protocols:
          grpc:
          http:

    processors:
      batch: {}

    exporters:
      prometheusremotewrite:
        endpoint: "<PROMETHEUS OR MIMIR PUSH URL>"
        tls:
          insecure: true

    service:
      pipelines:
        metrics:
          receivers: [otlp]
          processors: [batch]
          exporters: [prometheusremotewrite]
  1. Observe the error in the pod logs.

error decoding 'exporters': unknown type: "prometheusremotewrite" for id: "prometheusremotewrite" (valid values: [otlphttp file loadbalancing otelarrow debug nop otlp])
2025/01/30 18:45:21 collector server run finished with error: failed to get config: cannot unmarshal the configuration: decoding failed due to the following error(s):

Additional Information

The system fails when we use the docker image : otel/opentelemetry-collector-k8s:0.117.0 or otel/opentelemetry-collector-k8s:0.118.0
At the same time, it succeeds when we change the image repository to otel/opentelemetry-collector-contrib:0.117.0 or otel/opentelemetry-collector-contrib:0.118.0

Is this an issue with the image not in sync, or is this setting invalid for otel/opentelemetry-collector-k8s images?

Expected Result

The collector pods would come up

Actual Result

The collector pods failed with the above errors

Collector version

0.117.0, 0.118.0

Environment information

Environment

OS: Mac OS 15.1 (24B83) using K8s, Debian latest, I think this option does not apply to me because I am directly referring the images from opentelemetry docker registry
Compiler(if manually compiled): NA

OpenTelemetry Collector configuration

collector.yaml: |
    receivers:
      otlp:
        protocols:
          grpc:
            endpoint: 0.0.0.0:4317
          http:
            endpoint: 0.0.0.0:4318
    exporters:
      prometheusremotewrite:
        endpoint: http://mimir-nginx.monitoring.svc.cluster.local/api/v1/push
        tls:
          insecure: true
    processors:
      batch: {}
    service:
      telemetry:
        metrics:
          address: 0.0.0.0:8888
      pipelines:
        metrics:
          exporters:
            - prometheusremotewrite
          processors:
            - batch
          receivers:
            - otlp

Log output

error decoding 'exporters': unknown type: "prometheusremotewrite" for id: "prometheusremotewrite" (valid values: [otlphttp file loadbalancing otelarrow debug nop otlp])
2025/01/30 18:45:21 collector server run finished with error: failed to get config: cannot unmarshal the configuration: decoding failed due to the following error(s):

Additional context

No response

@jpadmin jpadmin added bug Something isn't working needs triage New item requiring triage labels Jan 30, 2025
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@dashpole
Copy link
Contributor

The prometheusremotewrite exporter is not supported in the otel/opentelemetry-collector-k8s distribution.

See https://github.com/open-telemetry/opentelemetry-collector-releases/blob/a09bde337f38bdb11fde140e1ef740ff4ee34932/distributions/otelcol-k8s/manifest.yaml.

@dashpole dashpole removed the needs triage New item requiring triage label Jan 30, 2025
@dashpole dashpole self-assigned this Jan 30, 2025
@jpadmin
Copy link
Author

jpadmin commented Jan 31, 2025

Thank you for the clarification, I switched to otel/opentelemetry-collector-contrib:0.117.0 .
You can close this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants