Skip to content

celery worker stops working #3888

@mertcangokgoz

Description

@mertcangokgoz

Self-Hosted Version

25.8.0

CPU Architecture

x86_64

Docker Version

28.3.3

Docker Compose Version

v2.39.1

Machine Specification

  • My system meets the minimum system requirements of Sentry

We are running Sentry on a dedicated server.

Specs

CPU: Intel Core i9
RAM: 64 GB

Steps to Reproduce

  • N/A

Expected Result

We have a self-hosted Sentry server that we use for 40 projects. Due to the high volume of events coming in, our Redis memory usage is very high, which causes the machine running Sentry to become unreachable due to RAM usage. To prevent this, we added a memory limit to Redis.

However, Celery occasionally stops working and gives the following error. Even when the workers are restarted, this error continues to occur.

responseError("OOM command not allowed when used memory > 'maxmemory'.")

As a temporary solution, it starts working when I take the following action.

docker compose exec redis redis-cli flushall && docker compose restart worker

my redis.conf

# redis.conf

# The 'maxmemory' directive controls the maximum amount of memory Redis is allowed to use.
# Setting 'maxmemory 0' means there is no limit on memory usage, allowing Redis to use as much
# memory as the operating system allows. This is suitable for environments where memory
# constraints are not a concern.
#
# Alternatively, you can specify a limit, such as 'maxmemory 15gb', to restrict Redis to
# using a maximum of 15 gigabytes of memory.
#
# Example:
# maxmemory 0         # Unlimited memory usage
maxmemory 40gb


# This setting determines how Redis evicts keys when it reaches the memory limit.
# `allkeys-lru` evicts the least recently used keys from all keys stored in Redis,
# allowing frequently accessed data to remain in memory while older data is removed.
# That said we use `volatile-lru` as Redis is used both as a cache and processing
# queue in self-hosted Sentry.
# > The volatile-lru and volatile-random policies are mainly useful when you want to
# > use a single Redis instance for both caching and for a set of persistent keys.
# > However, you should consider running two separate Redis instances in a case like
# > this, if possible.

maxmemory-policy volatile-random

Actual Result

  • N/A

Event ID

No response

Metadata

Metadata

Assignees

No one assigned

    Projects

    Status

    No status

    Status

    Waiting for: Product Owner

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions