Skip to content

Conversation

@epmog
Copy link
Contributor

@epmog epmog commented Nov 4, 2025

Fixes: #230

What was the problem/requirement? (What/Why)

We haven't seen this in the wild and while a pretty minimal risk for our usage (path mapping rules on machines that already requires lots of memory), unbounded lru_caches aren't generally great and will increase memory usage overtime

What was the solution? (How)

Add a limit to at least limit the unbounded growth. 100,000 is a LOT of path mapping rule applications. First entry is basically removed after the 100,001st unique path to map. As far as I understand, the actual memory footprint would vary based on the paths being returned from the function, since the args are hashed into a key. Back of napkin math for something like 256 character path (ie. reasonable) ends up being around 30MB (Python doesn't make this easy to find out).

path_mapping_rules could probably just be capped to something like 128 (the default if maxsize is not specified) since there shouldn't be a reason to continually request that info.

An ideal follow-up would track cache usage to see how much is actually being used and adjust. But it's super easy to add this large upperbound and move from there.

What is the impact of this change?

memory footprint of the adaptor will not grow indefinitely due to the lru_cache

How was this change tested?

hatch run test

Was this change documented?

N/A

Is this a breaking change?

I'd consider no. Everything continues to work. If a user had a massive cache miss percentage increase from this then they'd see some slowdown where the client requests the path from the background adaptor via IPC. But at that point they were already missing the cache a lot unless it then went back to files they previously mapped.

Does this change impact security?

Nope


By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@epmog epmog requested a review from a team as a code owner November 4, 2025 22:43
@sonarqubecloud
Copy link

sonarqubecloud bot commented Nov 4, 2025

@epmog epmog merged commit 8937f21 into OpenJobDescription:mainline Nov 4, 2025
20 checks passed
@epmog epmog deleted the cap_lru_cache branch November 4, 2025 22:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Maintenance: Unbounded lru_cache

4 participants