-
-
Notifications
You must be signed in to change notification settings - Fork 481
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory Leak when using top-level await before defining computed translations #2629
Comments
Thanks for reporting and providing a reproduction! This sounds similar to #2034, I thought I fixed that 😥 hopefully I'll have time to debug this soon. |
Same issue, i have stores before computed properties with t() |
Hmm while working on fixing the other memory leak in this PR #2616 I did also test it on this reproduction https://github.com/danielroe/i18n-memory-leak and at that time it did seem to work. @kazupon @danielroe It's clear I need to read up on exactly what |
@BobbieGoede I tried your method with my repo. I cloned However, the memory leak still persists, both in the |
@azhirov |
@BobbieGoede |
I often build nuxt i18n as gzip with |
@kazupon @azhirov What's interesting is that this memory leak doesn't seem to be present if the translations are in |
For some reason the memory leak is also gone when moving the computed translation inside a component. I forked your reproduction and added that workaround if you would like to try: https://github.com/BobbieGoede/nuxt-i18n-2629-memory-leak. page ( <script setup lang="ts">
import useFooStore from "~/store/foo";
import TranslationComponent from "~/components/TranslationComponent.vue";
const fooStore = useFooStore();
await fooStore.loadData();
</script>
<template>
<div>
<TranslationComponent />
</div>
</template> component ( <script setup lang="ts">
const text = computed(() => {
const i18n = useNuxtApp().$i18n;
return i18n.t("seventeen");
});
</script>
<template>
<div>
{{ JSON.parse(JSON.stringify(text)) }}
</div>
</template> Perhaps this could work as a workaround, besides the known workaround of moving await after the computed variables. Will continue searching for the source of the issue but lower the priority label due to the issue's specificity. |
@BobbieGoede Certainly, this is an interesting approach, and it does address some issues. However, I disagree that it can be a full-fledged workaround, as the result of I believe that the bug's priority remains "important", considering the widespread use of this module, including in production. |
I'm not sure I fully understand your use case, as I understand it you can use derived values in I have changed my fork to demonstrate this shouldn't cause SSR issues and changed the Pinia store to fetch from a server route, see https://github.com/BobbieGoede/nuxt-i18n-2629-memory-leak/blob/master/pages/index.vue for example. I lowered the label priority as I think the usage is specific enough that I feel this shouldn't block v8 release, preventing users from migrating from v7 to v8. I'm not sure how widespread the usage is, if more projects are running into this I would like to know, I'll raise the priority again either way, as it does impact performance as the label describes. |
Hi Two things I observed, messages loaded from a function are not cached at all, maybe this should be somehow configurable. If the lazy loaded messages are static why not cache them ? Second thing, following the "Allocation instrumentation on timeline" with the chrome inspector I observed that the leak object is coming from here: Line 59 in 4293f31
I can till now not really tell why the objects are still referenced from somewhere, but each reload of the page a new object with all messages for a locale is spawned on the heap and keeps being referenced so GC is not triggered. For us in production this leads to a crash of the application every x requests after the heap is filled. What I can surely tell is that we have no async pinia function which is executed on server side as all of our async pinia stuff is client side only (onMounted) |
It seems I managed to localize and fix the issue. https://github.com/s00d/i18n/tree/fix-leak In the Nuxt documentation, they recommend using JSON instead of a class. I rewrote your code to import JSON, and the problem disappeared. Therefore, the memory leak is definitely related to generateTemplateNuxtI18nOptions. Here is the documentation link. https://nuxt.com/docs/api/kit/templates#examples Here are my changes. Please note that this is just a code example, it's not very functional and triggers warnings due to JSON imports, but I believe it will help in identifying the issue. |
It seems the problem is here. Line 74 in b9c250e
If we refactor it like this importer.load = JSON.stringify(genDynamicImport(importSpecifier, !isServer ? { comment: `webpackChunkName: "${meta.key}"`, assert: { type: 'json' } } : { assert: { type: 'json' } }))
//...
const loadFunction = new Function('return ' + load)();
const getter = await loadFunction().then(r => r.default || r) and also fix a few other places, for example, the path to the locale file, then the memory leak disappears. In my example above, there are some changes that need to be made, but when switching to JSON, a lot of other problems arise. We need to find a different solution. |
After testing out a lot of stuff and trying to dig down the bug, I actually found out that for me it was related to the nuxt experimental feature called asyncContext. When running with async context I can produce relative reliable a OOM with the leaking object (the loaded translations), when deactivated it just works fine. For reference the description of the asyncContext feature |
I made a reproduction here I think one workaround, which is most probably not that clean, would be, when the messages are loaded for the first time for a language, store them in a global cache and then on the next loads just reference them from there instead of deepCloning them every time on every request. I'm not 100% sure what the intention behind the deepCloning is/was. I would volunteer to do a pull request if nothing speaks against leaving out the deepCloning every time but just do it once on initial load for a locale. This would also save some GC time as it will result in a lot less allocations especially when you have big translation files. Maybe it makes also sense to keep the deepCloning as default behavior but add an option which allows this caching to be enabled and circumvent the deepCloning of all messages every time. |
@s00d @agracia-foticos @szwenni |
For initial creating and loading the locale messages this seems okay but the deep cloning is happening on every route navigation right now (I checked it with the debugger). I will try to implement my suggestion and you can check it if it makes sense or not. In my repro I‘m using remote fetching with a loader. I did not check with local files which don‘t go through a loader and a remote api. |
okay I need to investigate if I can find the object which is getting leaked but most probably some configuration object of the nuxt internals. |
Basically, it seems that I found where the leak is. If you do as in my pull request, the leak disappears. It means that a reference is created in 'messages' and can't be destroyed. I spent a lot of time searching, but after the fix, everything started working fine. Unfortunately, due to such a solution, the framework began to work slower, but I think you will figure it out from here. |
I've added new information here, and it's all much simpler than I thought. The issue lies with deepCopy, and my fix worked because the references in messages were removed. |
@s00d Also, do you also still experience a small memory leak with the fixes so far? I suspect the messages leaking is a symptom of the context or i18n options leaking somewhere later in the code, but since the messages take up the largest amount of memory/space of the options it ends up being the most impactful. |
Sorry, I didn't check which pull request you were writing about, I'll check now. Locale retrieval is not working.| Regarding structuredClone, I tested it a couple of weeks ago; it operates slower than JSON.parse(JSON.stringify()). |
Can you check if you're still experiencing the leak using the latest edge release? ( I made some changes in |
I've checked on my own stand with same problem - no more memory leaks! Total heap size increases when there ara many active request, but then return to normal value fast. Really waiting release of this fix! Thank You! |
@wizzout @azhirov @s00d |
Checked on one of the test repositories, everything is fine now. Unfortunately, it will take significantly more time for the others. For now, we have worked around the issue with temporary solutions. I will try to do more testing. |
I have checked several versions of the module for memory leaks. The repository https://github.com/azhirov/nuxt-i18n-pinia-memory-leak was used for testing. Before installing a new version each time, I deleted For load testing, the command Here are the results: 8.0.0-rc.5 ✅No leaks, no errors in console output ( 8.0.0-rc.6 ❌There is a leak, and there is an error 8.0.0-rc.7 ❌There is a leak, and there is an error 8.0.0-rc.8 ❌There is a leak, and there is an error 8.0.0-rc.9 ❌There is a leak, and there is an error 8.0.0-rc.10 ❗️No leaks, but there is an error 8.0.0-rc.11 ✅No leaks. No errors. I was surprised here because this version had the issue. I examined the 8.0.0-rc.11 and [email protected] ❌There is a leak. To check, I deliberately installed an old version of [email protected] ✅No leaks. Probably because the minimum version of the package is specified in the dependencies: I might be mistaken, but it seems that the leak was in the "@nuxtjs/i18n": "8.0.0",
"vue-i18n": "^9.9.0" No need to install |
@azhirov You're right, most if not all of the memory leak was fixed in Will soon publish a patch release with the recommendation to refresh lockfiles when updating to ensure the latest |
Hello all For context we have around 40 different languages, and after some time (around 20 mins) it has filled up all the available heap, crashes and restarts. Only then the heap is again freed but till then it always fills up everything. A heap dump created after 10 minutes in our productive environment shows around 50 copies of the 'de' messages object (which is the default language) in 50 different vueI18nOptions objects. And this piles up. So somehow this problem is still not fixed. From configuration point of view I changed nothing than the things already described above. I'm not sure how to make a reproduction if this entirely and if I should create a new issue or this one can be reopened. |
@szwenni |
@BobbieGoede |
Hello to all, we had a problem with a newly push to production Nuxt 3 app. This modification to packages.json made the trick in local. We will deploy a new release of our app soon in prod container. For context :
So, to conclude, @azhirov : thank you so much mate. |
@szwenni Hey have u found the solution to your leak? Seems like I have the same problem |
@vhovorun |
So actually there are two sides I think
This both things together and additionally node being under pressure (a lot of requests, e.g. google is traversing your site) leads to node no having the time to gc fast enough also the big objects and then finally lead to a oom I for myself fixed this by using someones pull request where he adopted the routematcher to not use only regex but also a tree. This actually speed up the whole application by 1000% and till now I did not have any issues. |
@szwenni Of course if vuejs/router#2148 works well that's great, there's also room for the i18n module to improve routing performance (and build size) by changing the matching approach such as #2937. It's all an ongoing effort 😅 |
Hello. After careful consideration, I've decided to preserve all my work on this project. I've invested a significant amount of time and effort into it, and I'm quite proud of the results. https://github.com/s00d/nuxt-i18n-micro Perhaps others have encountered similar challenges. I've detailed the issues and solutions in the README. Currently, my module offers a 2-3x performance improvement in most cases, and up to a 10x speedup in some scenarios. It's also smaller, simpler, and easier to understand. |
@s00d Having more modules explore i18n should make for a healthier ecosystem 💪 |
It's difficult to say what the future holds right now. My current focus is on understanding how Vite and Nuxt work together, and I'm not just optimizing the build process, but also enhancing the module's runtime performance. I chose not to use Vue i18n due to the numerous issues I've personally encountered, which has brought both advantages and challenges. At the moment, my priority is catching bugs and getting the module ready for production. After that, we'll see what direction things take. I'm also exploring how to properly implement type definitions for the Nuxt plugin. So far, I haven't managed to create a fully functional solution—I've only been able to achieve some level of typing in the IDE through composables. Unfortunately, neither the documentation nor GitHub searches have provided clear answers. There are still a lot of hurdles to overcome before I can start thinking about expanding the module's features. |
@BobbieGoede Hey, I found the problem and it wasn't obvious at all 😅 but maybe it will save for someone a lot of time, in my case the fix was switching from 22.5.1 node version to 22.0.0 after that memory goes high but it resets back to normal |
Environment
Reproduction
I've created a minimal reproduction repository to illustrate the problem. To reproduce the issue:
git clone [email protected]:azhirov/nuxt-i18n-pinia-memory-leak.git
npm install
npm run build
node .output/server/index.mjs
ab
,siege
, or any other utility to send multiple requests. For example:siege -c10 -t30s http://127.0.0.1:3000
orab -n 5000 -c 100 http://localhost:3000/
top
command).Describe the bug
I've encountered a peculiar memory leak issue related to the simultaneous usage of the
@nuxtjs/i18n
and@pinia/nuxt
modules in a Nuxt 3 project. The problem manifests when an asynchronous method from the Pinia store is called (usingawait
) before acomputed
property that includesuseNuxtApp().$i18n.t()
. This results in a memory leak. Interestingly, moving the asynchronous method below thecomputed
property prevents the leak. Additionally, if the$i18n.t()
is not used in the computed property, the leak does not occur.Code with memory leak:
After running the command
node .output/server/index.mjs
, the server consumes approximately 20 MB of RAM. However, after executing the commandsiege -c10 -t30s http://127.0.0.1:3000
, the memory usage increases to 1.4 GB and remains at approximately this level even after stopping the load.If you move the
await
call to the asynchronous method of the store below the computed property, the graph looks like this (the memory leak is significantly reduced, and memory consumption resets to almost the initial level).I've noticed an interesting pattern: the memory leak seems to intensify with an increase in the number of translation files (in the
lang
directory).Additional context
A similar issue was mentioned earlier in this comment: #2034 (comment). After updating to version
8.0.0-rc.11
, the problem reproduces only when using Pinia. A simple asynchronous function defined within the same component does not trigger the memory leak.Logs
No response
The text was updated successfully, but these errors were encountered: