-
-
Notifications
You must be signed in to change notification settings - Fork 656
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Macro interpreter memory leak, up to 24GB RAM usage observed #11642
Comments
I'll be happy to assist you in this quest, but it will be quite difficult for me to reliably infer much from here. Most data on evalContext.ml First we should make sure that we're actually looking at the right thing though. Something you could try is doing |
Sounds like those are separate issues to address regardless of what we end up finding here. GC heap words is interesting because Just to be sure, the sub-items under |
Yes I think those are separate issues (but very likely related to language server). |
I think what you mean is |
I used the code below for tracing in memory.ml "additionalSizes",jarray (
(match !MacroContext.macro_interp_cache with
| Some interp ->
[
jobject ["name",jstring "macro interpreter";"size",jint (mem_size (MacroContext.macro_interp_cache))];
jobject ["name",jstring "macro builtins";"size",jint (mem_size (interp.builtins))];
jobject ["name",jstring "macro debug";"size",jint (mem_size (interp.debug))];
jobject ["name",jstring "macro curapi";"size",jint (mem_size (interp.curapi))];
jobject ["name",jstring "macro type_cache";"size",jint (mem_size (interp.type_cache))];
jobject ["name",jstring "macro overrides";"size",jint (mem_size (interp.overrides))];
jobject ["name",jstring "macro array_prototype";"size",jint (mem_size (interp.array_prototype))];
jobject ["name",jstring "macro string_prototype";"size",jint (mem_size (interp.string_prototype))];
jobject ["name",jstring "macro vector_prototype";"size",jint (mem_size (interp.vector_prototype))];
jobject ["name",jstring "macro instance_prototypes";"size",jint (mem_size (interp.instance_prototypes))];
jobject ["name",jstring "macro static_prototypes";"size",jint (mem_size (interp.static_prototypes))];
jobject ["name",jstring "macro constructors";"size",jint (mem_size (interp.constructors))];
jobject ["name",jstring "macro file_keys";"size",jint (mem_size (interp.file_keys))];
jobject ["name",jstring "macro toplevel";"size",jint (mem_size (interp.toplevel))];
jobject ["name",jstring "macro eval";"size",jint (mem_size (interp.eval))];
jobject ["name",jstring "macro evals";"size",jint (mem_size (interp.evals))];
jobject ["name",jstring "macro exception_stack";"size",jint (mem_size (interp.exception_stack))];
jobject ["name",jstring "gc live_words";"size",jint (stat.live_words)];
]
| None ->
[jobject ["name",jstring "macro interpreter";"size",jint (mem_size (MacroContext.macro_interp_cache))];]
)
@
[
(* jobject ["name",jstring "macro stdlib";"size",jint (mem_size (EvalContext.GlobalState.stdlib))];
jobject ["name",jstring "macro macro_lib";"size",jint (mem_size (EvalContext.GlobalState.macro_lib))]; *)
jobject ["name",jstring "last completion result";"size",jint (mem_size (DisplayException.last_completion_result))];
jobject ["name",jstring "Lexer file cache";"size",jint (mem_size (Lexer.all_files))];
jobject ["name",jstring "GC heap words";"size",jint (int_of_float size)];
]
); |
I just saw that Gc.quick_stat() does not compute live_words, will check if it can gives more informations |
Maybe I have to RTFM again, don't know what's going on with the live words... (Edit: Ah, just saw you next comment, makes sense!) But this is useful information, so the problem appears to be in the |
Let's keep this open, I want to investigate why the stack frames aren't popped in the first place. |
When compiling a macro heavy project with vscode, we observed that
haxe.exe
's memory usage grows quickly at each compile (and diagnostic) when the source is modified. It grow easily up to 24 GB before display memory details and 10 GB after display (there is a Ocaml GC call insideget_memory_json
) which freeze the system. Restarting language server reset the cache size but does not stay for long (devs are restarting very regularly now 😭).We suspect memory leak in macro interpreter. With some additional tracing on each variable of "macro interpreter", it shows the following result (knowing that after ocaml GC, the total cache size after the first build is less than 400MB, and average usage of another projects is around 1.5GB):
The text was updated successfully, but these errors were encountered: