Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Search] Memory leaks caused by AbortController #65051

Closed
Dosant opened this issue May 4, 2020 · 3 comments · Fixed by #81996
Closed

[Search] Memory leaks caused by AbortController #65051

Dosant opened this issue May 4, 2020 · 3 comments · Fixed by #81996
Assignees
Labels
bug Fixes for quality problems that affect the customer experience Feature:Search Querying infrastructure in Kibana impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. loe:medium Medium Level of Effort v7.9.0

Comments

@Dosant
Copy link
Contributor

Dosant commented May 4, 2020

While digging into search code noticed potential improvements around requests cancelation:

  1. When firing search requests we subscribe to abort signal
abort.signal.addEventListener('abort', () => {
cancel()
}) 

But it could happen that client already asked to abort request and it seems to handle that case we have to handle it separately:

e.g.:

const { abortSignal = null } = requestOptionsMap.get(request) || {};
if (abortSignal) {
  if (abortSignal.aborted) {
    abort();
  } else {
    abortSignal.addEventListener('abort', abort);
  }
}

Such scenario happened to me in dashboard.

  1. I guess that every time we manually subscribe to abort event, we have to unsubscribe also? otherwise it is a memory leak. abortSignal.addEventListener('abort', abort);. note: RxJS fromEvent handle unsubscribe when we unsubscribe from observable.

To confirm this is needed, for example, fetch polyfill does this: https://github.com/github/fetch/blob/master/fetch.js#L534

@Dosant Dosant added Feature:Search Querying infrastructure in Kibana Team:AppArch labels May 4, 2020
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-app-arch (Team:AppArch)

@lizozom lizozom added the v7.9.0 label Jun 9, 2020
@lizozom lizozom self-assigned this Jun 9, 2020
@lukasolson lukasolson added bug Fixes for quality problems that affect the customer experience loe:small Small Level of Effort impact:medium Addressing this issue will have a medium level of impact on the quality/strength of our product. labels Jun 23, 2020
@Dosant Dosant mentioned this issue Jul 3, 2020
7 tasks
@Dosant
Copy link
Contributor Author

Dosant commented Jul 6, 2020

Please note:
removing even listener inside 'abort' event callback won't be enough to fix memory leak issue.
Event listener also should be removed when request finishes: see

@Dosant Dosant changed the title [Search] requests cancelations improvements [Search] Memory leaks caused by AbortController Oct 26, 2020
@Dosant Dosant added impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. loe:medium Medium Level of Effort and removed loe:small Small Level of Effort labels Oct 26, 2020
@Dosant
Copy link
Contributor Author

Dosant commented Oct 26, 2020

While I was looking into #79498 I noticed that this memory leak issue is more significant than it could have seemed. (this issue is not related to #79498)

Every single search slowly leaks memory because of AbortController usage.

Most of leaks are a closure leak from abort utils

This is what I saw for a dashboard with a bunch of panels with a frequent refresh interval open for couple minutes:

Screenshot 2020-10-23 at 00 29 44

  • '>8500 AbortSignals left in memory
  • '>3263 AbortControllers left in memory

In general they retained 43mb in memory
This grows with every single search request, so I imagine this could crash large Kibana dashboards with frequent refresh intervals opened for a long time.

@lizozom @lukasolson, I think makes sense to prioritise this and tackle in this cycle.

@Dosant Dosant removed the impact:medium Addressing this issue will have a medium level of impact on the quality/strength of our product. label Oct 26, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Fixes for quality problems that affect the customer experience Feature:Search Querying infrastructure in Kibana impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. loe:medium Medium Level of Effort v7.9.0
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants