Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Redactors in v0.71.1 consume too much memory #1331

Closed
banjoh opened this issue Sep 8, 2023 · 3 comments · Fixed by #1332
Closed

Redactors in v0.71.1 consume too much memory #1331

banjoh opened this issue Sep 8, 2023 · 3 comments · Fixed by #1332
Assignees
Labels
bug::regression type::bug Something isn't working

Comments

@banjoh
Copy link
Member

banjoh commented Sep 8, 2023

Bug Description

In certain scenarios, running redactor when collector a support bundle can consume a lot of memory leading to OOM signals killing pods.

Here is are memory profiles of this happening in KOTS .

With v0.71.1
Screen Shot 2023-09-08 at 10 26 46 AM

With v0.70.3
Screen Shot 2023-09-08 at 10 23 20 AM

NOTE: This has not been reproduced when using the CLI. From comments below

  • Checkout v0.71.1 of troubleshoot
  • Build binaries make build
  • Collect and redact support bundle - ./bin/support-bundle ./support-bundle.yaml --memprofile=mem.conf --redactors ./redactor.yaml with support-bundle.yaml and redactor.yaml
  • Analyse memory allocations - go tool pprof -http :8888 -alloc_space mem.prof will start an http server on port 8888

Expected Behavior

Redaction code should not consume excess memory.

Steps To Reproduce

Include the commands to reproduce the issue including any output. Any information that will help us to understand the problem is useful. Feel free to paste long output into a Github gist and include the link here.

To be fleshed

Additional Context

Include the following information.

  • Troubleshoot version. If you built from source, note that including the version of Go you used to build with.
  • Operating system
  • Operating system version
  • Other details that might be helpful in diagnosing the problem
@banjoh banjoh added type::bug Something isn't working bug::regression labels Sep 8, 2023
@arcolife
Copy link
Contributor

arcolife commented Sep 8, 2023

For context, as first seen in testsuite while bumping troubleshoot to v0.71.1 in kots replicatedhq/kots#4031

And PR relevant to investigation #1291

Potentially helpful tooling #1301

Discussion (internal - all info is summarized above) https://replicated.slack.com/archives/C016DU6CXNE/p1694185342225949

@chris-sanders
Copy link
Member

@banjoh did you get the images backwards?

@cbodonnell
Copy link
Contributor

The higher memory consumption can also be reproduced from the CLI:

  1. Modify the cmd/troubleshoot/main.go function so that it generates a memory profile (might be a better way to do this without rebuild the binary, but this is how I got it working):
func main() {
	cli.InitAndExecute()

	memProfileFile, err := os.Create("mem.prof")
	if err != nil {
		panic(err)
	}
	defer memProfileFile.Close()

	if err := pprof.WriteHeapProfile(memProfileFile); err != nil {
		panic(err)
	}
}
  1. make support-bundle
  2. ./bin/support-bundle ./support-bundle.yaml --redactors ./redactor.yaml with support-bundle.yaml and redactor.yaml
  3. go tool pprof -http :8888 -alloc_space mem.prof will start an http server on port 8888 where you can view the details

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug::regression type::bug Something isn't working
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

4 participants