Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NewRelic Scaler Crashes on Logging #3945

Closed
lkishalmi opened this issue Dec 2, 2022 · 0 comments · Fixed by #3946
Closed

NewRelic Scaler Crashes on Logging #3945

lkishalmi opened this issue Dec 2, 2022 · 0 comments · Fixed by #3946
Labels
bug Something isn't working

Comments

@lkishalmi
Copy link
Contributor

Report

NewRelic Scaler Crashes on Logging.
It is just enough to provide an invalid Query to the scaler.

That would result:

panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x20 pc=0x585a40]

goroutine 350 [running]:
github.com/go-logr/logr.Logger.Error({{0x0, 0x0}, 0xc000f9b520}, {0x3c9bf40, 0xc001eab5e0}, {0x35fbabd, 0x14}, {0x0, 0x0, 0x0})
        /go/pkg/mod/github.com/go-logr/logr@v1.2.3/logr.go:279 +0x80
github.com/kedacore/keda/v2/pkg/scalers.(*newrelicScaler).IsActive(0xc0002dc380, {0x3cf1f28, 0xc00070eec0})
        /workspace/pkg/scalers/newrelic_scaler.go:147 +0x7b
github.com/kedacore/keda/v2/pkg/scaling/cache.(*ScalersCache).IsScaledObjectActive(0xc0006980a0, {0x3cf1f28, 0xc00070eec0}, 0xc000b13400)
        /workspace/pkg/scaling/cache/scalers_cache.go:89 +0xef
github.com/kedacore/keda/v2/pkg/scaling.(*scaleHandler).checkScalers(0xc0003e89a0, {0x3cf1f28, 0xc00070eec0}, {0x3516240, 0xc000b13400}, {0x3ccea98, 0xc001907ea0})
        /workspace/pkg/scaling/scale_handler.go:278 +0x4b2
github.com/kedacore/keda/v2/pkg/scaling.(*scaleHandler).startScaleLoop(0xc0003e89a0, {0x3cf1f28, 0xc00070eec0}, 0xc0019cd180, {0x3516240, 0xc000b13400}, {0x3ccea98, 0xc001907ea0})
        /workspace/pkg/scaling/scale_handler.go:149 +0x31c
created by github.com/kedacore/keda/v2/pkg/scaling.(*scaleHandler).HandleScalableObject
        /workspace/pkg/scaling/scale_handler.go:105 +0x6ef

Expected Behavior

The Error logged to the console

Actual Behavior

Keda Crashes with segmentation fault.

Steps to Reproduce the Problem

  1. Set up a new-relic type scaler
  2. Provide an invalid Query

Logs from KEDA operator

panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x20 pc=0x585a40]

goroutine 350 [running]:
github.com/go-logr/logr.Logger.Error({{0x0, 0x0}, 0xc000f9b520}, {0x3c9bf40, 0xc001eab5e0}, {0x35fbabd, 0x14}, {0x0, 0x0, 0x0})
        /go/pkg/mod/github.com/go-logr/logr@v1.2.3/logr.go:279 +0x80
github.com/kedacore/keda/v2/pkg/scalers.(*newrelicScaler).IsActive(0xc0002dc380, {0x3cf1f28, 0xc00070eec0})
        /workspace/pkg/scalers/newrelic_scaler.go:147 +0x7b
github.com/kedacore/keda/v2/pkg/scaling/cache.(*ScalersCache).IsScaledObjectActive(0xc0006980a0, {0x3cf1f28, 0xc00070eec0}, 0xc000b13400)
        /workspace/pkg/scaling/cache/scalers_cache.go:89 +0xef
github.com/kedacore/keda/v2/pkg/scaling.(*scaleHandler).checkScalers(0xc0003e89a0, {0x3cf1f28, 0xc00070eec0}, {0x3516240, 0xc000b13400}, {0x3ccea98, 0xc001907ea0})
        /workspace/pkg/scaling/scale_handler.go:278 +0x4b2
github.com/kedacore/keda/v2/pkg/scaling.(*scaleHandler).startScaleLoop(0xc0003e89a0, {0x3cf1f28, 0xc00070eec0}, 0xc0019cd180, {0x3516240, 0xc000b13400}, {0x3ccea98, 0xc001907ea0})
        /workspace/pkg/scaling/scale_handler.go:149 +0x31c
created by github.com/kedacore/keda/v2/pkg/scaling.(*scaleHandler).HandleScalableObject
        /workspace/pkg/scaling/scale_handler.go:105 +0x6ef

KEDA Version

2.8.1

Kubernetes Version

< 1.23

Platform

Any

Scaler Details

New Relic

Anything else?

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

1 participant