Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

azure-pipelines scaler: exact match demands (don't match jobs without capability) #4138

Closed
patst opened this issue Jan 18, 2023 · 6 comments · Fixed by kedacore/keda-docs#1063 or #4203
Assignees
Labels
feature-request All issues for new features that have not been committed to needs-discussion

Comments

@patst
Copy link
Contributor

patst commented Jan 18, 2023

Proposal

We are using the azure-pipelines scaler and configuring the capability support introduced with #2328 .

Example trigger config:

  triggers:
    - type: azure-pipelines
      metadata:
        targetPipelinesQueueLength: "{{ $.Values.targetPipelinesQueueLength }}"
        poolName: "{{ $.Values.poolName }}"
        organizationURLFromEnv: "AZP_URL"
        demands: "cap-4gb"
      authenticationRef:
        name: pipeline-trigger-auth

I would like to have an attribute like requireDemands to configure that a scaled object for above trigger is only scaled if the Azure DevOps job is requesting a capability named "4gb".

Use-Case

We have different types of build agents running inside a kubernetes cluster.
Our internal developer use these buildagents and they are autoscaled by KEDAs azure-pipelines scaler.

All buildagents are in the same Nodepool (e.g. DEV) but have different configurations regarding CPU and RAM limits.

At the moment everything works fine, when a demand is configured an the Azure DevOPs pipeline:

# from an pipeline.yaml
demands: [ "cap-4gb" ]

If the pipelines to not specify a demand all buildagent types are scaled and one auf dem randomly picks the waiting job. The others are scaled even if there is no work left for them.

This behaviour is documented in the docs (Note: If more than one scaling definition is able to fulfill the demands of the job then they will both spin up an agent.).

Wouldn't it be great to add the configuration to only scale up if the demands configured in an trigger are all requested by and job?

I think the logic could be implemented here: https://github.com/kedacore/keda/blob/main/pkg/scalers/azure_pipelines_scaler.go#L367

Just check if every demand from the trigger is requested by the job.

Is this a feature you are interested in implementing yourself?

Yes

Anything else?

An alternative to this would be a configuration flag like matchEmptyDemands: false (with a default value true) which only checks if the Job requests anything as demand. GitLab runners allow this with a flag called --run-untagged=false

This way we would be able to provide a "default" scaledObject without any demands.

Our usecase would be fulfilled by this behaviour and it is probably even easier to implement.

WDYT?

@patst patst added feature-request All issues for new features that have not been committed to needs-discussion labels Jan 18, 2023
@JorTurFer
Copy link
Member

Hi @patst,
recaping: You would like to have another parameter like requireDemands and use it to scale ONLY if the pipeline has that requirement, to avoid cases where this agent is scaled when a "regular" job is scheduled, right?
This makes sense totally to me, but IDK if I'd do it with another parameter or just with a boolean like forDemandsOnly or similar, to reuse the same values given for demands (to reduce the possibilities of misconfiguration).
WDYT?
I see you are open to contribute with this improvement, I can assign this issue to you if you are still open to it

@patst
Copy link
Contributor Author

patst commented Jan 24, 2023

@JorTurFer thanks for your reply.
Your summary is correct. To keep it simple

  • I would go for a another parameter with type boolean (named: requireAllDemands )
  • if true: check if not only the trigger fulfils a demands but the job requests all demands configured for the trigger
  • change to docs to reflect that behaviour

I will try to come up with a pull request the next days

@JorTurFer
Copy link
Member

nice!
I assign this issue to you. Thanks a lot! ❤️

@EugeneLugovtsov
Copy link
Contributor

@patst @JorTurFer thank you. This feature is very important and useful for us as well.

patst added a commit to patst/keda that referenced this issue Jan 30, 2023
…core#4138

Signed-off-by: Patrick Steinig <patrick.steinig@googlemail.com>
patst added a commit to patst/keda that referenced this issue Feb 6, 2023
…core#4138

Signed-off-by: Patrick Steinig <patrick.steinig@googlemail.com>
patst added a commit to patst/keda that referenced this issue Feb 6, 2023
…core#4138

Signed-off-by: Patrick Steinig <patrick.steinig@googlemail.com>
Signed-off-by: patst <patrick.steinig@googlemail.com>
patst added a commit to patst/keda that referenced this issue Feb 6, 2023
…core#4138

Signed-off-by: Patrick Steinig <patrick.steinig@googlemail.com>
Signed-off-by: patst <patrick.steinig@googlemail.com>
patst added a commit to patst/keda that referenced this issue Feb 7, 2023
…core#4138

Signed-off-by: Patrick Steinig <patrick.steinig@googlemail.com>
Signed-off-by: patst <patrick.steinig@googlemail.com>
@Eldarrin
Copy link
Contributor

Eldarrin commented Feb 8, 2023

HI @patst ,

Could you look how this intersects with: #4195

Cheers, Andy

@patst
Copy link
Contributor Author

patst commented Feb 8, 2023

HI @patst ,

Could you look how this intersects with: #4195

Cheers, Andy

@Eldarrin tried to add my understanding to #4195 . Hope that helps. If there any additions to the proposed solution here feel free to add.

JorTurFer pushed a commit that referenced this issue Feb 8, 2023
#4203)

Co-authored-by: Jorge Turrado Ferrero <Jorge_turrado@hotmail.es>
Co-authored-by: Patrick <patrick.steinig@hdi.de>
Fixes #4138
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature-request All issues for new features that have not been committed to needs-discussion
Projects
Archived in project
4 participants