Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SubnetManager should use the main context #1867

Merged
merged 1 commit into from
Feb 13, 2024

Conversation

manuelbuil
Copy link
Collaborator

Description

For some unknown reason, subnetManager was using a different context than the main one, thus the cancel() method to clean up flannel resources will never work in this case.

This PR fixes that and subnetManager will use the main context

Todos

  • Tests
  • Documentation
  • Release note

Release Note

None required

@thomasferrandiz
Copy link
Contributor

I just noticed that this context is wrong as well:

node, err := ksm.client.CoreV1().Nodes().Get(context.TODO(), ksm.nodeName, metav1.GetOptions{})

Context.TODO is only for dev

@manuelbuil
Copy link
Collaborator Author

I just noticed that this context is wrong as well:

node, err := ksm.client.CoreV1().Nodes().Get(context.TODO(), ksm.nodeName, metav1.GetOptions{})

Context.TODO is only for dev

Should it be Context.Background? I wonder if we could do something to pipe the main context

@thomasferrandiz
Copy link
Contributor

I think we can get the main context because GetStoredMacAddress is called in RegisterNetwork where it is available.

Signed-off-by: Manuel Buil <mbuil@suse.com>
@manuelbuil
Copy link
Collaborator Author

I think we can get the main context because GetStoredMacAddress is called in RegisterNetwork where it is available.

done!

@manuelbuil manuelbuil merged commit 6febb7e into flannel-io:master Feb 13, 2024
8 checks passed
@manuelbuil manuelbuil deleted the useContext branch February 13, 2024 17:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants