Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

K8s 1.22 compatibility : v1beta1 CRD no longer understood #882

Open
fondemen opened this issue Aug 24, 2021 · 5 comments
Open

K8s 1.22 compatibility : v1beta1 CRD no longer understood #882

fondemen opened this issue Aug 24, 2021 · 5 comments

Comments

@fondemen
Copy link

Is this a BUG REPORT or FEATURE REQUEST?:
BUG REPORT

What happened:
After upgrading kubernetes from 1.21 to 1.22, storck fails starting with the following message:

time="2021-08-24T09:01:08Z" level=info msg="Creating default CSI SnapshotClasses"
time="2021-08-24T09:01:10Z" level=info msg="Starting stork version 2.6.4-945b41e2"
time="2021-08-24T09:01:10Z" level=info msg="Using driver linstor"
time="2021-08-24T09:01:10Z" level=error msg="failed to retrive applicationbackups crds: the server could not find the requested resource"
time="2021-08-24T09:01:10Z" level=error msg="failed to retrive applicationrestores crds: the server could not find the requested resource"
time="2021-08-24T09:01:10Z" level=error msg="failed to retrive applicationclones crds: the server could not find the requested resource"
I0824 09:01:10.927313 1 leaderelection.go:243] attempting to acquire leader lease linstor/linstor-stork...
I0824 09:01:10.931875 1 leaderelection.go:253] successfully acquired lease linstor/linstor-stork
[DEBUG] curl -X 'GET' -H 'Accept: application/json' 'http://linstor-controller:3370/v1/resource-definitions/pvc-ec2d4744-122b-4566-a721-1a8ca1211679'
time="2021-08-24T09:01:10Z" level=fatal msg="Error initializing rule: failed to create CRD due to: the server could not find the requested resource"

Looking at the code, I suspect this is because v1beta1 version of CRD is now removed, and code still uses it (

crd, err := client.ApiextensionsV1beta1().CustomResourceDefinitions().Get(context.TODO(), crdName, metav1.GetOptions{})
).

What you expected to happen:
Check K8s version before attempting to manipulate CRDs, or try using v1 only using v1beta1 client in case of failure so that stork can start...

How to reproduce it (as minimally and precisely as possible):
Try on a 1.22 K8s cluster

Anything else we need to know?:

Environment:

  • Kubernetes version (use kubectl version): 1.22.1
  • Cloud provider or hardware configuration: kubeadm (5 nodes, non-ha)
  • OS (e.g. from /etc/os-release): debian buster
  • Kernel (e.g. uname -a): 4.19.0-17-amd64
  • Install tools: apt
  • Others:
@adityadani
Copy link
Contributor

adityadani commented Aug 24, 2021

Thanks for reporting this issue. We have already started the effort for supporting STORK on k8s 1.22.
PR: #881

@nabiltntn
Copy link

@adityadani Could you please explain how to get the CRD manifests in stork code base. I was not able to find them.
Thanks

@ram-infrac
Copy link
Contributor

ram-infrac commented Jun 7, 2022

Hi @nabiltntn you will need to use stork 2.7.0 + which has k8s 1.22+ support https://github.com/libopenstorage/stork/releases/tag/v2.7.0
We have started work for defined CRD schema transition for 1.22+

@nabiltntn
Copy link

@ram-infrac thank you for this reply.
Using version 2.7.0, generated CRDs are created with basic schema :

  - name: v1alpha1
    schema:
      openAPIV3Schema:
        x-kubernetes-preserve-unknown-fields: true
    served: true
    storage: true

Is this the actual definition ?

@ram-infrac
Copy link
Contributor

ram-infrac commented Jun 20, 2022

so we register CRD's when stork gets deployed, if you need spec details for crd's you can browse pkgs at https://github.com/libopenstorage/stork/blob/master/pkg/apis/stork/v1alpha . For eg. application backup specs can be found here https://github.com/libopenstorage/stork/blob/master/pkg/apis/stork/v1alpha1/applicationbackup.go#L25

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants