Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: "system:serviceaccount:addon-manager-system:default" cannot list resource "workflows" in API group "argoproj.io" at the cluster scope #146

Closed
kevdowney opened this issue Jun 3, 2022 · 4 comments · Fixed by #147
Assignees
Labels
bug Something isn't working

Comments

@kevdowney
Copy link
Collaborator

Is this a BUG REPORT or FEATURE REQUEST?:
BUG
What happened:

E0602 23:56:40.972218       1 reflector.go:138] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: Failed to watch *v1alpha1.Workflow: failed to list *v1alpha1.Workflow: workflows.argoproj.io is forbidden: User "system:serviceaccount:addon-manager-system:default" cannot list resource "workflows" in API group "argoproj.io" at the cluster scope
E0602 23:57:21.057596       1 reflector.go:138] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: Failed to watch *v1alpha1.Workflow: failed to list *v1alpha1.Workflow: workflows.argoproj.io is forbidden: User "system:serviceaccount:addon-manager-system:default" cannot list resource "workflows" in API group "argoproj.io" at the cluster scope
E0602 23:58:13.462508       1 reflector.go:138] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: Failed to watch *v1alpha1.Workflow: failed to list *v1alpha1.Workflow: workflows.argoproj.io is forbidden: User "system:serviceaccount:addon-manager-system:default" cannot list resource "workflows" in API group "argoproj.io" at the cluster scope
{"level":"error","ts":1654214325.9899297,"logger":"controller-runtime.manager.controller.addon-manager-wf-controller","msg":"Could not wait for Cache to sync","error":"failed to wait for addon-manager-wf-controller caches to sync: timed out waiting for cache to be synced","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:195\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:221\nsigs.k8s.io/controller-runtime/pkg/manager.(*controllerManager).startRunnable.func1\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/manager/internal.go:696"}
{"level":"error","ts":1654214325.990036,"logger":"controller-runtime.manager.controller.addon-manager-controller","msg":"Could not wait for Cache to sync","error":"failed to wait for addon-manager-controller caches to sync: timed out waiting for cache to be synced","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:195\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:221\nsigs.k8s.io/controller-runtime/pkg/manager.(*controllerManager).startRunnable.func1\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/manager/internal.go:696"}
{"level":"error","ts":1654214325.9905198,"logger":"controller-runtime.manager","msg":"error received after stop sequence was engaged","error":"failed to wait for addon-manager-controller caches to sync: timed out waiting for cache to be synced"}
{"level":"error","ts":1654214325.9906044,"logger":"setup","msg":"problem running manager","error":"failed to wait for addon-manager-wf-controller caches to sync: timed out waiting for cache to be synced","stacktrace":"main.main\n\t/workspace/main.go:71\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:255"}

What you expected to happen:
This is a regression from when we limited RBAC to namespace for Argo WF. https://github.com/keikoproj/addon-manager/pull/73/files#diff-2df6a56a33260adb76f704ff7c6737be9f43c13fe523521ee76390ff2a9ca4b8R151

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

Environment:

  • Addon Manager version
  • Kubernetes version :
$ kubectl version -o yaml

Other debugging information (if applicable):

  • Addon status:
$ kubectl describe addon <addon-name>
  • controller logs:
$ kubectl logs <addon-manager-pod>
@kevdowney kevdowney added the bug Something isn't working label Jun 3, 2022
@kevdowney kevdowney self-assigned this Jun 3, 2022
@ccfishk
Copy link
Contributor

ccfishk commented Jun 3, 2022

This issue is a duplicate of what #131

@kevdowney
Copy link
Collaborator Author

This issue is a duplicate of what #131

That's not a duplicate but another design to allow for watches by ns only which would solve for this but not sure we want to wait for that as it's a breaking design change. We can fix this quickly and later do #131.

@ccfishk
Copy link
Contributor

ccfishk commented Jun 3, 2022

The #131 exactly address the failed to watch namespace error during operation stage.

The error message "Failed to watch *v1alpha1.Workflow ..." is expected based on the current implementation scope and BDD tests.

@kevdowney
Copy link
Collaborator Author

The bug fix is submitted #147

Since #131 is a significant design change it should be done later as as a separate version instead of a patch fix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants