Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Manually bump to 1.36.0 #3060

Open
wants to merge 10 commits into
base: main
Choose a base branch
from

Conversation

pierDipi
Copy link
Member

@pierDipi pierDipi commented Nov 27, 2024

When we bump the metadata for a major version, images are not yet available for SO components on
the Konflux registry, so we use the :latest tag temporarily by passing "true" as argument to
various latest_* functions.

Eventually, once images are available, SHAs will be replaced/used.

Part of #2989

Signed-off-by: Pierangelo Di Pilato <[email protected]>
Copy link
Contributor

openshift-ci bot commented Nov 27, 2024

Skipping CI for Draft Pull Request.
If you want CI signal for your change, please convert it to an actual PR.
You can still manually trigger a test run with /test all

Comment on lines 13 to 14
registry.ci.openshift.org/knative/release-1.35.0:serverless-bundle \
>> /configs/index.yaml
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mgencur this is the issue SERVERLESS_BUNDLE in dockerfile.sh is empty

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

++++ skopeo inspect --no-tags=true docker://quay.io/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-136/serverless-bundle:latest
++++ jq -r .Digest
FATA[0001] Error parsing image name "docker://quay.io/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-136/serverless-bundle:latest": reading manifest latest in quay.io/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-136/serverless-bundle: unauthorized: access to the requested resource is not authorized 
++++ echo ''
+++ digest=
+++ '[' '' = '' ']'
+++ '[' false = true ']'
+++ echo ''
+++ return
++ image=
++ [[ '' == '' ]]
+++ image_with_sha registry.ci.openshift.org/knative/serverless-bundle:release-1.36.0
+++ image=registry.ci.openshift.org/knative/serverless-bundle:release-1.36.0
+++ return_input_on_empty=false
++++ skopeo inspect --no-tags=true docker://registry.ci.openshift.org/knative/serverless-bundle:release-1.36.0
++++ jq -r .Digest
FATA[0000] Error parsing image name "docker://registry.ci.openshift.org/knative/serverless-bundle:release-1.36.0": reading manifest release-1.36.0 in registry.ci.openshift.org/knative/serverless-bundle: manifest unknown 

it can't find either
quay.io/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-136/serverless-bundle
registry.ci.openshift.org/knative/serverless-bundle:release-1.36.0

(expected)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm wondering if we need to also fallback to knative-main in get_bundle_for_version when not even CI registry.ci.openshift.org/knative/serverless-bundle:release-1.36.0 is available?

  # As a backup, try also CI registry.
  if [[ "${image}" == "" ]]; then
    image=$(image_with_sha "registry.ci.openshift.org/knative/serverless-bundle:release-${version}")
  fi
  if [[ "${image}" == "" ]]; then
    image=$(image_with_sha "registry.ci.openshift.org/knative/serverless-bundle:knative-main")
  fi

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will fix it for the Dockerfile: https://gist.github.com/mgencur/5316c4655abcb92e28a3455ee02a0b5f
But I'm still checking if the final catalog can be generated properly if the latest bundle doesn't exist yet.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My solution can't build the catalog properly because it requires the image to exist:

INFO    16:41:10.114 Generating catalog for OCP 4.14
INFO[0000] rendering index "registry.redhat.io/redhat/redhat-operator-index:v4.14" as file-based catalog 
INFO[0404] wrote rendered file-based catalog to "/tmp/knative.JC1YvFt4/tmp.hVU6K2GqdC" 
FATA[0002] Error parsing image name "docker://quay.io/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-136/serverless-bundle:latest": reading manifest latest in quay.io/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-136/serverless-bundle: unauthorized: access to the requested resource is not authorized 
2024/11/27 16:49:45 render reference "quay.io/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-136/serverless-bundle:latest": error resolving name for image ref quay.io/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-136/serverless-bundle:latest: unexpected status from HEAD request to https://quay.io/v2/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-136/serverless-bundle/manifests/latest: 401 UNAUTHORIZED
ERROR   16:49:45.652 🚨 Error (code: 1) occurred at ./hack/generate/catalog.sh:49, with command: opm alpha render-template basic --migrate-level="$level" "${catalog_template}" -oyaml > "${index_dir}/v${ocp_version}/catalog/serverless-operator/catalog.yaml"

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe if it doesn't, we can keep it that way for a few days until the new bundle is in there, the new bundle will be available as soon as we merge PRs like this one

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, yeah. We could fall back to registry.ci.openshift.org/knative/serverless-bundle:knative-main

@pierDipi pierDipi requested review from mgencur and creydr November 27, 2024 16:21
Signed-off-by: Pierangelo Di Pilato <[email protected]>
hack/generate/catalog.sh Outdated Show resolved Hide resolved
@pierDipi pierDipi changed the title [WIP] Manually bump to 1.36.0 Manually bump to 1.36.0 Nov 27, 2024
@pierDipi pierDipi marked this pull request as ready for review November 27, 2024 16:55
@openshift-ci openshift-ci bot requested a review from aliok November 27, 2024 16:55
# Also make sure to update values under `olm.previous` by copying from `olm.replaces` and `olm.skipRange`.
version: 1.35.0
name: serverless-operator
# When bumping the Operator to a new version (major and minor), make sure to also update
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit. Not sure the formatting change is intentional?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the command go run ./hack/cmd/bumpso/bumpso.go --branch release-1.35 decided the format for some reason

@mgencur
Copy link
Contributor

mgencur commented Nov 27, 2024

/lgtm

@openshift-ci-robot
Copy link

/retest-required

Remaining retests: 0 against base HEAD 511168a and 2 for PR HEAD 2b9b69a in total

7 similar comments
@openshift-ci-robot
Copy link

/retest-required

Remaining retests: 0 against base HEAD 511168a and 2 for PR HEAD 2b9b69a in total

@openshift-ci-robot
Copy link

/retest-required

Remaining retests: 0 against base HEAD 511168a and 2 for PR HEAD 2b9b69a in total

@openshift-ci-robot
Copy link

/retest-required

Remaining retests: 0 against base HEAD 511168a and 2 for PR HEAD 2b9b69a in total

@openshift-ci-robot
Copy link

/retest-required

Remaining retests: 0 against base HEAD 511168a and 2 for PR HEAD 2b9b69a in total

@openshift-ci-robot
Copy link

/retest-required

Remaining retests: 0 against base HEAD 511168a and 2 for PR HEAD 2b9b69a in total

@openshift-ci-robot
Copy link

/retest-required

Remaining retests: 0 against base HEAD 511168a and 2 for PR HEAD 2b9b69a in total

@openshift-ci-robot
Copy link

/retest-required

Remaining retests: 0 against base HEAD 511168a and 2 for PR HEAD 2b9b69a in total

@openshift-ci-robot
Copy link

/retest-required

Remaining retests: 0 against base HEAD baf5e6a and 2 for PR HEAD 2b9b69a in total

@mgencur
Copy link
Contributor

mgencur commented Nov 28, 2024

This needs rebase now.

@mgencur
Copy link
Contributor

mgencur commented Nov 28, 2024

/lgtm

@openshift-ci openshift-ci bot added the lgtm label Nov 28, 2024
@openshift-ci openshift-ci bot removed the lgtm label Nov 28, 2024
@pierDipi pierDipi requested a review from mgencur November 28, 2024 11:49
@pierDipi
Copy link
Member Author

Since the other bot PR conflicts, I'm adjusting the release merge order and removing conflicting commits from bots

@mgencur
Copy link
Contributor

mgencur commented Nov 28, 2024

/lgtm

@openshift-ci openshift-ci bot added the lgtm label Nov 28, 2024
Copy link
Contributor

openshift-ci bot commented Nov 28, 2024

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: mgencur, pierDipi

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@pierDipi
Copy link
Member Author

cp: cannot stat '/go/src/github.com/openshift-knative/serverless-operator//go/src/github.com/openshift-knative/serverless-operator/olm-catalog/serverless-operator-index/Dockerfile': No such file or directory
13:13:20.223 ERROR:   🚨 Error (code: 1) occurred at ./hack/lib/catalogsource.bash:217, with command: tmp_dockerfile=$(replace_images "${from_dir}/${dockerfile_path}")

Signed-off-by: Pierangelo Di Pilato <[email protected]>
Copy link
Contributor

openshift-ci bot commented Nov 28, 2024

New changes are detected. LGTM label has been removed.

@openshift-ci openshift-ci bot removed the lgtm label Nov 28, 2024
…sting

Replacing as done before doesn't respect per-component versioning and we
don't want to add the pre-requisite that every component is ready to bump
operator metadata.

Signed-off-by: Pierangelo Di Pilato <[email protected]>
@pierDipi
Copy link
Member Author

Upgrade test failed to install the previous version:

             "conditions": [
                    {
                        "lastTransitionTime": "2024-11-29T12:37:37Z",
                        "lastUpdateTime": "2024-11-29T12:37:37Z",
                        "message": "requirements not yet checked",
                        "phase": "Pending",
                        "reason": "RequirementsUnknown"
                    },
                    {
                        "lastTransitionTime": "2024-11-29T12:37:37Z",
                        "lastUpdateTime": "2024-11-29T12:37:37Z",
                        "message": "one or more requirements couldn't be found",
                        "phase": "Pending",
                        "reason": "RequirementsNotMet"
                    },
                    {
                        "lastTransitionTime": "2024-11-29T12:37:40Z",
                        "lastUpdateTime": "2024-11-29T12:37:40Z",
                        "message": "all requirements found, attempting install",
                        "phase": "InstallReady",
                        "reason": "AllRequirementsMet"
                    },
                    {
                        "lastTransitionTime": "2024-11-29T12:37:41Z",
                        "lastUpdateTime": "2024-11-29T12:37:41Z",
                        "message": "waiting for install components to report healthy",
                        "phase": "Installing",
                        "reason": "InstallSucceeded"
                    },
                    {
                        "lastTransitionTime": "2024-11-29T12:37:41Z",
                        "lastUpdateTime": "2024-11-29T12:37:42Z",
                        "message": "installing: waiting for deployment knative-operator-webhook to become ready: deployment \"knative-operator-webhook\" not available: Deployment does not have minimum availability.",
                        "phase": "Installing",
                        "reason": "InstallWaiting"
                    },
                    {
                        "lastTransitionTime": "2024-11-29T12:42:41Z",
                        "lastUpdateTime": "2024-11-29T12:42:41Z",
                        "message": "install timeout",
                        "phase": "Failed",
                        "reason": "InstallCheckFailed"
                    },
                    {
                        "lastTransitionTime": "2024-11-29T12:42:42Z",
                        "lastUpdateTime": "2024-11-29T12:42:42Z",
                        "message": "installing: waiting for deployment knative-operator-webhook to become ready: deployment \"knative-operator-webhook\" not available: Deployment does not have minimum availability.",
                        "phase": "Pending",
                        "reason": "NeedsReinstall"
                    },
                    {
                        "lastTransitionTime": "2024-11-29T12:42:42Z",
                        "lastUpdateTime": "2024-11-29T12:42:42Z",
                        "message": "all requirements found, attempting install",
                        "phase": "InstallReady",
                        "reason": "AllRequirementsMet"
                    },
                    {
                        "lastTransitionTime": "2024-11-29T12:42:43Z",
                        "lastUpdateTime": "2024-11-29T12:42:43Z",
                        "message": "waiting for install components to report healthy",
                        "phase": "Installing",
                        "reason": "InstallSucceeded"
                    },
                    {
                        "lastTransitionTime": "2024-11-29T12:42:43Z",
                        "lastUpdateTime": "2024-11-29T12:42:44Z",
                        "message": "installing: waiting for deployment knative-operator-webhook to become ready: deployment \"knative-operator-webhook\" not available: Deployment does not have minimum availability.",
                        "phase": "Installing",
                        "reason": "InstallWaiting"
                    },
                    {
                        "lastTransitionTime": "2024-11-29T12:47:42Z",
                        "lastUpdateTime": "2024-11-29T12:47:42Z",
                        "message": "install failed: deployment knative-operator-webhook not ready before timeout: deployment \"knative-operator-webhook\" exceeded its progress deadline",
                        "phase": "Failed",
                        "reason": "InstallCheckFailed"
                    }

@pierDipi
Copy link
Member Author

                "containerStatuses": [
                    {
                        "image": "registry.redhat.io/openshift-serverless-1/serverless-openshift-kn-rhel8-operator@sha256:d246f92cd503a276324159252c4856c25ae84f156c9307da84cf683e52a64e9e",
                        "imageID": "",
                        "lastState": {},
                        "name": "knative-operator",
                        "ready": false,
                        "restartCount": 0,
                        "started": false,
                        "state": {
                            "waiting": {
                                "message": "Back-off pulling image \"registry.redhat.io/openshift-serverless-1/serverless-openshift-kn-rhel8-operator@sha256:d246f92cd503a276324159252c4856c25ae84f156c9307da84cf683e52a64e9e\"",
                                "reason": "ImagePullBackOff"
                            }
                        }
                    }
                ],

@creydr
Copy link
Member

creydr commented Nov 29, 2024

ICSP seems to be OK:

- mirrors: ["quay.io/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-135/serverless-openshift-kn-operator"]
source: "registry.redhat.io/openshift-serverless-1/serverless-openshift-kn-rhel8-operator"

But image is missing in quay:

docker pull quay.io/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-135/serverless-openshift-kn-operator@sha256:d246f92cd503a276324159252c4856c25ae84f156c9307da84cf683e52a64e9e
Error response from daemon: manifest for quay.io/redhat-user-workloads/ocp-serverless-tenant/serverless-operator-135/serverless-openshift-kn-operator@sha256:d246f92cd503a276324159252c4856c25ae84f156c9307da84cf683e52a64e9e not found: manifest unknown: manifest unknown

@pierDipi
Copy link
Member Author

The problem is: the previous version is using the official images and we don't apply the ImageContentSourcePolicy but I'm not sure how do we need to configure the ImageContentSourcePolicy to handle multiple versions for the source "source"

@creydr
Copy link
Member

creydr commented Nov 29, 2024

The problem is: the previous version is using the official images and we don't apply the ImageContentSourcePolicy but I'm not sure how do we need to configure the ImageContentSourcePolicy to handle multiple versions for the source "source"

Not sure if I get the first part, but regarding "handling multiple versions for a source": mirrors in the ICSP is an array

@openshift-merge-robot
Copy link
Contributor

PR needs rebase.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@pierDipi pierDipi changed the title Manually bump to 1.36.0 [WIP] Manually bump to 1.36.0 Dec 13, 2024
Signed-off-by: Pierangelo Di Pilato <[email protected]>
Copy link
Contributor

openshift-ci bot commented Jan 15, 2025

@pierDipi: The following tests failed, say /retest to rerun all failed tests or /retest-required to rerun all mandatory failed tests:

Test name Commit Details Required Rerun command
ci/prow/417-test-upgrade-aws-417 af4586a link true /test 417-test-upgrade-aws-417
ci/prow/417-upstream-e2e 0c609d4 link false /test 417-upstream-e2e
ci/prow/417-upstream-e2e-kafka 0c609d4 link false /test 417-upstream-e2e-kafka
ci/prow/417-images 0c609d4 link true /test 417-images
ci/prow/417-test-upgrade 0c609d4 link true /test 417-test-upgrade
ci/prow/417-operator-e2e 0c609d4 link true /test 417-operator-e2e

Full PR test history. Your PR dashboard.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants