Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Synchronize knative manifests v1.16.0 #2917

Merged
merged 6 commits into from
Nov 19, 2024

Conversation

juliusvonkohout
Copy link
Member

@tarekabouzeid lets see what my tests say :-D

@juliusvonkohout
Copy link
Member Author

I get Error from server (InternalError): error when creating "STDIN": Internal error occurred: failed calling webhook "webhook.serving.knative.dev": failed to call webhook: Post "[https://webhook.knative-serving.svc:443/?timeout=10s](https://webhook.knative-serving.svc/?timeout=10s)": dial tcp 10.96.226.205:443: connect: connection refused in Build & Apply KServe manifests in KinD / build (pull_request). I could imagine that the pod is not ready or we might have to apply the manifests twice. Can you try the failing tests locally?

I also see resource mapping not found for name: "queue-proxy" namespace: "knative-serving" from "STDIN": no matches for kind "Image" in version "caching.internal.knative.dev/v1alpha1" ensure CRDs are installed first resource mapping not found for name: "routing-serving-certs" namespace: "knative-serving" from "STDIN": no matches for kind "Certificate" in version "networking.internal.knative.dev/v1alpha1" ensure CRDs are installed first

So we should modify https://github.com/kubeflow/manifests/blob/master/tests/gh-actions/install_knative.sh to first extract and install the CRDs only. Then try a full installation, that might fail due to podss coming up, wait 30 seconds and try again.

#!/bin/bash
set -euo pipefail
echo "Installing KNative CRDs ..."
TODO      - yq 'select(.kind == "CustomResourceDefinition")' manifests.yaml | kubectl apply -f

echo "Installing KNative ..."
set +e
kustomize build common/knative/knative-serving/base | kubectl apply -f -
set -e
kustomize build common/knative/knative-serving/base | kubectl apply -f -

But maybe it is just easier to have a for loop with 3 iterations and a 30 second sleep in between.

@juliusvonkohout
Copy link
Member Author

juliusvonkohout commented Nov 19, 2024

Ok we need to install and use a YAML formatter in the synchronize manifests script or add knative upstream to the exceptions in the yamllint workflow.

Also GitHub is having issues again with pulling the jupyterlab image I guess ...

@juliusvonkohout
Copy link
Member Author

If it fails again due to github image pull slowness here is proof that it works and we should merge anyway https://github.com/kubeflow/manifests/actions/runs/11916076734

Signed-off-by: juliusvonkohout <[email protected]>
@juliusvonkohout
Copy link
Member Author

/approve

Copy link

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: juliusvonkohout

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@tarekabouzeid
Copy link
Member

great work @juliusvonkohout !

@tarekabouzeid
Copy link
Member

/lgtm

@google-oss-prow google-oss-prow bot added the lgtm label Nov 19, 2024
@google-oss-prow google-oss-prow bot merged commit 9026318 into master Nov 19, 2024
14 checks passed
@juliusvonkohout juliusvonkohout deleted the synchronize-knative-manifests-v1.16.0 branch November 19, 2024 17:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants