Log: /mnt/jenkins/workspace/cloud-pxc-operator_PR-2473/e2e-tests/logs/monitoring-pmm3-8-0.log Warning: version difference between client (1.36) and server (1.33) exceeds the supported minor version skew of +/-1 Warning: version difference between client (1.36) and server (1.33) exceeds the supported minor version skew of +/-1 No resources found + kubectl patch pxc -n sh --type=merge -p '{"metadata":{"finalizers":[]}}' error: resource(s) were provided, but no name was specified No resources found No resources found No resources found error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces pxc-operator ----------------------------------------------------------------------------------- error: resource(s) were provided, but no name was specified namespace "pxc-operator" deleted waiting for namespace/pxc-operator to be deletedError from server (NotFound): namespaces "pxc-operator" not found ----------------------------------------------------------------------------------- create namespace pxc-operator ----------------------------------------------------------------------------------- namespace/pxc-operator created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-2473-6d392bea-4-cluster9" modified. ----------------------------------------------------------------------------------- start PXC operator ----------------------------------------------------------------------------------- customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterbackups.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterrestores.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusters.pxc.percona.com serverside-applied clusterrole.rbac.authorization.k8s.io/percona-xtradb-cluster-operator unchanged serviceaccount/percona-xtradb-cluster-operator created clusterrolebinding.rbac.authorization.k8s.io/service-account-percona-xtradb-cluster-operator unchanged deployment.apps/percona-xtradb-cluster-operator created service/percona-xtradb-cluster-operator created pod/percona-xtradb-cluster-operator-55d95dc9d8-jxgrf condition met E0516 23:57:22.610140 5762 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/pxc-operator/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpercona-xtradb-cluster-operator-55d95dc9d8-jxgrf&resourceVersion=1778975842162379000&timeoutSeconds=477&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/percona-xtradb-cluster-operator-55d95dc9d8-jxgrf condition met E0516 23:57:27.295497 6574 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/pxc-operator/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpercona-xtradb-cluster-operator-55d95dc9d8-jxgrf&resourceVersion=1778975846327071000&timeoutSeconds=464&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/percona-xtradb-cluster-operator-55d95dc9d8-jxgrf to become Ready.Ok error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces monitoring-pmm3-21297 ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "monitoring-pmm3-21297" not found waiting for namespace/monitoring-pmm3-21297 to be deletederror: resource(s) were provided, but no name was specified Error from server (NotFound): namespaces "monitoring-pmm3-21297" not found ----------------------------------------------------------------------------------- create namespace monitoring-pmm3-21297 ----------------------------------------------------------------------------------- namespace/monitoring-pmm3-21297 created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-2473-6d392bea-4-cluster9" modified. ----------------------------------------------------------------------------------- create secrets for cloud storages ----------------------------------------------------------------------------------- secret/minio-secret created secret/aws-s3-secret created secret/do-spaces-secret created secret/gcp-cs-secret created secret/azure-secret created "hashicorp" already exists with the same configuration, skipping "minio" already exists with the same configuration, skipping Hang tight while we grab the latest from your chart repositories... ...Successfully got an update from the "minio" chart repository ...Successfully got an update from the "chaos-mesh" chart repository ...Successfully got an update from the "percona" chart repository ...Successfully got an update from the "hashicorp" chart repository Update Complete. ⎈Happy Helming!⎈ ----------------------------------------------------------------------------------- install PMM Server ----------------------------------------------------------------------------------- Error: uninstall: Release not loaded: monitoring: release: not found "percona" has been removed from your repositories "percona" has been added to your repositories Hang tight while we grab the latest from your chart repositories... ...Successfully got an update from the "minio" chart repository ...Successfully got an update from the "chaos-mesh" chart repository ...Successfully got an update from the "percona" chart repository ...Successfully got an update from the "hashicorp" chart repository Update Complete. ⎈Happy Helming!⎈ NAME: monitoring LAST DEPLOYED: Sat May 16 23:58:02 2026 NAMESPACE: monitoring-pmm3-21297 STATUS: deployed REVISION: 1 TEST SUITE: None NOTES: Percona Monitoring and Management (PMM) An open source database monitoring, observability and management tool Check more info here: https://docs.percona.com/percona-monitoring-and-management/index.html Get the application URL: NOTE: It may take a few minutes for the LoadBalancer IP to be available. You can watch the status of by running 'kubectl get --namespace monitoring-pmm3-21297 svc -w monitoring-service' export SERVICE_IP=$(kubectl get svc --namespace monitoring-pmm3-21297 monitoring-service -o jsonpath="{.status.loadBalancer.ingress[0].ip}") echo https://$SERVICE_IP: Get password for the "admin" user: export ADMIN_PASS=$(kubectl get secret pmm-secret --namespace monitoring-pmm3-21297 -o jsonpath='{.data.PMM_ADMIN_PASSWORD}' | base64 --decode) echo $ADMIN_PASS statefulset.apps/monitoring condition met ----------------------------------------------------------------------------------- create secret ----------------------------------------------------------------------------------- secret/my-cluster-secrets created ----------------------------------------------------------------------------------- add PMM3 token to secret ----------------------------------------------------------------------------------- secret/my-cluster-secrets patched ----------------------------------------------------------------------------------- create PXC cluster ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- create first PXC cluster ----------------------------------------------------------------------------------- deployment.apps/pxc-client created perconaxtradbcluster.pxc.percona.com/monitoring created ----------------------------------------------------------------------------------- check if all 3 Pods started ----------------------------------------------------------------------------------- error: no matching resources found ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- Error from server (NotFound): pods "monitoring-haproxy-0" not found waiting for pod/monitoring-haproxy-0 to become Ready..........................................Ok ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/monitoring-pxc-0 condition met E0517 00:01:34.417637 29406 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-0&resourceVersion=1778976094051471009&timeoutSeconds=583&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/monitoring-pxc-0 to become Ready.Ok pod/monitoring-pxc-1 condition met waiting for pod/monitoring-pxc-1 to become Ready.Ok pod/monitoring-pxc-2 condition met waiting for pod/monitoring-pxc-2 to become Ready.Ok ----------------------------------------------------------------------------------- write data ----------------------------------------------------------------------------------- pod/pxc-client-67fc4995bb-chr8g condition met E0517 00:06:15.585292 29255 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-67fc4995bb-chr8g&resourceVersion=1778976373724974000&timeoutSeconds=456&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-67fc4995bb-chr8g to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-67fc4995bb-chr8g condition met E0517 00:06:29.130129 31074 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-67fc4995bb-chr8g&resourceVersion=1778976386154116000&timeoutSeconds=359&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-67fc4995bb-chr8g to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-67fc4995bb-chr8g condition met E0517 00:07:09.916069 4521 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-67fc4995bb-chr8g&resourceVersion=1778976428920559000&timeoutSeconds=341&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-67fc4995bb-chr8g to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-67fc4995bb-chr8g condition met E0517 00:07:25.024878 6404 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-67fc4995bb-chr8g&resourceVersion=1778976442577233000&timeoutSeconds=336&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-67fc4995bb-chr8g to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-67fc4995bb-chr8g condition met E0517 00:07:35.726973 8009 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-67fc4995bb-chr8g&resourceVersion=1778976453797803000&timeoutSeconds=423&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-67fc4995bb-chr8g to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok Waiting for sts/monitoring-pxc to reach generation 1... Resource sts/monitoring-pxc has reached generation 1. Waiting for sts/monitoring-haproxy to reach generation 1... Resource sts/monitoring-haproxy has reached generation 1. E0517 00:08:01.873936 11539 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-haproxy-0&resourceVersion=1778976481150084000&timeoutSeconds=470&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-haproxy-0 condition met E0517 00:08:02.176137 11539 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-haproxy-1&resourceVersion=1778976481150084000&timeoutSeconds=357&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-haproxy-1 condition met E0517 00:08:02.480936 11539 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-0&resourceVersion=1778976481150084000&timeoutSeconds=326&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-pxc-0 condition met E0517 00:08:02.782103 11539 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-1&resourceVersion=1778976481150084000&timeoutSeconds=435&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-pxc-1 condition met E0517 00:08:03.083729 11539 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-2&resourceVersion=1778976481150084000&timeoutSeconds=428&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-pxc-2 condition met ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- waiting for pxc/monitoring to be ready ----------------------------------------------------------------------------------- compare statefulset/monitoring-pxc--no-prefix ----------------------------------------------------------------------------------- [2026-05-17T00:08:15+0000] compare_kubectl: statefulset/monitoring-pxc OK ----------------------------------------------------------------------------------- compare statefulset/monitoring-haproxy--no-prefix ----------------------------------------------------------------------------------- [2026-05-17T00:08:16+0000] compare_kubectl: statefulset/monitoring-haproxy OK ----------------------------------------------------------------------------------- apply my-env-var-secrets to add PMM_PREFIX ----------------------------------------------------------------------------------- secret/my-env-var-secrets created Waiting for sts/monitoring-pxc to reach generation 2... Resource sts/monitoring-pxc is at generation 1. Waiting... Resource sts/monitoring-pxc has reached generation 2. Waiting for sts/monitoring-haproxy to reach generation 2... Resource sts/monitoring-haproxy has reached generation 2. ----------------------------------------------------------------------------------- create new PMM token and add it to the secret ----------------------------------------------------------------------------------- secret/my-cluster-secrets patched ----------------------------------------------------------------------------------- delete old PMM token ----------------------------------------------------------------------------------- Waiting for sts/monitoring-pxc to reach generation 3... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc has reached generation 3. Waiting for sts/monitoring-haproxy to reach generation 3... Resource sts/monitoring-haproxy has reached generation 3. E0517 00:11:33.600141 32733 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-haproxy-0&resourceVersion=1778976693282159016&timeoutSeconds=480&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-haproxy-0 condition met pod/monitoring-haproxy-1 condition met E0517 00:11:37.652088 32733 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-0&resourceVersion=1778976697389295010&timeoutSeconds=576&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-pxc-0 condition met E0517 00:11:37.957086 32733 reflector.go:227] "Failed to watch" err="Get \"https://34.45.173.175/api/v1/namespaces/monitoring-pmm3-21297/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-1&resourceVersion=1778976697389295010&timeoutSeconds=480&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-pxc-1 condition met pod/monitoring-pxc-2 condition met ----------------------------------------------------------------------------------- check if pmm-client container enabled ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- compare statefulset/monitoring-pxc- ----------------------------------------------------------------------------------- [2026-05-17T00:12:06+0000] compare_kubectl: statefulset/monitoring-pxc OK ----------------------------------------------------------------------------------- compare statefulset/monitoring-haproxy- ----------------------------------------------------------------------------------- [2026-05-17T00:12:08+0000] compare_kubectl: statefulset/monitoring-haproxy OK ----------------------------------------------------------------------------------- check mysql metrics ----------------------------------------------------------------------------------- jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) ----------------------------------------------------------------------------------- check haproxy metrics ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- check QAN data ----------------------------------------------------------------------------------- null ----------------------------------------------------------------------------------- verify that the custom cluster name is configured ----------------------------------------------------------------------------------- command terminated with exit code 1 command terminated with exit code 1 command terminated with exit code 1 perconaxtradbcluster.pxc.percona.com/monitoring patched waiting for pod/monitoring-pxc-0 to be deleted.................Error from server (NotFound): pods "monitoring-pxc-0" not found command terminated with exit code 1 command terminated with exit code 1 command terminated with exit code 1 release "monitoring" uninstalled ----------------------------------------------------------------------------------- destroy cluster/operator and all other resources ----------------------------------------------------------------------------------- + kubectl patch pxc -n monitoring-pmm3-21297 monitoring --type=merge -p '{"metadata":{"finalizers":[]}}' perconaxtradbcluster.pxc.percona.com/monitoring patched (no change) perconaxtradbcluster.pxc.percona.com "monitoring" deleted from monitoring-pmm3-21297 namespace No resources found No resources found validatingwebhookconfiguration.admissionregistration.k8s.io "percona-xtradbcluster-webhook" deleted ----------------------------------------------------------------------------------- test passed -----------------------------------------------------------------------------------