Log: /mnt/jenkins/workspace/cloud-pxc-operator_PR-2476/e2e-tests/logs/monitoring-pmm3-8-4.log Warning: version difference between client (1.36) and server (1.33) exceeds the supported minor version skew of +/-1 Warning: version difference between client (1.36) and server (1.33) exceeds the supported minor version skew of +/-1 + kubectl patch pxc -n demand-backup-cloud-pxb-24839 demand-backup-cloud --type=merge -p '{"metadata":{"finalizers":[]}}' perconaxtradbcluster.pxc.percona.com/demand-backup-cloud patched perconaxtradbcluster.pxc.percona.com "demand-backup-cloud" deleted from demand-backup-cloud-pxb-24839 namespace perconaxtradbclusterbackup.pxc.percona.com "on-demand-backup-aws-s3" deleted from demand-backup-cloud-pxb-24839 namespace perconaxtradbclusterbackup.pxc.percona.com "on-demand-backup-azure-blob" deleted from demand-backup-cloud-pxb-24839 namespace perconaxtradbclusterbackup.pxc.percona.com "on-demand-backup-gcp-cs" deleted from demand-backup-cloud-pxb-24839 namespace perconaxtradbclusterrestore.pxc.percona.com "on-demand-backup-aws-s3" deleted from demand-backup-cloud-pxb-24839 namespace perconaxtradbclusterrestore.pxc.percona.com "on-demand-backup-gcp-cs" deleted from demand-backup-cloud-pxb-24839 namespace perconaxtradbclusterrestore.pxc.percona.com "on-demand-backup-s3" deleted from demand-backup-cloud-pxb-24839 namespace error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces pxc-operator ----------------------------------------------------------------------------------- namespace "demand-backup-cloud-pxb-24839" deleted namespace "pxc-operator" deleted waiting for namespace/pxc-operator to be deletedError from server (NotFound): namespaces "pxc-operator" not found ----------------------------------------------------------------------------------- create namespace pxc-operator ----------------------------------------------------------------------------------- namespace/pxc-operator created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-2476-a8b01a39-5-cluster11" modified. ----------------------------------------------------------------------------------- start PXC operator ----------------------------------------------------------------------------------- customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterbackups.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterrestores.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusters.pxc.percona.com serverside-applied clusterrole.rbac.authorization.k8s.io/percona-xtradb-cluster-operator unchanged serviceaccount/percona-xtradb-cluster-operator created clusterrolebinding.rbac.authorization.k8s.io/service-account-percona-xtradb-cluster-operator unchanged deployment.apps/percona-xtradb-cluster-operator created service/percona-xtradb-cluster-operator created pod/percona-xtradb-cluster-operator-8548fd5788-xttzz condition met E0516 19:24:47.475084 18344 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/pxc-operator/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpercona-xtradb-cluster-operator-8548fd5788-xttzz&resourceVersion=1778959487114876000&timeoutSeconds=319&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/percona-xtradb-cluster-operator-8548fd5788-xttzz condition met E0516 19:24:52.609720 19029 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/pxc-operator/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpercona-xtradb-cluster-operator-8548fd5788-xttzz&resourceVersion=1778959490357460000&timeoutSeconds=447&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/percona-xtradb-cluster-operator-8548fd5788-xttzz to become Ready.Ok error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces monitoring-pmm3-13245 ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "monitoring-pmm3-13245" not found waiting for namespace/monitoring-pmm3-13245 to be deletederror: resource(s) were provided, but no name was specified Error from server (NotFound): namespaces "monitoring-pmm3-13245" not found ----------------------------------------------------------------------------------- create namespace monitoring-pmm3-13245 ----------------------------------------------------------------------------------- namespace/monitoring-pmm3-13245 created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-2476-a8b01a39-5-cluster11" modified. ----------------------------------------------------------------------------------- create secrets for cloud storages ----------------------------------------------------------------------------------- secret/minio-secret created secret/aws-s3-secret created secret/do-spaces-secret created secret/gcp-cs-secret created secret/azure-secret created "hashicorp" already exists with the same configuration, skipping "minio" already exists with the same configuration, skipping Hang tight while we grab the latest from your chart repositories... ...Successfully got an update from the "minio" chart repository ...Successfully got an update from the "chaos-mesh" chart repository ...Successfully got an update from the "hashicorp" chart repository ...Successfully got an update from the "percona" chart repository Update Complete. ⎈Happy Helming!⎈ ----------------------------------------------------------------------------------- install PMM Server ----------------------------------------------------------------------------------- Error: uninstall: Release not loaded: monitoring: release: not found "percona" has been removed from your repositories "percona" has been added to your repositories Hang tight while we grab the latest from your chart repositories... ...Successfully got an update from the "minio" chart repository ...Successfully got an update from the "chaos-mesh" chart repository ...Successfully got an update from the "hashicorp" chart repository ...Successfully got an update from the "percona" chart repository Update Complete. ⎈Happy Helming!⎈ NAME: monitoring LAST DEPLOYED: Sat May 16 19:25:39 2026 NAMESPACE: monitoring-pmm3-13245 STATUS: deployed REVISION: 1 TEST SUITE: None NOTES: Percona Monitoring and Management (PMM) An open source database monitoring, observability and management tool Check more info here: https://docs.percona.com/percona-monitoring-and-management/index.html Get the application URL: NOTE: It may take a few minutes for the LoadBalancer IP to be available. You can watch the status of by running 'kubectl get --namespace monitoring-pmm3-13245 svc -w monitoring-service' export SERVICE_IP=$(kubectl get svc --namespace monitoring-pmm3-13245 monitoring-service -o jsonpath="{.status.loadBalancer.ingress[0].ip}") echo https://$SERVICE_IP: Get password for the "admin" user: export ADMIN_PASS=$(kubectl get secret pmm-secret --namespace monitoring-pmm3-13245 -o jsonpath='{.data.PMM_ADMIN_PASSWORD}' | base64 --decode) echo $ADMIN_PASS statefulset.apps/monitoring condition met ----------------------------------------------------------------------------------- create secret ----------------------------------------------------------------------------------- secret/my-cluster-secrets created ----------------------------------------------------------------------------------- add PMM3 token to secret ----------------------------------------------------------------------------------- secret/my-cluster-secrets patched ----------------------------------------------------------------------------------- create PXC cluster ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- create first PXC cluster ----------------------------------------------------------------------------------- deployment.apps/pxc-client created perconaxtradbcluster.pxc.percona.com/monitoring created ----------------------------------------------------------------------------------- check if all 3 Pods started ----------------------------------------------------------------------------------- pod/monitoring-haproxy-0 condition met pod/monitoring-pxc-0 condition met E0516 19:29:30.623362 10258 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-0&resourceVersion=1778959770445007017&timeoutSeconds=309&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/monitoring-haproxy-0 condition met E0516 19:29:32.422360 22125 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-haproxy-0&resourceVersion=1778959771908335017&timeoutSeconds=551&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/monitoring-haproxy-0 to become Ready.Ok ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/monitoring-pxc-0 condition met E0516 19:29:40.169858 23231 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-0&resourceVersion=1778959777930015017&timeoutSeconds=438&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/monitoring-pxc-0 to become Ready.Ok pod/monitoring-pxc-1 condition met waiting for pod/monitoring-pxc-1 to become Ready.Ok pod/monitoring-pxc-2 condition met waiting for pod/monitoring-pxc-2 to become Ready.Ok ----------------------------------------------------------------------------------- write data ----------------------------------------------------------------------------------- pod/pxc-client-56fd5498cd-4v4q5 condition met E0516 19:34:25.175807 28072 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-4v4q5&resourceVersion=1778960064401270000&timeoutSeconds=347&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-4v4q5 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-4v4q5 condition met E0516 19:34:35.435561 29234 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-4v4q5&resourceVersion=1778960072930980000&timeoutSeconds=466&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-4v4q5 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-4v4q5 condition met E0516 19:35:15.501044 2387 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-4v4q5&resourceVersion=1778960113504103000&timeoutSeconds=393&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-4v4q5 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-4v4q5 condition met E0516 19:35:25.001364 3944 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-4v4q5&resourceVersion=1778960124254828000&timeoutSeconds=385&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-4v4q5 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-4v4q5 condition met E0516 19:35:34.532092 5251 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-4v4q5&resourceVersion=1778960132667036000&timeoutSeconds=503&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-4v4q5 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok Waiting for sts/monitoring-pxc to reach generation 1... Resource sts/monitoring-pxc has reached generation 1. Waiting for sts/monitoring-haproxy to reach generation 1... Resource sts/monitoring-haproxy has reached generation 1. E0516 19:35:57.502336 8520 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-haproxy-0&resourceVersion=1778960156816425000&timeoutSeconds=555&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-haproxy-0 condition met E0516 19:35:57.803463 8520 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-haproxy-1&resourceVersion=1778960156816425000&timeoutSeconds=572&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-haproxy-1 condition met E0516 19:35:58.104495 8520 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-0&resourceVersion=1778960156816425000&timeoutSeconds=353&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-pxc-0 condition met E0516 19:35:58.405358 8520 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-1&resourceVersion=1778960156816425000&timeoutSeconds=332&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-pxc-1 condition met E0516 19:35:58.706045 8520 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-2&resourceVersion=1778960156816425000&timeoutSeconds=410&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-pxc-2 condition met ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- waiting for pxc/monitoring to be ready ----------------------------------------------------------------------------------- compare statefulset/monitoring-pxc--no-prefix ----------------------------------------------------------------------------------- [2026-05-16T19:36:13+0000] compare_kubectl: statefulset/monitoring-pxc OK ----------------------------------------------------------------------------------- compare statefulset/monitoring-haproxy--no-prefix ----------------------------------------------------------------------------------- [2026-05-16T19:36:15+0000] compare_kubectl: statefulset/monitoring-haproxy OK ----------------------------------------------------------------------------------- apply my-env-var-secrets to add PMM_PREFIX ----------------------------------------------------------------------------------- secret/my-env-var-secrets created Waiting for sts/monitoring-pxc to reach generation 2... Resource sts/monitoring-pxc is at generation 1. Waiting... Resource sts/monitoring-pxc has reached generation 2. Waiting for sts/monitoring-haproxy to reach generation 2... Resource sts/monitoring-haproxy has reached generation 2. ----------------------------------------------------------------------------------- create new PMM token and add it to the secret ----------------------------------------------------------------------------------- secret/my-cluster-secrets patched ----------------------------------------------------------------------------------- delete old PMM token ----------------------------------------------------------------------------------- Waiting for sts/monitoring-pxc to reach generation 3... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc has reached generation 3. Waiting for sts/monitoring-haproxy to reach generation 3... Resource sts/monitoring-haproxy has reached generation 3. E0516 19:39:21.268890 3022 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-haproxy-0&resourceVersion=1778960360580236000&timeoutSeconds=529&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-haproxy-0 condition met pod/monitoring-haproxy-1 condition met E0516 19:39:26.397591 3022 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-0&resourceVersion=1778960366122271023&timeoutSeconds=313&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-pxc-0 condition met E0516 19:39:26.703384 3022 reflector.go:227] "Failed to watch" err="Get \"https://136.112.216.179/api/v1/namespaces/monitoring-pmm3-13245/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dmonitoring-pxc-1&resourceVersion=1778960366381999023&timeoutSeconds=510&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/monitoring-pxc-1 condition met pod/monitoring-pxc-2 condition met ----------------------------------------------------------------------------------- check if pmm-client container enabled ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- compare statefulset/monitoring-pxc- ----------------------------------------------------------------------------------- [2026-05-16T19:39:59+0000] compare_kubectl: statefulset/monitoring-pxc OK ----------------------------------------------------------------------------------- compare statefulset/monitoring-haproxy- ----------------------------------------------------------------------------------- [2026-05-16T19:40:01+0000] compare_kubectl: statefulset/monitoring-haproxy OK ----------------------------------------------------------------------------------- check mysql metrics ----------------------------------------------------------------------------------- jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) jq: error (at :0): Cannot iterate over null (null) ----------------------------------------------------------------------------------- check haproxy metrics ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- check QAN data ----------------------------------------------------------------------------------- null ----------------------------------------------------------------------------------- verify that the custom cluster name is configured ----------------------------------------------------------------------------------- command terminated with exit code 1 command terminated with exit code 1 command terminated with exit code 1 perconaxtradbcluster.pxc.percona.com/monitoring patched waiting for pod/monitoring-pxc-0 to be deleted.................Error from server (NotFound): pods "monitoring-pxc-0" not found command terminated with exit code 1 command terminated with exit code 1 command terminated with exit code 1 release "monitoring" uninstalled ----------------------------------------------------------------------------------- destroy cluster/operator and all other resources ----------------------------------------------------------------------------------- + kubectl patch pxc -n monitoring-pmm3-13245 monitoring --type=merge -p '{"metadata":{"finalizers":[]}}' perconaxtradbcluster.pxc.percona.com/monitoring patched (no change) perconaxtradbcluster.pxc.percona.com "monitoring" deleted from monitoring-pmm3-13245 namespace No resources found No resources found validatingwebhookconfiguration.admissionregistration.k8s.io "percona-xtradbcluster-webhook" deleted ----------------------------------------------------------------------------------- test passed -----------------------------------------------------------------------------------