Log: /mnt/jenkins/workspace/cloud-pxc-operator_PR-2349/e2e-tests/logs/monitoring-2-0-8-0.log Warning: version difference between client (1.35) and server (1.31) exceeds the supported minor version skew of +/-1 Warning: version difference between client (1.35) and server (1.31) exceeds the supported minor version skew of +/-1 No resources found + kubectl patch pxc -n sh --type=merge -p '{"metadata":{"finalizers":[]}}' error: resource(s) were provided, but no name was specified No resources found No resources found No resources found error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces pxc-operator ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "pxc-operator" not found waiting for namespace/pxc-operator to be deletederror: resource(s) were provided, but no name was specified Error from server (NotFound): namespaces "pxc-operator" not found ----------------------------------------------------------------------------------- create namespace pxc-operator ----------------------------------------------------------------------------------- namespace/pxc-operator created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-2349-b5e2b8a7-1-cluster2" modified. ----------------------------------------------------------------------------------- start PXC operator ----------------------------------------------------------------------------------- customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterbackups.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterrestores.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusters.pxc.percona.com serverside-applied clusterrole.rbac.authorization.k8s.io/percona-xtradb-cluster-operator unchanged serviceaccount/percona-xtradb-cluster-operator created clusterrolebinding.rbac.authorization.k8s.io/service-account-percona-xtradb-cluster-operator unchanged deployment.apps/percona-xtradb-cluster-operator created service/percona-xtradb-cluster-operator created pod/percona-xtradb-cluster-operator-7c94dbdc94-pvnms condition met pod/percona-xtradb-cluster-operator-7c94dbdc94-pvnms condition met waiting for pod/percona-xtradb-cluster-operator-7c94dbdc94-pvnms to become Ready.Ok error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces monitoring-2-0-20869 ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "monitoring-2-0-20869" not found waiting for namespace/monitoring-2-0-20869 to be deletederror: resource(s) were provided, but no name was specified Error from server (NotFound): namespaces "monitoring-2-0-20869" not found ----------------------------------------------------------------------------------- create namespace monitoring-2-0-20869 ----------------------------------------------------------------------------------- namespace/monitoring-2-0-20869 created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-2349-b5e2b8a7-1-cluster2" modified. ----------------------------------------------------------------------------------- create secrets for cloud storages ----------------------------------------------------------------------------------- secret/minio-secret created secret/aws-s3-secret created secret/gcp-cs-secret created secret/azure-secret created "hashicorp" already exists with the same configuration, skipping "minio" already exists with the same configuration, skipping Hang tight while we grab the latest from your chart repositories... ...Successfully got an update from the "minio" chart repository ...Successfully got an update from the "chaos-mesh" chart repository ...Successfully got an update from the "hashicorp" chart repository Update Complete. ⎈Happy Helming!⎈ ----------------------------------------------------------------------------------- install PMM Server ----------------------------------------------------------------------------------- "percona" has been added to your repositories Hang tight while we grab the latest from your chart repositories... ...Successfully got an update from the "minio" chart repository ...Successfully got an update from the "chaos-mesh" chart repository ...Successfully got an update from the "percona" chart repository ...Successfully got an update from the "hashicorp" chart repository Update Complete. ⎈Happy Helming!⎈ Error: uninstall: Release not loaded: monitoring: release: not found NAME: monitoring LAST DEPLOYED: Thu Jan 22 02:47:40 2026 NAMESPACE: monitoring-2-0-20869 STATUS: deployed REVISION: 1 TEST SUITE: None NOTES: PMM server can be accessed via HTTPS (port 443) on the following DNS name from within your cluster: endpoint: https://monitoring-service.monitoring-2-0-20869.svc.cluster.local:443 login: admin password: admin statefulset.apps/monitoring condition met logger=settings t=2026-01-22T02:48:28.831016886Z level=info msg="Starting Grafana" version= commit= branch= compiled=1970-01-01T00:00:00Z logger=settings t=2026-01-22T02:48:28.831162446Z level=info msg="Config loaded from" file=/usr/share/grafana/conf/defaults.ini logger=settings t=2026-01-22T02:48:28.831175546Z level=info msg="Config loaded from" file=/etc/grafana/grafana.ini logger=settings t=2026-01-22T02:48:28.831180946Z level=info msg="Path Home" path=/usr/share/grafana logger=settings t=2026-01-22T02:48:28.831185586Z level=info msg="Path Data" path=/srv/grafana logger=settings t=2026-01-22T02:48:28.831190886Z level=info msg="Path Logs" path=/srv/logs logger=settings t=2026-01-22T02:48:28.831195956Z level=info msg="Path Plugins" path=/srv/grafana/plugins logger=settings t=2026-01-22T02:48:28.831200726Z level=info msg="Path Provisioning" path=/usr/share/grafana/conf/provisioning logger=settings t=2026-01-22T02:48:28.831205796Z level=info msg="App mode production" logger=sqlstore t=2026-01-22T02:48:28.831275196Z level=info msg="Connecting to DB" dbtype=postgres logger=migrator t=2026-01-22T02:48:28.848356153Z level=info msg="Starting DB migrations" logger=migrator t=2026-01-22T02:48:28.851962242Z level=info msg="migrations completed" performed=0 skipped=452 duration=410.91µs logger=secrets t=2026-01-22T02:48:28.853427221Z level=info msg="Envelope encryption state" enabled=true currentprovider=secretKey.v1 logger=plugin.finder t=2026-01-22T02:48:28.886397314Z level=warn msg="Skipping finding plugins as directory does not exist" path=/usr/share/grafana/plugins-bundled logger=plugin.signature.validator t=2026-01-22T02:48:29.015923877Z level=warn msg="Permitting unsigned plugin. This is not recommended" pluginID=pmm-pt-summary-datasource pluginDir=/srv/grafana/plugins/pmm-app/dist/pmm-pt-summary/datasource logger=plugin.signature.validator t=2026-01-22T02:48:29.015964867Z level=warn msg="Permitting unsigned plugin. This is not recommended" pluginID=pmm-pt-summary-panel pluginDir=/srv/grafana/plugins/pmm-app/dist/pmm-pt-summary/panel logger=plugin.signature.validator t=2026-01-22T02:48:29.016012797Z level=warn msg="Permitting unsigned plugin. This is not recommended" pluginID=grafana-polystat-panel pluginDir=/srv/grafana/plugins/grafana-polystat-panel logger=plugin.signature.validator t=2026-01-22T02:48:29.016031417Z level=warn msg="Permitting unsigned plugin. This is not recommended" pluginID=pmm-app pluginDir=/srv/grafana/plugins/pmm-app/dist logger=plugin.pmm-app t=2026-01-22T02:48:29.023600655Z level=warn msg="Included dashboard is missing a UID field" logger=plugin.signature.validator t=2026-01-22T02:48:29.023847295Z level=warn msg="Permitting unsigned plugin. This is not recommended" pluginID=pmm-qan-app-panel pluginDir=/srv/grafana/plugins/pmm-app/dist/pmm-qan logger=plugin.loader t=2026-01-22T02:48:29.023932235Z level=info msg="Plugin registered" pluginID=pmm-pt-summary-datasource logger=plugin.loader t=2026-01-22T02:48:29.023952915Z level=info msg="Plugin registered" pluginID=pmm-pt-summary-panel logger=plugin.loader t=2026-01-22T02:48:29.023959715Z level=info msg="Plugin registered" pluginID=grafana-worldmap-panel logger=plugin.loader t=2026-01-22T02:48:29.023965525Z level=info msg="Plugin registered" pluginID=grafana-clickhouse-datasource logger=plugin.grafana-clickhouse-datasource t=2026-01-22T02:48:29.035629433Z level=warn msg="Plugin process is running with elevated privileges. This is not recommended" logger=plugin.loader t=2026-01-22T02:48:29.035651742Z level=info msg="Plugin registered" pluginID=grafana-piechart-panel logger=plugin.loader t=2026-01-22T02:48:29.035660882Z level=info msg="Plugin registered" pluginID=grafana-polystat-panel logger=plugin.loader t=2026-01-22T02:48:29.035718173Z level=info msg="Plugin registered" pluginID=pmm-app logger=plugin.loader t=2026-01-22T02:48:29.035735522Z level=info msg="Plugin registered" pluginID=jdbranham-diagram-panel logger=plugin.loader t=2026-01-22T02:48:29.035741742Z level=info msg="Plugin registered" pluginID=natel-discrete-panel logger=plugin.loader t=2026-01-22T02:48:29.035749993Z level=info msg="Plugin registered" pluginID=pmm-qan-app-panel logger=plugin.loader t=2026-01-22T02:48:29.035956542Z level=info msg="Plugin registered" pluginID=camptocamp-prometheus-alertmanager-datasource logger=plugin.loader t=2026-01-22T02:48:29.035971922Z level=info msg="Plugin registered" pluginID=petrslavotinek-carpetplot-panel Admin password changed successfully ✔ ----------------------------------------------------------------------------------- create PXC cluster ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- create first PXC cluster ----------------------------------------------------------------------------------- secret/my-cluster-secrets created deployment.apps/pxc-client created perconaxtradbcluster.pxc.percona.com/monitoring created ----------------------------------------------------------------------------------- check if all 3 Pods started ----------------------------------------------------------------------------------- pod/monitoring-haproxy-0 condition met pod/monitoring-pxc-0 condition met ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/monitoring-haproxy-0 condition met waiting for pod/monitoring-haproxy-0 to become Ready.Ok ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/monitoring-pxc-0 condition met waiting for pod/monitoring-pxc-0 to become Ready.Ok pod/monitoring-pxc-1 condition met waiting for pod/monitoring-pxc-1 to become Ready.Ok pod/monitoring-pxc-2 condition met waiting for pod/monitoring-pxc-2 to become Ready.Ok ----------------------------------------------------------------------------------- write data ----------------------------------------------------------------------------------- pod/pxc-client-c75dc5c46-gcfwx condition met waiting for pod/pxc-client-c75dc5c46-gcfwx to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-c75dc5c46-gcfwx condition met waiting for pod/pxc-client-c75dc5c46-gcfwx to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-c75dc5c46-gcfwx condition met waiting for pod/pxc-client-c75dc5c46-gcfwx to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-c75dc5c46-gcfwx condition met waiting for pod/pxc-client-c75dc5c46-gcfwx to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-c75dc5c46-gcfwx condition met waiting for pod/pxc-client-c75dc5c46-gcfwx to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok Unable to use a TTY - input is not a terminal or the right kind of file ----------------------------------------------------------------------------------- add PMM API key to secret ----------------------------------------------------------------------------------- % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 155 100 119 100 36 271 82 --:--:-- --:--:-- --:--:-- 353 secret/my-cluster-secrets patched Waiting for sts/monitoring-pxc to reach generation 2... Resource sts/monitoring-pxc is at generation 1. Waiting... Resource sts/monitoring-pxc has reached generation 2. Waiting for sts/monitoring-haproxy to reach generation 2... Resource sts/monitoring-haproxy has reached generation 2. pod/monitoring-haproxy-0 condition met pod/monitoring-haproxy-1 condition met pod/monitoring-pxc-0 condition met pod/monitoring-pxc-1 condition met pod/monitoring-pxc-2 condition met ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- waiting for pxc/monitoring to be ready............... ----------------------------------------------------------------------------------- compare statefulset/monitoring-pxc--no-prefix ----------------------------------------------------------------------------------- [2026-01-22T02:58:17+0000] compare_kubectl: statefulset/monitoring-pxc OK ----------------------------------------------------------------------------------- compare statefulset/monitoring-haproxy--no-prefix ----------------------------------------------------------------------------------- [2026-01-22T02:58:18+0000] compare_kubectl: statefulset/monitoring-haproxy OK ----------------------------------------------------------------------------------- apply my-env-var-secrets to add PMM_PREFIX ----------------------------------------------------------------------------------- secret/my-env-var-secrets created Waiting for sts/monitoring-pxc to reach generation 3... Resource sts/monitoring-pxc is at generation 2. Waiting... Resource sts/monitoring-pxc has reached generation 3. Waiting for sts/monitoring-haproxy to reach generation 3... Resource sts/monitoring-haproxy has reached generation 3. ----------------------------------------------------------------------------------- add new PMM API key to secret ----------------------------------------------------------------------------------- % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 167 100 127 100 40 291 91 --:--:-- --:--:-- --:--:-- 383 secret/my-cluster-secrets patched ----------------------------------------------------------------------------------- delete old PMM key ----------------------------------------------------------------------------------- % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 600 100 600 0 0 1380 0 --:--:-- --:--:-- --:--:-- 1379 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 29 100 29 0 0 66 0 --:--:-- --:--:-- --:--:-- 66 {"message":"API key deleted"}Waiting for sts/monitoring-pxc to reach generation 4... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc is at generation 3. Waiting... Resource sts/monitoring-pxc has reached generation 4. Waiting for sts/monitoring-haproxy to reach generation 4... Resource sts/monitoring-haproxy has reached generation 4. pod/monitoring-haproxy-0 condition met pod/monitoring-haproxy-1 condition met pod/monitoring-pxc-0 condition met pod/monitoring-pxc-1 condition met pod/monitoring-pxc-2 condition met ----------------------------------------------------------------------------------- check if pmm-client container enabled ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- compare statefulset/monitoring-pxc- ----------------------------------------------------------------------------------- [2026-01-22T03:01:52+0000] compare_kubectl: statefulset/monitoring-pxc OK ----------------------------------------------------------------------------------- compare statefulset/monitoring-haproxy- ----------------------------------------------------------------------------------- [2026-01-22T03:01:54+0000] compare_kubectl: statefulset/monitoring-haproxy OK ----------------------------------------------------------------------------------- verify clients agents statuses ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- check mysql metrics ----------------------------------------------------------------------------------- "1769044836" "1769044836" "155" "215" ----------------------------------------------------------------------------------- check haproxy metrics ----------------------------------------------------------------------------------- "0" "0" "1" "1" ----------------------------------------------------------------------------------- check QAN data ----------------------------------------------------------------------------------- null perconaxtradbcluster.pxc.percona.com/monitoring patched waiting for pod/monitoring-pxc-0 to be deleted...................Error from server (NotFound): pods "monitoring-pxc-0" not found release "monitoring" uninstalled ----------------------------------------------------------------------------------- destroy cluster/operator and all other resources ----------------------------------------------------------------------------------- + kubectl patch pxc -n monitoring-2-0-20869 monitoring --type=merge -p '{"metadata":{"finalizers":[]}}' perconaxtradbcluster.pxc.percona.com/monitoring patched (no change) perconaxtradbcluster.pxc.percona.com "monitoring" deleted from monitoring-2-0-20869 namespace No resources found No resources found validatingwebhookconfiguration.admissionregistration.k8s.io "percona-xtradbcluster-webhook" deleted ----------------------------------------------------------------------------------- test passed -----------------------------------------------------------------------------------