Log: /mnt/jenkins/workspace/cloud-pxc-operator_PR-2429/e2e-tests/logs/haproxy-5-7.log Warning: version difference between client (1.35) and server (1.32) exceeds the supported minor version skew of +/-1 Warning: version difference between client (1.35) and server (1.32) exceeds the supported minor version skew of +/-1 No resources found + kubectl patch pxc -n sh --type=merge -p '{"metadata":{"finalizers":[]}}' error: resource(s) were provided, but no name was specified No resources found No resources found No resources found error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces pxc-operator ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "pxc-operator" not found waiting for namespace/pxc-operator to be deletederror: resource(s) were provided, but no name was specified Error from server (NotFound): namespaces "pxc-operator" not found ----------------------------------------------------------------------------------- create namespace pxc-operator ----------------------------------------------------------------------------------- namespace/pxc-operator created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-2429-347d2f0b-1-cluster4" modified. ----------------------------------------------------------------------------------- start PXC operator ----------------------------------------------------------------------------------- customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterbackups.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterrestores.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusters.pxc.percona.com serverside-applied clusterrole.rbac.authorization.k8s.io/percona-xtradb-cluster-operator unchanged serviceaccount/percona-xtradb-cluster-operator created clusterrolebinding.rbac.authorization.k8s.io/service-account-percona-xtradb-cluster-operator unchanged deployment.apps/percona-xtradb-cluster-operator created service/percona-xtradb-cluster-operator created pod/percona-xtradb-cluster-operator-57d7f6977c-ncq7s condition met pod/percona-xtradb-cluster-operator-57d7f6977c-ncq7s condition met waiting for pod/percona-xtradb-cluster-operator-57d7f6977c-ncq7s to become Ready.Ok error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces haproxy-8631 ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "haproxy-8631" not found waiting for namespace/haproxy-8631 to be deletederror: resource(s) were provided, but no name was specified Error from server (NotFound): namespaces "haproxy-8631" not found ----------------------------------------------------------------------------------- create namespace haproxy-8631 ----------------------------------------------------------------------------------- namespace/haproxy-8631 created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-2429-347d2f0b-1-cluster4" modified. ----------------------------------------------------------------------------------- create secrets for cloud storages ----------------------------------------------------------------------------------- secret/minio-secret created secret/aws-s3-secret created secret/do-spaces-secret created secret/gcp-cs-secret created secret/azure-secret created error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- install chaos-mesh ----------------------------------------------------------------------------------- "chaos-mesh" has been added to your repositories NAME: chaos-mesh LAST DEPLOYED: Thu Apr 9 02:33:36 2026 NAMESPACE: haproxy-8631 STATUS: deployed REVISION: 1 TEST SUITE: None NOTES: 1. Make sure chaos-mesh components are running kubectl get pods --namespace haproxy-8631 -l app.kubernetes.io/instance=chaos-mesh Waiting for DaemonSet chaos-daemon... runtimeclass.node.k8s.io/docker-rc created ----------------------------------------------------------------------------------- create first PXC cluster with HAProxy ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- create first PXC cluster ----------------------------------------------------------------------------------- secret/my-cluster-secrets created secret/some-name-ssl created secret/some-name-ssl-internal created deployment.apps/pxc-client created perconaxtradbcluster.pxc.percona.com/haproxy created ----------------------------------------------------------------------------------- check if all 3 Pods started ----------------------------------------------------------------------------------- error: no matching resources found ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/haproxy-haproxy-0 condition met waiting for pod/haproxy-haproxy-0 to become Ready.Ok ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/haproxy-pxc-0 condition met waiting for pod/haproxy-pxc-0 to become Ready.Ok pod/haproxy-pxc-1 condition met waiting for pod/haproxy-pxc-1 to become Ready.Ok pod/haproxy-pxc-2 condition met waiting for pod/haproxy-pxc-2 to become Ready.Ok ----------------------------------------------------------------------------------- write data ----------------------------------------------------------------------------------- pod/pxc-client-97cb9c68b-tp5dw condition met waiting for pod/pxc-client-97cb9c68b-tp5dw to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-97cb9c68b-tp5dw condition met waiting for pod/pxc-client-97cb9c68b-tp5dw to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-97cb9c68b-tp5dw condition met waiting for pod/pxc-client-97cb9c68b-tp5dw to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-97cb9c68b-tp5dw condition met waiting for pod/pxc-client-97cb9c68b-tp5dw to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-97cb9c68b-tp5dw condition met waiting for pod/pxc-client-97cb9c68b-tp5dw to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok ----------------------------------------------------------------------------------- checking all haproxy pods point to the same writer ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/haproxy-pxc-0 condition met waiting for pod/haproxy-pxc-0 to become Ready.Ok pod/haproxy-pxc-1 condition met waiting for pod/haproxy-pxc-1 to become Ready.Ok pod/haproxy-pxc-2 condition met waiting for pod/haproxy-pxc-2 to become Ready.Ok ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- waiting for pxc/haproxy to be ready pod/pxc-client-97cb9c68b-tp5dw condition met waiting for pod/pxc-client-97cb9c68b-tp5dw to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-97cb9c68b-tp5dw condition met waiting for pod/pxc-client-97cb9c68b-tp5dw to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-97cb9c68b-tp5dw condition met waiting for pod/pxc-client-97cb9c68b-tp5dw to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok ----------------------------------------------------------------------------------- delete active writer and checking all haproxy pods still point to the same writer ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- fail pxc-pod-0 pod for 60s ----------------------------------------------------------------------------------- podchaos.chaos-mesh.org/chaos-pod-failure created NAME READY STATUS RESTARTS AGE chaos-controller-manager-647477cc95-9np4k 1/1 Running 0 7m21s chaos-controller-manager-647477cc95-9w7lj 1/1 Running 0 7m21s chaos-controller-manager-647477cc95-nthbx 1/1 Running 0 7m22s chaos-daemon-2z74d 1/1 Running 0 7m22s chaos-daemon-cj7vb 1/1 Running 0 7m22s chaos-daemon-gtffq 1/1 Running 0 7m21s chaos-dns-server-6648bb7956-x9vrq 1/1 Running 0 7m22s haproxy-haproxy-0 3/3 Running 0 6m59s haproxy-haproxy-1 3/3 Running 0 5m29s haproxy-haproxy-2 3/3 Running 0 5m9s haproxy-pxc-0 0/1 CrashLoopBackOff 2 (6s ago) 6m59s haproxy-pxc-1 1/1 Running 0 5m34s haproxy-pxc-2 1/1 Running 0 4m19s pxc-client-97cb9c68b-tp5dw 2/2 Running 0 7m3s pod/pxc-client-97cb9c68b-tp5dw condition met waiting for pod/pxc-client-97cb9c68b-tp5dw to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-97cb9c68b-tp5dw condition met waiting for pod/pxc-client-97cb9c68b-tp5dw to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-97cb9c68b-tp5dw condition met waiting for pod/pxc-client-97cb9c68b-tp5dw to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/haproxy-pxc-0 condition met waiting for pod/haproxy-pxc-0 to become Ready.Ok pod/haproxy-pxc-1 condition met waiting for pod/haproxy-pxc-1 to become Ready.Ok pod/haproxy-pxc-2 condition met waiting for pod/haproxy-pxc-2 to become Ready.Ok ----------------------------------------------------------------------------------- check advanced options are enabled in haproxy statefulset ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- compare pdb/haproxy-haproxy- ----------------------------------------------------------------------------------- [2026-04-09T02:42:37+0000] compare_kubectl: pdb/haproxy-haproxy OK ----------------------------------------------------------------------------------- compare statefulset/haproxy-haproxy- ----------------------------------------------------------------------------------- [2026-04-09T02:42:39+0000] compare_kubectl: statefulset/haproxy-haproxy OK ----------------------------------------------------------------------------------- default haproxy-replicas service ----------------------------------------------------------------------------------- The request is invalid: the server rejected our request due to an error in our request ----------------------------------------------------------------------------------- compare service/haproxy-haproxy-replicas- ----------------------------------------------------------------------------------- [2026-04-09T02:42:41+0000] compare_kubectl: service/haproxy-haproxy-replicas OK ----------------------------------------------------------------------------------- disable haproxy-replicas service ----------------------------------------------------------------------------------- waiting for svc/haproxy-haproxy-replicas to be deletedError from server (NotFound): services "haproxy-haproxy-replicas" not found ----------------------------------------------------------------------------------- enable haproxy-replicas service ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- compare service/haproxy-haproxy-replicas- ----------------------------------------------------------------------------------- [2026-04-09T02:43:23+0000] compare_kubectl: service/haproxy-haproxy-replicas OK ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/haproxy-haproxy-0 condition met waiting for pod/haproxy-haproxy-0 to become Ready.Ok pod/haproxy-haproxy-1 condition met waiting for pod/haproxy-haproxy-1 to become Ready.Ok pod/haproxy-haproxy-2 condition met waiting for pod/haproxy-haproxy-2 to become Ready.Ok Unable to use a TTY - input is not a terminal or the right kind of file secret/haproxy-haproxy created ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- waiting for pxc/haproxy to be ready.......... ----------------------------------------------------------------------------------- compare statefulset/haproxy-haproxy--secret ----------------------------------------------------------------------------------- [2026-04-09T02:44:58+0000] compare_kubectl: statefulset/haproxy-haproxy OK Unable to use a TTY - input is not a terminal or the right kind of file ----------------------------------------------------------------------------------- clean up ----------------------------------------------------------------------------------- release "chaos-mesh" uninstalled error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found podchaos.chaos-mesh.org/chaos-pod-failure patched podchaos.chaos-mesh.org "chaos-pod-failure" deleted from haproxy-8631 namespace No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found customresourcedefinition.apiextensions.k8s.io "awschaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "azurechaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "blockchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "dnschaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "gcpchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "httpchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "iochaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "jvmchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "kernelchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "networkchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "physicalmachinechaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "physicalmachines.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "podchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "podhttpchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "podiochaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "podnetworkchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "remoteclusters.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "schedules.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "statuschecks.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "stresschaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "timechaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "workflownodes.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "workflows.chaos-mesh.org" deleted error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- destroy cluster/operator and all other resources ----------------------------------------------------------------------------------- + kubectl patch pxc -n haproxy-8631 haproxy --type=merge -p '{"metadata":{"finalizers":[]}}' perconaxtradbcluster.pxc.percona.com/haproxy patched perconaxtradbcluster.pxc.percona.com "haproxy" deleted from haproxy-8631 namespace No resources found No resources found validatingwebhookconfiguration.admissionregistration.k8s.io "percona-xtradbcluster-webhook" deleted ----------------------------------------------------------------------------------- test passed -----------------------------------------------------------------------------------