Log: /mnt/jenkins/workspace/cloud-pxc-operator_PR-2476/e2e-tests/logs/haproxy-8-4.log Warning: version difference between client (1.36) and server (1.33) exceeds the supported minor version skew of +/-1 Warning: version difference between client (1.36) and server (1.33) exceeds the supported minor version skew of +/-1 No resources found + kubectl patch pxc -n sh --type=merge -p '{"metadata":{"finalizers":[]}}' error: resource(s) were provided, but no name was specified No resources found No resources found No resources found error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces pxc-operator ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "pxc-operator" not found waiting for namespace/pxc-operator to be deletederror: resource(s) were provided, but no name was specified Error from server (NotFound): namespaces "pxc-operator" not found ----------------------------------------------------------------------------------- create namespace pxc-operator ----------------------------------------------------------------------------------- namespace/pxc-operator created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-2476-a8b01a39-5-cluster7" modified. ----------------------------------------------------------------------------------- start PXC operator ----------------------------------------------------------------------------------- customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterbackups.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterrestores.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusters.pxc.percona.com serverside-applied clusterrole.rbac.authorization.k8s.io/percona-xtradb-cluster-operator unchanged serviceaccount/percona-xtradb-cluster-operator created clusterrolebinding.rbac.authorization.k8s.io/service-account-percona-xtradb-cluster-operator unchanged deployment.apps/percona-xtradb-cluster-operator created service/percona-xtradb-cluster-operator created pod/percona-xtradb-cluster-operator-8548fd5788-mt5hq condition met E0516 19:10:36.188449 31106 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/pxc-operator/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpercona-xtradb-cluster-operator-8548fd5788-mt5hq&resourceVersion=1778958635825019000&timeoutSeconds=400&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" pod/percona-xtradb-cluster-operator-8548fd5788-mt5hq condition met E0516 19:10:41.648918 31976 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/pxc-operator/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpercona-xtradb-cluster-operator-8548fd5788-mt5hq&resourceVersion=1778958639417823000&timeoutSeconds=558&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/percona-xtradb-cluster-operator-8548fd5788-mt5hq to become Ready.Ok error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces haproxy-6119 ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "haproxy-6119" not found waiting for namespace/haproxy-6119 to be deletederror: resource(s) were provided, but no name was specified Error from server (NotFound): namespaces "haproxy-6119" not found ----------------------------------------------------------------------------------- create namespace haproxy-6119 ----------------------------------------------------------------------------------- namespace/haproxy-6119 created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-2476-a8b01a39-5-cluster7" modified. ----------------------------------------------------------------------------------- create secrets for cloud storages ----------------------------------------------------------------------------------- secret/minio-secret created secret/aws-s3-secret created secret/do-spaces-secret created secret/gcp-cs-secret created secret/azure-secret created error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- install chaos-mesh ----------------------------------------------------------------------------------- "chaos-mesh" already exists with the same configuration, skipping NAME: chaos-mesh LAST DEPLOYED: Sat May 16 19:11:57 2026 NAMESPACE: haproxy-6119 STATUS: deployed REVISION: 1 TEST SUITE: None NOTES: 1. Make sure chaos-mesh components are running kubectl get pods --namespace haproxy-6119 -l app.kubernetes.io/instance=chaos-mesh Waiting for DaemonSet chaos-daemon... runtimeclass.node.k8s.io/docker-rc created ----------------------------------------------------------------------------------- create first PXC cluster with HAProxy ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- create first PXC cluster ----------------------------------------------------------------------------------- secret/my-cluster-secrets created secret/some-name-ssl created secret/some-name-ssl-internal created deployment.apps/pxc-client created perconaxtradbcluster.pxc.percona.com/haproxy created ----------------------------------------------------------------------------------- check if all 3 Pods started ----------------------------------------------------------------------------------- error: no matching resources found ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/haproxy-haproxy-0 condition met waiting for pod/haproxy-haproxy-0 to become Ready.Ok ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/haproxy-pxc-0 condition met E0516 19:13:35.435909 24469 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dhaproxy-pxc-0&resourceVersion=1778958815140323000&timeoutSeconds=577&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/haproxy-pxc-0 to become Ready.Ok pod/haproxy-pxc-1 condition met waiting for pod/haproxy-pxc-1 to become Ready.Ok pod/haproxy-pxc-2 condition met waiting for pod/haproxy-pxc-2 to become Ready.Ok ----------------------------------------------------------------------------------- write data ----------------------------------------------------------------------------------- pod/pxc-client-56fd5498cd-dvkh7 condition met E0516 19:16:23.407692 17299 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-dvkh7&resourceVersion=1778958980652218000&timeoutSeconds=399&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-dvkh7 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-dvkh7 condition met E0516 19:16:35.644716 19162 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-dvkh7&resourceVersion=1778958993660624000&timeoutSeconds=506&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-dvkh7 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-dvkh7 condition met E0516 19:17:19.819721 25373 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-dvkh7&resourceVersion=1778959036503798000&timeoutSeconds=346&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-dvkh7 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-dvkh7 condition met E0516 19:17:36.684328 27776 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-dvkh7&resourceVersion=1778959053698495000&timeoutSeconds=327&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-dvkh7 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-dvkh7 condition met E0516 19:17:51.447635 30157 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-dvkh7&resourceVersion=1778959070454881000&timeoutSeconds=446&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-dvkh7 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok ----------------------------------------------------------------------------------- checking all haproxy pods point to the same writer ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/haproxy-pxc-0 condition met E0516 19:18:06.636821 32068 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dhaproxy-pxc-0&resourceVersion=1778959085455145000&timeoutSeconds=553&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/haproxy-pxc-0 to become Ready.Ok pod/haproxy-pxc-1 condition met E0516 19:18:16.610706 977 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dhaproxy-pxc-1&resourceVersion=1778959095455844000&timeoutSeconds=394&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/haproxy-pxc-1 to become Ready.Ok pod/haproxy-pxc-2 condition met E0516 19:18:26.991111 2520 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dhaproxy-pxc-2&resourceVersion=1778959105455923000&timeoutSeconds=449&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/haproxy-pxc-2 to become Ready.Ok ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- waiting for pxc/haproxy to be ready pod/pxc-client-56fd5498cd-dvkh7 condition met E0516 19:19:02.086340 7514 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-dvkh7&resourceVersion=1778959139437986000&timeoutSeconds=503&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-dvkh7 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-dvkh7 condition met E0516 19:19:18.550628 9808 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-dvkh7&resourceVersion=1778959155930305000&timeoutSeconds=340&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-dvkh7 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-dvkh7 condition met E0516 19:19:35.949217 12326 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-dvkh7&resourceVersion=1778959172858459000&timeoutSeconds=546&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-dvkh7 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok ----------------------------------------------------------------------------------- delete active writer and checking all haproxy pods still point to the same writer ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- fail pxc-pod-0 pod for 60s ----------------------------------------------------------------------------------- podchaos.chaos-mesh.org/chaos-pod-failure created NAME READY STATUS RESTARTS AGE chaos-controller-manager-6b9897c8df-4t24b 1/1 Running 0 7m58s chaos-controller-manager-6b9897c8df-9vd78 1/1 Running 0 7m58s chaos-controller-manager-6b9897c8df-nlfq5 1/1 Running 0 7m58s chaos-daemon-8cdcb 1/1 Running 0 7m58s chaos-daemon-qpjdm 1/1 Running 0 7m58s chaos-daemon-rl6xs 1/1 Running 0 7m58s chaos-dns-server-6648bb7956-22sqv 1/1 Running 0 7m58s haproxy-haproxy-0 3/3 Running 0 7m32s haproxy-haproxy-1 3/3 Running 0 6m41s haproxy-haproxy-2 3/3 Running 0 6m21s haproxy-pxc-0 0/1 CrashLoopBackOff 2 (8s ago) 7m32s haproxy-pxc-1 1/1 Running 0 6m45s haproxy-pxc-2 1/1 Running 0 5m28s pxc-client-56fd5498cd-dvkh7 2/2 Running 0 7m37s pod/pxc-client-56fd5498cd-dvkh7 condition met E0516 19:20:14.251729 17754 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-dvkh7&resourceVersion=1778959212063562000&timeoutSeconds=461&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-dvkh7 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-dvkh7 condition met E0516 19:20:29.452556 19921 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-dvkh7&resourceVersion=1778959227119354000&timeoutSeconds=533&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-dvkh7 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok pod/pxc-client-56fd5498cd-dvkh7 condition met E0516 19:20:45.367211 22222 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dpxc-client-56fd5498cd-dvkh7&resourceVersion=1778959243129252000&timeoutSeconds=586&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/pxc-client-56fd5498cd-dvkh7 to become ReadyDefaulted container "pxc-client" out of: pxc-client, backup .Ok ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/haproxy-pxc-0 condition met waiting for pod/haproxy-pxc-0 to become Ready.Ok pod/haproxy-pxc-1 condition met E0516 19:21:24.876277 27167 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dhaproxy-pxc-1&resourceVersion=1778959283010114000&timeoutSeconds=552&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/haproxy-pxc-1 to become Ready.Ok pod/haproxy-pxc-2 condition met E0516 19:21:32.132829 28295 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dhaproxy-pxc-2&resourceVersion=1778959288010159000&timeoutSeconds=363&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/haproxy-pxc-2 to become Ready.Ok ----------------------------------------------------------------------------------- check advanced options are enabled in haproxy statefulset ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- compare pdb/haproxy-haproxy- ----------------------------------------------------------------------------------- [2026-05-16T19:21:40+0000] compare_kubectl: pdb/haproxy-haproxy OK ----------------------------------------------------------------------------------- compare statefulset/haproxy-haproxy- ----------------------------------------------------------------------------------- [2026-05-16T19:21:43+0000] compare_kubectl: statefulset/haproxy-haproxy OK ----------------------------------------------------------------------------------- default haproxy-replicas service ----------------------------------------------------------------------------------- The request is invalid: the server rejected our request due to an error in our request ----------------------------------------------------------------------------------- compare service/haproxy-haproxy-replicas- ----------------------------------------------------------------------------------- [2026-05-16T19:21:48+0000] compare_kubectl: service/haproxy-haproxy-replicas OK ----------------------------------------------------------------------------------- disable haproxy-replicas service ----------------------------------------------------------------------------------- waiting for svc/haproxy-haproxy-replicas to be deletedError from server (NotFound): services "haproxy-haproxy-replicas" not found ----------------------------------------------------------------------------------- enable haproxy-replicas service ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- compare service/haproxy-haproxy-replicas- ----------------------------------------------------------------------------------- [2026-05-16T19:22:29+0000] compare_kubectl: service/haproxy-haproxy-replicas OK ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/haproxy-haproxy-0 condition met E0516 19:22:31.204250 3720 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dhaproxy-haproxy-0&resourceVersion=1778959348010652000&timeoutSeconds=476&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/haproxy-haproxy-0 to become Ready.Ok pod/haproxy-haproxy-1 condition met E0516 19:22:39.565604 4961 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dhaproxy-haproxy-1&resourceVersion=1778959358010684000&timeoutSeconds=562&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/haproxy-haproxy-1 to become Ready.Ok pod/haproxy-haproxy-2 condition met E0516 19:22:47.315872 6138 reflector.go:227] "Failed to watch" err="Get \"https://35.254.24.141/api/v1/namespaces/haproxy-6119/pods?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dhaproxy-haproxy-2&resourceVersion=1778959363010708000&timeoutSeconds=414&watch=true\": context canceled" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" type="*unstructured.Unstructured" waiting for pod/haproxy-haproxy-2 to become Ready.Ok Unable to use a TTY - input is not a terminal or the right kind of file secret/haproxy-haproxy created ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- waiting for pxc/haproxy to be ready.......... ----------------------------------------------------------------------------------- compare statefulset/haproxy-haproxy--secret ----------------------------------------------------------------------------------- [2026-05-16T19:24:11+0000] compare_kubectl: statefulset/haproxy-haproxy OK Unable to use a TTY - input is not a terminal or the right kind of file ----------------------------------------------------------------------------------- clean up ----------------------------------------------------------------------------------- release "chaos-mesh" uninstalled error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found podchaos.chaos-mesh.org/chaos-pod-failure patched podchaos.chaos-mesh.org "chaos-pod-failure" deleted from haproxy-6119 namespace No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found No resources found customresourcedefinition.apiextensions.k8s.io "awschaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "azurechaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "blockchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "dnschaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "gcpchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "httpchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "iochaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "jvmchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "kernelchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "networkchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "physicalmachinechaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "physicalmachines.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "podchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "podhttpchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "podiochaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "podnetworkchaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "remoteclusters.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "schedules.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "statuschecks.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "stresschaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "timechaos.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "workflownodes.chaos-mesh.org" deleted customresourcedefinition.apiextensions.k8s.io "workflows.chaos-mesh.org" deleted error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- destroy cluster/operator and all other resources ----------------------------------------------------------------------------------- + kubectl patch pxc -n haproxy-6119 haproxy --type=merge -p '{"metadata":{"finalizers":[]}}' perconaxtradbcluster.pxc.percona.com/haproxy patched perconaxtradbcluster.pxc.percona.com "haproxy" deleted from haproxy-6119 namespace No resources found No resources found validatingwebhookconfiguration.admissionregistration.k8s.io "percona-xtradbcluster-webhook" deleted ----------------------------------------------------------------------------------- test passed -----------------------------------------------------------------------------------