Log: /mnt/jenkins/workspace/cloud-pxc-operator_PR-1710/e2e-tests/logs/cross-site-8-0.log WARNING: version difference between client (1.30) and server (1.26) exceeds the supported minor version skew of +/-1 WARNING: version difference between client (1.30) and server (1.26) exceeds the supported minor version skew of +/-1 ----------------------------------------------------------------------------------- Create source cluster ----------------------------------------------------------------------------------- E0514 15:41:50.653815 2295 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:50.874550 2295 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:50.980318 2295 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:51.086328 2295 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:51.414309 2295 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:51.615533 2295 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:51.724040 2295 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: the server doesn't have a resource type "pxc" + kubectl patch pxc -n sh --type=merge -p '{"metadata":{"finalizers":[]}}' E0514 15:41:52.926002 2692 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:53.230359 2692 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:53.335198 2692 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:53.440248 2692 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:53.758783 2692 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:53.999261 2692 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:54.108176 2692 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: the server doesn't have a resource type "pxc" E0514 15:41:55.516983 3013 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:55.629014 3013 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:55.735811 3013 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:55.844586 3013 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:56.174421 3013 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:56.382639 3013 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:56.490562 3013 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: the server doesn't have a resource type "pxc" E0514 15:41:58.091634 3266 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:58.305261 3266 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:58.414069 3266 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:58.522317 3266 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:58.849565 3266 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:59.078155 3266 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:41:59.204756 3266 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: the server doesn't have a resource type "pxc-backup" E0514 15:42:00.724307 3654 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:01.043448 3654 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:01.149789 3654 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:01.256117 3654 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:01.580564 3654 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:01.794689 3654 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:01.902850 3654 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: the server doesn't have a resource type "pxc-restore" E0514 15:42:04.432730 4225 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:04.543917 4225 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:04.649811 4225 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:04.755985 4225 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:06.413835 4467 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:06.638380 4467 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:06.745176 4467 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:06.852452 4467 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: resource(s) were provided, but no name was specified E0514 15:42:08.367135 4724 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:08.473434 4724 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:08.579855 4724 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:08.686380 4724 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:10.058469 5062 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:10.316448 5062 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:10.422261 5062 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:10.528131 5062 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: resource(s) were provided, but no name was specified E0514 15:42:11.900192 5333 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:12.492371 5333 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:12.669491 5333 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:12.799312 5333 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:14.366332 5692 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:14.679129 5692 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:14.786719 5692 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:14.894345 5692 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: resource(s) were provided, but no name was specified E0514 15:42:16.183931 5956 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:16.289709 5956 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: unable to retrieve the complete list of server APIs: metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:17.951684 6204 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:18.168798 6204 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:18.274888 6204 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:18.381316 6204 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:20.506358 6599 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:20.727528 6599 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:20.833995 6599 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:20.939769 6599 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: resource(s) were provided, but no name was specified E0514 15:42:22.961588 6930 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:23.271032 6930 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:23.378023 6930 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:23.484757 6930 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:25.095076 7299 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:25.318438 7299 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:25.424597 7299 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 15:42:25.530468 7299 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces pxc-operator ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "pxc-operator" not found namespace/pxc-operator - Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted Error from server (NotFound): namespaces "pxc-operator" not found ----------------------------------------------------------------------------------- create namespace pxc-operator ----------------------------------------------------------------------------------- namespace/pxc-operator created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1710-fb797906-3-cluster3" modified. ----------------------------------------------------------------------------------- start PXC operator ----------------------------------------------------------------------------------- customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterbackups.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterrestores.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusters.pxc.percona.com serverside-applied clusterrole.rbac.authorization.k8s.io/percona-xtradb-cluster-operator created serviceaccount/percona-xtradb-cluster-operator created clusterrolebinding.rbac.authorization.k8s.io/service-account-percona-xtradb-cluster-operator created deployment.apps/percona-xtradb-cluster-operator created service/percona-xtradb-cluster-operator created pod/percona-xtradb-cluster-operator-8448ddf579-qbczk condition met pod/percona-xtradb-cluster-operator-8448ddf579-qbczk condition met percona-xtradb-cluster-operator-8448ddf579-qbczk.Ok error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces cross-site-18126 ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "cross-site-18126" not found namespace/cross-site-18126 - Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted Error from server (NotFound): namespaces "cross-site-18126" not found ----------------------------------------------------------------------------------- create namespace cross-site-18126 ----------------------------------------------------------------------------------- namespace/cross-site-18126 created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1710-fb797906-3-cluster3" modified. ----------------------------------------------------------------------------------- create secrets for cloud storages ----------------------------------------------------------------------------------- secret/minio-secret created secret/aws-s3-secret created secret/gcp-cs-secret created secret/azure-secret created "hashicorp" already exists with the same configuration, skipping "minio" already exists with the same configuration, skipping Hang tight while we grab the latest from your chart repositories... ...Successfully got an update from the "minio" chart repository ...Successfully got an update from the "chaos-mesh" chart repository ...Successfully got an update from the "hashicorp" chart repository ...Successfully got an update from the "stable" chart repository Update Complete. ⎈Happy Helming!⎈ ----------------------------------------------------------------------------------- install Minio ----------------------------------------------------------------------------------- Error: uninstall: Release not loaded: minio-service: release: not found NAME: minio-service LAST DEPLOYED: Tue May 14 15:43:29 2024 NAMESPACE: cross-site-18126 STATUS: deployed REVISION: 1 TEST SUITE: None NOTES: MinIO can be accessed via port 9000 on the following DNS name from within your cluster: minio-service.cross-site-18126.svc.cluster.local To access MinIO from localhost, run the below commands: 1. export POD_NAME=$(kubectl get pods --namespace cross-site-18126 -l "release=minio-service" -o jsonpath="{.items[0].metadata.name}") 2. kubectl port-forward $POD_NAME 9000 --namespace cross-site-18126 Read more about port forwarding here: http://kubernetes.io/docs/user-guide/kubectl/kubectl_port-forward/ You can now access MinIO server on http://localhost:9000. Follow the below steps to connect to MinIO server with mc client: 1. Download the MinIO mc client - https://min.io/docs/minio/linux/reference/minio-mc.html#quickstart 2. export MC_HOST_minio-service-local=http://$(kubectl get secret --namespace cross-site-18126 minio-service -o jsonpath="{.data.rootUser}" | base64 --decode):$(kubectl get secret --namespace cross-site-18126 minio-service -o jsonpath="{.data.rootPassword}" | base64 --decode)@localhost:9000 3. mc ls minio-service-local pod/minio-service-76ffcfd45-kxgmz condition met minio-service-76ffcfd45-kxgmz.Ok make_bucket: operator-testing pod "aws-cli" deleted If you don't see a command prompt, try pressing enter. warning: couldn't attach to pod/aws-cli, falling back to streaming logs: Internal error occurred: error attaching to container: container is in CONTAINER_EXITED state ----------------------------------------------------------------------------------- create secrets for cloud storages ----------------------------------------------------------------------------------- secret/minio-secret unchanged secret/aws-s3-secret unchanged secret/gcp-cs-secret unchanged secret/azure-secret unchanged ----------------------------------------------------------------------------------- create first PXC cluster ----------------------------------------------------------------------------------- secret/my-cluster-secrets created secret/some-name-ssl created secret/some-name-ssl-internal created deployment.apps/pxc-client created perconaxtradbcluster.pxc.percona.com/cross-site-source created ----------------------------------------------------------------------------------- check if all 3 Pods started ----------------------------------------------------------------------------------- error: no matching resources found ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- Error from server (NotFound): pods "cross-site-source-haproxy-0" not found cross-site-source-haproxy-0...........................................Defaulted container "haproxy" out of: haproxy, pxc-monit, pxc-init (init) .Ok ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/cross-site-source-pxc-0 condition met cross-site-source-pxc-0.Ok pod/cross-site-source-pxc-1 condition met cross-site-source-pxc-1.Ok pod/cross-site-source-pxc-2 condition met cross-site-source-pxc-2.Ok ----------------------------------------------------------------------------------- write data ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok Unable to use a TTY - input is not a terminal or the right kind of file ----------------------------------------------------------------------------------- get main cluster services endpoints ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok ----------------------------------------------------------------------------------- patch source cluster with replicationChannels settings ----------------------------------------------------------------------------------- perconaxtradbcluster.pxc.percona.com/cross-site-source patched ----------------------------------------------------------------------------------- patch main cluster secrets with replication user ----------------------------------------------------------------------------------- secret/my-cluster-secrets patched ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- write data to source cluster ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok ----------------------------------------------------------------------------------- take backup of source cluster ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- make backup backup-minio-source ----------------------------------------------------------------------------------- perconaxtradbclusterbackup.pxc.percona.com/backup-minio-source created backup-minio-source...................Succeeded ----------------------------------------------------------------------------------- create replica cluster ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces cross-site-replica-2753 ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "cross-site-replica-2753" not found namespace/cross-site-replica-2753 - Error from server (NotFound): namespaces "cross-site-replica-2753" not found ----------------------------------------------------------------------------------- create namespace cross-site-replica-2753 ----------------------------------------------------------------------------------- namespace/cross-site-replica-2753 created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1710-fb797906-3-cluster3" modified. ----------------------------------------------------------------------------------- start PXC operator ----------------------------------------------------------------------------------- customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterbackups.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterrestores.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusters.pxc.percona.com serverside-applied clusterrole.rbac.authorization.k8s.io/percona-xtradb-cluster-operator unchanged serviceaccount/percona-xtradb-cluster-operator created clusterrolebinding.rbac.authorization.k8s.io/service-account-percona-xtradb-cluster-operator unchanged deployment.apps/percona-xtradb-cluster-operator created service/percona-xtradb-cluster-operator created error: timed out waiting for the condition on pods/percona-xtradb-cluster-operator-8448ddf579-88rcd pod/percona-xtradb-cluster-operator-8448ddf579-qbczk condition met percona-xtradb-cluster-operator-8448ddf579-qbczk.Ok secret/cross-site-replica-ssl-internal created ----------------------------------------------------------------------------------- create secrets for cloud storages ----------------------------------------------------------------------------------- secret/minio-secret created secret/aws-s3-secret created secret/gcp-cs-secret created secret/azure-secret created ----------------------------------------------------------------------------------- create first PXC cluster ----------------------------------------------------------------------------------- secret/my-cluster-secrets created secret/some-name-ssl created secret/some-name-ssl-internal created deployment.apps/pxc-client created perconaxtradbcluster.pxc.percona.com/cross-site-replica created ----------------------------------------------------------------------------------- check if all 3 Pods started ----------------------------------------------------------------------------------- error: no matching resources found ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- Error from server (NotFound): pods "cross-site-replica-haproxy-0" not found cross-site-replica-haproxy-0......................................Defaulted container "haproxy" out of: haproxy, pxc-monit, pxc-init (init) .Ok ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/cross-site-replica-pxc-0 condition met cross-site-replica-pxc-0.Ok pod/cross-site-replica-pxc-1 condition met cross-site-replica-pxc-1.Ok pod/cross-site-replica-pxc-2 condition met cross-site-replica-pxc-2.Ok ----------------------------------------------------------------------------------- write data ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok Unable to use a TTY - input is not a terminal or the right kind of file ----------------------------------------------------------------------------------- restore backup from source cluster ----------------------------------------------------------------------------------- perconaxtradbclusterrestore.pxc.percona.com/backup-minio created ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness ----------------------------------------------------------------------------------- get replica cluster services endpoints ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok ----------------------------------------------------------------------------------- patch replica cluster with replicationChannels settings ----------------------------------------------------------------------------------- perconaxtradbcluster.pxc.percona.com/cross-site-replica patched ----------------------------------------------------------------------------------- patch replica cluster secrets with replication user ----------------------------------------------------------------------------------- secret/my-cluster-secrets patched ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- Check replication works between source -> replica ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok ----------------------------------------------------------------------------------- make backup backup-minio-replica ----------------------------------------------------------------------------------- perconaxtradbclusterbackup.pxc.percona.com/backup-minio-replica created backup-minio-replica................Succeeded ----------------------------------------------------------------------------------- Switch clusters over ----------------------------------------------------------------------------------- Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1710-fb797906-3-cluster3" modified. ----------------------------------------------------------------------------------- rebuild source cluster ----------------------------------------------------------------------------------- perconaxtradbclusterrestore.pxc.percona.com/backup-minio created ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok ----------------------------------------------------------------------------------- configure old replica as source ----------------------------------------------------------------------------------- perconaxtradbcluster.pxc.percona.com/cross-site-replica patched perconaxtradbcluster.pxc.percona.com/cross-site-replica patched ----------------------------------------------------------------------------------- configure old source as replica ----------------------------------------------------------------------------------- perconaxtradbcluster.pxc.percona.com/cross-site-source patched perconaxtradbcluster.pxc.percona.com/cross-site-source patched Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1710-fb797906-3-cluster3" modified. ----------------------------------------------------------------------------------- Write data to replica cluster ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok pod/pxc-client-6644d8898f-676l5 condition met pxc-client-6644d8898f-676l5.Ok ----------------------------------------------------------------------------------- Check replication works between replica -> source ----------------------------------------------------------------------------------- Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1710-fb797906-3-cluster3" modified. pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok pod/pxc-client-6644d8898f-h4db4 condition met pxc-client-6644d8898f-h4db4.Ok ----------------------------------------------------------------------------------- destroy cluster/operator and all other resources ----------------------------------------------------------------------------------- + kubectl patch pxc -n cross-site-18126 cross-site-source --type=merge -p '{"metadata":{"finalizers":[]}}' perconaxtradbcluster.pxc.percona.com/cross-site-source patched + kubectl patch pxc -n cross-site-replica-2753 cross-site-replica --type=merge -p '{"metadata":{"finalizers":[]}}' perconaxtradbcluster.pxc.percona.com/cross-site-replica patched perconaxtradbcluster.pxc.percona.com "cross-site-source" deleted perconaxtradbcluster.pxc.percona.com "cross-site-replica" deleted perconaxtradbclusterbackup.pxc.percona.com "backup-minio-source" deleted perconaxtradbclusterbackup.pxc.percona.com "backup-minio-replica" deleted perconaxtradbclusterrestore.pxc.percona.com "backup-minio" deleted perconaxtradbclusterrestore.pxc.percona.com "backup-minio" deleted validatingwebhookconfiguration.admissionregistration.k8s.io "percona-xtradbcluster-webhook" deleted ----------------------------------------------------------------------------------- destroy cluster/operator and all other resources ----------------------------------------------------------------------------------- No resources found + kubectl patch pxc -n sh --type=merge -p '{"metadata":{"finalizers":[]}}' error: resource(s) were provided, but no name was specified No resources found No resources found No resources found Error from server (NotFound): validatingwebhookconfigurations.admissionregistration.k8s.io "percona-xtradbcluster-webhook" not found ----------------------------------------------------------------------------------- test passed ----------------------------------------------------------------------------------- namespace "pxc-operator" force deleted Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. Error from server (NotFound): namespaces "pxc-operator" not found