Log: /mnt/jenkins/workspace/cloud-pxc-operator_PR-1709/e2e-tests/logs/cross-site-8-0.log WARNING: version difference between client (1.30) and server (1.26) exceeds the supported minor version skew of +/-1 WARNING: version difference between client (1.30) and server (1.26) exceeds the supported minor version skew of +/-1 ----------------------------------------------------------------------------------- Create source cluster ----------------------------------------------------------------------------------- E0514 17:01:09.140520 3168 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:09.449901 3168 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:09.559055 3168 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:09.665465 3168 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:09.771953 3168 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: the server doesn't have a resource type "pxc" + kubectl patch pxc -n sh --type=merge -p '{"metadata":{"finalizers":[]}}' E0514 17:01:11.323198 3524 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:11.542637 3524 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:11.647955 3524 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:11.752702 3524 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:12.071809 3524 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:12.285254 3524 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:12.398318 3524 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: the server doesn't have a resource type "pxc" E0514 17:01:14.437905 3986 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:14.634581 3986 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:14.740437 3986 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:14.846027 3986 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:15.178063 3986 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:15.379838 3986 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:15.489557 3986 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: the server doesn't have a resource type "pxc" E0514 17:01:16.907568 4389 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:17.217583 4389 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:17.333136 4389 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:17.439598 4389 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:17.757310 4389 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:17.985932 4389 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:18.095734 4389 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: the server doesn't have a resource type "pxc-backup" E0514 17:01:19.403822 4664 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:19.626698 4664 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:19.733983 4664 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:19.840671 4664 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:20.172063 4664 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:20.374812 4664 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:20.484025 4664 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: the server doesn't have a resource type "pxc-restore" E0514 17:01:22.206694 5114 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:22.524424 5114 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:22.631197 5114 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:22.738313 5114 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:24.085176 5266 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:24.306507 5266 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:24.412821 5266 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:24.521912 5266 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: resource(s) were provided, but no name was specified E0514 17:01:26.166376 5572 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:26.474829 5572 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:26.583063 5572 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:26.688012 5572 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:28.562550 5843 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:28.824076 5843 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:28.930981 5843 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:29.038766 5843 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: resource(s) were provided, but no name was specified E0514 17:01:30.757920 6138 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:30.981918 6138 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:31.089501 6138 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:31.197557 6138 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:32.741048 6427 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:33.054737 6427 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:33.163259 6427 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:33.272401 6427 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: resource(s) were provided, but no name was specified E0514 17:01:34.875297 6698 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:35.186185 6698 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: unable to retrieve the complete list of server APIs: metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:36.836799 6962 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:37.150827 6962 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:37.258074 6962 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:37.363690 6962 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:39.374540 7323 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:39.688192 7323 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:39.796663 7323 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:39.903395 7323 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: resource(s) were provided, but no name was specified E0514 17:01:41.797998 7733 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:42.105198 7733 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:42.210423 7733 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:42.315742 7733 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:44.462738 8103 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:44.776145 8103 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:44.883775 8103 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:44.991412 8103 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: resource(s) were provided, but no name was specified E0514 17:01:46.256895 8382 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:46.482377 8382 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:46.588446 8382 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:46.701836 8382 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:48.293530 8646 memcache.go:287] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:48.519941 8646 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:48.625331 8646 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request E0514 17:01:48.730868 8646 memcache.go:121] couldn't get resource list for metrics.k8s.io/v1beta1: the server is currently unable to handle the request error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces pxc-operator ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "pxc-operator" not found namespace/pxc-operator - Error from server (NotFound): namespaces "pxc-operator" not found ----------------------------------------------------------------------------------- create namespace pxc-operator ----------------------------------------------------------------------------------- Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted namespace/pxc-operator created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1709-788cbf69-3-cluster3" modified. ----------------------------------------------------------------------------------- start PXC operator ----------------------------------------------------------------------------------- customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterbackups.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterrestores.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusters.pxc.percona.com serverside-applied clusterrole.rbac.authorization.k8s.io/percona-xtradb-cluster-operator created serviceaccount/percona-xtradb-cluster-operator created clusterrolebinding.rbac.authorization.k8s.io/service-account-percona-xtradb-cluster-operator created deployment.apps/percona-xtradb-cluster-operator created service/percona-xtradb-cluster-operator created pod/percona-xtradb-cluster-operator-5f94b588fb-dc2gs condition met pod/percona-xtradb-cluster-operator-5f94b588fb-dc2gs condition met percona-xtradb-cluster-operator-5f94b588fb-dc2gs.Ok error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces cross-site-30055 ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "cross-site-30055" not found namespace/cross-site-30055 - Error from server (NotFound): namespaces "cross-site-30055" not found ----------------------------------------------------------------------------------- create namespace cross-site-30055 ----------------------------------------------------------------------------------- Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted namespace/cross-site-30055 created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1709-788cbf69-3-cluster3" modified. ----------------------------------------------------------------------------------- create secrets for cloud storages ----------------------------------------------------------------------------------- secret/minio-secret created secret/aws-s3-secret created secret/gcp-cs-secret created secret/azure-secret created "hashicorp" has been added to your repositories "minio" has been added to your repositories Hang tight while we grab the latest from your chart repositories... ...Successfully got an update from the "minio" chart repository ...Successfully got an update from the "hashicorp" chart repository Update Complete. ⎈Happy Helming!⎈ ----------------------------------------------------------------------------------- install Minio ----------------------------------------------------------------------------------- Error: uninstall: Release not loaded: minio-service: release: not found NAME: minio-service LAST DEPLOYED: Tue May 14 17:03:01 2024 NAMESPACE: cross-site-30055 STATUS: deployed REVISION: 1 TEST SUITE: None NOTES: MinIO can be accessed via port 9000 on the following DNS name from within your cluster: minio-service.cross-site-30055.svc.cluster.local To access MinIO from localhost, run the below commands: 1. export POD_NAME=$(kubectl get pods --namespace cross-site-30055 -l "release=minio-service" -o jsonpath="{.items[0].metadata.name}") 2. kubectl port-forward $POD_NAME 9000 --namespace cross-site-30055 Read more about port forwarding here: http://kubernetes.io/docs/user-guide/kubectl/kubectl_port-forward/ You can now access MinIO server on http://localhost:9000. Follow the below steps to connect to MinIO server with mc client: 1. Download the MinIO mc client - https://min.io/docs/minio/linux/reference/minio-mc.html#quickstart 2. export MC_HOST_minio-service-local=http://$(kubectl get secret --namespace cross-site-30055 minio-service -o jsonpath="{.data.rootUser}" | base64 --decode):$(kubectl get secret --namespace cross-site-30055 minio-service -o jsonpath="{.data.rootPassword}" | base64 --decode)@localhost:9000 3. mc ls minio-service-local pod/minio-service-76ffcfd45-kq57m condition met minio-service-76ffcfd45-kq57m.Ok make_bucket: operator-testing pod "aws-cli" deleted If you don't see a command prompt, try pressing enter. warning: couldn't attach to pod/aws-cli, falling back to streaming logs: unable to upgrade connection: container aws-cli not found in pod aws-cli_cross-site-30055 ----------------------------------------------------------------------------------- create secrets for cloud storages ----------------------------------------------------------------------------------- secret/minio-secret unchanged secret/aws-s3-secret unchanged secret/gcp-cs-secret unchanged secret/azure-secret unchanged ----------------------------------------------------------------------------------- create first PXC cluster ----------------------------------------------------------------------------------- secret/my-cluster-secrets created secret/some-name-ssl created secret/some-name-ssl-internal created deployment.apps/pxc-client created perconaxtradbcluster.pxc.percona.com/cross-site-source created ----------------------------------------------------------------------------------- check if all 3 Pods started ----------------------------------------------------------------------------------- error: no matching resources found ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- Error from server (NotFound): pods "cross-site-source-haproxy-0" not found cross-site-source-haproxy-0...........................................Defaulted container "haproxy" out of: haproxy, pxc-monit, pxc-init (init) .Ok ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/cross-site-source-pxc-0 condition met cross-site-source-pxc-0.Ok pod/cross-site-source-pxc-1 condition met cross-site-source-pxc-1.Ok pod/cross-site-source-pxc-2 condition met cross-site-source-pxc-2.Ok ----------------------------------------------------------------------------------- write data ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok Unable to use a TTY - input is not a terminal or the right kind of file ----------------------------------------------------------------------------------- get main cluster services endpoints ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok ----------------------------------------------------------------------------------- patch source cluster with replicationChannels settings ----------------------------------------------------------------------------------- perconaxtradbcluster.pxc.percona.com/cross-site-source patched ----------------------------------------------------------------------------------- patch main cluster secrets with replication user ----------------------------------------------------------------------------------- secret/my-cluster-secrets patched ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- write data to source cluster ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok ----------------------------------------------------------------------------------- take backup of source cluster ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- make backup backup-minio-source ----------------------------------------------------------------------------------- perconaxtradbclusterbackup.pxc.percona.com/backup-minio-source created backup-minio-source................Succeeded ----------------------------------------------------------------------------------- create replica cluster ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces cross-site-replica-22650 ----------------------------------------------------------------------------------- Error from server (NotFound): namespaces "cross-site-replica-22650" not found namespace/cross-site-replica-22650 - Error from server (NotFound): namespaces "cross-site-replica-22650" not found ----------------------------------------------------------------------------------- create namespace cross-site-replica-22650 ----------------------------------------------------------------------------------- namespace/cross-site-replica-22650 created Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1709-788cbf69-3-cluster3" modified. ----------------------------------------------------------------------------------- start PXC operator ----------------------------------------------------------------------------------- customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterbackups.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusterrestores.pxc.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaxtradbclusters.pxc.percona.com serverside-applied clusterrole.rbac.authorization.k8s.io/percona-xtradb-cluster-operator unchanged serviceaccount/percona-xtradb-cluster-operator created clusterrolebinding.rbac.authorization.k8s.io/service-account-percona-xtradb-cluster-operator unchanged deployment.apps/percona-xtradb-cluster-operator created service/percona-xtradb-cluster-operator created pod/percona-xtradb-cluster-operator-5f94b588fb-5plxr condition met pod/percona-xtradb-cluster-operator-5f94b588fb-dc2gs condition met percona-xtradb-cluster-operator-5f94b588fb-dc2gs.Ok secret/cross-site-replica-ssl-internal created ----------------------------------------------------------------------------------- create secrets for cloud storages ----------------------------------------------------------------------------------- secret/minio-secret created secret/aws-s3-secret created secret/gcp-cs-secret created secret/azure-secret created ----------------------------------------------------------------------------------- create first PXC cluster ----------------------------------------------------------------------------------- secret/my-cluster-secrets created secret/some-name-ssl created secret/some-name-ssl-internal created deployment.apps/pxc-client created perconaxtradbcluster.pxc.percona.com/cross-site-replica created ----------------------------------------------------------------------------------- check if all 3 Pods started ----------------------------------------------------------------------------------- error: no matching resources found ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- Error from server (NotFound): pods "cross-site-replica-haproxy-0" not found cross-site-replica-haproxy-0.......................................Defaulted container "haproxy" out of: haproxy, pxc-monit, pxc-init (init) .Ok ----------------------------------------------------------------------------------- wait for running cluster ----------------------------------------------------------------------------------- pod/cross-site-replica-pxc-0 condition met cross-site-replica-pxc-0.Ok pod/cross-site-replica-pxc-1 condition met cross-site-replica-pxc-1.Ok pod/cross-site-replica-pxc-2 condition met cross-site-replica-pxc-2.Ok ----------------------------------------------------------------------------------- write data ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok Unable to use a TTY - input is not a terminal or the right kind of file ----------------------------------------------------------------------------------- restore backup from source cluster ----------------------------------------------------------------------------------- perconaxtradbclusterrestore.pxc.percona.com/backup-minio created ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness ----------------------------------------------------------------------------------- get replica cluster services endpoints ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok ----------------------------------------------------------------------------------- patch replica cluster with replicationChannels settings ----------------------------------------------------------------------------------- perconaxtradbcluster.pxc.percona.com/cross-site-replica patched ----------------------------------------------------------------------------------- patch replica cluster secrets with replication user ----------------------------------------------------------------------------------- secret/my-cluster-secrets patched ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- Check replication works between source -> replica ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok ----------------------------------------------------------------------------------- make backup backup-minio-replica ----------------------------------------------------------------------------------- perconaxtradbclusterbackup.pxc.percona.com/backup-minio-replica created backup-minio-replica..............Succeeded ----------------------------------------------------------------------------------- Switch clusters over ----------------------------------------------------------------------------------- Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1709-788cbf69-3-cluster3" modified. ----------------------------------------------------------------------------------- rebuild source cluster ----------------------------------------------------------------------------------- perconaxtradbclusterrestore.pxc.percona.com/backup-minio created ----------------------------------------------------------------------------------- wait cluster consistency ----------------------------------------------------------------------------------- waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness waiting for cluster readyness pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok ----------------------------------------------------------------------------------- configure old replica as source ----------------------------------------------------------------------------------- perconaxtradbcluster.pxc.percona.com/cross-site-replica patched perconaxtradbcluster.pxc.percona.com/cross-site-replica patched ----------------------------------------------------------------------------------- configure old source as replica ----------------------------------------------------------------------------------- perconaxtradbcluster.pxc.percona.com/cross-site-source patched perconaxtradbcluster.pxc.percona.com/cross-site-source patched Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1709-788cbf69-3-cluster3" modified. ----------------------------------------------------------------------------------- Write data to replica cluster ----------------------------------------------------------------------------------- pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok pod/pxc-client-6644d8898f-7klqm condition met pxc-client-6644d8898f-7klqm.Ok ----------------------------------------------------------------------------------- Check replication works between replica -> source ----------------------------------------------------------------------------------- Context "gke_cloud-dev-112233_us-central1-a_jen-pxc-1709-788cbf69-3-cluster3" modified. pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok pod/pxc-client-6644d8898f-6v5kh condition met pxc-client-6644d8898f-6v5kh.Ok ----------------------------------------------------------------------------------- destroy cluster/operator and all other resources ----------------------------------------------------------------------------------- + kubectl patch pxc -n cross-site-30055 cross-site-source --type=merge -p '{"metadata":{"finalizers":[]}}' perconaxtradbcluster.pxc.percona.com/cross-site-source patched + kubectl patch pxc -n cross-site-replica-22650 cross-site-replica --type=merge -p '{"metadata":{"finalizers":[]}}' perconaxtradbcluster.pxc.percona.com/cross-site-replica patched perconaxtradbcluster.pxc.percona.com "cross-site-source" deleted perconaxtradbcluster.pxc.percona.com "cross-site-replica" deleted perconaxtradbclusterbackup.pxc.percona.com "backup-minio-source" deleted perconaxtradbclusterbackup.pxc.percona.com "backup-minio-replica" deleted perconaxtradbclusterrestore.pxc.percona.com "backup-minio" deleted perconaxtradbclusterrestore.pxc.percona.com "backup-minio" deleted validatingwebhookconfiguration.admissionregistration.k8s.io "percona-xtradbcluster-webhook" deleted ----------------------------------------------------------------------------------- destroy cluster/operator and all other resources ----------------------------------------------------------------------------------- No resources found + kubectl patch pxc -n sh --type=merge -p '{"metadata":{"finalizers":[]}}' error: resource(s) were provided, but no name was specified No resources found No resources found No resources found Error from server (NotFound): validatingwebhookconfigurations.admissionregistration.k8s.io "percona-xtradbcluster-webhook" not found ----------------------------------------------------------------------------------- test passed ----------------------------------------------------------------------------------- namespace "pxc-operator" force deleted namespace "pxc-operator" force deleted Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.