Log: /mnt/jenkins/workspace/cloud-psmdb-operator_PR-1938/e2e-tests/logs/monitoring-2-0.log WARNING: version difference between client (1.33) and server (1.30) exceeds the supported minor version skew of +/-1 WARNING: version difference between client (1.33) and server (1.30) exceeds the supported minor version skew of +/-1 WARNING: version difference between client (1.33) and server (1.30) exceeds the supported minor version skew of +/-1 ----------------------------------------------------------------------------------- get and delete old CRDs and RBAC ----------------------------------------------------------------------------------- error: the server doesn't have a resource type "perconaservermongodbbackups" + kubectl patch perconaservermongodbbackups.psmdb.percona.com -n sh --type=merge -p '{"metadata":{"finalizers":[]}}' error: the server doesn't have a resource type "perconaservermongodbbackups" error: the server doesn't have a resource type "perconaservermongodbrestores" + kubectl patch perconaservermongodbrestores.psmdb.percona.com -n sh --type=merge -p '{"metadata":{"finalizers":[]}}' error: the server doesn't have a resource type "perconaservermongodbrestores" error: the server doesn't have a resource type "perconaservermongodbs" + kubectl patch perconaservermongodbs.psmdb.percona.com -n sh --type=merge -p '{"metadata":{"finalizers":[]}}' error: the server doesn't have a resource type "perconaservermongodbs" Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "null" not found Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "null" not found Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "null" not found Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "null" not found ----------------------------------------------------------------------------------- destroy chaos-mesh ----------------------------------------------------------------------------------- error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces psmdb-operator ----------------------------------------------------------------------------------- namespace "gke-managed-cim" deleted namespace "gke-managed-system" deleted ----------------------------------------------------------------------------------- create namespace psmdb-operator ----------------------------------------------------------------------------------- namespace "gmp-public" deleted namespace "gmp-system" deleted namespace/psmdb-operator created Context "gke_cloud-dev-112233_us-central1-a_jen-psmdb-1938-23826c20-3-cluster5" modified. ----------------------------------------------------------------------------------- start PSMDB operator ----------------------------------------------------------------------------------- customresourcedefinition.apiextensions.k8s.io/perconaservermongodbbackups.psmdb.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaservermongodbrestores.psmdb.percona.com serverside-applied customresourcedefinition.apiextensions.k8s.io/perconaservermongodbs.psmdb.percona.com serverside-applied clusterrole.rbac.authorization.k8s.io/percona-server-mongodb-operator created serviceaccount/percona-server-mongodb-operator created clusterrolebinding.rbac.authorization.k8s.io/service-account-percona-server-mongodb-operator created deployment.apps/percona-server-mongodb-operator created waiting for pod/percona-server-mongodb-operator-688cb5cf6c-mjcpk to be ready.OK Print operator info from log 2025-05-21T22:59:10.305Z INFO setup Manager starting up {"gitCommit": "23826c2013e23b0dc83348b69100d3c7104a88f2", "gitBranch": "PR-1938-23826c20", "buildTime": "", "goVersion": "go1.24.3", "os": "linux", "arch": "amd64"} ----------------------------------------------------------------------------------- destroy chaos-mesh ----------------------------------------------------------------------------------- error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified error: resource(s) were provided, but no name was specified ----------------------------------------------------------------------------------- cleaned up all old namespaces ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- cleaned up old namespaces monitoring-2-0-28682 ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- create namespace monitoring-2-0-28682 ----------------------------------------------------------------------------------- namespace "gke-managed-cim" deleted namespace "gke-managed-system" deleted namespace "gmp-public" deleted namespace "gmp-system" deleted namespace/monitoring-2-0-28682 created Context "gke_cloud-dev-112233_us-central1-a_jen-psmdb-1938-23826c20-3-cluster5" modified. ----------------------------------------------------------------------------------- deploy cert manager ----------------------------------------------------------------------------------- namespace/cert-manager created namespace/cert-manager labeled namespace/cert-manager configured customresourcedefinition.apiextensions.k8s.io/certificaterequests.cert-manager.io created customresourcedefinition.apiextensions.k8s.io/certificates.cert-manager.io created customresourcedefinition.apiextensions.k8s.io/challenges.acme.cert-manager.io created customresourcedefinition.apiextensions.k8s.io/clusterissuers.cert-manager.io created customresourcedefinition.apiextensions.k8s.io/issuers.cert-manager.io created customresourcedefinition.apiextensions.k8s.io/orders.acme.cert-manager.io created serviceaccount/cert-manager-cainjector created serviceaccount/cert-manager created serviceaccount/cert-manager-webhook created clusterrole.rbac.authorization.k8s.io/cert-manager-cainjector created clusterrole.rbac.authorization.k8s.io/cert-manager-controller-issuers created clusterrole.rbac.authorization.k8s.io/cert-manager-controller-clusterissuers created clusterrole.rbac.authorization.k8s.io/cert-manager-controller-certificates created clusterrole.rbac.authorization.k8s.io/cert-manager-controller-orders created clusterrole.rbac.authorization.k8s.io/cert-manager-controller-challenges created clusterrole.rbac.authorization.k8s.io/cert-manager-controller-ingress-shim created clusterrole.rbac.authorization.k8s.io/cert-manager-cluster-view created clusterrole.rbac.authorization.k8s.io/cert-manager-view created clusterrole.rbac.authorization.k8s.io/cert-manager-edit created clusterrole.rbac.authorization.k8s.io/cert-manager-controller-approve:cert-manager-io created clusterrole.rbac.authorization.k8s.io/cert-manager-controller-certificatesigningrequests created clusterrole.rbac.authorization.k8s.io/cert-manager-webhook:subjectaccessreviews created clusterrolebinding.rbac.authorization.k8s.io/cert-manager-cainjector created clusterrolebinding.rbac.authorization.k8s.io/cert-manager-controller-issuers created clusterrolebinding.rbac.authorization.k8s.io/cert-manager-controller-clusterissuers created clusterrolebinding.rbac.authorization.k8s.io/cert-manager-controller-certificates created clusterrolebinding.rbac.authorization.k8s.io/cert-manager-controller-orders created clusterrolebinding.rbac.authorization.k8s.io/cert-manager-controller-challenges created clusterrolebinding.rbac.authorization.k8s.io/cert-manager-controller-ingress-shim created clusterrolebinding.rbac.authorization.k8s.io/cert-manager-controller-approve:cert-manager-io created clusterrolebinding.rbac.authorization.k8s.io/cert-manager-controller-certificatesigningrequests created clusterrolebinding.rbac.authorization.k8s.io/cert-manager-webhook:subjectaccessreviews created role.rbac.authorization.k8s.io/cert-manager-cainjector:leaderelection created role.rbac.authorization.k8s.io/cert-manager:leaderelection created role.rbac.authorization.k8s.io/cert-manager-tokenrequest created role.rbac.authorization.k8s.io/cert-manager-webhook:dynamic-serving created rolebinding.rbac.authorization.k8s.io/cert-manager-cainjector:leaderelection created rolebinding.rbac.authorization.k8s.io/cert-manager:leaderelection created rolebinding.rbac.authorization.k8s.io/cert-manager-cert-manager-tokenrequest created rolebinding.rbac.authorization.k8s.io/cert-manager-webhook:dynamic-serving created service/cert-manager-cainjector created service/cert-manager created service/cert-manager-webhook created deployment.apps/cert-manager-cainjector created deployment.apps/cert-manager created deployment.apps/cert-manager-webhook created mutatingwebhookconfiguration.admissionregistration.k8s.io/cert-manager-webhook created validatingwebhookconfiguration.admissionregistration.k8s.io/cert-manager-webhook created Warning: resource namespaces/cert-manager is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by kubectl apply. kubectl apply should only be used on resources created declaratively by either kubectl create --save-config or kubectl apply. The missing annotation will be patched automatically. pod/cert-manager-6687d8765c-zb58b condition met pod/cert-manager-cainjector-764498cfc8-c6pbs condition met pod/cert-manager-webhook-74c74b87d7-dh9q7 condition met ----------------------------------------------------------------------------------- install PMM Server ----------------------------------------------------------------------------------- Error: uninstall: Release not loaded: monitoring: release: not found Error: no repo named "stable" found "stable" has been added to your repositories NAME: monitoring LAST DEPLOYED: Wed May 21 23:02:09 2025 NAMESPACE: monitoring-2-0-28682 STATUS: deployed REVISION: 1 TEST SUITE: None NOTES: PMM server can be accessed via HTTPS (port 443) on the following DNS name from within your cluster: endpoint: https://monitoring-service.monitoring-2-0-28682.svc.cluster.local:443 login: admin password: admin error: Internal error occurred: unable to upgrade connection: container not found ("monitoring") error: Internal error occurred: unable to upgrade connection: container not found ("monitoring") error: Internal error occurred: unable to upgrade connection: container not found ("monitoring") error: Internal error occurred: unable to upgrade connection: container not found ("monitoring") Retry 0 error: Internal error occurred: unable to upgrade connection: container not found ("monitoring") error: Internal error occurred: unable to upgrade connection: container not found ("monitoring") error: Internal error occurred: unable to upgrade connection: container not found ("monitoring") error: Internal error occurred: unable to upgrade connection: container not found ("monitoring") Retry 1 ----------------------------------------------------------------------------------- create secrets and start client ----------------------------------------------------------------------------------- secret/some-users created secret/some-users unchanged deployment.apps/psmdb-client created ----------------------------------------------------------------------------------- create first PSMDB cluster monitoring ----------------------------------------------------------------------------------- perconaservermongodb.psmdb.percona.com/monitoring created waiting for pod/monitoring-rs0-0 to be ready............OK waiting for pod/monitoring-rs0-1 to be ready..........OK waiting for pod/monitoring-rs0-2 to be ready...........OK Waiting for cluster readyness.................. ----------------------------------------------------------------------------------- check if pmm-client container is not enabled ----------------------------------------------------------------------------------- Percona Server for MongoDB shell version v4.4.29-28 connecting to: mongodb://monitoring-mongos.monitoring-2-0-28682.svc.cluster.local:27019/admin?compressors=disabled&gssapiServiceName=mongodb {"t":{"$date":"2025-05-21T23:07:28.889Z"},"s":"I", "c":"NETWORK", "id":5490002, "ctx":"thread1","msg":"Started a new thread for the timer service"} Implicit session: session { "id" : UUID("48692669-c405-4fec-85f5-c79092b847d0") } Percona Server for MongoDB server version: v7.0.18-11 WARNING: shell and server versions do not match Successfully added user: { "user" : "myApp", "roles" : [ { "db" : "myApp", "role" : "readWrite" } ] } bye Percona Server for MongoDB shell version v4.4.29-28 connecting to: mongodb://monitoring-mongos.monitoring-2-0-28682.svc.cluster.local:27019/admin?compressors=disabled&gssapiServiceName=mongodb {"t":{"$date":"2025-05-21T23:07:31.848Z"},"s":"I", "c":"NETWORK", "id":5490002, "ctx":"thread1","msg":"Started a new thread for the timer service"} Implicit session: session { "id" : UUID("291f1391-68cf-4ec5-b142-a17c508d2c17") } Percona Server for MongoDB server version: v7.0.18-11 WARNING: shell and server versions do not match { "ok" : 1, "$clusterTime" : { "clusterTime" : Timestamp(1747868852, 8), "signature" : { "hash" : BinData(0,"Tfm3f+5hIk52PbPnEb678CatM9s="), "keyId" : NumberLong("7507039170490007556") } }, "operationTime" : Timestamp(1747868852, 2) } bye Percona Server for MongoDB shell version v4.4.29-28 connecting to: mongodb://monitoring-mongos.monitoring-2-0-28682.svc.cluster.local:27019/admin?compressors=disabled&gssapiServiceName=mongodb {"t":{"$date":"2025-05-21T23:07:34.111Z"},"s":"I", "c":"NETWORK", "id":5490002, "ctx":"thread1","msg":"Started a new thread for the timer service"} Implicit session: session { "id" : UUID("ff1cc782-ce53-490a-81cf-ffe561965803") } Percona Server for MongoDB server version: v7.0.18-11 WARNING: shell and server versions do not match switched to db myApp WriteResult({ "nInserted" : 1 }) bye Percona Server for MongoDB shell version v4.4.29-28 connecting to: mongodb://monitoring-mongos.monitoring-2-0-28682.svc.cluster.local:27019/admin?compressors=disabled&gssapiServiceName=mongodb {"t":{"$date":"2025-05-21T23:07:37.127Z"},"s":"I", "c":"NETWORK", "id":5490002, "ctx":"thread1","msg":"Started a new thread for the timer service"} Implicit session: session { "id" : UUID("2c7791eb-b6e3-46dc-a4af-43d7305a09ff") } Percona Server for MongoDB server version: v7.0.18-11 WARNING: shell and server versions do not match switched to db myApp WriteResult({ "nInserted" : 1 }) bye Percona Server for MongoDB shell version v4.4.29-28 connecting to: mongodb://monitoring-mongos.monitoring-2-0-28682.svc.cluster.local:27019/admin?compressors=disabled&gssapiServiceName=mongodb {"t":{"$date":"2025-05-21T23:07:39.977Z"},"s":"I", "c":"NETWORK", "id":5490002, "ctx":"thread1","msg":"Started a new thread for the timer service"} Implicit session: session { "id" : UUID("39793af5-9005-44ce-827b-990b33a54d87") } Percona Server for MongoDB server version: v7.0.18-11 WARNING: shell and server versions do not match switched to db myApp WriteResult({ "nInserted" : 1 }) bye ----------------------------------------------------------------------------------- add PMM_SERVER_API_KEY for secret some-users ----------------------------------------------------------------------------------- % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 36 0 0 100 36 0 86 --:--:-- --:--:-- --:--:-- 86 100 155 100 119 100 36 260 78 --:--:-- --:--:-- --:--:-- 339 secret/some-users patched ----------------------------------------------------------------------------------- check if all 3 Pods started ----------------------------------------------------------------------------------- waiting for pod/monitoring-rs0-0 to be ready.OK waiting for pod/monitoring-rs0-1 to be ready.OK waiting for pod/monitoring-rs0-2 to be ready.OK Waiting for cluster readyness.................................................................................................................................. ----------------------------------------------------------------------------------- check if pmm-client container enabled ----------------------------------------------------------------------------------- ----------------------------------------------------------------------------------- check mongod metrics ----------------------------------------------------------------------------------- "1747865548" "1747865548" "0" "0" ----------------------------------------------------------------------------------- check mongo config metrics ----------------------------------------------------------------------------------- "1747865548" "1747865548" "0" "0" ----------------------------------------------------------------------------------- check mongos metrics ----------------------------------------------------------------------------------- "1747865553" "1747865553" ----------------------------------------------------------------------------------- check QAN data ----------------------------------------------------------------------------------- perconaservermongodb.psmdb.percona.com/monitoring patched waiting for pod/monitoring-mongos-0 to be deleted............................Error from server (NotFound): pods "monitoring-mongos-0" not found Error from server (NotFound): pods "monitoring-mongos-0" not found Error from server (NotFound): pods "monitoring-mongos-0" not found Error from server (NotFound): pods "monitoring-mongos-0" not found waiting for pod/monitoring-rs0-0 to be deleted.......Error from server (NotFound): pods "monitoring-rs0-0" not found Error from server (NotFound): pods "monitoring-rs0-0" not found Error from server (NotFound): pods "monitoring-rs0-0" not found Error from server (NotFound): pods "monitoring-rs0-0" not found waiting for pod/monitoring-cfg-0 to be deleted.....Error from server (NotFound): pods "monitoring-cfg-0" not found Error from server (NotFound): pods "monitoring-cfg-0" not found Error from server (NotFound): pods "monitoring-cfg-0" not found Error from server (NotFound): pods "monitoring-cfg-0" not found ----------------------------------------------------------------------------------- check if services are not deleted ----------------------------------------------------------------------------------- NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE monitoring-rs0 ClusterIP None 27019/TCP 14m NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE monitoring-cfg ClusterIP None 27019/TCP 14m NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE monitoring-mongos ClusterIP 34.118.235.109 27019/TCP 14m ----------------------------------------------------------------------------------- check customClusterName for pmm ----------------------------------------------------------------------------------- perconaservermongodb.psmdb.percona.com/monitoring patched waiting for pod/monitoring-rs0-0 to be ready............OK waiting for pod/monitoring-rs0-1 to be ready..........OK waiting for pod/monitoring-rs0-2 to be ready...........OK Waiting for cluster readyness.... Checking monitoring-2-0-28682-monitoring-mongos-0 Checking monitoring-2-0-28682-monitoring-rs0-0 Checking monitoring-2-0-28682-monitoring-cfg-0 ----------------------------------------------------------------------------------- check for passwords leak ----------------------------------------------------------------------------------- secrets=YmFja3VwMTIzNDU2 YmFja3VwMTIzNDU2 Y2x1c3RlckFkbWluMTIzNDU2 Y2x1c3RlckFkbWluMTIzNDU2 Y2x1c3Rlck1vbml0b3IxMjM0NTY= Y2x1c3Rlck1vbml0b3IxMjM0NTY= ZGF0YWJhc2VBZG1pbjEyMzQ1Ng== ZGF0YWJhc2VBZG1pbjEyMzQ1Ng== dXNlckFkbWluMTIzNDU2 dXNlckFkbWluMTIzNDU2 YmFja3VwMTIzNDU2 Y2x1c3RlckFkbWluMTIzNDU2 Y2x1c3Rlck1vbml0b3IxMjM0NTY= ZGF0YWJhc2VBZG1pbjEyMzQ1Ng== dXNlckFkbWluMTIzNDU2 passwords=backup123456 backup123456 clusterAdmin123456 clusterAdmin123456 clusterMonitor123456 clusterMonitor123456 databaseAdmin123456 databaseAdmin123456 userAdmin123456 userAdmin123456 backup123456 clusterAdmin123456 clusterMonitor123456 databaseAdmin123456 userAdmin123456 YmFja3VwMTIzNDU2 YmFja3VwMTIzNDU2 Y2x1c3RlckFkbWluMTIzNDU2 Y2x1c3RlckFkbWluMTIzNDU2 Y2x1c3Rlck1vbml0b3IxMjM0NTY= Y2x1c3Rlck1vbml0b3IxMjM0NTY= ZGF0YWJhc2VBZG1pbjEyMzQ1Ng== ZGF0YWJhc2VBZG1pbjEyMzQ1Ng== dXNlckFkbWluMTIzNDU2 dXNlckFkbWluMTIzNDU2 YmFja3VwMTIzNDU2 Y2x1c3RlckFkbWluMTIzNDU2 Y2x1c3Rlck1vbml0b3IxMjM0NTY= ZGF0YWJhc2VBZG1pbjEyMzQ1Ng== dXNlckFkbWluMTIzNDU2 pods=monitoring-0 monitoring-cfg-0 monitoring-cfg-1 monitoring-cfg-2 monitoring-mongos-0 monitoring-mongos-1 monitoring-mongos-2 monitoring-rs0-0 monitoring-rs0-1 monitoring-rs0-2 psmdb-client-6fdf7b7479-8c75k logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-monitoring-0-monitoring.txt logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-monitoring-cfg-0-mongod.txt logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-monitoring-cfg-1-mongod.txt logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-monitoring-cfg-2-mongod.txt logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-monitoring-mongos-0-mongos.txt logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-monitoring-mongos-1-mongos.txt logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-monitoring-mongos-2-mongos.txt logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-monitoring-rs0-0-mongod.txt logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-monitoring-rs0-1-mongod.txt logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-monitoring-rs0-2-mongod.txt logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-psmdb-client-6fdf7b7479-8c75k-psmdb-client.txt Error from server: Get "https://10.212.0.126:10250/containerLogs/psmdb-operator/percona-server-mongodb-operator-688cb5cf6c-ckj2t/percona-server-mongodb-operator": No agent available Error from server: Get "https://10.212.0.126:10250/containerLogs/psmdb-operator/percona-server-mongodb-operator-688cb5cf6c-ckj2t/percona-server-mongodb-operator": No agent available logs saved in: /tmp/tmp.oTr39CAbrD/logs_output-percona-server-mongodb-operator-688cb5cf6c-ckj2t-percona-server-mongodb-operator.txt Error from server: Get "https://10.212.15.192:10250/containerLogs/psmdb-operator/percona-server-mongodb-operator-688cb5cf6c-mjcpk/percona-server-mongodb-operator": http2: client connection lost Error from server: Get "https://10.212.15.192:10250/containerLogs/psmdb-operator/percona-server-mongodb-operator-688cb5cf6c-mjcpk/percona-server-mongodb-operator": dial tcp 10.212.15.192:10250: i/o timeout Error from server: Get "https://10.212.15.192:10250/containerLogs/psmdb-operator/percona-server-mongodb-operator-688cb5cf6c-mjcpk/percona-server-mongodb-operator": dial tcp 10.212.15.192:10250: i/o timeout Error from server: Get "https://10.212.15.192:10250/containerLogs/psmdb-operator/percona-server-mongodb-operator-688cb5cf6c-mjcpk/percona-server-mongodb-operator": dial tcp 10.212.15.192:10250: i/o timeout