k8s serviceaccount在集群内指定apiserver时验证错误的问题

在主机上,找到TOKEN,可以直接指定apiserver使用 

root@ubuntu-server:/home# kubectl auth can-i --list --server https://192.168.85.198:6443 --token="eyJhbGciOiJSUzI1NiIsImtpZCI6IlFlMHQ3TzhpcGw1SnRqbkYtOC1NUWlWNUpWdGo5SGRXeTBvZU9ib25iZDQifQ.eyJhdWQiOlsiaHR0cHM6Ly9rdWJlcm5ldGVzLmRlZmF1bHQuc3ZjLmNsdXN0ZXIubG9jYWwiXSwiZXhwIjoxNzczNjIzNDc1LCJpYXQiOjE3NDIwODc0NzUsImlzcyI6Imh0dHBzOi8va3ViZXJuZXRlcy5kZWZhdWx0LnN2Yy5jbHVzdGVyLmxvY2FsIiwia3ViZXJuZXRlcy5pbyI6eyJuYW1lc3BhY2UiOiJkZWZhdWx0IiwicG9kIjp7Im5hbWUiOiJob25leXBvZC01ZDZiY2RiY2RiLXM3NjRoIiwidWlkIjoiOTY1ZjY2ZDEtNmNhNC00NjBiLTg0NDQtOTRmOTUwOTZkMDk4In0sInNlcnZpY2VhY2NvdW50Ijp7Im5hbWUiOiJkZWZhdWx0IiwidWlkIjoiYzYwYTk4ODctMDlhMS00NWRlLTg2OWItNzhhNjdkMGRkNWZiIn0sIndhcm5hZnRlciI6MTc0MjA5MTA4Mn0sIm5iZiI6MTc0MjA4NzQ3NSwic3ViIjoic3lzdGVtOnNlcnZpY2VhY2NvdW50OmRlZmF1bHQ6ZGVmYXVsdCJ9.ct6yPMHVLnO3k51x7soOO2iV1dboXU2G2rTTAMuxboOMBkjruX_El7tXOuWEhuSGEECpSTxU9iXbPYV5KWCsUwCSrZd1yqlXle6u6h2KAlrZAqKHOe2PCGau6SUhnGwlGNz1z_98mV58cQbHCFzlHhN-2AkLbxYCgtFnD_ggIDQlNb_3uiTneFr7DBSDEp874vGdkh_E7fWSNPFKJQTxQuPZWuOIyz-LaHWGvCmLc5zJgxc_68Zluy-3jyvGIJdgcCXp9Wz51DQRp0NOeuqmZ0bhsNDwqBxZ6DM6dOtzGWaB8k8npd1QxxTmShYdwoOgube-x0-JnUiY5Z3Mf7Sd-A" -v=9
I0316 01:39:25.883868  907134 loader.go:372] Config loaded from file:  /root/.kube/config
I0316 01:39:25.884864  907134 request.go:1181] Request Body: {"kind":"SelfSubjectRulesReview","apiVersion":"authorization.k8s.io/v1","metadata":{"creationTimestamp":null},"spec":{"namespace":"default"},"status":{"resourceRules":null,"nonResourceRules":null,"incomplete":false}}
I0316 01:39:25.885088  907134 round_trippers.go:466] curl -v -XPOST  -H "Accept: application/json, */*" -H "Content-Type: application/json" -H "User-Agent: kubectl/v1.23.6 (linux/amd64) kubernetes/ad33385" -H "Authorization: Bearer <masked>" 'https://192.168.85.198:6443/apis/authorization.k8s.io/v1/selfsubjectrulesreviews'
I0316 01:39:25.885791  907134 round_trippers.go:510] HTTP Trace: Dial to tcp:192.168.85.198:6443 succeed
I0316 01:39:25.890543  907134 round_trippers.go:570] HTTP Statistics: DNSLookup 0 ms Dial 0 ms TLSHandshake 3 ms ServerProcessing 0 ms Duration 5 ms
I0316 01:39:25.890575  907134 round_trippers.go:577] Response Headers:
I0316 01:39:25.890579  907134 round_trippers.go:580]     Content-Type: application/json
I0316 01:39:25.890590  907134 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: 30268247-c126-47b4-a157-a1758fc4219c
I0316 01:39:25.890595  907134 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: cfdb4739-b1dc-4192-9ab4-2ad063a0e27d
I0316 01:39:25.890600  907134 round_trippers.go:580]     Content-Length: 647
I0316 01:39:25.890604  907134 round_trippers.go:580]     Date: Sun, 16 Mar 2025 01:39:25 GMT
I0316 01:39:25.890609  907134 round_trippers.go:580]     Audit-Id: 396e9b4f-70c1-46b3-a6dd-9bceef4b6fd7
I0316 01:39:25.890613  907134 round_trippers.go:580]     Cache-Control: no-cache, private
I0316 01:39:25.890636  907134 request.go:1181] Response Body: {"kind":"SelfSubjectRulesReview","apiVersion":"authorization.k8s.io/v1","metadata":{"creationTimestamp":null},"spec":{},"status":{"resourceRules":[{"verbs":["*"],"apiGroups":["*"],"resources":["*"]},{"verbs":["create"],"apiGroups":["authorization.k8s.io"],"resources":["selfsubjectaccessreviews","selfsubjectrulesreviews"]}],"nonResourceRules":[{"verbs":["get"],"nonResourceURLs":["/api","/api/*","/apis","/apis/*","/healthz","/livez","/openapi","/openapi/*","/readyz","/version","/version/"]},{"verbs":["*"],"nonResourceURLs":["*"]},{"verbs":["get"],"nonResourceURLs":["/healthz","/livez","/readyz","/version","/version/"]}],"incomplete":false}}
Resources                                       Non-Resource URLs   Resource Names   Verbs
*.*                                             []                  []               [*][*]                 []               [*]
selfsubjectaccessreviews.authorization.k8s.io   []                  []               [create]
selfsubjectrulesreviews.authorization.k8s.io    []                  []               [create][/api/*]            []               [get][/api]              []               [get][/apis/*]           []               [get][/apis]             []               [get][/healthz]          []               [get][/healthz]          []               [get][/livez]            []               [get][/livez]            []               [get][/openapi/*]        []               [get][/openapi]          []               [get][/readyz]           []               [get][/readyz]           []               [get][/version/]         []               [get][/version/]         []               [get][/version]          []               [get][/version]          []               [get]

在容器内如果不跟server参数指定也可以使用

但是指定后就报错了

oot@honeypod-5d6bcdbcdb-s764h:/run/secrets/kubernetes.io/serviceaccount# /home/kubectl auth can-i --list --server https://10.96.0.1:443 --token="eyJhbGciOiJSUzI1NiIsImtpZCI6IlFlMHQ3TzhpcGw1SnRqbkYtOC1NUWlWNUpWdGo5SGRXeTBvZU9ib25iZDQifQ.eyJhdWQiOlsiaHR0cHM6Ly9rdWJlcm5ldGVzLmRlZmF1bHQuc3ZjLmNsdXN0ZXIubG9jYWwiXSwiZXhwIjoxNzczNjIzNDc1LCJpYXQiOjE3NDIwODc0NzUsImlzcyI6Imh0dHBzOi8va3ViZXJuZXRlcy5kZWZhdWx0LnN2Yy5jbHVzdGVyLmxvY2FsIiwia3ViZXJuZXRlcy5pbyI6eyJuYW1lc3BhY2UiOiJkZWZhdWx0IiwicG9kIjp7Im5hbWUiOiJob25leXBvZC01ZDZiY2RiY2RiLXM3NjRoIiwidWlkIjoiOTY1ZjY2ZDEtNmNhNC00NjBiLTg0NDQtOTRmOTUwOTZkMDk4In0sInNlcnZpY2VhY2NvdW50Ijp7Im5hbWUiOiJkZWZhdWx0IiwidWlkIjoiYzYwYTk4ODctMDlhMS00NWRlLTg2OWItNzhhNjdkMGRkNWZiIn0sIndhcm5hZnRlciI6MTc0MjA5MTA4Mn0sIm5iZiI6MTc0MjA4NzQ3NSwic3ViIjoic3lzdGVtOnNlcnZpY2VhY2NvdW50OmRlZmF1bHQ6ZGVmYXVsdCJ9.ct6yPMHVLnO3k51x7soOO2iV1dboXU2G2rTTAMuxboOMBkjruX_El7tXOuWEhuSGEECpSTxU9iXbPYV5KWCsUwCSrZd1yqlXle6u6h2KAlrZAqKHOe2PCGau6SUhnGwlGNz1z_98mV58cQbHCFzlHhN-2AkLbxYCgtFnD_ggIDQlNb_3uiTneFr7DBSDEp874vGdkh_E7fWSNPFKJQTxQuPZWuOIyz-LaHWGvCmLc5zJgxc_68Zluy-3jyvGIJdgcCXp9Wz51DQRp0NOeuqmZ0bhsNDwqBxZ6DM6dOtzGWaB8k8npd1QxxTmShYdwoOgube-x0-JnUiY5Z3Mf7Sd-A" -v=9
I0316 01:38:45.999930     257 merged_client_builder.go:163] Using in-cluster namespace
I0316 01:38:46.000634     257 request.go:1181] Request Body: {"kind":"SelfSubjectRulesReview","apiVersion":"authorization.k8s.io/v1","metadata":{"creationTimestamp":null},"spec":{"namespace":"default"},"status":{"resourceRules":null,"nonResourceRules":null,"incomplete":false}}
I0316 01:38:46.000985     257 round_trippers.go:466] curl -v -XPOST  -H "Authorization: Bearer <masked>" -H "Accept: application/json, */*" -H "Content-Type: application/json" -H "User-Agent: kubectl/v1.23.6 (linux/amd64) kubernetes/ad33385" 'https://10.96.0.1:443/apis/authorization.k8s.io/v1/selfsubjectrulesreviews'
I0316 01:38:46.001798     257 round_trippers.go:510] HTTP Trace: Dial to tcp:10.96.0.1:443 succeed
I0316 01:38:46.004178     257 round_trippers.go:570] HTTP Statistics: DNSLookup 0 ms Dial 0 ms TLSHandshake 2 ms Duration 3 ms
I0316 01:38:46.004211     257 round_trippers.go:577] Response Headers:
I0316 01:38:46.004273     257 helpers.go:237] Connection error: Post https://10.96.0.1:443/apis/authorization.k8s.io/v1/selfsubjectrulesreviews: x509: certificate signed by unknown authority
F0316 01:38:46.004301     257 helpers.go:118] Unable to connect to the server: x509: certificate signed by unknown authority
goroutine 1 [running]:
k8s.io/kubernetes/vendor/k8s.io/klog/v2.stacks(0x1)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1038 +0x8a
k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).output(0x3080040, 0x3, 0x0, 0xc00013e070, 0x2, {0x25f2ec7, 0x10}, 0xc00005c800, 0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:987 +0x5fd
k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).printDepth(0xc00051f400, 0x4e, 0x0, {0x0, 0x0}, 0x55, {0xc000049450, 0x1, 0x1})/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:735 +0x1ae
k8s.io/kubernetes/vendor/k8s.io/klog/v2.FatalDepth(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1518
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.fatal({0xc00051f400, 0x4e}, 0xc0004f7ce0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:96 +0xc5
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.checkErr({0x1fed760, 0xc0004f7ce0}, 0x1e797d0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:191 +0x7d7
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.CheckErr(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:118
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/auth.NewCmdCanI.func1(0xc0003e8000, {0xc000104460, 0x0, 0x5})/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/auth/cani.go:133 +0xc5
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).execute(0xc0003e8000, {0xc000104410, 0x5, 0x5})/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:860 +0x5f8
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xc00057a000)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:974 +0x3bc
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).Execute(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:902
k8s.io/kubernetes/vendor/k8s.io/component-base/cli.run(0xc00057a000)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/cli/run.go:146 +0x325
k8s.io/kubernetes/vendor/k8s.io/component-base/cli.RunNoErrOutput(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/cli/run.go:84
main.main()_output/dockerized/go/src/k8s.io/kubernetes/cmd/kubectl/kubectl.go:30 +0x1egoroutine 6 [chan receive]:
k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).flushDaemon(0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1181 +0x6a
created by k8s.io/kubernetes/vendor/k8s.io/klog/v2.init.0/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:420 +0xfbgoroutine 18 [select]:
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0, {0x1febb40, 0xc000204000}, 0x1, 0xc00007e360)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:167 +0x13b
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x0, 0x12a05f200, 0x0, 0x0, 0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0x89
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Forever(0x0, 0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:81 +0x28
created by k8s.io/kubernetes/vendor/k8s.io/component-base/logs.InitLogs/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/logs/logs.go:179 +0x85
root@honeypod-5d6bcdbcdb-s764h:/run/secrets/kubernetes.io/serviceaccount# exit

指定TOKEN也报错

root@honeypod-5d6bcdbcdb-s764h:/# /home/kubectl auth can-i --list --server https://192.168.85.198:443 --token="eyJhbGciOiJSUzI1NiIsImtpZCI6IlFlMHQ3TzhpcGw1SnRqbkYtOC1NUWlWNUpWdGo5SGRXeTBvZU9ib25iZDQifQ.eyJhdWQiOlsiaHR0cHM6Ly9rdWJlcm5ldGVzLmRlZmF1bHQuc3ZjLmNsdXN0ZXIubG9jYWwiXSwiZXhwIjoxNzczNjIzNDc1LCJpYXQiOjE3NDIwODc0NzUsImlzcyI6Imh0dHBzOi8va3ViZXJuZXRlcy5kZWZhdWx0LnN2Yy5jbHVzdGVyLmxvY2FsIiwia3ViZXJuZXRlcy5pbyI6eyJuYW1lc3BhY2UiOiJkZWZhdWx0IiwicG9kIjp7Im5hbWUiOiJob25leXBvZC01ZDZiY2RiY2RiLXM3NjRoIiwidWlkIjoiOTY1ZjY2ZDEtNmNhNC00NjBiLTg0NDQtOTRmOTUwOTZkMDk4In0sInNlcnZpY2VhY2NvdW50Ijp7Im5hbWUiOiJkZWZhdWx0IiwidWlkIjoiYzYwYTk4ODctMDlhMS00NWRlLTg2OWItNzhhNjdkMGRkNWZiIn0sIndhcm5hZnRlciI6MTc0MjA5MTA4Mn0sIm5iZiI6MTc0MjA4NzQ3NSwic3ViIjoic3lzdGVtOnNlcnZpY2VhY2NvdW50OmRlZmF1bHQ6ZGVmYXVsdCJ9.ct6yPMHVLnO3k51x7soOO2iV1dboXU2G2rTTAMuxboOMBkjruX_El7tXOuWEhuSGEECpSTxU9iXbPYV5KWCsUwCSrZd1yqlXle6u6h2KAlrZAqKHOe2PCGau6SUhnGwlGNz1z_98mV58cQbHCFzlHhN-2AkLbxYCgtFnD_ggIDQlNb_3uiTneFr7DBSDEp874vGdkh_E7fWSNPFKJQTxQuPZWuOIyz-LaHWGvCmLc5zJgxc_68Zluy-3jyvGIJdgcCXp9Wz51DQRp0NOeuqmZ0bhsNDwqBxZ6DM6dOtzGWaB8k8npd1QxxTmShYdwoOgube-x0-JnUiY5Z3Mf7Sd-A" -v=9
I0316 01:44:43.689347     299 merged_client_builder.go:163] Using in-cluster namespace
I0316 01:44:43.689883     299 request.go:1181] Request Body: {"kind":"SelfSubjectRulesReview","apiVersion":"authorization.k8s.io/v1","metadata":{"creationTimestamp":null},"spec":{"namespace":"default"},"status":{"resourceRules":null,"nonResourceRules":null,"incomplete":false}}
I0316 01:44:43.690104     299 round_trippers.go:466] curl -v -XPOST  -H "Accept: application/json, */*" -H "Authorization: Bearer <masked>" -H "Content-Type: application/json" -H "User-Agent: kubectl/v1.23.6 (linux/amd64) kubernetes/ad33385" 'https://192.168.85.198:443/apis/authorization.k8s.io/v1/selfsubjectrulesreviews'
I0316 01:44:43.690883     299 round_trippers.go:510] HTTP Trace: Dial to tcp:192.168.85.198:443 succeed
I0316 01:44:43.693382     299 round_trippers.go:570] HTTP Statistics: DNSLookup 0 ms Dial 0 ms TLSHandshake 2 ms Duration 3 ms
I0316 01:44:43.693520     299 round_trippers.go:577] Response Headers:
I0316 01:44:43.693576     299 helpers.go:237] Connection error: Post https://192.168.85.198:443/apis/authorization.k8s.io/v1/selfsubjectrulesreviews: x509: certificate signed by unknown authority
F0316 01:44:43.693620     299 helpers.go:118] Unable to connect to the server: x509: certificate signed by unknown authority
goroutine 1 [running]:
k8s.io/kubernetes/vendor/k8s.io/klog/v2.stacks(0x1)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1038 +0x8a
k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).output(0x3080040, 0x3, 0x0, 0xc0003e6000, 0x2, {0x25f2ec7, 0x10}, 0xc00005c800, 0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:987 +0x5fd
k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).printDepth(0xc00039e0a0, 0x4e, 0x0, {0x0, 0x0}, 0x5a, {0xc0003efc10, 0x1, 0x1})/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:735 +0x1ae
k8s.io/kubernetes/vendor/k8s.io/klog/v2.FatalDepth(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1518
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.fatal({0xc00039e0a0, 0x4e}, 0xc000543ce0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:96 +0xc5
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.checkErr({0x1fed760, 0xc000543ce0}, 0x1e797d0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:191 +0x7d7
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.CheckErr(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:118
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/auth.NewCmdCanI.func1(0xc00045b180, {0xc000492410, 0x0, 0x5})/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/auth/cani.go:133 +0xc5
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).execute(0xc00045b180, {0xc0004923c0, 0x5, 0x5})/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:860 +0x5f8
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xc0000a6f00)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:974 +0x3bc
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).Execute(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:902
k8s.io/kubernetes/vendor/k8s.io/component-base/cli.run(0xc0000a6f00)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/cli/run.go:146 +0x325
k8s.io/kubernetes/vendor/k8s.io/component-base/cli.RunNoErrOutput(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/cli/run.go:84
main.main()_output/dockerized/go/src/k8s.io/kubernetes/cmd/kubectl/kubectl.go:30 +0x1egoroutine 6 [chan receive]:
k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).flushDaemon(0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1181 +0x6a
created by k8s.io/kubernetes/vendor/k8s.io/klog/v2.init.0/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:420 +0xfbgoroutine 18 [select]:
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0, {0x1febb40, 0xc0005c0000}, 0x1, 0xc00007e360)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:167 +0x13b
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x0, 0x12a05f200, 0x0, 0x0, 0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0x89
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Forever(0x0, 0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:81 +0x28
created by k8s.io/kubernetes/vendor/k8s.io/component-base/logs.InitLogs/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/logs/logs.go:179 +0x85

又或者不用TOKEN,直接打命令,也一样报错 

root@honeypod-5d6bcdbcdb-s764h:/# /home//kubectl auth can-i --list --server https://192.168.85.198:6443 -v=9
Please enter Username: a
Please enter Password: I0316 01:42:49.432361     282 merged_client_builder.go:163] Using in-cluster namespace
I0316 01:42:49.432601     282 request.go:1181] Request Body: {"kind":"SelfSubjectRulesReview","apiVersion":"authorization.k8s.io/v1","metadata":{"creationTimestamp":null},"spec":{"namespace":"default"},"status":{"resourceRules":null,"nonResourceRules":null,"incomplete":false}}
I0316 01:42:49.432653     282 round_trippers.go:466] curl -v -XPOST  -H "Accept: application/json, */*" -H "Authorization: Basic <masked>" -H "Content-Type: application/json" -H "User-Agent: kubectl/v1.23.6 (linux/amd64) kubernetes/ad33385" 'https://192.168.85.198:6443/apis/authorization.k8s.io/v1/selfsubjectrulesreviews'
I0316 01:42:49.433134     282 round_trippers.go:510] HTTP Trace: Dial to tcp:192.168.85.198:6443 succeed
I0316 01:42:49.435363     282 round_trippers.go:570] HTTP Statistics: DNSLookup 0 ms Dial 0 ms TLSHandshake 2 ms Duration 2 ms
I0316 01:42:49.435372     282 round_trippers.go:577] Response Headers:
I0316 01:42:49.435400     282 helpers.go:237] Connection error: Post https://192.168.85.198:6443/apis/authorization.k8s.io/v1/selfsubjectrulesreviews: x509: certificate signed by unknown authority
F0316 01:42:49.435413     282 helpers.go:118] Unable to connect to the server: x509: certificate signed by unknown authority
goroutine 1 [running]:
k8s.io/kubernetes/vendor/k8s.io/klog/v2.stacks(0x1)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1038 +0x8a
k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).output(0x3080040, 0x3, 0x0, 0xc0005d4000, 0x2, {0x25f2ec7, 0x10}, 0xc00005cc00, 0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:987 +0x5fd
k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).printDepth(0xc000047630, 0x4e, 0x0, {0x0, 0x0}, 0x5b, {0xc000048e50, 0x1, 0x1})/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:735 +0x1ae
k8s.io/kubernetes/vendor/k8s.io/klog/v2.FatalDepth(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1518
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.fatal({0xc000047630, 0x4e}, 0xc0005383f0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:96 +0xc5
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.checkErr({0x1fed760, 0xc0005383f0}, 0x1e797d0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:191 +0x7d7
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.CheckErr(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:118
k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/auth.NewCmdCanI.func1(0xc000672f00, {0xc000491140, 0x0, 0x4})/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/auth/cani.go:133 +0xc5
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).execute(0xc000672f00, {0xc000491100, 0x4, 0x4})/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:860 +0x5f8
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xc0005d2f00)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:974 +0x3bc
k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).Execute(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:902
k8s.io/kubernetes/vendor/k8s.io/component-base/cli.run(0xc0005d2f00)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/cli/run.go:146 +0x325
k8s.io/kubernetes/vendor/k8s.io/component-base/cli.RunNoErrOutput(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/cli/run.go:84
main.main()_output/dockerized/go/src/k8s.io/kubernetes/cmd/kubectl/kubectl.go:30 +0x1egoroutine 18 [chan receive]:
k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).flushDaemon(0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1181 +0x6a
created by k8s.io/kubernetes/vendor/k8s.io/klog/v2.init.0/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:420 +0xfbgoroutine 5 [select]:
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0, {0x1febb40, 0xc000386a50}, 0x1, 0xc000090360)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:167 +0x13b
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x0, 0x12a05f200, 0x0, 0x0, 0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0x89
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(...)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90
k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Forever(0x0, 0x0)/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:81 +0x28
created by k8s.io/kubernetes/vendor/k8s.io/component-base/logs.InitLogs/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/logs/logs.go:179 +0x85

解决方法

kubectl增加--insecure-skip-tls-verify参数即可

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/diannao/73536.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

Linux驱动开发-①pinctrl 和 gpio 子系统②并发和竞争③内核定时器

Linux驱动开发-①pinctrl 和 gpio 子系统②并发和竞争③内核定时器 一&#xff0c;pinctrl 和 gpio 子系统1.pinctrl子系统2.GPIO子系统 二&#xff0c;并发和竞争1.原子操作2.自旋锁3.信号量4.互斥体 三&#xff0c;按键实验四&#xff0c;内核定时器1.关于定时器的有关概念1.…

数据库的高阶知识

目录 一、case when二、几种常见的嵌套查询2.1 比较运算符2.2 ANY/ALL 关键词2.3 in 关键词2.4 EXISTS关键词2.5 in和exists的异同点 三、开窗函数 数据库的基本知识 数据库的高阶知识 一、case when 在实际工作中&#xff0c;经常会涉及以下两类问题&#xff1a; 数据的映射…

【Kubernetes】Service 的类型有哪些?ClusterIP、NodePort 和 LoadBalancer 的区别?

在 Kubernetes 中&#xff0c;Service 是一种抽象的方式&#xff0c;用于将一组 Pod 进行连接并暴露给外部或集群内部访问。它的主要目的是通过提供稳定的 IP 地址和端口来允许其他服务或客户端与一组 Pod 进行通信。 Service 类型 Kubernetes 中 Service 有四种主要类型&…

MapReduce处理数据流程

&#xff08;一&#xff09;Shuffle MapReduce中的Shuffle过程指的是在Map方法执行后、Reduce方法执行前对数据进行分区排序的阶段 &#xff08;二&#xff09;处理流程 1. 首先MapReduce会将处理的数据集划分成多个split&#xff0c;split划分是逻辑上进行划分&#xff0c;…

OrioleDB: 新一代PostgreSQL存储引擎

PostgreSQL 12 引入了可插拔式的表存储方法接口&#xff0c;允许为不同的表选择不同的存储机制&#xff0c;例如用于 OLTP 操作的堆表&#xff08;HEAP、默认&#xff09;、用于 OLAP 操作的列式表&#xff08;Citus&#xff09;&#xff0c;以及用于超快速搜索处理的内存表。 …

电脑自动关机故障维修案例分享

电脑基本配置&#xff1a; C P U: AMD A10 9700 内存&#xff1a;8G 硬盘&#xff1a;金邦512G固态硬盘 主板&#xff1a;华硕 A320M-F 显卡&#xff1a;集成&#xff08;核心显卡&#xff09; 操作系统&#xff1a;Win10专业版 故障描述&#xff1a; 使用一段时间会黑屏…

JVM垃圾收集器相关面试题(1)

垃圾收集与内存管理摘要 一.核心垃圾收集算法对比 算法原理优点缺点适用场景标记-清除两次遍历&#xff08;标记存活对象→清除未标记对象&#xff09;实现简单内存碎片化、双遍历效率低老年代&#xff08;结合整理&#xff09;标记-复制内存对半分&#xff0c;存活对象复制到…

栈(LIFO)算法题

1.删除字符串中所有相邻的重复字符 注意&#xff0c;我们需要重复处理&#xff0c;而不是处理一次相邻的相同元素就结束了。对示例来说&#xff0c;如果只进行一次处理&#xff0c;结果为aaca&#xff0c;但是处理之后又出现了相邻的重复元素&#xff0c;我们还得继续处理&…

conda的基本使用及pycharm里设置conda环境

创建conda环境 conda create --name your_env_name python3.8 把your_env_name换成实际的conda环境名称&#xff0c;python后边的根据自己的需要&#xff0c;选择python的版本。 激活conda环境 conda activate your_env_name 安装相关的包、库 conda install package_name …

Python基于深度学习的多模态人脸情绪识别研究与实现

一、系统架构设计 A[数据采集] --> B[预处理模块] B --> C[特征提取] C --> D[多模态融合] D --> E[情绪分类] E --> F[系统部署] F --> G[用户界面] 二、数据准备与处理 1. 数据收集 - 视频数据&#xff1a;FER2013&#xff08;静态图像&#xff0…

synchronized与 Java内置锁(未写完)

文章目录 一、 synchronized 关键字二、Java对象结构1. 对象头2. 对象体3. 对齐字节4. 对象头中的字段长度5. Mark Word 的结构信息6. 使用 JOL 工具查看对象的布局 三、Java 内置锁机制3.1 内置锁的演进过程1. 无锁状态2. 偏向锁状态3. 轻量级锁状态4. 重量级锁状态 一、 sync…

LLM(3): Transformer 架构

Transformer 架构是当前大语言模型的主力架构和基础技术&#xff0c;本文以通俗易懂的方式&#xff0c;对此作简要介绍。 1.4 介绍 Transformer 架构 大多数现代的大规模语言模型&#xff08;LLMs&#xff09;依赖于 Transformer 架构&#xff0c;这是一种在 2017 年的论文《…

11.【.NET 8 实战--孢子记账--从单体到微服务--转向微服务】--微服务基础工具与技术--Ocelot 网关--整合日志

网关作为微服务架构的入口&#xff0c;承载着各服务间的请求转发与安全校验&#xff0c;其日志信息尤为关键。通过整合网关日志&#xff0c;可以将分散在不同系统中的访问记录、错误提示和异常信息集中管理&#xff0c;为问题排查提供全景视角。在排查故障时&#xff0c;统一日…

88.HarmonyOS NEXT 性能监控与调试指南:构建高性能应用

温馨提示&#xff1a;本篇博客的详细代码已发布到 git : https://gitcode.com/nutpi/HarmonyosNext 可以下载运行哦&#xff01; HarmonyOS NEXT 性能监控与调试指南&#xff1a;构建高性能应用 文章目录 HarmonyOS NEXT 性能监控与调试指南&#xff1a;构建高性能应用1. 性能监…

012---状态机的基本知识

1. 摘要 文章为学习记录。主要介绍状态机概述、状态转移图、状态编码、状态机写法、状态机代码示例。 2. 状态机概述 状态机 &#xff08;Finite State Machine&#xff09;&#xff0c;也称为同步有限状态机&#xff0c;用于描述有先后顺序或时序规律的事情。 “同步”&…

deepseek+kimi做ppt教程记录

1.首先注册deepseek和kimi deepseek官网&#xff1a;https://chat.deepseek.com/ kimi官网&#xff1a;https://kimi.moonshot.cn/ 以下以一篇工作总结报告为例 2.使用deepseek生成ppt大纲 让deepseek生成kimi生成ppt所需要的内容时&#xff0c;需要注意提示词内容&#xff0c;…

Java Module介绍

Java模块系统自Java 9开始引入&#xff0c;旨在提供更强大的封装机制、清晰的依赖关系定义以及可靠的配置。Java平台本身也被模块化了&#xff0c;提供了多个核心模块以及其他用于支持不同功能的模块。以下是一些重要的Java标准模块&#xff1a; java.base - 这是最基础的模块…

SOME/IP:用Python实现协议订阅、Offer、订阅ACK与报文接收

文章目录 前言一、代码层次二、详细代码1. eth_scapy_sd.py2、eth_scapy_someip.py3、network_define.py4、packet_define.py5、unpack_define.py6、someip_controller.py 前言 1、需要pip安装scapy库 2、需要修改根据实际情况配置network_define.py 3、执行someip_controller…

【Linux内核系列】:文件系统收尾以及软硬链接详解

&#x1f525; 本文专栏&#xff1a;Linux &#x1f338;作者主页&#xff1a;努力努力再努力wz &#x1f4aa; 今日博客励志语录&#xff1a; 世界上只有一种个人英雄主义&#xff0c;那么就是面对生活的种种失败却依然热爱着生活 内容回顾 那么在之前的学习中&#xff0c;我们…

最新版Chrome浏览器加载ActiveX控件技术--allWebPlugin中间件一键部署浏览器扩展

allWebPlugin简介 allWebPlugin中间件是一款为用户提供安全、可靠、便捷的浏览器插件服务的中间件产品&#xff0c;致力于将浏览器插件重新应用到所有浏览器。它将现有ActiveX控件直接嵌入浏览器&#xff0c;实现插件加载、界面显示、接口调用、事件回调等。支持Chrome、Firefo…