Score:0

kube-api server high cpu

US flag
PeterWegner90

I want to know how I can check why one of my ctrl node and kubernetes consumes more cpu than the others.

I have a cluster with 3 ctrl nodes and 4 worker nodes.

I have an nginx load balancer with the least_conn algorithm to distribute the requests to the ctrl nodes.

Monitoring the resources with the top command, I observe that of the three ctrl nodes, the kube api server process always in the first ctrl node gives me a cpu usage above 100%, unlike the other ctrl nodes where the kube-api server uses less than 20%.

I want to know why?

And how can I see that same representation of consumption, be it pod, containers. nodes in grafana

Score:0
US flag
Heinrich supports Monica

After finding what happens in your cluster using kubctl top node and kubectl top pod, you can further diagnose what is happening with kubectl logs $pod -c $container on the pod.

At this point, it is up to the container to provide information on what it is doing, so ideally, you would collect metrics in the pods to get a quick insight into what is happening on your cluster using e.g. Grafana. You can also have a look at the resources assigned to your pod using kubectl get pod $pod -o jsonpath='{.spec.containers[].resources}'.

In your case, the log messages of the kubernetes apiserver should give you a hint. Probably, something (another container/pod maybe) is clogging up your API server.

I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.