Score:0

VMware VMs using private network can't communicate with each other when on different ESXi hosts

jp flag

I have VMs spread across multiple ESXi hosts (using Vcenter/vSphere). They have private addresses. Like 192.168.1.x. When on the same host, they can communicate with each other. But, when they are on different hosts, they can't communicate. The interfaces are the same labeled vSwitch. (NOTE: the VMs do have multiple NICS. Private 192.x.x.x just for VM to VM and routable addresses - 10.x.x.x stuff going out to rest of our networks)

Is that supported? I inherited this from another admin and not sure they had pinned the VMs on the same host to make it work. Been awhile since I worked directly with VMware. I had one host go down and the VM machines using DRS moved around. Not sure if they were located on the same host before. I assume since things were working before. But, wanted to see if it should work anyway before I try move all theses VMs to same host. Also, with DRS, I assume this will just break again.

I have moved them back and forth between hosts. But, only working when on the same host.

Also, these hosts are all the same Dell Chassis.

Score:0
jp flag

Finally got this working. I ended having to restart the Dell I/O aggregator card in Dells chassis. Looking at the card, it showed everything was fine. But, for some reason it was not passing this private traffic between hosts.

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.