Kyber and other learning with errors based systems are built around public values that are a matrix $A$ and a public vector $\mathbf b$ and private values $\mathbf s$ and $\mathbf e$ where the entries of the private values are all small. These values are related by the equation
$$\mathbf b=A\mathbf s+\mathbf e\mod q.$$
Their security rests on the difficulty of recovering $\mathbf s$ from $A$ and $\mathbf b$. This question can be reformulated as a close vector problem as follows. Consider the lattice generated by rows of the matrix
$$\begin{pmatrix}A&|&I\\ qI&|&0\end{pmatrix}$$
then this lattice has a vector close to the vector $(\mathbf b^T|0)$. Specifically the difference vector is $(-\mathbf e|\mathbf s)$.
Solving short/close vector problems is often facilitated by lattice basis reduction, which changes the basis of a lattice to a "good" representation where the basis vectors are close to orthogonal. Perhaps good might mean* that the dot products of the vectors are smaller than $dq^{2/d}$. If we look at the basis given by the rows of the matrix above, we see that there are pairs of basis vectors where the dot product could be of size $dq^2$ which is very "bad".
Thus the "bad" basis is the one that can be directly written down form the public values and the "good" basis might be one where the dot products are significantly smaller.
(*) - This is not the best technical description of a good basis, but and is only intended to be illustrative.