For $2$, it's easy to see these variables are independents because each elementary variables which appear in one of the expressions doesn't appear in the other.
For example
$\mathbb{P}(a_2b_0=0| a_1b_1=0, b_2(a_0 + a_2 +1)=0) = \mathbb{P}(a_4b_4=0)$.
A good way to see this, I think is to look the entropy of one of this variable :
$H (a_2 b_0 |a_1 b_1, b_2(a_0 + a_2 +1) ) \geq
H (a_2 b_0 |a_0, a_1, b_1, b_2 ) = H (a_2 b_0 )$.
The last equality comes from the independence of the elementary variables.
Thus we can deduce $H (a_2 b_0 |a_1 b_1, b_2(a_0 + a_2 +1) )= H (a_2 b_0 ) $.
About $1$, it's more complex because $b_4, b_2, a_2, a_4$ appear in more than one expression.
Then you have to show that $a_4, b_4, a_3, (b_0 + b_2), b_3, (a_0 + a_2), a_1 ,(b_2 + b_4), b_1, (a_2 + a_4)$ are linearly independent in $\mathbb{F}_2$ (by doing linear algebra), thus you can deduce these variables (eventually plus a constant) are independents (from a probabilistic point of view).
And then you have to use the argument "if $X, Y, Z, T$ are independent, then $XY$ and $ZT$ are independents."
Edit : A good way to see the independence of the linear variables is to compute the determinant of the following matrix (the first line correspond to $a_4$, the second to $b_4$, the fourth to $b_0 + b_2$, etc), you can also proof these family of vector generates $\mathbb{F}^{10}_p$, and deduce by the cardinality, it's base, and thus the vectors are independents :
$\begin{matrix} 0 & 0 & 0 & 0 & 1 & 0 & 0 & 0 & 0 & 0 \\
0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 1 \\
0 & 0 & 0 & 1 & 0 & 0 & 0 & 0 & 0 & 0 \\
0 & 0 & 0 & 0 & 0 & 1 & 0 & 1 & 0 & 0 \\
0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 1 & 0 \\
1 & 0 & 1 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \\
0 & 1 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \\
0 & 0 & 0 & 0 & 0 & 0 & 0 & 1 & 0 & 1 \\
0 & 0 & 0 & 0 & 0 & 0 & 1 & 0 & 0 & 0\\
0 & 0 & 1 & 0 & 1 & 0 & 0 & 0 & 0 & 0
\end{matrix}$