My intuition is that a particular sum is effectively equivalent to a random 64-bit integer (assuming a good hash function)
If all the values $a, b, c$ are distinct, then your intuition is correct; if the sum is done modulo $2^{64}$, then the probability of $2^{-64}$ at any one step (other than the last one) of the sum just happening to be zero (and thus signally an early end).
The caveat of $a, b$ being distinct is necessary; otherwise, consider this simple case: suppose $a=b$ and you include $G_i(a) - G_i(b)$ as your initial entry; if you test for "closure", the algorithm will signal it (and it doesn't matter how many $G_i$ functions you have).
Outline of how to show that the probability is $2^{-64}$ if all values are distinct - suppose we have a sum $(G_i(a) - G_i(b)) + (G_i(c) - G_i(d)) + ... + (G_i(y) - G_i(z))$. If we assume that we don't achieve closure and all the $a, b, c, d, ..., y, z$ values are distinct (and that the same $G_i(a)$ expression may occur in two different subexpressions), then there must be a value $G_i(k)$ that appears only once. If we assume $G_i$ acts like a random oracle, this expression may be reduced to $\pm G_i(k) + \text{otherstuff}$, where $\text{otherstuff}$ is independent of $G_i(k)$. Because $G_i(k)$ is effectively a random value, the entire expression is, and that random value is 0 with probability $2^{-64}$
And, if $G_i, G_j$ (for $i \ne j$) are independent functions (such as, $G_i(a) = Hash( i || a )$ for a strong hash function $Hash$), then multiplying the probabilities is appropriate.