CLT
Consider a fixed (signed) graph \(G = (V,E)\) with adjacency matrix \(A\). One way to interpret this is that we have an unsigned graph with adjacency \(B = |A|\), and then we apply a map \(X\) which signs the edges (so \(X_{ij} \in \left\{ -1, 1 \right\}\)). That is, \(A = B \circ X\). Instead of working with \(X\), we’ll want to work with \(Y\), which is the indicator for a negative edge (\(Y = \frac{1}{2}(X+1)\)).1 What \(X\) and \(Y\) do outside \(\operatorname{supp}(B)\) doesn’t matter to us.
The statistic we care about is the number of unbalanced triads in the graph. Previously, we have shown that it is given by \[ \begin{align*} \mathcal{U} = \sum_{i \in E} \varepsilon_{i} Y_i - \sum_{i,j,k \in E^{3}} Y_i Y_j (1- Y_k) - \frac{1}{3}\sum_{i,j,k \in E^{3}} Y_i Y_j Y_k \end{align*} \] Because the summands are not symmetric in the arguments, we instead adopt the alternate form \[ \begin{align} \mathcal{U} = \sum_{i \in E} \varepsilon_{i} Y_i - 2\sum_{(i,j,k) \in \triangle} \underbrace{\left[ Y_i Y_j (1-Y_k) + Y_i (1-Y_j) Y_k + (1-Y_i) Y_j Y_k + Y_i Y_j Y_k\right]}_{W_{ijk}} \label{eqn:unbal} \end{align} \]
Moments
Now, suppose \(Y_i \sim \operatorname{Bern}(p)\). Then, \[ \begin{align*} \mathbb{E}(\mathcal{U}) &= p\sum_{i \in E} \varepsilon_{i} - 2 |\triangle| \left( 3p^2(1-p) + p^{3} \right) \\&= 3p|\triangle| - 2 |\triangle| \left( 3p^2(1-p) + p^{3} \right) \\&= |\triangle|\left(3p - 2q \right), \end{align*} \] where \(q = \mathbb{E}(W_{ijk}) = 3p^2(1-p) + p^{3} = 3p^2 - 2p^{3}\) and I’ve used the identity \(\sum_{i \in E} \varepsilon_i = 3 |\triangle|\).
Variance
The variance calculations, unfortunately, are a little more involved. Let’s look at it step by step. From \(\eqref{eqn:unbal}\), we have that \[ \begin{align} \mathbb{V}(\mathcal{U}) = \mathbb{V}\left[ \sum_{i \in E} \varepsilon_{i} Y_i \right] + 4 \mathbb{V}\left[ \sum_{(i,j,k) \in \triangle} W_{ijk} \right] + \mathbb{C}\left[ \sum_{i \in E} \varepsilon_{i} Y_i , \sum_{(i,j,k) \in \triangle} W_{ijk} \right] \label{eqn:var} \end{align} \] The first term in \(\eqref{eqn:var}\) is easy, \[ \begin{align*} \mathbb{V}\left[ \sum_{i \in E} \varepsilon_{i} Y_i \right] = p(1-p) \sum_{i \in E} \varepsilon_{i}^2 \end{align*} \] The second term is tricky, \[ \begin{align} \mathbb{V}\left[ \sum_{(i,j,k) \in \triangle} W_{ijk} \right] &= \sum_{(i,j,k) \in \triangle} \mathbb{V}(W_{ijk}) + \sum_{(i,j,k) \neq (i',j',k')} \mathbb{C}(W_{ijk}, W_{i'j'k'}) \label{eqn:var_int} \end{align} \] Since the terms in \(W_{ijk}\) are uncorrelated, we have that \[ \begin{align*} \mathbb{V}(W_{ijk}) = q(1-q), \end{align*} \] where \(q\) was defined previously, as the value of \(\mathbb{E}(W_{ijk})\). That leaves the covariance term in \(\eqref{eqn:var_int}\). Notice the only terms that matter are those triangles that share a single edge.2 If you share two edges, then you’re the same triangle, since we’re only dealing with simple graphs. Thus, the term can be written as \[ \begin{align*} \sum_{\triangle \cap \triangle' = i} \mathbb{C}\left( W_{ijk}, W_{ilm} \right) \end{align*} \] Now, \[ \begin{align*} \mathbb{C}\left( W_{ijk}, W_{ilm} \right) &= \mathbb{E}(W_{ijk}W_{ilm}) - \mathbb{E}^2(W) \\&= \mathbb{E}(W_{ijk}W_{ilm}) - q^2 \end{align*} \] Let us rewrite \(W_{ijk}\) in terms of \(Y_i\): \[ \begin{align} W_{ijk} = Y_i\left( Y_j + Y_k - 2 Y_jY_k \right) + Y_j Y_k \label{eqn:w_i} \end{align} \] Thus, \[ \begin{align*} \mathbb{E}(W_{ijk}W_{ilm}) &= \mathbb{E}(Y_i^2) \mathbb{E}^2(Y_j + Y_k - 2 Y_jY_k) + \mathbb{E}^2(Y_j Y_k) \\&+ 2 \mathbb{E}(Y_i) \mathbb{E}(Y_j + Y_k - 2 Y_jY_k) \mathbb{E}(Y_jY_k) \\&= p(2p - 2p^2)^2 + p^{4} + 2p(2p - 2p^2)p^2 \\&= 4p^{3} - 3p^{4} \end{align*} \] So let’s aggregate all that we’ve learnt. The second term of \(\eqref{eqn:var}\) is therefore given by \[ \begin{align*} \mathbb{V}\left[ \sum_{(i,j,k) \in \triangle} W_{ijk} \right] = q(1-q) |\triangle| + (4p^{3} - 3p^{4} - q^2) \sum_{i \in E} \binom{\varepsilon_{i}}{2} \end{align*} \] Almost there. The last term to consider is the third term in \(\eqref{eqn:var}\). The only times the covariance term matters is when the single edge on the left is an edge in the triangle on the right. Without loss of generality, assume the match is with argument \(i\). Then, \[ \begin{align*} \mathbb{C}\left( \varepsilon_{i} Y_i , W_{ijk}\right) &= \mathbb{E} \left( \varepsilon_{i} Y_i W_{ijk} \right) - \varepsilon_{i} p q \\&= \varepsilon_i (p(2p - 2p^2) + p^{3}) - \varepsilon_{i} pq \\&= \varepsilon_i \left( 2p^2 - p^{3} - pq \right) \end{align*} \] where I have used the identity given in \(\eqref{eqn:w_i}\). The total number of such terms will be \(\varepsilon_i\). Thus, we can finally report the variance of our statistic: \[ \begin{align*} \mathbb{V}(\mathcal{U}) = p(1-p) \sum_{i \in E} \varepsilon^2_{i} + 4q(1-q) |\triangle| + 4(4p^{3} - 3p^{4} - q^2) \sum_{i \in E} \binom{\varepsilon_{i}}{2} + \left( 2p^2 - p^{3} - pq \right) \sum_{i \in E} \varepsilon^2_{i}, \end{align*} \] where \(q = 3p^2 - 2p^{3}\).
Summary
In summary, we have the following results: for \[ \begin{align*} \mathcal{U} = \sum_{i \in E} \varepsilon_{i} Y_i - 2\sum_{(i,j,k) \in \triangle} \underbrace{\left[ Y_i Y_j (1-Y_k) + Y_i (1-Y_j) Y_k + (1-Y_i) Y_j Y_k + Y_i Y_j Y_k\right]}_{W_{ijk}}, \end{align*} \] we know that its mean and variance are given by \[ \begin{align*} \mathbb{E}(\mathcal{U}) &= |\triangle|\left(3p - 2q \right) \\ &= \left( p - 2p^2 + \frac{4}{3} p^{3} \right) \sum_{i \in E} \varepsilon_{i} \\ \mathbb{V}(\mathcal{U})&= p(1-p) \sum_{i \in E} \varepsilon^2_{i} + 4q(1-q) |\triangle| + 4(4p^{3} - 3p^{4} - q^2) \sum_{i \in E} \binom{\varepsilon_{i}}{2} + \left( 2p^2 - p^{3} - pq \right) \sum_{i \in E} \varepsilon^2_{i} \\ &= \left( p - p^2 + 2p^2 + 8p^{3} - 4p^{3} + o(p^{3}) \right)\sum_{i \in E} \varepsilon_{i}^2 + \left( 4p^2 - \frac{8}{3} p^{3} - 8p^{3} + o(p^{3}) \right) \sum_{i \in E} \varepsilon_{i} \\ &= \left( p + p^2 +4 p^{3} + o(p^{3}) \right)\sum_{i \in E} \varepsilon_{i}^2 + \left( 4p^2 - \frac{32}{3} p^{3} + o(p^{3}) \right) \sum_{i \in E} \varepsilon_{i} \end{align*} \]