6

I am reading "Using gossips to spread information: Theory and evidence from two Randomised Control Trials" by Banerjee, Chandrashekhar, Duflo and Jackson, where they discuss the efficacy of gossips in spreading information regarding policies/ business offers.

In this paper, in page 21, they have introduced a generalised version of the network parameter "Diffusion Centrality", which the same authors had defined earlier in a 2013 paper.

If $w$ is a directed and weighted adjacency matrix for a social network, then they define a 'hearing matrix' for $w$ and time duration $T \in N$ as:

$$H(w, T) = \sum_{t=1}^T (w)^t$$

The $ij$-th entry of $H$, $H(w, T)_{ij}$ , is the expected number of times, within $T$ periods, that $j$ hears about a piece of information originating from $i$.

Then Diffusion Centrality is defined as:

$$DC(w, T) = H(w, T) \cdot 1$$

The paper says, "$DC(w, T)_i$ is the expected total number of times that some piece of information that originates from $i$ is heard by any of the members of the society during a $T$-period time interval".

I am not very familiar with network economics. I would like to understand why do $H$ and $DC$ match the verbal descriptions of them, that is, why is the $ij$-th entry of $H$ the expected number of times $j$ hears something from $i$ and similarly for $DC$. A proof directly written in the answer, or a reference to any good material to study this will help.

1 Answers1

5

Let me use capital letter $W$ instead of small letter $w$ (as $W$ is actually a matrix).

So, let $W$ be the matrix that contains in row $i$ and column $j$ the probability that $j$ hears from $i$. Then average number of pieces of information that $j$ gets from $i$ in one period is given by $W(i,j)$.

The average number of information that $j$ gets from $i$ that takes 2 periods to travel is the information that goes first from $i$ to some intermediate person $k$ and then from $k$ to $j$, via the path $i \to k \to j$. The amount of information transmitted through this path is $W(i,k) \cdot W(k,j)$. To see this, notice that $W(i,k)$ is the amount of info send to $k$ and of this a fraction $W(k,j)$ is sent forward to $j$. Now, this has to be summed over all intermediate nodes $k$ (i.e. all paths of length 2). so we get: $$ \sum_{k} W(i,k) \cdot W(k,j) = (W \times W)_{i,j} = (W^2)_{i,j} $$ Here $\times$ is matrix multiplication.

Let's now look at the information that takes 3 periods. This is the information that goes from $i$ to some intermediate $k$ then from $k$ to some intermediate $\ell$ and finally to $j$, so the path $i \to k \to \ell \to j$. The amount of information on this path is $W(i,k) \cdot W(k,\ell) \cdot W(\ell, j)$. If we sum over all these paths of length 3, we get: $$ \sum_k \sum_\ell W(i,k) \cdot W(k,\ell) \cdot W(\ell, j) = (W \times W \times W)_{i,j} = (W^3)_{i,j}. $$ If we continue, then we can generalize this to any path of length $T$. The information sent from $i$ to $j$ through all such paths is given by: $$ (W \times W \times \ldots W)_{i,j} = (W^T)_{i,j}. $$

Now if info goes from $i$ to $j$ either takes 1 period, 2 periods, 3 periods, ... $T$ periods to travel. As such, to compute the total expected pieces of info that $j$ receives from $i$ in $T$ periods is given by the sum over all these periods: $$ H(W, T) = \sum_{t = 1}^T (W^t)_{i,j}. $$ Now the total amount of info sent from $i$ and heard by someone can be obtained by taking the sum over all recipients $j$. $$ DC(W,T) = \sum_{j} (H(W,T))_{i,j} = (H(W,T)\times 1)_{i,j} $$ This amounts to adding up the rows of the matrix $H(W,T)$.

An example

Let's take the example of three individuals $1,2$ and $3$. Assume that $W$ is given by: $$ W = \begin{bmatrix} 0 & 0.3 & 0.2\\ 0.2 & 0 & 0.6\\ 0 & 0.4 & 0\end{bmatrix} $$ This implies that (for example) the probability that 2 hears from 1 in one period equals 0.3.

Then $$ W^2 = \begin{bmatrix} 0.06 & 0.08 & 0.18\\ 0 & 0.3 & 0.04\\ 0.08 & 0 & 0.24\end{bmatrix} $$ Here, for example, the amount received by 2 from 1 in two periods is $0.08 = 0.2 \times 0.4$

Next, multiplying $W^2$ once more by $W$ gives: $$ W^3 = \begin{bmatrix} 0.016 & 0.09 & 0.06\\ 0.06 & 0.016 & 0.18\\ 0 & 0.12 & 0.016\end{bmatrix} $$ Then: $$ H(W, 3) = W + W^2 + W^3 = \begin{bmatrix} 0.076 & 0.47 & 0.44\\ 0.26 & 0.316 & 0.82\\ 0.08 & 0.52 & 0.256\end{bmatrix} $$ So in three periods $2$ receives on average $0.47$ pieces of info from $1$

The total information sent from $1$ to someone is computed by taking the sum of the elements in row $1$, which gives: $$ 0.076 + 0.47 + 0.44 = 0.986 $$ In general: $$ DC(W, 3) = \begin{bmatrix} 0.986\\1.396\\0.856\end{bmatrix} $$

tdm
  • 11,747
  • 9
  • 36