# Network analysis abuses of null-models

Gorka Zamora-López

Analysing and interpreting data can be a complicated procedure, a maze made of interlinked steps and traps. There are no official procedures for how one should analyse a network. As it happens in many scientific fields the “standard” approach consists of a set of habits that have been popularised in the literature – repeated over-and-over again – without always being clear why we analyse networks the way we do.

Imagine we wanted to study an anatomical brain connectivity made of $$N = 214$$ cortical regions (nodes) interconnected by $$L = 4,593$$ white matter fibers (a density of $$\rho = 0.201$$). Following the typical workflow in the literature we would start the analysis by measuring a few basic graph metrics such as the degree of each node $$k_i$$ and their distribution $$P(k_i)$$, the custering coefficient $$C$$ and the average pathlength $$l$$ of the network. Imagine we obtain the empirical values $$C_{emp} = 0.497$$ for the clustering and $$l_{emp} = 1.918$$ for the average pathlength.
Continue reading Network analysis abuses of null-models