Today I read a paper titled “Walking on a Graph with a Magnifying Glass: Stratified Sampling via Weighted Random Walks”
The abstract is:
Our objective is to sample the node set of a large unknown graph via crawling, to accurately estimate a given metric of interest.
We design a random walk on an appropriately defined weighted graph that achieves high efficiency by preferentially crawling those nodes and edges that convey greater information regarding the target metric.
Our approach begins by employing the theory of stratification to find optimal node weights, for a given estimation problem, under an independence sampler.
While optimal under independence sampling, these weights may be impractical under graph crawling due to constraints arising from the structure of the graph.
Therefore, the edge weights for our random walk should be chosen so as to lead to an equilibrium distribution that strikes a balance between approximating the optimal weights under an independence sampler and achieving fast convergence.
We propose a heuristic approach (stratified weighted random walk, or S-WRW) that achieves this goal, while using only limited information about the graph structure and the node properties.
We evaluate our technique in simulation, and experimentally, by collecting a sample of Facebook college users.
We show that S-WRW requires 13-15 times fewer samples than the simple re-weighted random walk (RW) to achieve the same estimation accuracy for a range of metrics.