Today I read a paper titled “Mutual Search”
The abstract is:
We introduce a search problem called “mutual search” where $k$ \agents, arbitrarily distributed over $n$ sites, are required to locate one another by posing queries of the form “Anybody at site $i$?”.
We ask for the least number of queries that is necessary and sufficient.
For the case of two \agents using deterministic protocols we obtain the following worst-case results: In an oblivious setting (where all pre-planned queries are executed) there is no savings: $n-1$ queries are required and are sufficient.
In a nonoblivious setting we can exploit the paradigm of “no news is also news” to obtain significant savings: in the synchronous case $0.586n$ queries suffice and $0.536n$ queries are required; in the asynchronous case $0.896n$ queries suffice and a fortiori 0.536 queries are required; for $o(\sqrt{n})$ \agents using a deterministic protocol less than $n$ queries suffice; there is a simple randomized protocol for two \agents with worst-case expected $0.5n$ queries and all randomized protocols require at least $0.125n$ worst-case expected queries.
The graph-theoretic framework we formulate for expressing and analyzing algorithms for this problem may be of independent interest.