Let a communication network be modelled by a graph <i>G</i>=(<i>V</i>,<i>E</i>) of <i>n</i> nodes and <i>m</i> edges. Classic communication operations among nodes in the graph (e.g., broadcasting, multicasting, gossiping, etc.) usually take place on a subgraph of <i>G</i>, which is built in such a way that a given objective function is optimized. Among the many possible topologies of such a subgraph, the <i>minimum radius spanning tree</i> (MRST) is a rooted tree which minimizes the distance from the root to a farthest node. The MRST finds a natural application in several facility location problems, since it allows to minimize the maximum delay for reaching the center from the periphery of the tree. In this paper, we consider the problem of computing an MRST in a non-cooperative setting in which each edge of <i>G</i> is controlled by a selfish agent, and the cost she asks for using her edge depends only on her private information. Under these assumptions, we provide a <i>mechanism</i> that computes a true MRST of <i>G</i> in <i>O</i>(<i>mn</i>√<i>n</i>+<i>n</i><sup>3</sup> log <i>n</i>) time and <i>O</i>(<i>n</i><sup>2</sup> √<i>n</i>) space. Interestingly, this is just at most a linear factor away from the time needed to compute an MRST in a canonical centralized framework.