The issue of signal approximation has been treated extensively by the signal processing and mathematics communities. The problem is usually that of minimizing an appropriate metric between the given signal and signals described by a certain model. In this paper we consider the general signal model described by multirate systems. Multiple channels with nonuniform interpolation ratios are considered. We show that the solution minimizing the 2 norm of the approximation error involves filtering that is in general time-varying. We also point out the conditions under which LTI filters can be used and consider a special uniform case which provides further insights.