In physics-based engineering modeling and uncertainty quantification, distinguishing the effects of two main sources of uncertainty — calibration parameter uncertainty and model discrepancy — is challenging. Previous research has shown that identifiability, which is quantified by the posterior covariance of the calibration parameters, can sometimes be improved by experimentally measuring multiple responses of the system that share a mutual dependence on a common set of calibration parameters. In this paper, we address the issue of how to select the most appropriate subset of responses to measure experimentally, to best enhance identifiability. We use a preposterior analysis approach that, prior to conducting physical experiments but after conducting computer simulations, can predict the degree of identifiability that will result using different subsets of responses to measure experimentally. It predicts identifiability via the preposterior covariance from a modular Bayesian Monte Carlo analysis of a multi-response spatial random process (SRP) model. Furthermore, to handle the computational challenge in preposterior analysis, we propose a surrogate preposterior analysis based on Fisher information of the calibration parameters. The proposed methods are applied to a simply supported beam example to select two out of six responses to best improve identifiability. The estimated preposterior covariance is compared to the actual posterior covariance to demonstrate the effectiveness of the methods.