This paper presents a novel method for synthesizing a novel view from two sets of differently focused images taken by a sparse camera array for a scene of two approximately constant depths. The proposed method consists of two steps. The first step is a view interpolation to reconstruct an all-focused dense light field of the scene. The second step is to synthesize a novel view by light-field rendering technique from the reconstructed dense light field. The view interpolation can be achieved simply by linear filters that are designed to convert the defocus effects to the parallax effects without estimating the depth map of the scene. The proposed method can effectively create a dense array of pin-hole cameras (i.e., all-focused images), so that the final novel view is better than traditional method using sparse array of cameras. Experimental results on the real images from four aligned cameras are shown.