We seldom mistake a closer object as being larger, even though its retinal image is bigger. One underlying mechanism could be to calculate the size of the retinal image relative to that of another nearby object. Here we set out to investigate whether single neurons in the monkey inferotemporal cortex (IT) are sensitive to the relative size of parts in a display. Each neuron was tested on shapes containing two parts that could be conjoined or spatially separated. Each shape was presented in four versions created by combining the two parts at each of two possible sizes. In this design, neurons sensitive to the absolute size of parts would show the greatest response modulation when both parts are scaled up, whereas neurons encoding relative size would show similar responses. Our main findings are that 1) IT neurons responded similarly to all four versions of a shape, but tuning tended to be more consistent between versions with proportionately scaled parts; 2) in a subpopulation of cells, we observed interactions that resulted in similar responses to proportionately scaled parts; 3) these interactions developed together with sensitivity to absolute size for objects with conjoined parts but developed slightly later for objects with spatially separate parts. Taken together, our results demonstrate for the first time that there is a subpopulation of neurons in IT that encodes the relative size of parts in a display, forming a potential neural substrate for size constancy.