A couple of years ago, one of us wrote a ProGuestus article entitled "How Far Did That Fly Ball Travel?". The article posed the question: How well does the initial velocity vector (speed and angles) determine the landing point of a fly ball? Utilizing HITf/x data for the initial velocity and ESPN Home Run Tracker for the landing point and hang time, it was determined that the initial velocity vector poorly determines the landing location. Specifically, with a narrow range of initial speed and launch angle, distances in the range 370-440 ft were observed, with a mean of 405 ft and a standard deviation of 16 ft. Much of the rest of the article was devoted to speculation about why that is the case. Variations in air density due to temperature, elevation, and related effects were eliminated by only considering home runs hit in a narrow range of air density. A similar range of distances was observed in covered stadiums, thereby eliminating wind as the primary factor. Two other possible reasons were identified and investigated: variation both in backspin and in the air drag properties of the baseball. The latter is a very intriguing possibility, since variation in the seam height and/or surface roughness of the ball might have a significant effect on the air resistance experienced by the ball.