We explore properties of the entropy, Kullback-Leibler information, and mutual information for order statistics. The probability integral transformation plays a pivotal role in developing our results. We provide bounds for the entropy of order statistics and some results that relate entropy ordering of order statistics to other well-known orderings of random variables. We show that the discrimination information between order statistics and data distribution, the discrimination information among the order statistics, and the mutual information between order statistics are all distribution free and are computable using the distributions of the order statistics of the samples from the uniform distribution. We also discuss information properties of spacings for uniform and exponential samples and provide a large sample distribution-free result on the entropy of spacings. The results show interesting symmetries of information orderings among order statistics.