When listening to music, humans often focus on melodic and rhythmic elements to identify specific songs or genres. While these representations may be quite simple, they still capture and differentiate higher level aspects of music such as expressive intent and musical style. In this work we seek to extract and represent rhythmic patterns from a polyphonic corpus of audio encompassing a number of styles. A compact feature is designed that probabilistically models rhythmic activations within musical beat divisions through histograms of Inter-Onset-Intervals (IOI). Onset detection functions are calculated from multiple frequency bands of a perceptually motivated filter bank. This allows for patterns of lower pitched and higher pitched onsets to be described separately. Through a set of supervised and unsupervised experiments, we show that this feature is well suited for a variety of tasks in which quantifying rhythmic style is necessary.