Julián Villegas

Learn More
HRIR~, a new software audio filter for Head-Related Impulse Response (HRIR) convolution is presented. The filter, implemented as a Pure-Data object, allows dynamic modification of a sound source's apparent location by modulating its virtual azimuth, elevation, and range in realtime, the last attribute being missing in surveyed similar applications. With(More)
We investigated the effect on objective speech intelligibility of scaling the fundamental frequency (f0) of voiced regions in a set of utterances. The frequency scaling was driven by max-imising the glimpse proportion in voiced epochs, inspired by musical consonance maximisation techniques. Results show that depending on the energetic masker and the signal(More)
How does a background conversation affect a foreground con-versation? In this scenario, and unlike traditional studies of noise-induced speech modification (Lombard speech), listeners have to cope with the additional challenge of competing speech material. In the current study, pairs of talkers engaged in natural dialogs in the absence or presence of(More)
Speech produced in the presence of noise (Lombard speech) is typically more intelligible than speech produced in quiet (plain speech) when presented at the same signal-to-noise ratio, but the factors responsible for the Lombard intelligibility benefit remain poorly understood. Previous studies have demonstrated a clear effect of spectral differences between(More)
Modern smartphones and tablets have magnetometers that can be used to detect yaw, which data can be distributed to adjust ambient media. Either static (pointing) or dynamic (twirling) modes can be used to modulate multimodal displays, including 360° imagery and virtual environments. Azimuthal tracking especially allows control of horizontal planar(More)
The rapid advances in the technology and science of presenting spatial sound in virtual, augmented, and mixed-reality environments seem to be underrepresented in recent literature. The goal of this special issue of the Virtual Reality Journal is twofold: to provide a state-of-the-art review of progress in spatial sound as applied to virtual reality (VR) and(More)
A real-time system for sound spatialization via headphones is presented. Conventional headphone spatialization techniques effectively place sources on the surface of a virtual sphere around the listener. In the new system, sources can be spatialized at different distances from a listener by interpolating head-related impulse responses (HRIRs) measured(More)