Luca Turchet

Learn More
A system to synthesize in real-time the sound of footsteps on different materials is presented. The system is based on microphones which allow the user to interact with his own footwear. This solution distinguishes our system from previous efforts that require specific shoes enhanced with sensors. The microphones detect real footsteps sounds from users,(More)
We describe a system which simulates in realtime the auditory and haptic sensations of walking on different surfaces. The system is based on a pair of sandals enhanced with pressure sensors and actuators. The pressure sensors detect the interaction force during walking, and control several physically based synthesis algorithms, which drive both the auditory(More)
We describe a system that can provide combined auditory and haptic sensations that arise while walking on different grounds. The simulation is based on a physical model that drives both haptic transducers embedded in sandals and headphones. The model is able to represent walking interactions with solid surfaces that can creak, be covered with crumpling(More)
We propose a system that affords real-time sound synthesis of footsteps on different materials. The system is based on microphones, which detect real footstep sounds from subjects, from which the ground reaction force (GRF) is estimated. Such GRF is used to control a sound synthesis engine based on physical models. Two experiments were conducted. In the(More)
We describe a multimodal system that exploits the use of footwear-based interaction in virtual environments. We developed a pair of shoes enhanced with pressure sensors, actuators, and markers. These shoes control a multichannel surround sound system and drive a physically based audio-haptic synthesis engine that simulates the act of walking on different(More)
In this paper we present an experiment whose goal is to investigate subjects' ability to match pairs of synthetic auditory and haptic stimuli which simulate the sensation of walking on different surfaces. In three non-interactive conditions the audio–haptic stimuli were passively presented through a desktop system, while in three interactive conditions(More)
This article investigates whether auditory feedback affects natural locomotion patterns. Individuals were provided with footstep sounds simulating different surface materials. The sounds were interactively generated using shoes with pressure sensors. Results showed that subjects' walking speed changed as a function of the type of simulated ground material.(More)
In this paper, we describe several experiments whose goal is to evaluate the role of plantar vibrotactile feedback in enhancing the realism of walking experiences in multimodal virtual environments. To achieve this goal we built an interactive and a noninteractive multimodal feedback system. While during the use of the interactive system subjects physically(More)
This paper introduces the design of SoleSound, a wearable system designed to deliver ecological, audio-tactile, underfoot feedback. The device, which primarily targets clinical applications, uses an audio-tactile footstep synthesis engine informed by the readings of pressure and inertial sensors embedded in the footwear to integrate enhanced feedback(More)
In this study, we investigated the role of interactive auditory feedback in modulating the inadvertent forward drift experienced while attempting to walk in place with closed eyes following a few minutes of treadmill walking. Simulations of footstep sounds upon surface materials such as concrete and snow were provided by means of a system composed of(More)