In the modern world where people are usually busier than ever, family members are geographically relocated due to globalization of companies and humans are inundated with more information than they can process, ambient communications through mobile media or internet based communication can provide rich social connections to friends and family. People can stay connected to their loving ones that they care about by sharing awareness information in a passive way and even simulate real-world living in virtual worlds. For users who wish to have a persistent existence in a virtual world – to let their friends know about their current activity or to inform their caretakers – new technology is needed. Research that aims to bridge real life and these virtual worlds to simulate virtual living, while challenging and promising, is currently rare. Most existing works focus on the dynamic representation of inanimate and passive objects (e.g., buildings, cars etc.) inside virtual worlds. Only very recently the mapping of real-world activities to virtual worlds has been attempted by processing multiple sensors data along with inference logic for real-world activities. Detecting or inferring human activity using such simple sensor data is often inaccurate and insufficient. Hence, this paper proposes to infer human activity from environmental sound cues and common sense knowledge, which is an inexpensive alternative to other sensors (e.g., accelerometers) based approach. Because of their ubiquity, we believe that mobile phones or handheld devices (HHD) are ideal channels to achieve a seamless integration between the physical and virtual worlds. Therefore, we present the approach of a prototype integrating mobile phone based computing and Second Life by inferring activities from environmental sound cues. To the best of our knowledge, this system pioneers the use of environmental sound based activity recognition in mobile computing to reflect one’s real-world activity in virtual worlds.