Botanicus Interacticus: interactive plants technology

  title={Botanicus Interacticus: interactive plants technology},
  author={Ivan Poupyrev and Philipp Schoessler and J. Loh and Munehiko Sato},
  booktitle={International Conference on Computer Graphics and Interactive Techniques},
Botanicus Interacticus is a technology for designing highly expressive interactive plants, both living and artificial. We are motivated by the rapid fusion of computing and our dwelling spaces, as well as the increasingly tactile and gestural nature of our interactions with digital devices. Today, however, this interaction happens either on the touch screens of tablet computers and smart phones, or in free air, captured by camera-based devices, such as the Kinect. What if, instead of this… 

Figures from this paper

InfoPlant: Multimodal augmentation of plants for enhanced human-computer interaction

A subsequent study showed that the InfoPlant was indeed perceived as unobtrusive by the large majority of participants and that it was easily accepted as a possible new entity in a living-room context.

Pudica: A Framework For Designing Augmented Human-Flora Interaction

A design framework for designing human-flora interaction in plant-based interfaces, which could play a prominent role in a world where HCI strives to be less pollutive, more power saving, and humane is introduced.

Multi-sensory EmotiPlant: multimodal interaction with augmented plants

A framework for multisensory augmented plants is proposed and the design of three different augmented plants that are able to communicate with humans through different modalities are presented, aimed at bridging the communication gap between plants and humans.

Interactive Plants: Multisensory Visual-Tactile Interaction Enhances Emotional Experience

Using a multisensory interface system, we examined how people’s emotional experiences change as their tactile sense (touching a plant) was augmented with visual sense (“seeing” their touch). Our

Animals, plants, people and digital technology: exploring and understanding multispecies-computer interaction

Understanding interactions involving humans and computers has ever since its establishment in the early 1980s been a key foundation for the field of human-computer interaction (HCI) and its development and has been permeated by anthropocentrism.

Growth, Change and Decay: Plants and Interaction Possibilities

Early findings indicate that using a plant-based interface triggered emotive connections, making interactions more enjoyable, and the future potential for using plants as an interaction medium is considered.

Plant interaction

The manner of treating plants at the same level as a human being is focused on, which allows for an interactive experience between humans and living plants, and the prototype allows for visualize, at some moments a plant's biological reactions presenting its inner vitality artistically.

Cyborg Botany: Exploring In-Planta Cybernetic Systems for Interaction

Merging synthetic circuitry with plant's own physiology could pave a way to make these lifeforms responsive to the authors' interactions and their ubiquitous sustainable deployment.

Capacitive sensing and communication for ubiquitous interaction and environmental perception

A human-centric approach to perceiving the environment with quasi-electrostatic fields by making use of capacitive coupling between devices and objects, and contributes in the domain of context-aware devices and explicit gesture-recognition systems.

NatureCHI: Unobtrusive User Experiences with Technology in Nature

This workshop addresses the challenges that are related to interacting with technology in nature, including interaction design and prototyping, social and cultural issues, user experiences that aim for unobtrusive interactions with the technology with nature as the use context.


Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects

The rich capabilities of Touché are demonstrated with five example setups from different application domains and experimental studies that show gesture classification accuracies of 99% are achievable with the technology.