FlexCase: Enhancing Mobile Interaction with a Flexible Sensing and Display Cover

@article{Rendl2016FlexCaseEM,
  title={FlexCase: Enhancing Mobile Interaction with a Flexible Sensing and Display Cover},
  author={Christian Rendl and David Kim and Patrick Parzer and S. Fanello and Martin Zirkl and Gregor Scheipl and Michael Haller and Shahram Izadi},
  journal={Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems},
  year={2016}
}
FlexCase is a novel flip cover for smartphones, which brings flexible input and output capabilities to existing mobile phones. It combines an e-paper display with a pressure- and bend-sensitive input sensor to augment the capabilities of a phone. Due to the form factor, FlexCase can be easily transformed into several different configurations, each with different interaction possibilities. Users can use FlexCase to perform a variety of touch, pressure, grip and bend gestures in a natural manner… 
Paper for E-Paper: Towards Paper Like Tangible Experience using E-Paper
TLDR
The goal in this work is to replicate the feedback and affordances one would receive from a printed book on a mobile device, where to fully replicate the reading experience the user would need to turn pages as they would naturally with aprinted book.
Skin-On Interfaces: A Bio-Driven Approach for Artificial Skin Design to Cover Interactive Devices
TLDR
A paradigm called Skin-On interfaces is proposed, in which interactive devices have their own (artificial) skin, thus enabling new forms of input gestures for end-users (e.g. twist, scratch) and a toolkit is provided that enables easy reproduction and fabrication.
InformationSense: Trade-offs for the Design and the Implementation of a Large Highly Deformable Cloth Display
TLDR
InformationSense, a large, highly deformable cloth display, is presented, which suggests that deformable displays are already suitable if high hedonic qualities are important but need to be enhanced with additional digital power if high pragmatic qualities are required.
Multimodal feedback in HCI: haptics, non-speech audio, and their applications
TLDR
A range of applications where multimodal feedback that involves haptics or non-speech audio can provide usability benefits are presented, motivated by Wickens' Multiple Resources Theory.
HoloFlex: A Flexible Light-Field Smartphone with a Microlens Array and a P-OLED Touchscreen
We present HoloFlex, a 3D flexible smartphone featuring a light-field display consisting of a high-resolution P-OLED display and an array of 16,640 microlenses. HoloFlex allows mobile users to
ShearSheet: Low-Cost Shear Force Input with Elastic Feedback for Augmenting Touch Interaction
TLDR
The effectiveness of ShearSheet's smooth transition between position- and rate-based control through a controlled user study using simple scrolling tasks is demonstrated, suggesting its substantial benefits and potential applications.
Multimodal Feedbackin HCI : Haptics , Non-Speech Audio , and Their Applications
TLDR
This chapter provides an overview of research in the use of these non-visual modalities for interaction, showing how new output modalities can be used in the user interface to different devices.
iSoft: A Customizable Soft Sensor with Real-time Continuous Contact and Stretching Sensing
TLDR
iSoft, a single volume soft sensor capable of sensing real-time continuous contact and unidirectional stretching, is presented and a software toolkit for users to design and deploy personalized interfaces with customized sensors is provided.
Flexy: Shape-Customizable, Single-Layer, Inkjet Printable Patterns for 1D and 2D Flex Sensing
We contribute a new technique for fabricating highly customized 1D and 2D flex sensing surfaces on thin and flexible substrates. It enables designers and makers to easily, quickly and inexpensively
...
1
2
3
...

References

SHOWING 1-10 OF 57 REFERENCES
PaperPhone: understanding the use of bend gestures in mobile devices with flexible electronic paper displays
TLDR
An evaluation of the effectiveness of various bend gestures in executing a set of tasks with a flexible display and strong consensus on the polarity of the bend gestures implies that bend gestures that take directional cues into account are likely more natural to users.
Lucid touch: a see-through mobile device
TLDR
Initial study results indicate that many users found touching on the back of the device to be preferable to touched on the front, due to reduced occlusion, higher precision, and the ability to make multi-finger input.
SideSight: multi-"touch" interaction around small devices
TLDR
A prototype device with infra-red proximity sensors embedded along each side and capable of detecting the presence and position of fingers in the adjacent regions is described, which gives a larger input space than would otherwise be possible which may be used in conjunction with or instead of on-display touch input.
Grips and gestures on a multi-touch pen
TLDR
This study shows that both swipe and double tap gestures are comparable in performance to commonly employed barrel buttons without its disadvantages, and can be successfully and comfortably performed regardless of the rotation of the pen or how the user grips it.
Sensing techniques for tablet+stylus interaction
TLDR
G grip and motion sensing is explored to afford new techniques that leverage how users naturally manipulate tablet and stylus devices during pen + touch interaction, and can be used to impart new, previously unanticipated subtleties to pen +touch interaction on tablets.
FlexView: an evaluation of depth navigation on deformable mobile devices
TLDR
FlexView allows users to easily browse depth arranged information spaces without sacrificing traditional touch interactions, and demonstrates that bend interaction is comparable to touch input for navigation through stacked content.
Duet: exploring joint interactions on a smart phone and a smart watch
TLDR
Duet is an interactive system that explores a design space of interactions between a smart phone and a smart watch, and transforms the watch into an active element that enhances a wide range of phone-based interactive tasks, and enables a new class of multi-device gestures and sensing techniques.
Back-of-device interaction allows creating very small touch devices
TLDR
This paper argues that the key to touch-enabling very small devices is to use touch on the device backside, and presents four form factor concepts based on back-of-device interaction and design guidelines extracted from a second user study.
HandSense: discriminating different ways of grasping and holding a tangible user interface
TLDR
The HandSense prototype employs capacitive sensors for detecting when it is touched or held against a body part, and is able to correctly classify over 80 percent of all touches, discriminating six different ways of touching the device.
Can you handle it?: bimanual techniques for browsing media collections on touchscreen tablets
TLDR
This paper presents a bimanual scrolling technique that splits the control of scrolling speed and scrolling direction across two hands using a combination of pressure, physical dial and touch input.
...
1
2
3
4
5
...