Abdelkareem Bedri

Learn More
We explore using the Outer Ear Interface (OEI) to recognize eating activities. OEI contains a 3D gyroscope and a set of proximity sensors encapsulated in an off-the-shelf earpiece to monitor jaw movement by measuring ear canal deformation. In a laboratory setting with 20 participants, OEI could distinguish eating from other activities, such as walking,(More)
The touchscreen has been the dominant input surface for smartphones and smartwatches. However, its small size compared to a phone limits the richness of the input gestures that can be supported. We present TapSkin, an interaction technique that recognizes up to 11 distinct tap gestures on the skin around the watch using only the inertial sensors and(More)
This paper presents an approach for automatically detecting eating activities by measuring deformations in the ear canal walls due to mastication activity. These deformations are measured with three infrared proximity sensors encapsulated in an off-the-shelf earpiece. To evaluate our method, we conducted a user study in a lab setting where 20 participants(More)
We address the problem of performing silent speech recognition where vocalized audio is not available (e.g. due to a user's medical condition) or is highly noisy (e.g. during firefighting or combat). We describe our wearable system to capture tongue and jaw movements during silent speech. The system has two components: the Tongue Magnet Interface (TMI),(More)
The human ear seems to be a rigid anatomical part with no apparent activity, yet many facial and body activity can be measured from it. Research apparatuses and commercial products have demonstrated the capability of monitoring hart rate, tongue activities, jaw motion and eye blinking from the ear. In this paper we describe the design and the implementation(More)
In Automatic Sign Language Recognition (ASLR), robust hand tracking and detection is key to good recognition accuracy. We introduce a new dataset of depth data from continuously signed American Sign Language (ASL) sentences. We present analysis showing numerous errors of the Microsoft Kinect Skeleton Tracker (MKST) in cases where hands are close to the(More)
The implementation of single-electron tunneling (SET) simulators based on the master-equation (ME) formalism requires the efficient and accurate identification of an exhaustive list of active states and related tunnel events. Dynamic simulations also require the control of the emerging states and guarantee the safe elimination of decaying states. This paper(More)
Motivated by health applications, eating detection with off-the-shelf devices has been an active area of research. A common approach has been to recognize and model individual intake gestures with wrist-mounted inertial sensors. Despite promising results, this approach is limiting as it requires the sensing device to be worn on the hand performing the(More)
Many single electron tunneling (SET) devices have been proposed to function as voltage or current dependent oscillators. Modeling the behavior of such devices in time and frequency domains is therefore needed in order to understand and predict the coherence and stability of the resulting oscillations. This paper presents a model that captures the statistics(More)