Towards Measuring and Inferring User Interest from Gaze

@article{Li2017TowardsMA,
  title={Towards Measuring and Inferring User Interest from Gaze},
  author={Yixuan Li and Pingmei Xu and Dmitry Lagun and Vidhya Navalpakkam},
  journal={Proceedings of the 26th International Conference on World Wide Web Companion},
  year={2017}
}
How can we reliably infer web users' interest and evaluate the content relevance when lacking active user interaction such as click behavior? In this paper, we investigate the relationship between mobile users' implicit interest inferred from attention metrics, such as eye gaze or viewport time, and explicit interest expressed by users. We present the first quantitative gaze tracking study using front-facing camera of mobile devices instead of specialized, expensive eye-tracking devices. We… 

Figures and Tables from this paper

Inferring Human Knowledgeability from Eye Gaze in Mobile Learning Environments

This is the first attempt to predict user’s knowledgeability from eye gaze using a noninvasive eye tracking method on mobile devices: the authors perform gaze estimation using front-facing camera of mobile devices in contrast to using specialised eye tracking devices.

Inferring Human Knowledgeability from Eye Gaze in Mobile Learning Environments

This is the first attempt to predict user’s knowledgeability from eye gaze using a noninvasive eye tracking method on mobile devices: the authors perform gaze estimation using front-facing camera of mobile devices in contrast to using specialised eye tracking devices.

Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications

This work evaluates the performance of state-of-the-art appearance-based gaze estimation for interaction scenarios with and without personal calibration, indoors and outdoors, for different sensing distances, as well as for users with andwithout glasses.

GazeGraph: graph-based few-shot cognitive context sensing from human visual behavior

This work introduces the spatial-temporal gaze graphs and the deep learning-based representation learning method to extract powerful and generalized features from the eye movements for context sensing and develops a few-shot gaze graph learning module that adapts the `learning to learn' concept from meta-learning to enable quick system adaptation in a data-efficient manner.

Is This Really Relevant? A Guide to Best Practice Gaze-based Relevance Prediction Research

This paper reviews approaches to predict relevance from gaze with regard to five design issues: extracting features, defining the algorithm, setting a prediction scope, eliminating visual distractors, and evaluating the system.

What Does Your Gaze Reveal About You? On the Privacy Implications of Eye Tracking

This paper provides a structured overview of personal data that can be inferred from recorded eye activities and shows that eye tracking data may implicitly contain information about a user’s biometric identity, gender, age, ethnicity, body weight, personality traits, drug consumption habits, emotional state, skills and abilities, fears, interests, and sexual preferences.

Optimizing Visual Element Placement via Visual Attention Analysis

This work proposes a novel data-driven optimization approach for automatically analyzing visual attention and placing visual elements in 3D virtual environments using an eye-tracking virtual reality headset and uses the predicted gaze duration output of the regressors to optimize the placement of visual elements with respect to certainVisual attention and design goals.

Exploring Pointer Assisted Reading (PAR): Using Mouse Movements to Analyze Web Users' Reading Behaviors and Patterns

Advocating for considering PAR-tracking as a feasible alternative to eye-tracking on websites, as tracking the eye gaze of ordinary web users is usually impractical, is advocated.

EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition

EyeSyn is presented, a novel suite of psychology-inspired generative models that leverages only publicly available images and videos to synthesize a realistic and arbitrarily large eye movement dataset that can replicate the distinct pat-terns in the actual gaze signals that are captured by an eye tracking device.

Post-Click Behaviors Enhanced Recommendation System

Fine-grained post-click behaviors are integrated to alleviate the data sparsity problem of explicit feedback and the data accuracy problem of macro implicit feedback in the deployed article recommendation pipeline.

References

SHOWING 1-10 OF 32 REFERENCES

Gaze Prediction for Recommender Systems

This work shows that it is possible to predict gaze by combining easily-collected user browsing data with eye tracking data from a small number of users in a grid-based recommender interface, and demonstrates that Hidden Markov Models (HMMs) can be applied in this setting.

Measurement and modeling of eye-mouse behavior in the presence of nonlinear page layouts

These findings show that mouse tracking can be used to infer user attention and information flow patterns on search pages, and develop models to predict user attention (eye gaze) from mouse activity.

What do you see when you're surfing?: using eye tracking to predict salient regions of web pages

An eye-tracking study is presented in which 20 users viewed 361 Web pages while engaged in information foraging and page recognition tasks, and the concept of fixation impact is introduced, a new method for mapping gaze data to visual scenes that is motivated by findings in vision research.

Segment-level display time as implicit feedback: a comparison to eye tracking

The study shows that segment-level display time yields comparable results as eye-tracking-based feedback, and should be considered in future personalization systems as an inexpensive but precise method for implicit feedback.

Predicting Search User Examination with Visual Saliency

A new prediction model based on visual saliency map and page content features, designed to measure the likelihood of a given area to attract human visual attention, is used to predict users' attention distribution on heterogenous search components.

Understanding Mobile Searcher Attention with Rich Ad Formats

The findings indicate that showing rich ad formats improve search experience, by drawing more attention to the information-rich ad and allowing users to interact to view more offers, which increases user satisfaction with search.

Web User Interaction Mining from Touch-Enabled Mobile Devices

Some of the challenges faced in mining interaction data for new modes of interaction are discussed, and future research directions in this field are discussed.

Eye‐tracking analysis of user behavior and performance in web search on large and small screens

It is found that users have more difficulty extracting information from search results pages on the smaller screens, although they exhibit less eye movement as a result of an infrequent use of the scroll function, and there is no significant difference between the 2 screens in time spent on searchresults pages and the accuracy of finding answers.

Towards better measurement of attention and satisfaction in mobile search

This paper studied whether tracking the browser viewport on mobile phones could enable accurate measurement of user attention at scale, and provide good measurement of search satisfaction in the absence of clicks, and found strong correlations between gaze duration and viewport duration on a per result basis.

Implicit interest indicators

It was found that the time spent on a pages, the amount of scrolling on a page and the combination of time and scrolling had a strong correlation with explicit interest, while individual scrolling methods and mouse-clicks were ineffective in predicting explicit interest.