Towards Measuring and Inferring User Interest from Gaze

@article{Li2017TowardsMA,
  title={Towards Measuring and Inferring User Interest from Gaze},
  author={Yixuan Li and Pingmei Xu and Dmitry Lagun and Vidhya Navalpakkam},
  journal={Proceedings of the 26th International Conference on World Wide Web Companion},
  year={2017}
}
How can we reliably infer web users' interest and evaluate the content relevance when lacking active user interaction such as click behavior? In this paper, we investigate the relationship between mobile users' implicit interest inferred from attention metrics, such as eye gaze or viewport time, and explicit interest expressed by users. We present the first quantitative gaze tracking study using front-facing camera of mobile devices instead of specialized, expensive eye-tracking devices. We… Expand
Inferring Human Knowledgeability from Eye Gaze in Mobile Learning Environments
What people look at during a visual task reflects an interplay between ocular motor functions and cognitive processes. In this paper, we study the links between eye gaze and cognitive states toExpand
Inferring Human Knowledgeability from Eye Gaze in Mobile Learning Environments
TLDR
This is the first attempt to predict user’s knowledgeability from eye gaze using a noninvasive eye tracking method on mobile devices: the authors perform gaze estimation using front-facing camera of mobile devices in contrast to using specialised eye tracking devices. Expand
Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications
TLDR
This work evaluates the performance of state-of-the-art appearance-based gaze estimation for interaction scenarios with and without personal calibration, indoors and outdoors, for different sensing distances, as well as for users with andwithout glasses. Expand
GazeGraph: graph-based few-shot cognitive context sensing from human visual behavior
TLDR
This work introduces the spatial-temporal gaze graphs and the deep learning-based representation learning method to extract powerful and generalized features from the eye movements for context sensing and develops a few-shot gaze graph learning module that adapts the `learning to learn' concept from meta-learning to enable quick system adaptation in a data-efficient manner. Expand
Is This Really Relevant? A Guide to Best Practice Gaze-based Relevance Prediction Research
TLDR
This paper reviews approaches to predict relevance from gaze with regard to five design issues: extracting features, defining the algorithm, setting a prediction scope, eliminating visual distractors, and evaluating the system. Expand
What Does Your Gaze Reveal About You? On the Privacy Implications of Eye Tracking
TLDR
This paper provides a structured overview of personal data that can be inferred from recorded eye activities and shows that eye tracking data may implicitly contain information about a user’s biometric identity, gender, age, ethnicity, body weight, personality traits, drug consumption habits, emotional state, skills and abilities, fears, interests, and sexual preferences. Expand
Optimizing Visual Element Placement via Visual Attention Analysis
TLDR
This work proposes a novel data-driven optimization approach for automatically analyzing visual attention and placing visual elements in 3D virtual environments using an eye-tracking virtual reality headset and uses the predicted gaze duration output of the regressors to optimize the placement of visual elements with respect to certainVisual attention and design goals. Expand
Between Clicks and Satisfaction: Study on Multi-Phase User Preferences and Satisfaction for Online News Reading
TLDR
This work sheds light on the understanding of user click behaviors and provides a method for better estimating user interest and satisfaction, and builds an effective model to predict whether the user actually likes the clicked news. Expand
Post-Click Behaviors Enhanced Recommendation System
TLDR
Fine-grained post-click behaviors are integrated to alleviate the data sparsity problem of explicit feedback and the data accuracy problem of macro implicit feedback in the deployed article recommendation pipeline. Expand
Saliency Prediction for Mobile User Interfaces
TLDR
A novel autoencoder based multi-scale deep learning model that provides saliency prediction at the mobile interface element level is developed and it is shown that this approach performs significantly better on a range of established metrics. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 32 REFERENCES
Gaze Prediction for Recommender Systems
TLDR
This work shows that it is possible to predict gaze by combining easily-collected user browsing data with eye tracking data from a small number of users in a grid-based recommender interface, and demonstrates that Hidden Markov Models (HMMs) can be applied in this setting. Expand
Measurement and modeling of eye-mouse behavior in the presence of nonlinear page layouts
TLDR
These findings show that mouse tracking can be used to infer user attention and information flow patterns on search pages, and develop models to predict user attention (eye gaze) from mouse activity. Expand
What do you see when you're surfing?: using eye tracking to predict salient regions of web pages
TLDR
An eye-tracking study is presented in which 20 users viewed 361 Web pages while engaged in information foraging and page recognition tasks, and the concept of fixation impact is introduced, a new method for mapping gaze data to visual scenes that is motivated by findings in vision research. Expand
Segment-level display time as implicit feedback: a comparison to eye tracking
TLDR
The study shows that segment-level display time yields comparable results as eye-tracking-based feedback, and should be considered in future personalization systems as an inexpensive but precise method for implicit feedback. Expand
Predicting Search User Examination with Visual Saliency
TLDR
A new prediction model based on visual saliency map and page content features, designed to measure the likelihood of a given area to attract human visual attention, is used to predict users' attention distribution on heterogenous search components. Expand
Understanding Mobile Searcher Attention with Rich Ad Formats
TLDR
The findings indicate that showing rich ad formats improve search experience, by drawing more attention to the information-rich ad and allowing users to interact to view more offers, which increases user satisfaction with search. Expand
Web User Interaction Mining from Touch-Enabled Mobile Devices
Web services that thrive on mining user interaction data such as search engines can currently track clicks and mouse cursor activity on their Web pages. Cursor interaction mining has been shown toExpand
Eye‐tracking analysis of user behavior and performance in web search on large and small screens
TLDR
It is found that users have more difficulty extracting information from search results pages on the smaller screens, although they exhibit less eye movement as a result of an infrequent use of the scroll function, and there is no significant difference between the 2 screens in time spent on searchresults pages and the accuracy of finding answers. Expand
Towards better measurement of attention and satisfaction in mobile search
TLDR
This paper studied whether tracking the browser viewport on mobile phones could enable accurate measurement of user attention at scale, and provide good measurement of search satisfaction in the absence of clicks, and found strong correlations between gaze duration and viewport duration on a per result basis. Expand
Implicit interest indicators
TLDR
It was found that the time spent on a pages, the amount of scrolling on a page and the combination of time and scrolling had a strong correlation with explicit interest, while individual scrolling methods and mouse-clicks were ineffective in predicting explicit interest. Expand
...
1
2
3
4
...