Translating Video Recordings of Mobile App Usages into Replayable Scenarios

@article{BernalCardenas2020TranslatingVR,
  title={Translating Video Recordings of Mobile App Usages into Replayable Scenarios},
  author={Carlos Bernal-C'ardenas and Nathan Cooper and Kevin Moran and Oscar Chaparro and Andrian Marcus and Denys Poshyvanyk},
  journal={2020 IEEE/ACM 42nd International Conference on Software Engineering (ICSE)},
  year={2020},
  pages={309-321}
}
Screen recordings of mobile applications are easy to obtain and capture a wealth of information pertinent to software developers (e.g., bugs or feature requests), making them a popular mechanism for crowdsourced app feedback. Thus, these videos are becoming a common artifact that developers must manage. In light of unique mobile development constraints, including swift release cycles and rapidly evolving platforms, automated techniques for analyzing all types of rich software artifacts provide… 

Figures and Tables from this paper

Translating Video Recordings of Mobile App UI Gestures into Replayable Scenarios for Native and Hybrid Apps
TLDR
V2S+ is an automated approach for translating video recordings of Android app usages into replayable scenarios based primarily on computer vision techniques and adapts recent solutions for object detection and image classification to detect and classify user gestures captured in a video, and convert these into a replayable test scenario.
V2S: A Tool for Translating Video Recordings of Mobile App Usages into Replayable Scenarios
TLDR
Video2Scenario (V2S), an automated tool that processes video recordings of Android app usages, utilizes neural object detection and image classification techniques to classify the depicted user actions, and translates these actions into a replayable scenario is presented.
Layout and Image Recognition Driving Cross-Platform Automated Mobile Testing
TLDR
This paper uses computer vision technologies to perform UI feature comparison and layout hierarchy extraction on mobile app screenshots to obtain UI structures containing rich contextual information of app widgets, including coordinates, relative relationship, etc, which form a platform-independent test script, and then locate the target widgets under test.
GIFdroid: Automated Replay of Visual Bug Reports for Android Apps
  • Sidong Feng, Chunyang Chen
  • Computer Science
    2022 IEEE/ACM 44th International Conference on Software Engineering (ICSE)
  • 2022
TLDR
GIFdroid, a light-weight approach to automatically replay the execution trace from visual bug reports, adopts image processing techniques to extract the keyframes from the recording, map them to states in GUI Transitions Graph, and generate the execution Trace of those states to trigger the bug.
It Takes Two to Tango: Combining Visual and Textual Information for Detecting Duplicate Video-Based Bug Reports
TLDR
Tango is presented, a duplicate detection technique that operates purely on video-based bug reports by leveraging both visual and textual information that combines tailored computer vision techniques, optical character recognition, and text retrieval.
GIFdroid: An Automated Light-weight Tool for Replaying Visual Bug Reports
  • Sidong Feng, Chunyang Chen
  • Computer Science
    2022 IEEE/ACM 44th International Conference on Software Engineering: Companion Proceedings (ICSE-Companion)
  • 2022
TLDR
GIFdroid is proposed, a light-weight approach to automatically replay the execution trace from visual bug reports that adopts image processing techniques to extract the keyframes from the recording, map them to states in GUI Transitions Graph, and generate the execution Trace of those states to trigger the bug.
Andror2: A Dataset of Manually-Reproduced Bug Reports for Android apps
TLDR
This paper presents AndroR2: a dataset of 90 manually reproduced bug reports for Android apps listed on Google Play and hosted on GitHub, systematically collected via an in-depth analysis of 459 reports extracted from the GitHub issue tracker.
Efficient Search of Live-Coding Screencasts from Online Videos
TLDR
A tool named PSFinder is developed that leverages a classifier to identify whether a video frame contains an IDE window and determines whether the video is a live-coding screencast based on frames classified as containing IDE window.
Automated Recording and Semantics-Aware Replaying of High-Speed Eye Tracking and Interaction Data to Support Cognitive Studies of Software Engineering Tasks
TLDR
Results show that Déjà Vu can playback 100% of the data recordings, correctly mapping the gaze to corresponding elements, making it a well-founded and suitable post processing step for future eye tracking studies in software engineering.
...
...

References

SHOWING 1-10 OF 62 REFERENCES
V2S: A Tool for Translating Video Recordings of Mobile App Usages into Replayable Scenarios
TLDR
Video2Scenario (V2S), an automated tool that processes video recordings of Android app usages, utilizes neural object detection and image classification techniques to classify the depicted user actions, and translates these actions into a replayable scenario is presented.
Record and replay for Android: are we there yet in industrial cases?
TLDR
A comparison of popular record-and-replay tools from researchers and practitioners is presented, by applying these tools to test three popular industrial apps downloaded from the Google Play store to better understand the strengths and weaknesses of these tools.
Analyzing mobile application usage: generating log files from mobile screen recordings
TLDR
This paper is combining long-term log file analysis and short-term screen recording analysis by utilizing existing computer vision and machine learning methods, and shows that the method provides detailed data about application use and can work with low-quality video under certain circumstances.
Versatile yet lightweight record-and-replay for Android
TLDR
This work proposes a novel, stream-oriented record-and-replay approach which achieves high-accuracy and low-overhead by aiming at a sweet spot: recording and replaying sensor and network input, event schedules, and inter-app communication via intents.
Crowdsourcing user reviews to support the evolution of mobile apps
Efficiently, effectively detecting mobile app bugs with AppDoctor
TLDR
AppDoctor is a system for efficiently and effectively testing apps against many system and user actions, and helping developers diagnose the resultant bug reports, and quickly screens for potential bugs using approximate execution, which runs much faster than real execution and exposes bugs but may cause false positives.
RERAN: Timing- and touch-sensitive record and replay for Android
TLDR
This work presents an approach and tool named Reran that permits record-and-replay for the Android smartphone platform, and demonstrates its applicability in a variety of scenarios, including replaying 86 out of the Top-100 Android apps on Google Play.
Auto-completing bug reports for Android applications
TLDR
The results demonstrate that FUSION both effectively facilitates reporting and allows for more reliable reproduction of bugs from reports compared to traditional issue tracking systems by presenting more detailed contextual app information.
scvRipper: Video Scraping Tool for Modeling Developers' Behavior Using Interaction Data
TLDR
A computer-vision based video scraping tool that can automatically transcribe a screen-captured video into time-series interaction data according to the analyst's need to address the increasing need for automatic behavioral data collection methods in the studies of human aspects of software engineering.
Mosaic: cross-platform user-interaction record and replay for the fragmented android ecosystem
TLDR
Mosaic is presented, a cross-platform, timing-accurate record and replay tool for Android-based mobile devices that overcomes device fragmentation through a novel virtual screen abstraction and allows user interaction traces to be recorded on emulators, smartphones, tablets, and development boards and replayed on other devices.
...
...