Action Detection with Improved Dense Trajectories and Sliding Window

Abstract

In this paper we describe an action/interaction detection system based on improved dense trajectories [20], multiple visual descriptors and bag-of-features representation. Given that the actions/interactions are not mutual exclusive, we train a binary classifier for every predefined action/interaction. We rely on a non-overlapped temporal sliding window to enable the temporal localization. We have tested our system in ChaLearn Looking at People Challenge 2014 Track 2 dataset[1, 2]. We obtained 0.4226 average overlap, which is the 3rd place in the track of the challenge. Finally, we provide an extensive analysis of the performance of this system on different actions and provide possible ways to improve a general action detection system.

DOI: 10.1007/978-3-319-16178-5_38

Extracted Key Phrases

4 Figures and Tables

Cite this paper

@inproceedings{Shu2014ActionDW, title={Action Detection with Improved Dense Trajectories and Sliding Window}, author={Zhixin Shu and Kiwon Yun and Dimitris Samaras}, booktitle={ECCV Workshops}, year={2014} }