Hollywood in Homes: Crowdsourcing Data Collection for Activity Understanding

@article{Sigurdsson2016HollywoodIH,
  title={Hollywood in Homes: Crowdsourcing Data Collection for Activity Understanding},
  author={Gunnar A. Sigurdsson and G{\"u}l Varol and Xiaolong Wang and Ali Farhadi and Ivan Laptev and Abhinav Gupta},
  journal={ArXiv},
  year={2016},
  volume={abs/1604.01753}
}
Computer vision has a great potential to help our daily lives by searching for lost keys, watering flowers or reminding us to take a pill. To succeed with such tasks, computer vision methods need to be trained from real and diverse examples of our daily dynamic scenes. While most of such scenes are not particularly exciting, they typically do not appear on YouTube, in movies or TV broadcasts. So how do we collect sufficiently many diverse but boring samples representing our lives? We propose a… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 250 CITATIONS

Forecasting Future Sequence of Actions to Complete an Activity

VIEW 10 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Multi-Scale Based Context-Aware Net for Action Detection

VIEW 7 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

PCPCAD: Proposal Complementary Action Detector

VIEW 9 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Weakly Supervised Gaussian Networks for Action Detection

VIEW 10 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Interpretable representation learning for visual intelligence

VIEW 9 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

From Lifestyle Vlogs to Everyday Interactions

VIEW 13 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Predictive-Corrective Networks for Action Detection

VIEW 6 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

TALL: Temporal Activity Localization via Language Query

VIEW 13 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2014
2020

CITATION STATISTICS

  • 66 Highly Influenced Citations

  • Averaged 78 Citations per year from 2017 through 2019

  • 32% Increase in citations per year in 2019 over 2018

References

Publications referenced by this paper.
SHOWING 1-10 OF 42 REFERENCES

ActivityNet: A large-scale video benchmark for human activity understanding

VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

THUMOS challenge: Action recognition with a large number of classes

  • A. Gorban, H. Idrees, +4 authors R. Sukthankar
  • http://www.thumos.info/
  • 2015
VIEW 9 EXCERPTS
HIGHLY INFLUENTIAL

Large-Scale Video Classification with Convolutional Neural Networks

VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL

Action Recognition with Improved Trajectories

VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL

HMDB: A large video database for human motion recognition

VIEW 9 EXCERPTS
HIGHLY INFLUENTIAL

Recognizing realistic actions from videos “in the wild”

VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

A database for fine grained activity detection of cooking activities

VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Anonymous submission under review

  • Anonymous
  • Anonymous conference
  • 2016
VIEW 2 EXCERPTS

A dataset for Movie Description

VIEW 3 EXCERPTS