Corpus ID: 43117699

Action Search: Learning to Search for Human Activities in Untrimmed Videos

@article{Alwassel2017ActionSL,
  title={Action Search: Learning to Search for Human Activities in Untrimmed Videos},
  author={Humam Alwassel and Fabian Caba Heilbron and Bernard Ghanem},
  journal={ArXiv},
  year={2017},
  volume={abs/1706.04269}
}
  • Humam Alwassel, Fabian Caba Heilbron, Bernard Ghanem
  • Published 2017
  • Computer Science
  • ArXiv
  • Traditional approaches for action detection use trimmed data to learn sophisticated action detector models. Although these methods have achieved great success at detecting human actions, we argue that huge information is discarded when ignoring the process, through which this trimmed data is obtained. In this paper, we propose Action Search, a novel approach that mimics the way people annotate activities in video sequences. Using a Recurrent Neural Network, Action Search can efficiently explore… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Explore key concepts

    Links to highly relevant papers for key concepts in this paper:

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 36 REFERENCES

    Fast Temporal Activity Proposals for Efficient Detection of Human Actions in Untrimmed Videos

    VIEW 2 EXCERPTS

    Temporal Action Detection Using a Statistical Language Model

    Temporal Action Localization with Pyramid of Score Distribution Features

    End-to-End Learning of Action Detection from Frame Glimpses in Videos

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Finding action tubes

    VIEW 1 EXCERPT

    Automatic annotation of human actions in video

    VIEW 1 EXCERPT

    Temporal Action Localization in Untrimmed Videos via Multi-stage CNNs

    VIEW 6 EXCERPTS
    HIGHLY INFLUENTIAL

    ActivityNet: A large-scale video benchmark for human activity understanding

    VIEW 1 EXCERPT