In this work, we present a general compositional vector framework for transitionbased dependency parsing. The ability to use transition-based algorithms allows for the application of vector composition to a large set of languages where only dependency treebanks are available, as well as handling linguistic phenomena such as non-projectivities which pose problems for previously proposed methods. We introduce the concept of a Transition Directed Acyclic Graph that allows us to apply Recursive Neural Networks for parsing with existing transition-based algorithms. Our framework captures semantic relatedness between phrases similarly to a constituency-based counterpart from the literature, for example predicting that “a financial crisis”, “a cash crunch” and “a bear market” are semantically similar. Currently, a parser based on our framework is capable of achieving 86.25% in Unlabelled Attachment Score for a well-established dependency dataset using only word representations as input, falling less than 2% points short of a previously proposed comparable feature-based model.