The objective of this research is to measure how much syntactic information (in the form of word order) and Coreference Resolution affect the result of Automated Essay Scoring (AES) using Latent Semantic Analysis (LSA). To incorporate the syntactic information, Syntactically Enhanced LSA (SELSA) is used, whilst Stanford CoreNLP Natural Language Processing Toolkit is used for the Coreference Resolution. To evaluate the results, we calculate the average absolute difference between the system score and human score for each essay. Based on the results, we can conclude that syntactic information, when combined with Coreference Resolution, do not have higher correlation to human score than LSA (an average absolute difference of 0.15748 as opposed to LSA’s 0.12597). But interestingly, the two techniques work better when they are used together, rather than when they are used separately. We also develop a new algorithm to calculate the scores, with a better average absolute difference, which is as high as 0.08969.