The attraction of Simulation-Based Training for unmanned Intelligence, Surveillance, and Reconnaissance tasks has sparked testing for instructional strategies in a kinesic cue detection task. Early evidence of training effectiveness for this task is manifested by performance and self-report measures. The wealth of surveys collected include aspects of users' technology acceptance, immersion, intrinsic motivation, stress, workload, and demographics. This paper reviews these detection task measures in light of an instructional strategy, Kim's Game. A cross-scale analysis of the provided measures indicates strong correlations between several subscales. An investigation of potential predictors of performance indicates weekly computer use is statistically significant in predicting a user's Posttest Median Response Time for behavior cue detection. Recommendations for future initiatives include adding feedback, questioning concern for increasing immersion, and comparing results to other instructional strategies.