Craig M. Vineyard

Learn More
Encoding sensor observations across time is a critical component in the ability to model cognitive processes. All biological cognitive systems receive sensory stimuli as continuous streams of observed data over time. Therefore, the perceptual grounding of all biological cognitive processing is in temporal semantic encodings, where the particular grounding(More)
We propose a neurologically plausible computational architecture to model human episodic memory and recall based on cortical-hippocampal structure and function. The model design is inspired by neuroscience findings and categorical neural semantic theory.  Fuzzy Adaptive Resonance Theory (ART) and temporal integration are used to form episodic(More)
—Neural machine learning methods, such as deep neural networks (DNN), have achieved remarkable success in a number of complex data processing tasks. These methods have arguably had their strongest impact on tasks such as image and audio processing – data processing domains in which humans have long held clear advantages over conventional algorithms. In(More)
The field of machine learning strives to develop algorithms that, through learning, lead to generalization; that is, the ability of a machine to perform a task that it was not explicitly trained for. An added challenge arises when the problem domain is dynamic or non-stationary with the data distributions or categorizations changing over time. This(More)
The field of machine learning strives to develop algorithms that, through learning, lead to generalization; that is, the ability of a machine to perform a task that it was not explicitly trained for. Numerous approaches have been developed ranging from neural network models striving to replicate neurophysiology to more abstract mathematical manipulations(More)
Through various means of structural and synaptic plasticity enabling online learning, neural networks are constantly reconfiguring their computational functionality. Neural information content is embodied within the configurations, representations, and computations of neural networks. To explore neural information content, we have developed metrics and(More)
Amidst the rising impact of machine learning and the popularity of deep neural networks, learning theory is not a solved problem. With the emergence of neuromorphic computing as a means of addressing the von Neumann bottleneck, it is not simply a matter of employing existing algorithms on new hardware technology, but rather richer theory is needed to guide(More)