• Corpus ID: 219636235

Deep Learning-based Stress Determinator for Mouse Psychiatric Analysis using Hippocampus Activity

@article{Liu2020DeepLS,
  title={Deep Learning-based Stress Determinator for Mouse Psychiatric Analysis using Hippocampus Activity},
  author={Donghan Liu and Benjamin C. M. Fung and Tak Pan Wong},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.06862}
}
Decoding neurons to extract information from transmission and employ them into other use is the goal of neuroscientists' study. Due to that the field of neuroscience is utilizing the traditional methods presently, we hence combine the state-of-the-art deep learning techniques with the theory of neuron decoding to discuss its potential of accomplishment. Besides, the stress level that is related to neuron activity in hippocampus is statistically examined as well. The experiments suggest that our… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 44 REFERENCES

Efficient and accurate extraction of in vivo calcium signals from microendoscopic video data

A new constrained matrix factorization approach to accurately separate the background and then demix and denoise the neuronal signals of interest is described, which substantially improved the quality of extracted cellular signals and detected more well-isolated neural signals, especially in noisy data regimes.

A training algorithm for optimal margin classifiers

A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of the classification functions,

A Comparative Analysis of Forecasting Financial Time Series Using ARIMA, LSTM, and BiLSTM

The results show that additional training of data and thus BiLSTM-based modeling offers better predictions than regular LSTm-based models.

PyTorch: An Imperative Style, High-Performance Deep Learning Library

This paper details the principles that drove the implementation of PyTorch and how they are reflected in its architecture, and explains how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance.

I-MAD: A Novel Interpretable Malware Detector Using Hierarchical Transformer

This work proposes an Interpretable MAware Detector (I-MAD), which achieves state-of-the-art performance on static malware detection with excellent interpretability and integrates a hierarchical Transformer network that can understand assembly code at the basic block, function, and executable level.

Application of a long short-term memory neural network: a burgeoning method of deep learning in forecasting HIV incidence in Guangxi, China

The LSTM model was more effective than other time-series models and is important for the monitoring and control of local HIV epidemics and the accuracy of GRNN and ES models in forecasting HIV incidence in Guangxi was relatively poor.

Incorporating Nesterov Momentum into Adam

Recurrent Neural Networks for Time Series Forecasting

A recurrent neural network based time series forecasting framework covering feature engineering, feature importances, point and interval predictions, and forecast evaluation is presented.

EA-LSTM: Evolutionary Attention-based LSTM for Time Series Prediction