Learn More
We propose a modular neural-network structure for implementing the Bayesian framework for learning and inference. Our design has three main components, two for computing the priors and likelihoods based on observations and one for applying Bayes' rule. Through comprehensive simulations we show that our proposed model succeeds in implementing Bayesian(More)
We use data extracted from many weblogs to identify the underlying relations of a set of companies in the Standard and Poor (S&P) 500 index. We define a pairwise similarity measure for the companies based on the weblog articles and then apply a graph clustering procedure. We show that it is possible to capture some interesting relations between companies(More)
Bayesian models of cognition hypothesize that human brains make sense of data by representing probability distributions and applying Bayes' rule to find the best explanation for available data. Understanding the neural mechanisms underlying probabilistic models remains important because Bayesian models provide a computational framework, rather than(More)
Finding the sparse solution of an underdetermined system of linear equations (the so called sparse recovery problem) has been extensively studied in the last decade because of its applications in many different areas. So, there are now many sparse recovery algorithms (and program codes) available. However, most of these algorithms have been developed for(More)
In this paper, we consider a generalized multivariate regression problem where the responses are monotonic functions of linear transformations of predictors. We propose a semi-parametric algorithm based on the ordering of the responses which is invariant to the functional form of the transformation function. We prove that our algorithm, which maximizes the(More)
We introduce a sparse multivariate regression algorithm which simultaneously performs dimensionality reduction and parameter estimation. We decompose the coefficient matrix into two sparse matrices: a long matrix mapping the predictors to a set of factors and a wide matrix estimating the responses from the factors. We impose an elastic net penalty on the(More)
We introduce an online, time-dependent clustering algorithm that employs a dynamic probabilistic topic model. The proposed algorithm can handle data that evolves over time and strives to capture the evolution of clusters in the dataset. It addresses the case where the entire dataset is not available at once (e.g., the case of data streams) but an up-to-date(More)
We propose a constructive neural-network model comprised of deterministic units which estimates and represents probability distributions from observable events — a phenomenon related to the concept of probability matching. We use a form of operant learning, where the underlying probabilities are learned from positive and negative reinforcements of the(More)