Learn More
We consider the problem of scoring Bayesian Network Classifiers (BNCs) on the basis of the conditional loglikelihood (CLL). Currently, optimization is usually performed in BN parameter space, but for perfect graphs (such as Naive Bayes, TANs and FANs) a mapping to an equivalent Logistic Regression (LR) model is possible, and optimization can be performed in(More)
In this paper we consider the first passage process of a spectrally negative Markov additive process (MAP). The law of this process is uniquely characterized by a certain matrix function, which plays a crucial role in fluctuation theory. We show how to identify this matrix using the theory of Jordan chains associated with analytic matrix functions. This(More)
In this note we identify a simple setup from which one may easily infer various decomposition results for queues with interruptions as well as c`adì ag processes with certain secondary jump inputs. In particular , this can be done for processes with stationary or stationary and independent increments. It resulted from an attempt to understand these kind of(More)
We study the first passage process of a spectrally-negative Markov additive process (MAP). The focus is on the background Markov chain at the times of the first passage. This process is a Markov chain itself with a transition rate matrix Λ. Assuming time-reversibility we show that all the eigenvalues of Λ are real with algebraic and geometric multiplicities(More)
We present a new mapping from Bayesian Network Classifiers (BNC) to Logistic Regression (LR) models. It associates with each BNC structure an LR specification with unconstrained parameter space. We prove that a BNC structure and its associated LR specification, index exactly the same set of conditional distributions if and only if the BNC structure has a(More)