• Corpus ID: 235266105

Kernel based Dirichlet sequences

  title={Kernel based Dirichlet sequences},
  author={Patrizia Berti and Emanuela Dreassi and Fabrizio Leisen and Luca Pratelli and Pietro Rigo},
. Let X = ( X 1 ,X 2 ,... ) be a sequence of random variables with values in a standard space ( S, B ). Suppose X 1 ∼ ν and P (cid:0) X n +1 ∈ · | X 1 ,...,X n (cid:1) = θν ( · ) + P ni =1 K ( X i )( · ) n + θ a.s. where θ > 0 is a constant, ν a probability measure on B , and K a random probability measure on B . Then, X is exchangeable whenever K is a regular conditional distribution for ν given any sub- σ -field of B . Under this assumption, X enjoys all the main properties of classical… 
1 Citations
A Central Limit Theorem for Predictive Distributions
Let S be a Borel subset of a Polish space and F the set of bounded Borel functions f:S→R. Let an(·)=P(Xn+1∈·∣X1,…,Xn) be the n-th predictive distribution corresponding to a sequence (Xn) of S-valued


Exchangeable Sequences Driven by an Absolutely Continuous Random Measure
Let S be a Polish space and (Xn : n = 1) an exchangeable sequence of S-valued random variables. Let an(·) = P( Xn+1 in · | X1, . . . ,Xn) be the predictive measure and a a random probability measure
Ferguson Distributions Via Polya Urn Schemes
Let p be any finite positive measure on (the Borel sets of) a complete separable metric space X. We shall say that a random probability measure P* on X has a Ferguson distribution with parameter p if
A class of models for Bayesian predictive inference
In a Bayesian framework, to make predictions on a sequence X1, X2, . . . of random observations, the inferrer needs to assign the predictive distributions σn(·) = P ( Xn+1 ∈ · | X1, . . . , Xn ) . In
The two-parameter Poisson-Dirichlet distribution derived from a stable subordinator
The two-parameter Poisson-Dirichlet distribution, denoted PD(α,θ), is a probability distribution on the set of decreasing positive sequences with sum 1. The usual Poisson-Dirichlet distribution with
Abstract : The parameter in a Bayesian nonparametric problem is the unknown distribution P of the observation X. A Bayesian uses a prior distribution for P, and after observing X, solves the
0–1 laws for regular conditional distributions
is c.g. See [2, 3] and Theorem 1 of [4]. In other terms, not onlyeverywhere properness is to be weakened into condition (2), but the latterholds if and only if A is c.g. under P.If A fails to be c.g.
Mixtures of Dirichlet Processes with Applications to Bayesian Nonparametric Problems
process. This paper extends Ferguson's result to cases where the random measure is a mixing distribution for a parameter which determines the distribution from which observations are made. The
Limit theorems for a class of identically distributed random variables
A new type of stochastic dependence for a sequence of random variables is introduced and studied. Precisely, (Xn)n≥1 is said to be conditionally identically distributed (c.i.d.), with respect to a