Distance Covariance for Stochastic Processes

Abstract

The distance covariance of two random vectors is a measure of their dependence. The empirical distance covariance and correlation can be used as statistical tools for testing whether two random vectors are independent. We propose an analogs of the distance covariance for two stochastic processes defined on some interval. Their empirical analogs can be used to test the independence of two processes. The authors of this paper would like to congratulate Tomasz Rolski on his 70th birthday. We would like to express our gratitude for his longstanding contributions to applied probability theory as an author, editor, and organizer. Tomasz kept applied probability going in Poland and beyond even in difficult historical times. The applied probability community, including ourselves, has benefitted a lot from his enthusiastic, energetic and reliable work. Sto lat! Niech zyje nam! Zdrowia, szczescia, pomyslnosci! 1. Distance covariance for processes on [0, 1] We consider a real-valued stochastic process X = (X(t))t∈[0,1] with sample paths in a measurable space S such that X is measurable as a map from its probability space into S. We assume that the probability measure PX generated by X on S is uniquely determined by its finite-dimensional distributions. Examples include processes with continuous or càdlàg sample paths on [0, 1]. The probability measure PX is then determined by the totality of the characteristic functions φX(xk; sk) = φ (k) X (xk; sk) = ∫ S e i (s1 f(x1)+···+sk f(xk)) PX(df) , k ≥ 1 , where xk = (x1, . . . , xk) ′ ∈ [0, 1]k, sk = (s1, . . . , sk) ∈ Rk. In particular, for two such processes, X and Y , the measures PX and PY coincide if and only if φX(xk; sk) = φY (xk; sk) for all xk ∈ [0, 1]k, sk ∈ Rk, k ≥ 1. We now turn from the general question of identifying the distributions of X and Y to a more specific but related one: given two processes X,Y on [0, 1] with values in S as above and defined on the same probability space, we intend to find some means to verify whether X and Y are independent. Motivated by the discussion above, we need to show that the joint law of (X,Y ) on S × S, denoted by PX,Y , coincides with the product measure PX ⊗ PY . Assuming, once again, that a probability measure on S×S is determined by the finite-dimensional distributions (as is the case with the aforementioned examples), we need to show that the joint characteristic functions of (X,Y ) factorize, i.e., φX,Y (xk; sk, tk) = ∫ S2 e i ∑k j=1(sjf(xj)+tjh(xj)) PX,Y (df, dh) = φX(xk; sk)φY (xk; tk) , xk ∈ [0, 1], sk, tk ∈ R , k ≥ 1 . (1.1) 1991 Mathematics Subject Classification. Primary 62E20; Secondary 62G20 62M99 60F05 60F25.

4 Figures and Tables

Cite this paper

@inproceedings{Mikosch2016DistanceCF, title={Distance Covariance for Stochastic Processes}, author={Thomas Mikosch and Gennady Samorodnitsky}, year={2016} }