Jeremy E. Cohen

Learn More
In signal processing, tensor decompositions have gained in popularity this last decade. In the meantime, the volume of data to be processed has drastically increased. This calls for novel methods to handle Big Data tensors. Since most of these huge data are issued from physical measurements, which are intrinsically real nonnegative, being able to compress(More)
The Canonical Polyadic tensor decomposition (CPD), also known as Candecomp/Parafac, is very useful in numerous scientific disciplines. Structured CPDs, i.e. with Toeplitz, circulant, or Hankel factor matrices, are often encountered in signal processing applications. As subsequently pointed out, specialized algorithms were recently proposed for estimating(More)
HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L'archive ouverte pluridisciplinaire HAL, est destinée au dépôt età la diffusion(More)
A Bayesian framework is proposed to define flexible coupling models for joint tensor decompositions of multiple datasets. Under this framework, a natural formulation of the data fusion problem is to cast it in terms of a joint maximum a posteriori (MAP) estimator. Data-driven scenarios of joint posterior distributions are provided, including general(More)
—This paper gives an overview of notations used in multiway array processing. We redefine the vectorization and matricization operators to comply with some properties of the Kronecker product. The tensor product and Kronecker product are also represented with two different symbols, and it is shown how these notations lead to clearer expressions for multiway(More)
Taking into account subject variability in data mining is one of the great challenges of modern biomedical engineering. In EEG recordings, the assumption that time sources are exactly shared by multiple subjects, multiple recordings of the same subject, or even multiples instances of the sources in one recording is especially wrong. In this paper, we(More)
To deal with large multimodal datasets, coupled canonical polyadic decompositions are used as an approximation model. In this paper, a joint compression scheme is introduced to reduce the dimensions of the dataset. Joint compression allows to solve the approximation problem in a compressed domain using standard coupled decomposition algorithms.(More)