• Corpus ID: 231582876

Protecting Big Data Privacy Using Randomized Tensor Network Decomposition and Dispersed Tensor Computation

@article{Ong2021ProtectingBD,
  title={Protecting Big Data Privacy Using Randomized Tensor Network Decomposition and Dispersed Tensor Computation},
  author={Jenn-Bing Ong and Wee Keong Ng and Ivan Tjuawinata and Chao Li and Jielin Yang and Sai None Myne and Huaxiong Wang and Kwok-Yan Lam and C. C. Jay Kuo},
  journal={ArXiv},
  year={2021},
  volume={abs/2101.04194}
}
Data privacy is an important issue for organizations and enterprises to securely outsource data storage, sharing, and computation on clouds / fogs. However, data encryption is complicated in terms of the key management and distribution; existing secure computation techniques are expensive in terms of computational / communication cost and therefore do not scale to big data computation. Tensor network decomposition and distributed tensor computation have been widely used in signal processing andโ€ฆย 

References

SHOWING 1-10 OF 77 REFERENCES
Privacy-Preserving Tensor Decomposition Over Encrypted Data in a Federated Cloud Environment
TLDR
This paper presents a novel privacy-preserving tensor decomposition approach over semantically secure encrypted big data and presents the first secure integer division and integer square root schemes over encrypted data.
Distributed Differentially Private Algorithms for Matrix and Tensor Factorization
TLDR
New and improved distributed and differentially private algorithms for two popular matrix and tensor factorization methods: principal component analysis and orthogonal tensor decomposition are designed.
Privacy-Preserving Tensor Factorization for Collaborative Health Data Analysis
TLDR
DPFact is proposed, a privacy-preserving collaborative tensor factorization method for computational phenotyping using EHR that embeds advanced privacy- Preserving mechanisms with collaborative learning and is more accurate and communication-efficient than state-of-the-art baseline methods.
Secure Tensor Decomposition Using Fully Homomorphic Encryption Scheme
TLDR
Experimental results reveals that this approach can securely decompose tensor models, the mathematical model widely used in data-intensive applications, to a core tensor and some truncated orthogonal bases.
Online and Differentially-Private Tensor Decomposition
TLDR
A careful perturbation analysis is derived in this paper which improves up on the existing results significantly and presents the first guarantees for online tensor power method which has a linear memory requirement.
Practical Secure Aggregation for Privacy-Preserving Machine Learning
TLDR
This protocol allows a server to compute the sum of large, user-held data vectors from mobile devices in a secure manner, and can be used, for example, in a federated learning setting, to aggregate user-provided model updates for a deep neural network.
Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives
TLDR
This monograph builds on Tensor Networks for Dimensionality Reduction and Large-scale Optimization by discussing tensor network models for super-compressed higher-order representation of data/parameters and cost functions, together with an outline of their applications in machine learning and data analytics.
Privacy-preserving data outsourcing in the cloud via semantic data splitting
Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions
TLDR
A focus is on the Tucker and tensor train TT decompositions and their extensions, and on demonstrating the ability of tensor network to provide linearly or even super-linearly e.g., logarithmically scalablesolutions, as illustrated in detail in Part 2 of this monograph.
...
1
2
3
4
5
...