Corpus ID: 237503590

On Decentralized Multi-Transmitter Coded Caching

@article{Mahmoudi2021OnDM,
  title={On Decentralized Multi-Transmitter Coded Caching},
  author={Mohammad Mahmoudi and Mohammad Javad Sojdeh and Seyed Pooya Shariatpanahi},
  journal={ArXiv},
  year={2021},
  volume={abs/2109.06867}
}
This paper investigates a setup consisting of multiple transmitters serving multiple cache-enabled clients through a linear network, which covers both wired and wireless transmission situations. We investigate decentralized coded caching scenarios in which there is either no cooperation or limited cooperation between the clients at the cache content placement phase. For the fully decentralized caching case (i.e., no cooperation) we analyze the performance of the system in terms of the Coding… Expand

Figures from this paper

References

SHOWING 1-10 OF 15 REFERENCES
Decentralized Multi-Antenna Coded Caching with Cyclic Exchanges
TLDR
A cyclic-exchange protocol for efficient content delivery is proposed and shown to perform almost as well as the original multi-user broadcast scheme. Expand
Multi-Server Coded Caching
TLDR
The results suggest that, in the case of networks with multiple servers, type of network topology can be exploited to reduce service delay. Expand
Physical-Layer Schemes for Wireless Coded Caching
TLDR
The results convey the important message that although directly translating schemes from the network coding ideas to wireless networks may work well at high-SNR values, careful modifications need to be considered for acceptable finite SNR performance. Expand
Decentralized coded caching attains order-optimal memory-rate tradeoff
  • M. Maddah-Ali, Urs Niesen
  • Computer Science, Mathematics
  • 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton)
  • 2013
TLDR
This paper proposes an efficient caching scheme, in which the content placement is performed in a decentralized manner, and hence achieves a rate close to the centralized scheme. Expand
Finite-Length Analysis of Caching-Aided Coded Multicasting
TLDR
This paper designs a new random placement and an efficient clique cover-based delivery scheme that achieves this lower bound approximately and provides tight concentration results that show that the average number of transmissions concentrates very well requiring only a polynomial number of packets in the rest of the system parameters. Expand
Order-Optimal Decentralized Coded Caching Schemes with Good Performance in Finite File Size Regime
TLDR
Analytical results indicate that, when the file size grows to infinity, the proposed schemes achieve the same memory- load tradeoff as Maddah-Ali-Niesen's decentralized scheme, and hence are also order optimal. Expand
Multi-Antenna Interference Management for Coded Caching
TLDR
The proposed schemes are shown to provide the same degrees- of-freedom at high signal-to-noise ratio (SNR) as the state-of-art methods and to perform significantly better, especially in the finite SNR regime, than several baseline schemes. Expand
Fundamental limits of caching
TLDR
This paper proposes a novel caching approach that can achieve a significantly larger reduction in peak rate compared to previously known caching schemes, and argues that the performance of the proposed scheme is within a constant factor from the information-theoretic optimum for all values of the problem parameters. Expand
Coding for Caching in 5G Networks
TLDR
Experimental results show that coding overhead does not significantly affect the promising performance gains of coded multicasting in small-scale realworld scenarios, practically validating its potential to become a key next generation 5G technology. Expand
Fundamental limits of cache-aided interference management
TLDR
A system comprising a library of files and a wireless network with an arbitrary number of transmitters and receivers, where each node is equipped with a local cache memory, demonstrates that in this setting, caches at the transmitters' side are equally valuable as the caches on the receivers' side, and shows that caching can offer a throughput gain that scales linearly with the size of the network. Expand
...
1
2
...