Yair Toaff

  • Citations Per Year
Learn More
Large backup and restore systems may have a petabyte or more data in their repository. Such systems are often compressed by means of deduplication techniques, that partition the input text into chunks and store recurring chunks only once. One of the approaches is to use hashing methods to store fingerprints for each data chunk, detecting identical chunks(More)
The time efficiency of many storage systems rely critically on the ability to perform a large number of evaluations of certain hashing functions fast enough. The remainder function B mod P , generally applied with a large prime number P , is often used as a building block of such hashing functions, which leads to the need of accelerating remainder(More)
New layouts for the assignment of a set of n parallel processors to perform certain tasks in several hierarchically connected layers are suggested, leading, after some initialization phase, to the full exploitation of all of the processing power all of the time. This framework is useful for a variety of string theoretic problems, ranging from modular(More)
A special case of data compression in which repeated chunks of data are stored only once is known as deduplication. The input data is cut into chunks and a cryptographically strong hash value of each (different) chunk is stored. To restrict the influence of small inserts and deletes to local perturbations, the chunk boundaries are usually defined in a data(More)
Many Web clients today are connected to the Internet via low speed computer links such as cellular connections. In order to efficiently use the cellular connection for Web access, the connection must be accelerated using a Performance Enhancing Proxy (PEP) as a gateway to the Web. In this paper we investigate the challenges created by the use of PEP. In(More)
  • 1