Learn More
Peer-to-peer (P2P) networks can reduce the distribution cost of large media files for the original provider of the data significantly. Thereby, the BitTorrent protocol is widely used in the Internet today. Most research work studies the protocol analytically, by simulations at the flow-level or real world experiments. Thereby, for flow-level simulations the(More)
—Video streaming is considered one of the most important and challenging application for next generation cellular networks. Current infrastructures are not prepared to deal with the increasing amount of video traffic. The current Internet, and in particular the mobile Internet, was not designed with video requirements in mind and, as a consequence, its(More)
– Peer-to-Peer (P2P) networks and their applications gain increasing importance in today's Internet, as already today the majority of IP traffic is caused by P2P applications. Since the upcoming of Napster a lot of research has been done in this area producing interesting and promising results. Still, growing demands like less data rate consumption, faster(More)
Many researchers working on concepts for the next generation internet agree that the split of identifier and locator seems to be a very promising approach. Although this solution addresses the most critical issues in today's internet, new challenges arise through the mapping between locator and identifier. Our proposal takes a look at novel DHT-based(More)
Peer-to-peer (p2p) systems are a highly decentralized, fault tolerant, and cost effective alternative to the classic client-server architecture. Yet companies hesitate to use p2p algorithms to build new applications. Due to the decentralized nature of such a p2p system the carrier does not know anything about the current size, performance, and stability of(More)
Gnutella is a classical Peer-to-Peer network designed for file-sharing. The absence of pure servers is one of its main properties, given that every Gnutella host is client and host in one. It uses the resources of the participants to distribute content, e.g. mp3 compressed audio files, and shares the processing capacity to provide the routing and searching(More)
The Internet architecture of today is the result of a constant evolution during the past 25 years. However, this layering of add-ons, bug fixes and extensions has grown into a tremendously complex and, therefore, increasingly static platform. On the contrary, a multitude of new challenges related to issues never conceived in the original design, such as(More)
Recent research efforts have shown that peer-topeer (p2p) mechanisms incorporate a potential that goes well beyond simple file sharing. Compared to the classic client-server architecture, these systems do not suffer from a single point of failure. However, there is still the danger that an adversary is able to attack a specific subpart of the system. This(More)
One of the most important design goals of current peer-to-peer (P2P) technology is to be able to offer its service to an arbitrary large number of users. Discrete event simulation is often applied to quantitatively and qualitatively evaluate the performance and scalability of such systems before they are deployed. However, the number of users, processes and(More)
The algorithms and methods of the Peer-to-Peer (P2P) technology are often applied to networks and services with a demand for scalability. In contrast to traditional client/server architectures, an arbitrary large number of users, called peers, may participate in the network and use the service without losing any performance. In order to evaluate(More)