Corpus ID: 15642358

Web Replica Hosting Systems Design

  title={Web Replica Hosting Systems Design},
  author={Swaminathan Sivasubramanian and Guillaume Pierre and Maarten van Steen},
Replication is a well-known technique to improve the accessibility of Web sites. It generally offers reduced client latencies and increases a site’s avai lability. However, applying replication techniques is not trivial, and various Content Delivery Net works (CDNs) have been created to facilitate replication for digital content providers. The su ccess of these CDNs has triggered further research efforts into developing advanced Web replica hosting systems. These are systems that host the… Expand

Figures and Tables from this paper

Roles of Agents in Data-Intensive Web Sites
Cases in which agents can be used to improve the data management performance are discussed and the aim is to specify tasks that may profit from the increase development in agent technologies. Expand


Content replication in Web++
A distributed server-initiated approach for resource replication in which all servers can decide autonomously whether to replicate resources and the locations where the replicas should be allocated is described. Expand
Efficient and adaptive Web replication using content clustering
This paper compares the uncooperative pulling of Web contents used by commercial CDNs with the cooperative pushing, and describes three clustering techniques and uses various topologies and several large Web server traces to evaluate their performance. Expand
Replicated Web Services: A Comparative Analysis of Client-Based Content Delivery Policies
This paper classified and contrasted, qualitatively and quantitatively (via simulation), different client side techniques to find the pro and cons of each approach with the aim to identify the best solutions for content-delivery systems. Expand
Clustering Web content for efficient replication
This work proposes to replicate content in units of clusters, each containing objects which are likely to be requested by clients that are topologically close, and describes three clustering techniques, and uses various topologies and several large Web server traces to evaluate their performance. Expand
Globule: A Platform for Self-Replicating Web Documents
The design of Globule is presented, a platform that automates all aspects of such replication: server-to-server peering negotiation, creation and destruction of replicas, selection of the most appropriate replication strategies on a per-document basis, consistency management and transparent redirection of clients to replicas. Expand
A market-based architecture for management of geographically dispersed, replicated Web servers
Many popular Web sites employ a set geographically dispersed, replicated servers to address the issue of overloaded servers and network congestion. Such distributed Web sites require allocationExpand
Dynamic Server Selection In The Internet
  • M. Crovella, R. Carter
  • Computer Science
  • Third IEEE Workshop on the Architecture and Implementation of High Performance Communication Subsystems
  • 1995
This paper reports on techniques for finding good service providers without a priori knowledge of server location or network topology, and considers the use of two principal metrics for measuring distance in the Internet: hops, and round-trip latency. Expand
The state of the art in locally distributed Web-server systems
This article classifies and describes main mechanisms to split the traffic load among the server nodes, discussing both the alternative architectures and the load sharing policies. Expand
Application specific data replication for edge services
This paper explores using a distributed object architecture to build an edge service system for an e-commerce application, an online bookstore represented by the TPC-W benchmark, and finds that by slightly relaxing consistency within individual distributed objects, this system can be built that is highly available and efficient. Expand
On the placement of Web server replicas
  • L. Qiu, V. Padmanabhan, G. Voelker
  • Computer Science
  • Proceedings IEEE INFOCOM 2001. Conference on Computer Communications. Twentieth Annual Joint Conference of the IEEE Computer and Communications Society (Cat. No.01CH37213)
  • 2001
This work develops several placement algorithms that use workload information, such as client latency and request rates, to make informed placement decisions, and evaluates the placement algorithms using both synthetic and real network topologies, as well as Web server traces. Expand