Wide area data delivery requires timely propagation of up-to-date information to thousands of clients over a wide area network. Applications include web caching, RSS source monitoring, and email access via a mobile network. Data sources vary widely in their update patterns and may experience different update rates at different times or unexpected changes to update patterns. Traditional data delivery solutions are either push-based, which requires servers to push updates to clients, or pull-based, which require clients to check for updates at servers. While push-based solutions ensure timely data delivery, they are not always feasible to implement and may not scale to a large number of clients. In this article, we present adaptive pull-based policies that explicitly aim to reduce the overhead of contacting remote servers, compared to existing pull-based policies, while meeting freshness requirements. We model updates to data sources using update histories, and present two novel history-based policies to estimate when updates occur; they are based on individual history and aggregate history. These policies are presented within an architectural framework that supports their deployment either client-side or server-side. We further develop two adaptive policies to handle objects that initially may have insufficient history or objects that experience changes in update patterns. Extensive experimental evaluation using three data traces from diverse applications shows that history-based policies can reduce contact between clients and servers by up to 60% compared to existing pull-based policies while providing a comparable level of data freshness. Our experiments further demonstrate that our adaptive policies can select the best policy to match the behavior of an object and perform better than any individual policy, thus they dominate standalone policies.