Skip to search formSkip to main contentSkip to account menu

Sitemaps

Known as: Sitemap.xml, Google sitemaps 0.90, Google Sitemaps 
The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. A Sitemap is an XML file… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2015
2015
“White hat” search engine optimization refers to the practice of publishing web pages that are useful to humans, while enabling… 
2011
2011
Web archives preserve the history of born-digital content and offer great potential for sociologists, business analysts, and… 
Review
2010
Review
2010
At the time of writing there exists no consensus about the approaches to detect, propagate and describe changes in resources and… 
2010
2010
Provenance is a cornerstone element in the process of enabling quality assessment for the Web of Data. Applications consuming or… 
Highly Cited
2009
Highly Cited
2009
Comprehensive coverage of the public web is crucial to web search engines. Search engines use crawlers to retrieve pages and then… 
Highly Cited
2008
Highly Cited
2008
Increasing amounts of RDF data are available on the Web for consumption by Semantic Web browsers and indexing by Semantic Web… 
2007
2007
In this paper, we present an exploratory study of the web navigation experiences of dyslexic users. Findings indicate that… 
Review
2003
Review
2003
Supporting the exploration of large web sites remainsa challenge. Visitors require site overviews to guide theirexploration. Text… 
Highly Cited
2000
Highly Cited
2000
Through a study of web site design practice, we observed that designers employ multiple representations of web sites as they…