Using full reference history for efficient document replacement in web caches

Hyokyung Bahn, Sam H. Noh, Sang Lyul Min, Kern Koh

Research output: Contribution to conferencePaperpeer-review

20 Scopus citations

Abstract

With the increase in popularity of the World Wide Web, the research community has recently seen a pro-liferation of Web caching algorithms. This paper pre-sents a new such algorithm, that is efficient and robust, called Least Unified-Value (LUV). LUV evaluates a Web document based on its cost normalized by the likelihood of it being re-referenced. This results in a normalized assessment of the contribution to the value of a document, leading to a fair replacement policy. LUV can conform to arbitrary cost functions of Web documents, so it can optimize any particular perfor-mance measure of interest, such as the hit rate, the byte hit rate, or the delay-savings ratio. Unlike most existing algorithms, LUV exploits complete reference history of documents, in terms of reference frequency and recency, to estimate the likelihood of being re-referenced. Nevertheless, LUV allows for an efficient implementation in both space and time complexities. The space needed to maintain the reference history of a document is only a few bytes and furthermore, the time complexity of the algorithm is 0(log2n), where n is the number of documents in the cache. Trace-driven simulations show that the LUV algorithm outperforms existing algorithms for various performance measures for a wide range of cache configurations.

Original languageEnglish
StatePublished - 1999
Event2nd USENIX Symposium on Internet Technologies and Systems, USITS 1999 - Boulder, United States
Duration: 11 Oct 199914 Oct 1999

Conference

Conference2nd USENIX Symposium on Internet Technologies and Systems, USITS 1999
Country/TerritoryUnited States
CityBoulder
Period11/10/9914/10/99

Fingerprint

Dive into the research topics of 'Using full reference history for efficient document replacement in web caches'. Together they form a unique fingerprint.

Cite this