Online File Caching in Latency-Sensitive Systems with Delayed Hits and Bypassing

IEEE Conference on Computer Communications (INFOCOM)(2022)

引用 6|浏览45
暂无评分
摘要
In latency-sensitive file caching systems such as Content Delivery Networks (CDNs) and Mobile Edge Computing (MEC), the latency of fetching a missing file to the local cache can be significant. Recent studies have revealed that successive requests of the same missing file before the fetching completes could still suffer latency (so-called delayed hits). Motivated by the practical scenarios, we study the online general file caching problem with delayed hits and bypassing, i.e., a request may be bypassed and processed directly at the remote data center. The objective is to minimize the total request latency. We show a general reduction that turns a traditional file caching algorithm to one that can handle delayed hits. We give an O(Z(3/2) logK)-competitive algorithm called CaLa with this reduction, where Z is the maximum fetching latency of any file and K is the cache size, and we show a nearly-tight lower bound Omega(Z logK) for our ratio. Extensive simulations based on the production data trace from Google and the Yahoo benchmark illustrate that CaLa can reduce the latency by up to 9.42% compared with the state-of-the-art scheme dealing with delayed hits without bypassing, and this improvement increases to 32.01% if bypassing is allowed.
更多
查看译文
关键词
delayed hits,online general file caching problem,traditional file caching algorithm,maximum fetching latency,cache size,bypassing,online file caching,missing file,local cache,latency sensitive file caching systems,remote data center,production data trace,Google,Yahoo
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要