Online Profiling and Adaptation of Quality Sensitivity for Internet Video

PROCEEDINGS OF THE 2023 ACM SYMPOSIUM ON CLOUD COMPUTING, SOCC 2023(2023)

引用 0|浏览2
暂无评分
摘要
A key to video streaming systems is knowing how sensitive quality of experience (QoE) is to quality metrics (e.g., buffering ratio and average bitrate). In the conventional wisdom, such quality sensitivity should be profiled by offline user studies because QoE is equally sensitive to quality metrics everywhere for an entire genre of videos. However, recent studies show that quality sensitivity varies substantially both across videos and within a video, giving rise to a new potential for improving QoE and serving more users without using more bandwidth. Unfortunately, offline profiling cannot capture the variability of quality sensitivity within a new video (e.g., a new TV show episode or live sports event), if users join to watch it within a short time window. This short paper makes a case for a new architecture that online profiles the quality sensitivity of a video by gathering and analyzing QoE-related feedback ( e.g., exit or skip) from actual users, while the video is being streamed to users. The key component is a QoE-driven feedback loop, called SensitiFlow, run by video content providers to make adaptivebitrate (ABR) decisions for concurrent and future video sessions. We evaluated QoE in user engagement (view time) using real traces of 7.6 million video sessions from a content provider. Our preliminary results show that SensitiFlow can realize up-to 80% of the improvement obtained by a hypothetical "oracle" system that knows quality sensitivity in advance. Admittedly, our evaluation is not a real deployment by a large-scale commercial content provider, but we hope our preliminary results will inspire follow-up efforts to test similar ideas at scale.
更多
查看译文
关键词
Video QoE,ABR
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要