Loading [MathJax]/extensions/MathMenu.js
Mosaic: Advancing User Quality of Experience in 360-Degree Video Streaming With Machine Learning | IEEE Journals & Magazine | IEEE Xplore

Mosaic: Advancing User Quality of Experience in 360-Degree Video Streaming With Machine Learning


Abstract:

Conventional streaming solutions for streaming 360-degree panoramic videos are inefficient in that they download the entire 360-degree panoramic scene, while the user vie...Show More

Abstract:

Conventional streaming solutions for streaming 360-degree panoramic videos are inefficient in that they download the entire 360-degree panoramic scene, while the user views only a small sub-part of the scene called the viewport. This can waste over 80% of the network bandwidth. We develop a comprehensive approach called Mosaic that combines a powerful neural network-based viewport prediction with a rate control mechanism that assigns rates to different tiles in the 360-degree frame such that the video quality of experience is optimized subject to a given network capacity. We model the optimization as a multi-choice knapsack problem and solve it using a greedy approach. We also develop an end-to-end testbed using standards-compliant components and provide a comprehensive performance evaluation of Mosaic along with five other streaming techniques - two for conventional adaptive video streaming and three for 360-degree tile-based video streaming. Mosaic outperforms the best of the competitions by as much as 47-191% in terms of average video quality of experience. Simulation-based evaluation as well as subjective user studies further confirm the superiority of the proposed approach.
Published in: IEEE Transactions on Network and Service Management ( Volume: 18, Issue: 1, March 2021)
Page(s): 1000 - 1015
Date of Publication: 21 January 2021

ISSN Information:

Funding Agency:

References is not available for this document.

I. Introduction

With video streaming proliferating on the Internet [1] interest is growing for immersive video applications. An important application in this space is 360-degree video [2]. 360-degree video is a panoramic video recorded using omni-directional cameras [3]. It is then projected onto 2D using one of the available mapping techniques (e.g., equirectangular, cube, and pyramid). Typically, the user watches the 360-degree video using head mounted display (HMD) or commodity mobile devices (e.g., [4]).

Select All
1.
The Global Internet Phenomena Report, October 2018, [online] Available: https://www.sandvine.com/hubfs/downloads/phenomena/2018-phenomena-report.pdf.
2.
Feb. 2017, [online] Available: http://www.orbisresearch.com/reports/index/global-virtual-reality-market-hardware-and-software-and-forecast-to-2020.
3.
Samsung 360 Round VR Camera, 2017, [online] Available: https://www.samsung.com/us/business/products/mobile/virtual-reality/360-round-vr-camera/.
4.
S. Hollister, Youtube’s Ready to Blow Your Mind WITH 360-Degree Videos, Mar. 2015, [online] Available: https://gizmodo.com/youtubes-ready-to-blow-your-mind-with-360-degree-videos-1690989402.
5.
Akamai’s: 2018 State of the Internet/Connectivity Report, 2018.
6.
F. Qian, L. Ji, B. Han and V. Gopalakrishnan, "Optimizing 360 video delivery over cellular networks", Proc. ATC Workshop, pp. 1-6, 2016.
7.
C.-L. Fan, J. Lee, W.-C. Lo, C.-Y. Huang, K.-T. Chen and C.-H. Hsu, "Fixation prediction for 360 video streaming in head-mounted virtual reality", Proc. ACM NOSSDAV, pp. 67-72, 2017.
8.
L. Xie, Z. Xu, Y. Ban, X. Zhang and Z. Guo, "360ProbDASH: Improving qoe of 360 video streaming using tile-based HTTP adaptive streaming", Proc. ACM Multimedia, pp. 315-323, 2017.
9.
Y. Bao, H. Wu, T. Zhang, A. A. Ramli and X. Liu, "Shooting a moving target: Motion-prediction-based transmission for 360-degree videos", Proc. IEEE Big Data, pp. 1161-1170, 2016.
10.
S. Akhshabi, A. C. Begen and C. Dovrolis, "An experimental evaluation of rate-adaptation algorithms in adaptive streaming over HTTP", Proc. Multimedia Syst., pp. 157-168, 2011.
11.
J. Jiang, V. Sekar and H. Zhang, "Improving fairness efficiency and stability in http-based adaptive video streaming with festive", Proc. CoNEXT, pp. 97-108, 2012.
12.
T.-Y. Huang, R. Johari, N. McKeown, M. Trunnell and M. Watson, "A buffer-based approach to rate adaptation: Evidence from a large video streaming service", ACM SIGCOMM Comput. Commun. Rev., vol. 44, no. 4, pp. 187-198, 2015.
13.
Y. Sun et al., "CS2P: Improving video bitrate selection and adaptation with data-driven throughput prediction", Proc. ACM SIGCOMM, pp. 272-285, 2016.
14.
K. Spiteri, R. Urgaonkar and R. K. Sitaraman, "BOLA: Near-optimal bitrate adaptation for online videos", Proc. IEEE INFOCOM, pp. 1-9, 2016.
15.
H. Mao, R. Netravali and M. Alizadeh, "Neural adaptive video streaming with pensieve", Proc. SIGCOMM, pp. 197-210, 2017.
16.
S. K. Park, A. Bhattacharya, M. Dasari and S. R. Das, "Understanding user perceived video quality using multipath TCP over wireless network", Proc. IEEE 39th Sarnoff Symp., pp. 1-6, 2018.
17.
V. R. Gaddam, M. Riegler, R. Eg, C. Griwodz and P. Halvorsen, "Tiling in interactive panoramic video: Approaches and evaluation", IEEE Trans. Multimedia, vol. 18, no. 9, pp. 1819-1831, Sep. 2016.
18.
M. Hosseini and V. Swaminathan, "Adaptive 360 VR video streaming: Divide and conquer", Proc. IEEE Int. Symp. Multimedia (ISM), pp. 107-110, 2016.
19.
X. Corbillon, G. Simon, A. Devlic and J. Chakareski, "Viewport-adaptive navigable 360-degree video delivery", Proc. IEEE Int. Conf. Commun. (ICC), pp. 1-7, 2017.
20.
X. Corbillon, A. Devlic, G. Simon and J. Chakareski, "Optimal set of 360-degree videos for viewport-adaptive streaming", Proc. ACM Multimedia, pp. 943-951, 2017.
21.
S. Petrangeli, F. De Turck, V. Swaminathan and M. Hosseini, "Improving virtual reality streaming using HTTP/2", Proc. ACM MMSys, pp. 225-228, 2017.
22.
H. Xie and X. Zhang, "Poi360: Panoramic mobile video telephony over LTE cellular networks", Proc. CoNext, pp. 336-349, 2017.
23.
M. Graf, C. Timmerer and C. Mueller, "Towards bandwidth efficient adaptive streaming of omnidirectional video over HTTP: Design implementation and evaluation", Proc. IEEE MMSys, pp. 261-271, 2017.
24.
R. Skupin, Y. Sanchez, C. Hellge and T. Schierl, "Tile based HEVC video for head mounted displays", Proc. IEEE Multimedia (ISM), pp. 399-400, 2016.
25.
F. Qian, B. Han, Q. Xiao and V. Gopalakrishnan, "Flare: Practical viewport-adaptive 360-degree video streaming for mobile devices", Proc. ACM MobiCom, pp. 99-114, 2018.
26.
J. Carreira and A. Zisserman, "Quo vadis action recognition? A new model and the kinetics dataset", Proc. IEEE Conf. Comput. Vis. Pattern Recognit., pp. 4724-4733, 2017.
27.
O. A. Niamut, E. Thomas, L. D’Acunto, C. Concolato, F. Denoual and S. Y. Lim, "MPEG DASH SRD: Spatial relationship description", Proc. ACM MMSys, pp. 1-8, 2016.
28.
S. Park, A. Bhattacharya, Z. Yang, M. Dasari, S. R. Das and D. Samaras, "Advancing user quality of experience in 360-degree video streaming", Proc. IEEE IFIP Netw. Conf. (IFIP Netw.), pp. 1-9, 2019.
29.
X. Yin, A. Jindal, V. Sekar and B. Sinopoli, "A control-theoretic approach for dynamic adaptive video streaming over HTTP", ACM SIGCOMM Comput. Commun. Rev., vol. 45, no. 5, pp. 325-338, 2015.
30.
K. He, X. Zhang, S. Ren and J. Sun, "Deep residual learning for image recognition", Proc. IEEE Conf. Comput. Vis. Pattern Recognit., pp. 463-469, 2016.
Contact IEEE to Subscribe

References

References is not available for this document.