Viewport Prediction Method of 360 VR Video Using Sound Localization Information

Eunyoung Jeong, Dongho You, Changjong Hyun, Bong Seok Seo, Namtae Kim, Dong Ho Kim, Ye Hoon Lee

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

For 360 VR video, the user's viewport is portion, but the video needs to be sent to 360° spheres. So 360 VR video requires large bandwidth. In this paper, we propose a viewport prediction method to solve this problem. The proposed method predicts the user's viewport by utilizing the location information of the sound sources in 360 VR video. Especially, the proposed method is considered based on MPEG-DASH, and its feasibility is also shown by our head tracking simulation.

Original languageEnglish
Title of host publicationICUFN 2018 - 10th International Conference on Ubiquitous and Future Networks
PublisherIEEE Computer Society
Pages679-681
Number of pages3
ISBN (Print)9781538646465
DOIs
StatePublished - 14 Aug 2018
Event10th International Conference on Ubiquitous and Future Networks, ICUFN 2018 - Prague, Czech Republic
Duration: 3 Jul 20186 Jul 2018

Publication series

NameInternational Conference on Ubiquitous and Future Networks, ICUFN
Volume2018-July
ISSN (Print)2165-8528
ISSN (Electronic)2165-8536

Conference

Conference10th International Conference on Ubiquitous and Future Networks, ICUFN 2018
Country/TerritoryCzech Republic
CityPrague
Period3/07/186/07/18

Fingerprint

Dive into the research topics of 'Viewport Prediction Method of 360 VR Video Using Sound Localization Information'. Together they form a unique fingerprint.

Cite this