TY - JOUR
T1 - On the road to the metaverse
T2 - Point cloud video streaming: Perspectives and enablers
AU - Enenche, Patrick
AU - Kim, Dong Ho
AU - You, Dongho
N1 - Publisher Copyright:
© 2024 The Author(s)
PY - 2025/2
Y1 - 2025/2
N2 - In the rapidly evolving metaverse ecosystem, video streaming plays a vital role, offering immersive, on-demand experiences and engaging users in dynamic 3D virtual environments. Point cloud video streaming enables detailed 3D representations for applications like 3D scanning, AR, and VR. However, high data demands and complex encoding pose challenges, requiring efficient resource and latency management. As a result, user experience (UX) is critical in the metaverse, where metrics like real-time feedback, Quality of Experience (QoE), and adaptive UX design are essential for sustaining engagement in immersive 3D environments. Therefore, this paper closely examines the latest developments in point cloud video streaming, specifically focusing on user-centric, AI-driven, and low-latency techniques, transcending traditional video streaming methods. We investigate a variety of techniques, pinpoint obstacles, and chart prospective routes for elevating both the quality and depth of immersive experiences. Furthermore, our work includes a detailed qualitative comparison of these streaming paradigms. We also shed light on pivotal metaverse enhancers, such as Mobile Edge Computing (MEC), Visible Light Communication (VLC), advanced network coding techniques, and the integration of the Data Plane Development Kit (DPDK). These enablers—particularly when combined with DPDK's capabilities in efficient data packet processing—have the potential to optimize resource allocation, strengthen reliability, reduce latency, and improve the sustainability of point cloud video streaming. Our study, therefore, offers a holistic overview of the advancing area of point cloud video streaming and outlines paths for future research and development.
AB - In the rapidly evolving metaverse ecosystem, video streaming plays a vital role, offering immersive, on-demand experiences and engaging users in dynamic 3D virtual environments. Point cloud video streaming enables detailed 3D representations for applications like 3D scanning, AR, and VR. However, high data demands and complex encoding pose challenges, requiring efficient resource and latency management. As a result, user experience (UX) is critical in the metaverse, where metrics like real-time feedback, Quality of Experience (QoE), and adaptive UX design are essential for sustaining engagement in immersive 3D environments. Therefore, this paper closely examines the latest developments in point cloud video streaming, specifically focusing on user-centric, AI-driven, and low-latency techniques, transcending traditional video streaming methods. We investigate a variety of techniques, pinpoint obstacles, and chart prospective routes for elevating both the quality and depth of immersive experiences. Furthermore, our work includes a detailed qualitative comparison of these streaming paradigms. We also shed light on pivotal metaverse enhancers, such as Mobile Edge Computing (MEC), Visible Light Communication (VLC), advanced network coding techniques, and the integration of the Data Plane Development Kit (DPDK). These enablers—particularly when combined with DPDK's capabilities in efficient data packet processing—have the potential to optimize resource allocation, strengthen reliability, reduce latency, and improve the sustainability of point cloud video streaming. Our study, therefore, offers a holistic overview of the advancing area of point cloud video streaming and outlines paths for future research and development.
KW - Metaverse
KW - Point cloud
KW - Sustainability
KW - Video streaming
UR - http://www.scopus.com/inward/record.url?scp=85209250368&partnerID=8YFLogxK
U2 - 10.1016/j.icte.2024.11.001
DO - 10.1016/j.icte.2024.11.001
M3 - Review article
AN - SCOPUS:85209250368
SN - 2405-9595
VL - 11
SP - 93
EP - 104
JO - ICT Express
JF - ICT Express
IS - 1
ER -