13.17 – MPEG-I

  • Post author:
  • Post category:Mpeg book

Coded representation of immersive media (MPEG-I) represents the current MPEG effort to develop a suite of standards to support immersive media products, services and applications.

Currently MPEG-I has 11 parts and more parts are being added.

  1. Part 1 – Immersive Media Architectures outlines possible architectures for immersive media services.
  2. Part 2 – Omnidirectional MediA Format specifies an application format that enables consumption of omnidirectional video, aka Video 360. Version 2 is under development
  3. Part 3 – Immersive Video Coding will specify the emerging Versatile Video Coding standard
  4. Part 4 – Immersive Audio Coding will specify metadata to enable enhanced immersive audio experiences compared to what is possible today with MPEG-H 3D Audio
  5. Part 5 – Video-based Point Cloud Compression will specify a standard to compress dense static and dynamic point clouds
  6. Part 6 – Immersive Media Metrics will specify different parameters useful for immersive media services and their measurability
  7. Part 7 – Immersive Media Metadata will specify systems, video and audio metadata for immersive experiences. One example is the current 3DoF+ Video activity
  8. Part 8 – Network-Based Media Processing will specify APIs to access remote media processing services
  9. Part 9 – Geometry-based Point Cloud Compression will specify a standard to compress sparse static and dynamic point clouds
  10. Part 10 – Carriage of Point Cloud Data will specify how to accommodate compressed point clouds in the MP4 File Format
  11. Part 11 – Implementation Guidelines for Network-based Media Processing is the usual collection of guidelines

 

Table of contents 13.16 MPEG-DASH 13.18 MPEG-CICP