Stanford EE274: Data Compression I 2023 I Lecture 18 - Video Compression

()
Stanford EE274: Data Compression I 2023 I Lecture 18 - Video Compression

Video Compression

  • Video compression aims to convert videos into a compressed bitstream.
  • Video compression is computationally intensive, leading to the development of specialized hardware (e.g., Apple's M2 chip).
  • Key video parameters include frame size (resolution) and frames per second (FPS).
  • Video compression techniques are used to reduce the size of video files.
  • Motion vectors are used to find the motion between frames.
  • Block matching algorithms are used to find the best match for a block of pixels in a reference frame.
  • Hierarchical searching is used to make block matching more efficient.
  • The residual frame is the difference between the current frame and the predicted frame.
  • The residual frame is encoded using image compression techniques.
  • Different hyperparameters can be used for encoding the residual frame.
  • B-frames (bilinear coding) are used in video encoding to improve compression by interpolating between I-frames (independent frames) and P-frames (predictive frames).
  • I-frame compression is useful in video editing software as it allows for efficient editing of individual frames without the need to decode the entire video.
  • Traditional video coding methods like H.264 and H.265 use block-based motion estimation and compensation, which can result in blocky artifacts, especially at lower bit rates.
  • Learned image compression techniques can achieve smoother motion and fewer artifacts compared to traditional video codecs.
  • Machine learning-based video codecs have the potential to outperform traditional codecs, but they are computationally expensive and may not be suitable for real-time applications.

Lossless Compression

  • The course covered fundamental concepts such as entropy, prefix-free codes, and lossless compression techniques.
  • Entropy provides a theoretical limit for lossless compression, and understanding entropy helps in practical compression scenarios.
  • Non-IID data or real data was explored, including concepts like entropy rate, conditional entropy, and the relationship between good predictors and good compressors.
  • Advanced predictors like context tree weighting (CTW) and prediction by partial matching (PPM) were discussed, along with the use of language models as powerful predictors for compression.
  • Universal compressors achieve the entropy rate for any stationary source.
  • Tips for using lossless compression in practice:
    • Evaluate the data and understand the standard compression methods.
    • Use existing tools and standard compressors.

Lossy Compression

  • Scalar quantization, rate-distortion theory, vector quantization, and transform coding.
  • Theory for lossy compression is less useful compared to lossless compression.
  • Human perception plays a role in designing lossy compression algorithms.

Image Compression

  • JPEG, DCT, and ML-based image compression.

Video Compression

  • Residual coding, quantization, and lossless coding.

Other Topics in Information Theory

  • Distributed compression and error correction coding.
  • Succinct data structures for compressed data with random access.
  • Compression for hardware and neural networks.
  • Compression in specialized domains like AR/VR, genomics, and vision processing.

Stanford Compression Library

  • Easy to use and experiment with.
  • Helps understand arithmetic coding, ANS, and range coding.
  • Resources on the website will be continuously updated.

Other Stanford Classes

  • E276 Information Theory: Covers more theory, especially in lossy compression.
  • E376 Topics in Information Theory: Focuses on universal schemes and proving entropy rates.
  • Music 422: Explores audio coding, psychoacoustics, and human perception.
  • CS 236 Generative Models in AI: Useful for understanding how to build good compressors through probabilistic and machine learning techniques.

Speaker's Journey in Compression

  • Started in 2016 at Saki's lab.
  • Will continue to work on compression-related research.

Overwhelmed by Endless Content?