This page has been created to document MPEG Z/Alpha, an MPEG2 video file extension for the storage of depth and transparency information.

Presentation, Examination talk

MSc thesis

For an overview of the format's inner workings, please read Gernot Ziegler's diploma thesis on this subject.


These movies, created with the TexMPEG library, show possible applications of the file format.

Video grab of realtime MPEG Z/Alpha viewer

Video grab of realtime MPEG Z/Alpha viewer (shows standard OpenGL rendering limitations for depth maps)

Relief texture merger of several images with depth maps (movie shows only merging of still RPF images)

Relief texture merger of several images with depth maps (movie shows only merging of still RPF images)

Overlaying of/Intersection with realtime graphics (movie shows only still images)

Background music by Lackluster (Can'O'lard: Jugglers and Can'o'Lard: Tardy) and Dennis Gustavsson aka Void Main (Artificial Intelligence). Thank you for the permission!

If you have any questions on MPEG Z/Alpha, feel free to mail to

Recently, there has been increased interest in the usage of streaming video textures in 3D applications. Streaming video textures are textures that continously change, mostly based on media data stored in a common video format.

The demo movies below demonstrate how the media component framework GStreamer and OpenGL 1.5 for Nvidia cards can be used to create surprising real-time video effects in Linux.

Figure 1. Video rendered as 3D pixels, each pixel is offset from its 2D video position, based on its luminance value.
Video footage from Singularity by Satori and Lackluster.
Conceptual idea based on Justin Manor's Key Grip system. See also his diploma thesis on Realtime Video Performance.

Technical background

Gstreamer provides YUV planes of the video frames that are uploaded as 3 luminance only textures (using Pixel Buffer Objects, PBO).
Instead of drawing a simple quad with these textures, an array of point primitives is being generated, one for each video pixel.

A Vertex Buffer Object (VBO), bound to the same memory area as the PBO, holds the original 2D positions of the points, and grants access to the Y (luminance) values.

A vertex shader reads the point array, and displaces them according to the provided Y values. The following fragment shader reconstructs the RGB pixel colors from the 3 given luminance textures, using a commonly known YUV->RGB conversion formula.

It is important to note that this video effect is not the only one achievable with this technology.
The software is able to provide PAL resolution (720x576) video textures at full framerate (25 Hz) on a decent PC. Only the GPU resources and the programmer's OpenGL capabilities set limits for what can be done, as soon as the texture is available in the 3D environment.

The sourcecode is available on request.

Comments and ideas are very welcome!

TexMPEG is an advanced MPEG2 software decoder for IRIX and Linux systems, able to utilize multiple CPUs and able to play several MPEG files simultaneously (see picture).

A simple OpenGL demo, 150 lines

It fits perfectly as an MPEG2 player for all applications where it is neccessary to retain control over the video output, such as: