Description
WebCodecs deals with a few kinds of buffers: encoded packets, video frames, and audio buffers. Video frames are at least an order of magnitude larger than packets and audio buffers, so I focus on them below.
Video frames are typically backed by GPU memory, using the native graphics buffer primitive (eg. DMAbuf, IOSurface, etc). These buffers can be mapped for reading but usually must be locked during access. Video frames may also be stored in CPU memory.
We assume that video frames are immutable due to lifetime and security restrictions imposed by the underlying codec implementations.
In each case I list the most obvious solution as an example, but there are known technical problems with each of them.
WebCodecs <-> JS
We would like JS to be able to read planar image data from video frames, and to be able to construct new video frames from planar data. We can't use ArrayBuffer for this as ArrayBuffers are always mutable.
The current WebCodecs proposal provides a readInto() method to copy data from a video frame into a destination ArrayBuffer, and copies from a source ArrayBuffer to create new video frames.
The most obvious solution here is read-only ArrayBuffers.
WebCodecs <-> WASM
The WASM situation is the same as JS, except that ArrayBuffers are not available (other than the WASM main memory).
The most obvious solution here is a memory mapping facility for WASM.
WebCodecs <-> <canvas>, WebGL
The web-native interop format for <canvas> and WebGL is ImageBitmap, and we currently believe that we can offer zero-copy ImageBitmap in most cases.
Using the "imagebitmap" context, it should be possible to render an ImageBitmap also with no additional copies (via transferFromImageBitmap()
). In WebGL however there is no way to make use of an ImageBitmap without copying it (via TexImage2D()
).
It is also possible to upload and read back data using pixel pack buffers, which may be efficient enough in the case of video frames that are backed by CPU memory.
The most obvious solution for WebGL is a way to bind video frames to textures, as in the WEBGL_video_texture extension.
WebCodecs <-> WebGPU
WebGPU has a concept of buffers that matches platform graphics buffers. We are hopeful that it will be possible to interoperate in that way, but there are significant open questions and a few problematic corner-cases.
Some discussion here: gpuweb/gpuweb#625, gpuweb/gpuweb#700, gpuweb/gpuweb#1154.
A potential solution here could be an enhanced ImageBitmap-like object that WebCodecs can export and WebGPU can import.