Thursday, October 8, 2009

texture streaming

IMG has an interesting texture streaming EGL extension for their GFX graphics core, which is supported on TI omap3 devices. It provides a way to stream YUV (or RGB) video to a 3d surface without incurring the expense of a texture upload or an extra memory copy and without the need for colorspace conversion. This makes for interesting possibility, such as decoding video to a 3d surface.

The texture streaming extension utilizes a kernel API to tell the GFX hw when a new frame of video is ready to render. The catalog group in TI had created a 'bc_cat' linux kernel module to allow userspace to allocate buffers to use in texture streaming.

I've taken this driver, and made a few tweaks, and created a GStreamer video sink plugin (see gst-plugin-bc which includes the 'bc_cat' kernel module) to allow for using texture streaming in a GStreamer pipeline. This allows our accelerated codecs to decode video directly to the buffer used by the GFX core. The hw/DSP and GFX core do all the work:

gst-launch filesrc location=/mnt/mmc/iron_man-tlr2_h640w.mp g ! avidemux name=d d.video_00 ! queue ! omx_mpeg4dec ! queue ! bcsink d.audio_00 ! queue ! omx_mp3dec ! alsasink



Here is another example, this time using camera to capture to 3d surface:


gst-launch v4l2src ! "video/x-raw-yuv,width=640,height=480" ! bcsink sync=false




(sorry about the low quality video and the glare which makes it a bit hard to see on the screen)

Touchbook!

I got my touchbook last night! Haven't had too much time to play with it yet, but a very cool little device.