GStreamer xvimage
출처 : http://jetsonhacks.com/2014/10/
The NVIDIA Jetson TK1 uses Gstreamer as its official multi-media interface. In previous entries we’ve installed two webcams, a Microsoft LifeCam Studio and a Logitech c920. There are several longer range goals with utilizing the webcams, but first up is to show them on the screen. Here’s the video of what they look like:
I’d like to tell you that it’s a straightforward process. Gstreamer tends to be a sprawling, “I can do everything” type of program with plugins galore. The architecture is that of a pipeline, starting at a “source” and ending at a “sink”.
To me, the nomenclature that they use does not match anything in the naming convention that computer programmers have used for the last 60 years. In the world of computers that I dwell in, you have “input” and “output”. I can understand the concept of “source” and “sink”. However, the idea that there are components in the pipeline that act both as a “sink” and a “source” is just strange. I translate that in my head as a “source” is an output, and a “sink” is the input side.
With that said, the Jetson is a little confusing on which version of Gstreamer to use. In the initial release, the documentation indicated Gstreamer 0.10, when LT4 19.3 was release, the release notes stated that Gstreamer 1.0 was now supported. It’s not quite clear what that means in either context, or what the full extent of supported means.
So with a lot of trials, here are some of the first results. Both the Logitech and the Microsoft webcams deliver a variety of video resolutions and different encodings. The higher resolutions come in some sort of compressed format, MJPEG for both webcams. In addition, the Logitech also delivers H264.
I’ll use Gstreamer 1.0 for this example. It is straightforward to show one webcam using Gstreamer:
gst-launch-1.0 -v v4l2src device=/dev/video0 \
! image/jpeg, width=1920, height=1080, framerate=30/1 \
! jpegparse ! jpegdec \
! videoconvert ! videoscale \
! xvimagesink sync=false
Here’s a shell file, gstreamerPreviewWebcam.sh which will allow you to run this example.
A simplified explanation:
Basically the source is Video 4 Linux Version 2 (v4l2src) using video device 0. Ask the camera for MJPG (image/jpeg) 1920×1080 resolution at a frame rate of 30 frames per second. Gstreamer calls this a capability. Because the video is encoded as MJPG (a compressed image), you have to convert it to something that the display will understand. The display is represented by xvimagesinkin this example.
The ! (exclamation point, some people call it a bang) acts as a separator between different pipeline elements.
The first thing to do is to take the video stream coming from the webcam, and decode it into raw pixel data. jpegparse makes sure that the JPEG decoder (jpegdec) understands the version of MJPEG that the camera is speaking. jpegdec then converts it into a “raw” video RGB pixel format.
videoconvert makes sure that the RGB pixel format is suitable for xvimagesink. videoconvert may observe that the stream is already in the correct format and just perform a pass through. In this instance, I added videoscale so that the video will resize when the window is resized.
xvimagesink represents the window on the display. “sync=false” tells Gstreamer that it’s ok to drop frames if it gets behind. For a display, that’s probably a good policy as you don’t want to fall behind. For storing video streams, not so much.
It wasn’t horrible, horrible. So I figured two webcams wouldn’t be difficult either. I figured wrong, but it’s because the core issues were tweeners between the Cameras, USB, and Gstreamer.
First, here’s the shell file for two webcams, sideBySide.sh.
There is only one trick that you need to know. There is a Gstreamer pipeline element called a “tee” which allows for parallel execution routes. So to start up two webcams, create a tee and send one of the webcams through it. Mark the end of the path with the name of the tee, in this case splitter, followed by a period. Then describe the next step in the pipeline. A tee may have more than two elements. Having worked around physical pipelines for a long time, my brain made sure that it let me know how much I was making it hurt with terrible metaphors.
In any case, you can wire this all up and still not know the secret ingredient to getting larger resolution webcams running on USB. It turns out that these webcams reserve the amount of bandwidth on the USB device that their maximum resolution will generate. You can do the math, but I’ll just let you know that I couldn’t put these two 1920×1080 30fps MPEG USB 2.0 webcams on one USB 3.0 device. However, you can run them both at 1280×720 (720p).
If you try to run both at 1920×1080 you’ll get a “cannot allocate memory” error in the Terminal. Of course, you’ll think that has something to do with the program (it doesn’t), it’s simply that Gstreamer thinks that all of the bandwidth of the USB device has been reserved.
Anyway, at least we got a good demo out of it, and hopefully you’ll find some use for it too!
'멀티미디어' 카테고리의 다른 글
gstreamer0.10 build 문제 생길 때 (0) | 2015.05.18 |
---|---|
GStreamer Network Video Stream and Save to File (0) | 2015.04.23 |
nvidia tegra multimedia library (0) | 2015.04.22 |