I want to use gstreamer (gst-launch-1.0) to stream a video signal from a camera connected to a raspberry pi (CM4) to a remote client over UDP. The gstreamer pipelines that I use always reverts to the uncompressed YUYV pixel format even after I set the format to MJPG with v4l2.
This is my pipeline:
pi@cm4:~ $ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=MJPG
pi@cm4:~ $ gst-launch-1.0 -v v4l2src device=/dev/video0 ! "video/x-raw, width=1920, height=1080, pixelformat=MJPG" ! rndbuffersize max=65000 ! udpsink host=127.0.0.1 port=1234
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRndBufferSize:rndbuffersize0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRndBufferSize:rndbuffersize0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
Even though the pipeline seems to accept the "pixelformat=(string)MJPG", the format is YUY2 and the maximum framerate is the 5fps. If I set the framerate to anything higher than 5/1, it fails with:
** (gst-launch-1.0:16205): CRITICAL **: 21:36:05.076: gst_adapter_take_buffer: assertion 'GST_IS_ADAPTER (adapter)' failed
** (gst-launch-1.0:16205): CRITICAL **: 21:36:05.076: gst_adapter_available: assertion 'GST_IS_ADAPTER (adapter)' failed
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
After execution of the gstreamer pipeline, v4l2-ctl confirms that the video format reverted to YUYV.
How can I force the gstreamer pipeline to use MJPG 1920x1080 and enable higher frame rates?
The camera is a Canon 5D iv that produces a clean HDMI output up to full HD at 60fps. The camera HDMI output is connected to an HDMI to USB video capture (mirabox) that supports 1920x1080 at 60fps. The video capture box is connected to the CM4 via a USB3-PCIe adapter.
This is the list of supported formats:
pi@cm4:~ $ v4l2-ctl -d 0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'MJPG' (Motion-JPEG, compressed)
Size: Discrete 1920x1080
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.100s (10.000 fps)
[..... deleted lower resolution formats...]
[1]: 'YUYV' (YUYV 4:2:2)
Size: Discrete 1920x1080
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1600x1200
Interval: Discrete 0.200s (5.000 fps)
[..... deleted lower resolution formats...]
Setting the pixel format is actually incorrect here. MJPEG is not a pixel fornat for "raw" video.
Try
v4l2src device=/dev/video0 ! image/jpeg, width=1920, height=1080, framerate=30/1 ! ..
Note that the camera will return you jpeg image data, so you will need a jpeg decoder if you want to display the image.