udpvideo-streaminggstreamerrtptello-drone

Gstreamer UDP stream from Tello drone into RTP


I am trying to use Gstreamer to stream video from Tello drone into RTP, so that to use it further with jetson inference. The computer to receive the UDP packages is Jetson Nano. The most succesful command till now was

gst-launch-1.0 -v udpsrc port=11111 caps="video/x-h264, stream-format=(string)byte-stream, width=(int)960, height=(int)720, framerate=(fraction)24/1, skip-first-bytes=2" ! queue ! decodebin ! videoconvert ! autovideosink sync=false

when running it in shell, the big window with video starts playing. Now I want to forward this video to RTP. I tried various combinations of x264enc, rtph264depay, rtph264pay, but every time pipeline gets broken with internal data stream error. Result of the last try:

gst-launch-1.0 -v udpsrc port=11111 caps="video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=2" ! queue ! decodebin ! videoconvert ! x264enc ! rtph264pay ! udpsink

output:

/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=(int)2
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=(int)2
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=(int)2
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=(int)2, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, alignment=(string)au, profile=(string)main, level=(string)4
Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/nvv4l2decoder:nvv4l2decoder0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=(int)2, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, alignment=(string)au, profile=(string)main, level=(string)4
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=(int)2, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, alignment=(string)au, profile=(string)main, level=(string)4
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=(int)2, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, alignment=(string)au, profile=(string)main, level=(string)4
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/nvv4l2decoder:nvv4l2decoder0.GstPad:src: caps = video/x-raw(memory:NVMM), format=(string)NV12, width=(int)960, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)24/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad1: caps = video/x-raw(memory:NVMM), format=(string)NV12, width=(int)960, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)24/1
WARNING: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0: not negotiated
Additional debug info:
gstbasetransform.c(1415): gst_base_transform_reconfigure (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0:
not negotiated
WARNING: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0: not negotiated
Additional debug info:
gstbasetransform.c(1415): gst_base_transform_reconfigure (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0:
not negotiated
WARNING: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0: not negotiated
Additional debug info:
gstbasetransform.c(1415): gst_base_transform_reconfigure (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0:
not negotiated
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.971564245
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Please, if someone has already done this, share correct line for gstreamer. Thanks.


Solution

  • Your problem is that decodebin selects nvv4l2decoder that outputs into NVMM memory. videoconvert cannot read from NVMM memory. You would use nvvidconv instead that can read from NVMM and output into system memory.

    However, it is not mandatory to decode h264 for reencoding into h264. This simple pipeline should do the job:

    gst-launch-1.0 -v udpsrc port=11111 caps="video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=2" ! queue ! h264parse ! rtph264pay config-interval=1 ! udpsink host=X.X.X.X port=YYYY auto-multicast=0
    

    where X.X.X.X is the address of receiver (such as 127.0.0.1 for localhost) and YYYY is the port such as 5000.

    For using multicast (don't use with wifi):

    gst-launch-1.0 -v udpsrc port=11111 caps="video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=2" ! queue ! h264parse ! rtph264pay config-interval=1 ! udpsink host=host=224.1.1.1 port=5000
    

    Or if you want to decode re-encode with HW:

    gst-launch-1.0 -v udpsrc port=11111 caps="video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=2" ! queue ! h264parse ! nvv4l2decoder ! nvv4l2h264enc insert-sps-pps=1 ! rtph264pay ! udpsink host=X.X.X.X port=YYYY auto-multicast=0
    

    If you want to use your original pipeline:

    gst-launch-1.0 -v udpsrc port=11111 caps="video/x-h264, stream-format=(string)byte-stream, width=(int)300, height=(int)300, framerate=(fraction)24/1, skip-first-bytes=2" ! queue ! decodebin ! nvvidconv ! x264enc insert-vui=1 ! rtph264pay ! udpsink host=X.X.X.X port=YYYY auto-multicast=0