rustgstreamergstreamer-rs

Why doesnt the PTS retrieved from gstreamer doesnt match saved ts


I have a pipeline where I am pulling an rtsp stream from a camera and running some AI detections on it. I have implemented it in rust but my simplified pipeline looks like the following

rtspsrc -> rtspsrc  latency=100 location=rtsp://<rtspurl>  protocols=0x4 name=basesrc basesrc. ! rtph264depay ! tee name=t \
t. ! queue ! vaapidecodebin name=video_decoder ! tee name=decode_t \
   decode_t. ! queue ! videorate ! video/x-raw,framerate=15/1 ! vaapipostproc format=nv12 ! video/x-raw,width=2880,height=1620 ! vaapih265enc ! h265parse config-interval=-1 ! hlssink2 playlist-length=5 max-files=0 target-duration=10 send-keyframe-requests=true program-date-time=true playlist-location=manifest.m3u8 location=video/%t.ts \
   decode_t. ! queue ! videorate ! video/x-raw,framerate=3/1 ! vaapipostproc format=nv12 ! video/x-raw,width=640,height=384 ! videoconvert ! objectdetector ! fakesink async=false 

objectdetector is my custom plugin which sends a message to the bus when an object is detected. I capture the PTS in its transform_frame_ip() implementation by getting the frame which is of type

frame: &mut gst_video::VideoFrameRef<&mut gst::BufferRef>

and I get the pts like below

let pts = frame.buffer().pts().expect("Buffer must have a PTS value.");

However when I try to print all the frame PTS of the TS segment collected by the hlssink element, the PTS times are completely different.

I use the following command to get the PTS of a ts segment.

ffprobe -select_streams v -show_frames -of csv -show_entries frame=coded_picture_number,key_frame,pict_type,pts,pts_time -i <>

I am using these PTS values to try to match it with the actual play time so that I can draw the bounding box in a web browser while playing the manifest.

What am I doing wrong? I appreciate any help in this.


Solution

  • There are multiple possible reasons for this: