pythonrealsenserosbag

Frame didn't arrived within 5000 while reading .bag file - pyrealsense2


I'm trying to read the frames in the .bag files with pyrealsense2. I followed the read_bag_example by Intel. Here's the full sample that the code that I'm using.

import numpy as np
import pyrealsense2 as rs
import os
import time
import cv2

i = 0
try:
    config = rs.config()
    rs.config.enable_device_from_file(config, "D:/TEST/test_4.bag", repeat_playback=False)
    pipeline = rs.pipeline()
    pipeline.start(config)

    while True:
        frames = pipeline.wait_for_frames()
        depth_frame = frames.get_depth_frame()
        if not depth_frame:
            continue
        depth_image = np.asanyarray(depth_frame.get_data())

        color_image = cv2.applyColorMap(cv2.convertScaleAbs(depth_image, alpha=0.03), cv2.COLORMAP_JET)

        cv2.imwrite("D:/TEST/image/" + str(i) + ".png", color_image)
        i += 1
finally:
    pass

The code is working. However I checked the number of frames via realsense-viewer and its output is 890 frames. However, the output of this code always changes in the range of 500-770 and raise the error:

RuntimeError: Frame didn't arrived within 5000

I searched lots of hours but I was not able to find a solution that would resolve my problem.

I'm also using

I could add more information if you need. Any help or other suggestions would greatly appriciated!


Solution

  • Problem is about the playback time of the pyrealsense2. Modules automatically assign it as if they are real-time. Setting a profile, and setting playback time resolved the problem. There is a sample code that works with the 848x480-90FPS below.

    i = 0
    try:
        config = rs.config()
        rs.config.enable_device_from_file(config, "D:/TEST/test_4.bag", repeat_playback=False)
        pipeline = rs.pipeline()
        profile = pipeline.start(config)
        playback = profile.get_device().as_playback()
        playback.set_real_time(False)
    
        while True:
    
            frames = pipeline.wait_for_frames()
            playback.pause()
            depth_frame = frames.get_depth_frame()
            if not depth_frame:
                continue
            depth_image = np.asanyarray(depth_frame.get_data())
    
            color_image = cv2.applyColorMap(cv2.convertScaleAbs(depth_image, alpha=0.03), cv2.COLORMAP_JET)
            cv2.imwrite("D:/TEST/image/" + str(i) + ".png", color_image)
            i += 1
            playback.resume()
    
    except RuntimeError:
        print("There are no more frames left in the .bag file!")
    
    
    finally:
        pass
    

    As could be seen above, while loop changed slightly in order to ensure that the gathered frame first processed before taking a new frame with the playback.pause() and playback.resume().

    TL;DR:

    You should set playback.set_real_time(False) if you are getting inconsistent number of frames in a .bag file.