Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replay of bag file taking a long time #9585

Closed
naoya-kumagai opened this issue Aug 7, 2021 · 10 comments
Closed

Replay of bag file taking a long time #9585

naoya-kumagai opened this issue Aug 7, 2021 · 10 comments

Comments

@naoya-kumagai
Copy link

  • Before opening a new issue, we wanted to provide you with some useful suggestions (Click "Preview" above for a better view):

  • All users are welcomed to report bugs, ask questions, suggest or request enhancements and generally feel free to open new issue, even if they haven't followed any of the suggestions above :)


Required Info
Camera Model D435i
Firmware Version Open RealSense Viewer --> Click info)
Operating System & Version Ubuntu 20.04
Kernel Version (Linux Only) 5.4.72-microsoft-standard-WSL2
Platform PC
SDK Version 2.48
Language python }
Segment {Robot/Smartphone/VR/AR/others }

Issue Description

<Describe your issue / question / feature request / etc..>
I am replaying a bag file and processing data from it. When I run the following code, running the code takes much longer than the length of the bag file. (Code: 58 seconds, bag file: 47 seconds)
How do I get the data from the bag file "in real time"?
'''
def initialize_camera():
# start the frames pipe
p = rs.pipeline()
conf = rs.config()
rs.config.enable_device_from_file(conf, bag_dir,repeat_playback=False)
conf.enable_stream(rs.stream.accel)
conf.enable_stream(rs.stream.gyro)
conf.enable_stream(rs.stream.depth, 848, 480, rs.format.z16, 30)
conf.enable_stream(rs.stream.color, 1280, 720, rs.format.rgb8, 30)
prof = p.start(conf)
return p

p = initialize_camera()
time_start = time.time()

while True:
print(time.time()-time_start)
try:
frames = p.wait_for_frames()
except:
print('no more frames...')
break
print(frames.get_frame_number())
accel_frame = frames.first_or_default(rs.stream.accel)
gyro_frame = frames.first_or_default(rs.stream.gyro)
depth = frames.get_depth_frame()
color = frames.get_color_frame()
'''

@MartyG-RealSense
Copy link
Collaborator

Hi @naoya-kumagai When playing back a bag file, you will get more stable performance if you set set_real_time to False by adding two lines under your pipe start line. It should look something like this in your particular script:

prof = p.start(conf)
playback = prof.get_device().as_playback()
playback.set_real_time(False)

@naoya-kumagai
Copy link
Author

Hi, MartyG-RealSense Thank you for the response.

I've added the two lines to my code, and it actually takes a longer time for the code (67 seconds, bag file is 47 seconds).
What exactly does these two lines do? And what exactly do you mean by stable?

I am trying to implement a real-time vision system so doing things without lag is crucial to this task. If you could tell me how to simulate this in a bag file and how to do this using a camera stream, this would be great!

@MartyG-RealSense
Copy link
Collaborator

The set_real_time function and the effects of using true or false are described in the function's pyrealsense2 documentation entry in the link below.

https://intelrealsense.github.io/librealsense/python_docs/_generated/pyrealsense2.playback.html#pyrealsense2.playback.set_real_time

An example of what I mean by stability of bag playback is that when seeking to navigate to a specific frame number of the bag, there is a greater chance of missing the target when set_real_time is True.

I note that you are streaming depth, color and IMU frames at the same time. It is a known issue that there can be performance problems when doing so with these three stream types simultaneously. The IMU stream is the cause of it. A solution for this is to set up two separate pipelines, with depth & color on one pipeline and IMU on its own on the other pipeline. A Python script that demonstrates this two pipeline approach is linked below.

#5628 (comment)

@naoya-kumagai
Copy link
Author

Thank you for the links, I have separated my pipelines.

However, the problem addressed in this issue is still not resolved.
My understanding of set_real_time is that setting it to True simulates a real-time stream and therefore some frames may be dropped. However, even after setting this to True, the playback is not 'in real time', as it takes longer than the bag file length to replay. (If I set it to False, it takes even longer)

@MartyG-RealSense
Copy link
Collaborator

Are the stream configurations that you defined in your conf statements (848x480 depth at 30 FPS and 1280x720 color at 30 FPS) the same stream resolutions and FPS speeds that the data in the bag was recorded at? If not, it is recommended to edit your conf statements to match the recorded data in the bag.

You could also similarly match the gyro and accel configuration to the IMU streams recorded in the bag by using an expanded conf instruction that takes account of format and frequency. For example:

conf.enable_stream(rs.stream.accel, rs.format.motion_xyz32f, 63)
conf.enable_stream(rs.stream.gyro, rs.format.motion_xyz32f, 200)

Edit the accel and gyro frequencies to whichever ones the recorded IMU streams use.

It may be worth drag-and-dropping your bag file into the center panel of the RealSense Viewer program to play the bag back in the Viewer and see how long it takes to play from start to finish. This will provide indication of whether or not the delay is being caused by something in your script.

@MartyG-RealSense
Copy link
Collaborator

Hi @naoya-kumagai Do you require further assistance with this case, please? Thanks!

@naoya-kumagai
Copy link
Author

Sorry, yes!
I have done all of the above. The playback time with the script is indeed longer than the Realsense Viewer.
I am struggling to find the cause of the mismatch since my script is nothing complicated; it only streams gyro and accel.

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Aug 15, 2021

Dies it make any difference if you move your stream data retrieval statements to the position directly after the wait_for_frames() instruction?

while True:
print(time.time()-time_start)
try:
frames = p.wait_for_frames()
accel_frame = frames.first_or_default(rs.stream.accel)
gyro_frame = frames.first_or_default(rs.stream.gyro)
depth = frames.get_depth_frame()
color = frames.get_color_frame()

except:
print('no more frames...')
break
print(frames.get_frame_number())


There was also a past Python case in #8481 that had problems with the IMU streams when using frames.first_or_default to access the data. A successful solution was to access them with as_motion_frame().get_motion_data(), like in the script in the link below.

#3409 (comment)

@naoya-kumagai
Copy link
Author

Thank you @MartyG-RealSense
This problem still occurs on occasion but for now it is not a crucial problem so it is okay.
Please close this issue.

@MartyG-RealSense
Copy link
Collaborator

Thanks very much @naoya-kumagai for the update. As you suggest, I will close the case. Thanks again!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants