Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is get_timestamp() aware of serial communication latency? #7955

Closed
amm385 opened this issue Dec 9, 2020 · 3 comments
Closed

Is get_timestamp() aware of serial communication latency? #7955

amm385 opened this issue Dec 9, 2020 · 3 comments

Comments

@amm385
Copy link

amm385 commented Dec 9, 2020

Required Info
Camera Model D435
Firmware Version don't have it on me :(
Operating System & Version Ubuntu 18
Kernel Version (Linux Only) 4.9.140-tegra
Platform NVIDIA Jetson
SDK Version 2.36.0
Language python
Segment Robot

Issue Description

Imagine I have a very long usb cable or a very unpredictable latency between my image being ready on the camera firmware and my pyrealsense sdk receiving the data. If I call get_timestamp(), I can imagine there being one of two values returned, depending on how get_timestamp syncs its clock with my system clock:

a) My host computer's timestamp when the frames were fully collected on the camera's firmware
b) My host computer's timestamp when the frames were received by the host computer

Which of those values is returned by get_timestamp?

If this doesn't make sense yet, here's an example:
The camera and my computer are clock-synced. The camera takes a picture at t=100ms.
It transmits this information back to the computer. This process takes 10ms.
So the SDK receives the full batch of data at t=110ms.

Does get_timestamp() return 100 or 110?

Or, alternatively, am I misunderstanding how this works? I did my best to parse the other questions on here but am still confused.

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Dec 10, 2020

Hi @amm385 The documentation for get_timestamp() in the link below states that it retrieves the time at which the frame was captured. At runtime the SDK selects the most correct representation from the different types of timestamp available, based on both device and host capabilities.

https://intelrealsense.github.io/librealsense/doxygen/classrs2_1_1frame.html#a25f71d45193f2f4d77960320276b83f1

The timestamp timings of RealSense camera sensors are generated as frame metadata attributes. These can be produced by the camera firmware or by the host clock. The links below describe how metadata is created.

https://dev.intelrealsense.com/docs/frame-metadata

#2188 (comment)

Device time and system time are aligned when Global Time is enabled.

#3909

@amm385
Copy link
Author

amm385 commented Dec 10, 2020

Thanks for your answer @MartyG-RealSense. Certainly a bit confusing but glad there are a variety of options to choose from.

@amm385 amm385 closed this as completed Dec 10, 2020
@MartyG-RealSense
Copy link
Collaborator

You are very welcome @amm385 - good luck!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants