-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Announcement: Introducing the RealSense Tracking Camera T265 #3129
Comments
Thanks, Specifications table is a little light on details. Any chance for some numbers like fps and depth accuracy? |
Thanks @MartyG-RealSense
Also, I'll try to answer some of the questions here. FAQ
|
@dorodnic Will the pose timestamps be able to sync with realsense depth cameras? Or will there be some method of synchronizing the pose data with other hardware? |
@sam598 - As far as I know there is no hardware-sync, but you could query T265 and D400 clocks with ~2-4ms accuracy each. |
Fairly short (24sec) recording of T265 data:
Download sample *.bag file (1GB) and open with Intel RealSense Viewer 2.18+ |
This product looks amazing. Does it provide any support for persistence? Ideally, a way to fetch and save the map on a host and then restore it and re-localize to that map? Thanks. |
@JBBee - right now |
I would like to do +100 likes on it if I could. Have been waiting so long for a proper tracking solution from Realsense. After reading initial info. on T265, first question in my mind was, whether it will allow persistence? AND IT WILL :) so cool!! Other nice to have features could be Thanks |
Appreciate the feedback. |
@dorodnic That will be awesome. Until then, is it possible for 3rd party developers to use the raw streams of data from camera to implement these CV features on the host? Just want to make sure there is nothing to keep from it in the SDK or the device |
@nominator - the camera can provide:
You can ask for any or all of these. We can publish additional sample data (*.bag) if you'd like to take a closer look at the streams (auto-exposure performance, IMU stability, etc...) |
I'm just curious about about some basic questions, since I am new to 3D imaging:
Edit: Just saw this: "The T265 complements Intel’s RealSense D400 series cameras, and the data from both devices can be combined for advanced applications like occupancy mapping, improved 3D scanning and advanced navigation and collision avoidance in GPS-restricted environments." Will there be a camera like the T265 in the future that also provides depth data to the user like the D435i? |
@dorodnic That will be great!! Please do, so that until we get the camera shipped, we can start playing around with the data and think about how to best use this data to add more features required for AR applications. We are currently doing an AR MVP for a client and being able to show them advanced capabilities enabled by Realsense will be awesome. |
Cooool product I have waited for long time! Does T265 have capabiliteis of realtime sharing of relocalization map between two devices over ethernet? |
I just pre-ordered one of these to play with and might buy more. This is exactly my question, and I think a lot of people will want/need this. Basically allow like 2+ sensors to work using the same shared internal map so they move in the same space. There are two use cases. One where two or more sensors are placed statically relative to each other and move together increasing their accuracy, and the other is where they move independently in the same space. My high level plan is to track multiple controllers (and maybe HMDs?) turning them into inside-out trackers. I'd need an API feature that allows the sensors to synchronize/merge their internal maps wirelessly. (So this isn't a case where two sensors are attached to the same computer). @dorodnic The image you linked shows 30 fps for the video. That's not the sampling rate is it? |
Hi @sirisian @mohammedari |
@dorodnic If you could also create a bag file showing performance under the influence of vibrations, it'd be great. I mean vibrations from a drone, and vibrations from some kind of ground mobile robot/vehicle.... |
Hi @peci1 |
the video stream is 30 fps, but is it also 30 fps used internally for the computation? I guess the accelerometer provides the 260Hz data and there is some interpolation between the processed images, but 30 fps seems a little bit low for fast movement (even though if it is global shutter, low exposure time and hardware sync, it may be enough). What is the exposure time? is possible to modify it? Is there a timestamp for the video stream that we can match to the position. I saw it's not possible to add additional processing on top of the 6dof tracking, but is it possible to completely replacing the tracking by other task like would do on a normal movidius compute stick, to use it as a standard stereo camera with processing capability. I'm thinking of tracking or object detection that could be done on camera, and streaming only the detections instead of full image. Is it possible to synchronize the clock of multiple cameras together like the d400 serie with a sync cable? |
Hi @delmottea -
There is no external sync connector on this device.
T265 performs auto-exposure on board. Actual exposure and gain are reported with every frame via metadata. If you are implementing your own 6-DOF algorithm (and not enabling the POSE stream) you can also set manual exposure value (this is not yet in LRS API, but should be possible according to device specifications)
IMU rates are 200 Hz for Gyro and 62 Hz for Accel. Every 33ms the device corrects IMU drift using visual data from fisheye cameras.
Yes, there are hardware timestamps on all streams, and a method to query device current time (the later is not yet exposed in LRS, but will be)
Unlike the NCS, the T265 is not a general purpose compute device, meaning we do not provide an SDK for development of custom firmware on top of its sensors, at least not at the moment. This device was designed and optimized for the problem of inside out tracking. Low-power plug&play V-SLAM is significant part of its value proposition. For general purpose CV an NCS + standard webcam would probably make more sense. |
@dorodnic SInce the device is capable of saving and loading maps. Is there going to be a limit on the area size that can be tracked and persisted using T265? I mean to relocalize, T265 will need to search through its saved map for matching features etc. If the maps are too large to cache on the onboard memory, would it then stream it from the connected host computer? Similarly while tracking and building the map, can the host receive the map data as a stream to save it during tracking? |
Hi I am doing research on visual-inertial odometry and the raw data from T265 seem very useful. In the comments above I saw "Every 33ms the device corrects IMU drift using visual data from fisheye cameras." Is it possible to get raw IMU data without drift correction? |
@MartyG-RealSense Thanks for the response! You said in #3987 "If you are implementing your own 6-DOF algorithm (and not enabling the POSE stream)". Does this mean I will get raw IMU data (without being corrupted by drift correction) after disabling POSE stream? |
I'm looking to use a D435 and T265 together, as in the demos, on a mobile platform (so battery powered). Do people have a recommendation for either a single board computer or a NUC to use? I think I need two USB 3.0 ports, so would need at least an up-squared over just the UP board, correct? Would love to hear what you would do for this kind of project. Thanks! |
An Up Squared has been shown to be able to handle two D415 on the same board, so a D435 and T265 pairing should be fine in theory. There is a long discussion about using the Up Board or Up Squared with two cameras at this link. Click on the 'More Answers' link at the bottom of the page at this link to see the full length of the discussion. |
Thank you for the reply and link! |
Confirmed this to work on UP board big brother, AAEON Pico - APL3 The main issue deals with how to run a fully online SLAM system including loop closure in realtime on these entry level Intel processors. |
If you were to go with a NUC, which would you get? The expensive ones are really nice, but how about out of the lower cost ones? |
After Intel released the super-powerful but high-priced NUC 8 VR kit in Spring 2018 (multiple USB 3 ports and powerful graphics). they followed it up with the announcement of a budget-priced 2018 range with a more modest spec. |
great info, thanks! |
@dorodnic and @MartyG-RealSense, has there been any development with realtime sharing of relocalization map between two devices? (As was discussed in #3129 (comment)). |
I do not have enough knowledge of T265 to comment on the practicality of a shared map - one of the Intel guys such as Dorodnic will be better equipped to comment on that. It was established recently though that multiple T265s can be put on separate threads. Please read downwards from the comment linked to below. |
Hey All! Amazing work, really appreciate it! Two Questions: About the relocalization in a known map. It is my understanding that this feature is not really accurate (kidnapped-robot example?). Is this true? And if so would it be possible to use detection of apriltags in known positions to re-initialize the current position of the robot and maybe even correct the <1% drift of the T265 on long runs? My system requires to travel long distances with high accuracy. The other question is about wheel-odometry. This is asked a few times I think above, but since quite a time passed, I'd like to ask again. Currently I couldn't really find a documented API to provide wheel-odometry to the T265, besides for ROS. If there is no other way to provide this data, I would have to build ROS on my system only for that. |
No. This feature works pretty good. If the robot is released again in an environment always seen and mapped, it is pretty easy possible to re-orientate again and determine the correct (relocated) position. But you need to help the T265 by set/get_static_node and re-project everything from the retrieved static node outside the cam in your code. See https://neilyoung.serveblog.net for examples and working code. You don't even need Apriltags for this. I find this "AT help" scenario a bit unrealistic: You need to do the AT detection outside the cam in your code, which immediately raises the requirements to the hosting hardware. Then the AT detection costs, and lets you at least more or less w/o orientation as long as you are looking for ATs. Of course, once you have a corrective in form of a detected AT you can use this information instead of get_static_node to re-project. I'm not sure, where this 1% drift info comes from. At small scale the drift can be pretty low. At large scale the drift can be way > 1%, even outdoors. My personal opinion of wheel odometry is: I can't really imagine that a wheel - especially a sliding and spinning wheel - can improve the accuracy of the entire system much, so I never used it. But I might be wrong. However, I was always pretty satisfied with the accuracy indoors w/o wheel odometry, but I don't count in centimetres... IMHO this is overkill. |
Thank's for the quick reply! About the ATs, actually I think you could let the T265 take care of computation as explained here, and then just take the pose data. Also I didn't quite understand how I am supposed to set/get these static points. Is it like saying "You are at position x,y,z right now!", because if so, wouldn't that mean that I had to actively put the sensor myself to a position that I know (a reference position) and then run the get_static_node code from there? And how accurate would that be, since I won't put the sensor each time to the exact same position. But as I said I still didn't have that beakthrough in my head yet and I have some research to do, so don't try to explain some basic concept here(, if its gonna be too much trubble) :) |
In my code it is used like so: Later wherever you are in real world and you are using the same map and retrieve AT is not my favourite, since it requires you to place printed markers at exact positions in your world. This might work for PoC, but customers will not like it :) Promised. And AT is also a format, which does not allow you to store generic data, it's not QR. You would also have to have a translation layer, which converts a detected AT into your world coordinates, you are needing. Give my solution a try, it's easy. There are a lot of videos explaining everything. You not even necessarily need a floor plan: A photo of a true to scale hand drawn rectangle on a paper does it... |
Thank you very much for your explanation, it really helped! I'll give it a try, once my camera arrives. 👍 |
Thanks. I really would appreciate some comments. I'm pretty convinced by the shown accuracy indoors. Having not seen it better yet. But I'm alone, having no feedback. |
Yeah, sure I‘d totally try it out and give some feedback. Checked your website out btw, impressive accuracy actually. I think I‘m going to use something like ATs or AruCo codes anyway since I‘m trying to do some shelf picking with a robot arm, and I really need millimetric accuracy. That‘s why I thought, why not use the same marks to kind of “fine tune” my relocalization. But if it really is that good, maybe won’t need them after all. I’m definitely going to give feedback. Thanks again. :) |
Hi, any news on the maximum map size of the T265? Did it increase? |
Hello, I am using T265 with Jetson NX. Every time when I boot up the NX and T265, I get error saying “Error booting T265”, and “No RealSense devices were found!”. I can unplug and plug back the usb cable to resolve the issue, but it is annoying to do that every time I power cycle the drone. Is there anyway to fix that? Also, I often see error "SLAM_ERROR Speed", what caused that and how to resolve that? Normally the drone is flying well, but sometimes it flies away and lose control in a sudden. The position data looks odd, why is that, is that caused by the error "SLAM_ERROR Speed"? Thanks |
Its a known issue that you have to unplug / plug in the T265 after boot. I have not found a solution for this. I don't think I have seen the error "SLAM_ERROR Speed" What happens is that vibration causes the T265 to lose its tracking. I have seen this. One solution is to mount the T265 with vibration dampeners. Another solution, is to monitor the T265 tracking quality. If it is bad, you can switch your drone to manual. |
Thanks very much for your kind reply. Is there any good substitute for t265? And how to monitor the tracking quality? |
Check out this video for an alternative. It is using wireless tracking. Tracking quality can be monitored with two different confidence variables: I use tracker_confidence. It should always be 3 in order to trust the data coming from the T265. If it drops below 3, then set an alarm and switch to manual control. |
Thanks very much Mike for your recommendation! |
Hi everyone,
A new RealSense camera product called the RealSense Tracking Camera T265 is now listed for pre-prder on Intel's online Click store, with pre-orders due to ship in the week of March 4 2019.
"Intel RealSense Tracking Camera T265 is a new class of stand-alone Simultaneous Localization and Mapping device, for use in robotics, drones and more".
https://click.intel.com/order-intel-realsense-tracking-camera-t265.html
The text was updated successfully, but these errors were encountered: