-
-
Notifications
You must be signed in to change notification settings - Fork 35.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WebXR: Add experimental Hand Input API support #19922
Conversation
Considering that the |
Wow that was fast :_D |
@fernandojsg What VR device is required to test the API? Does it work with a Quest? |
Quest works, behind a Chrome flag. |
Yep Oculus Quest using a flag and Hololens 2 using Servo.
|
@@ -71,7 +71,7 @@ var VRButton = { | |||
// ('local' is always available for immersive sessions and doesn't need to | |||
// be requested separately.) | |||
|
|||
var sessionInit = { optionalFeatures: [ 'local-floor', 'bounded-floor' ] }; | |||
var sessionInit = { optionalFeatures: [ 'local-floor', 'bounded-floor', 'hand-tracking' ] }; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, we could bring options
back. When setting the reference space was refactored, the object had no purpose anymore and thus was removed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's wait for the answer here to see if it's worth to do it optional or not https://twitter.com/fernandojsg/status/1286968102476029953
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On Firefox Reality for Hololens there's IPC involved so once you enable hand input we have to calculate and send over hand poses each frame. I haven't perceived a major perf impact on simple demos but it might not be great for complex ones.
It might be worth unconditionally requesting hands in VR so that you can render a hand model, but I think it might be good for an opt in. But it's worth measuring on both Oculus and FxR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is the same for oculus.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems like the IPC/compute overhead would be enough to make it opt in, let alone the current pose GC issue (which we both hope to fix 😉)
It's also worth mentioning that, currently, we can't enter vr with hands. We need to hit enter vr with the controller and then put the controllers aside and wait until your hands are detected. Oculus is working on fixing this for the next version. |
It seems the entering VR with hands could be fixed also by enabling this parameter in OB: (context https://twitter.com/DePanther/status/1286974408469381120) |
Is there some way to figure out which hand is the left hand and which is the right hand? |
Ah got it!
|
API: https://immersive-web.github.io/webxr-hand-input/
Demo: https://twitter.com/fernandojsg/status/1286803121512108032
Some thoughts:
hand2.add( handModelFactory.createHandModel( hand2, "highpolyhand.glb" ) );
or soxr.getHand()
it won't be positioned in the space (just in 0,0,0) because the joints and tips are in world space so the Group holding them is not updated. It could be nice to use probably the wrist or compute the palm center to update that group if you want to add objects following your hand.