Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WebXR: Add experimental Hand Input API support #19922

Merged
merged 12 commits into from
Jul 25, 2020

Conversation

fernandojsg
Copy link
Collaborator

image

API: https://immersive-web.github.io/webxr-hand-input/
Demo: https://twitter.com/fernandojsg/status/1286803121512108032

Some thoughts:

  • I'd like to add profiles in a similar way we do with the controllers. Currently the XRHandController creates a spheres hands, but that should be totally customizable. Basically we could provide a mesh/gltf with some specific naming on the nodes and they will keep in synch with the joints. So you could be using custom procedural meshes as the current example or a more complex mesh. You could do something like hand2.add( handModelFactory.createHandModel( hand2, "highpolyhand.glb" ) ); or so
  • Currently if you add something to the hand controller itself returned by xr.getHand() it won't be positioned in the space (just in 0,0,0) because the joints and tips are in world space so the Group holding them is not updated. It could be nice to use probably the wrist or compute the palm center to update that group if you want to add objects following your hand.
  • Gesture detector should be a different class, and we could provide new gestures, for eventually training and recording new ones too.

@mrdoob
Copy link
Owner

mrdoob commented Jul 25, 2020

Considering that the webxr-hand-input is a draft and we'll likely be adapting this on the go I think this is good to merge as it is.

@mrdoob mrdoob added this to the r119 milestone Jul 25, 2020
@mrdoob mrdoob merged commit 1168ec4 into mrdoob:dev Jul 25, 2020
@fernandojsg
Copy link
Collaborator Author

Wow that was fast :_D

@fernandojsg fernandojsg deleted the handtracking branch July 25, 2020 08:50
@mrdoob mrdoob changed the title Add experimental WebXR Handtracking API support WebXR: Add experimental Hand Input API support Jul 25, 2020
@Mugen87
Copy link
Collaborator

Mugen87 commented Jul 25, 2020

@fernandojsg What VR device is required to test the API? Does it work with a Quest?

@avaer
Copy link

avaer commented Jul 25, 2020

Quest works, behind a Chrome flag.

@fernandojsg
Copy link
Collaborator Author

Yep Oculus Quest using a flag and Hololens 2 using Servo.
On Oculus Quest:

  • Enable the experimental support on chrome://flags/
    • Enable WebXR experiences with joints tracking (#webxr-hands)
    • If already enabled, disable WebXR experiences with hands tracking (#webxr-hands-tracking)
  • Enable automatic switching between hands and controller in Oculus Settings

@@ -71,7 +71,7 @@ var VRButton = {
// ('local' is always available for immersive sessions and doesn't need to
// be requested separately.)

var sessionInit = { optionalFeatures: [ 'local-floor', 'bounded-floor' ] };
var sessionInit = { optionalFeatures: [ 'local-floor', 'bounded-floor', 'hand-tracking' ] };
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is important to mention as adding hand-tracking by default could have performance implications as the device could be looking for hands constantly.
I didn't find a way to add options to VRButton as they were deprecated
@Mugen87 @mrdoob ideas?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, we could bring options back. When setting the reference space was refactored, the object had no purpose anymore and thus was removed.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's wait for the answer here to see if it's worth to do it optional or not https://twitter.com/fernandojsg/status/1286968102476029953

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On Firefox Reality for Hololens there's IPC involved so once you enable hand input we have to calculate and send over hand poses each frame. I haven't perceived a major perf impact on simple demos but it might not be great for complex ones.

It might be worth unconditionally requesting hands in VR so that you can render a hand model, but I think it might be good for an opt in. But it's worth measuring on both Oculus and FxR.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is the same for oculus.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems like the IPC/compute overhead would be enough to make it opt in, let alone the current pose GC issue (which we both hope to fix 😉)

@mrdoob
Copy link
Owner

mrdoob commented Jul 25, 2020

It's also worth mentioning that, currently, we can't enter vr with hands.

We need to hit enter vr with the controller and then put the controllers aside and wait until your hands are detected.

Oculus is working on fixing this for the next version.

@fernandojsg
Copy link
Collaborator Author

fernandojsg commented Jul 25, 2020

It's also worth mentioning that, currently, we can't enter vr with hands.

We need to hit enter vr with the controller and then put the controllers aside and wait until your hands are detected.

Oculus is working on fixing this for the next version.

It seems the entering VR with hands could be fixed also by enabling this parameter in OB:
com oculus vrshell-20200725-124101
Although it crashes the browser from time to time when trying to enter VR.

(context https://twitter.com/DePanther/status/1286974408469381120)

@jespertheend
Copy link
Sponsor Contributor

Is there some way to figure out which hand is the left hand and which is the right hand?

@jespertheend
Copy link
Sponsor Contributor

Ah got it!

const hand = renderer.xr.getHand(id);
hand.addEventListener("connected", e => {
    console.log(e.data.handedness);
});

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants