Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Decode 3D texture coordinates during upgrade #146

Closed
wants to merge 2 commits into from

Conversation

javagl
Copy link
Contributor

@javagl javagl commented Aug 13, 2024

This is structurally similar to #98 (where "structurally similar" means "largely copy and paste"), with a similar summary:

  • glTF 1.0 (in B3DM or I3DM) could contain 3D (!) texture coordinates stored as SHORT
  • in glTF 1.0, these had been decoded in the shader (that was part of glTF 1.0)
  • When upgrading glTF 1.0 to glTF 2.0 with gltf-pipeline, then the 3D texture coordinates are transferred to the glTF 2.0
  • This caused an invalid asset (e.g. with rendering errors in CesiumJS)

The exact encoding of the texture coordinates is not yet clear. It just seems to be "one (arbitrary) way" of not storing the VEC2/FLOAT (2 * 4 = 8 bytes), but instead, storing VEC3/SHORT (3 * 2 = 6 bytes).

In one of the relevant glTF files, the decoding in the shader was done like this:

const float uvMultiplier = 0.0000305185; // 1/32767
v_texcoord0 = a_texcoord0.xy * uvMultiplier * (a_texcoord0.z+32767.0);

This was for an accessor with VEC3/SHORT.

This PR implements the decoding of VEC3/SHORT (or VEC3/BYTE) texture coordinates into VEC2/FLOAT texture coordinates. From a quick test with one of these B3DM files that contain such glTF 1.0 data, the result can be rendered in CesiumJS.

But... there are some guesses and degrees of freedom.

  • Is the encoding always implemented like this?
  • Does this have a name (like "oct-encoded" for normals)?
  • Should this also cover the case of UNSIGNED_SHORT components?
  • ...

Some of this still has to be sorted out and confirmed, so I'll just open this a a DRAFT for now.


(NOTE: The state in this PR currently tries to ignore missing BATCHID attributes in B3DM files. This should not be necessary. In the final state, this might be omitted. But there are reasons to assume that this is one quirk that may happen for ~"some old legacy data sets" - details TBD)

@lilleyse
Copy link
Contributor

Does this have a name (like "oct-encoded" for normals)?

Not sure. gltf-pipeline used to compress texture coordinates but from a quick glance it's a different technique: https://github.com/CesiumGS/gltf-pipeline/blob/0d44fb90a4bd9e9e6fba5abce3bb92752f801945/lib/compressTextureCoordinates.js

@javagl
Copy link
Contributor Author

javagl commented Aug 14, 2024

The fact that there could have been "arbitrary" forms of texture coordinate compression in glTF (with all this arbitrariness being compensated in a custom shader) makes it difficult to implement this generically, in a form that always works.

On the one hand, the specific form of compression that was used here appears to be used in multiple files, but it's probably not widespread and unambiguous enough to commit to one decoding method. On the other hand: When we find a glTF with 3D texture coordinates, and no clue about how they might be encoded, then trying to decode them (like in this PR) cannot "break" anything (because leaving them as a VEC3 will make the asset invalid in any case).

No strong opinion, though. Let's leave it as a 'draft' and see whether we encounter more such models, and notice a pattern...

@javagl
Copy link
Contributor Author

javagl commented Aug 17, 2024

Split up and cleaned up into
#147
#148

@javagl javagl closed this Aug 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants