Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nodes: Add SSAAPassNode. #29106

Merged
merged 7 commits into from
Aug 12, 2024
Merged

Nodes: Add SSAAPassNode. #29106

merged 7 commits into from
Aug 12, 2024

Conversation

Mugen87
Copy link
Collaborator

@Mugen87 Mugen87 commented Aug 10, 2024

Related issue: -

Description

This ports SSAARenderPass to WebGPURenderer.

SSAAPassNode does not yet support MRT since I'm not sure how to proceed.

I guess it's expected to supersample not just beauty but all other attachments as well. However, I'm not sure how to do the accumulation for additional outputs and the depth. The accumulation itself is just a simple copy with an additively blended material (see code in setup()). But I did not figure out so far how to apply it to a MRT configuration.

We need this bit at a later point for TRAA as well so it would be beneficial to already find a solution for SSAAPassNode.

Copy link

github-actions bot commented Aug 10, 2024

📦 Bundle size

Full ESM build, minified and gzipped.

Filesize dev Filesize PR Diff
685.1 kB (169.6 kB) 685.1 kB (169.6 kB) +0 B

🌳 Bundle size after tree-shaking

Minimal build including a renderer, camera, empty scene, and dependencies.

Filesize dev Filesize PR Diff
462 kB (111.4 kB) 462 kB (111.4 kB) +0 B

@sunag
Copy link
Collaborator

sunag commented Aug 11, 2024

SSAAPassNode does not yet support MRT since I'm not sure how to proceed.

I added some commits that should bring support. AutoMRT will not work on .fragmentNode, in these cases the user will have to use .fragmentNode = mrt( .. ) manually. I cloned RenderTarget this should help when user gets new textures using pass.getTexture().

We need this bit at a later point for TRAA

Yes, rendering the same scene multiple times in a single frame seems really expensive...

@Mugen87
Copy link
Collaborator Author

Mugen87 commented Aug 11, 2024

I've tried to replace pass with ssaaPass in webgpu_postprocessing_ao but only the beauty output is correct. Normal and depth appears to be empty. The code looks like so:

const scenePass = ssaaPass( scene, camera );
scenePass.setMRT( mrt( {
	output: output,
	normal: transformedNormalView
} ) );

scenePassColor = scenePass.getTextureNode( 'output' );
const scenePassNormal = scenePass.getTextureNode( 'normal' );
const scenePassDepth = scenePass.getTextureNode( 'depth' );

postProcessing.outputNode = scenePassColor; // scenePassNormal and scenePassDepth do not work yet

@Mugen87
Copy link
Collaborator Author

Mugen87 commented Aug 11, 2024

Side note: I always find it great when engines or games offer SSAA as an option. In games, you normally can't use this type of anti-aliasing during the release. But after some years of hardware progression it's nice if you can enable SSAA when you revisit a game. In this way you have superior anit-aliasing but without the temporal artifacts known from TRAA.

@sunag
Copy link
Collaborator

sunag commented Aug 11, 2024

Thanks for the tests with normals, replacing the MRT was necessary, otherwise it would get the normal information from QuadMesh, apparently it is working.

image

@sunag
Copy link
Collaborator

sunag commented Aug 11, 2024

I added depth, it is possible to gain performance by allocating it outside the jitterOffsets loop, or using the depth of the sampleRenderTarget, it could be a possibility

image

@sunag sunag added this to the r168 milestone Aug 11, 2024
@Mugen87
Copy link
Collaborator Author

Mugen87 commented Aug 11, 2024

Regarding the depth: Is it already possible with the node material to write to frag_depth and gl_FragDepth? I guess we could that use to produce a supersampled depth value.

Edit: It seems it won't work since the blend does not honor the depth buffer...

@sunag
Copy link
Collaborator

sunag commented Aug 11, 2024

We would have to implement it for MRT as well but I assume it won't work without an update, another possibility would be to store the depth in a third MRT texture.

@sunag
Copy link
Collaborator

sunag commented Aug 12, 2024

Edit: It seems it won't work since the blend does not honor the depth buffer...

Maybe we can leave this for later, I think the priority is related to the beauty output and allow other MRT outputs, we already have great progress here.

@Mugen87 Mugen87 marked this pull request as ready for review August 12, 2024 07:36
@Mugen87 Mugen87 merged commit f7a09bb into mrdoob:dev Aug 12, 2024
12 checks passed
@Mugen87
Copy link
Collaborator Author

Mugen87 commented Aug 12, 2024

Agreed!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants