Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion examples/jsm/tsl/display/SSGINode.js
Original file line number Diff line number Diff line change
Expand Up @@ -471,7 +471,7 @@ class SSGINode extends TempNode {

If( this.useScreenSpaceSampling.equal( true ), () => {

stepRadius.assign( RADIUS.mul( this._resolution.x.div( 2 ) ).div( float( STEP_COUNT ) ) );
stepRadius.assign( RADIUS.mul( this._resolution.x.div( 2 ) ).div( float( 16 ) ) ); // SSRT3 has a bug where stepRadius is divided by STEP_COUNT twice; fix here
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is 16 an arbitrary number? Why not using 8 or 32?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also fixed an issue where the radius would change with the step count in ScreenSpace Sampling mode. (This bug is also present in SSRT3:

/ping @cdrintherrieno Are we doing the right thing here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

16 is arbitrary… I suppose we want it to scale across the long axis of the image, so since the example’s radius caps out at 25, this number could be 12.5, so that when the radius is maxed out, it propagates light all the way from the left side to the right side.

Is 25 an arbitrary number?

As for if it’s the right thing to do… just try adjusting the step count and radius in dev (in a better scene) and you can see that they’re intertwined in a way they shouldn’t be (and that is inconsistent with Worldspace Sampling).


} ).Else( () => {

Expand Down
41 changes: 21 additions & 20 deletions examples/webgpu_postprocessing_ssgi.html
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
<script type="module">

import * as THREE from 'three/webgpu';
import { pass, mrt, output, normalView, velocity, add, vec3, vec4 } from 'three/tsl';
import { pass, mrt, output, normalView, diffuseColor, velocity, add, vec3, vec4 } from 'three/tsl';
import { ssgi } from 'three/addons/tsl/display/SSGINode.js';
import { traa } from 'three/addons/tsl/display/TRAANode.js';

Expand Down Expand Up @@ -78,11 +78,13 @@
const scenePass = pass( scene, camera );
scenePass.setMRT( mrt( {
output: output,
diffuseColor: diffuseColor,
normal: normalView,
velocity: velocity
} ) );

const scenePassColor = scenePass.getTextureNode( 'output' );
const scenePassDiffuse = scenePass.getTextureNode( 'diffuseColor' );
Copy link
Collaborator

@Mugen87 Mugen87 Sep 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mind adding this below the getTextureNode() block? This will change the format from RGBA16 to RGBA8 saving half of the memory and bandwidth. For modulating the GI, this precision should be sufficient.

const diffuseTexture = scenePass.getTexture( 'diffuseColor' );
diffuseTexture.type = THREE.UnsignedByteType;

const scenePassDepth = scenePass.getTextureNode( 'depth' );
const scenePassNormal = scenePass.getTextureNode( 'normal' );
const scenePassVelocity = scenePass.getTextureNode( 'velocity' );
Expand All @@ -97,8 +99,8 @@

const gi = giPass.rgb;
const ao = giPass.a;
const compositePass = vec4( add( scenePassColor.rgb, gi ).mul( ao ), scenePassColor.a );

const compositePass = vec4( add ( scenePassColor.rgb.mul( ao ), ( scenePassDiffuse.rgb.mul( gi ) ) ), scenePassColor.a );
Copy link
Collaborator

@Mugen87 Mugen87 Sep 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sidenote: If we want to include GI and AO more physically correct into the lighting, we need something like #28863.

AO should only modulate the indirect light (diffuse+specular). The computed GI should only contribute to the indirect diffuse.

For this, I would create a different (physically correct) example though since the setup is going to be way more complex. I believe we also need enhancements in the core.

Copy link
Collaborator

@Mugen87 Mugen87 Sep 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unfortunately, we need an additional MRT output for the new formula.

Can we please preserve the previous composite as a comment so if developers reach the maximum number of MRT attachments , they see a more simplified formula that does not require scenePassDiffuse? Reducing the GI intensity should approximate a similar solution and is maybe good enough for certain use cases.

When we improve packing with MRT in the future, additional MRT data should hopefully less influence performance.

Copy link
Collaborator

@Mugen87 Mugen87 Sep 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if the multiplication is correct though 🤔 . Isn't that producing an overall too weak indirect diffuse? As far as I understand, SSGI should add an additional source of indirect diffuse light to the scene. If areas receive no indirect diffuse, the SSGI can't contribute any light because of the multiplication. You would have to brighten up the areas first e.g. with an ambient light so the SSGI is visible. That workflow seems strange to me.

I think if we want to treat SSGI as an alternative to an ambient/hemisphere light, light probe or indirect diffuse environment map, we need to stick to an additive formula.

Copy link
Contributor Author

@zalo zalo Sep 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In dev, if you bounce white light onto a black surface, it will turn the surface white. You HAVE to at least multiply it by the albedo; it’s what SSRT3 does. I tried using the normalized scene color instead and it didn’t look nearly as correct.

What are the challenges with an extra MRT attachment? What is the maximum? Is it a WebGL fallback issue? Maybe we can make a sceneColor that has its deviation from albedo in the alpha channel (which would approximate monochromatic lighting intensity)?

I’m all for a proper GBuffer compositing example, but it’s worth noting that people will come to this example to incorporate the GI into their larger project. It shouldn’t be too wrong… 💀

Copy link
Collaborator

@Mugen87 Mugen87 Sep 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What are the challenges with an extra MRT attachment? What is the maximum?

If you additionally need for example emissive or metal/roughness, you already reach the maximum. We lately had a case were this happened with WebGPU:

#31538 (comment)

So it's not a fallback issue, it happens with WebGPU as well. We just need to be aware of that restriction and try to keep the number of attachment as low as possible, optimize the texture formats like below and start using packing.

Copy link
Collaborator

@Mugen87 Mugen87 Sep 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@WestLangley Does the new formula for incorporation the additional indirect diffuse from the SSGI look plausible to you? Like outlined in #31882 (comment), it is not physically correct but is multiplying the GI with the scene's diffuse color more sensible that just adding the GI on the scene's color? Do you see other options that we could use?

@cdrintherrieno It would be good to know for us what the albedo term in Unity shader represents. Is it just the diffuse or the total reflected light of a pixel? Is it a combined direct and indirect light term?

Copy link

@cdrintherrieno cdrintherrieno Sep 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Mugen87 Albedo is the base color of a material (it does not contain any lighting). It's typically stored directly in the G-Buffer in deferred mode, or accessible as an input material texture in forward. You should multiply the diffuse lighting by the albedo to obtain the correct lit values (any form of diffuse light needs to be multiplied by the albedo, wether it's direct, ambient or indirect diffuse). Here is a basic lighting example: https://learnopengl.com/PBR/Lighting.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah… with packing we can probably also save on the normal map by just computing it from the depth map, and the depth map can probably be packed into the alpha of the velocity or albedo channels.

I assume we can also just render to more targets across multiple passes… but I guess that’s slow?

Perhaps when the number of MRT attachments goes above the WebGPU limit, it should transparently split the rendering into multiple passes?

AAA compositing pipelines aren’t going to want fewer GBuffers 😅

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

with packing we can probably also save on the normal map by just computing it from the depth map

Normal reconstruction is already supported. If you pass no normal node to the GI pass, it automatically reconstructs the data from the depth. However, if you are not bandwidth limited, I would recommend the normal map usage since the reconstruction is more expensive than a single texture lookup.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Mugen87 AO should only attenuate indirect (ambient) light sources, and GI should add to indirect light sources. Total indirect light is then attenuated by the diffuse reflectance of the material.

Given the model appears to have both direct and indirect lighting baked into diffuse, I think this PR is a reasonable improvement.


// traa

Expand All @@ -123,27 +125,16 @@

//

const gui = new GUI();
gui.title( 'SSGI settings' );
gui.add( giPass.sliceCount, 'value', 1, 4 ).step( 1 ).name( 'slice count' );
gui.add( giPass.stepCount, 'value', 1, 32 ).step( 1 ).name( 'step count' );
gui.add( giPass.radius, 'value', 1, 25 ).name( 'radius' );
gui.add( giPass.expFactor, 'value', 1, 3 ).name( 'exp factor' );
gui.add( giPass.thickness, 'value', 0.01, 10 ).name( 'thickness' );
gui.add( giPass.backfaceLighting, 'value', 0, 1 ).name( 'backface lighting' );
gui.add( giPass.aoIntensity, 'value', 0, 4 ).name( 'AO intenstiy' );
gui.add( giPass.giIntensity, 'value', 0, 100 ).name( 'GI intenstiy' );
gui.add( giPass.useLinearThickness, 'value' ).name( 'use linear thickness' );
gui.add( giPass.useScreenSpaceSampling, 'value' ).name( 'screen-space sampling' );
gui.add( giPass, 'useTemporalFiltering' ).name( 'temporal filtering' );

const params = {
output: 0
};

const types = { Default: 0, AO: 1, GI: 2, Beauty: 3 };

gui.add( params, 'output', types ).onChange( value => {
const gui = new GUI();
gui.title( 'SSGI settings' );
let outputDropdown = gui.add( params, 'output', types );
outputDropdown.onChange( value => {

if ( value === 1 ) {

Expand All @@ -159,14 +150,24 @@

} else {

postProcessing.outputNode = traaPass;
postProcessing.outputNode = giPass.useTemporalFiltering ? traaPass : compositePass;

}

postProcessing.needsUpdate = true;

} );

gui.add( giPass.sliceCount, 'value', 1, 4 ).step( 1 ).name( 'slice count' );
gui.add( giPass.stepCount, 'value', 1, 32 ).step( 1 ).name( 'step count' );
gui.add( giPass.radius, 'value', 1, 25 ).name( 'radius' );
gui.add( giPass.expFactor, 'value', 1, 3 ).name( 'exp factor' );
gui.add( giPass.thickness, 'value', 0.01, 10 ).name( 'thickness' );
gui.add( giPass.backfaceLighting, 'value', 0, 1 ).name( 'backface lighting' );
gui.add( giPass.aoIntensity, 'value', 0, 4 ).name( 'AO intenstiy' );
gui.add( giPass.giIntensity, 'value', 0, 100 ).name( 'GI intenstiy' );
gui.add( giPass.useLinearThickness, 'value' ).name( 'use linear thickness' );
gui.add( giPass.useScreenSpaceSampling, 'value' ).name( 'screen-space sampling' );
gui.add( giPass, 'useTemporalFiltering' ).name( 'temporal filtering' ).onChange( outputDropdown._onChange );
}

function onWindowResize() {
Expand Down
Loading