-
-
Notifications
You must be signed in to change notification settings - Fork 36k
SSGINode: Improve GI Blending #31882
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -28,7 +28,7 @@ | |
<script type="module"> | ||
|
||
import * as THREE from 'three/webgpu'; | ||
import { pass, mrt, output, normalView, velocity, add, vec3, vec4 } from 'three/tsl'; | ||
import { pass, mrt, output, normalView, diffuseColor, velocity, add, vec3, vec4 } from 'three/tsl'; | ||
zalo marked this conversation as resolved.
Fixed
Show resolved
Hide resolved
|
||
import { ssgi } from 'three/addons/tsl/display/SSGINode.js'; | ||
import { traa } from 'three/addons/tsl/display/TRAANode.js'; | ||
|
||
|
@@ -78,11 +78,13 @@ | |
const scenePass = pass( scene, camera ); | ||
scenePass.setMRT( mrt( { | ||
output: output, | ||
diffuseColor: diffuseColor, | ||
normal: normalView, | ||
velocity: velocity | ||
} ) ); | ||
|
||
const scenePassColor = scenePass.getTextureNode( 'output' ); | ||
const scenePassDiffuse = scenePass.getTextureNode( 'diffuseColor' ); | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Do you mind adding this below the const diffuseTexture = scenePass.getTexture( 'diffuseColor' );
diffuseTexture.type = THREE.UnsignedByteType; |
||
const scenePassDepth = scenePass.getTextureNode( 'depth' ); | ||
const scenePassNormal = scenePass.getTextureNode( 'normal' ); | ||
const scenePassVelocity = scenePass.getTextureNode( 'velocity' ); | ||
|
@@ -97,8 +99,8 @@ | |
|
||
const gi = giPass.rgb; | ||
const ao = giPass.a; | ||
const compositePass = vec4( add( scenePassColor.rgb, gi ).mul( ao ), scenePassColor.a ); | ||
|
||
const compositePass = vec4( add ( scenePassColor.rgb.mul( ao ), ( scenePassDiffuse.rgb.mul( gi ) ) ), scenePassColor.a ); | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Sidenote: If we want to include GI and AO more physically correct into the lighting, we need something like #28863. AO should only modulate the indirect light (diffuse+specular). The computed GI should only contribute to the indirect diffuse. For this, I would create a different (physically correct) example though since the setup is going to be way more complex. I believe we also need enhancements in the core. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Unfortunately, we need an additional MRT output for the new formula. Can we please preserve the previous composite as a comment so if developers reach the maximum number of MRT attachments , they see a more simplified formula that does not require When we improve packing with MRT in the future, additional MRT data should hopefully less influence performance. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I wonder if the multiplication is correct though 🤔 . Isn't that producing an overall too weak indirect diffuse? As far as I understand, SSGI should add an additional source of indirect diffuse light to the scene. If areas receive no indirect diffuse, the SSGI can't contribute any light because of the multiplication. You would have to brighten up the areas first e.g. with an ambient light so the SSGI is visible. That workflow seems strange to me. I think if we want to treat SSGI as an alternative to an ambient/hemisphere light, light probe or indirect diffuse environment map, we need to stick to an additive formula. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. In dev, if you bounce white light onto a black surface, it will turn the surface white. You HAVE to at least multiply it by the albedo; it’s what SSRT3 does. I tried using the normalized scene color instead and it didn’t look nearly as correct. What are the challenges with an extra MRT attachment? What is the maximum? Is it a WebGL fallback issue? Maybe we can make a sceneColor that has its deviation from albedo in the alpha channel (which would approximate monochromatic lighting intensity)? I’m all for a proper GBuffer compositing example, but it’s worth noting that people will come to this example to incorporate the GI into their larger project. It shouldn’t be too wrong… 💀 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
If you additionally need for example emissive or metal/roughness, you already reach the maximum. We lately had a case were this happened with WebGPU: So it's not a fallback issue, it happens with WebGPU as well. We just need to be aware of that restriction and try to keep the number of attachment as low as possible, optimize the texture formats like below and start using packing. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @WestLangley Does the new formula for incorporation the additional indirect diffuse from the SSGI look plausible to you? Like outlined in #31882 (comment), it is not physically correct but is multiplying the GI with the scene's diffuse color more sensible that just adding the GI on the scene's color? Do you see other options that we could use? @cdrintherrieno It would be good to know for us what the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @Mugen87 Albedo is the base color of a material (it does not contain any lighting). It's typically stored directly in the G-Buffer in deferred mode, or accessible as an input material texture in forward. You should multiply the diffuse lighting by the albedo to obtain the correct lit values (any form of diffuse light needs to be multiplied by the albedo, wether it's direct, ambient or indirect diffuse). Here is a basic lighting example: https://learnopengl.com/PBR/Lighting. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yeah… with packing we can probably also save on the normal map by just computing it from the depth map, and the depth map can probably be packed into the alpha of the velocity or albedo channels. I assume we can also just render to more targets across multiple passes… but I guess that’s slow? Perhaps when the number of MRT attachments goes above the WebGPU limit, it should transparently split the rendering into multiple passes? AAA compositing pipelines aren’t going to want fewer GBuffers 😅 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Normal reconstruction is already supported. If you pass no normal node to the GI pass, it automatically reconstructs the data from the depth. However, if you are not bandwidth limited, I would recommend the normal map usage since the reconstruction is more expensive than a single texture lookup. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @Mugen87 AO should only attenuate indirect (ambient) light sources, and GI should add to indirect light sources. Total indirect light is then attenuated by the diffuse reflectance of the material. Given the model appears to have both direct and indirect lighting baked into diffuse, I think this PR is a reasonable improvement. |
||
|
||
// traa | ||
|
||
|
@@ -123,27 +125,16 @@ | |
|
||
// | ||
|
||
const gui = new GUI(); | ||
gui.title( 'SSGI settings' ); | ||
gui.add( giPass.sliceCount, 'value', 1, 4 ).step( 1 ).name( 'slice count' ); | ||
gui.add( giPass.stepCount, 'value', 1, 32 ).step( 1 ).name( 'step count' ); | ||
gui.add( giPass.radius, 'value', 1, 25 ).name( 'radius' ); | ||
gui.add( giPass.expFactor, 'value', 1, 3 ).name( 'exp factor' ); | ||
gui.add( giPass.thickness, 'value', 0.01, 10 ).name( 'thickness' ); | ||
gui.add( giPass.backfaceLighting, 'value', 0, 1 ).name( 'backface lighting' ); | ||
gui.add( giPass.aoIntensity, 'value', 0, 4 ).name( 'AO intenstiy' ); | ||
gui.add( giPass.giIntensity, 'value', 0, 100 ).name( 'GI intenstiy' ); | ||
gui.add( giPass.useLinearThickness, 'value' ).name( 'use linear thickness' ); | ||
gui.add( giPass.useScreenSpaceSampling, 'value' ).name( 'screen-space sampling' ); | ||
gui.add( giPass, 'useTemporalFiltering' ).name( 'temporal filtering' ); | ||
|
||
const params = { | ||
output: 0 | ||
}; | ||
|
||
const types = { Default: 0, AO: 1, GI: 2, Beauty: 3 }; | ||
|
||
gui.add( params, 'output', types ).onChange( value => { | ||
const gui = new GUI(); | ||
gui.title( 'SSGI settings' ); | ||
let outputDropdown = gui.add( params, 'output', types ); | ||
outputDropdown.onChange( value => { | ||
|
||
if ( value === 1 ) { | ||
|
||
|
@@ -159,14 +150,24 @@ | |
|
||
} else { | ||
|
||
postProcessing.outputNode = traaPass; | ||
postProcessing.outputNode = giPass.useTemporalFiltering ? traaPass : compositePass; | ||
|
||
} | ||
|
||
postProcessing.needsUpdate = true; | ||
|
||
} ); | ||
|
||
gui.add( giPass.sliceCount, 'value', 1, 4 ).step( 1 ).name( 'slice count' ); | ||
gui.add( giPass.stepCount, 'value', 1, 32 ).step( 1 ).name( 'step count' ); | ||
gui.add( giPass.radius, 'value', 1, 25 ).name( 'radius' ); | ||
gui.add( giPass.expFactor, 'value', 1, 3 ).name( 'exp factor' ); | ||
gui.add( giPass.thickness, 'value', 0.01, 10 ).name( 'thickness' ); | ||
gui.add( giPass.backfaceLighting, 'value', 0, 1 ).name( 'backface lighting' ); | ||
gui.add( giPass.aoIntensity, 'value', 0, 4 ).name( 'AO intenstiy' ); | ||
gui.add( giPass.giIntensity, 'value', 0, 100 ).name( 'GI intenstiy' ); | ||
gui.add( giPass.useLinearThickness, 'value' ).name( 'use linear thickness' ); | ||
gui.add( giPass.useScreenSpaceSampling, 'value' ).name( 'screen-space sampling' ); | ||
gui.add( giPass, 'useTemporalFiltering' ).name( 'temporal filtering' ).onChange( outputDropdown._onChange ); | ||
} | ||
|
||
function onWindowResize() { | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is
16
an arbitrary number? Why not using8
or32
?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/ping @cdrintherrieno Are we doing the right thing here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
16 is arbitrary… I suppose we want it to scale across the long axis of the image, so since the example’s radius caps out at 25, this number could be 12.5, so that when the radius is maxed out, it propagates light all the way from the left side to the right side.
Is 25 an arbitrary number?
As for if it’s the right thing to do… just try adjusting the step count and radius in
dev
(in a better scene) and you can see that they’re intertwined in a way they shouldn’t be (and that is inconsistent with Worldspace Sampling).