Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GenerateEnvironmentMapLight #9414

Closed
wants to merge 58 commits into from
Closed

GenerateEnvironmentMapLight #9414

wants to merge 58 commits into from

Conversation

JMS55
Copy link
Contributor

@JMS55 JMS55 commented Aug 10, 2023

Objective

Solution

  • Adds the GenerateEnvironmentMapLight component.
  • Implements "Fast Filtering of Reflection Probes" for the specular filtering.
    • Special thanks to Godot - their implementation was an extremely valuable reference.
  • Implements cosine-weighted hemisphere importance sampling for the diffuse filtering.

Changelog

  • Added GenerateEnvironmentMapLight for automatically generating an EnvironmentMapLight component from a Skybox component. This can be used instead of KhronosGroup's glTF-IBL-Sampler.

@JMS55 JMS55 added C-Feature A new feature, making something new possible A-Rendering Drawing game state to the screen labels Aug 10, 2023
@JMS55 JMS55 added this to the 0.12 milestone Aug 10, 2023
@github-actions
Copy link
Contributor

Example no_renderer failed to run, please try running it locally and check the result.

@github-actions
Copy link
Contributor

Example no_renderer failed to run, please try running it locally and check the result.

@github-actions
Copy link
Contributor

Example no_renderer failed to run, please try running it locally and check the result.

@github-actions
Copy link
Contributor

Example no_renderer failed to run, please try running it locally and check the result.

@github-actions
Copy link
Contributor

Example no_renderer failed to run, please try running it locally and check the result.

1 similar comment
@github-actions
Copy link
Contributor

Example no_renderer failed to run, please try running it locally and check the result.

@JMS55
Copy link
Contributor Author

JMS55 commented Nov 28, 2023

@robtfm add GenerateEnvironmentMapLight::default() to the camera of the skybox example, and then use renderdoc to inspect the generated textures, and compare it vs the manually generated textures using khronos's tool.

@superdump
Copy link
Contributor

Is filter_coefficents.rs still needed now that the coefficients have been serialised into filter_coefficents.bin?

Also, coefficents -> coefficients.

@JMS55
Copy link
Contributor Author

JMS55 commented Nov 28, 2023

The rust file is not needed, no. It's unused, but I left it as a reference. I can remove it if we're ok with an opaque bin file. Up to you/cart.

Also damn, I thought I had double checked my spelling :(

@JMS55 JMS55 requested a review from pcwalton December 3, 2023 18:27
Copy link
Contributor

@pcwalton pcwalton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I haven't fully digested the guts of the shader yet but here are a bunch of comments.

case 0u: { return vec3(1.0, v, -u); }
case 1u: { return vec3(-1.0, v, u); }
case 2u: { return vec3(u, 1.0, -v); }
case 3u: { return vec3(u, -1.0, v); }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

@JMS55 JMS55 Dec 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I used the same get_dir function from https://www.activision.com/cdn/research/filter_using_table_128.txt that I used in the other shaders

Comment on lines +39 to +40
/// * The first frame this component is added to the skybox entity, an [`EnvironmentMapLight`]
/// component will be generated and added to the skybox entity.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could just say "this is generally a quick operation".

var color = vec4(0.0);
for (var axis = 0u; axis < 3u; axis++) {
let other_axis0 = 1u - (axis & 1u) - (axis >> 1u);
let other_axis1 = 2u - (axis >> 1u);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel like if you're switching over axis below anyway you might as well just set other_axis0 and other_axis1 explicitly in a switch statement instead of using these bit tricks.

var color = vec3(0.0);
for (var sample_i = 0u; sample_i < 32u; sample_i++) {
// R2 sequence - http://extremelearning.com.au/unreasonable-effectiveness-of-quasirandom-sequences
let r = fract(0.5 + f32(sample_i) * vec2<f32>(0.75487766624669276005, 0.5698402909980532659114));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Factor this out into a rand function :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's 1 line, I'm inclined not to 😅

@JMS55 JMS55 requested review from pcwalton and IceSentry December 15, 2023 20:12
Copy link
Contributor

@pcwalton pcwalton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@JMS55
Copy link
Contributor Author

JMS55 commented Dec 29, 2023

Diffuse filtering is suffering from not enough samples, and fireflys. I should look at https://www.shadertoy.com/view/4c2GRh.

// R2 sequence - http://extremelearning.com.au/unreasonable-effectiveness-of-quasirandom-sequences
let r = fract(0.5 + f32(sample_i) * vec2<f32>(0.75487766624669276005, 0.5698402909980532659114));

let cos_theta = sqrt(1.0 - f32(r.y));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
let cos_theta = sqrt(1.0 - f32(r.y));
// map uniformly distributed [0..1)^2 into hemisphere with cosine importance (Lambertian distribution)
let cos_theta = sqrt(1.0 - f32(r.y));

fn main(@builtin(global_invocation_id) global_id: vec3<u32>) {
var id = global_id;
var level = 0u;
if id.x < 128u * 128u {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it has to do the biggest mip first for other reasons, maybe because all the mips after it depend on it. but wgsl workgroups are not guaranteed to run in any particular order so idk

id.x -= id.y * res;

let u = (f32(id.x) * 2.0 + 1.0) / f32(res) - 1.0;
let v = -(f32(id.y) * 2.0 + 1.0) / f32(res) + 1.0;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it maps to the centers of every other 2x2 texel patch, because the lower mip level must represent those texels as one larger one that covers the area of all 4 original ones.

id.y = id.x / res;
id.x -= id.y * res;

let u = (f32(id.x) * 2.0 + 1.0) / f32(res) - 1.0;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
let u = (f32(id.x) * 2.0 + 1.0) / f32(res) - 1.0;
// remap integers [0..res-1]^2 to the centers of every 2x2 texel patch we are mipping down to one texel, in (-1..1)^2
let u = (f32(id.x) * 2.0 + 1.0) / f32(res) - 1.0;

@JMS55
Copy link
Contributor Author

JMS55 commented Dec 30, 2023

This PR has artifacts unfortunately. It can't be merged as-is.


id.z = id.y;
let res = 128u >> level;
id.y = id.x / res;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could save an integer division here with

Suggested change
id.y = id.x / res;
id.y = id.x >> (7 - level);

@JMS55 JMS55 marked this pull request as draft December 31, 2023 06:46
@JMS55 JMS55 removed this from the 0.13 milestone Jan 20, 2024
@JMS55 JMS55 closed this Mar 29, 2024
pcwalton added a commit to pcwalton/bevy that referenced this pull request Jun 13, 2024
This commit introduces a new type of camera, the *omnidirectional*
camera. These cameras render to a cubemap texture, and as such extract
into six different cameras in the render world, one for each side. The
cubemap texture can be attached to a reflection probe as usual, which
allows reflections to contain moving objects. To use an omnidirectional
camera, create an [`OmnidirectionalCameraBundle`].

Because omnidirectional cameras extract to six different sub-cameras in
the render world, render world extraction code that targets components
present on cameras now needs to be aware of this fact and extract
components to the individual sub-cameras, not the root camera component.
They also need to run after omnidirectional camera extraction, as only
then will the sub-cameras be present in the render world. New plugins,
`ExtractCameraComponentPlugin` and `ExtractCameraInstancesPlugin`, are
available to assist with this.

Each side of an omnidirectional camera can be individually marked as
active via the `ActiveCubemapSides` bitfield. This allows for the common
technique of rendering only one (or two, or three) sides of the cubemap
per frame, to reduce rendering overhead. It also allows for on-demand
rendering, so that an application that wishes to optimize further can
choose sides to refresh. For example, an application might wish to only
rerender sides whose frusta contain moving entities.

In addition to real-time reflection probes, this patch introduces much
of the infrastructure necessary to support baking reflection probes from
within Bevy as opposed to in an external program such as Blender, which
has been the status quo up to this point. Even with this patch, there
are still missing pieces needed to make this truly convenient, however:

1. Baking a reflection probe requires more than just saving a cubemap:
   it requires pre-filtering the cubemap into diffuse and specular parts
   in the same way that the [glTF IBL Sampler] does. This is not yet
   implemented in Bevy; see bevyengine#9414 for a previous attempt.

2. The cubemap needs to be saved in `.ktx2` format, as that's the only
   format that Bevy presently knows how to load. There's no
   comprehensive Rust crate for this, though note that my [glTF IBL
   Sampler UI] has code to do it for the specific case of cubemaps.

3. An editor UI is necessary for convenience, as otherwise every
   application will have to create some sort of bespoke tool that
   arranges scenes and saves the reflection cubemaps.

The `reflection_probes` example has been updated in order to add an
option to enable dynamic reflection probes, as well as an option to spin
the cubes so that the impact of the dynamic reflection probes is
visible. Additionally, the static reflection probe, which was previously
rendered in Blender, has been changed to one rendered in Bevy. This
results in a change in appearance, as Blender and Bevy render somewhat
differently.

Partially addresses bevyengine#12233.

[glTF IBL Sampler]: https://github.com/KhronosGroup/glTF-IBL-Sampler

[glTF IBL Sampler UI]: https://github.com/pcwalton/gltf-ibl-sampler-egui
pcwalton added a commit to pcwalton/bevy that referenced this pull request Jun 14, 2024
This commit introduces a new type of camera, the *omnidirectional*
camera. These cameras render to a cubemap texture, and as such extract
into six different cameras in the render world, one for each side. The
cubemap texture can be attached to a reflection probe as usual, which
allows reflections to contain moving objects. To use an omnidirectional
camera, create an [`OmnidirectionalCameraBundle`].

Because omnidirectional cameras extract to six different sub-cameras in
the render world, render world extraction code that targets components
present on cameras now needs to be aware of this fact and extract
components to the individual sub-cameras, not the root camera component.
They also need to run after omnidirectional camera extraction, as only
then will the sub-cameras be present in the render world. New plugins,
`ExtractCameraComponentPlugin` and `ExtractCameraInstancesPlugin`, are
available to assist with this.

Each side of an omnidirectional camera can be individually marked as
active via the `ActiveCubemapSides` bitfield. This allows for the common
technique of rendering only one (or two, or three) sides of the cubemap
per frame, to reduce rendering overhead. It also allows for on-demand
rendering, so that an application that wishes to optimize further can
choose sides to refresh. For example, an application might wish to only
rerender sides whose frusta contain moving entities.

In addition to real-time reflection probes, this patch introduces much
of the infrastructure necessary to support baking reflection probes from
within Bevy as opposed to in an external program such as Blender, which
has been the status quo up to this point. Even with this patch, there
are still missing pieces needed to make this truly convenient, however:

1. Baking a reflection probe requires more than just saving a cubemap:
   it requires pre-filtering the cubemap into diffuse and specular parts
   in the same way that the [glTF IBL Sampler] does. This is not yet
   implemented in Bevy; see bevyengine#9414 for a previous attempt.

2. The cubemap needs to be saved in `.ktx2` format, as that's the only
   format that Bevy presently knows how to load. There's no
   comprehensive Rust crate for this, though note that my [glTF IBL
   Sampler UI] has code to do it for the specific case of cubemaps.

3. An editor UI is necessary for convenience, as otherwise every
   application will have to create some sort of bespoke tool that
   arranges scenes and saves the reflection cubemaps.

The `reflection_probes` example has been updated in order to add an
option to enable dynamic reflection probes, as well as an option to spin
the cubes so that the impact of the dynamic reflection probes is
visible. Additionally, the static reflection probe, which was previously
rendered in Blender, has been changed to one rendered in Bevy. This
results in a change in appearance, as Blender and Bevy render somewhat
differently.

Partially addresses bevyengine#12233.

[glTF IBL Sampler]: https://github.com/KhronosGroup/glTF-IBL-Sampler

[glTF IBL Sampler UI]: https://github.com/pcwalton/gltf-ibl-sampler-egui
@JMS55
Copy link
Contributor Author

JMS55 commented Nov 8, 2024

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-Rendering Drawing game state to the screen C-Feature A new feature, making something new possible
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

Automatic Skybox -> EnvironmentMapLight generation
9 participants