Skip to content

Commit

Permalink
Apply suggestions from code review
Browse files Browse the repository at this point in the history
Co-authored-by: Robert Swain <[email protected]>
  • Loading branch information
alice-i-cecile and superdump authored Jun 18, 2024
1 parent 410875a commit a96726d
Show file tree
Hide file tree
Showing 8 changed files with 10 additions and 10 deletions.
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Percentage-closer filtering is a standard anti-aliasing technique used to get softer, less jagged shadows.
To do so, we sample from the shadow map near the pixel of interest using a Gaussian kernel, averaging the results to reduce sudden transitions as we move in / out of the shadow.

As a result, Bevy's point lights now look softer and more natural, without any changes to end user code. As before, you can configure the exact strategy used to alias your shadows by setting the [`ShadowFilteringMethod`](https://dev-docs.bevyengine.org/bevy/pbr/enum.ShadowFilteringMethod.html) component on your 3D cameras.
As a result, Bevy's point lights now look softer and more natural, without any changes to end user code. As before, you can configure the exact strategy used to anti-alias your shadows by setting the [`ShadowFilteringMethod`](https://dev-docs.bevyengine.org/bevy/pbr/enum.ShadowFilteringMethod.html) component on your 3D cameras.

{{ compare_slider(
left_title="Without PCF filtering",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,12 @@

When looking at objects far away, it's hard to make out the details!
This obvious fact is just as true in rendering as it is in real life.
As a result, using complex, high-fidelity models for distant objects is a waste: we can replace them with simplified equivalents (whose lower resolution textures are called mipmaps).
As a result, using complex, high-fidelity models for distant objects is a waste: we can replace their meshes with simplified equivalents (whose lower resolution textures are called mipmaps).

By automatically varying the **level-of-detail** (LOD) of our models in this way, we can render much larger scenes (or the same open world with a higher draw distance), swapping out models on the fly based on their proximity to the player.
By automatically varying the **level-of-detail** (LOD) of our models in this way, we can render much larger scenes (or the same open world with a higher draw distance), swapping out meshes on the fly based on their proximity to the player.
Bevy now supports one of the most foundational tools for this: **visibility ranges** (sometimes called hierarchical levels of detail, as it allows users to replace multiple meshes with a single object).

By setting the `VisibilityRange` component on your model entities, developers can automatically control the range from the camera at which their models will appear and disappear, automatically fading between the two options using dithering.
By setting the `VisibilityRange` component on your mesh entities, developers can automatically control the range from the camera at which their meshes will appear and disappear, automatically fading between the two options using dithering.
Hiding meshes happens early in the rendering pipeline, so this feature can be efficiently used for level of detail optimization.
As a bonus, this feature is properly evaluated per-view, so different views can show different levels of detail.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
In rendering, **depth of field** is an effect that mimics the [limitations of physical lenses]((https://en.wikipedia.org/wiki/Depth_of_field)).
By virtue of the way light works, lens (like that of the human eye or a film camera) can only focus on objects that are within a specific range (depth) from them, causing all others to be blurry and out of focus.
By virtue of the way light works, lenses (like that of the human eye or a film camera) can only focus on objects that are within a specific range (depth) from them, causing all others to be blurry and out of focus.

Bevy now ships with this effect, implemented as a post-processing shader.
There are two options available: a fast Gaussian blur or a more physically accurate hexagonal bokeh technique.
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Not all fog is created equal.
Bevy's existing implementation covers [distance fog](https://en.wikipedia.org/wiki/Distance_fog), which is fast, simple and not particularly realistic.
Bevy's existing implementation covers [distance fog](https://en.wikipedia.org/wiki/Distance_fog), which is fast, simple, and not particularly realistic.

In Bevy 0.14, this is supplemented with volumetric fog, based on [volumetric lighting](https://en.wikipedia.org/wiki/Volumetric_lighting), which simulates fog using actual 3D space, rather than simply distance from the camera.
As you might expect, this is both prettier and more computationally expensive!
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ To support this, Bevy's [existing tonemapping tools](https://bevyengine.org/news

We've followed [Blender's](https://www.blender.org/) implementation as closely as possible to ensure that what you see in your modelling software matches what you see in the game.

![A very orange image of a test scene, with controls for exposure, temperature, tint and hue. Saturation, contrast, gamme, gain and lift can all be configured for the highlights, midtones and shadows separately.](filmic_color_grading.png)
![A very orange image of a test scene, with controls for exposure, temperature, tint and hue. Saturation, contrast, gamma, gain, and lift can all be configured for the highlights, midtones, and shadows separately.](filmic_color_grading.png)

We've provided a new, [`color_grading`](https://github.com/bevyengine/bevy/blob/main/examples/3d/color_grading.rs) example, with a shiny GUI to change all the color grading settings.
Perfect for copy-pasting into your own game's dev tools and playing with the settings!
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
<!-- https://github.com/bevyengine/bevy/pull/13423 -->

Jagged edges are the bane of game developers' existence: a wide variety of anti-aliasing techniques have been invented and are still in use to fix them without degrading image quality.
In addition to [MSAA](https://en.wikipedia.org/wiki/Multisample_anti-aliasing), [FXAA](https://en.wikipedia.org/wiki/Fast_approximate_anti-aliasing) and [TAA](https://en.wikipedia.org/wiki/Temporal_anti-aliasing), Bevy now implements [SMAA](https://en.wikipedia.org/wiki/Morphological_antialiasing): subpixel morphological antialiasing.
In addition to [MSAA](https://en.wikipedia.org/wiki/Multisample_anti-aliasing), [FXAA](https://en.wikipedia.org/wiki/Fast_approximate_anti-aliasing), and [TAA](https://en.wikipedia.org/wiki/Temporal_anti-aliasing), Bevy now implements [SMAA](https://en.wikipedia.org/wiki/Morphological_antialiasing): subpixel morphological antialiasing.

SMAA is a 2011 antialiasing technique that detects borders in the image, then averages nearby border pixels, eliminating the dreaded jaggies.
Despite its age, it's been a continual staple of games for over a decade. Four quality presets are available: low, medium, high, and ultra. Due to advancements in consumer hardware, Bevy's default is high.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Anisotropy strength, which ranges from 0 to 1, represents how much the roughness
In effect, it controls how stretched the specular highlight is. Anisotropy rotation allows the roughness direction to differ from the tangent of the model.

In addition to these two fixed parameters, an anisotropy texture can be supplied.
Such a texture should be a 3-channel RGB texture, where the red and green values specify a direction vector using the same conventions as a normal map ([0, 1] color values map to [-1, 1] vector values), and the the blue value represents the strength.
Such a texture should be a 3-channel RGB texture, where the red and green values specify a direction vector using the same conventions as a normal map ([0, 1] color values map to [-1, 1] vector values), and the blue value represents the strength.
This matches the format that the `KHR_materials_anisotropy` specification requires.
Such textures should be loaded as linear and not sRGB.
Note that this texture does consume one additional texture binding in the standard material shader.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,6 @@ Conversely, if the camera is pointing at a stationary object, and a fast moving
The implementation is configured with [camera shutter angle](https://en.wikipedia.org/wiki/Rotary_disc_shutter), which corresponds to how long the virtual shutter is open during a frame.
In practice, this means the effect scales with framerate, so users running at high refresh rates aren't subjected to over-blurring.

![A series of cartoony cars whiz past low polygon trees. You can see the trees and the cars blurring as the camera moves, with faster objects (relative to the field of vision) blurring more.](motion_blur_cars.mp4)
![A series of cartoony cars whiz past low polygon trees. The trees and the cars blur as the camera moves, with faster objects (relative to the field of vision) blurring more.](motion_blur_cars.mp4)

You can enable motion blur by adding [`MotionBlurBundle`](https://dev-docs.bevyengine.org/bevy/core_pipeline/motion_blur/struct.MotionBlurBundle.html) to your camera entity, as shown in our [`motion blur` example](https://github.com/bevyengine/bevy/blob/main/examples/3d/motion_blur.rs).

0 comments on commit a96726d

Please sign in to comment.