diff --git a/release-content/0.14/release-notes/12910_Implement_percentagecloser_filtering_PCF_for_point_lights.md b/release-content/0.14/release-notes/12910_Implement_percentagecloser_filtering_PCF_for_point_lights.md index b000e845f9..53747ef65a 100644 --- a/release-content/0.14/release-notes/12910_Implement_percentagecloser_filtering_PCF_for_point_lights.md +++ b/release-content/0.14/release-notes/12910_Implement_percentagecloser_filtering_PCF_for_point_lights.md @@ -1,7 +1,7 @@ Percentage-closer filtering is a standard anti-aliasing technique used to get softer, less jagged shadows. To do so, we sample from the shadow map near the pixel of interest using a Gaussian kernel, averaging the results to reduce sudden transitions as we move in / out of the shadow. -As a result, Bevy's point lights now look softer and more natural, without any changes to end user code. As before, you can configure the exact strategy used to alias your shadows by setting the [`ShadowFilteringMethod`](https://dev-docs.bevyengine.org/bevy/pbr/enum.ShadowFilteringMethod.html) component on your 3D cameras. +As a result, Bevy's point lights now look softer and more natural, without any changes to end user code. As before, you can configure the exact strategy used to anti-alias your shadows by setting the [`ShadowFilteringMethod`](https://dev-docs.bevyengine.org/bevy/pbr/enum.ShadowFilteringMethod.html) component on your 3D cameras. {{ compare_slider( left_title="Without PCF filtering", diff --git a/release-content/0.14/release-notes/12916_Implement_visibility_ranges_also_known_as_hierarchical_lev.md b/release-content/0.14/release-notes/12916_Implement_visibility_ranges_also_known_as_hierarchical_lev.md index de5ce5317b..c98ee2ad04 100644 --- a/release-content/0.14/release-notes/12916_Implement_visibility_ranges_also_known_as_hierarchical_lev.md +++ b/release-content/0.14/release-notes/12916_Implement_visibility_ranges_also_known_as_hierarchical_lev.md @@ -3,12 +3,12 @@ When looking at objects far away, it's hard to make out the details! This obvious fact is just as true in rendering as it is in real life. -As a result, using complex, high-fidelity models for distant objects is a waste: we can replace them with simplified equivalents (whose lower resolution textures are called mipmaps). +As a result, using complex, high-fidelity models for distant objects is a waste: we can replace their meshes with simplified equivalents (whose lower resolution textures are called mipmaps). -By automatically varying the **level-of-detail** (LOD) of our models in this way, we can render much larger scenes (or the same open world with a higher draw distance), swapping out models on the fly based on their proximity to the player. +By automatically varying the **level-of-detail** (LOD) of our models in this way, we can render much larger scenes (or the same open world with a higher draw distance), swapping out meshes on the fly based on their proximity to the player. Bevy now supports one of the most foundational tools for this: **visibility ranges** (sometimes called hierarchical levels of detail, as it allows users to replace multiple meshes with a single object). -By setting the `VisibilityRange` component on your model entities, developers can automatically control the range from the camera at which their models will appear and disappear, automatically fading between the two options using dithering. +By setting the `VisibilityRange` component on your mesh entities, developers can automatically control the range from the camera at which their meshes will appear and disappear, automatically fading between the two options using dithering. Hiding meshes happens early in the rendering pipeline, so this feature can be efficiently used for level of detail optimization. As a bonus, this feature is properly evaluated per-view, so different views can show different levels of detail. diff --git a/release-content/0.14/release-notes/13009_Implement_fast_depth_of_field_as_a_postprocessing_effect.md b/release-content/0.14/release-notes/13009_Implement_fast_depth_of_field_as_a_postprocessing_effect.md index 301dbe0af5..3f1de14934 100644 --- a/release-content/0.14/release-notes/13009_Implement_fast_depth_of_field_as_a_postprocessing_effect.md +++ b/release-content/0.14/release-notes/13009_Implement_fast_depth_of_field_as_a_postprocessing_effect.md @@ -1,5 +1,5 @@ In rendering, **depth of field** is an effect that mimics the [limitations of physical lenses]((https://en.wikipedia.org/wiki/Depth_of_field)). -By virtue of the way light works, lens (like that of the human eye or a film camera) can only focus on objects that are within a specific range (depth) from them, causing all others to be blurry and out of focus. +By virtue of the way light works, lenses (like that of the human eye or a film camera) can only focus on objects that are within a specific range (depth) from them, causing all others to be blurry and out of focus. Bevy now ships with this effect, implemented as a post-processing shader. There are two options available: a fast Gaussian blur or a more physically accurate hexagonal bokeh technique. diff --git a/release-content/0.14/release-notes/13057_Implement_volumetric_fog_and_volumetric_lighting_also_know.md b/release-content/0.14/release-notes/13057_Implement_volumetric_fog_and_volumetric_lighting_also_know.md index 12a3eb797a..0dc0bca3a2 100644 --- a/release-content/0.14/release-notes/13057_Implement_volumetric_fog_and_volumetric_lighting_also_know.md +++ b/release-content/0.14/release-notes/13057_Implement_volumetric_fog_and_volumetric_lighting_also_know.md @@ -1,5 +1,5 @@ Not all fog is created equal. -Bevy's existing implementation covers [distance fog](https://en.wikipedia.org/wiki/Distance_fog), which is fast, simple and not particularly realistic. +Bevy's existing implementation covers [distance fog](https://en.wikipedia.org/wiki/Distance_fog), which is fast, simple, and not particularly realistic. In Bevy 0.14, this is supplemented with volumetric fog, based on [volumetric lighting](https://en.wikipedia.org/wiki/Volumetric_lighting), which simulates fog using actual 3D space, rather than simply distance from the camera. As you might expect, this is both prettier and more computationally expensive! diff --git a/release-content/0.14/release-notes/13121_Implement_filmic_color_grading.md b/release-content/0.14/release-notes/13121_Implement_filmic_color_grading.md index 643ecb84b2..cad0c06647 100644 --- a/release-content/0.14/release-notes/13121_Implement_filmic_color_grading.md +++ b/release-content/0.14/release-notes/13121_Implement_filmic_color_grading.md @@ -11,7 +11,7 @@ To support this, Bevy's [existing tonemapping tools](https://bevyengine.org/news We've followed [Blender's](https://www.blender.org/) implementation as closely as possible to ensure that what you see in your modelling software matches what you see in the game. -![A very orange image of a test scene, with controls for exposure, temperature, tint and hue. Saturation, contrast, gamme, gain and lift can all be configured for the highlights, midtones and shadows separately.](filmic_color_grading.png) +![A very orange image of a test scene, with controls for exposure, temperature, tint and hue. Saturation, contrast, gamma, gain, and lift can all be configured for the highlights, midtones, and shadows separately.](filmic_color_grading.png) We've provided a new, [`color_grading`](https://github.com/bevyengine/bevy/blob/main/examples/3d/color_grading.rs) example, with a shiny GUI to change all the color grading settings. Perfect for copy-pasting into your own game's dev tools and playing with the settings! diff --git a/release-content/0.14/release-notes/13423_Implement_subpixel_morphological_antialiasing_or_SMAA.md b/release-content/0.14/release-notes/13423_Implement_subpixel_morphological_antialiasing_or_SMAA.md index e7e56f089a..8e73959f70 100644 --- a/release-content/0.14/release-notes/13423_Implement_subpixel_morphological_antialiasing_or_SMAA.md +++ b/release-content/0.14/release-notes/13423_Implement_subpixel_morphological_antialiasing_or_SMAA.md @@ -2,7 +2,7 @@ Jagged edges are the bane of game developers' existence: a wide variety of anti-aliasing techniques have been invented and are still in use to fix them without degrading image quality. -In addition to [MSAA](https://en.wikipedia.org/wiki/Multisample_anti-aliasing), [FXAA](https://en.wikipedia.org/wiki/Fast_approximate_anti-aliasing) and [TAA](https://en.wikipedia.org/wiki/Temporal_anti-aliasing), Bevy now implements [SMAA](https://en.wikipedia.org/wiki/Morphological_antialiasing): subpixel morphological antialiasing. +In addition to [MSAA](https://en.wikipedia.org/wiki/Multisample_anti-aliasing), [FXAA](https://en.wikipedia.org/wiki/Fast_approximate_anti-aliasing), and [TAA](https://en.wikipedia.org/wiki/Temporal_anti-aliasing), Bevy now implements [SMAA](https://en.wikipedia.org/wiki/Morphological_antialiasing): subpixel morphological antialiasing. SMAA is a 2011 antialiasing technique that detects borders in the image, then averages nearby border pixels, eliminating the dreaded jaggies. Despite its age, it's been a continual staple of games for over a decade. Four quality presets are available: low, medium, high, and ultra. Due to advancements in consumer hardware, Bevy's default is high. diff --git a/release-content/0.14/release-notes/13450_Implement_PBR_anisotropy_per_KHR_materials_anisotropy.md b/release-content/0.14/release-notes/13450_Implement_PBR_anisotropy_per_KHR_materials_anisotropy.md index e422c95523..419cf873a6 100644 --- a/release-content/0.14/release-notes/13450_Implement_PBR_anisotropy_per_KHR_materials_anisotropy.md +++ b/release-content/0.14/release-notes/13450_Implement_PBR_anisotropy_per_KHR_materials_anisotropy.md @@ -18,7 +18,7 @@ Anisotropy strength, which ranges from 0 to 1, represents how much the roughness In effect, it controls how stretched the specular highlight is. Anisotropy rotation allows the roughness direction to differ from the tangent of the model. In addition to these two fixed parameters, an anisotropy texture can be supplied. -Such a texture should be a 3-channel RGB texture, where the red and green values specify a direction vector using the same conventions as a normal map ([0, 1] color values map to [-1, 1] vector values), and the the blue value represents the strength. +Such a texture should be a 3-channel RGB texture, where the red and green values specify a direction vector using the same conventions as a normal map ([0, 1] color values map to [-1, 1] vector values), and the blue value represents the strength. This matches the format that the `KHR_materials_anisotropy` specification requires. Such textures should be loaded as linear and not sRGB. Note that this texture does consume one additional texture binding in the standard material shader. diff --git a/release-content/0.14/release-notes/9924_PerObject_Motion_Blur.md b/release-content/0.14/release-notes/9924_PerObject_Motion_Blur.md index 7da204b8d0..2616ecf3e2 100644 --- a/release-content/0.14/release-notes/9924_PerObject_Motion_Blur.md +++ b/release-content/0.14/release-notes/9924_PerObject_Motion_Blur.md @@ -9,6 +9,6 @@ Conversely, if the camera is pointing at a stationary object, and a fast moving The implementation is configured with [camera shutter angle](https://en.wikipedia.org/wiki/Rotary_disc_shutter), which corresponds to how long the virtual shutter is open during a frame. In practice, this means the effect scales with framerate, so users running at high refresh rates aren't subjected to over-blurring. -![A series of cartoony cars whiz past low polygon trees. You can see the trees and the cars blurring as the camera moves, with faster objects (relative to the field of vision) blurring more.](motion_blur_cars.mp4) +![A series of cartoony cars whiz past low polygon trees. The trees and the cars blur as the camera moves, with faster objects (relative to the field of vision) blurring more.](motion_blur_cars.mp4) You can enable motion blur by adding [`MotionBlurBundle`](https://dev-docs.bevyengine.org/bevy/core_pipeline/motion_blur/struct.MotionBlurBundle.html) to your camera entity, as shown in our [`motion blur` example](https://github.com/bevyengine/bevy/blob/main/examples/3d/motion_blur.rs).