The bug
When creating a custom dimension with the field simplex_surface_noise
set to false
in the noise settings, the game will use a perlin noise sampler instead of a simplex noise sampler to calculate the surface noise. This perlin noise sampler causes discontinuity issues, as shown in the attached snapshot. These issues become visible in biomes that use surface builders that test the noise based on a threshold to choose the block, such as mountains, shattered savannas, and giant tree taigas. It also happens in the vanilla nether without any datapacks as it sets simplex_surface_noise
to false
by default, but it is less visible due to the nether's generation.
Code analysis
In NoiseBasedChunkGenerator's buildSurfaceAndBedrock method, the following code is used to calculate the noise:
double noise = this.surfaceNoise.getSurfaceNoiseValue(x * 0.0625, z * 0.0625, 0.0625, x * 0.0625) * 15.0;
The 4th parameter is scaled along the x axis and passed to the surface noise. When simplex_surface_noise
is set to true
, PerlinSimplexNoise
is the class used. Its implementation of getSurfaceNoiseValue
ignores the 3rd and 4th parameters, so this issue isn't visible when simplex_surface_noise
is set to true
. When it's false
, PerlinNoise
is the class used to the sampling, and it does use the 3rd and 4th values. Because the x value is the local x value (scaled from 0 to 16), this causes discontinuities along chunk borders. Changing the 4th parameter from x * 0.0625 to just 0.0625 like the 3rd parameter fixes this problem, with the results attached in the second screenshot.
Linked issues
Attachments
Comments 3
Could this also be the cause of MC-172393?
Can confirm. Very obvious. I added a data pack for easy reproduction.