I just checked the current version (1.11.2) and its still unaddressed. I just realised that you can easily see the effect that the stars at night should be "doubled" on screen about 2 inches apart, but they are on the same spot for left and right eye. This simply doesn't work for the brain: If an object is in infinity the eyes are looking in parallel at it which gives the brain a sense of the distance of that object. In Minecraft the eyes are telling the brain the stars are right at the computer display and all objects are closer sticking out of the screen. In my opinion everything should be behind the display and the stars should be in infinity.
Unfortunately the rotation of the left/right cameras won't help. The result may look similar, but the projection for left and right eye would be wrong (because of the much to big field of view) and the images won't match in the brain, and would be disruptive for it.
Please read the Wikipedia article I linked in my original post ( https://en.wikipedia.org/wiki/Anaglyph_3D#Depth_adjustment ) . There is a good explanation what to do and why. If you look there on the two images from the martian surface the top image is like its now in Minecraft (the martian landscape "pops out of the screen" which makes it very exhausting to look at) and the image below is like it should be (landscape "is behind the computer screen" which is the natural way of looking at this) This effect is achieved just by shifting the left/right images that way that the rocks near by are aligned and the mountains in the far have a great gab between the left and the right image.
I totally understand that rotating the cameras a bit would be much easier than shifting the images and cropping them, but that is what's needed. If only the rotation solution could be implemented as a compromise, I suggest to have a user parameter for it too. (Actually you will need to rotate the cameras towards each other).
@Marcono1234: The "anaglyph" technique is only one way to separate the left and right images for the left and right eye. Those images are combined in one image (because your monitor only can output one image at a time) by mixing colors. If you are using a red/green filtered glasses your left eye will only see the "left camera" image and your right eye will only see the "right camera" image of the scene.
However the problem is that the left/right camera images are combined in a wrong way, so that your physical eye has to focus on the screen if it try to see something in the far back (in the scene) and have to cross eyes even more to your nose if it wants to see something nearby. The right way would be: the physical eyes should be focused on the screen if objects in the scene are near by and should be parallel if they looking at objects far away (looking through the computer screen)
To solve this problem you need to shift the left camera image to the left and the right camera image to the right before they get combined into the anaglyph image which get presented on the screen. Now: A big challenge is how far you will shift the images: This totally depends on the screen dimensions of the users hardware setup. So this shifting factor should be adjustable as an users preference.
Does this make sense?
@Tim Weber: Can you provide screenshots of other games you mentioned, so I can compare them to the Minecraft rendered ones? Would be interesting to see what makes the difference (make sure they also have the high contrast of white areas against dark surroundings). TIA!
My guess is that the debug xyz-cross hair is in some sense correct, because it is the "nearest" point at your eyes. However the parallax value (horizontal shift) of the left and right eyes images aren't correct. If it were correct the xyz-cross hair should appear "as one" because it is "in the depth of the computer screen" (just like the status information and game stats display) every thing of the Minecraft world should be "behind" the screen, and this is incorrect.
So my suggestion is to close this bug as "resolved" and create another bug that describes the wrong parallax rendering of anaglyph mode in Minecraft ( https://bugs.mojang.com/browse/MC-86556 )
If you take a look at the red/green/blue channels of the attached screenshot (e.g. with Photoshop) you can see that the separation is correctly done by Minecraft: red channel for left eye, blue channel for right eye and green channel filled with the right eye image to support its brightness. The problem with anaglyph glasses is that the red filter often tend to not only let the red channel pass, but also some of the green, so that your left eye also see 80% left eye image and 20% right eye image. This is true for the blue filter too. (Actutally this helps to get more realistic colors in your brain, but produce the ghosting!). Shifting hue of the image doesn't help at all, because you are then moving color information from the blue channel to the red channel and vis versa producing even more ghosting because your glasses can only filter red and blue. You can't do an "orange/purple" anaglyph image with your computer screen, because it only emits red/green/blue colors, you won't be able to separate "orange" and "purple" into two separate images for your left eye and your right eye anymore, because those information is mixed into all three channels that are available only. So what you need is a sharper "red" filter that don't let green or blue light through it to not see those channels with the left eye. Make sense?
I consider this bug as "resolved" because Minecraft does the color separation correctly.
Screenshot of Minecraft 1.8.8 on a Mac
@Djfe: Thanks for chiming in. You are right: It hurts to look at it. Thats because everything sticks out of the screen rather than been behind it as it should. The blocks in the background in your screen shot should be in infinity, but currently they are in the display plane. To accomplish this its necessary to move the 3D cameras a bit so that the blue and red silhouette of the ender chest gets more aligned on screen and the blocks in the background move further apart.