The debug display shows the memory usage using MB (Megabyte = 10^6 bytes) as the unit.
Expected behaviour:
The memory usage is displayed in Megabytes.
Actual behaviour:
In reality it displays MiB (Mebibyte = 2^20 bytes). I've verified that in code where divides the incoming number of bytes by 1024 twice.
And before you ask, yeah I know it's somewhat commonly used that way. But I do consider it an error and it contradicts the SI. I would be fixable by simply adding an "i" in the middle and could also help reduce misuse in other places.
Can confirm in 20w51a.