Description update by [~IonicEcko]
I'm going to use this one as a master for the performance degradation experienced in BDS on a Linux install post 1.14. I've done my best to summarise the info provided in this ticket and others but please feel free to comment below if you have further info.
What is happening?
It appears that post 1.14, CPU usage on linux is far higher than the 1.13 releases. Testing against identical kit using Windows vs Linux yields far better performance on Windows than on Linux
How to reproduce
Build Linux box on Ubuntu 18.04 LTS (per last published Sys Reqs).
Create new server using default properties file.
Load into the world and note significantly higher CPU utilization than a Windows equivalent. Even with only 1 active player.
Notes from community members
World (chunk) generation appears to contribute to the slowness. Logging into a new box the CPU pinning seems to occur while chunks are generating and falls off over time.
Chunk generation appears slow to the player. (the view distance expands slowly)
While under load, server time is obviously affected causing numerous issues for players,
Was not present in the 1.13 releases.
Does not appear to have improved with the subsequent 1.14 releases.
Issue with CPU maxing out to 100% usage when having a single client connected to dedicated bedrock server on Linux/Windows server. Testing environment including Ubuntu 18.04 LTS with 1 physical box (Celeron CPU 1007U @ 1.50 GHz 2 core ) and Azure Standard D2s v3 hosted VM.).
When server is idle with no connections, CPU is perfectly fine.
Java server has no adverse affect on connections and performance is much better. Is this typical?
Linked issues
is duplicated by
Attachments
Comments


Actually, it's even lower than that - 40% is only when loading and creating new chunks with one player. If it's loading existing chunks, it really doesn't get above 25%, and doing nothing but standing around it's at like 3%.
That's on my desktop, yet on the much more powerful server, just standing around runs at 60%.
The desktop with the 3470 is running Windows, the server is running Ubuntu Server 18.04.3
I think there may be an issue with the Linux version of the server.

Have you tried 1.14.20.1 or 1.14.21.0?
I only ask because the version you listed in the title is 2 versions old now. I haven't noticed the same on mine, although have noticed some pretty heft load on player login.

I've done some more testing on this, and it seems that at least on my hardware, the Linux version is just not very good.
With the Nuc 815, with Ubuntu Server18.04, just standing around doing nothing has one core running at 50-95%, depending on the area. Our world does have a lot of entities, so I tested in a new empty world, and in that one it's still a 40% single core load doing nothing.
Switched to Windows 10 to test, and on that, the CPU load on a single core is at <20% when doing nothing, and only goes to like 40% when I go near one of the areas with lots of entities.
On top of that, the loading that happens when a player logs in or is flying to a new area is much quicker, and only uses a few cores at 70-90% instead of all 8 at 100%.

I met the same problem.When the server run with no user ,CPU usage is about 4-5%.Only one user bring the usage up to 98%.
Environment: 1Core 1G RAM
Ubuntu 18.04.01
Minecarft Version: bedrock-server-1.14.21.0

Hey Zhu,
Do you have the ability to assign another core to the VM? The original doco that came out with BDS indicates 2 core was minimum. There's no longer anything official published (the page was lost by the looks when they moved to help.minecraft.net) so I've asked if they're able to republish minimum specs.
Ionic

HAHA,I dont find any page about the minimum requirements on web.OK, I'll try,thanks for helping.

Ya also for me is always at 100%, and we are only 2 players online.
My server:
CPU 2 Core Epyc 7351P 2.9 Ghz
RAM 2GB DDR4 2600 Mhz
SDD 35 GB
With the java server i never had this problem, and we are only mining.

Running BDS on a new SSD, new 19.10 Ubuntu install AMD A8-3500M quad core. 8GB ram + 8GB swap. It's a laptop, yes, but it runs windows 10 bedrock version fine (desktop version, not server, so it can host and do graphics at the same time). Connected to router via gigabit ethernet cable. Two players in same room as wifi playing. Chunk generation is very slow. But most obvious issue is that the mobs are "jerky" , they move for 1 second, then pause for 1 second, then this repeats. This is a new world (only a few hours played) so there's no redstone or farms whatsoever. Here's a video of some zombie villagers showing the effect, both players are stood still watching them https://photos.app.goo.gl/nW7pypyRq5A8TxDQ7 - this happens to all mobs . (also my non zombie villagers have mysteriously vanished but that's another issue). Here's another video of the server load about 20 seconds later https://photos.app.goo.gl/QggjXnp6ph6d3GL67

I can confirm similar issues on my server once 4 to 5 people are connected extreme lag begins to show up. If 7+ people are online then it's almost unplayable with people barely even able to eat food. I am attaching a screenshot of server usage during a time with 5 people connected and another screenshot when 7 were connected. In the first screenshot no redstone devices were being used, in the second screenshot with 7 people on there was a redstone mob farm being used. However this seems like a lot of CPU usage either way.
There was also a time when 2 redstone devices where being used with 9 players online and the server was running at 100% on all 12 cores, the lag was so bad people had to disconnect to even turn off devices. Without devices running server was still at 100% on 3 cores with extreme lag. To the point where you could open a door and wait almost 2 seconds before door actually opened.
[media][media]

Can reproduce, we run bedrock in docker and it fully uses the 4 vcores at 100% with 7 players online without any farms. im running a 1650v3

Also having this issue. Unfortunately I'm only now seeing this thread. We (our church) wanted to build a Minecraft server for our kids to have an "Easter egg hunt". We have a dual-socket Dell R720 with plenty of resources available right now, and I thought it would be plenty powerful enough. Everything was running fine with the 3 person event team building the world, but yesterday once kids started connecting it completely fell over. HTOP showed a single vcore just getting hammered and the other 29 cores sitting idle. Super frustrating since generally from a performance perspective, and definitely from a licensing perspective, Linux is a better server than Windows.

I've seen a report on the Minecraft Discord that performance in last nights release has improved. Can anyone confirm?
Ionic

I would say a tentative "yes" that performance has improved but its still not on par with Windows. I'm going to do a bit of proper testing to confirm. Chunk loading seems a bit better though.
Edit: After some tests Windows version still uses much much less CPU.

I would say no, chunk loading is marginally faster when the CPU isn't maxed at out yet. As soon as you have 3 people on and CPU hits 100% chunk loading is delayed. When 7-9 are on there's so much lag even with chunk loading while flying with Elytra you actually stop mid flight for the chunk to load every time you hit a new chunk. However server is still hanging around 80% CPU utilization with 1 person on and when it reaches 5+ people there's so much lag it affects the ability to even eat properly.

I've tested and found that when using my old worlds created in former versions, it is still of poor performance. However, when I use the latest version 1.14.60.5 to create a new world and play, it seems to be better. But only a little.

Thanks Zijian.
I asked about this internally and there was no specific changes in 1.14.60 regarding Linux performance so this is still considered active as we've also heard from many that 1.14.60 is actually worse.

I have experienced this issue as well, enough that I have temporarily moved my worlds to BDS on Windows. I am using Intel Xeon E5-2637v2 processors in Dell servers. I was running Ubuntu in virtual environments at first, then moved to a dedicated physical server in my lab with the same results when performance dropped with the 1.14 update.
Using BDS on Windows 10 on the same hardware results in normal, non-degraded performance.

Sadly this is still happening on 1.16 and seems to be worse. A world with around 4 - 10 players at the same time are causing the CPU to increase between 100% - 200% in use, this needs to be fixed asap since players are lagging really bad.
Perhaps decreasing the view distance from 32 to 10 may help in some way?
[media]
BDS 1.16 for Linux performance issues
OS Version: Debian 10
CPU: Intel Xeon D-1521
RAM: 4 GB
Everything seems to be fine when BDS is idle.
When 1 player joins the server CPU usage increases to 75 – 100 % and if 2-5 players join CPU usage increases to 100 – 150%. When any server chunks get loaded by client, client suffers FPS drops along with short freezes (1 - 2 seconds).

Have been running BDS on Ubuntu 18.04 server since BDS 1.12.1 including up to 1.14.60 on view distance 32, tick distance 10 without any issues, however, the new 1.16.0.2 update is extremely laggy (CPU maxes out) when loading new chunks - makes flying with elytra impossible. Have lowered view and tick distance which has improved it, but still almost makes the game unplayable.
When nobody is connected the CPU usage for bedrock_server is only 1.25%, as soon as i login it spikes to 100%, then settles to between 15% for low tick/view distance and 40% for high tick/view distance, and loading new chunks is super-laggy.
Will change to windows (maybe thats the point as Minecraft is owned by Microsoft? Hahah) as per Cricket the Wise comment above, however, definitely an upvote for this issue.

We have a dedicated running multiple Bedrock servers. Prior to 1.16 a single server would average 50% or less with occasional spikes up to 150%.
Every server that installs 1.16 runs normal until the first person logs in then it spikes to over a 1000%. If the player stays still it will settle around 400% but as soon as they start moving it spikes again reaching up to 1400%.
(Our panel represents 100% as one thread.)
Intel I9-9900K - 64GB RAM
[media]
Minecraft Bedrock v1.16 on Ubuntu Server 18.04 LTS VM on Hyper-V Server. Dual Xeon E5-2667v2 (4GHz max frequency), Intel S2600CO4, 256GB registered ECC RAM, dual Intel 750 NVMe SSDs.
Lag hasn't been great on this world for a while, particularly around the home base that has a few dozen villagers, extensive mines, lots of torches etc. However, after the Nether Update (v1.16) dropped this week (the server auto-updates and runs a backup every morning at 4am) this world/server is basically unplayable in the Overworld (if you can make it to the Nether, that is much more playable). Clearly a world that's been played in for a year or so is going to have quite a lot of changes, but the result is that it takes about twenty minutes to load in the world (everything's transparent and you can't move the character until then), and then everything is very laggy and stuttery. Shooting a bow causes the arrow to drop out of the sky about halfway to the target, mobs move like a 1 frame per three second slideshow. Giving the server a whole hour to load the world doesn't help - the immediate environment may render, but as soon as you start moving around you can quickly find areas with no textures loaded (and that's just running - flying is impossible as you hit invisible walls almost immediately).
To investigate this, I've tried numerous things. First I added extra CPU cores and memory (usually has 4 cores and 8GB RAM allocated), increasing the CPU and memory weight so the Hyper-V server prioritises this VM over everything else (8 CPU cores and 64GB of RAM allocated). Memory usage typically only hits 2 or 3GB, but I wanted to see what would happen if I gave it everything I could - it made no difference. Whatever the limitation is with a Minecraft server, it isn't performance of the underlying hardware.
I went into the world and used the /kill command to kill everything, logged out and back in, it made no difference.
So I'm at a loss here - my son spent all week bouncing off the walls waiting to play the new Nether update, only to have his expectations dashed with a game that's utterly unplayable. He's had to satisfy himself with playing local worlds generated on the PC.
Why has the 1.16 update caused such catastrophic performance issues? He's been building in that world for ages, there's no way we're just going to abandon it and start again. But we're stuck between a rock and a hard place, because it is completely, totally unplayable in its current state.
As other people have noticed, with no one logged in the server's performance is fine - ~99% idle CPU. The moment a single player logs in, CPU is maxed out. Here's a snapshot of top:
[media]I tried throwing 16 CPU cores at it, makes no difference.
[media](all other VMs have a relative weight of 100, so this minecraft server will be prioritised for all CPU loads)
[media](again, all other VMs have the default memory weight, so the minecraft server will take priority for memory space. Not that that's an issue - of the 256GB available on the host server, only around 64GB is currently allocated to running VMs).
This is before user login:
[media]This is after (with one active user not moving):
[media]Even though it's showing 31% idle, it is completely unplayable - the character doesn't even move, it is locked in place as the world attempts to load, and frequently crashes the bedrock_server process thread.
Hey Mojang, you have a MASSIVE problem with your Linux code - if you had this problem on Windows the Internet would light up with outrage. We've also noticed problems since 1.14 but been told on user forums we probably have too many mobs, but despite the performance degradation in heavily populated areas (main town with dozens of villagers), in most areas the game remained playable. It isn't playable any longer - it's completely broken.
While putting this together, I managed to see it hit 1600% CPU utilisation. Here's a snapshot I took just seconds before:
[media]It only showed as 1600% for a single refresh poll (about three seconds). At the next refresh, it dropped down to 100%, and 'id' (idle percentage) in the top bar showed 99.9%. Bizarrely it wasn't crashing - the player was standing next to a bed at night, and right-clicking on it allowed the player to sleep (albeit taking about five times longer than it should have). The player 'awoke' in the morning and stood next to the bed, yet the world was still an invisible wireframe.
[media]While writing this, the world loaded around the player. I took a look around using the Elytra, and within seconds had reached the outer limit of the draw distance, hitting an invisible wall. Not seeing anything noteworthy, I logged out and shut down the app, but then noticed the CPU on the server was still flat out:
[media]It took around five minutes before it closed whatever it was about the bedrock_server thread that was causing the CPU to max out.
Sadly htop doesn't provide a great deal more in the way of detailed insight into the bedrock_server process:
[media]Although I did then observe immediately after capturing that screenshot that most threads seemed to crash, leaving only a single thread running at 100% trying to do the job of all the rest of them - for the player, the world was still stuck in the same partially-loaded state:
[media]So it is interesting that something about the bedrock server is inefficiently drawing so much CPU power, but also interesting that at some point it becomes a single-threaded operation, which surely must exacerbate the issue?
Question: Is it possible to export a world from Ubuntu Server and import it into Windows? I can spin up a Windows 10 VM in seconds, the reason I'm running Ubuntu Server is because it should, in theory, be vastly more efficient and reliable... Ha!

This graph is the CPU use from our server yesterday. The baseline you see is a combination of 16 Java & Bedrock servers all running 1.14. The spike you see at the end is from one Bedrock server running 1.16 and represents a 300% increase in CPU resources.
[media]
I can confirm the same. Major performance degradation after the 1.16 update on linux dedicated server. 100% load on one core with very quickly. If I run the same world, and do the same thing (running a gold farm) hosting the server on windows (The other of the two available downloads), it runs with basically "zero" cpu load.
Load on Linux (Ubuntu 20.04) Xeon quad core with HT. Thread limit set to 8 in server.properties:
[media]
Load on Window 10, on an older core i7 laptop: (same world, same server.properties and same actions done in world, same number of clients):
[media][media]
Here we see the process spread out on multiple cores, and overall uses much less cpu in total. Doing exactly the same thing, on exactly the same server and client version. Only change is server platform linux vs windows...
Server version on both platforms: bedrock-server-1.16.1.02
When I log into the (linux) server first time it actually uses multiple cores for a very short while when it's loading up, but then goes back to overloading one core and starts lagging:
[media]
FYI: Both Linux and Windows instance is running native. No VM and no container/docker.

Thank you all for the additional information provided in some of the comments.
Please remember though that the comment section of the bug tracker isn't a place for discussion, comments can and will be hidden to keep the focus on the bug and supporting information. To discuss ongoing bugs please head over to our Mojira Discord server which has a dedicated channel for BDS (albiet not often frequented - I'm there though!).
Just briefly on the conjecture from some earlier commenters, there has been no indication of any drop in support for BDS. In fact, new features are being added on a regular basis. This comes largely from the fact that BDS is reportedly very close to the service that runs realms. Realms is provided as the simple, family friendly solution, it in no way competes with Realms for the majority of the player base.
I'd also ask that those affected limit new comments to information not mentioned previously. Comments stating that you too have been affected only serve to obscure some of the more important information. A vote on the report is the best way to show you are affected.

After reading through all of the comments on this report I have a hypothesis that I hope will be of some help. I am not a programmer and know nothing about running a server, so this is just based on my general knowledge of Bedrock.
It seems like the Linux server is actually running the client thread for each client, and possibly running each client's rendering processes through its own CPU instead of the client's GPU. (I don't know if it would be running client threads authoritatively or as a sort of ineffective shadow that just eats up processing power.)
What leads me to this hypothesis is the fact that in regular Bedrock, to my knowledge, rendering of new chunks outside of simulation distance is largely a client-side process based on world seed. What I know for a fact is that most rendered chunks are not saved until they are simulated, unless they contain a tile entity (see comments on MCPE-84772 for details).
In sum, the fact that the linux BDS uses almost no processing power when no one is logged in but spikes as soon as someone joins and gets worse with more players or with any movement even in brand new worlds, suggests that the server is being bogged down by client-side process and even rendering that should be handled by the clients' GPUs.

Hi. Running BDS on Ubuntu 20.04, Intel 960 Processor (quadcore, 3.2 ghz), 12 Gb ram, Nvidia Geforce 690.
Its been running fine for over a month, then suddenly here the other day, it started lagging (half to a full second lag). We are 2 people playing here via LAN (xbox one x) and as stated before, it was working perfectly. I've tried updating the pc, updating the router etc. Same result. Even tried moving the BDS server over to my girlfriends pc (Ubuntu 20.04, i7 2600k cpu, 16 GB ram), with the same result. And by the looks of it, the cpu (or any of the cores) does not go over 30 % use. So i'm at a loss of what could cause it, spesially since we did not change anything, it came by itself.

So the latest update 1.16.100, makes it so we cant join the server. We get CTD everytime we try to join it.

The most obvious symptom I've seen of this is that there's a weird juttering of mobs. All mobs will move for approx 1 second, then pause for 1 second, then move for 1 sec, pause, move, pause, move etc. The sun will also move across the sky for a while, then occasionally jump back a few steps. Suggests to me some kind of time / synchronisation issue

I just wanted to say this was an issue on my server as well. I run it with CentOS 8, 4 vcpu's and 16gb of ram on a dell r320 rack server. I only have 2 other VM's running with 2 vcpu's each and they barely touch the resources. The host has 1 Intel XEON E5-2470 v2, so 10 + 10 = 20 threads available, running VMWare ESX 7. It also has 192GB of ram. I was thinking of giving the VM more CPU's, but 4 should be enough, there's usually only 2 to 4 of us playing. Once one joins, load goes to 1-2. Then the second joins and it's 2-3. Then the 3rd/4th it rides about 4-6 load. Once the second joins, that's when issues start. We get stuttering, and odd jumping of mobs. I only recently started hosting a bedrock server, so this has always been the load trend. I have no previous benchmark to go off of.

What is the current status on a fix for this? This bug has been open for over a year now and seriously hinders the BDS experience.
I'm running into the issue on a server that should be well overpowered for BDS. Running Ubuntu 20 on a 3.5Ghz Intel Xeon processor, 4GB of memory, SSD, and 10Gbps network bandwidth. No mods are installed. Chunks load in extremely slowly, riding minecarts is horribly laggy, and mobs in the nether constantly freeze. If anybody goes exploring new areas it brings the server too its knees. I've attached a graph that shows the per-cpu usage with only 2 people on the server building in already explored areas. Each cpu core runs at around 20-50% but then has random 100% spikes. As you can imagine it gets much worse with more than 2 people.
[media]
@ Nathan Telles, IDK, they are sleeping at the wheel. It seems I've been following this issue since forever, and it existed long before. I just gave up my BDS plans entirely for this reason, along with my faith in a proper performing Minecraft experience.
Following info probably won't help diagnose the problem at all but at least it adds to the current bank of affected systems :
Same abysmal performance issues as others pointed out on Synology NAS DS918+ :
Intel Celeron J3455 quad-core 1.5GHz, burst up to 2.3GHz
4 GB DDR3L
2x 250GB NVME Cache drives
3x 4TB WD Red HDD (BTRFS)
Running MarcTV Docker Image :
I know its not a powerhouse, but decent IMO. One shall expect to be able to play with a couple of friends without completely lagging out or cook the CPU....

TL;DR, I am happy that running Windows Server 100% resolved the performance issues. I am not happy about running Windows Server.
The beginning:
@Nathan Telles, @Antoine Talbot
I agree with both of you. The performance is very poor on decent hardware. For me:
Ryzen 3400G (quad core Zen+)
8GB DDR4 2666MHz
250GB NVMe (HP EX900)
Ubuntu Linux 20.04.x - 20.10.x
This server is 100% dedicated to only running Minecraft Bedrock Server.
I ran the Linux bedrock server in a screen (I know tmux is probably preferred). I started on a realm when 1.16 came out. Moved over after a month. Even in relatively light areas, just a single player would often load up the main game thread to ~25% or so. In heavier areas, ~60% or so. In some areas, 100%, and having many villagers in an area will cause the villagers to start stuttering in their movements, as do cows, etc. I'm the most frequent player, but there is one other somewhat frequent player, and a couple other occasional players. We are all friends, so this is just our own server.
[media]
Figure 1: One player standing in a relatively light area. One large building, but minimal redstone (one comparator, one repeater, one dispenser, one observer, and a few bits of redstone dust). Pic taken around 03JAN2021, on Ubuntu 20.10 Server running the Linux Bedrock Dedicated Server.
[media]
Figure 2: CPU utilization with just one player, near a village of 20 villagers and a few natural farms. Picture taken around 13JAN2021, on Ubuntu 20.10 Server running the Linux Bedrock Dedicated Server.
One thing that is subtle, is everything is a bit slow to react. Doors, buttons, etc. The worst, however, was flying. Basically impossible. Not only was it hard to jump+rocket launch with an elytra, just flying around would constantly result in myself running into "empty" chunks that haven't loaded in yet. Even on a well traveled (flying) path, I would have to wait a while for the chunks to load it. It made travelling between two spots (approximately 1000 blocks apart) very tedious, and exploration was just unenjoyable.
When flying around, all 8 threads of the CPU would be pegged at 100% utilization.
One of the updates in the past month or so (so 1.16.5x - 1.16.2xx) improved performance a tiny bit when flying. With that, I ran into the first chunk walls around around 800 blocks, instead of ~200.
Hardware Upgrade & Disappointment:
I then had the opportunity to upgrade the server hardware.
Ryzen 3600 (hexa core Zen 2)
16GB DDR 3000MHz
1TB 970 Evo Plus
Ubuntu Linux Server 20.04.1 LTS
Performance was a tiny, tiny bit better. CPU loads that were ~90% were now ~70%. But the large village that was (seemingly) causing the 100% CPU load was still 100%. Flying around still loaded up 8 threads to 100% (and a few other threads to some medium-low amount - I left the default 8 thread max, in the server configurations file).
This was just unacceptable. Performance was still abysmal, relative to the large upgrade.
There is no way any major Minecraft bedrock server would be acceptable, if it was running like this. They would need massively expensive CPUs, just to lag out with minimal builds and just one-two players. Not to name names, but stuff like Hypixel and Lifeboat (both available in the default server's list) would be absolutely impossible if this were the case. I have played on their servers, and their performance was far, far better than I was getting with my server. It does not seem possible, given the high single thread performance of Zen 2 (sure, Intel is marginally faster in single thread workloads, but not nearly enough to explain away the extremely poor performance of my server. Furthermore, when running my server's world in a singleplayer, Bedrock instance on my own Zen2 computer, the performance was absolutely fine, without lag, stutters, etc).
One more thing to try:
However, given Mojang is owned by Microsoft, I had to try one more thing. Install the wretched abomination of GUI and poorly implemented ideas (Windows Server 2019 Essentials) onto my current server and running Bedrock Dedicated Server (Windows) off of that, with the same world, and the same settings.
To my absolute shock and horror, everything played wonderfully - it was like playing singleplayer minecraft again! Villagers ran smoothly and quickly across the village, cows stopped stuttering about the field, I could fly as fast as I wanted and as far as I wanted without encountering invisible loading walls. Flying around would only load 1-2 threads on the server ("logical cores" in the Task Manager's utilization window). CPU utilization when in other areas of the world amounted to <20% (and probably single digit percentage) utilization of 1-2 threads.
[media]Figure 3: Windows Server 2019 Essentials. Picture taken on 20JAN2021, running Windows Bedrock Dedicated Server
It wasn't even causing the CPU to really raise its clockspeed - you can see it staying around base clocks. In Linux, single thread loads were pegging the CPU at 4.2GHz and flying was so taxing, the CPU would throttle down to ~3.9GHz. I was able to expand the render distance to 80 with no noticeable downside whatsoever. Simulation distance is 12.
Outcome:
The performance difference was appalling and shocking.
Future:
The next steps are to test the older server with a virtualized Windows Server 2019 Essentials guest, and see if the performance also improves on that hardware. Then see if WINE 6.0 can run the server without Windows, so all of the management scripts I hacked together can be utilized again (based off of a couple of other management scripts I found on GitHub - thanks TheRemote and the other, unknown source). Though I did find an open source management tool for Windows Bedrock Dedicated Server called Vellum that supports hot backups and is written in familiar C#, so I'm working on modifying that to support auto-updates for the Bedrock Server software, so I'm willing to at least keep around a Windows Server guest VM.
Conclusion:
As stated above. The performance difference was appalling and shocking. For now, I'd recommend running WIndows as the server. I don't know if running Windows in a VM would preserve the performance of the Windows Bedrock Dedicated Server, nor do I know if running WINE would preserve the performance of the Windows Bedrock Dedicated Server while running on a Linux OS. For now, I am merely happy with the smooth gameplay.

Hi,
could someone from developers (or BDS Product owner) comment on this issue please? Why it's not even assigned for more than a year? It has been open for very long time now and it's impact is huge. Workaround "use Windows or Java edition" is not somewhat acceptable for many of your customers.
To add something to stack - I've noted that when you start a new world, it can work quite well for some time until you run into some type of chunk that causes this havoc and then it's useless for eternity. I was able to made it to the End in survival on some world and sometimes it's broken after 10 minutes of gameplay.

Workaround "use Windows or Java edition" is not somewhat acceptable for many of your customers.
Or at least make Windows bin work in a headless environment with Wine without much fuss. Currently only way to do this is to use some third-party hacked BDS full of bugs.

We have been using BDS since the first release and we have had the view-distance set to 32 since the beginning. I have been fighting the CPU issue for the last few releases and it became unbearable in 1.16.200. I reduced the view-distance to 10 yesterday and we are running significantly better. It has even reduced the impact of logons. CPU utilization has dropped from 100% to an average of 60% with 14 players online last night. We no longer suffered from rubber banding. Block lag was occasional, but no where near what we saw before the change. For us, this improved it.

Just learned all the basics of ubuntu server only to find out the MCBE server software is slow as crap on it....welp back to windows since running it on wine takes more time then its worth.

10th Gen i-7 32gb ram. Windows BDS world is fine and smooth, same world & same computer Linux performance is awful with 100% cpu (12 core) usage on logins and mobs stutter like crazy (the whole time not during spikes). Not playable on Linux, Windows is butter smooth.

has anyone tried with wine in linux running the winodws version?

Answering my own question, running from wine is not straightforward, I found some tutorials on how to make it work but didn't try yet:
https://github.com/Element-0/ElementZero/wiki/linux-install-no-docker

I tried to make the ElementZero to work, not possible, it is not building for the latest BDS version.. if you are ok running only the 1.16.20.6 then it works.

Just as a data point: I am running in a 16GB Ryzen5 3600 and with two or three players on the tick frequency gets at least halved.

@Foobar Vanilla BDS depends on chakra.dll, which is not something you can get on wine. There are patches to make it work with chakra core maintained by 3rd parties, but you're at the whims of the maintainer. I use this one: https://github.com/bdsx/bdsx

I have same issue, running on Debian 10, BDS 1.16.210.06, 4 players and CPU usage = 200% (100% per core, 2 cores allocated to the server, so server uses max), CPU Intel Xeon E5-2680 v4

For anyone else with this issue:
I ended up using the Java edition paperMC server client and then used GeyserMC to let bedrock players join. Not only is this somehow MORE stable then the official native bedrock server client, it also means you can use server plugins (for example our server uses things like dynmap and that’s a game changer). I highly suggest this over any official or unofficial bedrock server client as somehow it actually works better on a Java server with geyser lmao.

I've updated to 1.16.220.02, and so far I've seen a very noticeable drop in system load, as well as a sizeable performance increase.

I can confirm as well that 1.16.220.02 improves performance on Linux.
I have no concrete metrics to show but it feels almost as smooth as running Windows bin through Wine. I think there is still a small performance disadvantage running Linux native over Wine but it's marginal.
Main thread spikes to 100% CPU usage for a few seconds when activating logic on redstone-heavy chunks whereas it's not as prominent on Windows version through Wine.
Hardware config for reference:
Ubuntu Server 18.04
i5-9400
16GB RAM
Bare metal instance running off nVME SSD.

One more confirmation that things are improving. Whatever tended to destroy the CPU on fresh/young worlds on Ubuntu is now much closer in line with Windows. I also don’t have concrete metrics, but at the very least, comparing CPU usage of the VM process in each case are in the same margin of 5-10% CPU. That includes OS overhead in each case though, so it’s not a perfect comparison.
Our world isn’t quite heavy on Redstone, so I can’t speak to performance there.
Windows VM (Bare) vs Ubuntu 20.04 VM (w/Docker)
CPU usage is markedly better on hosts like CubedHost as well.

Resolving as Fixed in 1.16.220 based on comments above.

I know this kinda old, but im still having this issue.
Approx 2 updates ago performance was perfect, then it degraded after an update.
Now, with 1 player the server reaches 100% CPU utilization and the performance drop is especially noticeable when loading chunks (extremely slow).

Space Katzzz: thank you for the update. Since the issue reported here was not a problem for several versions we would like you to make a new report.

I too am still experiencing this issue. Nothing has been reported since nothing has changed. Please reopen this ticket as all of the contained information and comments are relevant to the issue. We were also told not to post unless there was new information to provide so now we are getting conflicting statements...

Before we reopen this, can you confirm if if you still have the issue after the fix to MCPE-143156 in 1.17.41 (released today)?

I'd argue that comparative performance metrics between linux and windows in the latest version (similar to what has already been attached to this ticket for previous versions) should be submitted before deciding to close this ticket. It should not be closed just because a few subjective comments were left that things seemed to have improved.
After all, we wouldn't open a ticket if someone only claimed "performance sucks on Linux". The same should go for closing.

Ninja Dankinate: I understand your sentiment, but often a few subjective comments is all the information we have to go on. We rely on users to continue report affected versions to know that an issue is still a problem. One reason for this is that many issues get fixed indirectly through other changes. If no one updates a report for 3-6 months that tells Mojang it is no longer a concern for the community.
Trevor Kensiski: whether a problem affects the current version is the kind of information we do want in comments. Especially if you continue to experience an issue while other users are saying it is fixed for them.
No one disputed the comments saying that performance improved markedly in 1.16.220 until 6 months later. At this point it will be more clear to get a new report with detail for the current version if you are experiencing performance problems unique to Linux BDS.

Ok so finally had some free time to mess around with the linux VM again and got it updated to 1.18.
I have noticed a few hiccups where the game kinda stutters for a second (game saving?)
Either way these stutters while a bit detracting are not ruining the experience.
That being said at the moment the performance is much better than it was back in 1.14-16
Ill continue monitoring my VM and see how it continues to preform as we explore more of the world and start doing redstone etc... As well as dig in and see if anything stands out during the stuttering issues.
VM Config
8 vCPU
32 GB Ram
100 GB SSD
Minecraft Config:
ticking distance of 12
Render Distance of 32
2 Players via XBOX split screen
Threads set to 0
Defaults for everything else
Let me know if any other settings are useful for debugging this.

There is definitely a stark performance difference between the Windows and Linux version though I am still trying to work out an apples-to-apples test that I actually have time to demonstrate.
Trevor Keniski: It is nice to see that someone else is working on this as well. What I have found is that the problem really shows itself if you turn up the tick speed then add a second player to the game. The first player to join a game seems to get priority treatment.
On Windows, even with the tick speed cranked you will be hard pressed to notice any side effects.
On Linux, if you turn the tick speed up much then every player after the first one is likely to have a terrible experience. One of the CPU cores will be pegged all the time, mobs will move in small increments and everything will stutter in general. On my Linux server with a Ryzen 9 3950X, setting the tick speed to 50 will easily cap one CPU core and present issues for the second player.
I caught on to this when I moved a game that my wife was hosting on her Core i5-8400. Her and my two kids were playing at a tick speed of 150 with no issues. After moving the game to the Linux server (the Ryzen 9 3950X) we had to turn the tick speed down to 10 to make it playable again. I suppose I should also note that all of our games have a 10 chunk simulation distance.
I am surprised that Mojang isn't all over this issue already as I strongly suspect it is hindering the performance and scalability of realms.
Moderator: If you think this belongs in a new ticket then I will be happy to create one. Thanks!

Yes please create a new ticket. I will link it as related to this one.

Has this been fixed up to the point performance on Linux is similar to Windows on 1.18 and 1.19?

For me, the issue disappeared just over a month ago.
The OP mentioned chunk generation being a primary cause for degradation in performance. When I had this issue my server CPU would hit 100% with a single player flying around, now its ~5% :3
I'm having the same issue - one user brings the cpu up to 60%, two makes it run at 100%, on a single core
I'm running an Intel Nuc 815, which has a i5 8259U
Testing on my much older desktop with an i5 3470, one user barely brings it up to 40%, and that's with running it on the same machine as my client - I'll test with other clients shortly.