The world's most practical PC.
#1 PC Processing Bottleneck: One Queue Depth 4K Random Read
This is where the PC is busy trying to read random file locations quickly and can't catch up. Despite major technological improvements in various speed attributes, this bottleneck continues to plague today's PC in real-world performance tests. Even the world's fastest consumer NVMe drive only improves these test results by about 50%.
There are three primary PC bottlenecks:
Virtual paging overflow used to be the most common bottleneck, because RAM used to be expensive, resulting in most PC's not having enough to adequately handle software and paging requirements. However, due to today's extremely low RAM pricing, this bottleneck is far less likely to surface, which is why we put it in 3rd place.
Here's the twist. These bottlenecks combine with caching deficiencies to produce the mechanisms behind planned obsolescence. No one can prove that its intentional because the design deficiencies from different technologies combine to cause the problems. And its generally easy enough to fix if you are aware of it. The trick is that the more powerful marketers know not to touch this. Its making money the way it is. People believe their PC is going out-of-date, and that leads to another sale. As a result, all spotlights beam in other directions.
Thus why these marketers push market segmentation as hard as they can.
Today's biggest unspoken dilemma in PC marketing is that it is so common for people to think that their CPU is the bottleneck when it is more often than not an inadequacy in their File System Cache configuration. And as a result, they go out and buy a new PC with a faster CPU. At first its much faster. But then over time the new PC slows back down to the terrible performance level of their old PC. To the uninformed it then seems a good time to go buy a new one, again.
Was the PC out of date?
A very common example of this is where they put a low-end Seagate hard drive into a lower priced PC, with a CPU that has an impressive specification, making it seem like a good value. Even though the CPU may be fast, the consumer's use of that PC will at some point expand beyond its useful cache size. This causes that low-end hard drive to be accessed more and more often to read random locations, which is the slowest thing it can do, which then slows down everything the PC is doing, and causes it to burn out early.
A new PC will be fast only at first, because it takes time for this bottleneck to kick in. But just to be clear, what most old and new software requires, and what most PC's are terrible at, is good performing "1 queue depth 4K random read". New technologies come out each year making the other speed attributes faster, but this particularly important speed attribute continues to suffer, only progressing in tiny increments every few years, and therefore remains a serious performance bottleneck, even in brand new PC's.
If your software is too slow, and you wish to speed it up with a faster PC, then this issue should be very important to you.
Even the fastest Samsung NVMe drive only speeds up 1 queue depth 4K random read by about 50%. All the others? Hardly at all.
There are three lasting solutions:
Intel's Optane accelerator has to be large enough to sufficiently cache the storage in use. And it has to be configured right. Intel's Optane accelerator works on some newer chipsets and CPU's, but not all of them.
Rewriting all your software on a threaded platform comes with heavy development & licensing fees. Most new software is developed by individuals and small businesses that like to stay away from these fees. That's why most software both new and old still suffer from the single queue 4K random read bottleneck.
The very best solution is a properly designed disk-cache, which is ordinarily the operating system's responsibility.
It potentially accelerates single queue 4K random read by 60x, but it requires more RAM to make this work. RAM is now very cheap, making this the very best solution. However, there's the matter of memory persistence, which can be fixed with the right NVMe. Not any NVMe drive will work for this purpose. Further technical details narrow the field to just two specific NVMe products that we know of. And, there's the fact that Microsoft defaults their File System Cache to a dynamic allocation, rather than fixing its size to an optimal level. This decision was clearly left over from the expensive RAM era, to better handle bottleneck #3 (see above). In a RAM rich PC its better to lock in the File System Cache to just the right fixed size, to prevent an almost inevitable tug-of-war that will result in one of the bottlenecks resurfacing despite having plenty of RAM.
If you don't lock the File System Cache size, then some software will likely hog its memory to the point of bringing back bottleneck #1. In these cases you'd rather get a memory error from the software to know to replace it with something more memory efficient, or to add more RAM. But Microsoft is not facing this specific point (likely for reasons explained above). Instead, leaving the allocation logic the way it is results in one bottleneck or the other losing the tug-of-war. And you get the idea that your PC is going out of date.
If you don't know what to do, the safest bet is to set both the minimum and maximimum cache size to about 1/2 of your total RAM, and leave it there. There are all sorts of possible issues and this setting is the most likely to resolve them.
You can calculate the right amount of RAM to buy as follows... First add together the RAM requirements for each software package that you intend to use simultaneously. Then add together the sizes of their active resource files. Then if you are a light-weight user add another 2GB, for medium-weight add 5GB, and for heavy-weight add another 10GB. Then round up the total to the nearest power of 2 (4GB, 8GB, 16GB, or 32GB). This is a very good guess as to the amount of RAM that you need.
Between choosing the right amount of RAM, and configuring the File System Cache correctly, and adding the right NVMe product, you can eliminate the single queue depth 4K random read bottleneck. And that in turn prevents your PC from getting slower prematurely.
Your PC will no longer creep past a nebulous limitation, unwittingly turning on a bottleneck that had no reason to be there in the first place.
A PC Server?
Coffee Lake Refresh is a terrific platform for servers. It so easily breaks all three PC bottlenecks. A correctly equipped Coffee Lake Refresh server will cruise past all the rest at the same price level. Plus it is generally the more reliable choice.
But what about that indecision as to whether to beef up one server, or to split its load between two or more servers? Our product The Mini is the sweet-spot in highest-value. Ultra high reliability, ultra low price, ultra-fast, etc. So given all it has to offer, it might make sense to divide up your demand into multiple identical (and therefore interchangeable) PC's.
Just imagine how much simpler and easier to add and subtract the exact same optimally designed PC, than to fight with upgrades and parts that are always different?