logo qrcode SilverBulletPC.com
The world's most practical PC.
Recommended by Locals On Alignable Seen on Linkedin

PC Tips

Is Intel's iGPU Enough?

We see 3rd party GPU's and Intel's integrated graphics (aka iGPU, aka IGP) all advancing from year-to-year, but shouldn't there be a line that we can draw, to take some of those 3rd party GPU's out of consideration because they won't do any better than Intel's iGPU? Well, the answer is more of a twisted side-ways type answer...

It depends on what you're doing with it.

Basically, what Intel's iGPU is great at is encoding 4K video (even better than most of those 3rd party GPU's), 4K still graphics will look great, and slow paced moving graphics will work just fine. Intel's Coffee Lake (and higher) iGPU's include a very nice 60Hz 4K over DisplayPort. It's hard for anyone to complain about this level of graphics capability. However, serious gamers will not be satisfied with the graphics performance on some of their games. Adaptive Sync is only available in Intel's Alder Lake iGPU's (a feature considered important by some of the more serious gamers). Calculating textures can fall behind on very fast moving graphics. These are things a 3rd party GPU can do better. One may also want more screens. Adding a 3rd party GPU is great for this. In fact, you can use the iGPU for two of the screens, and a good 3rd party GPU for 3+ more screens. For these reasons we do not recommend buying our Unified-PC if you are a serious gamer with specific games that you must be able to play. For practically all other uses (including a majority of very good games), Intel's UHD 730-770 iGPU's (included in Alder Lake) work wonderfully.

Here's what you may not be expecting:

Historically, each new version of Intel's UHD graphics was only marginally better than the previous. However, with Alder Lake things change more aggressively. Lighting, reflections, and performance are all quite a bit better. But it doesn't stop there. Games actually require a fair number of different chip-level capabilities, and Intel's Alder Lake integration of those capabilities is very good, so little improvements here and there all stack together, resulting in a gaming or engineering experience that's considerably improved. So, if your previous engineering or gaming system using an older UHD version was barely tolerable, you may find that Alder Lake becomes a real solution. And, last but not least there's the raw core speed factor. Advanced games and engineering both generally suffer more in raw core speed than the graphics speed. All you have to do is dial back the resolution of the graphics one notch to radically improve graphics speed, and doing that in combination with Intel's superior raw core speed will give you a very affordable yet superior gaming or engineering system. One that is vastly simpler, easier to work with, more responsive, and lasts a long time.

Which "Sync" Do I Need?

Adaptive-Sync is a standard created by VESA to help eliminate image artifacts caused by signal timing issues between GPU and monitor, which is more an issue with fast-paced graphics. A very high refresh rate compensates for most of the problem.

Intel includes Adaptive-Sync with their Alder Lake CPU line.

FreeSync is AMD's branded support for the Adaptive-Sync standard. In other words, if it says "FreeSync" then it is a product that supports the Adaptive-Sync standard.

G-Sync is a costly hardware based solution invented by NVIDIA. Diehard gamers insist it is better and will shell out the big bucks.

Intel's Quick Sync Video has to do with something else altogether: video encoding and decoding. But because people look for some sort of "sync" in Intel's brochure, it can easily be confused as the same thing.

Currently none of the Intel iGPU's support Adaptive-Sync, but a top Intel executive did leak news that they plan on adding Adaptive Sync to their gen10 and gen11 CPU's. More and more monitors and 3rd party GPU's are adding Adaptive Sync, making it the right standard to plan on. If clean fast moving graphics are important to you then be sure you buy a monitor with Adaptive Sync (or it may say "FreeSync"), and be sure it has a DisplayPort connection, because the HDMI implementation is not solid even when they say it has it.

Intel or AMD?

In short: given all points of value to the consumer, Intel is king, and will likely stay that way.

It is easy to debate details endlessly. And right now the web is filled with these debates. However, what's important to take away from all this is the fact that each time AMD comes up with something that benchmarks better than Intel on some point, that Intel then takes action to improve things at their end. This is an effect of capitalism that accelerates the creation of more value for the consumer. It keeps Intel in check. If it weren't for AMD then Intel would leave pockets of untapped potential untended, and prices would be much higher. Just think of these debates as a reminder that this aspect of capitalism is working the way we want it to.

But how do we choose between Intel and AMD?

The right answer: "It doesn't matter."

The reason we say this is because no matter what, AMD and Intel will both spend billions trying to beat the other in value. And this will go on forever, or until one of them is way better than the other and the marketplace tips their vote (their purchases) in that direction. Or at least that's true in theory.

Deceptive marketing also plays a role.

According to a July/2019 analysis by Digital Trends, Intel and AMD were at that moment neck-and-neck overall, where Intel was ahead in single-core speed, and AMD was ahead in iGPU capability. This is a case where AMD just released their latest chip and Intel was about to, so the comparison of course unfairly favored AMD. Now that it is switched around (Intel has since launched Ice Lake and Tiger Lake CPU's), the numbers now unfairly favor Intel. But Intel does not leverage this with marketplace deceptions.

Looking back at last year, AMD added their previously missing integrated graphics, and managed to leap-frog Intel's integrated graphics. There's a lot of this going on between them.

But what matters most? When looking at a cross-section of all the software packages people use today, it's the core speed that matters in most cases. This has always been true in the past, and will likely remain true for a number of years into the future. It's really quite amazing that Intel's aging Coffee Lake Refresh platform, a 14nm platform, is still better at raw core speed than all AMD's advances in miniaturization (all the way down to 7nm), even after all these years of intense effort. This is a flag that something is up with their 7nm technology.

But what about AMD's 64 Core ThreadRipper 3990X? This is yet another example of deceptive marketing. Intel also has higher-end CPU's that cost thousands of dollars, going into servers and other special-purpose machines. What AMD did was take one of their high-end CPU's, then added a desktop chipset and shoved it into the consumer marketplace with a hefty price tag (about $4,000). And it worked! Extreme hobbyist's are emptying their pockets for this high-end CPU. Intel could learn from this tactic. But the tactic wasn't about making more money for AMD, its sole purpose was to save face in light of the fact that their 7nm chip can't keep up with Intel's 14nm chip. AMD wants the market to believe that they have the world's fastest consumer grade CPU, not Intel. This is deception at its best. These really aren't consumer grade CPU's in the first place. And Intel could easily pull the same trick with one of their high-end CPU's. All they have to do is wrap it up in the same manner and market it to top-tier consumers under a cool sounding name. But Intel generally doesn't play this type of game.

Now, if AMD had a 64 Core CPU at a "consumer-affordable" price. Then that would be something to talk about. But they don't.

It's easy to point at AMD's progress in miniaturization, already launching their 7nm chip while Intel is still dabbling in 10nm and 14nm details. But the real challenge comes afterwards: all the supporting technologies that have to sync up to it. Intel's advancing more slowly with miniaturization because they are pulling with them a far bigger effort, and this results in more value.

Getting the whole picture together first (a full-featured 14nm Coffee Lake Refresh platform from Intel), is a lot more important than launching one part of the picture first (a working 7nm Ryzen platform with a couple features that top Intel's 14nm platform).

An example of something either CPU manufacturer could have easily done at any time, was to significantly boost the internal cache, which then significantly boosts the performance of just those programs that primarily process software code, not data. Such as a flight simulator. This capability is important here and there, but not very important overall. But, AMD was first. And they are making use of this in their marketing materials, as if it proves that their 7nm Ryzen 9 is better than Intel's 14nm i9-9900. This is just plain twisted nonsense. Don't fall for it. If they have to use this tactic to make their 7nm technology look better than Intel's 14nm technology, then you know that something is not at all optimal with their 7nm technology, and that this was likely a cover for a weakness in their technology. A good implementation of 7nm technology should be so much faster than everything else, that there's no question at all about anything comparing to it. This is because the mathematical advantages to each level of miniaturization compound exponentially. So, since this is not happening, we know something is wrong with AMD's technology. At the very least it is not efficient.

These days just about any read-through of a technical forum involving PC's and you'll find at least one crafty plug for AMD. Each one is a masterfully well written put-down on Intel that simply sounds right, yet isn't. Unfortunately, very few forums are going through the trouble of straightening these out. Most seem to be effective. And that means more sales to the lesser technology, and that's the power of deceptive marketing. It works all too well.

All those added dollars resulting from AMD's relentless marketing deceptions fuels real R&D. This is making their technology better, which may at some point actually pass up Intel for real. The thing is, this type of business is very unlikely to ever let go of deceptive practices. It's like a bear with a honey pot. It's not going to let go. So, even if they produce technology of superior value, you should beware of other types of deceptive tactics that they may use. We warn about some of those here.

Putting AMD's endless deceptive marketing practices aside, we are convinced that judging purely based on the value of the technology, that the sum of all the details affecting value continue to favor Intel. However, we will watch AMD's 7nm progress closely, and if they do get the whole picture together (most notably a superior raw core speed) then our opinion might change.

Meanwhile, Intel has launched its game-changer Alder Lake CPU platform, settling the score. This platform aggressively advances raw core speed (25-40% faster), introduces efficient cores (significantly more performance at lower power levels), and significantly improves iGPU graphics (to the point where there is very little market left wanting a 3rd party GPU).

Coffee Lake? Ice Lake? Comet Lake? Tiger Lake? Rocket Lake? Alder Lake?

Ok then, which CPU platform, or "Lake" is best? Intel has pushed hard to move forward into 10nm with its new Ice Lake platform, and finally in 2019 launched a major invasion of Ice Lake CPU's into the mobile markets. Dell and other big names have been selling laptops with Ice Lake CPU's en masse. These Ice Lake CPU's include about double the capability of Coffee Lake in graphics, and perform about 30% better per equivalent ghz-rated-core. However, Ice Lake CPU's are capped at just 4 cores and 8 threads. Their primary advantage is low power and low heat production, making these ideally suited to mobile. The desktop counterparts were cancelled, as they turned their resources towards development of an improved 10nm platform called Tiger Lake. And to tide people over they did a quick workover at 14nm to produce a line-up of CPU's that they call "Comet Lake", and "10th generation" (the same generation number as the Ice Lake CPU's that they put into mobile). So, it would appear that Comet Lake is there to plug a hole in the market (of people expecting the new 10th gen desktop chips). These Comet Lake CPU's are about the same thing as Coffee Lake except that you get a couple more threads, and a bit more RAM speed. They've pushed out the envelope a bit more to make a slightly faster CPU without actually advancing the technology. So, altogether you might see a 10% to 15% increase in performance over Coffee Lake depending on the scenario, but not with any of the existing motherboards, because they switched to an entirely new CPU socket. However, there is one compelling motivation remaining: Comet Lake allows twice as much RAM (up to 64GB) within just two slots, allowing the super popular Mini-ITX form factor to hold as much RAM as a big board. It really doesn't make much sense to go after these CPU's unless you need this additional RAM. It's otherwise about the same as a Coffee Lake CPU, which has had more time to prove itself reliable. We think Comet Lake was created simply to buy time. Buy time for what? Tiger Lake. This is a redesign to their 10nm process that uses improved core technology that they claim is twice the performance per core as AMD's Ryzen and includes a far better iGPU. This game-changer Tiger Lake iGPU is now a proven thing: a Tiger Lake system has demonstrated 89fps on the cutting-edge game Overwatch, while AMD's 4800U could only get 46fps. This uber-advanced iGPU (capable of 10K graphics) wipes out roughly 95% of all motivation to even bother with 3rd party graphics boards, resulting in a potentially dramatic shift towards miniaturization in game machines. And with Tiger Lake the per-core performance is about 30% improved, which is about 100% faster than an AMD core. Real performance without real heat? Traditionally, a laptop is only fast at first and then slows down because of challenges with heat. Tiger Lake might change that. But what about real Tiger Lake desktop chips? 4 CPU's are listed on the Intel website, but they are not available, nor are any motherboards supporting the odd new socket (FCBGA1787). Either they are about to come out, or it's something we are supposed to focus our gazes on while something else is happening. And this is where Rocket Lake comes in. Rocket Lake is just like Comet Lake in that it fills up the rest of the market demand for the current generation number (now gen 11), without actually delivering the new 10nm technology (Rocket Lake remains at 14nm). So, what you see actually hitting the desktop market are not the game-changer Tiger Lake CPU's, but rather the super high power consumption Rocket Lake CPU's, also marked "11th gen" (possibly to get the marketplace to overlook this detail). You might see this as deceptive, or as a slick way to be sure to make a deadline with a new technology. Keep in mind Rocket Lake does a far better job than Comet Lake at including newer secondary technologies (such as PCIe 4), and some of those are important, but wow does it generate heat. Some say as much as a 300 watt heater. Rocket Lake seriously cranks up power and heat to produce more performance, which is a relatively easy thing to do. And it's primarily just a reshuffling of existing tech. However, one good note is that Rocket Lake does include a porting over of the revolutionary Tiger Lake circuitry (from 10nm back to 14nm). This "quick fix" delivers some of the revolutionary new Tiger Lake iGPU features in desktop Rocket Lake CPU's, which is of value if you are a gamer. But where are the real Tiger Lake desktop CPU's? It has been cancelled as well, with the new 12th gen Alder Lake taking over that dream. Alder Lake is Intel's first 10nm desktop CPU platform, which started shipping in December 2021. That's about as sudden as a UFO landing right now outside and aliens getting out while we all watch. Ok, so they launch Tiger Lake but we can't find them, they launch Rocket Lake to tide us over, and then they launch Alder Lake. With a little thinking our best theory is that Alder Lake's breakthrough hybridization of two types of core technologies is why. This hybridization could ultimately give them the best of both worlds in the ongoing challenge with AMD: both the fastest cores, and the most cores. Thinking about that for a moment, it makes sense. Alder Lake is why Intel has been dragging their feet. It is the game changer tech that we have been waiting for.

With Alder Lake Intel has now unquestionably taken over the "best of best" CPU arena, for both gaming and business. It is fast, reliable, and cheap, placing it front-center in our PC philosophy.

ECC or NON-ECC?

Most experts consider this matter settled. If you want your computer to not produce memory errors then you should go with memory designed with Error Correction Coding (ECC). However, there is a lot more to this than a black & white answer. ECC may actually make things worse if not handled properly. And there's a software workaround that trumps it altogether. If your use for the PC is critical in nature then you may want to learn more here... The Truth About ECC

About Products Contact Privacy Disclaimer Warranty HOME