99 hundred k. Cpu brings us eight cores and sixteen threads to their mainstream lineup. But how does it compare to the old king from the 8th gen, the i7, a 700 k to find out I've tested both CPUs in 10 different games at 1080p, 1440p and 4k resolutions, as well as a number of other applications to help you decide which to Pick well find out if it's worth upgrading let's start with the differences in specs. The obvious difference is, of course, the core counts. With the 8700 K having 6 cores 12 threads, while the new and 9900 K features 8 cores 16 threads, while the own iron has a slightly lower base clock, speed, it's got higher turbo boost speeds and, although both are based off of Intel sporting, never made a Coffee, like architecture, the 9th generation features sold the Tim, which should help cool those extra cores compared to the paste that was used in the 8th gen to see how these differences practically affect games and applications. I'Ve tested them both in the same system I'm using the msi '0 gaming Pro carbon motherboard, which supports both eighth and ninth Gen Intel CPUs, so no need to swap boards. I'Ve got 16 gig of t4 Nighthawks seal 16 memory running at ddr4 3200, which was kindly provided by team group. Hopefully, the extra RGB boosts our performance and for the graphics I'm, using my a VGA for the wind 1080 as it's, the best graphics card I've got available I'm calling the CPUs using my fractal s, 36 360 millimeter, all in one liquid cooler with the same Noctua And th one thermal paste, so the exact same PC was used for testing both the 8700 K and 9900 K CPUs as apples to apples as things come with that in mind, let's check out the games first, followed by a few specific benchmarks.

Afterwards, I've tested all games at 1080p, 1440p and 4k resolutions using these Nvidia drivers with the same windows and game updates installed, and no overclocking has been done at this point. Let'S start with ashes of the singularity, because what kind of CPU comparison video would be complete without it at 1080p, with max settings, there was just a 3 improvement to average rates with the 900k, although a larger 11 percent improvement to the 1 allure. Moving on to 1440p there's now minimal differences between the highest setting levels as the graphics start, taking more of the load, so as CPU difference is early clear at high settings will blur at 4k. This extends further with only low settings now showing a clear performance difference between our two CPUs. What note was tested using the same replay at 1080p there's, basically no difference at high or epic settings, presumably as we're GPU bound, but at medium or low. There is quite a large improvement, although realistically around the 300 FPS mark anyway. I don't think this matters. Stepping up to 1440p saw the medium results come in much closer together, just like the higher settings, while only minimum settings now see an obvious improvement by 4k. It basically doesn't matter which CPU you, using as the graphics becomes the bottleneck resulting in very similar performance at all setting levels. Shadow of the Tomb Raider was tested using the built in bench mark and at 1080p. The results were very close together with the same results.

Recurring from both CPUs at some setting levels at 1440p, the 9900 K is anywhere from one to three frames per second ahead of the 8700 K or a three percent improvement at max settings, sir, nothing very interesting and then at 4k. The results are all within one frame of each other, again, no practical difference between CPUs observed at this point. Assassin'S Creed Odyssey was also tested, with the built in benchmark tool and at 1080p. The bulk of the difference is the scene at the lowest setting levels. As these are typically less GPU bound just a 1 boost to the averages with the 9900 K at max settings, but then this changes to a 10 improvement at low settings at 1440p. The results start squeezing up closer together, just a small 4 improvement at low settings. Now, with the 9900 k, otherwise no major differences, whether you've note at 4k, I noticed the width, the 9900 K, although it was just barely ahead in terms of average frame rates. The frame times were consistently down in this test. I'M. Honestly not sure what the deal was. There were no resource issues that I could detect so I'm, not sure if the game at this resolution, didn't like something about the night 100 K, Far Cry 5 was again tested using the built in bench mark and at 1080p. Will finally sing some obvious if small gains from the 9900 K performing 5 ahead of the 8700 K at Ultra settings and this percentage rises further as we drop down in settings at 1440p, the average frame rates are much closer together now, but there's more of a Difference in the 1 lows now, with the 9900 K, improving them by 5 at Ultra settings at 4k, we're very GPU bound and there's, no real practical difference observed at any setting level.

Although the 9900 K results are just slightly ahead, where for watch was tested playing in the practice range and keep in mind, the game has a frame rate cap of 300 frames per second, and while at 1080p, we're averaging this at low to high settings it's worth Noting the 9900 K is giving us better frame times shown by the 1 lows at 1440p: there's, basically, no difference in terms of average frame rates, regardless of setting level, though the 9900 K does seem to slightly improve the 1 lose again at 4k. While the averages are closer together, the 1 lows are again actually down on the 9900 K, similar to what we saw in a sessoms, great oversea csgo was tested using the illogical benchmark and at 1080p the 1900 K was providing a 3 improvement to average frame rates And mech sittings, a 6 percent boost at medium and a 7 percent increase at minimum going up to 1440p. These differences changed to the 9900 K now being ahead by 2 at max settings and 7 for the rest at 4k. Once again, there's hardly any difference between all setting levels, with the largest at minimum settings now, just being a three percent improvement. With the 9900 K, hub G was tested using the same replay and at 1080p just a small 3 improvement to average frame rates. With the 9 100 K – and this rises to 5 at very low settings with larger improvements to the 1 lose than the averages granted at this resolution, they're high enough at most settings anyway, at 1440p, the averages creep closer together, while the 1 lows remain further apart.

But again their major differences to justify paying more for the 1900 K so far at 4k, it's the same story just even closer in terms of averages as we're dependent on by 1080 graphics. Watchdogs, too, was tested, as I found it, to be a game that uses quite a lot of CPU and while at 1080p there's only a 1 improvement to the averages. With the 9900 K, this rises as we lower the settings right up to a 19 boost at low settings at 1440p. It'S the same story we've seen again and again, results closer together as the CPU place less of a critical role at higher resolutions, with just a 4 improvement with the 9900 K low settings now, once again at 4k, were basically all dependent on the graphics again and The differences in CPU aren't, counting for munch Ghost Recon, was tested with a built in benchmark and only bothered testing 1080p. As at this resolution, we're already GPU bound where the results are extremely close together, regardless of CPU due to a graphical bottleneck. It may be more interesting with more powerful graphics like a 2080 either, but unfortunately the 1080 is the best I've got as we've. Seen as we step up in resolution and setting levels, there's less of a gap between the two CPUs as the 1080 graphics card becomes. The bottleneck, this is why in the games tested at 1080p, there appears to be the biggest performance improvement, then less so at 1440p and even less at 4k.

If you're planning on playing games at higher resolutions, you generally be better off putting the money towards a better graphics card. Instead, as I suspected there's, no major differences in gaming between these two CPUs, at least from the range of games tested here, I think for most games. The 6 cores on offer with the 8700 K would be enough for a while, yet otherwise, we're only seeing small boosts with the 900k due to the clock, speed differences which, as will now see aren't even that much once we overclock the 8700 K to be fair, We may have seen larger differences at 1440p and maybe even 4k if I had a better graphics card, but unfortunately, I don't have xx ATT eyes laying around so I have to work with what I've got alright. So, as per overclocking, I was able to get my 8700 ke to 5 gigahertz on all six cores at one point: three: six volts. While I could only get my 9900 ke to four point, nine gigahertz on all eight calls. At the same one point three six volts: I was able to run the 9900 K at five gigahertz on all cores at around one point: 4 volts, and while most of my tests it's past some failed. So I ended up stepping back down to 4.9 gigahertz. In terms of, however, clocking affected gaming I've only retested one game ashes of the singularity at 1080p, 1440p and 4k, with the high setting prieser there's a larger difference at the lower resolutions, as expected, due to being less GPU bound, but realistically not too much of a Difference it will, of course vary between game and overclocks will depend on the silicon muttering anyway and will vary between chip to chip.

Now let's check out some cpu specific benchmarks, I've tested both CPUs at stock speed and with the overclocks previously mentioned. I'Ll also note that in these tests, we'll see a bigger jump in performance between stock and overclocked, with the 8700 K as we're, effectively increasing the old core speed by 700 megahertz. While the 99 hundred K is only increasing by 200 megahertz, starting out with set of bench we're, seeing slightly better single core performance with the overclocked 8700 K, as expected, owing to the higher 5 gigahertz overclock that I was able to get. But when it comes to multi core, the extra 2 cores in the 9900 K push it out in front between the two CPUs at stock. That 9900 K is performing 49 percent better in the multi core test or 31 percent better between the birth with overclock supplied, Adobe Premiere was tested using the newest CC 2019 version I've just exported one of my laptop reviews at 1080p, using the built in high bitrate Preset and the 9900 K is completing the task over 20 faster than the stock 8700 K, but this shrinks to around a 10 difference. Once overclocking, the 8700 K blender was tested using the BMW one classroom, benchmarks and we're starting to see the same pattern emerge where there's minimal difference between the 9900 K at stock and overclocked. With a larger difference, once overclocking the 8700 K, there is still a fair improvement with the 9900 K is to extra cause.

7, zip was used to test compression and decompression speeds and the extra to cause in the 9900 K helping out in decompression tasks with a 42 improvement at stock speeds and a 31 improvement between the overclocked speeds, as the bigger overclock on the 8700 K closes. That gap compression speeds, on the other hand, a release. An 80 improvement at stock speeds between both CPUs veracrypt was used to test a es, encryption and decryption speeds, and not too much of a performance difference. In this workload, a nine percent and 15 percent increase for decryption and encryption speeds respectively, with the 900k at stock over the Getti 700k at stock handbrake was used to convert a 4k video file to 1080p and then a separate 1080p file to 720p at stock speeds. The 99 hundred KS completing the 1080p task 47 faster than the stock 8700 K, although if we overclock both the difference lowers to 30, the corona benchmark uses the CPU to render a scene and fairly similar results to what we've already seen with the 99 hundred K. Its stock completing the task over 30 percent quicker than the 8700 K, and this is a good example illustrating how the you ever clock on the Goethe 700 K is giving us a bigger boost. Compared to the 900 K is overclock. The v ray benchmark gave us very similar results to what we just saw in Corona, with no major difference between the nine hundred K at stock or once overclocked I've attempted to summarize this information here to try and give you an idea of the differences in these Specific applications tested the first graph shows how much the ninety nine hundred K at stock was ahead of the 8700 K at stock.

On average, the 99 hundred K was 23 better in these tests. If we then go on to compare the overclocked to ninety nine hundred K against the overclocked 8700 K, this lowers to the 99 hundred K being just under 15 better, as the 8700 K is able to close the gap with it's, better overclock, but remember this is Only based on my results and the applications are tested and is even specific to my overclocks. I was able to gain on my chips. These are the temperatures of both CPUs at stock and, while overclocked at idle and while running a blender benchmark with an ambient room temperature of 20 degree Celsius, I've included two readings in red and purple with the three fans on my 360 millimeter. A are running at different speeds in order to avoid thermal throttling while overclocked, I did have to have the fans running quite loud, but it still seemed to be able to do the job it's also worth remembering, while overclocked both have the same 1.36 multiplier, which should Further help in this comparison, I've measured the total system power draw from the wall, and I recent go to new palomito, so this should actually be accurate. These are the results of the system at idle and then, while under heavy CPU load during the blender benchmark. As expected, the 9900 K uses more power as it's got those two extra cores, however there's a larger difference between stock and overclocked.

On the 8700 k as there's a logic, lock, speed change, they're just for fun knife also briefly tested the 1900 k with two of its cause: disabled, essentially making it a six core 12 thread cpu, just like the 8700 k to attempt to even the playing field. I'Ve also kept them both at 4.5 gigahertz on all six cores. We can see that under this somewhat apples to apples test, the 900 K is using slightly less power and is also running a little cooler in the same blender tests likely due to the solder Tim, which should help it cool better. Compared to the paste between the die and IHS in the 8700 K, with both CPUs essentially the same, I have retested Cinebench and taken the averages from 5 runs and the 900k was slightly ahead and multi core performance in this test. Sir, perhaps an indication of small IPC improvements over the 8th gen granted – this is only a 0.67 percent improvement, but the 900k was consistently in front likewise. I'Ve also rerun the blend of benchmarks, and the 9900 k was again only just ahead here, but honestly to close within margin of error territory 2 for updated pricing for either CPU check the links in the description here in Australia. At the time of recording and 8700 k is going for 589 Australian dollars, while the 9900 K is going for 859 Australian dollars, so a 45 higher cost in the US. The 8700 K currently goes for around 370 u.

s. dollars, while the 9900 K is about 530. U.S. dollars so a similar increase here 43 in games you're, definitely not going to see this much of an improvement, but in some CPU specific workloads it may be possible as we've seen to be fair. I do expect the 9700 K to be a closer comparison to the 8700 K, but I still think it's pretty crazy that the price difference is so much just to get a cause, especially when you compare it to a mdn 2700 X, but I'll save that analysis. For the upcoming 2700 X first 97 hundred K video, I wanted to do it first, but wasn't able to get my hands on a last minute, 2700 X, so I ended up buying one and now I'm waiting for it to arrive, so don't forget to subscribe. So you don't miss that Intelli advertising, the 9900 k is the best gaming cpu and while strictly speaking, they're not wrong in terms of performance, as we've seen here, there's negligible improvement in most games over the 8700 K kind of hard to justify that 45 higher price. I couldn't see myself buying this. If I was looking at building a gaming machine, there are much better choices out there for the money, but if money isn't a problem – and you really do want the best, then the 9900 K might be for you. If you're currently running an 8th gen system, the 9th gen may appear attractive, as after a BIOS update, you can simply put in the new CPU and away you go honestly.

I think in most cases, though, the price difference isn't worth it, unless you're really going to be doing something that actually utilizes the extra cause. If you're just gaming, then it doesn't really seem like the 9900 K is worth it at the current price and you'd most likely be better off putting that money into a GPU upgrade, especially if you play at resolutions above 1080p or use higher settings. Let me know which of these 2 CPUs you pick and why, down in the comments and don't forget to subscribe for the horizon 2700 X and Intel 9900 K comparison as well as future tech.