S
sakichop
Guest
Intel still is the best for gaming, I look forward to the day that’s not the case though.
Intel still is the best for gaming, I look forward to the day that’s not the case though.
AMD processors consistently fail our QA tests. They may work on the average commercial game, but fail hard when running business apps that are processor hungry. This doesn't mean you shouldn't buy them for a gaming rig, but if you're going to run multiple apps that want CPU cycles all at the same time, stick with IBM.
Note: I'm not on the testing team, but I'm a frequent consultant when purchasing desktops/laptops/VDI's/thin clients/tablets/etc. Of course, when we buy it's in the hundreds of machines typically, so the little vendors don't have a shot.
YMMV
Yeah. I mean it might not be the "best for gaming" if you have a fixed amount of money to spend. But in the end, I am also not ignoring old experiences just because some numbers are bigger or lower.
With AMD you can still see hickups now and then, like with the release of Dead Red Redemption 2 where AMD Board manufacturers advised to update the bios so that the game doesn't crash.
This might not be AMDs fault alone. But before I switch over to AMD there (again) I'd rather let them ripe a bit.
Same goes for graphics cards…had multiple bad experiences with AMD in the past. Never going to switch to AMD for GPU.
Had horrible experiences with Pixelview Monitors, will never buy a pixelview monitor again even if they might look good on paper.
Ofc that's all extremely subjective and hard to argue with.
All I am saying is that in case of hardware there is more than just benchmark values.
Intel still is the best for gaming, I look forward to the day that’s not the case though.
AMD processors consistently fail our QA tests. They may work on the average commercial game, but fail hard when running business apps that are processor hungry. This doesn't mean you shouldn't buy them for a gaming rig, but if you're going to run multiple apps that want CPU cycles all at the same time, stick with IBM.
No. Intel is clearly better for unoptimized rubbish code. In such 2 games among 1000s of games per year where almost all are decently optimized, thanks to brute force Intel can provide the atrocious code passes whatever tests they make, and if.From what I read, Intel is the clear best at 1080p by about 10% to 15%.
2 cpus 24.75%
4 cpus 51.31%
6 cpus 17.55%
8 cpus 4.09%
12 cpus 0.12%
No. Intel is clearly better for unoptimized rubbish code. In such 2 games among 1000s of games per year where almost all are decently optimized, thanks to brute force Intel can provide the atrocious code passes whatever tests they make, and if.
Why reviewers never pan unoptimized horrors? Because just as hotel owners they play games on i9 and think everyone has i9 - while Steam statistics is clear.
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
Physical cores:
Remember, the RDR2 stutterfest wasn't caused by Intel or AMD. It was caused by CPU threads mismanagement from Rockstar's code so bad it felt as if it's the same Ubisoft's AC4 trash.
Yeah, they did accuse nVidia and nVidia did screw Vulkan support in their drivers, but there was nothing wrong with dx12 from nVidia's side from the beginning.
From what I read, Intel is the clear best at 1080p by about 10% to 15%. However when you go up in resolution that drops down to 2% or 3% very quickly.
That’s why since I’m only interested in pure FPS. I look at 720p benches. At that resolution the 9900k can beat the 3900x by 30-40% in games.
It really depends on what you are looking for though. Generally I'd say that future"proofing" makes almost no sense for CPUs at that cost.
Even with middle range CPU (e.g. i5 of almost any sort) you will likely not have any need to upgrade for quite a while. You probably have 2 different graphics cards at that time.
So rather than paying twice the money to get a rather small performance boost, you might want to save the money instead and put it either in the graphics card or into a replacement of the CPU along the road.
You can rather replace your i5 after a few years with a brand new system than trying to cope with an i9 for two "cycles". Replacing the whole system once is probably even cheaper and more performant for the second cycle.
However if you keep that cpu for a couple video card generations you’ll gain that difference back and probably more depending how fast future generations of video cards are.
.
While this might be true in some cases - at the extreme case you are illustrating here I think your reasoning is quite flawed.
I never understood the upgrade advantage of pc. I just buy a well-balanced pc (now that would be ryzen 2600 + gtx 1660). By the time it can't keep up anymore there are new sockets, memory speeds, etc. which basically requires building from scratch.
While this might be true in some cases - at the extreme case you are illustrating here I think your reasoning is quite flawed.
I think this is rather the "normal" case.
Let's say you had to buy a CPU in 2013 and you either buy the i5-4670K for 243$ or the Core i7-4770k for 350$ then these days it slowly getting time to upgrade again.
Yes the i7 has 100 Mhz more boost clock and 4 more logical cores due to hyperthreading. But it will not make a significant difference in your consideration on whether to update now or not. You can actually check some recent benchmarks in this video: (skip to the end): https://www.youtube.com/watch?v=R5qwTsdMMg8
So was this really worth the 100$? Probably not.
Of course you can go even further on both ends, go to socket 2011 on the one end and go down to i3 with even better price/performance on the other end.
In addition the gap between what is useful and what is the max you can get is even wider today. The i9-9900KS for 513$ vs let's say a i5 9500 for about 200$ or even i5 9600k for 262$
Yes and no. Luckily the Graphics Card is pretty separate AND it's the biggest boost. So you can just replace your graphics cards after 3-5 years and you have a top notch PC again without upgrading the rest. After another 3-5 years you will need to replace everything though.
I think this is rather the "normal" case.
Yes and no. Luckily the Graphics Card is pretty separate AND it's the biggest boost. So you can just replace your graphics cards after 3-5 years and you have a top notch PC again without upgrading the rest. After another 3-5 years you will need to replace everything though.
Burn as the latest Batman game still struggles on a lot of older hardware. Anyway I've said it before we need more engines like the one used in The Phantom Pain called Fox.Please don't invest into CPU because of one unoptimized game. Because whatever you upgrade, the next Warner's Batman will still struggle on it.