|
Your donations keep RPGWatch running!
RPGWatch Forums » General Forums » Tech Help » System Requirements & CPUs

Default System Requirements & CPUs

April 8th, 2014, 03:52
It's getting pretty difficult for me to understand what the CPU specs are even supposed to mean these days. The "recommended" specs for the up coming Watch Dogs game are:

Eight core - Intel Core i7-3770 @3.5 GHz or AMD FX-8350 X8 @ 4 GHz

We've got a problem right off the bat as the i7-3770 has just 4 cores. So I guess hyperthreading virtual cores count?

My main problem, though, is that CPUs have basically been treading water for several years now. AMD and Intel are both playing to the mobile/laptop folks by reducing power and beefing up built in graphics. There is some monkeying around with the sizes of the instruction caching but raw speed and core count has been at a stand still. Yet the stats listed inevitably list CPUs that are fairly recent - something in stores now.

Right now I've got an old i7-980. According to PC-Mark 7 Entertainment benchmarks on Tom's Hardware, my old beast gets a score of 4897 while the AMD CPU listed gets a slightly lower 4881. So it looks like I should be good but the Intel CPU they list is the second best on the entire list with a score of 5539.

The other benchmarks seem to tell the same story. The listed AMD processor the fastest one listed but the Intel processor has a good bit higher score and quite a few processors between the listed CPU and the AMD processor.

So what are these things really? I know we already have a pretty soft term by saying "recommended" settings but come on, the CPUs should at least score fairly close to each other. They REALLY need to make tech demos we can download for free to check these things out.
Zloth is offline

Zloth

Zloth's Avatar
I smell a… wumpus!?

#1

Join Date: Aug 2008
Location: Kansas City
Posts: 2,962

Default 

April 8th, 2014, 04:31
So long as you are running a "modern" CPU - it won't be holding you back much. I remember trying out a budget single-core in place of my over clocked Core2 Duo (3.something or other ghz) and firing up a few games. The difference was barely noticeable. Keep in mind this was an XP machine tweaked for gaming and several years ago. I say CPU is important, but not as big a deal as the people selling them would have you believe.

Now if you do something more CPU intensive - video encoding, folding@home, etc; then the difference is much bigger.
CrazyIrish is offline

CrazyIrish

Keeper of the Watch

#2

Join Date: May 2008
Posts: 604

Default 

April 8th, 2014, 06:39
If you're in the market, I would stay away from AMD. They are complete garbage in single-threaded applications (which games are for the most part). My new i7-4770k absolutely destroys the 3 year old Phenom II 965 I had previously.

As to Ghz ratings, the technologies are so varied these days that such comparisons only makes sense between like architectures. I once saw the analogy comparing a current 4ghz chip to one from 5 years ago as something akin to comparing a mazda miata with a ferarri - they'll both get you to 100 mph, but… there's a bigger story to tell.
Drithius is offline

Drithius

Drithius's Avatar
Misbegotten Alien

#3

Join Date: Nov 2008
Location: Florida, USA
Posts: 2,617

Default 

April 8th, 2014, 23:57
The fact that a $340 current intel processor blows the doors off of a 3 year old $250 AMD is hardly conclusive. I built a gaming rig for a friend recently using a $200 AMD and it performs flawlessly. If your budget is well over $200 for a processor, then yes, it makes sense to go intel. Otherwise, either will serve.
CrazyIrish is offline

CrazyIrish

Keeper of the Watch

#4

Join Date: May 2008
Posts: 604

Default 

April 9th, 2014, 01:57
Originally Posted by CrazyIrish View Post
The fact that a $340 current intel processor blows the doors off of a 3 year old $250 AMD is hardly conclusive. I built a gaming rig for a friend recently using a $200 AMD and it performs flawlessly. If your budget is well over $200 for a processor, then yes, it makes sense to go intel. Otherwise, either will serve.
Actually, it's pretty conclusive. The AMD 8350 chip noted by Zloth is comparable to an Intel Celeron processor in single threaded performance.

AMD's R&D for the past 5 years has been a joke, amounting to throwing more cores on a chip to try to obfuscate the fact that their cpu architecture is leaps and bounds behind Intel. Multiple cores are well and good if you're video editing but, nine times out of ten, an end user's typical usage will be better served by faster per-core performance. Gaming lies within this realm.

I used to be a fan of AMD cpus from a pricepoint-performance perspective, but that was back when they were still technologically competitive.
Drithius is offline

Drithius

Drithius's Avatar
Misbegotten Alien

#5

Join Date: Nov 2008
Location: Florida, USA
Posts: 2,617

Default 

April 9th, 2014, 02:17
Going back on topic, somewhat, I googled a terrific article I'd recommend reading for anyone mildly curious: HERE.

Some of the more applicable parts:
The Megahertz-myth was a result of people being conditioned to see the clockspeed as an absolute measure of performance. It is an absolute measure of performance, as long as you are talking about the same microarchitecture. So yes, if you have two Pentium 4 processors, the one with the higher clockspeed is faster. Up to the Pentium 4, the architectures of Intel and competing x86 processors were always quite similar in performance characteristics, so as a result, the clockspeeds were also quite comparable. An Athlon and a Pentium II or III were not very far apart at the same clockspeed.

However, when the architectures are different, clockspeed becomes quite a meaningless measure of performance. For example, the first Pentiums were introduced at 66 MHz, the same clockspeed as the 486DX2-66 that went before it. However, since the Pentium had a superscalar pipeline, it could often perform 2 instructions per cycle, where the 486 did only one at most. The Pentium also had a massively improved FPU. So although both CPUs ran at 66 MHz, the Pentium was a great deal faster in most cases.
__________________________________________________ _______
The more cores you add to a CPU, the faster the parallel parts of an application are processed, so the more the performance becomes dependent on the performance in the sequential parts

In other words: the single-threaded performance becomes more important. And that is what makes the multi-core myth a myth!

What we see today is that Intel’s single-threaded performance is a whole lot faster than AMD’s. This not only gives them an advantage in single-threaded tasks, but also makes them perform very well in multi-threaded tasks. We see that Intel’s CPUs with 4 cores can often outperform AMD’s Bulldozer architecture with 6 or even 8 cores. Simply because Intel’s 4 cores are that much faster.
Drithius is offline

Drithius

Drithius's Avatar
Misbegotten Alien

#6

Join Date: Nov 2008
Location: Florida, USA
Posts: 2,617

Default 

April 9th, 2014, 04:47
I'm not in the market, I'm just trying to figure out what to do with that "CPU" system requirement we see tacked on to every game out there.

Different games put very different stresses on the CPU. Some may need multi-threading, some may not. Some might benefit from a big L2 cache, some might not. With the technology advancements of late having little to do with gaming, it seems like we should either see a really old CPU listed or we should see stats listed instead of model numbers. That CPU requirement I listed above should really say something more like: "Cores times GHz of 10 but don't count more than 6 cores, hyper-threading cores count as half a core. L2 cache should be at least 3GB." Or, if a benchmark works real well for the game, point to the benchmark score. But instead we get model numbers and those model numbers don't really agree with each other when you compare them.
Zloth is offline

Zloth

Zloth's Avatar
I smell a… wumpus!?

#7

Join Date: Aug 2008
Location: Kansas City
Posts: 2,962

Default 

April 9th, 2014, 05:50
The problem with fiat benchmarks is they get to decide the scale, which in turn greatly influences your perception of the difference between different scores. I've personally stared at benchmarks for years. Then one day I woke up and realized I should actually use my computer instead of fantasizing about how fast it is measured in fiat terms compared to someone else's computer measured in the same fiat terms.

There's not much point in getting into it as everyone has their own standards, but this is an apt analogy - Cars come in a wide variety of performance levels. A new car in the US can be had with as low as 60hp and as high as 1600hp. For the average driver, the average car will do just fine. Sure, a GTR will smoke a Camry in all ways measurable (and in the automotive world, the "benchmarks" actually have real world meaning) but for the average driver going on the average drive, the average car will do just fine. And a %15 increase in performance simply won't be noticed by most. But lots of people convince themselves that they can feel every ghz…I mean horsepower.

And for back on topic - the "Required/Recommended CPU" is primarily a way for game developers to limit their exposure/liability when It comes to older systems. Very few games are CPU intensive enough for it to be a big deal. If you are rocking a mid grade processor from 4 generations ago, then yes, you may have an issue. Otherwise, don't sweat it.
CrazyIrish is offline

CrazyIrish

Keeper of the Watch

#8

Join Date: May 2008
Posts: 604

Default 

April 9th, 2014, 06:13
Originally Posted by CrazyIrish View Post
The problem with fiat benchmarks is they get to decide the scale, which in turn greatly influences your perception of the difference between different scores. I've personally stared at benchmarks for years. Then one day I woke up and realized I should actually use my computer instead of fantasizing about how fast it is measured in fiat terms compared to someone else's computer measured in the same fiat terms.
I used to think that, not putting too much importance in benchmarks - until I got royally pissed off when I lagged in a 4 year old game (New Vegas) with my then 2 year old processor. And I researched everywhere as to what the problem was… AMD architecture is inferior, no two ways about it - whether or not you drive it hard enough to realize the problem is another matter.

But, yes, the "Technical Requirements" is primarily to absolve publishers of having to Q/A more than they would care to do so. There's so many chipsets these days (thanks in no small part to Intel) that a single realistic performance rating/requirement doesn't exist.
Drithius is offline

Drithius

Drithius's Avatar
Misbegotten Alien

#9

Join Date: Nov 2008
Location: Florida, USA
Posts: 2,617

Default 

April 9th, 2014, 11:32
Well, CPU discussion has been fairly useless in gaming for a long time now. Might be now that we have new consoles with more cores, that the games will be built for more than 2 cores, and it'll start getting interesting again.

I wouldn't say architecture is the only reason the GHZ discussion more or less died, the main reason is that going above a certain GHZ is not cost effective. Having 4, 2 GHZ processors generate less heat and less cost, than one 5 GHZ would, at least it used to be like that, which means adding more cores made sense. However for gaming at this point of time, the 5 GHZ one would have been much more useful as they didn't take advantage of the 4 two GHZ processors, also in gaming a lot of tasks are still of a single threaded nature and is hard to split into different threads, which makes the GHZ the important measure for how fast the bottleneck single-threaded task can be completed.

Another way to improve performance is to let the processor do more things per clock cycle which is also what they've been working on, but it is also only useful for a certain kind of task which can used these instructions.

It'd be interesting if there was some new technology that could give us things like 10 GHZ processors.
GothicGothicness is offline

GothicGothicness

GothicGothicness's Avatar
SasqWatch

#10

Join Date: Oct 2006
Posts: 4,408

Default 

April 11th, 2014, 13:02
Originally Posted by GothicGothicness View Post
It'd be interesting if there was some new technology that could give us things like 10 GHZ processors.
There is probably, but it's either too complicated for current dumbsters in the industry or is too expensive compared to junkyard resources.

I wouldn't want to see processors with higher frequencies because of only one thing.
When we had "slow" CPUs, there was one particular thing in software development that would show a difference between morons and talented programmers.
Because of skyrocketing number of Mhz and later Ghz, today noone wants to pay for a quality, CEOs hire semiuseless teams and we get halfbaked products.

The one thing very important in the past, and ignored today was code optimization. And it's something you can't do (right) if you're not an artist in coding.
Books like this one are not an essential read any more just because of overpumped CPU tact:


Which CEO would hire you if you state you can optimize just anything? None. They don't know what is code optimization. And the ones who do know, they simply don't care any more.
And then we get a stuttering Thief.

Toka Koka
joxer is offline

joxer

joxer's Avatar
The Smoker
RPGWatch Donor

#11

Join Date: Apr 2009
Posts: 6,733

Tags
benchmarks, comparisons, system specs
RPGWatch Forums » General Forums » Tech Help » System Requirements & CPUs
Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 17:29.
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
Copyright by RPGWatch