|
Your continuous donations keep RPGWatch running!
RPGWatch Forums » General Forums » Tech Help » nVidia 400 series

Default nVidia 400 series

July 6th, 2010, 18:10
Originally Posted by MasterKromm View Post
-Tangent/Rambling thought- AMD is clearly in the lead now, no question(if I needed to upgrade I'd be all over a 5870). But it's worth mentioning nvidia's G80(unified shader arch) was incredibly potent. IMO, if gt200 had achieved the clocks nvidia set out for they wouldn't be nearly as bad off as they are now. Yes AMD would have still wrestled away the performance and price/performance crowns with the dawn of their evergreen lineup(some say nvidia has taken back performance with Fermi - at least in dx11/GPGPU/tesselation).
I don't see AMD having the lead in performance, or price/performance, right now. Just curious, what makes you believe that AMD is "clearly in the lead now"?
JDR13 is offline

JDR13

JDR13's Avatar
SasqWatch

#21

Join Date: Oct 2006
Location: Florida, US
Posts: 18,030

Default 

July 6th, 2010, 20:01
Originally Posted by JDR13 View Post
I don't see AMD having the lead in performance, or price/performance, right now. Just curious, what makes you believe that AMD is "clearly in the lead now"?
Well it depends on what you're after. From a pure performance single GPU solution perspective(not single card dual GPU aka 5970) I will concede GTX480 is king of the hill…When I said ATI had the lead I meant more in terms of product success, overall price/performance, thermals/acoustics/power consumption and image quality.

That said, since this is a Fermi thread I will point out where it fails compared to AMD's evergreen lineup:

-First and foremost, AMD has a full DX11 lineup unlike NV - from the lowly 5670 up to their 5970. The real issue for NV's Fermi is that they are working with the exact same core, GF100. The difference between GTX 480, 470 and 465 is simply a few disabled blocks(aka binning), either by design or by necessity(IE disabling non/malfunctioning blocks). Which translates into parts that sometimes consume similar power while providing lower performance(http://www.anandtech.com/show/3745/n…rce-gtx-465/14). Where as ATI has Cypress(5800) series, Juniper(5700) series, Redwood(5600+5500) and Cedar(5400) series cards. All of which feature their own core. Now, GF104 might finally shake things up a bit – frankly that part is long overdue… As is a refresh/shrink of GF100. It's worth noting that even a die shrink might not be enough to reign in power and allow nvidia to produce a dual gpu card.

-If you read through the previously linked anandtech thread you will see that in terms of thermals, acoustics and power consumption Fermi's performance is terrible. Heck, AMD's dual GPU 5970 consumes less power under load than a single GTX480. Not only do you have a question of performance per watt but you also have to consider that an AMD GPU might be a pure drop in upgrade where as the GTX480 could require a new PSU. This is literally where I would find myself if I was considering upgrading GPUs as I have a Corsair 550vx(amazing PSU btw). Sure you could argue that Corsair's CWT and Seasonic built units are great and can be pushed up to and in some cases beyond their max(for my 550 that's 41A on the 12v rail). But why go there? Now I know performance per watt might sound like a crazy metric to even bother considering; yet in an area with exceptionally high energy costs and living in a house with my brother + 1 more housemate(both of which use my gaming PC more than I do) said performance per/watt figure becomes very important when considering any potential upgrade.

-If you look at AMD's design their smaller more efficient approach has paid off in spades. Fermi is a monster @ 3bn trannies and a die size of 529mm^2 especially when compared to Cypress which comes in at 2.15bn trannies and 334mm^2. Not only does AMD have more dies per wafer as a result of their physically smaller size, they're also less "complex" cores and as such have provided (significantly?) better yields.

die_comparison.png

-AMD has also finally improved upon something where they were sorely lacking/getting beat by Nvidia… Before, I found Nvidia's Angle dependent Anisotropic Filtering algorithm to be superior, now that AMD has implemented their Angle Independent Anisotropic Filtering they rule the roost.

http://www.bit-tech.net/hardware/gra…re-analysis/12

-And lastly, Nvidia's drivers are no longer perceptibly better than ATI's, they're roughly equal. Which isn't necessarily bad, just not an advantage for the green team anymore.
Last edited by MasterKromm; July 6th, 2010 at 20:08. Reason: added picture
MasterKromm is offline

MasterKromm

Sentinel

#22

Join Date: Feb 2010
Posts: 376

Default 

July 6th, 2010, 21:13
Originally Posted by MasterKromm View Post
When I said ATI had the lead I meant more in terms of product success, overall price/performance, thermals/acoustics/power consumption and image quality.
I'd say overall price/performance is *real* close right now, with ATI perhaps having a small advantage. ATI is better if you're really worried about power consumption, (few enthusiast gamers are), but their cards are not significantly quieter than nVidia's.

Not sure where you get ATI having better image quality, because that's definitely not true.


Originally Posted by MasterKromm View Post
-First and foremost, AMD has a full DX11 lineup unlike NV - from the lowly 5670 up to their 5970. The real issue for NV's Fermi is that they are working with the exact same core, GF100. The difference between GTX 480, 470 and 465 is simply a few disabled blocks(aka binning), either by design or by necessity(IE disabling non/malfunctioning blocks). Which translates into parts that sometimes consume similar power while providing lower performance(http://www.anandtech.com/show/3745/n…rce-gtx-465/14). Where as ATI has Cypress(5800) series, Juniper(5700) series, Redwood(5600+5500) and Cedar(5400) series cards. All of which feature their own core.
I can see how that might be an advantage in terms of overall sales, (uninformed people buying a cheap video card because it says "Direct X 11 compatibility" on the box), but in truth, those lower/mid-range cards can't actually run many DX 11 games with acceptable framerates.


Originally Posted by MasterKromm View Post
-AMD has also finally improved upon something where they were sorely lacking/getting beat by Nvidia… Before, I found Nvidia's Angle dependent Anisotropic Filtering algorithm to be superior, now that AMD has implemented their Angle Independent Anisotropic Filtering they rule the roost.

http://www.bit-tech.net/hardware/gra…re-analysis/12.
Sure, if you want to compare the 5xxx series to nVidia's previous generation of cards, which is exactly what that article does. Personally, I'd rather be able to use Supersampling Anti-Aliasing in DX 10/11 games, which ATI cards are incapable of doing.
JDR13 is offline

JDR13

JDR13's Avatar
SasqWatch

#23

Join Date: Oct 2006
Location: Florida, US
Posts: 18,030

Default 

July 6th, 2010, 21:44
Originally Posted by JDR13 View Post
I'd say overall price/performance is *real* close right now, with ATI perhaps having a small advantage. ATI is better if you're really worried about power consumption, (few enthusiast gamers are), but their cards are not significantly quieter than nVidia's.

Not sure where you get ATI having better image quality, because that's definitely not true.

I can see how that might be an advantage in terms of overall sales, (uninformed people buying a cheap video card because it says "Direct X 11 compatibility" on the box), but in truth, those lower/mid-range cards can't actually run many DX 11 games with acceptable framerates.

Sure, if you want to compare the 5xxx series to nVidia's previous generation of cards, which is exactly what that article does. Personally, I'd rather be able to use Supersampling Anti-Aliasing in DX 10/11 games, which ATI cards are incapable of doing.
The Angle Independent Anisotropic Filtering was the IQ boost I was talking about… As for SSAA, how many good DX10/11 games are out right now and of those how many are RPGs?

Dx11 is the future, but(and this is entirely my opinion) why jump on the bandwagon as an early adopter? It goes without saying that there will always be better tech down the line, and while some might argue if you "wait" too long you might fall into the trap where you're always waiting for that next technological breakthrough. I'm of the opinion that if you have a GPU capable of cranking out acceptable frames for the games you enjoy and @ your monitor's native resolution(+ w/e eye candy you like) why bother?

Also, in the games that support SSAA, how does the card perform? I doubt it's good since at 4x SSAA the GPU has to sample each pixel 4 times… That kind of increased workload typically translates into a huge drop in performance. Does it take SLI to run 1920x1200 and 4x SSAA at playable frames? Also, gaming on a monitor with a higher pixel pitch reduces the need for AA so there are some scenarios where the IQ difference could be a complete wash.

I'm curious now, do you have any comparison screens of your own you'd be willing to share?
MasterKromm is offline

MasterKromm

Sentinel

#24

Join Date: Feb 2010
Posts: 376

Default 

July 6th, 2010, 22:05
For me, I'd rather run at a higher resolution than run with AA on. Of course, being able to have both on is best, but not realistic when playing bleeding edge games. SSAA would make this problem worse it seems. Is it a moot point for newer games, and only useful for older titles that aren't too demanding?
Thrasher is offline

Thrasher

Thrasher's Avatar
Wheeee!
RPGWatch Donor

#25

Join Date: Aug 2008
Location: Studio City, CA
Posts: 10,129

Default 

July 6th, 2010, 22:07
Originally Posted by Thrasher View Post
For me, I'd rather run at a higher resolution than run with AA on. Of course, being able to have both on is best, but not realistic when playing bleeding edge games.
Sure it is, if you have the hardware. I'm curently playing Metro 2033 at 1920x1200 with x4 AA.

Of course it depends on the graphics engine of the game. Some games look fine without any AA at all.
JDR13 is offline

JDR13

JDR13's Avatar
SasqWatch

#26

Join Date: Oct 2006
Location: Florida, US
Posts: 18,030

Default 

July 6th, 2010, 22:18
And some games were impossible to play at full resolution with AA turned on (like Crysis).
Thrasher is offline

Thrasher

Thrasher's Avatar
Wheeee!
RPGWatch Donor

#27

Join Date: Aug 2008
Location: Studio City, CA
Posts: 10,129

Default 

July 6th, 2010, 22:39
Originally Posted by MasterKromm View Post
The Angle Independent Anisotropic Filtering was the IQ boost I was talking about… As for SSAA, how many good DX10/11 games are out right now and of those how many are RPGs?
Not a huge amount, but enough for me, and I don't limit myself to crpgs.


Originally Posted by MasterKromm View Post
Dx11 is the future, but(and this is entirely my opinion) why jump on the bandwagon as an early adopter? It goes without saying that there will always be better tech down the line, and while some might argue if you "wait" too long you might fall into the trap where you're always waiting for that next technological breakthrough. I'm of the opinion that if you have a GPU capable of cranking out acceptable frames for the games you enjoy and @ your monitor's native resolution(+ w/e eye candy you like) why bother??
I agree in part, but I wasn't happy with my previous card (Radeon 4890), and I got a very good price on the GTX 470. I didn't upgrade just for Dirext X 11, although I'm already enjoying its benefits. (Tessellation is quite nice)

Originally Posted by MasterKromm View Post
Also, in the games that support SSAA, how does the card perform? I doubt it's good since at 4x SSAA the GPU has to sample each pixel 4 times… That kind of increased workload typically translates into a huge drop in performance. Does it take SLI to run 1920x1200 and 4x SSAA at playable frames?
Afaik it doesn't matter if a game supports SSAA because you can force it almost every time. Of course performance is going to depend on the game itself. I wouldn't dare to try it with something like Crysis or Metro 2033, but plenty of older games are quite playable. At least the option is there for the people who do have dual-GPU setups.

Originally Posted by MasterKromm View Post
I'm curious now, do you have any comparison screens of your own you'd be willing to share?
Not quite sure what you mean? Are you talking about SSAA specifically? Here's an article that talks about it, and shows a few comparison screens as well.
http://www.pcgameshardware.com/aid,7…ames/Practice/
JDR13 is offline

JDR13

JDR13's Avatar
SasqWatch

#28

Join Date: Oct 2006
Location: Florida, US
Posts: 18,030

Default 

July 6th, 2010, 22:45
Originally Posted by Thrasher View Post
And some games were impossible to play at full resolution with AA turned on (like Crysis).
Yep, but the key word there is "were", as in, when it was first released. Crysis can be played at high resolution + AA with newer high end systems, and it's definitely not a problem for those with dual-GPUs.
JDR13 is offline

JDR13

JDR13's Avatar
SasqWatch

#29

Join Date: Oct 2006
Location: Florida, US
Posts: 18,030

Default 

July 6th, 2010, 23:13
And there are new games that are impossible to play at full resolution with AA turned on, right?
Thrasher is offline

Thrasher

Thrasher's Avatar
Wheeee!
RPGWatch Donor

#30

Join Date: Aug 2008
Location: Studio City, CA
Posts: 10,129

Default 

July 7th, 2010, 00:42
Originally Posted by Thrasher View Post
And there are new games that are impossible to play at full resolution with AA turned on, right?
I don't know about "impossible", but there's a few that are going to suffer from slower framerates with a single GPU.
JDR13 is offline

JDR13

JDR13's Avatar
SasqWatch

#31

Join Date: Oct 2006
Location: Florida, US
Posts: 18,030

Default 

July 7th, 2010, 00:59
Below 30 fps for me is "impossible".
Thrasher is offline

Thrasher

Thrasher's Avatar
Wheeee!
RPGWatch Donor

#32

Join Date: Aug 2008
Location: Studio City, CA
Posts: 10,129

Default 

July 7th, 2010, 01:23
Originally Posted by JDR13 View Post
Not a huge amount, but enough for me, and I don't limit myself to crpgs.
Unfortunately I don't have as much spare time anymore so I have to limit my game selection quite a bit.

Originally Posted by JDR13 View Post
I agree in part, but I wasn't happy with my previous card (Radeon 4890), and I got a very good price on the GTX 470. I didn't upgrade just for Dirext X 11, although I'm already enjoying its benefits. (Tessellation is quite nice)
I don't doubt that… Was just throwing in my own 2cents. I had always been partial to nvidia, but for me their last two major GPU releases(GT200 and now GF100) were big let downs. Not that either were bad I just had high expectations(their constant and seemingly limitless delays didn't help either).

Originally Posted by JDR13 View Post
Afaik it doesn't matter if a game supports SSAA because you can force it almost every time. Of course performance is going to depend on the game itself. I wouldn't dare to try it with something like Crysis or Metro 2033, but plenty of older games are quite playable. At least the option is there for the people who do have dual-GPU setups.
I meant DX10/11 titles(or am I mistaken, can the dx9 API also support full screen SGSSAA?)… Random info, Dx11 is a superset of Dx10.1 which is also a superset of Dx10. In other words, you have to really dig into a "dx11" game's functionality to determine whether or not it is actually utilizing any of the new functionality introduced by the new superset. The two biggest are tessellation and multi-threading(yes CPU side - finally something for our quad and hexa core chips to do).

Originally Posted by JDR13 View Post
Not quite sure what you mean? Are you talking about SSAA specifically? Here's an article that talks about it, and shows a few comparison screens as well.
http://www.pcgameshardware.com/aid,7…ames/Practice/
Yeah, I remember seeing that when it was first published… Was hoping for real world examples as their textures seemed somewhat blurred then again they applied/increased TSSAA not SSAA.

Oh and I've seen ToMMTi-Systems SSAA tool posted up a couple times when searching for some info on SSAA, have you tried it out(If so any thoughts)?

Link : http://www.tommti-systems.de/start.html

Originally Posted by Thrasher View Post
Below 30 fps for me is "impossible".
While I didn't really like Crysis(came for free with my g92 gts) surprisingly it played somewhat "smooth" even under 30FPS. Though in general I tend to agree, < 30FPS = extreme pain.
MasterKromm is offline

MasterKromm

Sentinel

#33

Join Date: Feb 2010
Posts: 376

Default 

July 7th, 2010, 01:29
Impossible to me equals extreme frustration. But possibly playable by someone with a strong ability to dissociate themselves from time.
Thrasher is offline

Thrasher

Thrasher's Avatar
Wheeee!
RPGWatch Donor

#34

Join Date: Aug 2008
Location: Studio City, CA
Posts: 10,129

Default 

July 7th, 2010, 01:33
To test that theory perhaps it is time to play an "impossible" game while getting acquainted with Mary Jane?

MasterKromm is offline

MasterKromm

Sentinel

#35

Join Date: Feb 2010
Posts: 376

Default 

July 7th, 2010, 01:57
Originally Posted by MasterKromm View Post
I had always been partial to nvidia, but for me their last two major GPU releases(GT200 and now GF100) were big let downs. Not that either were bad I just had high expectations(their constant and seemingly limitless delays didn't help either).
I wouldn't be so quick to judge Fermi from second-hand sources. I'm pleasantly surprised with it, and it has performed even better than I expected. A good friend and gaming partner of mine has a 5850, and he says that now he wishes he had held out for a GTX 470. The only downside atm is cost.


Originally Posted by MasterKromm View Post
I meant DX10/11 titles(or am I mistaken, can the dx9 API also support full screen SGSSAA?)… Random info, Dx11 is a superset of Dx10.1 which is also a superset of Dx10. In other words, you have to really dig into a "dx11" game's functionality to determine whether or not it is actually utilizing any of the new functionality introduced by the new superset. The two biggest are tessellation and multi-threading(yes CPU side - finally something for our quad and hexa core chips to do).
I don't think any version of Direct X would be incompatible with SGSSAA, but don't quote me on that, I'm too lazy to actually look it up right now. I do remember using it years ago (before MSAA) with earlier 3d games. Iirc didn't the old 3dfx cards, and early GeForce cards, use the same method?


Originally Posted by MasterKromm View Post
Oh and I've seen ToMMTi-Systems SSAA tool posted up a couple times when searching for some info on SSAA, have you tried it out(If so any thoughts)?
Not familiar with it. Thanks for the link though, maybe I'll give it a try when I have some spare time.

Oh… and go easy on Thrasher. He's extremely sensitive to anything about a game that's not perfect to him.
JDR13 is offline

JDR13

JDR13's Avatar
SasqWatch

#36

Join Date: Oct 2006
Location: Florida, US
Posts: 18,030

Default 

July 7th, 2010, 02:11
on my point again I really don't think they can call it Fermi until they implement the software.

The GPGPU features are supposed to be fully available as an import library rather than running CUDA files and accessing them as external objects.

And again, how much faster would your computer be running 512-1ghz with 120-512 cores compared to say the 2 or 4 CPUs at 3.2ghzwith all the architectures taken up by caching?



Also, my other research machine is running a 5870 but I haven't played with Stream yet.

My purpose is to do comparative work between the machines using OpenCL. When I get it to work it will open up a new benchmark you guys can argue about.



Speaking of the argument, I haven't heard any mention of Intel's cards yet.

Developer of The Wizard's Grave Android game. Discussion Thread:
http://www.rpgwatch.com/forums/showthread.php?t=22520
Last edited by Lucky Day; July 7th, 2010 at 02:22.
Lucky Day is offline

Lucky Day

Lucky Day's Avatar
Daywatch

#37

Join Date: Oct 2006
Location: The Uncanny Valley
Posts: 3,202

Default 

July 7th, 2010, 02:39
Beware of criticizing anything JDR may like; you'll never hear the end of it.
Thrasher is offline

Thrasher

Thrasher's Avatar
Wheeee!
RPGWatch Donor

#38

Join Date: Aug 2008
Location: Studio City, CA
Posts: 10,129

Default 

July 7th, 2010, 04:20
Originally Posted by Lucky Day View Post
Speaking of the argument, I haven't heard any mention of Intel's cards yet.
Meh…. Who needs Intel?



Originally Posted by Thrasher View Post
Beware of criticizing anything JDR may like; you'll never hear the end of it.
I don't think he has anything to worry about, he doesn't seem opposed to trading factual information.
JDR13 is offline

JDR13

JDR13's Avatar
SasqWatch

#39

Join Date: Oct 2006
Location: Florida, US
Posts: 18,030

Default 

July 9th, 2010, 02:51
or discussing the real power of video cards, General Purpose GPU programming!

Developer of The Wizard's Grave Android game. Discussion Thread:
http://www.rpgwatch.com/forums/showthread.php?t=22520
Lucky Day is offline

Lucky Day

Lucky Day's Avatar
Daywatch

#40

Join Date: Oct 2006
Location: The Uncanny Valley
Posts: 3,202
RPGWatch Forums » General Forums » Tech Help » nVidia 400 series
Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 00:13.
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
Copyright by RPGWatch