PCG tested R9 Fury X

joxer

The Smoker
Original Sin Donor
Original Sin 2 Donor
Joined
April 12, 2009
Messages
23,459
My post from another thread…
Just to say, while a bit offtopic, AMD revealed Quantum and Fury lines where Quantum is not interesting at all, but Fury?
Based on some leak (take it with a grain of salt), Fury and upcoming later twice as good Fury 2x are supposed to eat Titan for breakfast:
http://www.guru3d.com/news-story/ra...an-x-and-fury-to-gtx-980-ti-3dmark-bench.html

So definetly stick to GTX 770 some more, don't upgrade it yet, even if these benchmarks are fabricated the price of other cards will soon drop.

Seems that Fury X is just as good as GTX 980, but keep in mind that drivers are not optimized for it yet so it can only get a better performance in time. What will also help it for sure is upcoming DX12.
http://www.pcgamer.com/amd-radeon-r9-fury-x-tested-not-quite-a-980-ti-killer/

Now… Where are those huge pricedrops on a bit older GPUs? :D
 
Joined
Apr 12, 2009
Messages
23,459
Looks like the same story I'm used to seeing about a new AMD graphics card. It's close, but no cigar.
 
Joined
Oct 21, 2006
Messages
39,413
Location
Florida, US
Bit disappointed as well. I hoping for good performance and price from AMD to keep Nvidia in check but it doesn't look that way so far..
 
Joined
Oct 8, 2009
Messages
4,425
Location
UK
It is a joke that the cards just have 4 GB of memory, however I guess this is a test "card" for the new memory type, once they get the new HBM under control I think they have a lot of potential, but they have to quickly get something good out with that, before NVIDIA is ready with their counter move!
 
Joined
Oct 25, 2006
Messages
6,292
Gamestar also did several tests, coming to a similar result.

Benchmarks without AA and with AA, avergage:
http://www.gamestar.de/hardware/gra...087409,2.html#benchmarks-ohne-kantenglaettung

4k Benchmarks:
http://www.gamestar.de/hardware/gra...eon_r9_furyx,928,3087409,3.html#4k-benchmarks

And they also have a video where they discuss it, but it's completely in german:
http://www.gamestar.de/index.cfm?pid=1589&pk=83440&ci=search&search=fury

So in total on "normal" resultions it performs similar as the GTX 980
On 4k resolutions it shines and becomes as almost good as the GTX 980 Ti, which however costs the same.

So in comparison the GTX 980 TI is more flexible and in addition also supports newer direct X 12 technologies which the Fury card does not support (but which most likely will stay irrelevant).

I think the most interesting information can be taken from the discussion in the video and from this table:
http://www.gamestar.de/hardware/gra...on_r9_furyx,928,3087409.html#technische-daten

It shows that even though the fury is only running on a quarter of the Memory Clocing as the GTX, it compensates that by the new memory structure (it has more than 10 times the memory interface size)

In that we can see the potential of the new cards. You also need to keep in mind that while the Fury X is using this new HBM, the new NVIDIA cards will directly come with the second generation of this memory next year.

What the tests do not show yet though is the performance increase via overclocking.
And considering that the Clocking speed is currently extremely low it will probably profit by a higher margin from overclocking than the NVIDIA Cards.
 
Joined
Jun 2, 2012
Messages
4,699
I don't think AMD has done too well here. Of the new range, most of the cards are rehashes of the old cards, and the Fury does not compete well with Nvidia. I was hoping that they would show some really competitive new technology. I hope that they can still come with something for next year. They are not in great shape, and I'd hate to see them exit the market.
 
Joined
Nov 8, 2014
Messages
12,085
But they test their games with "low" resolutions. Once 4k is needed the card is decent.

In any case - right now there is not a single reason to get this one instead of a 980 Ti.
But let's wait for some overclocking tests.
 
Joined
Jun 2, 2012
Messages
4,699
Well, it's not 4k aka 3840×2160, which is where this card "shines"

They have a 4k section which concludes: The GTX 980 Ti is 7% faster than the R9 Fury X. R9 Fury X is 38% faster than the R9 290X
 
Joined
Jun 2, 2012
Messages
4,699
That may be, but 4K is mostly just for show right now. We're probably still at least a year away before it starts to become mainstream.
 
Joined
Oct 21, 2006
Messages
39,413
Location
Florida, US
While that may be true anything more than a GTX 970 probably has no use but for "show" right now. Including everything with more than 4GB of RAM.

Well, and then there is VR.
 
Joined
Jun 2, 2012
Messages
4,699
A lot of the newer games need a 970 or higher if you want to run with all the bells and whistles and still get around 60 FPS. Especially if you're playing with anti-aliasing and ambient occlusion.

I agree about the RAM though. There's nothing out there right now that needs more than 4GB of VRAM for 1080p resolutions.
 
Joined
Oct 21, 2006
Messages
39,413
Location
Florida, US
Well, it's not 4k aka 3840×2160, which is where this card "shines"

It does not "shine" at all (just as the 980Ti and TitanX do not shine either). Have you looked at the settings in 4K? They have turned off AA or turned down the detail settings in most of the games to get somewhat playable fps. How stupid is that?
Given the choice, you should always want to play at a resolution that allows for the maximum possible quality settings.
All of the current cards that are being advertised as "4K ready" cards are in reality 1440p cards because that is the resolution where the cards really shine (at least most of the time).
Real "4K ready" cards are still at least one generation, maybe two, away.

There are many reviews out there, of course, but I found that the HARDOCP review really summed up very well why the FuryX is so disappointing.

Here's a snip on the 4K gaming debate:
Who is Built for 4K? You Tell Us

To make a video card that is built for 4K gaming some basic things need to happen. First, the GPU must be fast, it must be able to handle high resolutions and pump out the performance needed to push 4K resolution. In terms of pixels, 4K is 8,294,400 pixels. Compare that to 1440p's pixel mass of 3,686,400.

Part of the GPU specification that helps push these pixels are known as the ROPs. NVIDIA scaled up its ROPs with the GeForce GTX 980 Ti and TITAN X up to 96 ROPs. AMD however did not scale up its ROPs compared to the 290X/390X, it is still at 64 ROPs.

NVIDIA sought to up the GeForce GTX 980 Ti to 6GB and the TITAN X to 12GB. This is where a video card needs to be if you are aiming for a video card that is quote: "built for 4K." No current amount of memory bandwidth is going to overcome the physical limitation of VRAM.

NVIDIA also made sure not only DisplayPort 1.2 is on board for 4K 60Hz support but also HDMI 2.0 which is needed for HDMI support of 4K at 60Hz. AMD removed the DVI connections, but then only gave us HDMI 1.4, with no 4K 60Hz support. If, in our opinion, you are going to remove I/O connection options then at least support the latest versions of each connection for the latest resolution and refresh rate support. In this case that would be DisplayPort 1.3 and HDMI 2.0, if only AMD had done that, it would have felt like a more capable 4K card for displays.

Finally, performance must be there. Your video card must perform. So far, in our testing, the AMD Radeon Fury X trails performance of the GeForce GTX 980 Ti even though it is the same price. Add all these facts up, and you tell us which video card is "built for 4K gaming?"
 
Joined
Oct 18, 2006
Messages
3,201
That really depends. I personally don't have a 4k monitor. But from what I have read several times, you normally don't use AA with 4k, as you cannot see the edges anyways.
 
Joined
Jun 2, 2012
Messages
4,699
That really depends. I personally don't have a 4k monitor. But from what I have read several times, you normally don't use AA with 4k, as you cannot see the edges anyways.

I believe people already used that argument years ago when we were going to 1280x1024 on 15" LCDs ;) .

It is not really true though. First, the diagonals are also increasing. Many 4K displays are in the 30" to 34" range while the "standard" full HD display is 24". Sure, you might not really need AA with a 4K resolution on a 24" LCD (though it might still make sense to use at least a basic level of AA… more on that below) but on a 32"+ it might be an entirely different story.

Secondly, no matter how high the resolution, you should always activate at least 2x AA for a smoother picture. You always have a certain level of jaggies simply because textures are never 100% seamless. With high anisotropic filtering (16x AF) which is making the textures sharper and crisper, you are getting a certain level of "noise" in the image even at very high resolutions. That is why you will always want to take advantage of the smoothing effect of anti-aliasing. You get less flickering and a much nicer, smoother image (for example when you are standing in high grass in a game like Skyrim and looking around).

Finally, AA on/off is just part of the issue. If you look at the 4K benchmarks on just about any site, you will see that they turn off more than just AA (or only use the blurry FXAA at best). They often turn down the settings, too, from ultra to high or even down to medium/low etc. - I just think it's really dumb to game at 4K with reduced settings when you can game at 1080p/1440p in all its glory.
I will be upgrading to 4K as soon as GPUs become available that can really handle 4K with the highest quality in-game settings and produce 60fps across the vast majority of -then modern- games but not before that.
What for? Bragging rights? "Yo man, I'm gaming at 4K hohoho" ("but what I'm not telling you is that I got to turn down the settings in every game so I can actually run shit").
Yeah. Stupid.
 
Joined
Oct 18, 2006
Messages
3,201
Back
Top Bottom