Hunk'a Memory Video Card? - RPGWatch Forums
|
Your donations keep RPGWatch running!
RPGWatch Forums » General Forums » Tech Help » Hunk'a Memory Video Card?

Default Hunk'a Memory Video Card?

August 30th, 2015, 21:01
With DirectX12, does it now make sense to buy a video card purely for the memory? For instance, you could drop in a GTX 740 with 4GB of DDR5 for just $100. That's a huge memory boost for ultra high-res textures! (At least for a DirectX 12 game that wants them or allows mods.)
Zloth is offline

Zloth

Zloth's Avatar
I smell a… wumpus!?

#1

Join Date: Aug 2008
Location: Kansas City
Posts: 7,637
Mentioned: 34 Post(s)

Default 

August 30th, 2015, 21:12
There are only a dozen games which are released or in development which will use Direct X12 in the near future: http://www.pcgameshardware.de/Direct…tizen-1164994/

So I guess unless you are playing one of these games, the question is irrelevant anyways.

And on top of that even the games which support DX12 have to specially configurate it that the memory/gpu stack.

I am pretty sure that it doesn't make too much sense for a while.
--
Doing Let's Plays Reviews in English now. Latest Video: Encased
Mostly playing Indie titles, including Strategy, Tactics and Roleplaying-Games.
And here is a list of all games I ever played.
Kordanor is offline

Kordanor

Kordanor's Avatar
Wastelander

#2

Join Date: Jun 2012
Posts: 4,320
Mentioned: 45 Post(s)

Default 

August 30th, 2015, 21:22
DX12 allows for the use of multiple mismatched GPUs, but, as far as I know, each GPU will handle a percentage of each frame, depending on its power. So, a weak card would only handle a small portion of the frame when paired with a more powerful card. If each card had 4gb Vram, the Vram on the weaker card would only be used for its small slice of the rendering, and the more powerful card would have to stretch its 4gb for the larger portion of the scene. So, the memory is not combined efficiently when cards are mismatched in power.
Ripper is offline

Ripper

Ripper's Avatar
Бажаю успіху

#3

Join Date: Nov 2014
Posts: 11,256
Mentioned: 120 Post(s)

Default 

August 30th, 2015, 21:55
On the case of VRAM, your investing into it won't make any difference anywhere at all. Shadow of Mordor that wants 6GB of VRAM for ultra textures will remain a pathetic grinder. Wolfenstein that eats up 50 Gigs of your HDD space will still look like lowres since it's unoptimized huge textures are mostly a garbage. Etc.

Sure there will be a game, or two, soon, that will look drastically different on 2Gb and 4Gb of VRAM.
But IMO buying something extremely expensive for sakes of just a few games doesn't make sense.

I've said before, if you have so much money you won't feel an ich from buying a pair of MSI's 980Ti Lightning cards to overclock and pair in a machine - do it. If however your budget has a limit, grow a patience, prices will drop.
Instead of investing into abnormally priced VRAM you won't use that much today, how about buying yourself a new SSD instead? SSD is something you *will* use. Frequently.
--
Toka Koka
joxer is offline

joxer

joxer's Avatar
The Smoker
Original Sin 1 & 2 Donor

#4

Join Date: Apr 2009
Posts: 23,468
Mentioned: 230 Post(s)

Default 

August 30th, 2015, 22:37
@Joxer: I don't think you got his point. His idea was to throw in a cheap and weak card, just for the additional VRAM.
--
Doing Let's Plays Reviews in English now. Latest Video: Encased
Mostly playing Indie titles, including Strategy, Tactics and Roleplaying-Games.
And here is a list of all games I ever played.
Kordanor is offline

Kordanor

Kordanor's Avatar
Wastelander

#5

Join Date: Jun 2012
Posts: 4,320
Mentioned: 45 Post(s)

Default 

August 30th, 2015, 22:53
Jesus Christ yes I totally missed the point. Sorry.
After rereading his post - basically it'd come down to one GPU using VRAM from another.

I'm not sure what to say as theoretically, with proper drivers, it could work - if dx12 does allow such thing (and from what I've read it doesn't which still means nothing, I never researched the material thoroughly).

We need someone who read the whole material about dx12 to answer on this one.
Or perhaps ask on tom's hardware?
--
Toka Koka
joxer is offline

joxer

joxer's Avatar
The Smoker
Original Sin 1 & 2 Donor

#6

Join Date: Apr 2009
Posts: 23,468
Mentioned: 230 Post(s)

Default 

August 30th, 2015, 23:12
It doesn't work that way, and it would defeat the point of Vram anyway. The whole point is that the video card has its own dedicated ram directly connected to the GPU over a high speed bus.
Ripper is offline

Ripper

Ripper's Avatar
Бажаю успіху

#7

Join Date: Nov 2014
Posts: 11,256
Mentioned: 120 Post(s)

Default 

August 31st, 2015, 02:14
Hmmm, good point Ripper. Games have been able to use main memory as a 'backup' to video memory since DirectX 10 or something. Would video card A be better off tossing the extra textures into video card B or (assuming there is some) just putting it in main memory? Either way, the information has to go through card A's PCIe port but is it better to go through the second PCIe port or to use the slower memory?

Tom's Hardware might be a good idea… or maybe the 3D Gurus. Jeez, I haven't been there in centuries.
Zloth is offline

Zloth

Zloth's Avatar
I smell a… wumpus!?

#8

Join Date: Aug 2008
Location: Kansas City
Posts: 7,637
Mentioned: 34 Post(s)

Default 

August 31st, 2015, 03:06
The key thing is that once a graphic card is forced to use memory off the card, stuttering will be the result. The bus width of PCIe sockets is about 30gb/s at the absolute max. The bus speed between the GPU and its VRAM is more like 300gb/s on a decent card. Once the card is forced to go through PCIe to find some more memory, you already have a problem.

Also bear in mind that there is nothing special about the VRAM itself (though it's usually a bit faster than system RAM) - the main thing is the speed with which it communicates with the GPU, using the dedicated controllers and bus on the card.

I would virtually guarantee that DX12 will not try to facilitate cards using each other's VRAM. What it does allow for is split-frame rendering, so that if you had two equal cards, they would each use their own VRAM to render half the scene - effectively combining their VRAM total, but never actually accessing each other's memory.
Last edited by Ripper; August 31st, 2015 at 03:19.
Ripper is offline

Ripper

Ripper's Avatar
Бажаю успіху

#9

Join Date: Nov 2014
Posts: 11,256
Mentioned: 120 Post(s)
+1:

Default 

September 1st, 2015, 02:36
Stuttering shouldn't be a thing anymore. DX12 says that the game (or the game's engine) is responsible for getting all the resources in the right place at the right time. That's only if the game/engine developers can do a better job with their specific situation than driver developers have done with the general case, though.

Reading up more on this "explicit multi-adapter" technology, it does sound like what you're saying, Ripper. Some textures and such will need to be on both cards but not all of them, depending on what's getting rendered. (If the secondary GPU is just rendering the GUI for the game then it might have completely unique memory values… but it sure isn't going to need a complete gigabyte.)
Zloth is offline

Zloth

Zloth's Avatar
I smell a… wumpus!?

#10

Join Date: Aug 2008
Location: Kansas City
Posts: 7,637
Mentioned: 34 Post(s)

Tags
directx 12, memory, video
RPGWatch Forums » General Forums » Tech Help » Hunk'a Memory Video Card?

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 02:48.
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2022, vBulletin Solutions Inc.
vBulletin Security provided by DragonByte Security (Pro) - vBulletin Mods & Addons Copyright © 2022 DragonByte Technologies Ltd.
User Alert System provided by Advanced User Tagging (Lite) - vBulletin Mods & Addons Copyright © 2022 DragonByte Technologies Ltd.
Copyright by RPGWatch