|
Your donations keep RPGWatch running!
"Nextgen" GPUs
May 26th, 2016, 01:14
It looks like the entire nVidia Pascal line-up has leaked via AIDA64.
The TL;DR is:
GP-100 = Tesla P100 professional GPU (that one has been available for a while already) and Quadro (TBA)
GP-102 = Titan, GTX 1080Ti (ETA early 2017)
GP-104 = GTX 1080/1070 (May 27/June 10 2016)
GP-106 = GTX 1060 (ETA fall 2016)
GP-107/GP-108 = GTX 1050 and OEMs (TBA)
The TL;DR is:
GP-100 = Tesla P100 professional GPU (that one has been available for a while already) and Quadro (TBA)
GP-102 = Titan, GTX 1080Ti (ETA early 2017)
GP-104 = GTX 1080/1070 (May 27/June 10 2016)
GP-106 = GTX 1060 (ETA fall 2016)
GP-107/GP-108 = GTX 1050 and OEMs (TBA)
May 26th, 2016, 01:22
Also the first GTX 1070 Benchmark is available now (leak as well):
http://videocardz.com/60265/nvidia-g…ike-benchmarks
http://videocardz.com/60265/nvidia-g…ike-benchmarks
--
Doing Let's Plays Reviews in English now. Latest Video: Encased
Mostly playing Indie titles, including Strategy, Tactics and Roleplaying-Games.
And here is a list of all games I ever played.
Doing Let's Plays Reviews in English now. Latest Video: Encased
Mostly playing Indie titles, including Strategy, Tactics and Roleplaying-Games.
And here is a list of all games I ever played.
May 26th, 2016, 11:41
Originally Posted by KordanorAny leaks about the cost of that card ??
Also the first GTX 1070 Benchmark is available now (leak as well):
http://videocardz.com/60265/nvidia-g…ike-benchmarks
May 26th, 2016, 12:02
Originally Posted by KordanorHmm may be I should flog my GTX 970 and get 1070
Also the first GTX 1070 Benchmark is available now (leak as well):
http://videocardz.com/60265/nvidia-g…ike-benchmarks
. Looks like 40% performance boost. People are saying 1070 might cost ~£340 and ~£540 for 1080.
May 26th, 2016, 13:22
Originally Posted by GothicGothicnessNVIDIA already announced it (the MSRP) at the same time as the 1080. Price for the 1070:
Any leaks about the cost of that card ??
$379 US for "normal" edition
$450 US for the Founder edition
Most people are pretty sure there won't be any sub-$450 version though, because that's the price you pay for a cheap fan and crappy overclocking capability.
--
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
Last edited by azarhal; May 26th, 2016 at 14:07.
SasqWatch
Original Sin Donor
May 26th, 2016, 13:31
Originally Posted by azarhalThat does not make sense. In pounds that is ~£260 for 1080? They are selling GTX 970 for around that price here in UK!
NVIDIA already announced it (the MSRP) at the same time as the 1080:
$379 US for "normal" edition
$450 US for the Founder edition
Most people are pretty sure there won't be any sub-$450 version though, because that's the price you pay for a cheap fan and crappy overclocking capability.
May 26th, 2016, 14:07
Originally Posted by lostforeverThat's for the 1070, not the 1080.
That does not make sense. In pounds that is ~£260 for 1080? They are selling GTX 970 for around that price here in UK!
--
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
SasqWatch
Original Sin Donor
May 26th, 2016, 14:10
Originally Posted by lostforeverI think that when they quote MSRP in $ for the UK you will need to add VAT to that when calculating how much it will be over here. so reckon that the 1070 will be around £310-£320 for the non founders edition. Founders edition 1080's are going on sale at around £635 at the moment
That does not make sense. In pounds that is ~£260 for 1080? They are selling GTX 970 for around that price here in UK!
--
“We've got facts," they say. But facts aren't everything; at least half the battle consists in how one makes use of them!”
― Fyodor Dostoyevsky, Crime and Punishment
“We've got facts," they say. But facts aren't everything; at least half the battle consists in how one makes use of them!”
― Fyodor Dostoyevsky, Crime and Punishment
May 26th, 2016, 19:25
As soon as you can see one of the cards in your local shop you should be able to extrapolate this price to the other card as well.
And if the 970 brings 58% of the performance from the GTX 1070, this means that the GTX 1070 is at 172% of the 970 performance.
So the 1070 is 72% faster than the 970.
And of course that is single screen only. In VR you'd get an additional boost due to the new multiscreen technology.
Originally Posted by lostforeverIf you put it like that it's not correct. If you speak about boost you should go from the component which is boosted.
Hmm may be I should flog my GTX 970 and get 1070. Looks like 40% performance boost.
And if the 970 brings 58% of the performance from the GTX 1070, this means that the GTX 1070 is at 172% of the 970 performance.
So the 1070 is 72% faster than the 970.
And of course that is single screen only. In VR you'd get an additional boost due to the new multiscreen technology.
--
Doing Let's Plays Reviews in English now. Latest Video: Encased
Mostly playing Indie titles, including Strategy, Tactics and Roleplaying-Games.
And here is a list of all games I ever played.
Doing Let's Plays Reviews in English now. Latest Video: Encased
Mostly playing Indie titles, including Strategy, Tactics and Roleplaying-Games.
And here is a list of all games I ever played.
| +1: |
May 27th, 2016, 15:11
Total War got a beta DX12 patch and there was some tests done. It highlight the CPU bottleneck of the game, that DX12 helps everyone get better FPS and what happens with a GTX1080…
--
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
SasqWatch
Original Sin Donor
May 27th, 2016, 15:28
"Hammering" CPU means only one thing and that's - no bloody code optimization!
And some ppl in comments say they run the game without a glitch (except some flicker bug) on i7 4790k stock and 1080p with ultra setting.
But even if this PCG test is to be accepted as a reason to buy something, I don't want to pay $350 for Intel® Core™ i7-6700K that runs 4Ghz by default and has six cores hyperthread capable just to run one game's messy executable. One game that I don't even care about.
There is more to this story, important as we're mostly about RPGs here.
Noone seems to remember that Risen 3 had a bug on release that prevented it from working on these chestbusting and superexpensive CPU i7s. It was patched later.
Also, FF10/10-2 port tends to crash on i7 machines.
So no man. I'll stick to my good old i5 4670K as long as I can.
And some ppl in comments say they run the game without a glitch (except some flicker bug) on i7 4790k stock and 1080p with ultra setting.
But even if this PCG test is to be accepted as a reason to buy something, I don't want to pay $350 for Intel® Core™ i7-6700K that runs 4Ghz by default and has six cores hyperthread capable just to run one game's messy executable. One game that I don't even care about.
There is more to this story, important as we're mostly about RPGs here.
Noone seems to remember that Risen 3 had a bug on release that prevented it from working on these chestbusting and superexpensive CPU i7s. It was patched later.
Also, FF10/10-2 port tends to crash on i7 machines.
So no man. I'll stick to my good old i5 4670K as long as I can.
--
Toka Koka
Toka Koka
May 27th, 2016, 15:53
@azarhal and joxer,
I've seen similar issues reported on Fallout 4. Apparently some of the video tasks associated with shadows, and perhaps some other video functions, are programmed to be accomplished by the CPU, instead of the GPU. People report that the game runs smoother and faster on the i7 as compared to the i5 (It also runs smooth and fast on my cheap eight core AMD FX8350 btw).
I sure wouldn't buy a new CPU and/or motherboard for any single game either. I think that an extra eight gigs of fast ram might help; and ram is cheap these days. For new computer purchasers, more cores is probably really desirable as some game developers may make a habit of assigning more video tasks to the CPU in the future.
Regards.
__
I've seen similar issues reported on Fallout 4. Apparently some of the video tasks associated with shadows, and perhaps some other video functions, are programmed to be accomplished by the CPU, instead of the GPU. People report that the game runs smoother and faster on the i7 as compared to the i5 (It also runs smooth and fast on my cheap eight core AMD FX8350 btw).
I sure wouldn't buy a new CPU and/or motherboard for any single game either. I think that an extra eight gigs of fast ram might help; and ram is cheap these days. For new computer purchasers, more cores is probably really desirable as some game developers may make a habit of assigning more video tasks to the CPU in the future.
Regards.
__
Guest
May 27th, 2016, 16:05
Oddly I haven't experienced that similar F4 issue.
Everything as high as possible, and GPU is GTX760. Not a single framedrop I felt, the game was stuck at 60 all the time (was because my monitor doesn't go over 60, yes, I know, I need a new one).
Then again, I didn't run the game on 4K nor were using mods.
Gaming community is definetly the one pushing new technologies forward.
But this desn't mean we paid Intel to try (and fail) on phone market with Atom.
If Intel (or AMD) can't provide a good enough CPU at some acceptable price for home use, screw them. I'm not projecting houses with a need to render them realtime nor am doing super computersized calculations at home.
And will certainly not die if I can't run one game where devs believe that modern CPU's have 20 cores and are running on 6725235864237852785432 Ghz.
Everything as high as possible, and GPU is GTX760. Not a single framedrop I felt, the game was stuck at 60 all the time (was because my monitor doesn't go over 60, yes, I know, I need a new one).
Then again, I didn't run the game on 4K nor were using mods.
Gaming community is definetly the one pushing new technologies forward.
But this desn't mean we paid Intel to try (and fail) on phone market with Atom.
If Intel (or AMD) can't provide a good enough CPU at some acceptable price for home use, screw them. I'm not projecting houses with a need to render them realtime nor am doing super computersized calculations at home.
And will certainly not die if I can't run one game where devs believe that modern CPU's have 20 cores and are running on 6725235864237852785432 Ghz.
--
Toka Koka
Toka Koka
May 27th, 2016, 16:14
Originally Posted by joxerI've also seen it reported that F4 programming caps the game at 60fps no matter the capabilities of the GPU and monitor. In my case I can run GPU and monitor at higher frame rates, but with F4, I only see anything higher than 60fps on loading screens.
Oddly I haven't experienced that similar F4 issue.
Everything as high as possible, and GPU is GTX760. Not a single framedrop I felt, the game was stuck at 60 all the time (was because my monitor doesn't go over 60, yes, I know, I need a new one).
Then again, I didn't run the game on 4K nor were using mods.
Gaming community is definetly the one pushing new technologies forward.
But this desn't mean we paid Intel to try (and fail) on phone market with Atom.
If Intel (or AMD) can't provide a good enough CPU at some acceptable price for home use, screw them. I'm not projecting houses with a need to render them realtime nor am doing super computersized calculations at home.
And will certainly not die if I can't run one game where devs believe that modern CPU's have 20 cores and are running on 6725235864237852785432 Ghz.
As to the higher frame rates in general, I'm not sure that those higher frame rates usefully accomplish anything other than more heat anyway.
__
Guest
May 27th, 2016, 16:28
I've noticed a difference in Sims 4 on rates higher than 60. Can't possibly imagine the difference in RPGs or shooters.
Visual difference, not heat.
But well, I don't have a monitor of my own for such stuff yet. With new GPU I'll probably buy VG248QE I'm boring all my friends with talks about it for years now.
Visual difference, not heat.
But well, I don't have a monitor of my own for such stuff yet. With new GPU I'll probably buy VG248QE I'm boring all my friends with talks about it for years now.
--
Toka Koka
Toka Koka
May 27th, 2016, 16:50
I bought a gaming monitor a few months ago (BenQ RL2460HT), it's 60GHz refresh rates but my GPU is a GTX 770, a game is running real smooth when I get 30-40FPS. lol
I really need to upgrade that GPU this year, Nvidia already put them on legacy driver support, lol.
I really need to upgrade that GPU this year, Nvidia already put them on legacy driver support, lol.
--
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
SasqWatch
Original Sin Donor
May 27th, 2016, 19:46
Originally Posted by azarhalUhm…got to disappoint you.
I bought a gaming monitor a few months ago (BenQ RL2460HT), it's 60GHz refresh rates but my GPU is a GTX 770, a game is running real smooth when I get 30-40FPS. lol
I really need to upgrade that GPU this year, Nvidia already put them on legacy driver support, lol.
It's 60Hz, meaning 60 pictures a second, which is the standard for a long, long time.
And as it's only 1920 × 1080 you should actually be pretty fine with a GTX 770 atm.
30-40 fps is certainly not great, but should be ok if it's not an action game.
--
Doing Let's Plays Reviews in English now. Latest Video: Encased
Mostly playing Indie titles, including Strategy, Tactics and Roleplaying-Games.
And here is a list of all games I ever played.
Doing Let's Plays Reviews in English now. Latest Video: Encased
Mostly playing Indie titles, including Strategy, Tactics and Roleplaying-Games.
And here is a list of all games I ever played.
May 27th, 2016, 20:03
Originally Posted by KordanorDisappointed me? How? Oh the typo, sorry. Yeah, 60Hz, not 60GHz.
Uhm…got to disappoint you.
It's 60Hz, meaning 60 pictures a second, which is the standard for a long, long time.
--
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
SasqWatch
Original Sin Donor
May 27th, 2016, 20:07
More the fact that 60 Hz is what every monitor and it's mom has. 
There are some 120Hz monitors but that's still rather rare.

There are some 120Hz monitors but that's still rather rare.
--
Doing Let's Plays Reviews in English now. Latest Video: Encased
Mostly playing Indie titles, including Strategy, Tactics and Roleplaying-Games.
And here is a list of all games I ever played.
Doing Let's Plays Reviews in English now. Latest Video: Encased
Mostly playing Indie titles, including Strategy, Tactics and Roleplaying-Games.
And here is a list of all games I ever played.
May 27th, 2016, 20:18
Originally Posted by Kordanorand I have no problem with that, I don't have a setup that would require anything higher, the reviews for the monitor were good and the price was in my budget.
More the fact that 60 Hz is what every monitor and it's mom has.
There are some 120Hz monitors but that's still rather rare.
Maybe I should have mentioned that I upgraded from an old crappy 21inch LCD monitor. I'm very happy with my purchase.
--
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
It's developer is owned by Sony which means it'll remain a hostage of inferior hardware. ~ joxer
SasqWatch
Original Sin Donor
| +1: |
|
|
All times are GMT +2. The time now is 07:45.
