RPGWatch Forums
Page 1 of 6 1 2 3 Last »

RPGWatch Forums (https://www.rpgwatch.com/forums/index.php)
-   Tech Help (https://www.rpgwatch.com/forums/forumdisplay.php?f=25)
-   -   RTX 2080 ti specs leaked? (https://www.rpgwatch.com/forums/showthread.php?t=40580)

SirJames August 19th, 2018 01:03

RTX 2080 ti specs leaked?
 
NVIDIA’s newest flagship graphics card is a revolution in gaming realism and performance. Its powerful NVIDIA Turing™ GPU architecture,breakthrough technologies, and 11 GB of next-gen, ultra-fast GDDR6 memory make it the world’s ultimate gaming GPU.

The GeForce® RTX graphics cards are powered by the Turing GPU architecture and the all-new RTX platform. This gives you up to 6× the performance of previous-generation graphics cards and brings the power of real-time ray tracing and AI to games.
When it comes to next-gen gaming, it’s all about realism. GeForce RTX 2080 Ti is light years ahead of other cards, delivering truly unique real-time ray-tracing technologies for cutting-edge, hyper-realistic graphics.

NVIDIA® CUDA® Cores 4352
Core Clock 1350 MHz
Boost Clock 1545 MHz
Memory Amount 11GB GDDR6
Memory Interface 352-BIT
Memory Bandwidth 616 GBPs
TDP 285 W
SLI NVLink 2-way
Multi-Screen Yes
Max Resolution 7680 × 4320 @60Hz (Digital)
Power Input 2 x 8-pin
Bus Type PCI Express 3.0
Card Dimensions 1.73" x 12.36" x 5.04"
Width Dual slot
Box Dimensions 6.76" x 13.8" x 4.32"
Virtual Reality Ready Yes

JDR13 August 19th, 2018 01:07

Did you really need to copy the advertisement verbatim? ;)

Sounds pretty impressive, but each new generation makes similar boasts. I'm curious to see what the MSRP ends up being.

SirJames August 19th, 2018 05:49

It's supposed to be $1000 USD and I hear the non-TI version is $650 USD. Grain of salt, and all that.

Sure, each generation claims to be better than the last. It's not untrue. But these boasts are quite new. They claim to have a whole bunch of processing technologies going on inside these cards. Some "RT" core for ray-tracing. "Tensor" cores for AI deep learning and crypto mining. Plus the standard cuda cores that we'll actually use in gaming.

I'd say before we actually see ray-tracing used in games there will be better cards out. Still, it's exciting to see the first ever push for the technology. Start of the future!

Here's a little more advertising. ;)
Naturally, the professional cards come out first with the technology.
loading…


If they follow the same successful launch cycle for the gaming cards it would be 1080 first then 1070 then 1060 then 1080ti and I wouldn't count on a 1070ti again but it would be last. Plus all the 1050 and 1030 stuff that no one should bother with is in there somewhere too.

deltaminds August 19th, 2018 10:41

Quote:

Originally Posted by SirJames (Post 1061523138)
It's supposed to be $1000 USD and I hear the non-TI version is $650 USD. Grain of salt, and all that.

Sure, each generation claims to be better than the last. It's not untrue. But these boasts are quite new. They claim to have a whole bunch of processing technologies going on inside these cards. Some "RT" core for ray-tracing. "Tensor" cores for AI deep learning and crypto mining. Plus the standard cuda cores that we'll actually use in gaming.

I'd say before we actually see ray-tracing used in games there will be better cards out. Still, it's exciting to see the first ever push for the technology. Start of the future!

Here's a little more advertising. ;)
Naturally, the professional cards come out first with the technology.
loading…


If they follow the same successful launch cycle for the gaming cards it would be 1080 first then 1070 then 1060 then 1080ti and I wouldn't count on a 1070ti again but it would be last. Plus all the 1050 and 1030 stuff that no one should bother with is in there somewhere too.

Budget gaming depends on 1030 and 1050 you know?

SirJames August 19th, 2018 11:40

I wouldnt wish a 1030 on my worst enemy! :P

1050ti if you're really that poor and cant save up for another week or two, maybe.

joxer August 19th, 2018 12:46

Saw a few similar rumors where the new generation is supposedly 50% faster than the current one. Yea, right, maybe it is during bitcoin mining.
I suggest not trusting the "leak" nor buying that card.

Raytracing will need years to appear in games. CDpr would be the first to use it, then Square Enix will adopt it for FF16 on PC just like it happened with hairworks. Other companies don't want to invest in new technologies - EA is shifting to phones where hairworks won't happen in a million years, Activision Blizzard is onto patenting scamming methods and Ubisoft is more interested in gaas than upgrading their engine. Bethesda? I'll just say lol.

As CDpr is already releasing a game for the current GPU generation, I really see no reason to go for 2080ti unless you've jumped on the silly 4K hypewagon, bought 4K blurry 5+ms response videowall instead of 144Hz capable monitor and discovered that everything is either upscaled or a slideshow.

SirJames August 19th, 2018 13:59

Quite right, joxer. But imagine if the Tensor core was dedicated purely to hairworx! That's the sort of thing I think it could be used for in games. Sort of like a co-processor like a PhysX card was for physics.

My monitor is only 1080p but 144Hz and has GSYNC. Cost more than a cheap 4k monitor but it's been fantastic.

sakichop August 19th, 2018 17:17

Quote:

Originally Posted by joxer (Post 1061523158)
Saw a few similar rumors where the new generation is supposedly 50% faster than the current one. Yea, right, maybe it is during bitcoin mining.
I suggest not trusting the "leak" nor buying that card.

Raytracing will need years to appear in games. CDpr would be the first to use it, then Square Enix will adopt it for FF16 on PC just like it happened with hairworks. Other companies don't want to invest in new technologies - EA is shifting to phones where hairworks won't happen in a million years, Activision Blizzard is onto patenting scamming methods and Ubisoft is more interested in gaas than upgrading their engine. Bethesda? I'll just say lol.

As CDpr is already releasing a game for the current GPU generation, I really see no reason to go for 2080ti unless you've jumped on the silly 4K hypewagon, bought 4K blurry 5+ms response videowall instead of 144Hz capable monitor and discovered that everything is either upscaled or a slideshow.

Right on point as usually, joxer. I mean, you haven’t gamed until you’ve experienced a turn based RPG at 144fps.:biggrin:

I switched to a 55 inch 4K oled from a 21:9, 3440x1440 gsync monitor and I’ll not go back.

Games have never looked better. Yes I lose some response time and it’s only 60hz but I don’t play competitive shooters or anything. It’s plenty fast enough for ARPG’s, TB and RTWP which is what I play.

I’m really interested in the Nvidia big displays. 65”, hdr, 120hz with gsync. Sounds awesome but I bet it will be crazy expensive. True 120hz 4K tv’s Should come out next year anyways.

So I’ll definitely be picking up a 2080ti when available.

joxer August 19th, 2018 18:22

Quote:

Originally Posted by sakichop (Post 1061523180)
Right on point as usually, joxer. I mean, you haven’t gamed until you’ve experienced a turn based RPG at 144fps.:biggrin:

loading…

SirJames August 20th, 2018 06:38

Heaps more 20 series leaks! Here's some prices. Not confirmed, of course!
Quote:

NVIDIA GeForce RTX 2080 – $500-$700 US (50% Faster Than 1080)

NVIDIA GeForce RTX 2070 – $300-$500 US (40% Faster Than 1070)

NVIDIA GeForce GTX 2060 – $200-$300 US (27% Faster Than 1060)

NVIDIA GeForce GTX 2050 – $100-$200 US (50% Faster Than 1050 Ti)
2070 rumoured to have 2304 cores, 2080 = 2944 cores, both 256bit memory bus, same as 10 series. DDR6 memory. The TI will be the 384bit memory one, as usual.

The RTX cards will have the tensor and ray tracing stuff.

The official announcement is expected at Gamescom in a couple of days time. :D

SirJames August 20th, 2018 06:46

Quote:

Originally Posted by sakichop (Post 1061523180)
Right on point as usually, joxer. I mean, you haven’t gamed until you’ve experienced a turn based RPG at 144fps.:biggrin:

You know what's funny? When Into The Breach launched there were heaps of people complaining about the "60fps lock in 2018?!"

Probably trolling but there were a lot of them.

lostforever August 20th, 2018 18:19

IIRC, the 1070 is supposed to equal to the old 980 TI. Will this still hold? So we get the new 2070 == 1080 TI?

lostforever August 20th, 2018 18:22

Quote:

Originally Posted by sakichop (Post 1061523180)
I switched to a 55 inch 4K oled from a 21:9, 3440x1440 gsync monitor and I’ll not go back.

I take it you no longer game from the table/chair setup?

You don't miss the bigger game world from going from 3440x1440 to 4k? This will be be my biggest issue against 4k.

sakichop August 20th, 2018 18:37

Quote:

Originally Posted by lostforever (Post 1061523395)
I take it you no longer game from the table/chair setup?

You don't miss the bigger game world from going from 3440x1440 to 4k? This will be be my biggest issue against 4k.

I absolutely still sit in my computer chair. I find it more comfortable than the couch actually. I have a coffee table that the top extends up like a desk. When I want to game I just raise it and pull my chair up and use my M&K.

I actually didn’t like the 21:9 aspect ratio. Everything felt squished vertically. I like the 16:9 ratio better. As for the difference in resolution from 3440x1330 to 3840x2160 I can’t tell any difference really. I can really tell a difference from 1080p to 4K on the bigger screen though. I only sit 6-8 feet away though.

Mosaic August 20th, 2018 18:47

Thankfully the bleeding edge tech is still far out pacing the content we need it for. Maybe if you're really into VR?

sakichop August 20th, 2018 18:50

Quote:

Originally Posted by Mosaic (Post 1061523411)
Thankfully the bleeding edge tech is still far out pacing the content we need it for. Maybe if you're really into VR?

Depends on what you’re doing. I’ve done a few mod sessions for skyrim that have brought my Titan x (pascal) to its knees.

If they gives us the power we’ll find a way to use it all.:biggrin:

lostforever August 20th, 2018 19:03

Quote:

Originally Posted by sakichop (Post 1061523403)
I absolutely still sit in my computer chair. I find it more comfortable than the couch actually. I have a coffee table that the top extends up like a desk. When I want to game I just raise it and pull my chair up and use my M&K.

I am finding it hard to picture this! I find the 35inch too big for desk/chair setup so I can't image how 55inch looks like :)

Quote:

Originally Posted by sakichop (Post 1061523403)
I actually didn’t like the 21:9 aspect ratio. Everything felt squished vertically. I like the 16:9 ratio better. As for the difference in resolution from 3440x1330 to 3840x2160 I can’t tell any difference really. I can really tell a difference from 1080p to 4K on the bigger screen though. I only sit 6-8 feet away though.

You say it felt squished but it really isn't right? The amount of world you see vertically is the same in both 21:9 and 16:9? But you see lot more horizontal world in 21:9. This is what I mean you see more in 21:9 compared to 16:9 and not the number of pixels.

Ragnaris August 20th, 2018 20:18

Well, prices were finally announced. And, as I predicted, the 2080 Ti will be $999.

RTX 2070: $499
RTX 2080: $699
RTX 2080 Ti: $999

RTX 2070 (Founder's Edition)
Boost clock: 1710 MHz (OC)
Frame buffer: 8 GB GDDR6
Memory speed: 14 Gbps

RTX 2070
Boost clock: 1620 MHz
Frame buffer: 8 GB GDDR6
Memory speed: 14 Gbps

RTX 2080 (Founder's Edition)
Boost clock: 1800 MHz (OC)
Frame buffer: 8 GB GDDR6
Memory speed: 14 Gbps

RTX 2080
Boost clock: 1710 MHz
Frame buffer: 8 GB GDDR6
Memory speed: 14 Gbps

RTX 2080 Ti (Founder's Edition)
Boost clock: 1635 MHz (OC)
Frame buffer: 11 GB GDDR6
Memory speed: 14 Gbps

RTX 2080 Ti
Boost clock: 1545 MHz
Frame buffer: 11 GB GDDR6
Memory speed: 14 Gbps

you August 20th, 2018 21:54

Too expensive. See me when AMD solution arrives.

Couchpotato August 20th, 2018 22:00

Quote:

Originally Posted by you (Post 1061523442)
Too expensive. See me when AMD solution arrives.

Yeah I went AMD myself. Only problem with that is the damn crypto miniers who jack up the prices. I'm curious to see what Intel will release with their new graphic cards.

In my opinion more competition is better then just Nvidia and AMD.


All times are GMT +2. The time now is 11:37.
Page 1 of 6 1 2 3 Last »

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, vBulletin Solutions Inc.
vBulletin Security provided by DragonByte Security (Pro) - vBulletin Mods & Addons Copyright © 2021 DragonByte Technologies Ltd.
User Alert System provided by Advanced User Tagging (Lite) - vBulletin Mods & Addons Copyright © 2021 DragonByte Technologies Ltd.
Copyright by RPGWatch