Dhruin
SasqWatch
According to Rock, Paper, Shotgun, Call of Pripyat will support DX11, with a new trailer on offer to presumably show that off.
More information.
More information.
Are there any cards out that support it already? I guess there are, if this is going to support it.
Yes, it looks like ATI is going to take the lead in certain market segments with the 5xxx series. The HD 5870 is rumored to be ~60% faster than the HD 4870. The price of the new gen is supposed to be $599 for the HD 5870X2 (two GPUs on one PCB), $379 - 399 for the HD 5870 and $279 - 299 for the HD 5850. The first significant price cut should occur around the time nVidia releases its G300 based cards.
I will be very surprised if those prices are accurate, they seem way too high. ATI has been able to stay competitive with nVidia the last few years by offering cards with a great price/performance ratio.
Same here for driver, compatibility and power consumption reasons. Moi is going to wait for the G300 before upgrading to a new card but I wish ATI the best of luck since nVidia needs a strong competitor.That being said, I still prefer nVidia cards.
Exactly. And just for that reason it makes me kind of wonder why you would question the accuracy of the prices. Of course we're very, very deeply in rumor territory here but if the 60% performance gain figure is accurate then it would seem to me like the prices sound about more than right.
Think about it. A card that performs at +60% of a HD 4870 should be able to (almost) compete with the GTX 295. The GTX 295 is currently priced at ~$480 - 500 The HD 5870 is supposed to sell for just $379 - 399.
Here you go. Great price/performance ratio. Just as you stated . How/why did you figure that the prices "seem way too high"? Way too high compared to what?
No thanks. Pass on AMD's ATI for 10 years. Maybe by then their drivers won't try to create a black hole to destroy the universe.
Say what you will with Nvidia's drivers. I've had my share of problems with them but nothing on the scale of ATI's.
Even though it shares the same first two letters with GT200 architecture [GeForce Tesla], GT300 is the first truly new architecture since SIMD [Single-Instruction Multiple Data] units first appeared in graphical processors.
GT300 architecture groups processing cores in sets of 32 - up from 24 in GT200 architecture. But the difference between the two is that GT300 parts ways with the SIMD architecture that dominate the GPU architecture of today. GT300 Cores rely on MIMD-similar functions [Multiple-Instruction Multiple Data] - all the units work in MPMD mode, executing simple and complex shader and computing operations on-the-go. We're not exactly sure should we continue to use the word "shader processor" or "shader core" as these units are now almost on equal terms as FPUs inside latest AMD and Intel CPUs.
GT300 itself packs 16 groups with 32 cores - yes, we're talking about 512 cores for the high-end part. This number itself raises the computing power of GT300 by more than 2x when compared to the GT200 core. Before the chip tapes-out, there is no way anybody can predict working clocks, but if the clocks remain the same as on GT200, we would have over double the amount of computing power.
If for instance, nVidia gets a 2 GHz clock for the 512 MIMD cores, we are talking about no less than 3TFLOPS with Single-Precision. Dual precision is highly-dependant on how efficient the MIMD-like units will be, but you can count on 6-15x improvement over GT200.
I'm still using DX9? XP doesn't support any higher? I guess I don't know what I'm missing. I'm proudly a graphics whore but Stalker looks and runs great with DX9 and 2009 complete mod @ 1920x1200 and graphics options on high. (8800GT)
So, why all the hoopla about DX11, nevermind 10?
I've no doubt those indeed might be the MSRP for those cards, but I'd be willing to bet that the actual street price is lower.
The GTX 295 will be a card from the previous generation by the time the 5870 is released. I would *expect* for a newer generation of cards to provide one that is both cheaper, and almost as fast/faster. That's pretty much what happens every time a new generation of cards is introduced.
I thought that would be obvious, but perhaps you haven't been following ATI the last few years. Their 4xxx series was *much* cheaper upon introduction. Even the 4870, which was their high end product when it was released, debuted at $299.
But until then ATI is competing only with nVidia's current gen and the rumored pricing falls very much in line with what ATI has always been doing.
I wouldn't be surprised if they are going to have to really slash the prices once the G300 comes out because then ATI is obviously going to have to readjust and pick new targets based on the usual price/performance analysis of nVidia's new gen.
This is why I don't agree about the price\performance ratio. When the cheapest product of a series is still going to cost you nearly $300, those are enthusiast products\prices, and most enthusiast are just going to wait and get the faster cards from nVidia.
The reason the 4xxx series was so successful was because semi-casual gamers could get cards in the $150-$200 range that still provided good performance. I think it's a huge mistake for ATI to abandon that market segment, which it looks like they're doing, unless they simply haven't revealed everything yet….