|
Your donations keep RPGWatch running!
Patches successful?
December 7th, 2006, 23:24
Originally Posted by ThaurinThat is not quite correct. ATI X1xxx graphics cards can do AA and "true" HDR (FP16) just fine. It's only nVidia cards that can not render AA and FP16 HDR at the same time. However, both nVidia and ATI can usually do AA and any other implementation of HDR (non-FP16) or a "fake HDR" bloom effect. There is no true HDR in Gothic III. They are using bloom so just about any nVidia or ATI card that has pixel shader 2.0 capabilities or higher should be able to do both, AA and the bloom effect. Why it is in reality not possible is one of the secrets of PB's engine programmers.
If you want anti-aliasing, you'd have to disable all post-processing. I think this is still a problem with most video card (all but the very latest); you can't do HDR and AA at the same time and I think the same goes for bloom and depth-of-field. However, I'm sure they can include a setting where you could enable just anti-aliasing.
Depth of field has absolutely nothing to do with it BTW. I think you need a pixel shader 3.0 card for the blur effect but that's about it. Otherwise, I'm pretty sure that any shader 3.0 card would be perfectly able to do AA + HDR (or bloom) + DoF blur at the same time. It's questionable whether this would make sense though since all of these features require a lot of processing power but let's think about it for a moment… do you really need AA if the textures in the distance get blurred out anyway, i.e. if any "jaggies" are being made invisible by the blur effect anyway? Certainly doesn't make sense.
December 8th, 2006, 03:03
The problem isn't with distant textures though, it's with the close and medium range objects. I think they made a big mistake by not making G3 compatible with AA because it needs it bad! Even playing at a resolution as high as 1280x1024 you can see a lot of aliasing in that game.
Last edited by JDR13; December 8th, 2006 at 03:44.
December 8th, 2006, 10:22
Thanks for clearing that, Moriendor. I've read about this stuff a good while ago and superficially. It's actually very interesting how the graphics pipelines and render paths and whatnot work, but I don't really know all the details. I do remember that ATi was able to do AA+fp16 HDR and I'm pretty sure that nVidia's latest (8* series) can, too.
SasqWatch
December 8th, 2006, 20:49
Originally Posted by JDR13Yes, but you can't do "regional" AA for only some certain textures of a frame (as far as I know at least). It's always the full frame/scene that gets AA'd and it just seems pretty uneconomical to AA the entire frame if more than half of it is blurred out.
The problem isn't with distant textures though, it's with the close and medium range objects. I think they made a big mistake by not making G3 compatible with AA because it needs it bad! Even playing at a resolution as high as 1280x1024 you can see a lot of aliasing in that game.
Besides, I somewhat disagree about the importance of AA. As long as you move, you should basically barely notice any "jaggies" at all. So… get your ass in gear some more and all will be fine
.
December 9th, 2006, 00:03
Moriendor,
Maybe I'm just spoiled, but I'm not used to seeing jaggies like that in my games. I didn't pay over $400 for my video card to see ugly crawling edges all over the place.
The graphics engine they used for Gothic 3 has particuly bad aliasing on edges compared to other recent graphic engines.
It's just something I'm pointing out, I'm not trying to bring the game down.
It's a fantastic game, I'm just a little surprised by that design decision.
Maybe I'm just spoiled, but I'm not used to seeing jaggies like that in my games. I didn't pay over $400 for my video card to see ugly crawling edges all over the place.
The graphics engine they used for Gothic 3 has particuly bad aliasing on edges compared to other recent graphic engines.
It's just something I'm pointing out, I'm not trying to bring the game down.
It's a fantastic game, I'm just a little surprised by that design decision.
December 9th, 2006, 03:08
I miss FSAA. Usually I'm willing to sacrifice graphical details for it. I just can't stand teared edges…
Watchdog
December 10th, 2006, 17:25
I never use AA, that is, unless I can really afford it. I'd rather have a smoother frame rate than AA and most of the time the reduction in frame rate just isn't worth it for me. Personally, I've never understood why people can't live without it, but it's nice to have if your graphics card is up to it (or the engine isn't cutting edge any more).
SasqWatch
|
|
All times are GMT +2. The time now is 03:36.

