Adaptive AA, Quality AF
PH!: There are lots of forum posts about enabling Adaptive AA on older Radeons. We made the feature work on an R350 based card and the last version ATI Tool supports the feature too. Is Adaptive AA going to be available for older Radeons in the future Catalyst versions? What's the essence of the Adaptive AA? Is it not just the activation of Super Sampling when a texture has transparent parts?
Hirdetés
ED: Some form of adaptive AA is going to be made available on previous architectures. Our fundamental AA hardware has always been very flexible and gives the ability to alter sampling locations and methods. In the X1k family, we've done more things to improve performance when doing Adaptive AA, so that the performance hit there will be much less than on previous architectures. However, the fundamental feature should be available to earlier products. I'm not sure on the timeline, as we've focused on X1k QA for this feature, up to now. Beta testing is ongoing, and users can use registry keys to enable the feature for or older products, for now.
PH!: Quality AF looks impressively. It finally does what AF is supposed to – congratulations. In our tests, Quality AF was almost as fast as the “angle-dependent” Performance AF. Did we make a mistake? If not, why is the regular AF needed at all, when we have Quality AF with basically the same speed?
ED: there are some cases where area based AF does perform slower than our previous AF. Our previous AF filtering algorithm used reasonable compromises to achieve excellent quality and performance. It was superior to bilinear/trilinear simple implementations. In fact, it was superior, I feel, to today's offerings from competing vendors. However, we did listen to the criticisms, and while not always agreeing, we decided that our customers' requirements are our top priority. As well, as HW improves, it should never give worst quality. So we improved many items in the pipeline, to increase precision (subpixel, Z, color and iteration, etc…) as well as improved LOD and AF algorithms. I think it's obvious that we now offer the very best quality filtering, bar none. To me, I'd rather have high quality at a good frame rate, than terrible quality at breakneck speeds.
PH!: To test the X1800 in 1600x1200 we had to warm up our good old CRT, because our good new LCD-s could only handle 1280x1024. ATI and NVIDIA suggests to test in 1600x1200 or even higher. Does it make sense when most of the users (even power users) use LCD with native 1280x1024? Is the HDTV 1080i going to be popular among PC gamers? We think 1280x1024/Quality AF/Adaptive AA with HDR is the best setup today and it can even make the X1800 XT sweat :)
ED: Well, even the cheaper monitors used to support 16x12 without too much trouble. The popularity of low cost LCDs as actually caused a little of a resolution stall. But newer, larger LCDs certainly support 16x12 and even higher. As well, I think that there will be a plateau of standardization at 1920x1080, as this is standard for broadcast and an excellent resolution for monitors. But for those others, their focus should be on quality. Myself, I use a 19” LCD at 1280x1024. So I up the AF setting to the highest, and turn on our 6xAA – It gives the highest quality picture available.
A cikk még nem ért véget, kérlek, lapozz!