Has anyone considered about the new technology 3Dc? better 'image quality' at no cost of performance? I don't know if this is true but I have read about it. Sounds like a really big advantage for the X800 XT over the 6800 Ultra.
Man both companies have been accused of cheating and in both cases it doesnt make any more difference, im not baised other way,m ive bought tested and second hand sold like 6 graphics cards in the last 2 months or so and i have to say forget the brand look at the card you specifically want to buy.
p.s. i WASW an ATI fan but im afraid the 6800 looks more attractive in the long-run....more driver improvement possibilities. ANd if they cheat? use the slightly older drivers...big deal!
An xbox360 and a 12" iBook....
And a kawasaki er-6n to mod instead
How do you differentiate between "cheating" and "optimising"? All this episode does is show how meaningless the benchmark programs (3DMark series and so on) are. ATi and Nvidia both know how these programs work, so its easy for them to tweak the drivers so that they do only what's necessary to achieve a hgh score. The only way to get a true picture is to run real programs with standard drivers.
and still no one has considered the new 3Dc technology. I'm still an ATI fan, all the way, they used older technology compared to nVidia and still came close. Now that's pushing your limit. And i can't wait for them to use new technology especially with the 3Dc tech. It's gonna be a bomb! nVidia though is a pretty good company themselves
Well as Sander mentioned the real issue is wether or not it is an optimization of some type that detects when to turn on and off or if it turns off tri-linear all the time. One is an enhancement and the other is a cheat. Personally it looks to me as though ATI has just added another means of bandwidth conservation such as Hyper Z or the more recent 3Dc texture compression.
good interview. i read it all. i don't think they cheated. i think it was smart what they did. it's true, why bother use 16x on a wall which will still look the same with no X. this increases performance although there is one thing i want to ask. they mentioned new IQ and how it auto detects stuff. what is it? is it like you have the wall, 16x is off, when you more to edge of wall, 16x will on so smooth out automatically? or is it like multi X? like some parts only need 2X and some 4X and it will auto detect to find best image quality and performance? i still like the new 3Dc tech
But think about it --- they're taking load off the GPU and putting it on CPU for switching modes and independently setting render effects per object! They're basically saying "if it performs poorly blame the CPU".
Sounds like ATI has followed mother nature - what our own eyes do with analog information. If we see everything that our eyes can process, then our brain would not be able to process that information. We see with our eyes only a small portion of the Focused Image and the rest is blurred until the eyes move to refocus on another part of the image. In addition, our eyes focus continually (floating point focusing -hahaha). Making these fine adjustments every millisecond. Plus, the distance between our eyes give us the depth perception we need not to bump into things.
It's quite logical actually. Plus, it's not "The Matrix" we're rendering here. We're experiencing the celing of pixel applied technology. The Clincher: As long as I can snipe my enemy from a fair distance and I can tell which direction the clouds are versus the groun when I'm in the air, that's all that concerns me.
The next step is photon energy manimpulation in third dimentional space -- Holograms in four dimensions.
May be a marriage between the two companies -- ATivia or Nviati -- to make the Holodeck a reality. ha ;-)
Bravo to ATI for squeezing the last bit of juice from thier engine.
I look forward to NVidia meeting the challenge of improving thier new platform.