You know I have heard people say that 8X AGP is more of a marketing ploy then an actual performance increaser. I guess it would be like saying you have a car that can do 80MPH Max on a highway where the speed limit is 75MPH. That would be 4X AGP. Then you get the 8X AGP and now the speed limit is raised to 160MPH, but you can still only do 80MPH max. Make any sense?
Performance wise 8x AGP provides a huge bandwidth increase over the old 4x AGP but the problem is AGP 4x was enough bandwidth for general control purposes anyway. That's because the AGP bus, is only really used to send control signals to the card telling it what to render. Texture transfers to the card are a one-off cost anyway.
The only time I can see 8x AGP being of real benifit is when you don't have enough graphics card RAM and you have to swap textures in and out of your main RAM. But if you do that your performance is gonna be bad anyway, so there's really no improvement.
As for RD-RAM, I don't know the frequencies (PC1066 was the last I heard) but really it's not as cheap or as fast as DDR (at least Dual channel DDR). Basically only Intel backed it and they were way wrong on that one. Stay clear of RDRAM and go with the mainstream on this one.
8x will be usefull for the next generation graphics cards that will launch sometime next year Q2 2004 with games such as Doom III. If you are confused as to the earlier graphics cards that were first produced in 4x then made in 8x well then person who posted above that said it was all marketing was exactly right. In benchmark test the 8x was actually a fps slower (lol). Currently know game can really touch the full capabilites of 4x. There will probably never be a game fill up 8x agp since in a year or two everything will have gone to pci express. But oh well if you have it use it if not don't worry no big deal yet
Uh Jason Have u tried Pushing the Game Unreal Tournament 2003 Graphics Up all the way and Resolution to max supported by monitor and same with Refresh Rate while running under AGP 4x without tweaking or using hacked drivers?
Well PCI Express is a 1-bit serial bus (like USB) just it runs at a way higher frequency than anything that parallel connections can do.
The PCI Express that Video cards will use is based on 16x PCI Express, which is only 16-bits in parallel.
The difference between PCI Express and regular PCI is that because the bus is not as wide, (1-bit compared to 32-bit), each connector will be able to have it's own dedicated connection to the hub, which will be integrated into the south bridge. This will make PCI Express way faster than regular PCI in which a devices need to share a bus and therefore need to implement the protocol for this.
PS. Next generation video cards will be based on 16x PCI Express technology. There will be versions for AGP I'm sure because too many people will have AGP to let it go. But first of all, PCI express is on everyone's mind and AGP 8x will never be used to it's full potential.
Going along the same lines, I'm a bit confused. I'm wanting to do some upgrades, and noticed that the GeForce FX cards (that I'm looking to buy) are 8x AGP, then I found some mobos that have 8X AGP slots on them. Then I read under the requirements of a "256MB Nvidia Geforce FX5200 8xAGP Video Card w/DVI/TV Out" that it required "1.5V AGP 2X ; AGP 2X/4X or AGP 8x universal slot" Do I NEED a mobo with 8X AGP to use a Video card that has 8X AGP on it, and whats the difference between the 2x, 2x/4x, and 8x universal!?