Please register or login. There are 0 registered and 999 anonymous users currently online. Current bandwidth usage: 326.30 kbit/s December 15 - 08:56am EST 
Hardware Analysis
      
Forums Product Prices
  Contents 
 
 

  Latest Topics 
 

More >>
 

    
 
 

  You Are Here: 
 
/ Forums / Nvidia driver cheats? Truth or fiction?
 

  Of Screenshots and Microscopes 
 
 Author 
 Date Written 
 Tools 
Continue Reading on Page: 1, 2, Next >>
Corvus Raven Apr 28, 2004, 12:53pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List Replies: 28 - Views: 2526
I use to work in production.. and unless I was told I could rarely tell 25fps or 30 fps. IF they were becide eachother I could pic which was which, but without knowing who would ever guess? Perhaps another tech or production personnel that was at it every day. Perhaps. My glitch lasted 2-3 frames, most people don't notice unless it is more the 5. Having been away from that environment for so long, I would be surprised if I could still notice such a tiny event.

There is a reason that 25-30 frames are used. More then that is unnoticed. by the human eye. 25-30 frames is seen as a moving image without lines from one frame to the next and even at 21 fps, some people don't see the lines between the frames (film).

The human brain can only proccess so much information at a time, especially when there is more then the eye can take in.

BTW Just how big does a pixel have to be?


--------------------------
ASUS A7N8X Deluxe PCB 2.00 w/ BIOS 1005 (cause? Dunno.. ATM)
AMD XP 2800+ @ 0.0 GHz 0MHz FSB (overheating for no appearent reason)
(2) - Corsair XMS512-3500C2 (5-2-2-2T) @ 0MHz
ATI Radeon 9700 Pro @0
Enermax 500W p/s (ok.
Want to enjoy fewer advertisements and more features? Click here to become a Hardware Analysis registered user.
Michael Chan Apr 28, 2004, 03:01pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
I can't say that I agree with the 30 framerate thing. Being a gamer, I've played games that run the performance gamut from a steady 100fps down to probably 20 or lower fps. While science states that the human eye can't process more than 29.96 fps, it really does seem noticeable when you move from a steady 100 down to a steady 30. Perhaps it's just the occasionally choppiness or how images seem a little less smooth.

Besides that, to address the article, I can't say I agree that all "shortcuts" are legitimate optimizations. When you're buying a midrange consumer level video card for 150 dollars, i'm sure it doesn't matter to you as long as you're getting playable performance. However, if I am to pay 400, 500 or even 600 dollars for a video card, I want everything. I don't think that's too much to ask. I'm paying the equivalent for a video card that many people pay for their whole computer. The reasoning behind this is that... if a company has to not render unseen pixels to get great performance out of current games... how future proof is the product? In theory, if you need to "cheat" to get current games running at a playable framerate, wouldn't you have to "cheat" even more in future games to the point where image quality starts to suffer?

I don't know about anyone else, but I can tell the difference between an mp3 and FLAC (Free Lossless Audio Codec).

SurrealBeingX <> Apr 28, 2004, 03:24pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
Corvus Raven, bud, u are completely wrong. Ive been a gamer for a while, and believe me, 25-30 fps is SLOW. I can definitely tell the difference once speeds get up to 60FPS and higher. Also, people that play first person shooters avidly have a higher visual perception rate 30% higher than that of a normal person. Im not joking, I read that in an article that was posted a long time ago. It was based on some reasearch by Dr. Bavalier.

But whatever you believe, I KNOW that i can tell the difference between 30FPS and something higher...30FPS is really slow when it comes to fast paced action games...but my dad cant really tell much of a difference either....but hes not a FPSer..

Lawrence Heffernan Apr 28, 2004, 03:42pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
I can visually differenciate between 60Hz, 75Hz and 85Hz. But 25FPS was a guidline, but you can notice the difference. In games especially, where your brain is using the images on screen and responding accordingly, the faster the framerate, the better your brain can comprehend what's happening. However, Framerates above monitor refresh aren't of any benefit, as they're not being drawn

Simon Tremblay Apr 28, 2004, 04:30pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List

Edited: Apr 28, 2004, 04:34pm EDT

 
>> Re: Of Screenshots and Microscopes
I would have to agree with Corvus about one thing, MP3 and CD are way different when it comes to quality, and so are Nvidia and Ati. Since the Radeon 9700 event, Nvidia has been struggling to keep up performance wise, but eventually they did catch up, at a price, noticeable ( not talking under a microscope, or even while idling in the game, BUT AT FULL SPEED IN A FPS VEHICLE while theres a bazillion things going on screen ). Which game? Halo. ( for the record this artifact has been reported on a plethora of games too long to list here ) What kind of artifacts? Anisotropic filtering artifacts ( you know the BRIlinear filtering? )

This was the first artifact they introduced to alleviate their performance gap with Ati ( you can CLEARLY see at ALL TIMES those nifty lines between texture mip maps crawling along with you on the floor and walls and whatnots. I found it ANNOYING, but bearable. ) Then they do it again, this time it's about shader effects and shading effects quality, where we should see a beautifull gradient of color we see huge blocky shading the likes of which even old grandpa gouraud shading didn't produce.

I can stand the first, but I can't stand a leap BACK 5 ( if not more ) YEARS in image quality for some features, and especially if those features are supposedly " top of the line " like PS 3.0.

Used Nvidia products upto geforce 4ti4200 8x , now endlessly happy on a Radeon9800 pro. Go competition, and please Nvidia, try to be a LOT smarter, cause thinking we can't see those nice artifacts while PLAYING the game is simply unnacceptable. ( being a hi end top of the line *cough* cutting edge *cough* product )

SurrealBeingX <> Apr 28, 2004, 05:11pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
yea well nVidia is really giving ATI a run for the money with this new 6800 Ultra...its kicking the 9800XT's A$$ by even 100% in some benchmarks

Shadow_Ops_Airman1 Apr 28, 2004, 09:23pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
sorry i still would get a 9800XT, it seems that the Stock Cooler for the XT can handle better than what cooler Nvidia had to put on the 6800, that cooler is just as rediculous as the first FX board, oh besides just remember the Next ATI boards are not out. Another thing is that i can believe that Nvidia is cheating again cause They certainly did it once before.

AMD Athlon XP-M 2500+ (133x14= 1867MHz) (209x11= 2299MHz)
DFI LP NF2 Ultra-B (Hellfire 3EG Rev2)
Antec SX800, Neo HE 500, 4 Antec 8CM Fans
Thermalright SI-97 1 Antec Tricool 12CM Fan
CL SB XFi Xtreme Music
2x Barracuda HDs (250/400)
2x Samsung Write
Shadow_Ops_Airman1 Apr 28, 2004, 09:28pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
oh if u understand this Nvidia and Intel are back together, So i think of Nvidia More like Intel than i Do Nvidia is like AMD, personally AMD and ATI are more like eachother in how they develop their stuff, it seems that when nvidia comes out with a new board, ati usually is able to out do the nvidia boards with less speed.

AMD Athlon XP-M 2500+ (133x14= 1867MHz) (209x11= 2299MHz)
DFI LP NF2 Ultra-B (Hellfire 3EG Rev2)
Antec SX800, Neo HE 500, 4 Antec 8CM Fans
Thermalright SI-97 1 Antec Tricool 12CM Fan
CL SB XFi Xtreme Music
2x Barracuda HDs (250/400)
2x Samsung Write
SurrealBeingX <> Apr 28, 2004, 10:51pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
Airman, y would u get a 9800xt? they arent much faster than a 9800pro, which costs hundreds less.

and yea i know ati is comin out with a new card soon

Zach Beck Apr 28, 2004, 10:57pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
Ok, as for nVidia's 6800 beating the 9800XT...no duh. look at the damn thing. it's nothing but heat sink. and besides, who has two molex's to spare....i mean, come on NV, what are you thinking? Can't beat ATI without dumping twice the power in? I can't wait to see ATI's new card at the end of the year. I think ATI rules over NV any day, especially in the budget cards. I got a 9000 pro 128mb as an upgrade to my NV card. I was amazed at what i had been missing, and a much better price. ATI is just the best.

And as for framerates...I can tell the difference between 30fps and most anything else. It looks slow and choppy. Most times, i can tell if a game drops below 25.

P4 2.0
512mb PC2100
Creative Sound Blaster Lived! 5.1 Digital
Logitech Z-560 400W
Sapphire Radeon 9500 Pro
ColorCases 425W p/s
WD 60G hdd





burningrave101 Apr 28, 2004, 11:42pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List

Edited: Apr 28, 2004, 11:51pm EDT

 
>> Re: Of Screenshots and Microscopes
Well it was only recently that ATi began to even have much of a lead in the video card industry vs Nvidia. Back before the 9700 pro came into place and the Titanium series was out nVidia was King of the Hill and had been for a while. ATi has some nice cards but i definitely prefer nVidia. I've put togethor quite a few systems using both brands of cards for people and i really dont notice a difference at all in image quality. If your really paranoid and go looking for stuff then you might find a little something here and there. I think alot of the things people have pointed out with the nVidia drivers so much are coming from ATi users that are trying to put down nVidia cards. I havn't seen too aweful many actual nVdia card users, espeically ones with a 59XX, complaining about the image quality.

Personally i think the 6800 ultra is going to be quite a bit better then the R420. The R420 from what i've seen is just a reved up 9800XT on 130 nano. The 6800 ultra has quite a few new features includeing Pixel Shader 3.0 and Vertex Shader 3.0. Also most likely the R420 will be limited to 24-bit floating-point while the NV40 supports 32-bit floating-point.

I've also had alot fewer problems with nVidia drivers over ATi drivers. nVidia drivers have quite a few more features also, especially if you are into running multiple monitors.

Right now i'm running a nVidia 5900XT that i bought for $195 and stock its 390/700. For the moment i have it clocked at 540/810 and i've ran hours of 3D Mark 03 without a problem. I can go even higher on the core if i want. The RAM isn't as good as some of the others on this specific card though which is the Gigabyte and so it tops out around 820-830. Thats a hell of an overclock for that low end of a card though.

Another reason why i prefer nVidia is the fact nVidia based cards aren't constantly reference design. I get really sick of seeing Red PCB ATi cards with the stock coolers on them. Plus the software bundles with ATi cards are usually alot to be desired. I got 3 full version games with my 5900XT along with other software.

And i may of read the Intel/nVidia comment incorrectly so dont jump on me but Intel has closer ties with ATi then they do nVidia. Its AMD and nVidia that are more "togethor". I mean look at the fact nVidia makes AMD's chipsets plus on-board sound and other things. And i'm an Intel user.

All in all their both great companys and i'm glad to have both of them because it keeps prices down. But me personally i'll take an nVidia card any day of the week over ATi.

burningrave101 Apr 28, 2004, 11:42pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List

Edited: Apr 28, 2004, 11:44pm EDT

 
>> Re: Of Screenshots and Microscopes
OOOOPSSS...Double post....Sorry :)


SurrealBeingX <> Apr 29, 2004, 12:21am EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
i cant wait to see the speed of the new ATi chip coming out...i want to see if nvidia is really kicking their a$$es yet or not....and i also wanna wait b4 buying the 6800, to make sure i get the fastest card

Drew s Apr 29, 2004, 08:09am EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
I dont realy mind which is the fastest ive owned both and have good and bad experiences with both. I just hope that both keep competing with each other. If they dont compete and only one is on top, boy will that slow down progress and then we will be stuck around the same level for ages. I just wish a third party would come along with something new and rev up the competition even more.

Drew

Chris M Apr 29, 2004, 01:52pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
The belief that the eye can see full motion at 30fps is a half truth. Most movies only run at about 18-24 fps yet you see in full motion. It is because of motion blurs. Games don't have motion blurs thus requiring more frames to make the motion fluid. It is not just a simple fact of "The eye can see anything past 30fps as full motion."

--
It is said that when it comes to life, those who feel see it as a tragedy. Those who think see it as a comedy.
Matt Walker Apr 29, 2004, 03:06pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
All I have to say is that we have finally gotten to a point where the video cards are keeping up with the processros, FINALLY. Video cards were developing too slowly and as of this point the cpu's are actually the bottlenecks for the new nvidia 6800 and it will probably be the same for the ati card when it comes out. one thing that sort of aggravates me though is every single ati fanboy out there saying the ati card will be better. No duh its going to be better, nvidia just launched their card whil ati's isn't out yet. ATI is able to scope out the competition. I'm glad that Nvidia has th courage to be the first card out, it keeps the development moving along. I myself am an ATI fanboy but I despise the amount of competition between ati and nvidia fans, who cares, we don't make the cards and this sort of bickering scares away a lot of people from trying to learn about computers.

Simon Tremblay Apr 29, 2004, 04:35pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List

Edited: Apr 29, 2004, 04:43pm EDT

 
>> Re: Of Screenshots and Microscopes
If you think there is bickering here, don't be mistaken, up to before burninggrave showed up we were discussing objectively with facts and proofs not hear says and consumer satisfaction.

Now for more facts:

Nview is certainly sweet and was better than hydravision for a while, upto radeon 9700 then Hydravision ( quoting a famous reviewer ) : finally came up to par if not a little ahead while dealing with multiple monitors and flexibility. I haven't seen that big of a change myself, both schemes handle multiple monitors like a charm.

I'd kill for a color vibrance slider on my Ati card like the one I used to have with Nvidia.

Why no video in as standard yet? 3 out but no in?

Ati beats nvidia clock2clock.

Both companies seems to have reached a "stalemate" with their latest offerings ( notwithstanding the 6800 and r420 parts ).

The only difference from which to choose one card over the other seems to be price, bundle, driver features and image quality.

The only direct relevance to games in the above mentioned points are image quality and to much less extent driver features. ( digital color vibrance, give it to me Ati :P ). So on one side we have someone with great image quality but no tuning for color, and on the other we have some sacrifices made to image quality for a bit of speed but a godsent slider that could make Quake 1 actually look lively ( at max digital vibrance that is :P ).

And one last fact, the artifacts were there a year ago, they still are and getting worse... I know nvidia is gonna fix it, since there's been so much fuss about it, I just wish they would'nt had done it at all, I know it sounds a lot like the "no t&l, no 32 bit " back when 3dfx died. But one thing can't be more certain, Nvidia unlike 3dfx is kikin, hard and I'm looking forward to the next 3 years with anticipation.

Zach Beck Apr 29, 2004, 05:28pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
And speaking of bundle......Has anybody ever seen a DVI - VGA adapter with an nVidia card? I've gotten one each time I've bought an ATI card, whether from ATI or Sapphire or Connect3D. And as for clock speed, I don't know how well the newer NV cards o/c, but my first one sucked at it. I'd take the core and mem up maybe 5, and i'd have artifacts to hell. That may not entirely be the card's fault - but my Radeon 9000 Pro o/c'd very well. It's stock was 275/275. I got it stabley to 308/308. And my newer Sapphire Radeon 9500 Pro o/c's even better than that. I can't even remember what it went to. All i know is it was running like a 9700. I stopped because i was afraid of the heat. Who here o/c's the NV cards with the stock heat sink? On overclocking my Radeon, my 3DMark 2001SE scores went from 8200 to 9915. Now, tell me that's not a big difference, i dare you. :-D

SurrealBeingX <> Apr 29, 2004, 07:29pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
yea my 5900se EVGA didnt OC that well at all, my scores even went down in 3dmark03 after OCing, i guess bc the card was more unstable...still dosent seem right though. But yea, this card OCs OK, not great.

Jesse Bufton Apr 29, 2004, 09:43pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
My biggest beef is not with artifacts in image quality or whose card has better multi monitor support, my beef is with the video card companies and the game companies not bringing us the technologies that they jam down our throats.
Every new video card there is a new technology that they're trying to market to us. I remember when Bump mapping was being jammed down everyone's throats. We don't even SEE bump mapping in ANY game! Sure, Doom III is going to use it, if Doom III in fact exists, which after a year of the release dates being pushed back I'm being to doubt. Next thing we know John Carmack will be saying, 'What, Doom III? No, look at Quake 4. Isn't it pretty? Look at Quake 4!'
Seriously. This guy started endorsing the freaking GeForce 2 when it came out. Telling us all that it was the official card of Doom III. To this day we just don't see all the technologies these cards are packed with actually being used in games. They just turn our attention to the next version of DX and what it supports, and hardly ever actually uses.

I want my Bump Mapping. I want freaking Doom III.

Simon Tremblay Apr 29, 2004, 09:50pm EDT Reply - Quote - Report Abuse
Private Message - Add to Buddy List  
>> Re: Of Screenshots and Microscopes
About bump mapping... It's been in use for years, but not in a lot of games, Expendable is the first game I remember that used bump mapping for the water and ground effects ( also wood ), and it's a long ways off, I had a voodoo 3 back then, and bump mapping was rendered, albeit at a lower quality than today in games like Halo, Farcry and Morrowind ( water effects ).


Write a Reply >>

Continue Reading on Page: 1, 2, Next >>

 

    
 
 

  Topic Tools 
 
RSS UpdatesRSS Updates
 

  Related Articles 
 
 

  Newsletter 
 
A weekly newsletter featuring an editorial and a roundup of the latest articles, news and other interesting topics.

Please enter your email address below and click Subscribe.