Wow that's a pretty bold statement. Care to bet--oh, say, $10,000 on it? I am game. Seriously. Like taking candy from (literally) a baby!
You guys still just do not get it, and continue to talk out your a$$ in spite of a complete void of fact or experiental context. You have an utterly uninformed opinion. That's great. But you are also making the mistake of thinking it is in anyway a hallucination of reality.
I want you to copy the text of what you wrote, and save it to a file named "READ ME ON JAN 1 2013.doc". Just be prepared to feel like a complete idiot--or at least to muse at how foolish you USED to be.
The list of people who were once as absolutely dead-certain as you (or at least made as idiotically narrow-minded and short-visioned statements of absolute like you) is very, very long.
Microsoft is quoted on the record in 1980 as saying, "DOS addresses only one megabyte of RAM because we cannot imagine any application needing more." Microsoft--love 'em or hate 'em they do have some very talented, forward-thinking people--said that. ONE MEGABYTE! Can you imagine!? (Actually, I can. That itself used to be unthinkable, back in the days of 4 KILOBYTE computers.)
Also: "640k ought to be enough for anybody"--Bill Gates, 1981.
Like they did (and learned their lesson), you are making the fundamental error of projecting TODAY'S software technology onto TOMORROW's hardware. Sorry, try again. It doesn't work that way. You've got to project TOMORROW'S software onto tomorrow's hardware. What will tomorrow's software look like? No one knows precisely. We can guess. Let me ask you this: 10 years ago, could you have imagined the sheer realism and vast expanse of today's games like Far Cry or HL2--compared to Doom 1 (which was jaw-dropping at the time)? At that time, nobody thought video game technology could possibly advance any further. So the evidence is right in front of your nose: you simply cannot imagine what tomorrow's software will look like, no better than anyone could at any time in the past. We can theorize--just as we can theorize and explain a 4-dimensional hypercube--but we can't really grasp it. We will be able to someday though, as the future slowly unfolds in front of us.
In 1976, the Cray-1 Supercomputer was released. It had a 16-bit bus, 8mb RAM, performed 80 megaflops, and cost $8 million. The average PC user at the time thought "what in god's name would anybody need something like that for?" In fact it was such an absurd question that nobody really asked it.
Let me spell it out for you in simple terms:
* Humans doing long division: MILLI-flops (1/1000th of one flop)
* Cray-1 supercomputer, 1976, $8m: 80 megaflops (up to 120 depending on who you ask)
* Pentium II, 400 mhz: 100 megaflops
* TYPICAL HIGH-END PC TODAY: about 1,000 megaflops
* Sony Playstation 3, 2006 (combined CPU+GPU and depending on who you ask): 25,000 to 2,000,000 megaflops
* IBM TRIPS, 2010 (one-chip solution, CPU only): 1,000,000 megaflops
* IBM Blue Gene, < 2010 (with 65,536 standard RISC microprocessors): 360,000,000 megaflops
News item: "Infineon is working on DDR2-400 memory modules with a capacity of 8 GByte! Paving the way for future specifications of 16, 32 and 64GB of RAM" (http://www.techspot.com/story17578.html
); of course it mentions this is only useful for high-end servers--which is precisely what they said about the 733 dual Xenon setup I bought several years ago for home use, at incredible cost, which now chokes as a simple "low-bandwidth" (by today's standards) web server.
I remember only about 5 years ago, a company I did some work for had an 8-way Citrix server running an OUTRAGEOUS 1gb or RAM. It cost about $30,000. (Of course it also had about 250 GB RAID 5 SCSI array and redundant NICs, power, etc.--but the RAM was a significant chunk.)
So please spare me the arguments that computing capacity will not continue to increase, or will not need to.
Isaac Asimov once wrote "any technology sufficiently advanced is indistinguishable from magic". I would add that in context he meant "advanced technology seen for the first time". Eventually of course you get used to it. Imagine glimpsing Mac OS-X Jaguar (or Longhorn) back when Hercules monochrome graphics was the hottest thing going. Try it for a moment. Really take yourself back. Well my friend, if the past is any indicating trend (and unfortunately it is), the very near future will see the same exponential gains (on the order of 2 raised to the power of X, where X is the number of 18-month periods). Longhorn and it's "3D" interface is just SCRATCHING the surface. That breakthrough--using 3d accelleration for the UI--mark my words, will be seen looking back from the future as one of the great liberators of the 2D windows paradigm. We've been stuck in 2D for too long, and in the very near future the limitations will be removed. Longhorn's flashy effects are kid's stuff, just eye-candy--nothing terribly useful. But that's just because no one has had time to explore what can be achieved, and how the human/machine interface can be evolved or even broken through using it.
And as for games--imagine how expansive the world of "Far Cry" is to "Doom". In 10 years we have gone from stuffy corridors, 320x240 2-D graphics, and 8-bit color and sound--to whole island chains at 1920x1080, 24-bit color, and 16-bit sound, . But Far Cry will seem so limited, compared to games 10 years from now--when we have photo-realistic characters that can "learn" (or simulate it), and enivironments that will blow your mind in detail and expanse, all without the painfull in-play loads after every 10th door. No longer will objects be composed of hollow polygons, rather they will be modeled like--and behave like--real materials. Sounds will be generated by physical models, not months of labor-intensive sampling and weeding through libraries. People say games are getting harder and harder to bring to market, as they get more elaborate. This is a temporary trend. As with any software development, future generations build on and leverage work done in the past. Physics libraries will be widely available. World simulators will exist that simulate geology over billions of years--no longer will you have to create worlds out freakin' polygons. Motion simulators will exist (all this for free or for license), so no more motion-capture for human or monster characters. The shape, size, mass, skelature, and musculature of the monster will dictate how it moves. All game developers will need to do is provide it with a motivation. Game developers will be more like movie directors rather than hard-core coders. (Whom will still have a place in game-development--but like all software trends, they will move towards library creation rather than the final product.)
And such fine-grained physics simulators (forget Source) will not come cheap (except in the future)--they will chew up resources like nobody's business. I remember when Quake II ran about 15 fps. Now it runs something like 300+ fps. Try extrapolating that backwards from the future.
Still not convinced? Try reading these articles--if you haven't been using a computer for at least two decades continuously, this might be enlightening, and help expand your vision beyond it's current narrow bounds (sorry if these aren't showing up as links):
Required reading in order to be able to remotely intelligently debate this topic (if you haven't lived through it):
Other reading of interest that might help you get a clue - CPU performance charts from last 10 years:
Computer hardware FAQ from 1994 (quaint):
I make another challenge: Try actually researching and educating yourselves, instead of just giving another worthless opinion as if you actually knew something. Go to the google ex-deja vu archive. Search back as far as you can--I don't know how far back they go but I think it's at least 8 years. I do know that Google's goal is to catalog every usenet conversation ever made, which would go back what--15 years? Find discussions on how much RAM would be too much. You will hear EXACTLY the same kind of short-visioned arguments. Over. And Over. I can't tell you how many debates like this I've gotten in. And you know what? I always win, because I don't narrow-mindedly project today's needs onto tomorrow.
And you know what else? The rate of resource requirement usage is ACCELLERATING, not slowing down. This is not just an opinion, read any study published by the Gartner Group, various university publishings, etc. Search for publishing coming out of MIT. The simple reason? Leverage. Just as I've suggested with game evolution, the more our computer hardware and software technology evolves, the more it builds on itself, the larger chunks of functionality are moved into easy-to-use libraries, and the faster software evolves. Look at .NET for example (or J2EE if you prefer). Say what you want about how good or bad it is, but for discussion of leveraged technology it's a decent example. It does HUGE amounts of work with very little code (e.g. resource management, garbage collection, etc.), compared to what used to be required 7.5 years ago, and even more so than 20 years ago, when most serious stuff was programmed in assembler or at best C. So not only is hardware accellerating in capability in complexity according to Moore's law, so is software.
There also seems to be this pervasive assumption that we are at the limits of hardware technology. Microprocessors strictly based on today's technology ARE approaching the limits of thermal dissipation and even realizing quantum tunneling problems. However, there is alot of headroom left just in current technology. Clockless chips are good examples. They already exist and are in use in many products today. CPUs these days spend an ever higher % of their time just managing the clock. Get rid of the clock, and you suddenly have alot more headroom. And then of course there are quantum computers. Already we have workable, quantum encryption for sale right now (not exactly the same thing, but no less magical). Even molecular computers. Reasearchers can already solve incredibly complex problems with "DNA computing" that today's silicon would simply not be able to solve in any reasonable amount of time--say, before the universe winks out in about 100 billion years.
The only problem is we have no idea now, what technologies will pan out and be used in the future. In the days of vacuum tubes, there was speculation about the future, but no one (or very few) would have guessed it would wind up being microtransistors etched into semiconducting material by light. We can no further pin down what new technologies will emerge now, than we could then. We can guess and speculate though--maybe it will be something already posited as possible, maybe something not a single human has yet thought of.
Already there are holographic "DVDs" being marketed (with non-rotating readers) that can hold something like 100 gigabytes, with technical headroom for a terabyte. They are ultra-fast because the discs don't rotate and thus are not subject to the physical forces that have current speeds nearly maxed out due to the limits of affordable material strength (discs flying apart). Who would have guessed that? You were probably thinking DVDs had to rotate, and bam, along comes something that totally changes the rules right out from underneath you.
So my advice is: engage brain before putting mouth into gear. Otherwise your ignorance and short-sightedness shows (as does your tender young age).