Computers of the future

Started by
42 comments, last by Agincourt 22 years, 5 months ago
Ok, I''m doing an informative speech for a gen-ed coms class. I''m thinking along the lines of "Computers of the future" or "What your kids will use to IM all their friends.(a big waste of computing power)" two titles, same speech. Anyway, I''m thinking along the lines of what new computing technologies will be avaible in 20-30-40 yrs. I''m making a mental list and have Optical computers, Biological computers, Quantum computers, and maybe some radical advances in good ''ol silicon (although I''ve heard it can''t take us much further)What other technologies are out there that have a possibility of making it into our homes and what is most likely? I''d like to avoid Sci-fi. And have you got any credible articles? I''m looking as we speak most likely, but if you''ve got any really cool stuff - I''m in it even just for a good read. Thanks in advance. Use the WriteCoolGame() function Works every time
Use the WriteCoolGame() functionWorks every time
Advertisement
Parallelism (parallel computing). Why continue to have a single processor split up its time among multiple tasks? The technology is here right now; in the future it will simply become commonplace.


I wanna work for Microsoft!
Thanks again Oluseyi. That''s the second time you''ve helped me this week I beleive. I''ll look that one up. That seems like a likely one.

Use the WriteCoolGame() function
Works every time
Use the WriteCoolGame() functionWorks every time
what kind of computer does the starship enterprise use?/
ha

Use the WriteCoolGame() function
Works every time
Use the WriteCoolGame() functionWorks every time
Well, I''m not sure if it''s true or not, but I heard the physical limit of a regular processor is 2ghz, after that you have to start parralell processing or you won''t get any faster.

Oh, and the Enterprise supposedly used positronic computers, which are of the same vein as thier imaginary transporters and replicators. They don''t exist and prolly never will. (But who knows! )

--
Relee the Squirrel
-- Relee the Squirrel
Distributed processing. As we begin to have more and more devices (and as devices become more and more miniaturized - look at what you can do with today''s cellphone/PDAs), many of our tasks will migrate from the "central" PC to other devices. For example, IM isn''t terribly popular in Japan due to DoCoMo''s wireless web applications (and 1.2 million Koreans have subscribed for new 3G services for a year).

Voice recognition and speech/natural language synthesis. Rather than having a spreadsheet or a wordprocessor, we''d be able to dictate our documents and transactions to the computer (which means we might not need keyboards, mice and screens. As a matter of fact, other than software/firmware development and games, there''ll be little use for input devices in general. I think that eventually we''ll have what I like to call "invisible computers."


I wanna work for Microsoft!
On the topic of HCI, I really like the "Starfire" office setup as described by HCI guru Bruse Tognazzini. Check out a quick site based on the idea here: http://www.asktog.com/starfire/starfireHome.html. This is more of an "in the near future" idea, not more than a decade or two away (I hope!!)

Basically, we do away with the monitor, keyboard and mouse, and use the entire desktop for both input and output. Drag documents around the "screen" and such. Imaging a monitor the size of you desk! (Plus a good portion of the wall behind it as well...)

We''re actually not that far away with plasma screens and those new "paper thin" displays...

codeka.com - Just click it.
No input devices? Make sure your passwords are hard to pronounce
-----------------------"When I have a problem on an Nvidia, I assume that it is my fault. With anyone else's drivers, I assume it is their fault" - John Carmack
Silicon Graphics Industries have already ditched silicon chips in favour of cobalt-based processors in many of their machines. Their integrated architecture also gives them an edge. The RAM is physically plugged into the hard drive and the main processor plugged into the RAM. This all saves valuable nanoseconds as 1 millimetre is a large distance for an electron to be travelling. I don''t see quantumn computers happening anytime soon since nobody really knows enough about quantumn physics to make one. In theory a quantumn computer would be infinitely fast. Doesn''t sound likely in practice. Biological processors have been constructed successfully to perform simple mathematical tasks such as addition. I have heard rumours of atomic processors being investigated where a single molecule is used to store a binary digit, a matrix of these forms the processor as for conventional chips. I think the theory was proved to be sound but the costs of development too astronomical even for the military.
Geocyte Has Committed Suicide.

This topic is closed to new replies.

Advertisement