Archived

This topic is now archived and is closed to further replies.

Agincourt

Computers of the future

Recommended Posts

Ok, I''m doing an informative speech for a gen-ed coms class. I''m thinking along the lines of "Computers of the future" or "What your kids will use to IM all their friends.(a big waste of computing power)" two titles, same speech. Anyway, I''m thinking along the lines of what new computing technologies will be avaible in 20-30-40 yrs. I''m making a mental list and have Optical computers, Biological computers, Quantum computers, and maybe some radical advances in good ''ol silicon (although I''ve heard it can''t take us much further)What other technologies are out there that have a possibility of making it into our homes and what is most likely? I''d like to avoid Sci-fi. And have you got any credible articles? I''m looking as we speak most likely, but if you''ve got any really cool stuff - I''m in it even just for a good read. Thanks in advance. Use the WriteCoolGame() function Works every time

Share this post


Link to post
Share on other sites
Well, I''m not sure if it''s true or not, but I heard the physical limit of a regular processor is 2ghz, after that you have to start parralell processing or you won''t get any faster.

Oh, and the Enterprise supposedly used positronic computers, which are of the same vein as thier imaginary transporters and replicators. They don''t exist and prolly never will. (But who knows! )

--
Relee the Squirrel

Share this post


Link to post
Share on other sites
Distributed processing. As we begin to have more and more devices (and as devices become more and more miniaturized - look at what you can do with today''s cellphone/PDAs), many of our tasks will migrate from the "central" PC to other devices. For example, IM isn''t terribly popular in Japan due to DoCoMo''s wireless web applications (and 1.2 million Koreans have subscribed for new 3G services for a year).

Voice recognition and speech/natural language synthesis. Rather than having a spreadsheet or a wordprocessor, we''d be able to dictate our documents and transactions to the computer (which means we might not need keyboards, mice and screens. As a matter of fact, other than software/firmware development and games, there''ll be little use for input devices in general. I think that eventually we''ll have what I like to call "invisible computers."



I wanna work for Microsoft!

Share this post


Link to post
Share on other sites
On the topic of HCI, I really like the "Starfire" office setup as described by HCI guru Bruse Tognazzini. Check out a quick site based on the idea here: http://www.asktog.com/starfire/starfireHome.html. This is more of an "in the near future" idea, not more than a decade or two away (I hope!!)

Basically, we do away with the monitor, keyboard and mouse, and use the entire desktop for both input and output. Drag documents around the "screen" and such. Imaging a monitor the size of you desk! (Plus a good portion of the wall behind it as well...)

We''re actually not that far away with plasma screens and those new "paper thin" displays...


codeka.com - Just click it.

Share this post


Link to post
Share on other sites
Silicon Graphics Industries have already ditched silicon chips in favour of cobalt-based processors in many of their machines. Their integrated architecture also gives them an edge. The RAM is physically plugged into the hard drive and the main processor plugged into the RAM. This all saves valuable nanoseconds as 1 millimetre is a large distance for an electron to be travelling. I don''t see quantumn computers happening anytime soon since nobody really knows enough about quantumn physics to make one. In theory a quantumn computer would be infinitely fast. Doesn''t sound likely in practice. Biological processors have been constructed successfully to perform simple mathematical tasks such as addition. I have heard rumours of atomic processors being investigated where a single molecule is used to store a binary digit, a matrix of these forms the processor as for conventional chips. I think the theory was proved to be sound but the costs of development too astronomical even for the military.

Share this post


Link to post
Share on other sites
I know some computer company was working on AI chips, instead of hard coding neural networks, we could soon have them built on a hardware level for quick access, let me know if you are interested and I''ll dig up the links,

Cheers,
MK83

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Programmable grid array, or Field Programmable Gate Array or something like that, is another (new?) technology.

It works something like this:

A standard CPU has a fixed set of transistors set in
a fixed order.

The Grid Array CPU has nothing exept the ability to form
itself into some "transistor layout". It molds itself to
solve the problem. Most likely it''s programmed by another
"standard" computer, the hardware is created on-the-fly.

NASA has one.

Share this post


Link to post
Share on other sites
quote:
Original post by Maximus
No input devices? Make sure your passwords are hard to pronounce

You wont need passwords as we''ll be using biometrics (a combination of voice recognition, retina scans, thumbprints, heatsignatures, etc as appropriate).



I wanna work for Microsoft!

Share this post


Link to post
Share on other sites
quote:
Original post by Dean Harding
Basically, we do away with the monitor, keyboard and mouse, and use the entire desktop for both input and output. Drag documents around the "screen" and such. Imaging a monitor the size of you desk! (Plus a good portion of the wall behind it as well...)

I just wonder about the feasibility of this, though. I mean, if I can dictate my documents to my computer, see a rough draft or finished product and have that immediately sent to all 1400 branch offices worldwide, why would I want to drag it about the screen? It''s like procedural versus OO programming, it requires a mental shift that hasn''t hit yet; we''re still stuck in our current form of GUI applications. Other than painting applications (which might become redundant, since I''ll be able to draw - or describe or measure in the case of 3d - what I intend and present it to the computer for optical scanning and replication), what need do I really have for drag and drop?

None.

When the interface changes, the entire paradigm will change with it. However, voice-driven computing is still much farther away than what you describe, so my guess is that the "starfire" office setup might be another stop "along the way."

quote:
We''re actually not that far away with plasma screens and those new "paper thin" displays...

Mmmm, plasma... Let''s make things better (Philips)



I wanna work for Microsoft!

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
quote:
Original post by Anonymous Poster
Programmable grid array, or Field Programmable Gate Array or something like that, is another (new?) technology.

It works something like this:

A standard CPU has a fixed set of transistors set in
a fixed order.

The Grid Array CPU has nothing exept the ability to form
itself into some "transistor layout". It molds itself to
solve the problem. Most likely it''s programmed by another
"standard" computer, the hardware is created on-the-fly.

NASA has one.



Those are pretty common in universities/schools that teach electronics, since you can easily ''burn'' one of those with VHDL code, and then just erase it and burn again if you had bugs or whatever. You can''t do very large things though, since they are basically just a matrix of transistors, some of which are burned/destroyed to make the correct connections. You need an extremely large FPGA to get anything ''advanced'' (like a CPU) in it, and I don''t think such a large one would be worth the costs, and it won''t be as fast as a special-made (''real'') chip.

Share this post


Link to post
Share on other sites
Ha. What is this, comedy hour? All these ideas sound nice and fancy, but let's be realistic here. They're expensive, and some impossible. Anyone who knows business knows that companies don't make stuff to immitate sci fi. They make what is the cheapest. Well, recently Intel has been able to make transistors at 3 atoms thick. With this, they predict they can make a single processor go to 6GHz +. When we get to 6 GHz, we put two. We get to 12, put 4, and so on. It seems to me that a 24 GHz processor is going to be plenty for the next few years.

As for elimination of input devices, why would we do this? Typing isn't that difficult (you are programmers, so you should nod now). And for mice? They're great. It's not very hard to click on a link or folder. And it may be harder for the computer to tell where you want to go when certain links have the same name. Not only that, but a mouse is much cheaper to make than something that not only recognizes voices, but to create a whole system of detecting what the person wants to do (to the effect of click, double click, drag, etc).

One person metioned touching a screen instead of using a mouse. Well, I predict that the monitor and TV will soon become fused into one. I mean, all HDTV is is a big monitor. I dunno about you, but when I have a 60 inch monitor, I don't want to do aerobics just to get into MS Word.

There's my little opinion. Feel free to bash all you want.

--Vic--

Edited by - Roof Top Pew Wee on November 5, 2001 10:49:13 AM

Share this post


Link to post
Share on other sites
quote:
Original post by Roof Top Pew Wee
As for elimination of input devices, why would we do this? Typing isn''t that difficult (you are programmers, so you should nod now). And for mice? They''re great. It''s not very hard to click on a link or folder. And it may be harder for the computer to tell where you want to go when certain links have the same name. Not only that, but a mouse is much cheaper to make than something that not only recognizes voices, but to create a whole system of detecting what the person wants to do (to the effect of click, double click, drag, etc).

Sing "If I Were A Paraplegic".



I wanna work for Microsoft!

Share this post


Link to post
Share on other sites
What''s that supposed to mean? You think that we are going to change the median of input for a minority of the population? I mean, don''t get me wrong, I''m not trying to insult anyone, but our society has yet to replace all stairs with ramps because there are a few people in wheelchairs. And we do have the technology for this . And to turn it around, what about the mute? We need to keep keyboards for them.

--Vic--

Share this post


Link to post
Share on other sites
I think that the entire paradigm of "desktop computing" will disappear with emerging technologies. Once selective audio in noisy environments becomes feasible, why would I need a keyboard (other than programming and games, which I mentioned above)? If we throw advances in AI into the mix, the concept of programming itself - at all but the lowest levels - may be radically altered. No longer will I need to spend hours encoding obtuse and cryptic sequences of keywprds; I''ll be able to describe to my development environment (in a human language, I might add) the intent of my algorithm. This description is what will then be transferred from device to device, with each interpreting it as appropriate. If a device is incapable of rendering the instructions therein, it will inform the user (and possibly make suggestions). Sounds far off? How about 50 years.

Input devices will always be with us (until they start hooking people''s brains right into computers, at which point I''m jumping ship), but in increasingly less mainstream use. When I can dictate, why type? When I can command and instruct, why point or click? When I can pause, ask for a presentation and verbally correct (and if I have to get my hands dirty, reposition on screen or gesture) much as a professor to a student, why would I mess with nasty, non-intuitive input devices?

Think about it. The mouse sucks; we just happen to be used to it. The keyboard is even worse; how many people "fingerpick" their letters? Place your computer-illiterate grandma in front of a computer with no assistance or instruction; she''ll have no idea what to do.

The concept of a "PC" doesn''t have much longer to live - for most people anyway. Portable devices, roll-up displays, HCI, NLS - those are technologies that are on the periphery of the market which have the potential to revolutionize computing. The only mainstream use for input devices like the mouse or keyboard that I see remaining is games. In an FPS, though, a VR headset and "lightgun" could go a lot further (add force feedback and some way of simulating gravity for heavier guns and you have a winning combination - long live deathmatch!) A "pressure glove" which simulates the response of physical contact with virtual objects is another interesting application. Input devices will remain, but in a new and interesting way.

I can''t wait to see what the future brings...



I wanna work for Microsoft!

Share this post


Link to post
Share on other sites
quote:
Original post by Roof Top Pew Wee
What''s that supposed to mean? You think that we are going to change the median of input for a minority of the population?

No, that''s not what I meant. I meant that voice-driven interfaces would level the playing ground for them, making them just as productive in any capacity as anybody else (without enormous cost).

quote:
And to turn it around, what about the mute? We need to keep keyboards for them.

I agree entirely. Devices hardly ever die out entirely; they linger on in specific usage. There''s also a fascinating little device under development which uses the motion of the eye as a cursor.

If you''re blind, mute and paralyzed, though, you''ll have to wait for the DirectBrainInput™ addon...



I wanna work for Microsoft!

Share this post


Link to post
Share on other sites
only problem being, that when they get all this advanced stuff etc.. 40/50 years time whatever im gonna be old (going back to to the "whack an old person at a comp they have no idea what to do"** thingy) but the old person will be me, thus im gonna lose out on a whole buncha cool stuff

lets hope that before im that old they will have a device that can hook me up to a comp and transfer my life energy into the comp, so i can live forever inside a computer. horray! although the idea of living forever is none-to-appealing. prehaps until someone deletes me then

** but then again, when old ppl of today where young, they must likely didnt have computers to learn to use, thus when im old, i will have a knowledge of how to use them. therefore destroying the idea of my first paragraph!

quick question: when like computers get really good (not that they arnt already) but "parallel computing" as Oluseyi said about, are developers really gonna be able to produce games that use all that power? theres must be a limit to how advanced AI can get, how good graphics can look etc...

so really, theres not gonna be a great need for these overly powerful machines of the future.

just an idea of the top of my head, no doubt someone as a good responce

Share this post


Link to post
Share on other sites
Haven''t read through the entire poll but i''d like to add that the starship Voyager was the first ship in starfleet to implent the use of bio-gel packs (biological computers)... just my 0.02$

Share this post


Link to post
Share on other sites
quote:
Original post by Roof Top Pew Wee
Ha. What is this, comedy hour?


Take my wife. No, please.
quote:

Anyone who knows business knows that companies don''t make stuff to immitate sci-fi. They make what is the cheapest.


You seen the iMac? Looks sci-fi to me. I bet they could have made a cheap grey box, but they went for the wacky sci-fi curvo-rama.
quote:

Well, recently Intel has been able to make transistors at 3 atoms thick. With this, they predict they can make a single processor go to 6GHz +. When we get to 6 GHz, we put two. We get to 12, put 4, and so on. It seems to me that a 24 GHz processor is going to be plenty for the next few years.


That, presumeably, depends upon what you''re doing with it. Also bare in mind that a processor running at 24Ghz will still be limited by the rest of the system.
quote:

As for elimination of input devices, why would we do this?

Not elimination, augmentation.
quote:

And for mice? They''re great. It''s not very hard to click on a link or folder.


It''s easier to tap on a link or folder.
quote:

And it may be harder for the computer to tell where you want to go when certain links have the same name.


Well, it''s got to put that 24GHz to some use.
quote:

Not only that, but a mouse is much cheaper to make than something that not only recognizes voices, but to create a whole system of detecting what the person wants to do (to the effect of click, double click, drag, etc).


We already have voice recognition software. Any processing complexity should be easy for your 24GHz processor to cope with, and don''t forget touch-based input. In all probability, we''ll still keep the mouse around (I couldn''t play a FPS without one), but we wouldn''t use it all the time.
quote:

One person metioned touching a screen instead of using a mouse. Well, I predict that the monitor and TV will soon become fused into one. I mean, all HDTV is is a big monitor. I dunno about you, but when I have a 60 inch monitor, I don''t want to do aerobics just to get into MS Word.


An image I had in my head was a simple interface panel that ran vertically and horizontally, with a smooth curve in between. You might put a keyboard window in the horizontal section, and a document view window in the vertical section. If you wanted to move things about, you could drag the view down to the horizontal where it''s easy to get at it.

Tog also mentions an integrated scanning system: place a paper document face down on the interface, and it''d scan the image into a software document. This kind of scanning technology is probably a way off and will certainly be very expensive at first.

I think that one thing that will definately happen in the future is genuine ''Plug-and-Play''. i.e. you connect a device and it starts working, no questions asked. We can also expect to see larger and faster portable storage devices.

One thing I though of a while back was display devices with adaptive resolution. Imagine having a GUI running at 64dpi and viewing a document at 512dpi on the same device. The lower resolution GUI is fast to draw, and the higher resolution document doesn''t get the aliasing you probably associate with word processing applications.

All your bases belong to us

Share this post


Link to post
Share on other sites
quote:
Original post by Bezzant
quick question: when like computers get really good (not that they arnt already) but "parallel computing" as Oluseyi said about, are developers really gonna be able to produce games that use all that power? theres must be a limit to how advanced AI can get, how good graphics can look etc...


We are the limit to AI. But beyond AI, there are many things we can be doing with that extra processor time. How many games do you know where you can knock a hole in any wall? Where the game map is the size of a planet? Where liquid is similated at a level that has you fooled? Where things can melt? Where several thousand units with mammal-level intelligence can carry out your plans in realtime?
quote:

so really, theres not gonna be a great need for these overly powerful machines of the future.


Well, there is this.

All your bases belong to us

Share this post


Link to post
Share on other sites
quote:
quick question: when like computers get really good (not that they arnt already) but "parallel computing" as Oluseyi said about, are developers really gonna be able to produce games that use all that power? theres must be a limit to how advanced AI can get, how good graphics can look etc...

In addition to what Mayrel said, graphics will always improve. At least until we get to the point where you can''t tell the difference between a game and a movie...

quote:
so really, theres not gonna be a great need for these overly powerful machines of the future.

That''s like saying 640 K is enough...

Share this post


Link to post
Share on other sites