• Advertisement

Archived

This topic is now archived and is closed to further replies.

A.I. Microchip PCI Card

This topic is 6001 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I''m very sure someone has to have had this idea before me, if so, what are they waiting for? My idea is simple. Neural Networks and other methods for creating AI agents, have all been defined. A company could simply create a chip with arround 1 million neurons, and we the programmers, would fill that chip, and make the axional conexions. Then we would say "render", and the chip would process its data, sending us the output results to a temporay memmory buffer, with which we could decide how to move our alien monster that is creeping in our mothership in our game... ok, I think u guys have the gneral idea, and no, I don''t believe we should stick with NNets only. We could go several ways. If AI is one of the biggest piece of the pie, in processing time, then with hardware support, imagine what we could do... ok guys, waiting for your input on this one, Hugo Ferreira UniteK Future "Concentrate, and you can see it. If you see it, then it is possible. If it is possible, you can Achieve it."

Share this post


Link to post
Share on other sites
Advertisement
I had this idea already many years ago, but like you, I just waited for something to happen.
I guess that the AI industry is far to young and need to mature before the appropriate hardware can be manufactured.
The main problem today is that nobody really knows how to implement it. And the market is yet unsure.

But one thing is important to note:
The one with the first AI PCI card will get VERY RICH fast, because the game industry is crying out for more power to AI everyday, and the market wants it bad.

Some established hardware vendors must take the first steps, because you and I cannot.



The Game Industry
OpenGL/OpenAL/OpenNL

Share this post


Link to post
Share on other sites
i''ve thought something similar. i''ve came up with a "AI acceleration card", but what u say looks better. Neural networks in a card is a very good idea. it''s a shame i can''t help u to make it come true. i''m barely surviving here in Argentina

YOUR IDEA IS GREAT. MAKE MORE PEOPLE KNOW IT, MAYBE SOMEONE WITH THE PROPERLY KNOWLEDGE HELPS YOU

Share this post


Link to post
Share on other sites
I think the problem with your idea specifically (apart from the usual hardware chicken-and-egg syndrome of the hardware being unsellable until software supports it, and vice versa) is that neural networks are not really that widely used. There also seem to be a lot of different implementations of them, which would presumably involve a lot of different code on the hardware.

I think a more viable proposition would aim at the more generally-applicable side of AI. Perhaps hardware accelerated pathfinding/state search would be more useful in the short term. You could throw a "node buffer" at the card, much like a 3D card takes a vertex buffer or display list, and tell it how you want to process it... it could then return the path for you. I expect a hardware BFS search could be done trivially. The other algorithms sound a bit harder and might require you to adopt "standard" graph layouts to simplify things like heuristic calculation (eg: specify that the nodes have a 2D or 3D "location" parameter and then choose between Manhattan or Pythagoran distance heuristics).

I think that once more basic stuff like this is implemented in hardware, you might find more movement towards hardware implementations of neural nets and so on. But I think the problem in comparing AI to 3D graphics, is that 3D graphics is more standardised and there are "correct" ways to do things based on the laws of physics. AI, although just as mathematical in nature, is still heavily theoretical, and therefore harder to pin down to a single specific implementation.

I''d be interested to hear the thoughts from some of the more qualified people (than I), though.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
I think the biggest problem with a card that does AI is that it must send the data back. A video card processes data in a pretty much one way stream. You usually don''t get any processed info back. In an AI card you need to have access to the final product which means using the bus twice. Since moving stuff around takes a long time, this is a very big strike against a card. A card however that could be given small programs to run in parallel to the CPU might be a worthwhile investment, but a way to limit the data movement is needed.

Share this post


Link to post
Share on other sites
Obviously you know nothing about AI, but
thank you for showing your lack of knowledge in public.

A NNet can be "programmed" to retun a single value,
a byte.
With a byte u can have 256 diferent instructions, like,
"I want to go east, so i can eat that guy" or,
"That guy has too much firepower, im going to catch him by
surprise".
Bandwidth is NO problem.

Anyone else care to join in?
My question would be: Would you buy such a card?

Hugo Ferreira
UniteK Future
"Concentrate, and you can see it. If you see it, then it is possible. If it is possible, you can Achieve it."

Share this post


Link to post
Share on other sites
Just to rain on everyones parade...

The idea of a neural net card that can be programmed for specific uses is already done, gone... and on the market. They are called Field Programmable Gate Arrays (FPGA). You can buy a machine decked out with them for around US$10k I believe. They''re yet to prove their worth over standard (read static) IC based systems, but they do come up with some funky results.

Cheers,

Timkin

Share this post


Link to post
Share on other sites
The idea of an AI Addon Card was discussed at great length years (5+) ago
on comp.ai.games and on gamasutra and in various AI Roundtables at GDC.

Initially, I thought it was a great idea, however over the course of the
discussions, I came to agree with many opponents of the idea. The one
point I was convinced of, was that the overwelming majority of computer
game players already find the AI in their computer games to be "tough"
enough to play against now, and thus would have no interest in paying
the additional $$$$ for an AI Addon Card. Such a preference would then
leave the only real market for an AI Addon Card to be the hard-core
computer games who craved a more challenging AI opponent. That size
of market made it hard to justify the R&D and Market Promotion costs
(to convince developers to use the AI Addon Card in their games) and
thus it was felt that there was no real market.

Maybe this accounts for the poor acceptence of the FPGA card that
Timkin talks about?

Anyway, I just wanted to point out that this is an old idea, and then to
say it will be interesting to see if anyone can really make it work now.

Eric

Share this post


Link to post
Share on other sites
Hey,

I don''t know a whole lot about neural nets or AI in general, but this post is in response to the one above.

I believe that with AI, less is more when in reference to computer player difficulty. Let me explain.

If the "AI" in a computer is being computed as a direct result of what ever the player is doing, then of course you could make an unbeatable game/opponent with very simple calculations. Take a game like mortal combat for example. All the computer opponent has to do is say:



if(high_kick)
block();



Very simple example, but works.

So you see, if the AI was enhanced to more of a brain like structure with thinking rather than instant computerized responce, it would be more human-like, and not nessesarily more difficult

Just my 2 cents

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
That is exactly the point.

Its not ''thougher'' we are after when coding AI. Its the ''life-like'' feel of a given situation and responses from the opponents.

Who has not played against Bots for some time, only to quit because the opponents are ''boring'' or ''not real people''.

With a real boost in AI power, these situations could be calculated to simulate ''almost real people'' as the opponents, giving a MUCH more intrigueing situation for the player.

Its not only games that can use the power of a Artificial Neural Network Accelerator Card. We could use these neural nets to learn and react on almost ANYTHING.

Here is an example:

A neural net with only a few nodes, can learn to distinguish if a photo of a human face is that of a male or female.

A game example:

A neural net with only a few nodes, can learn to distinguish what the general outcome would be in a strategy game, simply by reckognizing the current battlefield.

Another game example:

A neural net would be able to foresee how a human player would solve or react to a given situation in a game, and therefore choose such behavior to produce a more ''life-like'' approach.


The use of neural nets are not only useful for games as you understand. Its a way of using a learning capability of the hardware.

These judgements can be retrieved almost instantanously when using a hardware accelerated AI.

Share this post


Link to post
Share on other sites
Perhaps my point got missed?

I totally agree that with an AI Addon Card, that "better" AI could probably be built
for computer games.

But, is there sufficient market for buying such an additional card to be found
in the majority of computer game players? When this discussion was held years ago,
it was thought not. Has that market situation changed? If so, what evidence can
anyone offer to suggest that the typical computer game player would shell out the
$$$$ to get an AI Addon Card.

The simple economics are that if there is an insufficient market for the card, then it
is unlikely that someone would invest in developing one for our AI usage.

Eric

Share this post


Link to post
Share on other sites
I know nothing about proper AI, but I believe that more "lifelike" results can be obtained with current hardware. My opinion is that a lot of contemporary developers use their available cycles and processing power rather carelessly. Black & White (for all it alleged AI shortcomings, it still is a vast improvement over most of the field) runs on the same hardware as a Quake III. Demis Hassabis (I hope that''s the man''s name) has discussed a new method he''s created for large scale AIs.

Restricting problems to a specific domain and focusing on results rather than means - meaning employing heuristics or "sythnetic" (aiming for similar results) rather than artificial (aiming for similar process) intelligence - may yield the greatest gains in the field of computer gaming. Additional hardware would only (currently) be applicable in industrial AI, where clock cycles and processing power are even more plentiful...

JMHO.

Share this post


Link to post
Share on other sites
quote:
Original post by Oluseyi
I know nothing about proper AI, but I believe that more "lifelike" results can be obtained with current hardware. My opinion is that a lot of contemporary developers use their available cycles and processing power rather carelessly. Black & White (for all it alleged AI shortcomings, it still is a vast improvement over most of the field) runs on the same hardware as a Quake III. Demis Hassabis (I hope that''s the man''s name) has discussed a new method he''s created for large scale AIs.

Restricting problems to a specific domain and focusing on results rather than means - meaning employing heuristics or "sythnetic" (aiming for similar results) rather than artificial (aiming for similar process) intelligence - may yield the greatest gains in the field of computer gaming. Additional hardware would only (currently) be applicable in industrial AI, where clock cycles and processing power are even more plentiful...

JMHO.


I mostly agree. One of the promises of using the AI Addon Card, would be to provide
processing power for the exclusive use of AI, so that processing cycles would not be
"carelessly" consumed by graphics and user interface. However, I disagree that some
additional hardware would only be applicable in industrial AI. Computer gaming AI was
suffering from a dirth of resources available, and even now with more publisher awareness
(I can''t call it emphasis, because its not) of AI, the resources being made available are
still less than graphics and UI. The new generation of graphics cards (GForce 3, etc.)
are even helping by taking over even more of the graphics workload than ever before.

Also, after visiting with Demis at GDC 2001 and attending his lecture, I am looking forward
to his game Revolution and the AI it is hoped it will demonstrate. Demis has some
ambitious plans and I hope they succeed.

Eric

Share this post


Link to post
Share on other sites

  • Advertisement