AI and computer graphics ?

THE_RB

Joined Feb 11, 2008
5,438
To NSAspook; Sure it's great seeing all tech advances where the abilities of machines get better. The point I was making is that trying to make a machine that is *like us* will not represent the best way to make intelligent machines, and will not represent the highest form of machine intelligence.

Intelligence is the ability to process, and in many way your PC is much more intelligent than you in terms of its math intellect, and it's ability to process fast streams of data. In some ways you are more intelligent. Your car is much better at movement than you are, although you can play the piano better (which is just a different movement task).

I have an issue with that assumption that a robot must be developed to be just like a human, as if that is the highest development there is. The best form for advanced robots are robot forms. :)

...
No, machines do not think, and humans do. This is not just about being about to hold an intelligent conversation.
That's a strong statement and I don't agree with it, although our individual definitions of "think" are probably very different.

I design robot systems to make machines "think" and to me the term think means that the machine receives sensory input, processes it, makes decisions on a course of action and carries out the actions.

For instance the new world record micromouse maze solver;
http://www.geek.com/articles/chips/min7-micromouse-robot-solves-maze-in-3-921-seconds-20111122/
traverses the maze in 3.9 seconds! It's a good example of a machine that does the thinking part very similar to a human. First it navigates the maze slowly (as you would), learning where the corners are and finds the goal. Then it sits and thinks, as you would, and plots the fastest path to the goal, then it runs at full speed as you would, using its brain for the fastest physical performance and turning at the corners it remembers, but not thinking much along the way.
 
Last edited:

Wendy

Joined Mar 24, 2008
21,906
Your machine could not survive in the wild. Even if it had the tools, it couldn't survive. Robots can do set tasks, this is not intelligence. They can do some problem solving, but most cases the problems have to be set up. There is a long ways to go IMO, and we are not even close to a good start.

You contend that humans are poor models to emulate. I contend that humans (and other biological) are the only working models we have.
 

THE_RB

Joined Feb 11, 2008
5,438
Your machine could not survive in the wild. Even if it had the tools, it couldn't survive. Robots can do set tasks, this is not intelligence. They can do some problem solving, but most cases the problems have to be set up. There is a long ways to go IMO, and we are not even close to a good start.
...
I respect your point and am not arguing with it, apart from arguing the specific definition of "intelligence". You seem to place a very high emphasis on abstract learning and adaptability as defining intelligence. But they are just one small subset of intelligence, which to me defines abilities across a very broad range of tasks.

...
You contend that humans are poor models to emulate. I contend that humans (and other biological) are the only working models we have.
Not at all! Many small biological entities are of very limited intelligence, like insects, and robotics pioneers have shown that similar biological behaviour can be demonstrated with brains made from just a few transistors. Even though mimicing biological behaviour they are not as intelligent as your PC, whcih can perform massively complex tasks including deduction and forms of learning.

If you were designing a form of transport which is better; a car with pneumatic tyres and 100MPH smooth performance or a transport machine built with 2 human-like legs to carry us to work? Because something works for us, or HAS worked for us through the ages does not mean that is the highest form of development OR that we should make machines mimic it.

Thinking as such is just another task, and we are just not that good at it. The machines are already outthinking us in a huge variety of tasks, in the same way they are outrunning us to work, outflying birds, outswimming fish etc.

Trying to make machines think all fuzzy and faultily like a human mess of neurons is great as a research goal to increase understanding, but it's not (in my opinion) the "grail" of AI or some measurement of the ultimate AI achievement. Machines should (and will) continue in the areas where they are exceeding our abilities and develop that way, not be forced to try to pretend to be humans.
 
Top