The Orion's Arm Universe Project Forums

Poll: Does language (as a strategy for getting stuff done) imply self-awareness?
You do not have permission to vote in this poll.
2 20.00%
I think so
2 20.00%
I doubt it
5 50.00%
1 10.00%
Total 10 vote(s) 100%
* You voted for this item. [Show Results]

On creating the kind of AI that humans imagine
Well, whatever it is.... sufficient or not, it's definitely necessary.

I don't think that what ants and bees have is language. They have ... maybe, biological signalling is the right concept? First, there's a very limited range of things it can express, and the expression is invariant (not learned). An ant isn't "communicating" - it's feet and its glands just do what they do as a result of the ant being well-nourished or having walked x steps since leaving the nest or whatever. And other ants don't receive communication as such; chemical traces they pick up with their feet have direct and specific effects on their neurology and the behavior it directs. There is absolutely no interpretation or evaluation or decision that happens there.

It's sort of like the difference between animal cries and language, only more so. The parts of the brain that are active when vervet or rhesus monkeys vocalize particular cries are active in the human brain when we yell "Ow!" or "OhShit!" or similar without thinking. When we use language, with structure and syntax, entirely different (and distant) parts of the brain are active, and current thought is that there is no reason to suppose that the one evolved from or otherwise has anything to do with the other. When monkeys hear monkey cries, there is no flexibility in interpretation. The "OhShitFearOfSomethingFlying" cry presents these critters with a choice of running up a tree or doing nothing. There is evidently some degree of decision about *whether* to act, but never any different action.

But is this all nitpicking? Apes reared to use sign language are able to communicate more sophisticated concepts; the signs are symbols and they're getting used as symbols. And they're using different parts of the brain when they do it than when they vocalize anything. Linguists call what they're doing 'protolanguage' because the symbols aren't apparently related in any particular communicated way. That is, there's no "syntax." The ape can throw out symbols for "cat" and "eat" and "rat" (or whatever) but throws them out in any order at all, and there's no clue other than domain knowledge whether it was the cat that (killed and) ate the rat or the rat that is eating (as carrion) the cat. Still, the ape clearly has an understanding of itself as a being that's communicating, and an understanding that there's an 'other' that it communicates with, and as far as I'm concerned that's the same kind of awareness that a human using language to communicate has.

Hmmm.... Well, sufficient or not, it's clearly necessary for an AI to have it, as steve said, to have a consciousness the same "shape" as ours.

Messages In This Thread
RE: On creating the kind of AI that humans imagine - by Bear - 01-21-2017, 04:00 AM

Forum Jump:

Users browsing this thread: 1 Guest(s)