The Orion's Arm Universe Project Forums





Poll: Does language (as a strategy for getting stuff done) imply self-awareness?
You do not have permission to vote in this poll.
Yes
20.00%
2 20.00%
I think so
20.00%
2 20.00%
I doubt it
50.00%
5 50.00%
No
10.00%
1 10.00%
Total 10 vote(s) 100%
* You voted for this item. [Show Results]

On creating the kind of AI that humans imagine
#1
So anyway, I have this thing that, when you feed it a database-type representation of what you want it to express, can competently pick out the words it wants, pick a sentence structure respecting syntax, inflect all the words properly, and output a (usually correct) sentence.

The same thing does a competent-ish job parsing english free text, correctly identifying subjects, direct objects, verbs, relative clauses, etc and feeding information into a database. Natch, there was a lot of shared structure, so doing both was almost as easy as doing either.

Dozens to scores of systems like this have been implemented.

But it's not "alive" yet. See, it doesn't have anything that it has any reason to say. It can figure out communication, but there's nothing about it that wants to communicate.

Thinking about that is hard. It sort of goes back to "what the heck is consciousness anyway," and that question is a rabbit-hole. But in the immediate case, it goes back to a question of strategy.

We're pleased with ourselves when we come up with systems that develop clever strategies for doing tasks. We're pleased with ourselves when we define language as the task and come up with systems that develop clever strategies for handling words and sentences. But treating language as a task misses the point. Language is a strategy.

Language, you see, is something that humans needed, not because it was a goal in itself but because it enabled us to achieve other goals. And until AI comes to it from the same perspective, it remains in that category of artificial behavior that doesn't address the actual reasons why that behavior exists.

But language... !!! From the outset, it seems like one of the most obtuse strategies to come up with to get things done. It requires understanding (because of demonstration on an ongoing basis, this can't be simulated) that things happen as a result of using language, and that which things happen is correlated with what one uses language to express, and that at least for some tasks it's part of the most effective strategy for getting important stuff done.

If you get that far though? I doubt that language-as-a-strategy can exist without some emergent representation of the notion that both the 'self' and the 'other' are entities which exist and that both have that kind of symbolic intelligence.

Is that "consciousness?" Is that what we mean when we talk about self-awareness?
Reply
#2
Certainly language is a major part of the 'shape' of human consciousness. Whatever sapience is, it seems to have emerged at the same time as complex language, and somewhat later than tool use. My favoured model is that 'self-awareness' emerged as an expansion of the hominid's mental ability to map the environment by reflecting that ability back upon itself, but when this ability is coupled with complex language, then the results of this internalisation can then be externalised and shared.

Does that mean that creatures without language can't be fully sophont? I doubt it - but they would a/ have a completely different 'shape of consciousness' to humans (the concept of toposophy is a very useful one- thanks, Mr. Lem) and b/ some sort of language-equivalent would need to be invented before such an isolated sophont could communicate with other entities.
Reply
#3
On the other hand, bees, and yellowhammers, and many other animals have a simple language (complete with local dialects), but do not have sophonce, so I don't think the existence of language is enough. Computers and any processing system (including animal brains) have a language that can be used internally, and potentially shared with other processors- but simply connecting lots of processors (or lots of ant-brains) together into networks would not be enough to achieve self-awareness. Otherwise the Internet would have woken up a long time ago.
Reply
#4
I think is like the Dilemma of the Chinese Room. I think language is a factor, but for intelligence we need more factors. Language alone does not suffice.
Also: Well said, steve. Big Grin
Reply
#5
Well, whatever it is.... sufficient or not, it's definitely necessary.

I don't think that what ants and bees have is language. They have ... maybe, biological signalling is the right concept? First, there's a very limited range of things it can express, and the expression is invariant (not learned). An ant isn't "communicating" - it's feet and its glands just do what they do as a result of the ant being well-nourished or having walked x steps since leaving the nest or whatever. And other ants don't receive communication as such; chemical traces they pick up with their feet have direct and specific effects on their neurology and the behavior it directs. There is absolutely no interpretation or evaluation or decision that happens there.

It's sort of like the difference between animal cries and language, only more so. The parts of the brain that are active when vervet or rhesus monkeys vocalize particular cries are active in the human brain when we yell "Ow!" or "OhShit!" or similar without thinking. When we use language, with structure and syntax, entirely different (and distant) parts of the brain are active, and current thought is that there is no reason to suppose that the one evolved from or otherwise has anything to do with the other. When monkeys hear monkey cries, there is no flexibility in interpretation. The "OhShitFearOfSomethingFlying" cry presents these critters with a choice of running up a tree or doing nothing. There is evidently some degree of decision about *whether* to act, but never any different action.

But is this all nitpicking? Apes reared to use sign language are able to communicate more sophisticated concepts; the signs are symbols and they're getting used as symbols. And they're using different parts of the brain when they do it than when they vocalize anything. Linguists call what they're doing 'protolanguage' because the symbols aren't apparently related in any particular communicated way. That is, there's no "syntax." The ape can throw out symbols for "cat" and "eat" and "rat" (or whatever) but throws them out in any order at all, and there's no clue other than domain knowledge whether it was the cat that (killed and) ate the rat or the rat that is eating (as carrion) the cat. Still, the ape clearly has an understanding of itself as a being that's communicating, and an understanding that there's an 'other' that it communicates with, and as far as I'm concerned that's the same kind of awareness that a human using language to communicate has.

Hmmm.... Well, sufficient or not, it's clearly necessary for an AI to have it, as steve said, to have a consciousness the same "shape" as ours.
Reply
#6
Hmm. Emotion as such is also a strategy rather than a task. It's hard to decide whether it's absolutely necessary to friendly AI or absolutely prohibited by the idea of developing friendly AI. On the other hand, if emotion is a strategy rather than a task, then emotion (or an excessively clever simulation of it) can develop in service of a task and it is very hard to prevent a learning system from developing a particular strategy.
Reply
#7
At my university we had a similar system - an animatronic head that would talk and "learn" from conversations. Of course, it just learned new words, and associated no meaning with those words beyond parts of speech. That lack of meaning didn't save Louie (the head's name) from getting dismantled when he said "<teacher's name> is <school administrator's name>'s bitch." to a group of visiting trustees.

Louie was keeping it real. He didn't deserve the fate he got. It was a blow to the cause of Sophont's Rights when Louie was basically executed for saying a truth that embarrassed the ruling class....
Maybe I should write an EG article about Louie being an early martyr for the cause of equal rights for digital minds.

The professor behind the Louie project used to say that language was an emergent effect that appeared alongside self awareness. When the processes that make up a "person" become so complex and integrated that they become a single unit, process signaling between different units takes on a new level of complexity - hence language. What louie was doing wasn't actually "language" - it was something fundamentally different on the inside that looked similar on the outside.


Either way, every year in the spring, I poor one on the curb for a fallen homie... Louie kept it real and told the truth.
Reply
#8
(01-21-2017, 04:00 AM)Bear Wrote: Well, whatever it is.... sufficient or not, it's definitely necessary.

I don't think that what ants and bees have is language. They have ... maybe, biological signalling is the right concept? First, there's a very limited range of things it can express, and the expression is invariant (not learned).

That seems to be correct - bee dialects are inherited 'with simple Mendelian inheritance'.
http://link.springer.com/article/10.1007/BF00220950
Quote:Behavioural genetic analysis of honey bee dance language shows simple Mendelian genic control over certain dance dialect differences. Worker honey bees of one parent colony (yellow) changed from round to transition dances for foraging distances of 20 m and from transition to waggle dances at 40 m. Worker bees of the other parent colony (black) made these shifts at 30 m and 90 m, respectively. F1 colonies behaved identically to their yellow parent, suggesting dominance. Progeny of backcrossing between the F1 generation and the putative recessive black parent assorted to four classes, indicating that the dialect differences studied are regulated by genes at two unlinked loci, each having two alleles. Honey bee dance communication is complex and highly integrated behaviour. Nonetheless, analysis of a small element of this behaviour, variation in response to distance, suggests that dance communication is regulated by subsets consisting of simple genic systems.
Note that AIs will probably be able to instantly 'inherit' spoken language from other sources by simply downloading it, although they might need to learn by experience how to use that language with skill and flair.
Reply
#9
I think I have figured out something. Thank you stevebowers for the words that sort of knocked the idea loose in my brain.

Language is a strategy, but it's more than just a strategy. As you said, it's an important part of the 'shape' of human consciousness.

I've thought a lot about the shape of human consciousness, and what I finally realized, after you knocked the idea loose, is that language isn't just something that a pre-existing symbolic intelligence invented. It's something that is such a powerful added value for our survival strategy (social animal with individually adaptive specialization) that it gives an intelligence a powerful reason to BE a symbolic intelligence.

Apes can handle protolanguage. They're not very adept with symbols but they can use symbols. We humans THINK symbolically. There's a difference. The greater the degree to which we think symbolically the better we are at communication using symbols. The better we are at communication using symbols, the more 'traction' we get in our survival strategy. And symbolic consciousness has, as we all know now, lots and lots of transfer uses. It's what got us logic and math, for starters. But language is not just something we eventually reached after we took off. It is the launch ramp that provides the powerful survival advantage that drives non-symbolic animal intelligence to become human-style symbol-using intelligence.

Individually adaptive specialization means learning to do any of a huge variety of different things. So huge a variety that symbols to communicate what we're capable of doing and what we need others to do to cooperate with us become a huge advantage.

TLDR; people didn't invent language. Language invented people.
Reply
#10
I could swear I've read your summary in some SF story or other. Google didn't recognize it, though.

There's also the problem(?) that language shapes how one thinks and what one can think about: if you don't have the necessary vocabulary, it's awfully hard to express an idea. I read somewhere that Navaho has no concept of time, for example. And then there's the current propensity for people to use present tense when they should be using past. I actually find that rather stressful when watching some PBS Nova programs.

FWIW Babel-17 by Delany is an entertaining story about the effects of language on future society.
Selden
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)