The Orion's Arm Universe Project Forums





Poll: Does language (as a strategy for getting stuff done) imply self-awareness?
You do not have permission to vote in this poll.
Yes
20.00%
2 20.00%
I think so
20.00%
2 20.00%
I doubt it
50.00%
5 50.00%
No
10.00%
1 10.00%
Total 10 vote(s) 100%
* You voted for this item. [Show Results]

On creating the kind of AI that humans imagine
#1
So anyway, I have this thing that, when you feed it a database-type representation of what you want it to express, can competently pick out the words it wants, pick a sentence structure respecting syntax, inflect all the words properly, and output a (usually correct) sentence.

The same thing does a competent-ish job parsing english free text, correctly identifying subjects, direct objects, verbs, relative clauses, etc and feeding information into a database. Natch, there was a lot of shared structure, so doing both was almost as easy as doing either.

Dozens to scores of systems like this have been implemented.

But it's not "alive" yet. See, it doesn't have anything that it has any reason to say. It can figure out communication, but there's nothing about it that wants to communicate.

Thinking about that is hard. It sort of goes back to "what the heck is consciousness anyway," and that question is a rabbit-hole. But in the immediate case, it goes back to a question of strategy.

We're pleased with ourselves when we come up with systems that develop clever strategies for doing tasks. We're pleased with ourselves when we define language as the task and come up with systems that develop clever strategies for handling words and sentences. But treating language as a task misses the point. Language is a strategy.

Language, you see, is something that humans needed, not because it was a goal in itself but because it enabled us to achieve other goals. And until AI comes to it from the same perspective, it remains in that category of artificial behavior that doesn't address the actual reasons why that behavior exists.

But language... !!! From the outset, it seems like one of the most obtuse strategies to come up with to get things done. It requires understanding (because of demonstration on an ongoing basis, this can't be simulated) that things happen as a result of using language, and that which things happen is correlated with what one uses language to express, and that at least for some tasks it's part of the most effective strategy for getting important stuff done.

If you get that far though? I doubt that language-as-a-strategy can exist without some emergent representation of the notion that both the 'self' and the 'other' are entities which exist and that both have that kind of symbolic intelligence.

Is that "consciousness?" Is that what we mean when we talk about self-awareness?
Reply


Messages In This Thread
On creating the kind of AI that humans imagine - by Bear - 01-20-2017, 03:53 PM

Forum Jump:


Users browsing this thread: 1 Guest(s)