The Orion's Arm Universe Project Forums

What scientific ideas should society get rid of?
If Artificial Intelligence is impossible then the existence of intelligent life on this planet other than Homo Sapiens as well as the existence of extraterrestrial intelligence capable of building a complex civilisation should be impossible as well. Homo Sapiens would then be the only intelligent species in the whole universe. But that's only one counterargument to Shank.

The obvious major counterargument is of course Homo Sapiens itself. Humans exist due to the laws of physics which govern this universe. Thus the creation of other forms of intelligence through the same laws of physics should be possible as well.

In any case when I look at Karl Sims' virtually evolved creatures:

I find the moment in virtual evolution at 3 minutes 16 seconds especially interesting. Both creatures are competing for the possession of the "precious green cube". However unlike before where both creatures tried to reach the cube as fast as possible before the other creature could reach it, the genetic algorithm, governing the simulation, eventually computed a solution, where it is more advantageous to prevent or maybe even "damage" the competing creature, because then the "damaged" creature wouldn't be able to reach the cube anymore. And then the cube would belong only to the other attacking creature. In other words competition for resources always includes a solution of what we call: 'aggression'.

I wonder, if one way to achieve human level AI would be to create much more detailed virtual ecosystems with lots of different creatures which would compete for resources. The more detailed the simulation the more opportunities it would create for an evolutionary "arms race", where the "winner" would become more and more intelligent. And I think that in order to make an Artificial General Intelligence you would need a world with lots of competitors, and lots of different resources at different locations in that world. The more complicated and "random" the distribution of resources would become and the more traps and pitfalls a ("sadistic" Smile ) creator of such a world builts into the world the more intelligent the "winners" inside such a world would become. Also in order to raise the intelligence of the inhabitants of such a virtual ecosystem even more, one would have to simulate major disasters inside such a world, where lots of resources would disappear and lots of competing creatures would die. Perhaps something on the scale of the Permian–Triassic extinction event or the Cretaceous–Paleogene extinction event. Another possibility might be to move around the traps and pitfalls inside such a world to different random locations at random points in time in order to avoid "evolutionary overspecialization".

I'm sure that at some point an intelligent "species" (perhaps on the level of a Chimpanzee) would appear inside the simulation. I think that the simulation could be stopped at this point, because then we would be able to study the code of these creatures in detail. And we could also study the evolutionary history of these creatures in detail. Then we would know, how to create an AGI without evolving it first. Or maybe we will discover that the easiest way to create an AGI is always through "directed virtual evolution"?

Perhaps a Quantum supercomputer would be enough to do these kind of simulations. However scientists think that unfortunately these computers are still at least 30 years in the future.
"Hydrogen is a light, odorless gas, which, given enough time, turns into people." -- Edward Robert Harrison

Messages In This Thread
RE: What scientific ideas should society get rid of? - by chris0033547 - 01-25-2014, 07:22 PM

Forum Jump:

Users browsing this thread: 1 Guest(s)