Tuesday, July 16, 2019

Indigenous Knowledge & AI


One of the things that humans are trying to do, currently, is create artificial intelligence (AI). In humans, we begin with a system with lots of local connection, and then we have a tipping point; which then turns into a system that has fewer connections but much stronger, more long-distance connections. So, we start out with a system that’s very flexible but not very efficient; and that turns into a system that’s very efficient and not very flexible.


 And therein lies the rub. Instead of trying to produce a program to simulate the modern mind, why not rather try to produce one which simulates indigenous knowledge? The explosion of machine learning, as a basis for AI, has made people appreciate the fact that if you’re interested in systems that are going to learn about the external world, the system that we know of that does that better than anything else is indigenous knowledge.

This means taking a leaf from nature’s playbook. The strategy of producing just a few younger organisms, giving them a long period where they’re incapable of taking care of themselves, and then having a lot of resources dedicated to keeping them alive turns out to be a strategy that - over and over again - is associated with higher levels of intelligence. And that’s not just true for humans. It’s true for animals, insects and even plants.

It’s interesting that that isn’t an architecture that’s typically been used in AI. But it’s an architecture that life seems to use over and over again to implement intelligent systems. One of the questions we could ask is, how come? Why would we see this relationship? Why would we see this characteristic neural architecture, especially for highly intelligent species?

A good way of thinking about this may be that it’s a way of resolving the explore-exploit tradeoffs that we see in AI. One of the problems, characteristic to AI, is a greater range of solutions that seem to be moving in the direction of a system that’s more intelligent. A system that understands the world in more different ways, also produces a big expansion of the search problem.

One way to solve this problem, that comes out of computer science, is to start out with a very wide-ranging exploration of the space; and then gradually narrow in on solutions that are going to be more effective. The problem with such a high temperature search is that we could be spending a lot of time considering solutions that aren’t very effective; and if we’re considering solutions that aren’t effective, we aren’t going to be very good at acting in the world.

By contrast, indigenous knowledge produces a lot of random variability. Being impulsive and acting on the world are good ways of getting more feedback, but they’re not very good ways of planning effectively. This gives a different picture about the kinds of things we should be looking for in intelligence. It means that some of the things that have been difficult for AI to do - like creativity, being able to get to solutions that are genuinely new - are things that indigenous people are remarkably good at.

For example, one of the things that we know indigenous people do, is to get into everything. That's active learning, where they’re determining what will be the exact kind of information that will cause them to change the current view that they have of the world. It's a very unusual thing to be able to do, to go out into the world and spend energy in order to risk being wrong. That’s something that modern humans very characteristically don’t do.

Another aspect of what indigenous people do, that would be informative for thinking about intelligence in general, is that they are cultural learners. One of the effects is that it gives them this capacity for cultural ratcheting, a way of balancing innovation and imitation. They produce a constant tension between how much they’re going to be able to build on the things that the previous generation has done; and how much they’re producing something that’s new enough, so it would be worth having the next generation imitate.

The extraordinary affinity indigenous people have with nature keeps their brains in a state of plasticity. So, the effect is that it increases the local connections and breaks the long-distance network connections. What modern humans can learn from them is how to take a system that’s relatively rigid and inject variability; which shakes it out of its local optima and lets it settle into something new. Therefore, having computers that play and explore, might be a model for AI that’s different from the models of intelligence that we currently have in modern society.

No comments: