Yesterday I've returned back the Lego Mindstorms sets that Faculty of Computer and Information Science lend to us for the past two months (a big thank you to Miha Štajdohar and prof. Blaž Zupan for letting CoderDojo kids play with the Lego robots).
Having done a little robot programming again, I was reminded how inadequate the current model of programming robots really is. At the moment all robots are looped in cycle of sensing the environment, understanding the environment, and finally acting on the environment. While sensors and actuators are becoming extremely capable, it is the step of understanding the environment that has become a true bottleneck in robot development.
Sense-understand-act model has come about because of the false impression that we know how a human being functions. Since our self-dilusion through "consciousness" is so incredibly strong, it is hard to fathom other possibilities but having a homunculus inside of a mortal body. One such principle that casts away the need for understanding of environment and goes straight from sensors to actuators is the principle of affordances. The principle of affordances state that we don't need to build some internal state of the outside world. Instead, we think about the world in terms of actions that we can perform on the world. So, in the famous squirrel example,
instead of describing processes in squirrel's brain as: I see something - oh, it's a leopard - leopards are dangerous - when in danger, run!, we could describe them by: I see something - when I see something like that I better raise my adrenaline levels and start running.