No room for manoeuvre

Chinese Room - illustration 

I’m as big a fan of Ray Kurzweil as the next man but this post by Northern Planner– fast turning into a favourite (and extremely prolific) blogger – reminded me of something I meant to do a post about ages ago.

Kurzweil and other futurologists often talk about how long it will be before computers become “conscious”. You can see how the thinking goes: computers used to be a bit shit, then they became good enough to do sums, then the internet. Soon, computers will be able to perform more calculations than our own brains and then soon after, they’ll have more computing power than the whole bunch of us.

This is basically Moore’s law – the power of computers (or, more to the point, their power:cost ratio) will double every 12-18 months. And the rule continues to hold. Mathematically the effect is that useful computing power grows like 2n where ‘n’ is the number of 12-18 month periods. It’s dizzying growth that will keep us in awe of the power of the machines. But it will never amount to consciousness – just like no amount of cheesecake will ever build an elephant – they’re different sorts of things.

Considering it’s our finest feature, human’s are not well disposed to feel protective of our consciousness, and people find it very hard not to think of consciousness as some higher order of information processing.

But it’s not.

Luckily there is an absolutely fabulous analogy to help us understand. This comes from John Searle’s Chinese Room Argument

Searle asks us to imagine a Chinese man wandering through the wilderness who comes across a large room. This completely sealed box has 2 slots on the outside, as well as a pad and a pen. A sign pointing at the top slot invites passers-by to submit questions in writing  in Cantonese into the top slot. Our wandering man does this, asking the room as series of questions: directions, common facts, popular culture questions. Each time an appropriate or correct answer pops out of the bottom slot.

What do we conclude? That the room understands the questions? That there is someone inside who understands?

Now let me tell you what’s actually going on inside the room. As questions come in, a young YTS trainee from Hull (who only speaks English) takes them and checks them against a series of books. All the slips of paper contain strange incomprehensible symols (Catonese symbols). When he finds an exact match, it includes a long number, he then takes this number over to the other side of his room and looks it up in a different set of books. Here he finds the number links to a different set of symbols. He traces these onto a new piece of paper and pushes them back through the slot.

Will our YTS trainee ever learn Cantonese? How could he, all he gets is syntax, he would never get a foot hold onto even the first rung of semantics.

This is what modern computers do. And better, faster processor is a better faster YTS trainee.

(illustration stolen from here)