A while back, the husband of a friend of mine got a nasty, painful rash on his face. When it got up to his eye and started to affect his vision, he went to the hospital, and after a bunch of tests they found out what was going on. I asked my friend about it when they got home, and apparently the hospital staff had been a lot less helpful than they could have. She didn’t know exactly what the problem was; she said they had called it “zoister” or something like that, and she probably wasn’t even remembering it right.
I figured she probably wasn’t, because that doesn’t sound like any disease I’ve ever heard of. So I tried punching it into Google, and sure enough, it had the answer. “Do you mean zoster?” I clicked the link, and there it was: herpes zoster, better known as shingles, the revenge of the chickenpox virus. Why the hospital folks didn’t just say “he has shingles,” I’ll never know.
It took a few days before I realized the implications of what I’d done there, though. You may have heard the famous quote from Charles Babbage:
“On two occasions I have been asked,—”Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?” In one case a member of the Upper, and in the other a member of the Lower, House [of Parliament] put this question. I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.”
As programmers, we like to laugh at stuff like this. Oh, look at the clueless politicians who don’t know the first thing about engineering or GIGO, asking stupid questions whose answer would be obvious if they only gave it half a moment’s thought! But isn’t this what I had just done? I had deliberately, knowingly given a computer bad input, with the full expectation that it would give me the right answer, and it had done so successfully! It kind of made me think.
The members of Parliament Babbage was dealing with weren’t engineers at all; they were politicians, who were used to dealing with people, not machines. And if you say something that’s obviously wrong to a person, there’s a good chance that they might try to correct you in one way or another. It made me wonder if Babbage wasn’t missing the real point of the question: “Mr. Babbage, are you able to build a machine that has any common sense?”
I think a big part of it is that Google searches work with natural language subjects, while programming is based on formal language. Natural language is understood intuitively, though, not formally, which means that adding a bit of virtual intuition into your natural language processing is a lot less likely to produce really bad results than doing the same for code. My experience wasn’t the result of a simple spell check, either; there are pages out there that match the word “zoister”; but Google’s system knew that there was a similar word that was more likely to be relevant.
This is certainly an interesting time we’re living in. Intuition has always been the major thing that sets real intelligence apart from mere computing power. With engineers now beginning to develop virtual intuition routines, how much longer before we start to see true AIs emerging?