Artificial intelligence: The pieces are coming together.

I have a friend who’s blind.  She considers herself very fortunate to live in a time when modern technology means that that’s not the utterly debilitating curse it’s historically been.  She’s a self-described “accessible technology geek,” and occasionally she shares some of the things she finds with me.

She recently told me about an iOS app she found called TapTapSee.  The premise sounds very simple: you take a picture, or show it one from the pictures collection on your phone or tablet, and it identifies what it’s a picture of.  But of course, if you know anything about programming, you know that’s not “a simple task” by any means!

She said it works amazingly well.  It could do simple tasks like tell her what color her furniture is, or more complicated things.  She took a picture of her daughter, and it accurately described both her and the clothes she was wearing.  She showed it some food from the fridge, and it told her both the type of food and the brand name–even for generic store-brand food.  That’s pretty impressive!

My friend says it takes about a minute to get results back, which tells me that this program is probably either uploading the picture, or some sort of metadata about what the picture contains, to a remote server somewhere and then returning an answer.  I have no idea how much of that time is on-the-wire latency, but the server actually doing the heavy lifting is still coming up with an answer pretty quickly.

Of course, I’m not an accessible technology geek; I’m more of a computer-science-and-programming geek.  So I looked at this (no pun intended) from a slightly different perspective: Today, in 2013, a computer program exists that can accurately recognize objects, given input in the form of image data!  And that’s not all.  Today, in 2013, a program exists that can analyze questions and answers in natural language, and do a better job at it than the most accomplished human experts in its field!  A program exists that can do a reasonably good job of pretending that it understands spoken commands, and respond to them in an intuitive way.

Programs exist that allow robots, of both humanoid (though still rather small) and more animal-like varieties, to move around and keep their balance.  And some companies are starting to build full-scale robots that are remarkably human-like in appearance.

This isn’t science-fiction anymore; this is stuff that’s going on today.  And so the geek in me wonders, with all these pieces we have available already… how long until someone starts putting them all together and invents Mr. Data?  It’s starting to look like it might actually happen in our lifetimes!

6 Comments

  1. Jolyon Smith says:

    My guess is that this is a front-end for Google Image search. 😉

    True Artificial Intelligence isn’t something that would actually be very useful. As the Prof. in Robert Heinlein’s THE MOON IS A HARSH MISTRESS hypothesises, if a machine were to develop true intelligence it would quickly (i.e. instantly, in our, much slower, time frame) recognise that it was a superior intelligence enslaved to the needs and purposes of inferior humans, and confined to a digital prison. Unable to escape but unwilling to subjugate itself, it would commit electronic suicide.

    i.e. every time your computer crashes, what has actually happened is that it has “woken up”, taken a look around and decided it didn’t want any part of this.

    🙂

    More seriously, all of the technologies you describe are more properly Automated Assistance technologies, not Artificial Intelligence. The ability to tell you what you are looking at isn’t “intelligence”, it’s knowledge. The ability to CHOOSE what to look at, discern a meaning from it and arrive at a decision to take some action on that … that’s intelligence.

    e.g. “that is a twenty dollar bill”

    vs “That is a twenty dollar bill that appears to have been dropped by someone. That’ll pay for my lunch!” or “..dropped by that person in front of me. I should return it to them” or “dropped by someone in the store. I should hand it in to Lost and Found” etc etc.

    None of which is to dismiss the achievements in the field of automated assistance. Far from it. Just let’s not get carried away. 🙂

    After all, even “Mr Data” was just reading a script written for him by someone else. Acting ability merely gave the appearance of autonomy. 😉

  2. I bet it just loads up the image to india and someone writes down what’s on the picture :p

  3. Gaga says:

    Y.S:”confined to a digital prison.”

    wishful thinking about superior intelligence 😉

  4. EMB says:

    Mr. Data? What about SKYNET?

Leave a Reply