I don’t know about you guys, but when I was a kid, my favorite part about watching The Jetsons was thinking about the future, when I’d be able to marry a beautiful wife and have two beautiful kids, all of whose names would appear in my opening theme song. Okay, okay, you got me. All I wanted was a talking dog and a robot maid. I’m only (a lazy) human!
It’ll still be a while before anything like Rosie comes around, but Computer Science assistant professor Ashutosh Saxena and a team at the Robot Learning Lab in the Computer Science Department at Cornell University have created a no-frills version of a robot servant that can actively anticipate certain needs. Don’t get me wrong, these are rudimentary needs, but it’s still pretty amazing.
Using the RGBD sensor inside an Xbox Kinect, Saxena and his team recorded a library of 3D videos uploaded into the robot’s “brain” that it can use to associate with real world actions. It’s also equipped with some kind of camera that allows it to take its own videos, thus increasing the amount of tasks it can understand. We’re still a ways from the assisted-aid robot in Robot & Frank, but this is a damned fine first step towards that goal.
Even though the video below gives a perfectly good example, I’d prefer to make up my own. Say you want to brush your teeth, but for some reason you keep the toothpaste in a different room besides the bathroom. (They don’t call it “garage toothpaste” for nothing.) The robot, who you will have probably named by then, can understand that it needs to go and grab the toothpaste for you, just by seeing you grabbing your toothbrush. Now, it isn’t going to haul ass doing this, but being efficient and quick are for the robots of tomorrow.
Check out the video below, which is vaguely reminiscent of those Chinese waiter bots, and don’t forget to tip.