IBM Foresees Electronics With Five Senses in Five Years

By Nick Venable | 8 years ago

When it comes to predictions, I usually take mine with a grain of salt, regardless of who’s making them. This time of year is usually fueled by guesses about Christmas presents, Super Bowl opponents, and what the new year may bring. This year, we’ve mentally resurrected the Mayans for their end-of-the-world prophecies, but let’s focus on something rooted in the real world. IBM recently presented their annual “5 in 5,” a prognosis of what the world of computers and technology will be like in five years’ time. Those holding out for a Virtuosity reference will be disappointed, besides just being disappointing.

In years past, IBM had stressed “mind reading” capabilities of electronics, but this year, they’re sticking strictly to a physical sense of improvements, though all under the umbrella of “cognitive computing.” Using the commonly accepted five senses, IBM employees theorize how our devices will become more human than human. Read with me as I turn my vastly unprofessional mind’s eye toward summing up their predictions.

1. Touch

Like many advances in technology, this one has a creep factor going for it. The nutshell is that a library of digitized texture signatures will be created, allowing any fabric or surface to be translated into a pattern of minute vibrations through the air that can mimic the way the item feels to the fingertips. An example — one that obviously isn’t geared toward me — is online shopping for clothing, where the fingers can do as much browsing as the eyes can. “There’s a sale on silk dresses, and never mind that it’s got a dinosaur rollerblading on it, just feel how soft that is!” A second medical example proposes off-site doctors being able to give more efficient preliminary diagnoses by looking at, and being able to touch, images of patients’ injuries.

Personally, fruit is probably the only thing I buy purely based on touch, though I won’t wear a long sleeve shirt if it isn’t softer than puppy fur, so maybe my attitude will change should this come to light. And even though it’s second nature for my sex-addled brain to “go there,” especially in this case, I figure just saying that should have put some hideous imagery in your head.

2. Sight

When you post a photo to a social networking site of you and a friend standing in front of the fountains at the Bellagio in Las Vegas, your computer will know where you were and who you were with only if you tag the person and location to the photo itself. Using more giant databases, this time for photographs instead of touch information, computers may soon learn how to recognize what they’re seeing. Give it enough videos of you doing the Hammer Dance, and the computer will be able to look at a still image and be able to tell you’re doing the dance, probably from the lack of friends nearby. Also, for social media usage, your pictures will be dissected into marketable actions and products because in the future, generic advertising will not exist.

The medical field wins out here, too. For example, if 10 people had been going to the same doctor for 10 years and they were all diagnosed with colon cancer, a computer could study the medical records and imaging to determine what, if any, early signs are noticeable that can be used to prevent or detect diseases earlier than ever before. And when it comes to cancer, early signs are a vital advantage. This would be similar to goals for a first-response system to develop, in terms of weather and natural disasters, based on a coordination of pictures people post online, which give emergency workers a better understanding of what they’re going into and how to plan for it.

3. Hearing

As soon as you started reading this, you thought to yourself, “Man, I hope for the hearing part, it’ll say an app will exist that will allow me to understand what dogs and dolphins want.” Well, pat yourself on the back, because it’s happening. As well, your baby’s cries will be distinguishable, alerting you to hunger, fatigue, and pain individually. Another approach will be short-term weather forecasting, based on leaves shuffling in a forest, or something like that. (To be fair, IBM has been working on the weather side of it for years.) Finally, the same advances in frequency processing that will alert you to your dog’s wishes to live inside a peanut butter aquarium will allow you to have cell phone conversations with most or all background noise eliminated. Similarly, a phone conversation across a crowded room would sound as if both parties were in empty rooms. This could have unique applications in hearing aids and other aural implants as well.

4. Taste

In the last 20 or so years, it’s become obesely obvious what first-world humans will do with food when left to our own devices. The “extra belt notch” industry is thriving. But what if we allowed computers to decide what we eat? Hey, even I’m skeptical about something that probably won’t add bacon to every meal. Computers will soon be able to create recipes based on chemical composition just as much as one’s personal taste. Developments in science are how cooking got started in the first place, so it only makes sense to come full circle, but with less of a focus on technique, and more on how foods work in conjunction with one another. And after that, what a person actually likes is then considered. From there, it only gets more interesting and improvised. Picky eaters may scare initially.

To combat the negative connotations with “healthy food,” computers could create meals that are just as high in flavor as they are low in calories, which could revolutionize school meals, as well as those bland frozen things taking up so much of my grocers’ freezers. Even if the produced recipe isn’t all that is desired, it is the optimal blueprint from which a more suitable meal can be built. Win-win for the belly.

5. Smell

Certain to bring millions of unknowing souls the hilarious comedy of “Who Farted?” in app form, this kind of technology will not only be able to identify the passer of gas, but also if he has tuberculosis. The main stated goal in this case is mostly medical, where sensors would work like breathalyzers to collect the telling specimens that could identify a number of diseases early on. Obviously, this isn’t really smelling, which wouldn’t have fit the quintet, but it’s much more useful than anything our noses alone are capable of. Though the complex information processing needed will require a computing upheaval of some kind, this is 2017-2018 we’re talking about. If it doesn’t come true, you can get in your flying car and go complain to somebody.

Leave A Comment With: