IBM’s tip for the next five years: computers thinking like humans

IBM has once again stuck its neck on the line and made its prediction for which five technologies will dominate our lives in the next five years. This year the focus is very much on how humans will interact with technology through our senses, and the way that will change our lives.

Here is IBM’s 2012 five in five:

Big Blue reckons that soon enough mobile phone users will be able to touch a product through their device’s screen. Using haptic technology (which some speculated would appear on the iPhone 5 earlier this year) people could, for example, touch the material on a piece of clothing before decided whether to buy it or not.

Of course users will not literally touch the material, but it will be a simulation of what the material would feel like. IBM says its scientists are already hard at work using haptic, infrared and pressure sensitive technologies that will be able to do this, with the retail industry likely to be the first to benefit from it.

Anyone who has used the "similar" on Google’s image search will know that its algorithms are getting better at looking at an image and then finding ones similar. But it does that by using information tagged into the image by humans and is therefore quite limited.

IBM says that is soon going to change. Computers will soon be able to learn what is in an image, essentially making sense out of it in the same way a human would. The obvious use for this technology is the healthcare industry, where patient diagnosis could be given a helping hand. Telemedicine is one area that would certainly benefit from this sort of technology.

There is an episode of The Simpsons where Herb, Homer’s long-lost brother, invents a machine that can translate a baby’s babbling into speech. IBM thinks technology to actually do that will be available within five years. Sensors will detect elements of sounds and computers will interpret them to find any meaning that would otherwise be hidden.

As well as the baby talk example IBM thinks that sensors will be able to use sounds to predict when a tree may fall in a forest or when a landslide is imminent. A real life example is happening in Galway Bay in Ireland, where IBM have put sensors underwater to pick up on noises being made by wave energy conversion machines and working out what impact that may be having on marine life.

This one would interest Heston Blumenthal, that’s for sure. IBM claims that within the next five years computers will be able to pull together the perfect meal, one that uses a human’s favourite smells and tastes and selects appropriate food. Big Blue said, "It will break down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavours and smells humans prefer."

It wouldn’t just be about the taste though. The technology would be able to recommend healthy options that whoever sits down to eat the meal is guaranteed to like. It will even be able to take into account medical issues such as diabetes or allergies, IBM said.

Within five years it will be possible for your smartphone to smell you and analyse if you are coming down with something like a cough or a cold. Tiny sensors embedded in the device will analyse biomarkers and the thousands of molecules in someone’s breath and will pick up on any abnormalities.

Technology like this is already in use, such as art galleries using it to monitor the air around works of art, but IBM thinks it will eventually be used for clinical hygiene purposes as well. For example hospitals will be able to use to it work out whether a room has been sanitised or not.

"These five predictions show how cognitive technologies can improve our lives, and they’re windows into a much bigger landscape -the coming era of cognitive systems," said Bernard Meyerson, Chief Innovation Officer, IBM.

"The world is tremendously complex. We face challenges in deciphering everything from the science governing tiny bits of matter to the functioning of the human body to the way cities operate to how weather systems develop," Meyerson said. "Gradually, over time, computers have helped us understand better how the world works. But, today, a convergence of new technologies is making it possible for people to comprehend things much more deeply than ever before, and, as a result, to make better decisions.

"In the coming years, computers will become even more adept at dealing with complexity. Rather than depending on humans to write software programs that tell them what to do, they will program themselves so they can adapt to changing realities and expectations. They’ll learn by interacting with data in all of its forms-numbers, text, video, etc. And, increasingly, they’ll be designed so they think more like the humans," he concluded.

Type: White Paper


  • Favorite list is empty.
FavoriteLoadingClear favorites

Your favorite posts saved to your browsers cookies. If you clear cookies also favorite posts will be deleted.