IBM has unveiled the seventh annual ”IBM 5 in 5″ (#ibm5in5) — a list of innovations that have the potential to change the way people work, live and interact during the next five years.
The IBM 5 in 5 is based on market and societal trends as well as emerging technologies from IBM’s R&D labs around the world that can make these transformations possible. This year’s IBM 5 in 5 explores innovations that will be the underpinnings of the next era of computing, which IBM describes as the era of cognitive systems.
This new generation of machines will learn, adapt, sense and begin to experience the world as it really is. This year’s predictions focus on one element of the new era, the ability of computers to mimic the human senses — in their own way, to see, smell, touch, taste and hear.
These sensing capabilities will help us become more aware, productive and help us think — but not think for us, according to IBM. Cognitive computing systems will help us see through complexity, keep up with the speed of information, make more informed decisions, improve our health and standard of living, enrich our lives and break down all kinds of barriers — including geographic distance, language, cost and inaccessibility.
“IBM scientists around the world are collaborating on advances that will help computers make sense of the world around them,” says Bernie Meyerson , IBM Fellow and vice president of Innovation. “Just as the human brain relies on interacting with the world using multiple senses, by bringing combinations of these breakthroughs together, cognitive systems will bring even greater value and insights, helping us solve some of the most complicated challenges.”
Here are five predictions from IBM that will define the future:
Imagine using your smartphone to shop for your wedding dress and being able to feel the satin or silk of the gown, or the lace on the veil, all from the surface of the screen? Or to feel the beading and weave of a blanket made by a local artisan half way around the world? In five years, industries such as retail will be transformed by the ability to “touch” a product through your mobile device using haptic, infrared and pressure sensitive technologies to simulate touch.
We take 500 billion photos a year. Seventy-two hours of video is uploaded to YouTube every minute. The global medical diagnostic imaging market is expected to grow to $26.6 billion by 2016.
Computers today only understand pictures by the text we use to tag or title them; the majority of the information — the actual content of the image — is a mystery.
In the next five years, IBM predicts that systems will not only be able to look at and recognize the contents of images and visual data, they will turn the pixels into meaning, beginning to make sense out of it similar to the way a human views and interprets a photograph. In the future, “brain-like” capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media. This will have a profound impact for industries such as healthcare, retail and agriculture.
Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies. What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images — such as differentiating healthy from diseased tissue — and correlating that with patient records and scientific literature, systems that can “see” will help doctors detect medical problems with far greater speed and accuracy.
Ever wish you could make sense of the sounds all around you and be able to understand what’s not being said? According to IBM, within five years, a distributed system of clever sensors will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies. It will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will “listen” to our surroundings and measure movements, or the stress in a material, to warn us if danger lies ahead.
Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other “modalities,” such as visual or tactile information, and classify and interpret the sounds based on what it has learned. When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognize patterns.
For example, “baby talk” will be understood as a language, telling parents or doctors what infants are trying to communicate. Sounds can be a trigger for interpreting a baby’s behavior or needs. By being taught what baby sounds mean – whether fussing indicates a baby is hungry, hot, tired or in pain – a sophisticated speech recognition system would correlate sounds and babbles with other sensory or physiological information such as heart rate, pulse and temperature.
In the next five years, by learning about emotion and being able to sense mood, systems will pinpoint aspects of a conversation and analyze pitch, tone and hesitancy to help us have more productive dialogues that could improve customer call center interactions, or allow us to seamlessly interact with different cultures, according to IBM.
During the next five years, tiny sensors embedded in your computer or cell phone will detect if you’re coming down with a cold or other illness. By analyzing odors, biomarkers and thousands of molecules in someone’s breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes and epilepsy by detecting which odors are normal and which are not.
For more information about the IBM 5 in 5, http://tinyurl.com/cmkj3nq .