Forget Humans vs. Machines: Its a Humans + Machines Future

Organized by IBM, the colloquium brought together machine learning leaders in industry and academia to discuss how artificial intelligence can augment human intelligence, by helping us make sense of the quintillion bytes of data generated each day. It’s not about machine intelligence taking over the world, said Kelly.
 
It’s not even about recreating the human brain or its basic architecture. It’s about taking inspiration from the brain, or whatever inspiration from wherever we can get it, and changing the current computing architecture to better handle data and further our understanding of the world.
 
Around 80% of data is unstructured, meaning that current computing systems can’t make sense of it. By 2020, this number will reach 93%. To a human, unstructured data is far from enigmatic, think of describing a video recording of a street scene to a friend. Easy. To a current computer, however, the task is nearly insurmountable.
 
Yet analyzing unstructured data is far from a theoretical problem. Take medicine, for example. In a single lifetime, a person can generate over one million gigabytes of health-related data, mostly in the form of electronic records and medical images. Multiply this by our current population, and the “secret to well being” may be hidden among this data, says Kelly. Yet, we don’t have the means to analyze, interpret and extrapolate from this vast resource.
 
The problem lies in both hardware and software. The challenge is formidable, said Dr. Yoshua Bengio, a neural network and deep learning expert at the University of Montreal and invited speaker. But scientists are making headway on both fronts.
 
Currently, the basic unit of computer computation, the silicon chip, relies on the same outdated computational architecture that was first proposed nearly 70 years ago. These chips separate processing and memory, the two main functions that chips carry out, into different physical regions, which necessitates constant communications between the regions and lowers efficiency. Although this organization is sufficient for basic number crunching and tackling spreadsheets, it falters when fed torrents of unstructured data, as in vision and language processing.
 
This is why we took the long and winding road to a production-scale neuromorphic computing chip, said Dr. Dharmendra Modha, chief scientist at IBM. Published last year in the prestigious journal Science, Modha and colleagues at IBM and Cornell University described a chip, TrueNorth, which works more like a mammalian brain than the tiny electronic chips that currently inhabit our smartphones.
 
When you look at the brain, it’s both digital and analog, said Dr. Terry Sejnowski, a pioneer in computational neuroscience at the Salk Institute and invited speaker.
 
It’s digital in the sense that it processes electrical spikes, especially for information that needs to travel long distances without decay. But it’s also analog in how it integrates information. It’s quite noisy, can be very imprecise, but it gets by really well by producing “ok” solutions under strict energy constraints, something that completely evades current computer chips
 
The brain is also a master at parallel computing and capable of dealing with immense complexity. Part of this is due to how neurons, the brain’s basic computational units, are dynamically connected. Each individual neuron talks to thousands of neighboring ones through chemical signals at synapses. A message can ripple through the brain’s 100 billion neurons and 100 trillion of synapses without the need for pre-programming: neuronal networks that fire together regularly are reinforced, whereas those that don’t are trimmed away.
 
It’s a highly efficient, adaptable and energy efficient computing architecture distributed across multiple processing levels and physical regions of the brain. This means that there’s less need to shuttle data from one region to another, said Sejnowski.
 
TrueNorth mimics the brain by wiring 5.4 billion transistors into 1 million “neurons” that connect to each other via 256 million “synapses.” The chip doesn’t yet have the ability to incorporate dynamic changes in synaptic strength, but the team is working towards it.
 
“The chip is a fundamental departure from current architectures,” says Modha. But he stresses that it’s not a precise interpretation of the brain. The brain is very good at some things, image perception, intuition, reasoning, even a sense of morality, but inept at making sense of vast amounts of data. We’re trying to augment human intelligence with AI, not replicate it, stressed Modha.