Computers taking over the world? You’ve been watching too many movies. In fact there’s a lot the human brain can teach its electronic counterpart, according to one of our leading professors, whose work could improve treatments for brain disease as well as build better computers.
Machines with vast artificial intelligence are a constant theme in popular culture. Hollywood films like Ex Machina and hit TV shows like Westworld and Humans – they all feature a common thread: robots outsmarting humans.
But we need not get too excited – or too worried – about the future just yet. A robotic brain that matches our own is a long way off – not least because an artificial brain that matched the performance of a human would require colossal power.
“For a full-scale computer model of the human brain, we’d be looking at a machine that would need to be housed in an aircraft hangar and consume tens of megawatts,” explains Professor Steve Furber, ICL Professor of Computer Engineering.
“It would need a small power station just to keep it going.
“Frankly, despite the imaginative science fiction that has built up around this, we won’t be building a ‘walking, talking’ android-type robot any time in the near future.”
Our brains run on about 20 to 25 watts, equivalent to a small light bulb. To understand this amazing efficiency, Professor Furber is building the first low-power, large-scale digital system to support real-time neural network models of the human brain.
His pioneering approach has led to the development of the SpiNNaker machine (short for Spiking Neural Network architecture).
Funded by the Engineering and Physical Sciences Research Council (EPSRC), SpiNNaker is based on a computing platform made up of a million Advanced RISC Machine (ARM) microprocessors.
As part of Acorn Computers in the 1980s, Professor Furber was a principal designer of the BBC Microcomputer which helped to bring computing into people’s homes and schools – this pioneering role was later dramatised in BBC television’s ‘Micro Men’ drama. He also co-designed the transformational ARM microprocessor which remains the foundation of mobile computing worldwide and is found in all kinds of common devices, from our smartphones to the family car.
After helping to revolutionise computing, Professor Furber believes the next big technological development will be ‘cognitive systems’, in other words machines that will be less passive.
“They won’t just sit around waiting for humans to input meaningful data - instead they will actually respond to the environment and interact and engage with it. Obviously, this will require some degree of awareness and understanding.”
Current research into new drugs for brain diseases has all but stopped because modern drug development is based on understanding disease processes – and that understanding is missing for the brain.
SpiNNaker supports computer models of systems that work in ways that are similar to the brain, enabling researchers to model areas of the brain and to test new hypotheses about how the brain might work.
It’s a key player in the €1 billion Human Brain Project, a major European project to build an ICT-based research infrastructure helping scientists work together in the fields of neuroscience, computing and brain related medicine.
Outclassed by the humble human
Professor Furber’s passion for painstakingly ‘building a brain’ is partly inspired by the fact that computers, despite their huge advances, are still often outclassed by the humble human equivalent.
“After a couple of decades in the conventional processor business, seeing performance rise by more than a thousand-fold, I became frustrated with the things that people – even young babies – can do with apparent ease that computers still struggle with.
For example, an infant is able to instinctively recognise its own mother’s face or reach out and touch something it sees,” he recalls. “So I decided to turn my attention to using computers to try to understand the brain. The ultimate outcome of the work – and it’s a long way down the road – will be to build brain models that can be used to test the effects of new drugs on diseases of the brain, which in turn may lead to better treatments for those diseases.
“Current research into new drugs for brain diseases has all but stopped because modern drug development is based on understanding disease processes – and that understanding is missing for the brain.
“Brain diseases cost the developed economies more than heart diseases, cancer and diabetes put together – not to mention their impact on the quality of life of those affected and their families. To help provide a solution to this huge challenge would perhaps be one of the most important achievements computers could offer us humans.
“In addition, we hope to learn lessons from the brain that will help us build better computers in the future. The brain has a lot to teach us about effective use of parallelism and fault-tolerance, as well as energy-efficiency – which are all major issues for the computer industry.”
So it seems Hollywood has it all wrong. If anything, computers have a lot to learn from us.