Facebook co-founder Mark Zuckerberg has expressed interest in brain-computer interface applications, and the company has recently issued job advertisements for neural imaging engineers to join their research team. Here, Cognitive Neuroscience expert Dr Jason Taylor discusses whether this is feasible – and argues that social media billionaires should utilise their immense resources to solve more important medical and scientific problems.
[Notification: Jason’s brain has poked you!]
So Facebook is getting into the brain-computer interface (BCI) business. This is equal parts interesting and terrifying. I have no doubt that a direct connection to our brains would be of interest to Facebook – they'd love to know what we're thinking in the ~10 minutes per day we aren't already staring at our phones – but there are some considerable hurdles to be cleared before the device in your pocket can read your most private thoughts directly.
The kinds of 'brain reading' or 'decoding' experiments that often make the news these days require participants to lie still in an magnetic resonance imaging (MRI) scanner for an hour or so whilst they watch a film or a stream of images of different types of objects. Computer algorithms operate on patterns of brain activity and try to 'guess' what the participant was looking at, or even to 'reconstruct' the image from brain activity, and this can be done with some success. Jack Gallant’s group at UC Berkeley made headlines a few years ago with this slightly unnerving video, which shows a stream of stimulus images next to images reconstructed from brain activity whilst subjects viewed those stimulus images. For some images, particularly those containing a human face, the algorithm does rather well. (See also this video by Nature reviewing decoding work).
[You’ve got neuro-mail! New message from Jason’s brain: “Hey, apparently FB is hiring neuroscientists. Maybe you should apply.”]
Some progress has been made on similar experiments aimed at decoding signals from electroencephalography (EEG), which by-passes the giant magnetic tube in favour of a simple stretchy swimming cap full of electrodes. Unfortunately, due to the nature of the EEG signal, the fidelity of decoding from EEG isn't as good as with MRI. Mobile EEG systems exist which allow the participant to move about freely, so recordings can be made 'in the wild', but these ambulatory systems are generally noisier and often have fewer electrodes, reducing the decoding ability even further. Most research to date has focused on decoding intended movement in order to help individuals with movement disorders or paraplegia. Remember the opening kick-off at the 2014 World Cup in Brazil?
[A nearby Neuro-Tinder user, “amygdala”, is interested in you! Think ‘right’ to connect, ‘left’ to ignore.]
So what is Facebook up to? If their goal is to implement some sort of 'neuro-LIKE' feature, they might be able to achieve their goal in the not-to-distant future. I can imagine the set-up: In a top-secret lab in the shadowy basement of Building 8, participants wear portable EEG kit whilst they view social media posts on their phones and ‘react’ to each one by pressing the appropriate emoji. Computer algorithms are applied to the data in order to learn which patterns of brain activity correspond to ‘like’, ‘love’, and ‘lol’. Soon thereafter, Facebook releases an update that allows you to automatically react to your friends’ posts via brain activity directly, freeing up your thumbs for more important tasks. (You can have that one for free, Mark; for more ideas, contact me about my consulting fee.) Facebook’s investors might have more lucrative ideas in mind.
Your brain activity might also give away your feelings towards an advert that has been insidiously inserted into your news feed. And it doesn’t bear thinking about what the world’s intelligence agencies might derive from these data, should they be given access.
[Thank you for ordering from neuro-shop! Your new Necomimi EEG-controlled cat-ear headband will be delivered by an Amazon Now drone in the next 2 hours. If your brain activity ordered this product by accident, please think ‘refund’ now.]
Anything approaching Mr Zuckerberg’s stated goal of sending “full, rich thoughts to each other directly” would take a considerable advance in technology. But it’s maybe not as far-fetched as it sounds. Some research groups are working on brain-to-brain communication using EEG on the sending end and transcranial magnetic stimulation (TMS) on the receiving end. The scientist behind the World Cup exoskeleton, Miguel Nicolelis, has demonstrated brain-to-brain collaboration in monkeys. Pairs of primates worked together to successfully move an avatar’s virtual arms to accomplish a goal using only their combined brain activity recorded via implanted electrodes. These demonstrations are promising, but ultimately limited, and they require invasive technology and heaps of patience. At the risk of becoming a prediction-fail meme, I suspect that it will take some amount longer than the two years allotted by the Facebook job advert to reach the level of real-time communication of full-blown thoughts.
[New brain-friend request: Miguel’s brain and your brain reacted similarly to a video of a cat barking like a dog. Maybe you should be friends…?]
Whatever Facebook is working on, the good news is that you are unlikely to become an unwitting participant in a Facebook experiment to read your mind through your brain activity. Why? Because at the very least, you’d need to acquire and wear the appropriate sensors and connect them to your device. Would it be worth it? We’ll have to wait and see what comes out of the project.
Rather than the trivial or somewhat sinister applications I have imagined above, I certainly hope the world’s social media billionaires would choose to apply their considerable resources to more important medical and scientific problems, like enabling movement or communication in patients with neurological disorders, communicating with minimally conscious patients, detecting and diagnosing dementias, or just figuring out how the brain works!
[Comments? Just stare at the space below the article and think* about your opinions! (*Some future technology required. Results may vary.)]