Without words – what if computers could intuitively understand us
Wednesday, 18. January 2017
Media information No. 8/2017
- © TU Berlin/PR/Oana Popa
The experimental setting is not entirely unlike the popular children’s party game Topfschlagen (“Hit the Pot!”): one child is blindfolded and has to find a hidden pot by “blindly feeling around”. The other children sit in a circle, calling out “hot” or “cold”, depending on how close the blindfolded child is to the pot.
Dr. Thorsten Zander, postdoc in the research group for Biopsychology and Neuroergonomics led by Prof. Klaus Dr. Gramann at TU Berlin, working in collaboration with Laurens Krol (TU Berlin) and Prof. Dr. Nils Birbaumer (University of Tübingen), conducted an experiment with his participants based on a similar principle: the participants were instructed to watch a computer screen showing a flashing cursor that randomly jumps through a grid of 16 nodes toward a pre-specified target in one of the corners. Each participant wore a headset with a network of several electrodes. Their brainwaves were captured by a so-called Brain-Computer Interface (BCI) and sent to a special software application to be analyzed and evaluated. The participants were given one single task: watch the cursor as it randomly jumps around.
- © TU Berlin/PR/Oana Popa
“The results were
spectacular: over time, as the participants simply watched the screen,
the cursor found the target more and more quickly. In the first round,
it needed 27 jumps on average to reach the objective. But in
subsequent rounds, it could do it in 13,” reported Thorsten Zander,
a mathematician by training. The computer “learns” from the
participants without requiring them to participate intentionally –
or without them necessarily even knowing. The participants’ brains
naturally play the role of the children in the party game, unknowingly
revealing information to the computer about which movements are
“hot” (when the cursor moves toward the target) and which are
“cold” (when the cursor moves away from the target).
“We were able to show for the very first time that a passive brain-computer interface is capable of detecting unconscious brain signals, analyzing them, and turning them into an actionable instruction for the computer,” explained Thorsten Zander. These results are now published in the prestigious journal “Proceedings of the National Academy of Sciences” (PNAS).
“The brain activity used by the computer to determine the accuracy of the motion of the cursor is emitted by the medial prefrontal cortex region of the brain. We know from the literature that precisely this area of the brain is where so-called ‘predictive coding’ takes place.” ‘Predictive coding’ describes the brain’s tendency to construct a specific model of its surroundings and continuously make predictions about what will happen next in order to be able to respond adequately. For example, this allows people to anticipate the trajectory of a falling cup within a split second, so that they can catch it before it hits the floor.
“When the participants look at the flashing cursor, which they do not know they can influence, and see that it moves in the ‘hot direction – toward the target, the brain's prediction is confirmed, resulting in a certain ‘peak’ in their brain activity. If the cursor jumps in the ‘cold’ direction, the brain’s prediction is rejected, which creates a different peak,” explained Thorsten Zander. Each of these types of peak is detected by the BCI and converted into motion commands for the cursor by a special algorithm. The algorithm deduces the direction of the target from the brain’s unconscious reactions.
Thorsten Zander’s vision is not just to move cursors reliably across the screen. He is working toward a new type of neuroadaptive technology: “We’ve had computers for around 60 years. In that time, the performance of these computers has grown exponentially, but the interaction between humans and computers remains limited by the bottleneck of communicating human intentions to the machine by pressing keys or moving the mouse. In this paper, we demonstrated for the first time that, after a suitable calibration phase, passive brain-computer interfaces can do more than simply detect yes/no decisions from our brain activity. They can reconstruct a model of complex thought processes from it and extract various pieces of information, allowing the computer to independently deduce machine instructions without requiring conscious human input.” This opens completely new avenues for interaction: consider for example how computers are already capable of suggesting frequently visited websites. In Thorsten Zander’s vision, the computers of the future might be able to use the current peaks in our brain activity to determine whether we are more likely to go online shopping or look for work.
This also carries serious ethical implications. Because of this and other similar work, a large conference on neuroadaptive technologies will be held in Berlin in July 2017, focusing on the ethical aspects of this technology, among other things.
One interesting question remains unanswered: can participants suppress the unconscious information revealed to the computer by their brain activity via the BCI? “That’s exactly what our next project will attempt to find out. It has just now been approved by the DFG,” says Thorsten Zander.
For further information please contact:Dr. Thorsten Zander
Biological Psychology and Neuroergonomics