As far as the fields of aerospace, healthcare, and advanced manufacturing are concerned, newly developed technology is setting the stage for making mind reading possible.
At the University of Technology Sydney (UTS), scientists have come up with biosensor technology that will enable people to operate devices, like machines and robots, purely via thought control.
The sophisticated brain-computer interface was developed by Distinguished Professor Chin-Teng Lin and Professor Francesca Iacopi, from the University of Technology Sydney (UTS) Faculty of Engineering and IT, in partnership with the Australian Army and Defence Innovation Hub.
Also with defense applications, the technology has considerable potential in fields like aerospace, healthcare, and advanced manufacturing—for instance, enabling people with a disability to operate prosthetics or control a wheelchair.
The hands-free, voice-free technology works outside laboratory settings, anytime, anywhere. It makes interfaces such as consoles, keyboards, touchscreens, and hand-gesture recognition redundant.
Francesca Iacopi, Professor, Faculty of Engineering and IT, University of Technology Sydney
Iacopi added, “By using cutting edge graphene material, combined with silicon, we were able to overcome issues of corrosion, durability, and skin contact resistance, to develop the wearable dry sensors.”
A new study describing the technology has just been reported in the peer-reviewed journal ACS Applied Nano Materials. It shows that the graphene sensors developed at UTS are user-friendly, strong, and highly conductive.
The hexagon-patterned sensors have been placed over the back of the scalp to detect brainwaves coming from the visual cortex. The sensors are flexible to harsh conditions, so they could be utilized in extreme operating surroundings.
The user wears a head-mounted augmented reality lens that exhibits white flickering squares. Through concentrating on a specific square, the brainwaves of the operator are picked up by the biosensor and a decoder that helps translate the signal into commands.
Recently, the technology was illustrated by the Australian Army, where soldiers operated a Ghost Robotics robot with the help of the brain-machine interface. The device enabled hands-free command of the robotic dog with up to 94% accuracy.
Our technology can issue at least nine commands in two seconds. This means we have nine different kinds of commands and the operator can select one from those nine within that time period. We have also explored how to minimize noise from the body and environment to get a clearer signal from an operator’s brain.
Chin-Teng Lin, Distinguished Professor, University of Technology Sydney
The scientists believe the technology will be of interest to the industry, government, and the scientific community, and they also hope to continue making progress in brain-computer interface systems.
Faisal, S. N., et al. (2023) Noninvasive Sensors for Brain–Machine Interfaces Based on Micropatterned Epitaxial Graphene. ACS Applied Nano Materials. doi.org/10.1021/acsanm.2c05546.