Robot controlled by humans using nothing but THOUGHTS and hand gestures | Science | News


The machine, which scientists at America’s Massachusetts Institute of Technology have named Baxter, is capable of reading human brainwaves which tell it whether somebody is unhappy with its actions.

In this way, Baxter is able to notice and correct his mistakes.

His owner can then direct the machine into performing a different task using subtle hand gestures.

Project supervisor Daniella Russa said the aim was to “move away from a world where people have to adapt to the constraints of machines to develop robotic systems that are a more natural and intuitive extension of us”.

A team, led by Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory (MIT CSAIL), staged a demonstration with a robot using a drill.

During the course of the test, the robot moves the equipment to one of three possible targets on a mock plane.

The team proved the system works on people Baxter has never seen before, meaning companies could use him in real-world settings without the need to train him to work with different users.

The team makes use of a branch of science known as electroencephalography (EEG) for brain activity and electromyography (EMG) for muscle activity.

Users have a series of electrodes placed on their scalp and forearm to detect impulses.

PhD candidate Joseph DelPreto, who was lead author on the paper, explained: “By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures along with their snap decisions about whether something is going wrong.

“This helps make communicating with a robot more like communicating with another person.”

In isolation, neither EEG signals and EMG signals are entirely reliable – but by merging the two, the team hit upon a way for more robust bio-sensing.

CSAIL director Daniela Rus, who supervised the work, said: “This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback.

“By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”

The team says that they could envisages the system being useful for the elderly, as well as workers who have difficulties with speech or limited mobility.

Their research will be presented at the Robotics: Science and Systems (RSS) conference in Pittsburgh next week. 



Source link

Products You May Like

Articles You May Like

Eclipse 2018 map: Where will the Blood Moon be visible? | Science | News
Heavy monsoon rains cause floods and landslides, leaving 90 people dead in Nepal
A Jaguar Escaped Its Zoo Habitat And Moved From One Enclosure to The Next Killing Animals
Dementia breakthrough: Supercomputer which MIMICS brain could help find Alzheimer’s cure | Science | News
Rare mud eruption at Wai-O-Tapu Thermal Wonderland south of Rotorua, New Zealand

Leave a Reply

Your email address will not be published. Required fields are marked *