MIT researchers demonstrate ability to tell a Boston Robotics robotic dog to fetch while wearing a pair of AttentivU mind-reading smart glasses

MIT researchers demonstrate ability to tell a Boston Robotics robotic dog to fetch while wearing a pair of AttentivU mind-reading smart glasses

Nataliya Kosmyna, Ph.D controlling robotic Spot using thoughts read by AttentivU smart glasses. (Source: Nataliya Kosmyna, Ph.D. BRAINI)Nataliya Kosmyna, Ph.D controlling robotic Spot using thoughts read by AttentivU smart glasses. (Source: Nataliya Kosmyna, Ph.D. BRAINI)

A MIT research group has demonstrated the ability to mind-control a 4-legged Boston Robotics Spot. Participants wore a pair of AttentivU smart glasses with built-in electrodes that can mind-read. They were able to tell the robotic dog to fetch items and move about by thoughts alone.

A research group at MIT has published a paper reporting on their ability to control a robotic dog through thoughts picked up by a pair of mind-reading smart glasses. The Boston Robotics Spot was instructed to fetch bottles and toys from different rooms by thinking responses to a number of preset queries asking the participant for the next command. These thoughts were picked up by a pair of AttentivU smart glasses which have electrodes built-in to measure brain activity from behind the ears.

2023 has seen demonstrations of many mind-controlled robots. Many of these require an implanted electrode or a brain-cap worn over the skull connected to dozens of electrodes to pickup thoughts. While these methods can be very accurate in controlling robotic limbs and hands like an actual human one, they are cumbersome and expensive in equipment and setup.

The AttentivU has electrodes to measure brain activity, sensors to track eye movement, and Wi-Fi and Bluetooth connectivity to transmit data. These glasses are easier and cheaper to wear than brain caps as well as being more aesthetically pleasing.

During the demonstration with 2 participants, data was transmitted to an iPhone running an app that utilized AI/ML cloud computing to analyze thoughts. The app asked yes/no questions about what Spot should do next and correctly understood the participants approximately 84% of the time.

The Ddog project demonstrates how quickly mind-reading interfaces have evolved in cost and accuracy. In the meantime, why not play around with your own robotic dog (like this at Amazon).

David Chien – News Writer – 43 articles published on Notebookcheck since 2023

Having worked at Activision, UCLA, Anime Expo and more, I’ve seen technology being used to save lives, create games, and create fantastic 3D VR/AR worlds. There’s always something fun in emerging technology that I want to get my hands on and all my friends turn to me to find the best for their needs, so I’m glad to bring my experience to Notebookcheck.

David Chien, 2024-01- 1 (Update: 2024-01- 1)

>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : NotebookCheck – https://www.notebookcheck.net/MIT-researchers-demonstrate-ability-to-tell-a-Boston-Robotics-robotic-dog-to-fetch-while-wearing-a-pair-of-AttentivU-mind-reading-smart-glasses.788418.0.html

Exit mobile version