Experts at the Massachusetts Institute of Technology (MIT) have developed a new drone integrated with an updated feature called "Conduct-A-Bot" or a "muscle-control system." 

According to Techcrunch's latest report, the newly developed drone can now be navigated by the pilot just by using their gestures.

Also Read: Experts Develop Little Drones That Fly Like Insects: Skeeters Could be the Future of Emergency Rescues

New MIT Drone Will Have Muscle-Control System; Pilots Can Now Use Gesture For Navigation!
(Photo : Matthew Lejune on Unsplash)
New MIT Drone Will Have Muscle-Control System; Pilots Can Now Use Gesture For Navigation!

A video was released by MIT's Computer Science and Artificial Intelligence Lab or CSAIL showing how they integrated the muscle-control feature to control the device. Their latest visual presentation involves their newest drone which has full controls using just hand and arm gestures to navigate through a series of rings.

The new feature is impressive not only because it uses biofeedback instead of other kinds of gesture recognition to control the drones, but also because of how the controls can set up a range of different potential applications making the remote tech more accurate.

Also Read: Use of Coronavirus Pandemic Drones Raises Privacy Concerns: Drones Spread Fear, Local Officials Say

New MIT drone will have muscle-control system; Pilots can now use gestures to navigate

According to Techcrunch, the group of researchers was looking at the different applications of the muscle-control feature, including its use in collaborative robotics for potential industrial applications to improve different kinds of devices, including drones.

The report stated that drone piloting is another area that could see many benefits from the muscle-control feature in terms of real-world uses, especially if a pilot can control flocks of drones. They provided a view of what they can see using virtual-reality or VR.

The new upgrade can be a great way to do site surveying using the new drones. These drones can easily do the inspection of offshore platforms and other infrastructure that is hard to reach for people.

Robotic and human interaction is the main goal of the researchers just like how people intuit their own abilities and movements to manipulate their environment effectively.

The team of researchers and developers believe that the process should be as smooth as the normal way of navigating and working with the robots.

According to MIT CSAIL, to achieve their goal, the researchers used electromyography or EMG and motion sensors, which are worn on the forearms, biceps, and triceps to measure the muscle signals and movement for accurate navigation.

Without any offline calibration or per-user training data, the muscle signals were processed using algorithms to detect gestures in real-time. To reduce the barrier that the operators may encounter while interacting with the drones, the new system will use just two or three wearable sensors.

Techcrunch also said that "Cobotics," the industry that focuses on creating robots that can safely work alongside humans and in close collaboration with robots, would greatly benefit from the advances that were made by MIT's research team.

The "Conduct-A-Bot" can make the interaction between people and robotic equipment more instinctive, natural, and safe.

Also Read: [VIDEO] COVID-19 vs Drones: In the US, Pandemic Drones Will Help Detect Coronavirus

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion