Robots, specifically four-legged, dog-like ones, are learning how to walk properly using virtual simulations.

 Robot dog
(Photo : Getty Images )

These robots, according to ArsTechnica, are not exactly real because they're technically CGI. However, they are still robots that were developed by a team of researchers from Switzerland's ETH Zurich in collaboration with NVIDIA.

Aptly called ANYmals, the four-legged robots are being trained to overcome various virtual obstacles such as steps, slopes, and steep drops.

Every single time that the robots succeeded in overcoming challenges, they were presented with a harder one. The researchers did this by slightly modifying the control algorithm and making it a bit more complicated.

The main goal of this experiment, according to the researchers, is to train a machine learning algorithm that could be used to drive the legs of robots in the real world.

You can watch a video of the experiment, called "Massively Parallel Deep Reinforcement Learning," posted on the YouTube channel of the Robotic Systems Lab:

ANYmals, according to ANYbotics, are designed to "provide unparalleled mobility" when navigating real world obstacles. This includes stairs, steps, gaps, and even tight spaces.

Furthermore, it seems like the four-legged robots were also designed to be really tough. ANYbotics promises "reliable performance" even in harsh outdoor conditions, such as rain, wind, snow, dust, and splashes of water.

Read also: Robot Cockroach Features Engineering that Does Not Get Crushed, Moves Fast like the Insect

How Do These Robots 'Learn'?

For so many years, the field of robotics has always struggled with mimicking natural walking gaits, especially for bipedal machines. Even the most advanced ones of this kind (like Boston Dynamics' parkour robots) still have pretty limited mobility, despite largely being able to walk on two legs.

For these four-legged machines, however, they're solving the problem of learning how to walk by looking to nature. Specifically, "reinforcement training."

Ai brain
(Photo : Getty Images )

This kind of training is how animals in the real world learn specific life skills. For the ANYmals, the researchers used this specific AI method to train the robots by simulating positive and negative feedback.

As the robots navigate the virtual course, the algorithm "judges" how well they formed the task, then proceeds to tweak the controls.

NVIDIA's AI Tech Proving Effective

NVIDIA's AI technology is being employed to help the ANYmals achieve their objectives. By making the robots run on this specific AI hardware, the researchers claim that the amount of time spent "training" them has been cut to less than one-hundredth the previous.

It's this specific AI technology that has mainly pushed NVIDIA CEO Jensen Huang to TIME's 100 Most Influential list.

However, it's still not without its challenges.

For now, the AI tech is not exactly designed to simulate the different properties of physics, such as sliding and climbing. It's these physics that play a massive part in teaching the dog-like robots how to walk.

Looking Ahead

With enough time, the four-legged robots could usher in the era of all-terrain machines that can navigate almost anywhere in the world. They could be given tasks like delivering much-needed supplies to far-off places, and perhaps even helping out in rescue missions during catastrophes.

Related: Robot 'Police' Using 360-Degree Cameras With AI Are Now Patrolling Public Areas in Singapore

This article is owned by Tech Times

Written by RJ Pierce

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion