Robot helper
(Photo : @askkell / Unsplash)

Google and Alphabet are combining two of their most grandiose research initiatives: robotics and AI language understanding. All to create a "helper robot" that can fully comprehend natural language orders.

Google's Robot Projects

Alphabet has been developing robots that can do simple jobs such as fetching beverages and scrubbing floors since 2019. These are all Everyday Robots projects but are still in their early stages, with the robots being slow and reluctant. Today, these robots are about to receive an upgrade: enhanced language understanding thanks to Google's large language model (LLM) PaLM.

Sure, helper robots have roaming the earth, but most can only answer straightforward requests like fetching you a bottle of water or other stuff from across the room. But LLMs like GPT-3 and Google's MuM can better understand the intention behind more subtle demands.

In Google's case---according to The Verge---you may say to one of the Everyday Robots prototypes, "I spilled my drink, can you assist?" "The robot then reads this command as "bring me the sponge from the kitchen" after filtering it via an inbuilt list of different courses of action.

The resultant system has been named PaLM-SayCan by Google, a catchphrase that captures how the model combines the language comprehension skills of LLMs ("Say") with the "dynamic capabilities grounding" of its robots (that's "Can" - processing instructions via various actions).

"With PaLM, we're seeing new capabilities emerge in the language domain such as reasoning via chain of thought prompting. This allows us to see and improve how the model interprets the task," the Google announcement read.

Also Read: Say Hello To Hitachi's EMIEW3 Humanoid Helper Robot

How AI-Language Can Improve Helper Robots 

Google believes that by incorporating PaLM-SayCan into their robots, the bots could prepare proper replies to 101 user-instructions 84% of the time and execute them 74% of the time. 

Per a recent Wired story, a demonstration in Google's robotics lab in Mountain View, California, reveals how their helper robots would require no human programmer to learn what to do in response to somewhat-complex demands. This robot leveraged millions of pages of text from the internet as its control software has learned how to convert a spoken syllable into a series of physical actions.

This demonstrates how helper robots may be able to adapt to the volatility of the real world. As Google Research and Everyday Robots collaborate to integrate the best language models with robot learning, the capacity of robots to browse the internet and respond to orders is already a step forward.

"Language models contain enormous amounts of information about the world, and it turns out that can be pretty helpful to the robot. PaLM can help the robotic system process more complex, open-ended prompts and respond to them in ways that are reasonable and sensible," says Google.

Meanwhile, for those concerned that things may go wrong, Google reiterates that they take a proactive stance on this study and adhere to Google's AI Principles in developing helper robots. Safety is the top consideration, and this is especially true for a learning robot.

Related Article: Robust AI's New Warehouse Robot Detects Human Movements-Creating Safer Interaction Between Man and Machine

This article is owned by Tech Times

Written by Thea Felicity

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion