DeepMind Announces Plans to Utilize AI Models for Physical Robots

Google DeepMind Unveils Advanced AI Models for Robotics
Introduction to Gemini 2.0
On Wednesday, Google DeepMind announced the launch of two new artificial intelligence (AI) models designed for robotics. Both models operate on what Google describes as its "most capable" AI system to date, known as Gemini 2.0. This system marks a significant advance in AI, moving from traditional outputs like text and images to real-world physical actions. With Gemini Robotics and Gemini Robotics-ER (for extended reasoning), Google aims to control robots and enhance their ability to perform tasks in various environments.
Collaboration with Apptronik
To develop the next generation of humanoid robots, Google has partnered with Apptronik, a Texas robotics company known for its work with NASA and Nvidia. Recently, Google participated in Apptronik’s funding round, which raised $350 million. The collaboration indicates a strategic move by Google to leverage Apptronik’s expertise in robotics while integrating its advanced AI technologies into practical robotics applications.
Demonstration of Capabilities
In demonstration videos, Google showcased Apptronik robots using the new AI models. These robots displayed various skills, such as plugging devices into power strips, filling lunchboxes, and moving around plastic vegetables—all performed upon spoken commands. These practical demonstrations illustrate the goal of making robots more interactive and responsive to human directions. However, Google has not yet provided a timeline for when these advanced robots will be commercially available.
Key Qualities for Robotic AI Models
Google emphasizes three essential qualities that AI models for robotics must possess to be effective:
- Generality: The robots need the ability to adapt to different situations and environments, applying learned skills across various tasks.
- Interactivity: They must quickly understand and respond to user instructions and any changes occurring around them.
- Dexterity: The robots should be capable of performing tasks resembling human dexterity, such as manipulating objects with precision.
Advanced Model for Developers
The Gemini Robotics-ER model serves as a foundational tool for roboticists, allowing them to train their own AI models. Along with Apptronik, trusted testers like Agile Robots, Boston Dynamics, and Enchanted Tools will have access to Gemini Robotics-ER. This opens opportunities for various companies to innovate in the robotics sector using Google’s advanced AI technology.
The Bigger AI Landscape
Google is not alone in its pursuit of AI for robotics. In November, OpenAI made headlines with its investment in Physical Intelligence, a startup that focuses on integrating general-purpose AI into robotics. This reflects a growing trend in the tech industry, where major players like OpenAI and Tesla are expanding their influence into the humanoid robotics field. Tesla, for example, is venturing into this market with its Optimus robot, showcasing the competitive landscape among tech giants in robotics.
Google’s Vision for Robotics
Sundar Pichai, CEO of Google, expressed in a recent post that the company views robotics as an important platform for testing and applying AI advancements in real-world situations. Pichai highlights that the robots developed under this initiative will utilize Google’s multimodal AI models, enabling them to adapt and make changes as needed in their surroundings. By investing in robotics, Google aims to not only advance technology but also enhance practical applications that can assist people in their daily lives.
Looking Forward
The intersection of AI and robotics continues to evolve rapidly, with major tech companies like Google and Tesla leading the charge. As these technologies advance, they promise to reshape how we interact with machines, transforming our expectations of what robots can achieve in everyday tasks. The development and implementation of AI in robotics hold the potential for groundbreaking changes in various sectors, from manufacturing to personal assistance.