Explore our R&D with the world’s most dynamic humanoid robot.Read More
Discover the past innovations that informed our current products.Read More
Meet the team behind the innovationsRead More
Learn how we develop and deploy robots to tackle industry’s toughest challengesRead More
Start your journey at Boston DynamicsRead More
Find upcoming events to connect with our team and see live demosRead More
Stay up to date with what we’ve been working onRead More
Using Spot’s API, we integrated a machine learning model to show Spot how to play fetch.
Many applications for Spot are centered around the robot’s ability to collect frequent, consistent, reliable data. We’ve been working on tools to help customers use the Spot API to integrate machine learning (ML) into those applications, so they can make the most use of the data collected by the robot. Machine learning algorithms help Spot find and apply patterns in that data. These integrations allow Spot to do things like identify specific objects in its environment and respond based on what it detects.
To help our users get started, we’ve released a new tutorial that teaches your robot to play fetch.
We go in-depth to cover every step of implementing ML in an application, including:
In this case, the tutorial runs you through the process of teaching Spot to recognize a dog toy, find the toy in the area, pick it up, and return it to you. By the end of the tutorial, you’ll know how to train ML models and use them to have Spot pick up almost any object you want. You’ll also learn to see the output of the model on the tablet while you drive Spot. And finally, we cover using multiple ML models in the same application, including a model you can download and run immediately.
You can find the full tutorial in our Developer Documentation.
The fetch example takes advantage of a range of API features and helps demonstrate how easy it can be to program custom applications for Spot. The tutorial uses new API features in software release 2.3, such as Network Compute Bridge and the ML Model Viewer, to allow you to easily connect ML models to robot behavior and visualize their results in real time. We combine that with our Manipulation and Grasping API to demonstrate using the results of your models in-the-loop using sophisticated grasping with the Spot Arm. Even if your robot doesn’t have an arm, you’ll still be able to follow the ML parts of the tutorial.
Of course, we don’t expect to see Spot regularly playing fetch in the park. But these same features serve as the basis for more practical applications—for example, automated roadside trash clean up. You might teach Spot to recognize litter (as distinct from other items in the environment), pick it up, and bring it to a trash can. Dog toys are just a jumping off point to a range of real-world possibilities.
Want to learn more about using the Spot API? Watch our on-demand webinar to see how our customers are making custom payloads, controls, and behaviors.