smplayer_screenshot_fetch_with_titles_00_00_04_02-1

Many applications for Spot are centered around the robot's ability to collect frequent, consistent, reliable data. We've been working on tools to help customers use the Spot API to integrate machine learning (ML) into those applications, so they can make the most use of the data collected by the robot. Machine learning algorithms help Spot find and apply patterns in that data. These integrations allow Spot to do things like identify specific objects in its environment and respond based on what it detects.

To help our users get started, we've released a new tutorial that teaches your robot to play fetch.

We go in-depth to cover every step of implementing ML in an application, including:

  • Collecting data
  • Training an ML model
  • Evaluating the model while you're driving Spot
  • Grasping an object identified by the ML model
  • Using multiple models' results in an autonomous application

In this case, the tutorial runs you through the process of teaching Spot to recognize a dog toy, find the toy in the area, pick it up, and return it to you. By the end of the tutorial, you'll know how to train ML models and use them to have Spot pick up almost any object you want. You'll also learn to see the output of the model on the tablet while you drive Spot. And finally, we cover using multiple ML models in the same application, including a model you can download and run immediately.

You can find the full tutorial in our Developer Documentation.


Join our upcoming webinar to see Spot's API in action.

REGISTER NOW



Using machine learning to find a dog toy

Figure: Using Machine Learning to find the rope dog toy

API Features

The fetch example takes advantage of a range of API features and helps demonstrate how easy it can be to program custom applications for Spot. The tutorial uses new API features in software release 2.3, such as Network Compute Bridge and the ML Model Viewer, to allow you to easily connect ML models to robot behavior and visualize their results in real time. We combine that with our Manipulation and Grasping API to demonstrate using the results of your models in-the-loop using sophisticated grasping with the Spot Arm. Even if your robot doesn't have an arm, you'll still be able to follow the ML parts of the tutorial.

Of course, we don't expect to see Spot regularly playing fetch in the park. But these same features serve as the basis for more practical applications—for example, automated roadside trash clean up. You might teach Spot to recognize litter (as distinct from other items in the environment), pick it up, and bring it to a trash can. Dog toys are just a jumping off point to a range of real-world possibilities.

Want to learn more about using the Spot API? Join our upcoming webinar to see how our customers are making custom payloads, controls, and behaviors. Register today to have your questions answered live.

NEXT FOR YOU

Spot with DroneDeploy and Brasfield & Gorrie

CASE STUDY

BRASFIELD & GORRIE

General contractor Brasfield & Gorrie integrates Spot with DroneDeploy for automated and centralized site documentation.

LEARN MORE >

Webinar: All About Spot's API

WEBINAR

TECH TALK: ALL ABOUT SPOT'S API

Join us for a live, interactive tech talk to learn how Spot customers are developing tailor-made integrations to address unique business challenges.

LEARN MORE >