Explore our R&D with the world’s most dynamic humanoid robotRead More
Discover the past innovations that informed our current productsRead More
Meet the team behind the innovationsRead More
Learn how we develop and deploy robots to tackle industry’s toughest challengesRead More
Start your journey at Boston DynamicsRead More
Stay up to date with what we’ve been working onRead More
Discover the principles that guide our work and policiesRead More
Innovation
Blogs •
Perception and adaptability enable Atlas to perform varied, high-energy behaviors like parkour.
What does it take for a robot to run, flip, vault, and leap like an athlete? Creating these high-energy demonstrations is a fun challenge, but our technical goals go beyond just creating a flashy performance. On the Atlas project, we use parkour as an experimental theme to study problems related to rapid behavior creation, dynamic locomotion, and connections between perception and control that allow the robot to adapt – quite literally – on the fly.
Robot perception algorithms are used to convert data from sensors like cameras and lidar into something useful for decision making and planning physical actions. While Atlas uses IMUs, joint positions, and force sensors to control its body motion and feel the ground for balance, it requires perception to identify and navigate obstacles such as the gap and narrow beam seen in Figure 1.
Atlas uses a time-of-flight depth camera to generate point clouds of the environment at 15 frames per second. The point cloud is a large collection of range measurements. Atlas’ perception software extracts surfaces from this point cloud using an algorithm called multi-plane segmentation. The output of this algorithm is fed into a mapping system that builds models of the different objects that Atlas sees with its camera.
Figure 2 shows what Atlas sees and how that perception is used to plan actions. In the top left is the infrared image captured by the depth camera. The white points in the main image form the point cloud. Orange outlines mark the detected rectangular faces of parkour obstacles, which are tracked over time from the sensor observations. These detected faces are then used for planning specific behaviors. For example, the green footsteps represent a future plan of where to jump and jog next.
In order to execute an extended parkour course, we give the robot a high-level map that includes where we want it to go and what stunts it should do along the way. This map is not an exact geometric match for the real course; it is an approximate description containing obstacle templates and annotated actions. Atlas uses this sparse information to navigate the course, but uses live perception data to fill in the details. For example, Atlas knows to look for a box to jump on, and if the box is moved 0.5 meters to the side then Atlas will find it there and jump there. If the box is moved too far away then the system won’t find it and will come to a stop.
This animation is a 3D visualization that shows what the robot is seeing and planning as it navigates the parkour obstacle course. Actively tracked objects are drawn in green and fade from green to purple as they go out of view of the robot’s perception sensors. The tracking system continuously estimates the poses of objects in the world and the navigation system plans the green footsteps relative to those objects using information from the map.
Each of the moves you see Atlas perform during a parkour routine is derived from a template created ahead of time using trajectory optimization. Creating a library of these templates allows us to keep adding new capabilities to the robot by adding new trajectories to the library. Given planned targets from perception, the robot chooses the behaviors from the library that match the given targets as closely as possible.
Designing behaviors offline via trajectory optimization allows our engineers to explore the limits of the robot’s capabilities interactively ahead of time and reduce the amount of computation we have to do on the robot. For example, the details of how exactly the robot coordinates its limbs to launch and tuck for a backflip can have a major impact on success due to physical constraints like actuation limits. Leveraging offline optimization allows us to capture important constraints like this at design time and adapt them online using a single, general purpose controller.
Having identified the boxes, ramps, or barriers in front of the robot and planned a sequence of maneuvers to get over them, the remaining challenge is filling in all of the details needed for the robot to reliably carry out the plan.
Atlas’ controller is what’s known as a model-predictive controller (MPC) because it uses a model of the robot’s dynamics to predict how its motion will evolve into the future. The controller works by solving an optimization that computes the best thing to do right now to produce the best possible motion over time.
As we described above, each template in our behavior library gives the controller information about what a good solution looks like. The controller adjusts details like force, posture, and behavior timing to cope with differences in the environment geometry, foot slips, or other real-time factors. Having a controller that is able to deviate significantly from template motions simplifies the behavior creation process, since it means that we don’t have to have behavior templates that match every possible scenario the robot will encounter. For example, jumping off of a 52cm platform isn’t that different from a 40cm one, and we can trust MPC to figure out the details.
The predictive property of MPC also allows Atlas to see across behavior boundaries. For example, knowing that a jump is followed by a backflip, the controller can automatically create smooth transitions from one move to another. This again simplifies the behavior creation problem since we need not account for all possible sequences of behaviors ahead of time. There are of course limits to the innovation we can expect from MPC. For example, attempting to transition to a backflip from a fast forward jogging motion wouldn’t work. In general, we have to strike a balance between controller complexity and behavior library size.
Our work on parkour has given us a strong understanding of how to create and control a wide range of dynamic behavior on Atlas (also including dance). But more importantly, it created the opportunity to design an extensible software system that will grow with our team as Atlas gains new abilities to perceive and change its environment. We’re excited to continue building on this foundation as we expand the scope of what Atlas can do.
Recent Blogs
•9 min read
Picking Up Momentum
•6 min read
In Step with Spot
Leaps, Bounds, and Backflips