
In the past few years, advances in neural networks and deep learning have helped artificial intelligence find useful and commercial applications in many domains.
But one field that has continued to struggle despite breakthroughs in AI has been the robotics industry.
To be fair, we’ve seen interesting advances in robotics thanks to research in AI subfields such as reinforcement learning. But the old vision of versatile robot assistants that autonomously navigate various settings and assist humans in different tasks still seems a distant dream.
Last week’s release of Spot, Boston Dynamics’ flagship four-legged robot, might have changed things (a bit) for the robotics industry. Spot, formerly known as SpotMini, became famous for YouTube videos in which it is hauling trucks, opening doors, and dancing to Bruno Mars and Mark Ronson’s hit track “Uptown Funk.”
Now, Spot will be finding its first real home in construction sites, where it will be helping customers by patrolling and inspecting their sites. Boston Dynamics is also exploring possible partnerships with energy companies (for more inspection) and Cirque du Soleil (for entertainment?).
I will not be reviewing Spot here, because I didn’t have first-hand experience with the robot. There are plenty of good reviews out there if you want to know what its capabilities and limits are (I suggest Wired’s in-depth story and the Verge’s video on Spot).
I will be sharing my own observations on what Spot’s launch means for the robotics and the broader AI space. Though it’s a bit too early to judge the outcome, here are the reasons I believe Spot has a chance to succeed where other robots have so far failed. (At the end I will also discuss why there’s still reason to be skeptical about Spot.)
Human-machine collaboration
In many ways, Spot is a reality check on the capabilities and limits of current artificial intelligence technologies. Deep learning and neural networks, which are now one of the core components of all AI and robotic applications and products, are very good at classification tasks. For instance, a well-trained neural network can perform simple computer visions tasks, such as examining a video feed and taking inventory of the different objects that surround it.
Most robots such as Spot pack several cameras, which they use to monitor their surroundings. Deep learning algorithm then analyze the camera feeds in real time and help make sense of the surrounding. Some systems, such as self-driving cars, complement deep learning with other technologies such as lidars, which create three-dimensional models of the robot’s surrounding.
But deep learning struggles when it comes to understanding the relations between objects or figuring out problems that require commonsense. These AI models are also very poor at making decisions on scenarios they haven’t trained for. Unfortunately for robots, they must often navigate open environments that require a lot of background knowledge. That’s why robots powered by deep learning and neural networks often act stupidly on simple tasks that a human would do subconsciously.
Spot, however, has not been designed to be completely autonomous. A human operator controls Spot remotely with a fully featured controller that looks like a Nintendo Switch. But this doesn’t mean that spot is a dumb robot that depends fully on its remote controller. In fact, the robot makes full use of its AI smarts to make life easier for the operator.
For instance, the operator can tap on a point on the controller’s video screen, and Spot will find a way to get there. Also, as you’ve seen in the many videos that feature Spot, the robot is very good at handling different terrain and maintaining its balance on different slopes and ground conditions. It will also figure out how to avoid collisions and walk around obstacles.
So basically, the robot might not be the smartest decision maker, but it makes good use of AI and robotics to handle the basics of walking around and standing upright. This enables its human operators to focus on the goal they want to accomplish and make the commonsense decisions that Spot can’t.
This is the kind of intelligence augmentation that has so far been successful in current blends of AI. Similar projects and use cases are emerging in other fields. An example is drone flight, where machine learning–based flight controllers take care of things such as flight stability, and enable pilots to focus on determining the direction of flight.
Until we have AI technologies that can handle open environments, the controller-based model of Spot seems to be a successful formula.
Focusing on industrial uses

Another thing I like about how Boston Dynamics handled Spot’s rollout is the domain they chose. Instead of going for a broad domain with many use cases and uncertainties, the company chose to launch and test their robot in a limited and controlled environment.
Unlike homes, streets, or offices, which serve many purposes and are open environments, industrial settings such as factories, warehouses and construction sites are much more limited in nature. Lighting and weather conditions, variety of objects and the kinds of activities you see in professional environments are fairly constant. For instance, you’ll probably never see a child running around in a construction site of a pedestrian pushing their bicycle in the dark of the night. These are called edge cases, and current AI models are very bad at handling edge cases.
Therefore, by deploying Spot in environments where edge cases and uncertainties are much more limited, Boston Dynamics can make sure that their robot will have a smoother transition into real-world use cases.
There’s precedence of this kind of approach being successful for AI-based systems. For instance, while AI researchers continue to search for ways to create robust self-driving cars that can handle the open environments of roads, self-driving forklifts have already become a growing market. This is because forklifts operate in warehouses and industrial complexes, where lighting, weather and terrain are constant, and edge cases are very few.
There’s reason to believe that Spot will have more chance of succeeding in these kinds of environments.
Spot will still have to pass its test
While overall, I’m optimistic about Spot’s release, I also have a few reservations. There’s no clear data on the price point of Spot, and the Boston Dynamics is leasing its robots instead of selling them. This is a bit worrying, especially since the company has refused to give any clear figures on how much each Spot will cost its customers.
Also, the release of Spot comes against the backdrop of the shutdown of several robotics companies. Rethink Robotics, the manufacturer of the famous Baxter and Sawyer robots, closed shop last October. Anki, a startup that raised $200 million to create cute home robots, shut down in April, after it ran out of money to support its hardware and software business (update: the company’s assets were later acquired by German robotics firm HAHN Group). Mayfield Robotics, the maker of the Kuri home robot, also ceased operations in 2018.
All these events have cast a shadow of doubt over the validity of robot companies. The truth is, having a robot with a successful business model is very challenging. Further complicating the scene is the fact that Boston Dynamics itself has been struggling to find a successful business model for its robots in the past years.
It’s one thing to upload YouTube videos that show robots doing backflips and parkour. But delivering a robot to the market that has a valid return on investment for customers is a totally different challenge, especially when you have investors that expect you to become profitable at one point.
Google, which had acquired Boston Dynamics in 2013, sold it to SoftBank in 2017 after it couldn’t find a way to integrate its innovations into its products.
So, is Spot’s release a desperate attempt by the 26-year-old Boston Dynamics to keep its new owners satisfied, or is it a real step toward creating useful robots that can work alongside humans? Time will tell.