From materials handling to inventory and floor cleaning, autonomous mobile robots are finding more uses beyond factories and warehouses. Brain Corp. has developed the BrainOS software-as-a-service offering to provide autonomy and reporting features to more than 16,000 robots and logged over 4.9 million hours of operation.
The San Diego, Calif.-based company was founded in 2009, and its customers include Walmart, Kroger, and the Mall of America. Denver Public Schools, one of the largest school districts in the U.S., plans to use Tennant Co. floor scrubbers with BrainOS, starting with 14 buildings.
Brain recently hired Jon Thomason as its new chief technology officer. He is former vice president of autonomous solutions at Uber Advanced Technologies Group (ATG), and he has also held senior roles at Amazon and Qualcomm. Robotics 24/7 spoke with Thomason about his experience, his views on autonomous mobile robots (AMRs), and his plans for Brain Corp.
You've been a software leader at tech companies like Uber ATG, Oculus, and Qualcomm. What brought you to Brain Corp?
Thomason: While it seems like they're all different, the big commonality is that there's a lot of systems software. While artificial intelligence and deep learning are obviously components, there's a lot of other software—Web portals, interactions, real-time pipes, updates—with remarkable parallels in my 30 years in the industry.
So much of robotics is making a solid system. What was exciting about joining Brain Corp is bringing a lot of systems software rigor to the process and seeing Brain Corp's software powering 16,000 robots. Self-driving cars are a ways from that.
What is it about AMRs that make them so potentially helpful? Don't people expect too much or too little from robots?
Thomason: Imagination has preceded robots by a substantial degree. The word “robot” is 100 years old, and we've been dreaming about robots in science fiction for years. Expectations are sky-high, but we've been talking about humanoid robots for decades. Nobody really dreamed about touchscreens.
This has created a level of unsustainable hype, like what has hurt self-driving cars. It might have helped raise money, but expectations for robots are unrealistic.
With AMRs, humans are needed to teach and train robots. If a programmer has to go to a site and write unique code because every site is different, this makes maturing robotics hard.
Businesses could get Ph.D.s from leading universities to program an 80% solution using ROS [the open-source Robot Operating System], but they need things that can be deployed in the real world today. Workers change jobs every six months, high school and store environments are cluttered, and they change every day. This is not some sleek science fiction world; it's messy.
Our “secret sauce” at Brain Corp is that rather than build robots, we have pioneered the intellectual property in this space with an unbelievably simple user interface [UI]. There a button—press “Teach,” and drive the robot where it needs to go, and press “Save.” It's localized in 12 languages.
When we save, we do magic, taking all the sensor data and building a map on the fly. Self-driving cars follow a very different process. We converge map so it's all square, and clever software and algorithms retain all the priors we need.
Next time, when you want to go in autonomous mode, pussing the “Clean” button shows a list of routes, and all you need to do is push “Go.” From a UI point of view, that's it.
What's happening underneath is the AI and robotics stuff. You don't need a programmer to get robots to drive around obstacles. Others treat it like a research project, and then you have to send someone out with a laptop. That's not scalable. With Brain OS, you just retrain the route.
Reality is complicated, and companies like Walmart need something that people without specialized education can use all the time.
Speaking of the workforce, how do you address concerns about displacement?
Thomason: Brain did research and saw a clear need. Rather than build robots from scratch, we work with OEMs like Tennant. We went to the floor-cleaning space because, as the saying goes, “Nobody got fired for buying from IBM,” and nobody got fired for buying from Tennant.
We know how the scubbing machines work. They can all be manually driven, and it's crazy what some people do with them, resulting in damage. A robot does the thing the same way each time.
We don't displace people—they go to do more productive work, such as stocking. Where's the ROI [return on investment]? With our software, staffers are more productive, the robots are not damaging the store, and we provide proof of safe cleaning. We're doing lots of stuff in the cloud and working on a mobile app for store managers.
In fact, at Walmart [which uses SoftBank's Whiz cleaning robots powered by BrainOS], it's good ROI if technology is not displacing people. It increased its number of robots this year by 200.
What other applications besides cleaning does Brain enable?
Thomason: We strategize a lot. We have extended, paid-for pilot with Sam's Club to do shelf scanning in 11 stores today. We are doing daily runs of shelf analytics with partners. We handle the data collection and localization, and they do OCR [optical character recognition] ad image analysis.
We can do very precisely located images, which they've found useful. We're in negotiations for final products after a six-month pilot. ROI is really complicated, and robots can be very expensive.
Are there any applications where AMRs aren't the best fit?
Thomason: For the next couple of decades, we'll be discovering different uses for different types of automation. When you say, “robot,” people immediately snap to a $70,000 humanoid and what does it do. That's a focus on function over form. Very specialized robots are going to be the norm.
We've been successful at Brain Corp by not inventing form factors. We're not hunting for “What is this thing useful for?” We can put our software in an efficient robot.
For example, self-driving car sensors can cost more than $100,000 per vehicle —some double that. You might be able to afford that if it's a ride-share vehicle that's in use 18 to 24 hours per day, but that's not true for a robot that spends three to four hours a day cleaning.
Robot parts need to be a few thousand dollars, and you have to start with the function and see if you can make it at a reasonable cost. Uber ATG used lidar and radar, and Tesla looks at traffic signs. A vision-based approach alone won't work in stores; you just need some sensors that can enable robots to avoid people and shelves and that don't make hundreds or thousands of pieces of equipment prohibitive.
What's the most exciting thing you're working on right now?
Thomason: I took the following four initiatives to the board in December:
- Autonomy excellence, which improved 25% this calendar year.
- Customer and robot insights, which are code words for “cloud”—sending insights based on data we're creating back to customers through a Web portal and mobile app. The portal even shows things like hardware starting to fail.
- A next-generation hardware autonomy kit, which is a big initiative this year.
- The shelf-scanning initiative with Sam's Club.
These are reasonably tactical. For strategic things, we're looking at autonomy and excellence for things that have nothing to do with floor cleaning, like delivery robots and shelf scanning. Our pilots with Walmart on in-store delivery are mostly for restocking to back rooms and stockers.
Customers also like to have AMRs pull carts for trash runs. They go to drop zones, drop off and pick up carts. The cloud is also a huge feature for managing fleets of robots.
Speaking of fleets, who should manage the different types of AMRs that are emerging?
Thomason: Depending who you ask, you'll get different answers. Formant and InOrbit will say there should be a centralized company or software.
We're big enough to do this sort of thing and manage our own fleet. The data really belongs to the customer. We don't share their data with anybody else. Brain is super-strict about the idea of sharing data to manage fleets. It's not good to farm data and is a tough sales proposition.
We tailor the solution to what works best with their stores—how user experience, training, and intervention are different online versus with a laptop. We're making it part of our suite. We're not just commoditizing ourselves as an autonomy provider; we need to provide more.
Since we use a RaaS [robotics-as-a-service] business model, we're charging people for the software. Others sell the robots, and we sell the service. The data goes to their own apps, portals, and reports. With APIs [application programming interfaces], several big customers can do their own data mining.
What do you think about the interoperability efforts currently under way?
Thomason: There is such a thing as standardizing APIs too early. For example, there was an early push for RPC and CORBA that went awry. Does OLAP still matter? It doesn't; the Web crushed all that. Everybody thinks the iPad was the first tablet, but before that was Windows Tablet Edition, but the hardware wasn't ready.
Won't know what will be like the Web for robotics, so I'm not sure about standardiziation of functions before it makes sense. When robots are ubiquitous, we'll need anti-collision protocols, ways to query for status and position, and query-publish protocols like for webcams but with better security.
You can't “do it light and cheap” with robots. Even with AMRs, some weighing up to 1,000 lb., safety is mission-critical. I'm not pessimistic—IoT [the Internet of Things] and 5G are going to happen, and at some point, warehouses and distribution centers will start to buy standardized equipment.
As Brain Corp expands applications, do you foresee your autonomy systems applying to robots outdoors?
Thomason: We're focused on cleaning and delivery and are going to expand environments for both. We're not currently looking at outdoors. We could, but as soon as robots are where there are vehicles, things can get exciting. Even in parking lots, which are simpler than roads—a lot can happen, and the long tail of stuff is very long.
When I worked at Uber ATG, I worked on safety systems. We tried to figure out what to do in the case of sensor or compute failure. The best thing to do is pull off the road and stop—that's usually the right answer, but 5% of the time, it's the worst thing to do. You can't just stop in highway traffic or at an intersection or a ramp.
To be profitable, robots can't be dangerous. We're not anywhere near the “singularity” with robotics and AI. We write rules-based systems, a series of “If, then” statements, and rely on computer vision. Reinforcement learning won't work with cars. There's no way to converge branches, which deep learning networks can do.
It's easier to stay with deterministic behavior and train for use cases and add rules. It would be great to make robots that could get shopping carts in a parking lot, and we're looking at working with a cart tug maker.