How Simbe Robotics is Innovating in Retail with Brad Bogolea

Simbe Robotics         
23 November 2021

share this:

Kate speaks with Brad Bogolea, CEO and Co-founder of Simbe Robotics. Simbe Robotics developed a mobile robot named Tally, which is bringing advanced shelf insights to improve the retail shopping experience.

Tally provides a state-of-the-art sensing system on a robust, scalable platform that collects analytics in real-time.

Brad Bogolea

Brad Bogolea is the CEO and Co-Founder of Simbe Robotics, where he is responsible for the company’s vision and execution of its leading retail intelligence solution. In November 2015, Brad brought to market the Tally robot, the world’s first autonomous shelf auditing and analytics solution to help retailers ensure merchandise is always stocked, in the right place, and correctly priced. The National Retail Federation Foundation has named Brad to its list of “People Shaping Retail’s Future.

Prior to Simbe, Brad spent 10 years in the energy and wireless sensor industry. Most recently Brad worked at Silver Spring Networks, where he led global product management and business development efforts for their energy management and data analytics platform for energy utilities. Products under Brad’s leadership allowed the world’s largest utility companies and their energy consumers alike to gain efficiency and visibility of their energy usage through smart sensors and data.

Brad holds a B.S. in Computer Science and Engineering from Pennsylvania State University where he also returned to serve as an Entrepreneur-in-Residence.



Jana: [00:00:00] Simbe robotics with RoboHub the podcast for news and views on robotics.

Welcome to the robo hub podcast. In today’s episode, we find out how mobile robots can improve the retail industry, helping with tasks like knowing when to restock [00:01:00] shelves and determining where the products are priced. Correct. Our interview. Kate speaks with Brad Bogolea, Leah CEO, and co-founder of Simbi robotics, a company that has created a mobile robot named tally, which is designed to solve some of the key issues in retail efficiency.

Kate Zhou: Hello, welcome to rub hub. Would you please introduce yourself?

Brad Bogolea: Sure. I hi there, I’m Brad Magalia CEO, and co-founder, it’s going to be robotic.

Kate Zhou: Welcome Brad. Very happy to interview today. So could you begin by telling us what led you to find a Cindy robotics? What problems were you trying

Brad Bogolea: to solve? Yes, absolutely.

The specific problem we’re focused on at Symbi is a problem. We’ve all experienced as retail store shoppers. We’ve all walked into our local retail store and the product we’re looking for. Wasn’t actually. Retail stores happen to [00:02:00] be these really dynamic environments where they have, you know, tens of thousands or hundreds of thousands of products coming in and out of the store.

And it’s incredibly difficult to actually keep track our product stocks. Are they in the right place? Do they have the right price? And you know, as a founding team, we felt strongly the latest advancements in robotics and computers. It could really be brought into the retail market and help transform, really the, the retail shopping experience.

So that’s a little bit on sort of the, the problem we’re solving as far as how we got started, prior to founding Simbi, I had spent most of my career, you know, leveraging advanced technology to help transform specific industries. It’s been a long time in the energy space. And then had the fortunate opportunity to reconnect with an old friend and, and college buddy that happened to be working at Willow garage, [00:03:00] and through the lens at Willow and his work there, I started to see all the unique possibilities, with, you know, mobile robotics, and computers.

And when Willow disbanded and sort of late 2013, you know, we really thought it was a great opportunity to kind of come together and, you know, build a enterprise technology.

Kate Zhou: Very interesting. Cool. So we chose a robotic space solutions for the retail problems that we’re seeing currently. I guess you mentioned you have some experience with robotics and computer vision.

That’s what led you there, but why do you think this is very appropriate for the space currently and for there other non robotic solutions out there before you started.

Brad Bogolea: Yeah. And so the, the status quo in sort of the retail industry today is kind of manually checking shelves with labor. But these environments are so [00:04:00] dynamic, it’s actually, you know, sort of a impossible problem to sort of keep up with all the various changes.

And you want your labor in these stores to be highly focused on things like customer service, restocking picking online grocery orders. So what we have found is an automated solution can actually find between five and 25 X, the amount of anomalies on shelf, than a store team can and actual blind studies.

So capturing this data at a greater frequency infidels, Provides a lot of transformational value to the retail store. And this is not just something you can solve with sort of back office AI software, you know, most retailers, know roughly what products leave the warehouse and they of course know what, shoppers are buying in their stores.

But everything that happens in between is a little bit of a mystery, because the store shelves aren’t. And given the [00:05:00] sheer scale of these stores, it’s not cost effective to put things like weight sensors, or physical cameras throughout these really large format environments. What is on these shelves and how they’re organized are also changing quite regularly.

So it really becomes sort of a perfect opportunity, for something like a mobile robotics plot. And you don’t have to deal with all the challenges that autonomous vehicle vehicles have to deal with in the outside world.

Kate Zhou: I see. Interesting. Can you walk us through a tele the mobile robotics platform that you, your company has?

Brad Bogolea: Absolutely. So, as you know, Telly’s is really designed, to capture high quality image data, both 2d and 3d from retail store shelves. So what’s great is it doesn’t require any infrastructure changes to get into these environments. You really just [00:06:00] unpack it, turn it on. And it starts to build a map of the store environment using.

Many classic simultaneous localization mapping techniques and robotics. And once we have a map of the store environment, we’re then able to begin extracting that value from the store shelves. So the way that tally operates is typically somewhere between two to five times a day, it leaves its charging dock autonomously, and it goes out and just goes up and down the store aisle.

Sort of performing a roster scan of each side of the aisle at a time. And we’ll go through the entire store, identifying, you know, which products need to be restocked, which prices are incorrect and gathering up-to-date product location information. So we can give that to shoppers that may want to come out to the store and buy a product.

Or give that information to people like an Instacart or a door dash, so they know what store they should go to and where to find a product and make that [00:07:00] online picking process super efficient.

Kate Zhou: I see. So how long does it take to build initial map of the store? And you mentioned two to five paths a day and how what’s the duration of those.

Brad Bogolea: Yeah. So building a map in these stores is actually fairly quick, today, especially with modern sensing technology. You know, the average grocery store environment that we’re in is traditionally in between about 50 and, A hundred thousand square feet. So building maps in these environments are pretty quick exercise.

Something that normally takes less than an hour. We actually don’t have to even be there. It’s something that can be done sort of remotely and, you know, almost, entirely autonomously. And then once we have that, we’ve built another number of sort of machine learning, tools that help us identify, you know, where all the products.

As tally [00:08:00] goes through the store, you know, typically, you know, tally, isn’t racing through the store. We want to have a, you know, very thoughtful experience for the shoppers there. So tally walks the floor sort of slower than walking speed is sort of a, a shy robot that will stay out of, sort of people’s way.

But I would say the average grocery store scan is, is maybe two to three hours, to do a full. I say,

Kate Zhou: yeah, it’s interesting. You mentioned from the customer’s perspective, in addition to helping with the retailers, has there, have you received feedbacks from the customer experience and has any of your design decisions made to improve the customer experience?

Brad Bogolea: Yes, absolutely. You know, we consider ourselves in a very unique position and fortunate to actually be. In an industry where we can bring robotics to a place where. Most people visit every week, which is your [00:09:00] neighborhood grocery store to get milk, bread, or juice or eggs, whatever you’re looking for. And as you can imagine, you know, many folks across the country have not really seen robots outside of maybe the movies or the vacuum cleaner that they have at home.

They’re not quite as familiar with robotics as us. But we’ve put a lot of thought into sort of tallies design for that reason. So nobody wants a big, scary robot coming into their business or their local store. So we really focused on tally having a very slim, you know, compact sort of thoughtful design, physically, you know, lots of neutral colors.

It doesn’t take up much space in the aisles. So there’s ample room to get shopping carts by it’s a differential drive. Baby. So it can turn in place. You’re not dealing with, you know, car or Ackerman style steering mechanisms, you know, in a highly trafficked environment. [00:10:00] And we’ve added things like, you know, small lights and sounds to help those folks.

We all know, you know, electric driven vehicles can be quiet, to let people know that that tally is, is in the environment. So. We think we’ve built quite a thoughtful, sort of product there. And the feedback from, shoppers and customers has really been quite positive. You know, a number of folks don’t even notice it, you know, others are kind of like, Hey, that’s pretty cool.

You know, kids, of course, it’s amazing. Even at such a young age can recognize that, you know, this thing is a robot and, you know, they’re kind of excited to see. But telly, isn’t really designed to interact with those folks, right. It’s really focused on doing its task at hand and you know, it does its best to really kind of stay out of the way.

Kate Zhou: I see. [00:11:00] So you mentioned the mobile robotics space solution is quite perfect for a scenario such as a grocery store. It doesn’t have the same challenges as autonomous driving in the real world. Were there any features or parts of the robot that were adapted from other applications? And what were the, on the other hand, where were the most challenging to develop for your particular application?

That couldn’t be borrowed from existing robotics technology.

Brad Bogolea: So as a company today, we’re really full stack. Meaning like we have designed and developed sort of our entire robot. Now, of course there are certain components, particularly things like sensing components that we will use off the shelf or compute platforms.

We will kind of use off the shelf, but as it relates to things like drive mechanisms and other pieces, You know, we don’t need a robots. That goes very fast and, you know, safety is sort of hypercritical. So a lot of thought went into, you know, you know, [00:12:00] speed stopping distance, you know, how do we very gradually sort of accelerate, you know, make sort of thoughtful turns.

So a lot of healthy sort of mechanical study of sort of the environment. You know, fortunately these stores have to be sort of American disability act compliance, and there’s such high traffic. You don’t deal with things like trip hazards, traditionally their tile or cement floor. You really see carpeting or heavy transitions.

So it becomes a very nice environment. You know, to really build a robot and, you know, given a lot of the great research that has historically gone on and autonomous mobile robotics, we are able to get sort of the core solution, building a map, being able to traverse the environment up sort of very quickly, you know, where we spent, you know, a very significant portion of time, sort of as a company.

Is then sort of building a lot of the tools, around that. So how do you [00:13:00] think about managing all of these assets? Kind of think of them as sort of servers or data capture devices on wheels really across the globe. Right. We have robots in about five different countries. So how do you think about things like fleet management, you know, application performance management at scale and how do we deploy these things in a way where.

You know, we know they’re going to last, they’re expected sort of three to five years sort of life cycle out in the business. The other area where we’ve invested, you know, really a lot is, you know, we, we use a lot of bleeding edge sort of computer vision, you know, in the early days when we started, when we founded the company, You know, techniques like deep learning, we’re just starting to come into existence.

So as a company, we apply both, very sort of classical computer vision techniques, as well as modern to sort of extract a lot of value, in these re stills torch, re retail store shelves. So as [00:14:00] a company, I think one of the challenges that most robotics companies have to deal with is. You know, we often have to be pretty full stack.

You don’t see the same level of platformization. In robotics that you have in other industries, you might use tools like the robot operating system or common compute platforms like Intel or Nvidia. But so much of the upper application layer of things that you have to sort of build yourself versus, you know, really being able to go out and license, you know, other software or sort of other tools.

Jana: Cool.

Kate Zhou: What are the sensors on tele and part of their information that kind of work together for a decision-making?

Brad Bogolea: Absolutely. So, much like many classic mobile robot platforms, you know, we combine, inertial data through sort of a standard, IMU, you know, [00:15:00] wheel Adometry. We use a low cost LIDAR solution, you know, within the robot, as well as coupling that with sort of 3d cameras to really allow tally, to have a high quality 3d 360 degree view of the environment, right.

We’re going down store Isles. You know, there might be some advertising displays hanging down from the ceiling or a mop or broom sticking out on the aisle or. You know, someone left a bag of chips on the floor, you know, we need to be able to dynamically see all of these types of things in and really adjust for the environment.

So those are a lot of the sensors on the navigation and sort of autonomy side. When we think about extracting value from store shelves, you know, really that’s a combination of sort of 2d and 3d sense. So capturing images that are equivalent to what you would have in your smartphone of the store shelves, both in 2d and [00:16:00] 3d, as well as we have a flavor of our robot that actually leverages RFID as well.

So we don’t only operate in stores like grocery stores or drug stores or, or large general merchandise stores. But we’ve actually operated in, you know, sporting goods environments or high-end clothing that may be leveraging. Technology like RFID for item, inventory, or asset tracking as well.

Kate Zhou: Interesting. How much of the decision making is kind of tailored to each individual, customer, in terms of different sectors of retail or after, tele gathers all this information? Is there. One recommendation in terms of stalking or re pricing,

Brad Bogolea: et cetera. Yeah. So as tele goes down the aisle, let’s say, in your grocery store, your average neighborhood grocery store, fortunately in most of these [00:17:00] environments, things are fairly similar, right?

Products are traditionally on these metal gondola wracked shelves. Every store has product in sort of bags, cans, boxes, so we’re able to really sort of generalize, you know, this type of environment and get it to a point where after we did work with a handful of retailers, we can kind of just put the solution on the ground today and show immense value sort of out of the box.

And when Talia’s going down the aisle, we are building a very comprehensive. Sort of detection model of everything we see now, we don’t want to hit our retail clients with sort of this just tsunami of data. Right. We really help them digest of what’s the most important things that we need to focus on.

Right? So here are the 2000 products that are out of stock on shelf in store today. Of those [00:18:00] 2000, here are the 500 you told me you have inventory for. So go focus on those. You know, here are the 200 price tags that we have found incorrect and all of these products are on promotion this week, you know, go print new tags and put them up in this area.

But retailers have the ability, you know, really for the first time to actually understand their product availability rates. Precisely where products are, you know, from sort of an intraday view. So this is really, you know, sort of transformational for their business and, and a number of ways. Cool.

Kate Zhou: I guess in the recent times, do you think your goals of your products have changed due to potential changes in consumer habits due to the pandemic with more potential, online orders or deliveries?

How does. Change in this, in the retail [00:19:00] industry in general. And how can your robotics solution continue to update to support these goals?

Brad Bogolea: Yes, absolutely. So, you know, given that our core business is really in the fast moving consumer goods segment, which is grocery stores, drug stores, you know, these are fairly recession proof businesses and in the.

COVID era that we live in now, which I think many of these habits will stick. You actually have more people sort of working from home, eating at home. So these stores have actually been seeing all time, high sales, you know, sort of since the pandemic. And although, you know, online grocery solutions exist like the Instacarts and the door dashes of the world, all of those orders are picked in physical grocery stores.

You know, your local Safeway, Kroger, you know, these types of environments and the data we collect is actually super [00:20:00] valuable to those types of entities. Also, you know, the online shopping experience for these types of workers today is like being an Uber driver without a Google map, because they don’t know when they get to the store where the product is going to be.

And if it’s actually going to be on. So we’ve actually seen substantial growth, in our business since the pandemic actually more than seven X. And we believe strongly as it relates to sort of grocery, you know, these are not environments that are going anywhere, but are more sort of reinventing themselves, you know, with technology because not a lot has changed, since the barcode or the cash register.

Kate Zhou: Very interesting. A final question interested in hearing whether there were any concerns in terms of, privacy with the amount of data you’re collecting and the potential [00:21:00] customers in the, in your retail clients.

Brad Bogolea: Yeah. So Telly’s really just focused on sort of capturing the shelf and sort of the navigational data that we used is, is really used for making real-time navigational decisions.

Now, there are certain retailers that require us on their behalf. To store some of that information for liability purposes. But our focus as a business is to really analyze the state of products in the store and, you know, safely and reliably avoid customers and sort of stay out of their way. You know, we don’t think about customers in any other way, you know, sort of beyond that.

Kate Zhou: I see. Yeah, that totally makes sense. What are some next steps that your company is working?

Brad Bogolea: So we’ve been fortunate in the sense of we’ve announced a number of new partnerships this year, as well as really had some of the industry first sort of chain wide adoption of this sort of technology. [00:22:00] So much of the future for us, you know, we believe is, is very ripe and kind of focused on growth.

So how do we land and expand and more customers in the U S. As well as begin to put our toes and sort of the international market, because everyone around the globe buys groceries. And then in addition, you know, beginning to think about sort of downstream solutions. So now that we’re getting to a chain wide penetration with a number of customers with this type of technology, there are a lot of ecosystem stakeholders within the retail environment that would like access to this.

So, if you’re a brand, let’s say you’re Unilever or Procter and gamble, you care about how your product is being positioned on shelf. And if it’s being restocked or the promotion you’re paying for is actually being put up. Although the retailer pays for our solution today, [00:23:00] we’ve begun to build a model where we can now in partnership with a retailer begin brokering this data downstream to all of these other stakeholders that may be brands.

That may be market insight companies. You know, we talked about the online grocery use case of, folks like Instacart and door dash. In addition, you know, we think we’ve built a very strong sort of inner intellectual property moat around our computer vision processing pipeline. And we think longer term, there could be the way to externalize that for additional use cases.

But just staying focused on. Tell market today, you know, the north American grocery market is, you know, $1 trillion business, that is sort of very rich and we’re kind of very focused, you know, there for the time being, but think the future is exciting for both, robotics [00:24:00] and computer vision.

Yeah, for

Kate Zhou: sure. Very exciting. You mentioned the current state is, could be seen as a mix of robotics technology with the human cognition. So there’s the retailer workforce that will be required to be spread all over the store for, do you envision a future where there could be the retail experiments primarily operated by robots?

Brad Bogolea: There are aspects of retail that are, are able to do some of that today, but it’s both mostly warehouse, you know, for our markets, you know, you can’t serve. Groceries out of a warehouse that’s sort of 50 miles from the customer. You need to deliver groceries for, you know, a simple way to think about this is there’s a reason Amazon with, you know, the depth of their robotics technology actually bought whole foods and are opening their own grocery stores [00:25:00] because doing these types of perishable products is actually really quite.

And although there are a lot of exciting micro fulfillment solutions out there. They’re really only serving a very small portion of people’s baskets, what they buy, and many of them are best suited for products that are sort of on a regular subscript. So, you know, as far as the stores that we operate in, you know, I think it’s important to not underestimate the value of sort of the human touch and sort of human customer service.

And, you know, we really just see our tech, our technology is, is really being a power tool to help elevate store teams to do that. Especially in the labor environment or, or, you know, sort of the state of the labor workforce today, where there actually is a shortage, of, of labor in these types of industries, and you [00:26:00] know, many retailers need sort of all of the, all of the help that they can get.

I see.

Kate Zhou: Final question is whether you have any tips for listeners who are interested in robotics or interested in optimizing the retail industry, how to get into this space,

Brad Bogolea: you know, robotics wise, you know, in the modern day and age, I mean, you can just learn so much from sort of open source, you know, whether it’s YouTube or getting involved with things like the robot operating system and setting up a turtle.

You know, that was really my introduction to robotics, was, you know, beginning with, you know, sort of the turtle bot and, you know, my co-founders and sort of other colleagues had years and years of really deep applied robotics experience. But it’s, it’s such a great way to start. And, you know, depending on where someone is either age or career wise, you know, there’s, there’s so many avenues out there today through.

You know, high school, [00:27:00] robotics clubs, or, you know, college, robotics groups, you know, these types of things. I think there is a massive opportunity, especially for people, you know, going into academia and someday looking to go into the workforce. They’ll continue to be massive opportunities in robotics.

And, if you’re someone that likes. To play with a combination of, you know, hardware, software, and sort of a myriad of things. The multidisciplinary nature of robotics is, is sort of really intriguing, on the retail front. You know, what I would say is, you know, this is such a massive industry. There is a lot of opportunity, you know, within the retail space, as I was mentioning earlier, Retail and our food supply system is, is really critical, like to our basic needs.

We need food, water, and shelter, but the reality is everything from how this food is grown [00:28:00] to how it is distributed to holidays. So. Not a lot of that has changed and you know, the last a hundred years. And I think, there’s a really unique opportunity for technology to make that more efficient. Especially as, you know, we operate and, you know, this new world where we have to be mindful of things like global warming and carbon and, you know, health and safety.

Kate Zhou: Cool. Thank you so much, Brad.

Brad Bogolea: Thank you, Kate. It’s such a pleasure.

Jana: And that’s the end of today’s podcast as always simply go to robo forward slash podcast for loads, more exciting episodes. And did he know that the robot podcast is actually run by an international team of volunteers? If you enjoy our interviews and would like to support our small team, please check out a Patrion campaign where you can help us from as little as a dollar.[00:29:00]

The money we raise goes straight to producing more exciting new content for you. Enabling our interviews to meet and speak to more researchers, engineers, and robot enthusiasts, and cover the latest and greatest from the big international conferences to find out more. Go to robo forward slash podcast, and read up about how you can become a patron.

We’ll be back again in two weeks time until then. Cindy robotics with robo hub, the podcast for news and views on robotics.[00:30:00]


tags: , , , , , , ,

Kate Zhou

Related posts :

Congratulations to the #ICRA2024 best paper winners

The winners and finalists in the different categories have been announced.
20 May 2024, by

Robot Talk Episode 85 – Margarita Chli

In the latest episode of the Robot Talk podcast, Claire chatted to Margarita Chli from the University of Cyprus all about vision, navigation, and small aerial drones.
17 May 2024, by

What’s coming up at #ICRA2024?

Find out what's on the programme at the IEEE International Conference on Robotics and Automation.
10 May 2024, by

Octopus inspires new suction mechanism for robots

Suction cup grasping a stone - Image credit: Tianqi Yue The team, based at Bristol Robotics Laboratory, studied the structures of octopus biological suckers,  which have superb adaptive s...
18 April 2024, by

Open Robotics Launches the Open Source Robotics Alliance

The Open Source Robotics Foundation (OSRF) is pleased to announce the creation of the Open Source Robotics Alliance (OSRA), a new initiative to strengthen the governance of our open-source robotics so...

Robot Talk Episode 77 – Patricia Shaw

In the latest episode of the Robot Talk podcast, Claire chatted to Patricia Shaw from Aberystwyth University all about home assistance robots, and robot learning and development.
18 March 2024, by

Robohub is supported by:

Would you like to learn how to tell impactful stories about your robot or AI system?

training the next generation of science communicators in robotics & AI

©2024 - Association for the Understanding of Artificial Intelligence


©2021 - ROBOTS Association