Robohub.org
 

Bringing robots to retail: Interview with Fellow Robots CEO and co-founder, Marco Mascorro

by and
19 May 2016



share this:
Photo courtesy: Fellow Robots, Inc.

Photo courtesy: Fellow Robots, Inc.

Robotics today is a fast growing industry with applications in myriad markets, including retail, transportation, manufacturing, and even as personal assistants. Fellow Robots is at the forefront of reimagining these uses for the best retail experience – to improve your experience when shopping and to help employees with the most up to date product information and location of goods and services.

Fellow Robots is a multidisciplinary young team from different backgrounds, ranging from robotic, software wizards and data scientists to designers and business experts. We have built many platforms including telepresence robots, humanoid robots, sensors, cloud computing and more cutting edge technology platforms. We came out of SU Labs at Singularity University at NASA Research Park in Silicon Valley. SU Labs connects corporate innovation teams with startups and other organizations to explore exponentially accelerating technologies and create new sustainable business solutions. What distinguishes Fellow Robots is the ability to partner and work shoulder to shoulder with customers to learn how robotics can improve their retail needs.

Interview with Marco Mascorro, CEO and Co-Founder, Fellow Robots
edited for clarity

Source: Silicon Valley Robotics

Source: Silicon Valley Robotics

I’m Marco Mascorro, CEO and co-founder of Fellow Robots. Fellow Robots came out of Singularity University. Singularity University focuses on exponential technologies, with robotics being one of them. We started with a team of very passionate robotics engineers looking into industries that have not changed a great deal. We quickly honed in on the retail industry; specifically offline retail. For all the accomplishments of ecommerce and online retail in recent decades, about 90% of retail purchases still take place in store. This is clearly an industry overdue for a big step into technology, and we saw that robotics could be a great fit there.

We began by meeting with retailers and listening to them discuss the problems they are facing today, and have probably been facing for a long time. One of those early conversations was with Lowe’s. That conversation very quickly led us down the path toward a customer service robot.

I think that was a really interesting way to hear our customers, to see what kind of problems they were facing and how robotics technology could fit. Today, we have a partnership with Lowe’s to launch OSHbot, which is a customer service robot that helps customers find things in the stores. When the robot is navigating in the store, it knows where all the products are located. When a customer comes into the store they can just talk to the robot, in multiple languages. We can add up to twenty-five languages right now. We have English and Spanish working in the Orchard Supply Hardware store, owned by Lowe’s.

When a customer comes in and says, “Hey, I’m looking for nails and paint” then the robot can tell that customer where to find those items. It shows customers that it understands what is said. It displays on-screen the products that the store has in stock. It has a touch screen so customers can just navigate on the screen and see the pictures of the products and then click on the one that they actually want to see. OSHbot then tells the customer that the product is located in Aisle 15, for example, whether it’s in stock and some more information about the item. Customers can click on a button and follow OSHbot to the location of the item in the store.

The robot actually guides customers by its own fully autonomous navigation to the product location. Meanwhile customers are following OSHbot, there’s a screen on the back and that’s for engagement. It’s not very often we see robots talking to humans on a daily basis but I think that we are starting to see that now, and Fellow Robots and our customers are pretty happy about that.

The robot takes you to the product and then it gives you some extra options. If someone is buying paint, then it’s pretty common that they will also buy a brush. OSHbot can provide that information, and take the customer to the brushes as well. It’s a really interesting experience that the customer has now. It’s a whole new experience. Customers are going to stores and getting really accurate information about products, about what is in stock and where to find it.

If customers don’t want to follow the robot, OSHbot can just show them the product location on the map. The customer can decide to go on their own. One of the most interesting things we’ve seen so far is how quickly the adoption of new technologies is happening right now. Of course when we launched the robot in November last year, there was a “wow” factor. “But today if you go into the store it’s become so common, there’s no more “wow.” And we’re exactly seeing that path with OSHbot, as we saw with the smartphone industry.

So far you’ve described really successful interactions with OSHbot? Have there been difficulties?

There were a lot of unknowns when we first launched OSHbot, because hardly anyone has done this before. This is a customer service robot that’s actually working in the front of the store, helping customers and guiding them to specific locations. It’s very complicated to know how a robot is going to interact with people in stores, and how people and workers will respond to that robot.

Our primary realization was that customers interact with the robot almost exactly the same way that they interact with humans. The questions asked of OSHbot were questions they would ask a person.In the first month of deployment, we had someone right next to the robot collecting feedback, seeing how people interacted with the robot and where we could improve. That person was there taking notes on how customers were acting around and interaction with OSHbot.

Source: Silicon Valley Robotics

Source: Fellow Robots

What’s complicated is that people ask questions in very different ways. So if someone says “Hi OSHbot, I’m looking for a quarter inch screw for my door.” The robot needs to be intelligent enough to know what product they are looking for: Is it the door? Is it the screw? So speech recognition was one of the things we needed to improve on, and we have. To make the interaction with the customer as smooth as possible we focus a great deal on natural language processing,

Another priority is our focus on is the customer interface. When you open an app, for example, you need to understand what the app does and how naturally to proceed to the next step. We have a very similar challenge with the robot UI. That’s why we have people on the Fellow Robots team whose sole focus is on the screen Interface and these interactions. How many clicks? Are they clear? Are they sufficient? Too many?

We have come to realize that what’s best is a combination of speech and screen interface. When the screen shows off the product, the robot tells you at the same time, “Here are the products available, click on the one you’re looking for.” People know what to do, and when they click on the product, they can choose to get taken there. So you provide a full interaction with the customer, where there is some speech and some visual interaction. Those were the big lessons learned from the first month of working with OSHbot in the store.

Will you be adding a product scanner to the OSHbot experience?

We have some of the things working along these lines but haven’t deployed them in the store just yet. One of the reasons is that there are a lot of factors not under our control. For example, we need to have a full 3D database of the products to be able to match those scans. That’s one of the reasons we decided to wait a little bit to deploy that functionality.

Probably about 20 or 30% of people that come to hardware stores have an object with them that they’d like to buy or that they have a question about. When a customer has a very specific question that the robot can’t answer, like “My pipe broke. What should I use to fix it? or “What glue should I use?” we have a store representative available at any time for that customer.

We added a small button on the screen that says, “Talk to an expert.” If a customer has a complicated question then the robot will tell you to click on that button and it connects you remotely to a store associate. That store associate can be located somewhere else, just as in a normal telepresence video conference call. That person can also be an expert in plumbing, in electricity, or any other field. This way the customer gets the ideal mix of automation and high touch service that retailers typically struggle with. That’s actually what’s happening right now.

So customers can have a mixed physical and virtual shopping experience?

Exactly. They could be talking to someone who’s an expert in paint, or electricity, or design, depending on what they need. This enables a really interesting way of interacting with store associates that’s very valuable.

What is the staff response to OSHbot?

It’s been really positive so far. I think one of the most interesting things we’ve seen is how fast OSHbot’s colleagues got used to working alongside a robot. I think we’re lucky that everyone had a very positive response. People like it, people use it.

How many OSHbots are in stores so far?

We currently have two robots located at the OSH in San Jose, CA.

What is involved logistically in rolling out the robots? Are you getting close to rolling out more OSHbots?

We need to integrate the robots with the product database, but one of the nice things is that the robots have everything built in, so we don’t need to add any major additional infrastructure or sensors in the store to make the robot operate and navigate.

We are in conversations with customers about rolling out robots elsewhere.

How does OSHbot map the store and identify product locations?

Without going into too much detail, we can say that mapping is one of the robot’s core functionalities. The robot knows precisely where it is located at all times. We integrate that with the planogram, so this way OSHbot knows exactly where these objects are located on the shelves.

What other retail experiences would OSHbot apply to?

At the moment, our primary focus is in-store retail, but what’s key here is the level of premium customer service that the robot can provide. A robot can maintain accurate information of about 100,000 different products or more.

I think this capability can apply to different industries that also require customer service. When we talk to retailers, customer service is one of the biggest challenges, and one of the biggest opportunities. Therefore customer service functionality in the robot is vital, as is the speech recognition and the human interaction—to make it as natural as possible to provide a really nice experience.

So in the future you see robots greeting people when they walk into bricks and mortar shops?

That’s the plan. The idea is to rethink how retail is done in terms of customer service and the experience you actually have when you go and purchase items in a store. You normally come with the idea that you’re going to buy a product, but you often ending up buying more. We want to help businesses make the process as smooth as possible, so that their customers find everything they need, everything they’re looking for, and have a great experience. That scenario is a win/win for both the retailer and the customer.

That’s a clear value proposition for the front of store. Does it extend into the back of store – doing inventory etc.?

There’s lots of potential there. For example, when the robot navigates the store it creates a map in order to know where products are located. But if a store associate moves products from one location to another, it’s challenging to inform everyone in the store. The robot can then can provide that information to the store associates.

So OSHbot will be providing customer service to the associates too?

Yes, right now we’re talking about how we can make this experience even richer for employees.

We feel that we’re at the very beginning of a trend where robots are gaining traction with industries that haven’t traditionally looked to robotics.

Do you have anything to add? What makes a robotics company like Fellow Robots possible now that wasn’t possible five years ago?

This is a really interesting time in robotics because so many forces are coming together. The price of sensors is coming down, because the need is growing. The price of computing is also dropping. The price of software development is much less expensive than in the past.

We’re also getting more powerful technologies at the same time that can be combined with robotics, such as natural language processing. That’s a technology that has been around for years and years but is getting better and better. Many other industries are benefiting from this, not just the mobile and smartphone industry, and robotics is clearly benefiting.

So all these advances are merging now into robotics and the timing is perfect. All these robotic platforms are coming out of the labs and really going to market.

We are also learning a great deal about the customers as they arrive at OSH. For example, OSH now knows the most common questions that customers ask, so they can factor that information into their store planning. OSHbot already offers service in English and Spanish, but as we grow to include more languages stores will be able to better understand the languages spoken by their customers. All of this information has previously been unavailable.

Do you have any advice for a robotics startup, coming out of the lab and trying to turn into a business?

Everything works fine in the lab, but when you put it out in the real world, interacting with people in a real space in a busy environment, then it’s a very different story. You need to be prepared for the unknown and be in a position to quickly adjust.

In most cases we need to go back to the lab to figure out what is happening, why the robot is not working as well as it did in the confines of our lab.

One of the amazing things our customers allowed us to do was come and test in their facilities and see how the robot behaved. We learned a lot from that. It’s a different environment, it’s a totally different interaction and that really helped a lot for us—testing in the real world.

Read all the latest emerging robotics trends in the Silicon Valley Robotics  online magazine or you can download the rest of the Silicon Valley Robotics free report on service robotics here.



tags: , ,


Andra Keay is the Managing Director of Silicon Valley Robotics, founder of Women in Robotics and is a mentor, investor and advisor to startups, accelerators and think tanks, with a strong interest in commercializing socially positive robotics and AI.
Andra Keay is the Managing Director of Silicon Valley Robotics, founder of Women in Robotics and is a mentor, investor and advisor to startups, accelerators and think tanks, with a strong interest in commercializing socially positive robotics and AI.

Silicon Valley Robotics is an industry association supporting innovation and commercialization of robotics technologies.
Silicon Valley Robotics is an industry association supporting innovation and commercialization of robotics technologies.





Related posts :



Robot Talk Episode 99 – Joe Wolfel

In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.
22 November 2024, by

Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by

Robot Talk Episode 96 – Maria Elena Giannaccini

In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.
01 November 2024, by

Robot Talk Episode 95 – Jonathan Walker

In the latest episode of the Robot Talk podcast, Claire chatted to Jonathan Walker from Innovate UK about translating robotics research into the commercial sector.
25 October 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association