news    views    talk    learn    |    about    contribute     republish     crowdfunding     archives     events
podcast archive     

Robots Podcast #173: RoboThespian, with Will Jackson

January 10, 2015


In today’s podcast, Ron Vanderkley speaks with Will Jackson from Engineered Arts Limited about his team’s work making robot actors.

Engineered Arts was founded in October 2004 by Will Jackson, to produce mixed media installations for UK science centres and museums, many of which involved simple mechanical figures, animated by standard industrial controllers.

In early in 2005, the Company began work on the Mechanical Theatre for the Eden Project. This involved three figures, with storylines focused on genetic modification. Rather than designing another set of figures for this new commission, Engineered Arts decided to develop a generic programmable figure that would be used for the Mechanical Theatre, and the succession of similar commissions that would hopefully follow. The result was RoboThespian Mark 1 (RT1).

From thereon, Engineered Arts took a change of direction and now concentrates entirely on development and sales of an ever expanding range of humanoid and semi-humanoid robots featuring natural human-like movement and advanced social behaviours.

RoboThespian, now in its third version, is a life sized humanoid robot designed for human interaction in a public environment. It is fully interactive, multilingual, and user-friendly. Clients range from NASA’s Kennedy Space Centre through to Questacon, The National Science and Technology Centre in Australia. You can watch it in action in the video below.

Will Jackson
Will Jackson and his RoboThespianWill Jackson has a BA in 3D design from University of Brighton, UK and is the Founder of Engineered Arts Ltd.


Episode #173 Transcript – RoboThespian

Ron Vanderkley: Good morning Will – welcome to the podcast. Can you introduce yourself to the listeners and describe what Engineered Arts Ltd. does?

Will Jackson: My name is Will Jackson. I’m the Director of Engineered Arts Ltd. We’re a small company based in far south-west of the UK and we make robots for human interaction. Our best-known product is RoboThespian; it’s been around for about right years now, it’s currently installed in 17 countries I think, and we’ve made around 70 or 80 units. We have a smaller rear-projected robot designed for facial expressions, mimicking human expressions and social interaction scenarios. The company now has around 18 people, and is growing quite fast. We also have a number of advanced projects in our research lab that are under development.

Ron Vanderkley: What type of technologies do you use in your product?

Will Jackson: My background is in the film and TV industry, so you could say I come from an animatronics background. Our robots have always been built for specific commercial tasks, so even though we now supply a number of research robots (we have robots installed in more than ten universities) this was not the design rationale. These things are built as working machines.

Animatronics tends to refer to things that are ‘dumb’ – they have no kind of programmability or interactive capability – and that’s not what our robots are about now. They’re now very interactive, very programmable … we have visual feedback, depth sensors, we have facial recognition, and we have sensitive force sensors built into our robots. From an actuation point of view, our robots are quite unique in that they are hybrid; even our simplest RT3 robot is a hybrid pneumatic electric robot.

The pneumatics themselves subdivide into two categories. We use air muscles (aka fluidic muscles or McKibben muscles. These are a novel kind of pneumatic actuator that exhibits really nice properties for humanoid robots – very high strength to weight ratio, no ‘stiction’, good force control; they are a bit tricky on the position control, but they have a lot of nice human-like characteristics. We also use pneumatic cylinders for smaller actuations like fingers, and then we use DC servo drives for more rotary actuators, for example the yaw axis, or the head pitch and roll axis tend to be DC servo drives.

We develop everything in-house from the hardware design right now to the electronic level, firmware design. We even do our own robot operating system.

We don’t use ROS (or any of the better known ones) mainly because when we started ROS was immature, and it’s also not well suited to robots that are designed for human interaction. ROS was built for robots that were autonomous mobile platforms – basically PR2, which is not what our robot is about. Our robot is about speech, it’s about facial recognition, it’s about social environments, and it was not an appropriate operating system at the time.

Under development we have Byrun, which is a fully dynamic biped. It uses a hybrid pneumatic electric design, some custom designed air muscles that we’ve built in partnership with Festo in Germany, and some very high power bespoke Brushless DC motors. It uses series-elastic actuators, so sensing deflection of a spring to measure force control. One of the key features is that it’s bi-articulate, so single actuators affect multiple joints. It’s based on a human body model. So if you look at human arrangement of muscles, muscles often span multiple joints and they will affect more than one joint when they contract. This is very significant if you’re focusing on bipedal gaits and trying to maintain balance, because you can actually solve a lot of your balance problems at the mechanical level rather than the control level.

We’re not the only people working on those ideas; you’ll see groups at MIT, Oregon State University; Darmstadt in Germany. There are a lot of people working on these kinds of ideas now.

Byrun’s been under development for two and a half years now, and in another six months we have our first demonstrator ready. We’re now running simulations and running tests on real hardware for gait control, gait strategy, hopping, and jumping.

Many people will be familiar with Boston Dynamics and their hydraulically driven robots. For our kind of entertainment applications, hydraulics is not really feasible or appropriate. The hydraulics are big, powerful and fast – which is great if you’re trying to build dynamic robots – but they’re also dangerous, noisy and hot, and these are three big negatives. We focused on efficient designs and tried to use the natural dynamics of the system, [for example] we used the pneumatic components as energy storage in the system and try to conserve energy that way. Also we have parallel springs as well as series springs in our design, so we balance gravity loads with springs. This kind of thing is obvious and we do it with even our current RT3, our very economical commercial robot.

You’ll find a lot of balance springs in that robot. The reason being is that the robot is always the same way up and gravity is always affecting in the same way: there’s no point in putting a large motor there to counter a static load. You’ll see you’ll see this kind of balance technology used on cranes and larger Kuka- type robots, but it’s not something that many people apply to humanoids – though we do.

SociBot is a smaller desktop robot that we have under development and this uses rear projected facial expression technology. We’ve put a lot of time into developing human face models as 3D meshes that mimic human facial muscle groups that can be controlled dynamically on the fly. You can pass parameters to SociBot’s face projection software in real time and animate a face for appropriate and subtle expressions, such as lip sync, eye blinks, etc.

And underlying all of these robots, we now have a new software framework called Tritium, which is GPL. It’s not quite in the state where I would encourage anybody to adopt it yet, but it’s becoming more robust all the time. It’s based on ZeroMQ and Google protocol buffers, and uses Python Tornado web servers, so it’s got a lot of fast modern communication technologies built in. Faster, we think, than the XML RTP that underlies OS’s like ROS.

Ron Vanderkley: I see from what you have told me, that you have a good understanding of robotics. What is your background?

Will Jackson: I built my first PC from scratch (it was an Acorn Art – not really a PC, but my first home computer, if you will) when I was about 13 years old. I soldered that together and I’ve been making robots since I was about 10 years old. I’m now 48 years old, so I have a fair bit of experience. At college I studied design, but I never stopped making mechanical things. So I am not a formally trained engineer, however I’ve spent my entire life doing it. We now have a large team of 18 people, many of whom are formally trained and do the calculations to a better degree than I do. However I still find that an intuitive understanding of mechanics gets you a long way very fast. You can invest a lot of time into calculations that go nowhere, so having an intuitive understanding for mechanical systems I think is very important.

I started coding in Assembler when I was 12 years, so I’ve been running code most of my life. I’ve also been doing digital circuit design so I learned Boolean algebra and digital circuits, and a lot of my early robots were hard coded logic gates before MCU’s were readily available or easy to use, so a lot of history there. But I’ve done all of this in the context of an arts background, so a little bit odd.

Ron Vanderkley: It’s a question of where would a person get the experience to apply robotics in art? What are the robotics careers in your industry?

Will Jackson: One of the biggest problems we have in robotics is that it’s very multidisciplinary and you really have to have tight integration between many disciplines. We’ve got electronic design, firmware design, we’ve got control theory, and we’ve got mechanical design, knowledge of actuators … all these kinds of things need to tie together.

There’s also an aesthetic component: if your robot looks like a bag of junk, people will treat it like a bag of junk even if it’s the cleverest robot ever. It’s not great to make one that looks like a bag of scrap metal, so there’s a design element [that’s important]. If you can marry these things together very closely you can get some really good results. Our team includes creative people, it includes engineers, CNC operators, electronic designers, and we all work together very closely.

We have a saying in our workshop “We design in the morning, and we make in the afternoon, and we fail in the evening.” We have a very tight cycle from CAD, through simulation and testing, through manufacture, through to on-the-bench tests, and then throw it away and start again the next day. Nothing beats having something in your hand and seeing how it works. You can run it through MATLAB, Maple, Gazebo, whatever you want to use as much as you like, but nothing beats having a real robot. That’s where the real discoveries come.

Ron Vanderkley: Based on the products we’ve been talking about, what is your customer base?

Will Jackson: We’re probably a little bit different as a robot company in that more than 90% of our money is from commercial sources. We designed original robots for science communication, to be used in public spaces where you had a group of people and you had a message to deliver; we were looking for ways of interacting with people that were a fund and engaging.
A lot of our early customers were science centers; the Carnegie Science Center in Pittsburg USA had one of our first robots. Continuum in the Netherlands had one too. [We made about six robots in our first series.]

As we’ve continued to produce robots we found other application areas. One of our major [application areas] now is for commercial users, for example Telstra (in Australia) recently took two of our robots for their experience center to communicate some ideas about things that they’re working on. It was a very engaging way to get those ideas across. That would be a typical commercial user.

Almost by accident we strayed into making research robots. I think the first person to use one of our robots for R&D was from Chapel Hill in North Carolina. They were studying avatars, remote presences, and used one of our robots for that. Since then we have similar projects running in Barcelona, UCL in London, Oxford Brookes, Bristol Robotics Lab, Nanyang in Singapore, and Central Florida. All of these educational institutions now use our robots as an R&D platform, broadly because they’re open and they work when you take them out of the box.

We’re used to tough commercial situations where failure is not an option, and it’s pretty attractive if you’re working in a Lab to have a robot that just works. So many R&D robots don’t, ever; I think that’s partly why they’ve been adopted for that.

Also there are HRI users … we just had one of our SociBots go to Plymouth University, where they’re studying human robot interaction, and again, Bristol Robotics Lab are studying that kind of thing with projected face technology too. So that’s another kind of market segment for our robots, but kind of an accidental one. Our main focus was the commercial applications. We’re looking at “what can you do with a humanoid robot now?” Not in five years, not in ten years, but now.

We believe that in order to drive the technology forward you have to find real world applications. People will tell you “my robot is going to be a robot butler” or “it’s going to look after your elderly mother” or “it’s going to get the shopping” or it’s going to do this, that and the other … well, [ask them to] show it, and the usual answer you get is “in about five years.”

That doesn’t cut it with us. We want to see things we can do now, and the entertainment-information-interaction style applications are things we can do now safely. That’s another reason for using compliant robots, to make them inherently safe. If you’re going to be around people, you cannot use big, tough position-controlled actuators like those you might find in an industrial robot, it’s just going to go right through someone. It has to stop when it comes up against you, and it has to use minimal force.

That’s an overview of the application areas that we’re working in.

The final thing I should mention is there are spinout technologies that have come from robot development using parallel actuation, hybrid pneumatic-electric drives. And some of these have possible industrial applications, so we’re also working with industrial partners looking at how we can possible use compliant drives and hybrid drives in industrial situations.

Ron Vanderkley: What is the future for role-playing or character robotics?

Will Jackson: The main attraction of a performing robot is the fact that it’s a robot. [If you are an actor, don’t be] worried, you’re not going to be replaced.

It’s not about replacing the jobs that people are doing, it’s about providing something that people haven’t seen before. Our robots are very particularly robots, they’re not people. They have human characteristics, we identify with them, they have the same kind of movement curves, they have the same proportions, we try to emulate human expressions, but at the same time they’re very much robots. We have already done a number of stage installations … we have three robots working on stage together that have been running for a couple of years now in Warsaw, Poland, and these are robots performing on their own. It’s fixed program – they don’t interact much with the audience – so from a control point of view it’s a one-way street.

We’re now moving on to things that are much more interactive: robots that are able to focus on particular members of the audience, to know when people are not reacting well to it, to be able to direct any speech they’re making appropriately … these are the kinds of areas we’re moving into now.

For the future, and I’m going to keep this very specifically to humanoids because this is our area, I think these kinds of communication, entertainment applications are the ones that people are going to see over the next few years more and more. You’re going to start seeing the odd robot around your airport terminal probably selling you ice cream or just telling jokes. There are already a few technologies that have popped up … you might have seen rear projected people popping up at airports. They’re going to become much more sophisticated; we’re going to see much more movement. You’re going to see them around shopping malls and that kind of thing.

The really hard tasks – manipulating objects, moving around difficult environments – they’re really still a few years off. We don’t have any legal framework for that kind of machine… What are the regulations for a humanoid robot that has to run through a crowd of people? There just aren’t any [regulations yet] and that’s potentially a really dangerous thing. Getting a regulatory framework in place is one of the things that has to happen before we see these things being used.

Ron Vanderkley: It’s interesting that you spoke about compliance. What are the safety aspects required in public demonstrations?

Will Jackson: Our robots are force controlled, so even if you stand right up-close and the robot gives you a whack, you’re not going to get much more than a small bruise. To this day we’ve never had any significant injury from a robot – touch wood. Mainly because they only have enough power to move to make the gestures that they need to do. And they are inherently compliant; they’re pneumatically actuated on all the larger movements. Our next generation of robots have active compliance and force sensing built in, and are slightly more dangerous … it’s a bit like fly-by-wire. If your control system fails, you can lose your compliance, so we have to build in a lot of safety factors. Our current commercial robots are not capable of manipulating objects of precision movements. Our next generation [of robots very much are; they’re capable of lifting quite high loads and making precision manipulation and things like that.

So the control task is entirely different, basically the current generation are not that dangerous, so putting them around people is not too scary. We say the people are usually more dangerous to the robot than the robot is to the people. It’s much easier for you to break it than it is for it to break you. It isn’t Atlas – don’t be afraid! [laughs]

Ron Vanderkley: Where do you think we are with mobile ground-based robotics?

Will Jackson: One really interesting thing that the water-based robots and the air based robots that are really the most advanced autonomous vehicles that we have. By autonomous I mean in a mechanical sense, not in a control sense.

So why don’t we have autonomous land-based vehicles? Well, we have some wheeled ones; there are self-driving cars. We don’t have anything else with legs other than BigDog from Boston Dynamics, but it seems to be way behind what we have in terms of UAVs and ROVs. I would say the reason for that is that air and water are fluid mediums: they’re soft, and if you put a rigid robot in a fluid medium, the medium complies to the robot. So you’re able to maintain control and everything’s good. When your UAV hits the side of a mountain, things go badly wrong. Same thing with your submersible when it hits a rock, everything goes bad when rigid robot hits a rigid thing.

That’s exactly the situation we have with biped robots or even quadruped robots. We have a rigid robot coming in contact with a rigid terrain. To try and tackle that with position control is almost impossible. You have the Honda ASIMO scenario: ASIMO works great if you put him on a polished surface where every piece of geometry is known to the tenth of a millimeter, but as soon as you put him on unknown terrain it’s not fine. We need force control; we need bouncy robots to work against substrate or in a rigid environment. Boston Dynamics have demonstrated that with their hydraulic robots; they’re able to respond fast enough to the environment, and that they can behave in a springy way. However, generally, they are not springy. BigDog does have some springs, but it’s generally mimicking the action of a spring through a very high speed hydraulic … not very efficient.

There are really interesting robots that are coming up, there’s a nice Cheetah that’s just come out of MIT. They’re using things like really low impedance drives, so the robot is very back drivable and it can conform to the environment without even having to have active force sensing. These are the exciting areas, and I think that once we kick off very compliant robots that are very bouncy and that can work in rigid environments, then land-based robots will catch up to the submersibles and the UAVs.

Ron Vanderkley: What is your take on social robotics?

Will Jackson: There are some really interesting things you discover. One of the main things is that when you’re in lab environment or you’re working as a company and you’re trying to imagine what people would do, what you imagine is not reality at all. What actually happens in an interactive scenario is nothing like you imagine, and keeping your robot locked up in development means you’re probably going in the wrong direction.

We have a policy that we don’t try to imagine what people are going to do and we don’t try to imagine what the problems are going to be, so “don’t address problems that don’t exist” because you can invest a lot of time and energy into that. What we find in an interactive scenario is that people are actually really predictable – they’re probably more predictable than the robot. You can set up certain things that the robot will do, and the response you’ll get from a person will be very predictable. That’s probably because we’re used to interacting in social environments. There are keys, there are clues, there are things that we just take for granted that we do every day, and it’s quite easy to get robots to elicit those same kinds of responses from people. A really simple example is, if I raise my eyebrows in a social context, it can mean “your turn to speak”. Turn taking in a robot conversation can be greatly simplified by things like that.

Ron Vanderkley: What is your take on the use of humanoids in robotics? Where should we be going?

Will Jackson: I just love robots and I particularly like humanoids. We get a lot of people who are aggressive about that or very negative about it. They say, “What a useless branch of robotics to be in, what’s the point of making a humanoid?”

I always agree with them because I think they’re absolutely right. For utility purposes, humanoid robots are not at all appropriate. If you want to wash your dishes, put them in the dishwasher, that’s a dishwashing robot. Don’t go out and try and buy some $2 million robot to wash your dishes, it’s just a stupid idea. If you want to get a beer from the fridge, get a conveyor belt, or get up and get it yourself. These are not the kind of tasks that we want to focus expensive hardware on. The other thing you need to look at is the economics of it. If you’ve got one really expensive piece of equipment servicing one person, that person’s got to be extremely rich, otherwise the economics don’t work.

If you look at industrial automation you’ll see very expensive production line robots, $1-$2 million possibly. The economics come because those robots are producing 10,000 or 100,000 expensive cars, and the cost of that robot is spread over many units. If we apply the same thinking to humanoid robots, the humanoid robot has to interact with a large number of people. If it’s working in a big public space and it sees 10,000 people a day, the cost of that robot becomes insignificant. If it’s performing a utility task, i.e. replacing very low paid labor for one person, that economic model is bound to fail. So I agree, humanoid robots are useless for utility tasks, you don’t need to tell me.

Ron Vanderkley: On behalf of myself and the podcast, Will, I’d like to thank you for taking the time to speak to us.

Will Jackson: Thanks very much for talking to me Ron.

comments powered by Disqus

follow Robots Podcast:
Recent episodes: