Robohub.org
ep.

208

podcast
 

Ladybird with James Underwood

by
21 May 2016



share this:
Ladybird

In this episode, Ron Vanderkley interviews James Underwood from the Australian Centre for Field Robotics. Underwood discusses his work on an autonomous vegetable harvesting robot, Ladybird.

Ladybird is a lightweight omni-directional electric vehicle, inspired by the Coccinellidae insect (Ladybird or the ladybug insect). It is equipped with sensing, manipulation, communication hardware and software. Various user interfaces will be developed for the growers, contractors, and harvesters so that they can control the robot and use information derived from the system.

 

 

James Underwood

face

James Underwood is a senior research fellow at the Australian Centre for Field Robotics (ACFR) at The University of Sydney. James is an expert in the area of perception systems for field robotics – the study of how outdoor robots working in complex, unstructured environments can make sense of their world using science and technology in multi-modal sensing, data fusion and mapping.

Links:


Transcript has been edited for clarity.

Ron Vanderkley: Good morning James! Can you introduce yourself to our podcast listeners?

James Underwood: Hi, my name is Dr. James Underwood. I’m a researcher at the Australian center for field robotics, which is a robotics research group at the University of Sydney. I’ve been working there for the last several years, in applications of robotics to agriculture, specifically focused on ground-based crops like vegetables, and recently cereals and grains, and also tree crops.

Ron Vanderkley: Great. We’ve heard a lot about a particular robot you are working on. I don’t know if it’s the only one, but it’s the one that seems to be predominant, called Lady Bird. If you could describe the work that you’re doing on that particular machine.

James Underwood: Yeah, the Lady Bird is the robot, we custom designed it for the vegetable industry. Where we’ve also done a lot of research on a different platform called Trent, which we’ve used on tree crops. That was more of a general purpose. This one has really been designed specifically for the vegetable crop industry. We built this as a kind of research platform which would have a number of different senses, that are relevant in terms of a new wave of research in crop sensing, and so on. It’s also equipped with a general purpose manipulator arm, so we can actually interact with the crops.

In terms of the physical construction of the system, it has omnidirectional wheels so all four wheels point any direction. It’s a very maneuverable platform. It can go from one row to the next, without using very much of the headlamp, because we can kind of come out and just go straight sideways and back in. Another interesting feature of the platform, is it’s covered in solar panels. These actually have two functions. They have covers, so we can actually shield some of the senses underneath that, from the direct sunlight so we get better imagery of the crops that we are looking at.

Also, then of course we can cover up the solar panels, and use the energy that we are getting, in these typical, very open field applications. On a sunny day, getting a lot of energy into that all electric powered system that allows us to go for days on end without recharging.

Ron Vanderkley: You briefly talked about the senses on the Lady Bird, etc. The frequencies you were talking about were infrared, optical, there’s laser scanning, 3-D imaging involved. Are there any other type of sensory equipment involved in the machine itself?

James Underwood: We also do, we use, I don’t know if you mentioned stereo, that’s just a variation in general, allowing us to estimate the 3-D shape of instruction on the camera that we’re looking at. We also have a couple of senses, like in terms of GPS guidance and things like that, we have international sensor on board as well. The list that you’ve mentioned basically covers it. We’re looking at a lot of the different parts of the optical spectrum, as you say. From visual, to infrared and so on. We have high spectral senses to look at that spectrum in detail.

Also, with the femoral imaging, which is long wave infrared as well. So covering a large range of the electromagnetic spectrum, sense of approach there.

Ron Vanderkley: What are the difficulties in autonomous pest control, actually not just pest control, there’s weed control. Are you using other things apart from chemical sprays, steam, microwave? Mechanically, have you looked at things like that?

James Underwood: We have done a little bit of an explanation in terms of, as I mentioned, we have this general purpose manipulator robot arm on the platform. We can put a utensil on the end of that to actually mechanically remove weeds and some. But we haven’t really, at this stage, done extensive testing. There is quite a lot of work there to be done, in terms of particular, how fast could that type of thing be done. We could stop and then identify the weed, then position the arm in order to remove it.

Would that be fast enough for realistic commercial applications, is the question we don’t really know the answer to at this point. There are some other groups around the world who have looked at, for example, mechanical weeding mechanism, in particular using a stabbing mechanism. Which so rather than actually carefully hauling out the weed like that, you just sort of punch it directly from the barb. That type of technology looks like it might show some promise, and allow these sort of things to be done more rapidly.

Certainly our initial viewpoint on this, is using a rapid target and spray mechanism, which can be used even when the vehicle is in motion, for example, it can find weeds and shooting at them as it rolls past. While it is desirable to remove the chemical element all together, I think it’s probably a good first step. We can probably reduce the quantity of the chemical that is used, but maybe keep up the pace of the robot while it goes over and deals with these weeds. Certainly mechanical weeding is something that we’re rather interested in, because we can get rid of that chemical element all together. That may be a very interesting way to deal with herbicide resistance, and problems of that nature.

Ron Vanderkley: Great, I don’t know if I brought it up, but microwave was looked at at one stage. I don’t know how viable that is. Is that something that has been acknowledged?

James Underwood: I think there are a few groups who have looked into that. To my knowledge, I don’t know that there are any viable systems of that nature yet, but that is also something we are very interested in. There are many challenges that have to be solved with the microwave type application. For example, the power consumption that that device would require. Safety considerations in terms of actually having high voltages in the system, and things of that nature. I believe that the research shows that it can be an effective mechanism for dealing with weeds. Certainly in terms of constructing such a system with a robot and so on. They could be put together, there’s no reason why not.

I think that there’s more work to be done in that regard, to do a proof of concept of a useful whole system.

Ron Vanderkley: This is an interesting one for me. How do you create the section of a weed or a pest in an algorithm to create software to deal with it. Is it visual references, colors, leaf shapes, like in botany, or the location of it versus where the plant is?

James Underwood: Well this is an interesting question for us, too. This is exactly the type of algorithms that we’re interested in, and so on. A lot of the things that you mentioned are relevant. Obviously what you need to be able to do, is to detect those weeds reliably , so to minimize false detections where you think part of the crop is the weed, and you would then spray it. You want to avoid that kind of thing. Similarly, you want to have a sufficiently high rate of detection to actually deal with the weed infestation.

It’s no good if you go through a field and miss a large percentage of the weeds that are there. So the objective is to obviously do this reliably, with low error rates. Now, all of the things you mentioned can be used to detect weeds. So you can use the color camera, and in some cases, the weeds are a different color. You could use high spectral sensing, which let’s you, really the same sort of thing as a color camera, although maybe it tends to go into the infrared range as well. It’s a very detailed way of looking at color.

You where you might have two things that are almost the same shade of green, if there’s a kind of a statistically significant, albeit nuanced difference in the color pick that out more, then the cost of the sensor goes up. It may make the algorithms more simple, and the whole system may be more reliable , but of course if you can do it with the color camera, that’s a better approach.

The key with all of these things, where you’re using color, you’re using texture, all of those things together to try and separate the difference between weeds and the crop is that you have to be able to deal with the variability that you get. Not just in the different instance of the same species of weed will look different, but you also you have lot of variation just due to sunlight changes and things like that. So being able to do this reliably is really a key thing. It’s quite a challenging task.

In a lot of cases the approach that we take, are from state of the art in terms of computer vision, is to try not to specify hand coded rules. We try to avoid saying a weed is something that is this green, and it’s this size, or something like that. We try to get large data sets where we can, and learn directly from the data, what facets of that data from the senses really best eliminate the weed class and the crop class in question. It’s kind of a standard approach. It’s generally based on color, and it’s based on texture, but not necessarily in a human specified way, like a particular shape or a particular color or something like that.

You mentioned the position of the weeds location on the ground. That’s another really important one, because crops tend to be planted in straight lines with the seed. The probability of any vegetation that you may find off that seed line, the probability is much higher that it’s actually a weed. You can encode that knowledge into the system, and actually let it learn what weeds look like, by finding the examples that are off the seed line. Learning from those, and then learning automatically to find those same weeds amongst the seed line as well. There’s the kind of approach that we are taking.

There can be a lot of sophistication in how that’s done, and that’s kind of necessary in order to get those accuracies up, and to be able to do weeding once the field is no longer of color basically. When the field is basically fallow, detecting weeds is easy, because anything that’s green you just shoot with commercial applications with that already.

Ron Vanderkley: Cool, so what are the advantages and disadvantages of autonomous harvesting, fertilizing and planting today?

James Underwood: There are a number of different advantages. One of them is, obviously, labor saving and a lot of the growers that we talk to, that’s kind of the top of there list. They have difficulties with labor supply in Australia, and I understand there are similar issues in similar countries around the world. A lot of the work is done by temporary migrant workers.

We have an aging workforce. The average age of farm workers in this country is actually surprisingly high like 55 or 60 years old and the younger generation don’t seem to be showing as must interest to get involved in that area. When we talk to growers, the cost of labor and the security of the labor force is pretty much number one on their list. So clearly, autonomous systems have an advantage in that regard.

There are many other advantages you start to get when you have an autonomous system, such as the repeatability, the ability to operate 24/7 without getting tired and keeping that attention to detail at 100% capacity for the entire time that it’s working. There are advantages as far as buyer security, applications in terms of disease detection and management. And advantages having fewer instances where people handle the food that we eat. All these things have benefits for the quality of the produce and the safety of the food.

Whichever stem you pick up, you look on there and see a whole number of different advantages that might not necessarily be the top of the grower’s list but when you start to look at them all coming together in this one system that can provide many different advantages altogether, then it starts to look like a pretty good way to move.

Ron Vanderkley: What would be the ideal goal in crop management using multiple small robot tractors?

James Underwood: There are a number of advantages. One of the key things with using smaller autonomous systems, and so on is that you have a much lighter mass in terms of the vehicle. That has benefits in terms of soil compaction. Productivity in farming over the last few decades has really been driven by getting bigger and bigger equipment so that the one person driving a tractor can tend to normal hectares per day, simply by virtue of having a bigger machine. The cost of that is much greater, soil damage and soil compaction that that causes. This has been kind of mitigated to some extent by having control of traffic farming. The vehicle puts its wheels over exactly the same position so the extent of where the damage is caused is minimized. The size of these machines really does do a lot of damage to the soil. Smaller vehicles means really avoiding that problem.

As I said before, there are all these advantages jumping out at you. For example, if, by virtue of having a smaller machine, you might need to have several of them to replicate the same number of hectares per hour that you could do in a much bigger machine. But now, if one of your ten smaller machines breaks down completely, you’re still running at 90% capacity. Whereas, when that one big machine, which cost a fortune incidentally, breaks down, you’re down to 0% capacity and there’s very little that you can do about that. By having teams of smaller platforms you get a scalability and you get a resilience to malfunction, you can keep going with a subset of that team.

A small farmer could have just one of these systems. A really big farmer, property, could use a hundred of them or however many are required to scalable, better for breakages and things like that, better for soil compaction.

Ron Vanderkley: It’s also an approach to integrated farm management using farm bots both aerial and ground in the production, is that a viable option as well?

James Underwood: It doesn’t exist today as a system you can just buy and use. We’re talking in the research, where the technology is heading. There would definitely be scenarios where aerial vehicles could team up with ground vehicles to very effectively manage certain aspects of a farm. We haven’t seen it enough in practice to get a hands on view of what those benefits actually look like. It takes longer for a ground based vehicle to scan across an entire [inaudible 00:16:28] where it is much quicker for a flight vehicle to go over the top at a higher altitude and get a lower resolution snapshot of the status of the crop. If there was an identifier from that data, then they could go and do some sort of targeted management response based on what you saw from the air. You could then send vehicles specifically to the locations on the ground that need attention, whatever that might be. It is likely that there will be combinations of different bits of technology that work together and combine in that way.

Ron Vanderkley:
From the grass-root approach, What are interests local farmers? Where do they want us to go with robotics?

James Underwood: I mentioned land saving being the top of the list of the growers we have spoken to. I think they’d like to see more technology coming online that can … other assisted technology that allows greater areas to be managed in some sense using this technology rather than doing everything by hand or technology that in some cases can completely replace certain functions that are currently done now by manual labor. I think growers would like to see that. This could include a number of different things, for example, weeding is something that depending on the crop and the time of year, certainly there is a manual labor component involved in that. I have been to farms where they have to send people out to just look at the fields. They walk the whole field and look at the crop to make sure that there are no contaminants in the field. That could be leaves that blow in from adjacent, native trees and things like that. You don’t want gum leaves in your bag of spinach. As a consumer we’re fussy, you could have robots going out and doing that job.

There’s a lot of labor saving type approaches that tend to be high up on list. Where we are now we are doing some early prototypes. Growers are going to want to see the technology be actually ready to be used in a commercial sense. We still need to move in that direction. Early prototypes are based on functions. We need get a few more miles under our belt before we’re ready to sell the equipment.

Ron Vanderkley: Where to you see the future in robotics saw in the next ten or twenty years, for instance in agriculture?

James Underwood: Ten to twenty years is a very very long time, especially in terms of technological progress. It’s really hard to predict that far forward with any kind of accuracy.

Within that time frame, I think it would be good to see entire processes of the production, certain types of crops, could be done in a digital fashion. I don’t necessarily mean that there are no people involved. I mean that you have technology at all the right key points in the production cycle. All the information that’s relevant to growing the crop in an optimal kind of way. It’s all recorded and stored and used in terms of making optimal decisions and putting down the right inputs and someone to grow the best possible crop, that crop to be as resilient as possible, [inaudible 00:20:07] all using digital technology. That can be a big combination of different types of actual hardware like robotics is one part. It could also be stationary senses. It could be satellite data or UAV data. It could be handheld senses, senses mounted onto conventional vehicles, a whole number of things like that.

All that data is then coming together to really address all the information requirements across the whole production cycle. That, coupled with various automated and optimized processes, in terms of harvesting. If we’re saying in twenty years time then I hope by then that we manage to really digitize this whole system. In the near term, I think that we’ll start to see individual applications be kind of kicked off the list. Not a general type of robot, in the next five years, can do all of these things but maybe a package that you can add to a tractor that will specifically target weeds in a non fallow period. That’s a piece of technology that will come into commercial existence fairly soon. It could be integrated fairly easily into current practices.

You might then see certain crops that now are not harvested in a mechanized way, start to become solved in the mechanization. A lot of the crops are harvested in a mechanized way, like with spinach, you have a machine that has a sword in the front and cuts it off and on to a conveyor belt it goes. It’s all manually driven but it’s quite an efficient method for harvesting. Where those crops that are currently not mechanized, there is opportunities to kick them off the list. Not necessarily one special harvesting robot with arms that goes out and deals with all the different crops but say an asparagus harvester and it’s using modern robotics technology to do the job and it just does that one job very well. In the next five years or so we’ll start to see one by one these kinds of specific applications getting kicked off the list and eventually you start to get this kind of matrix of connectivity between all these different applications that start talking to each other and dealing with the whole production cycle in the bigger picture.

Ron Vanderkley: What do you think of the options for new players to robotics, for instance in agriculture, are there a lot of opportunities for a graduate be engineers?

James Underwood: Yeah, I guess that’s a fairly broad question. I think so. I think the industry by and large is sort of seeing the potential in this kind of technology. Throughout the whole chain we’re starting to see more funding in fundamental Blue Sky Research, more funding in applied research, leading towards commercialization opportunities, more commercialization funding to get those off the ground and go forth. There are more commercialization opportunities popping up all the time. There are brand new opportunities in building service companies that supply the services.

We’re sort of at the start of the ramp I think at this stage. We will start to see some of these companies become … their products will start to be used more universally and there will be the best practice in terms of farming. Then we will have hundreds and hundreds of these systems, whatever they look like, out there in the field. They will need servicing and maintenance, just like tractors do today. There is no magic behind robotics that somehow you don’t need to maintain them and look after them and always develop them. Every stage in the pipeline, there will be increasing opportunities from Blue Star Research all the way through to the actual filing practice.

We hope that actually … the demographics of those who work on the farm today is kind of an aging population and the younger people tend not to be so interested in farming as a career choice right now. There are kind of hopes that as it becomes more high tech, that it might actually attract again the younger generation into those sorts of roles and make those kinds of jumps, and fix that kind of demographic issue.

Ron Vanderkley: James, thanks on behalf of the Robots Podcast for a most interesting session.



tags:


Podcast team The ROBOTS Podcast brings you the latest news and views in robotics through its bi-weekly interviews with leaders in the field.
Podcast team The ROBOTS Podcast brings you the latest news and views in robotics through its bi-weekly interviews with leaders in the field.





Related posts :



Open Robotics Launches the Open Source Robotics Alliance

The Open Source Robotics Foundation (OSRF) is pleased to announce the creation of the Open Source Robotics Alliance (OSRA), a new initiative to strengthen the governance of our open-source robotics so...

Robot Talk Episode 77 – Patricia Shaw

In the latest episode of the Robot Talk podcast, Claire chatted to Patricia Shaw from Aberystwyth University all about home assistance robots, and robot learning and development.
18 March 2024, by

Robot Talk Episode 64 – Rav Chunilal

In the latest episode of the Robot Talk podcast, Claire chatted to Rav Chunilal from Sellafield all about robotics and AI for nuclear decommissioning.
31 December 2023, by

AI holidays 2023

Thanks to those that sent and suggested AI and robotics-themed holiday videos, images, and stories. Here’s a sample to get you into the spirit this season....
31 December 2023, by and

Faced with dwindling bee colonies, scientists are arming queens with robots and smart hives

By Farshad Arvin, Martin Stefanec, and Tomas Krajnik Be it the news or the dwindling number of creatures hitting your windscreens, it will not have evaded you that the insect world in bad shape. ...
31 December 2023, by

Robot Talk Episode 63 – Ayse Kucukyilmaz

In the latest episode of the Robot Talk podcast, Claire chatted to Ayse Kucukyilmaz from the University of Nottingham about collaboration, conflict and failure in human-robot interactions.
31 December 2023, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association