Blog

Blog

We show an end to end system for acquiring high resolution information to support precision agriculture in almond orchards.

The robot drives along the orchard rows autonomously, gathering laser and camera data while passing the trees. Each tree can be automatically identified, and information such as flower and fruit counts is produced. The information can be stored in a database, compared through the season and from one year to the next, mapped and displayed visually.

Our first full motion test of the ladybird outside.

We exercise the whole system for the first time, including translation, rotation and combined manoeuvres, including autonomous row alignment and following.

Meet our newest robot, the Ladybird! A robot we have designed and built for the vegetable industry. In this video, you can see the internal framework and components as it takes its first steps. Stay tuned for more videos as we develop the platform!

We have designed and built this robot as a new research platform to support Australia's vegetable industry. The omnidirectional wheel base allows traversal over most existing farm configurations, treading much more lightly over where existing tractor wheels currently run. In addition to the low weight of the vehicle, the ability to turn each wheel allows precision guidance and manoeuvrability, while minimising damage to the soil. In the undercarriage, the Ladybird carries a variety of optical sensors, including stereo and hyperspectral cameras, and the versatile robot arm enables development in a wide variety of applications, including spraying, weeding, thinning and of course to support harvesting research. We are looking forward to to our first tests on vegetable farm in the coming weeks.

 

We exhibited a selection of our robots at CeBIT, with an emphasis on the future of agriculture.

During our recent field trip to the Yarra Valley, we demonstrated autonomous row following for the trellis structured apple configuration. The system worked reliably, and we used it to gather data for yield prediction for approximately 30 rows of apples.

This video shows Shrimp driving fully autonomously in an apple orchard in the Yarra Valley, Australia. It uses a 360 degree lidar to guide it along the row (no need for GPS).

Unlike Mantis, the 2D lidars on Shrimp are looking sideways to scan the trees, so the 360 degree Velodyne sensor was used instead. To emulate a lower cost 2D lidar, only one of the 64 Velodyne lasers was processed. We used the autonomous system to obtain fruit yield data from approximately 30 rows of the farm without error.

This is a demonstration on our research platform, but the technology could easily be applied to any existing or new farm equipment, enabling smart farm vehicles to act as assistants to farmers.

We are interested in using robotic manipulators for harvesting and weeding applications.

This video from our field lab illustrates our concept for how a robot arm might look in performing a harvesting task.

This video shows some of the data and first processing from our recent trip to a banana plantation.

We gathered data from a banana plantation near Mareeba in the far north of Australia, at the end of 2013. Using Shrimp , we drove up and down rows of the plantation, acquiring 3D maps and image data.

Farmers typically use a system of coloured bags to denote the expected harvest date, which can be detected and mapped by the system. It is also hoped that in the future, growth rates of shoots or 'suckers' can be measured, to predict maturation times of the fruit directly, many months in advance.

Aquatic weed surveillance using robotic aircraft

We built and tested a prototype robotic aircraft and surveillance system to detect aquatic weeds in inaccessible habitats.

Read the report

Detecting Wheel Cacti
This project examined the role of unmanned aerial vehicles in detecting, classifying and mapping infestations of wheel cactus, Opuntia robusta, over large areas of rangelands in outback Australia.

Wheel cactus which is native and endemic to Mexico has now naturalised in South Australia, New South Wales and Victoria. It is often located in terrain which is difficult to access and monitoring and control by unmanned aerial vehicles and remote sensoring offers significant potential.

Read the report

One of our research ground vehicles, Mantis, was used to successfully demonstrate autonomy at an almond orchard.

The robot uses its forward looking laser to estimate the geometry of the tree foliage in front, enabling it to drive along the row without needing GPS. Additionally, it can detect people out in front, slowing down and coming to a safe halt.

The video shows a conceptual demonstration of how this could be used as a farmer assistance mechanism, whereby the vehicle could accompany a farmer, carrying heavy loads such as buckets of fruit, or towing other forms of equipment. Although demonstrated on one of our Perception Research Ground Vehicles (Mantis), the core technology can easily be applied to existing or new farm machinery.

GPS can be unreliable under canopied environments, due to occlusions between the vehicle and satellites. Therefore, forms of autonomy that require no GPS are likely to be more reliable.

Alligator Weed Detection Using UAV
We used a hexacopter to map and classify alligator weeds from an aerial perspective.

In this trial a light weight hexacopter was used to detect alligator weed infestation. The final map product can be opened using Google Earth and can help the weed controllers to locate the infestation.

 

Woody Weed Detection, Classification and Control
"J3 Cub" the unmanned aerial vehicle (UAV) was used to detect and map various species the wood weed in Northern Queensland.

This trial aimed to provide a weed distribution map over a large area in Northern Queensland. During the trial we have mapped various woody weed including prickly acacia (Acacia nilotica), parkinsonia (Parkinsonia aculeate) and mesquite (Prosopis pallida). The map product can be used by the farmers to plan the control and eradication process.