You are viewing an old version of this page. View the current version.
The University of Sydney’s Australian Centre for Field Robotics (ACFR) has been conducting research in autonomous remote sensing systems, and developing innovative robotics and intelligent software for the environment and agriculture community for over a decade.
See The University of Sydney here for more information about our research: www.sydney.edu.au/engineering/acfr/agriculture
The ACFR team has demonstrated SwagBot autonomously spot spraying weeds on a grazing property near Marulan, New South Wales. SwagBot can be seen automatically detecting and spraying serrated tussock. This solution has the potential to significantly reduce the burden of ongoing weed management.
SwagBot has been designed for the grazing livestock industry to assist with a range of tasks including weed control, animal monitoring and pasture surveys.
Serrated tussock is a highly invasive weed found throughout temperate Australia. It is a threat to native grasslands through reduced biodiversity, and has a severe impact on agricultural productivity.
The Digital Farmhand comprises of a small mobile platform that can be remotely or autonomously controlled. On the mobile platform exists a smartphone, sensors, and computing. The robot also has a three-point-hitch system which allows the use of farming implements to do activities such as precision seeding, spraying and weeding; and, through its ability to monitoring individual plants, the data it produces has the potential to support better on-farm decision making helping growers increase yield and productivity, reduce input costs, and maximise nutrition security.
As part of the Launch Food program, it was concluded that conducting a pilot study in the Pacific Islands would be ideal because of the need for improving food security in the Pacific Islands and because of the strong alliances between Australia and the Pacific Island community. In this video, we travelled to Samoa to trial the robot on three different farms and conducted a workshop with local farmers to get feedback on how a system like Digital Farmhand could be used in the region.
In addition to the trial our team assessed:
- The current level of digital technology readiness and understanding amongst farmers centred around agriculture;
- The ICT infrastructure currently in place to support platforms like Digital Farmhand; and
- Economic analysis of how current farming practices and how technology could help reduce input costs and increase productivity and yield.
Digital Farmhand and Swagbot were trialled on an orchard (Apple, Nectarine, Peach) near Bilpin NSW. The team wanted to see how Digital Farmhand and Swagbot would perform in an orchard setting.
Below is a short video montage of the trial.
Digital Farmhand is a modular low-cost platform designed to assist smallholder farmers in improving their productivity, yields and ultimately provide a more reliable income amidst changing markets and climates. In its simplest form it is a small electric tractor-like vehicle that can tow a variety of implements such as seeders, weeders and bed preparation tools. The Digital Farmhand can also use accessible smartphone technologies along with AI to provide crop analytics such as yield estimation, pest and disease identification, as well as precision automation of many labour intensive farm tasks, e.g. weeding, spraying and seeding The ACFR team conducted trials on 3 different farms in Fiji with the Digital Farmhand robotic platform in June 2018.
Below is a short video of the trial and an article from the ABC about the project
Created through the Horticulture Innovation Centre for Robotics and Intelligent Systems at the University of Sydney's internationally-recognised Australian Centre for Field Robotics, RIPPA is a production prototype robot for the vegetable industry. This video provides an update on RIPPA’s functionality and future research and development work.
This video shows our latest work in light interception modelling for orchards. Tree geometry is modelled using point clouds from a handheld LiDAR, while public weather data is used to model the sky at a given time and date. The light in the sky is then ray-traced through the tree to provide an estimate of light interception, distribution and absorption. The model can be applied to decision support systems, for example to inform optimal pruning practices. For more information see our arxiv preprint.
This small bot is capable of doing useful tasks in a fully autonomous fashion. Features include on-board computing, communications, sensing, power management, solar system and high torque motors. There is also space for additional payloads.
Small robots have a number of potential applications including materials inspection (i.e., pipes), navigating through collapsed buildings, intelligent transportation/delivery, micro surgery, surveillance and more.
This video shows the first steps towards an automatic lameness detection system for dairy cows. Four sensors record 3D data as cattle walk past. A Neural Network has been trained with hand-labelled data to detect hooves. These detections are projected to 3D and tracked to provide four hoof trajectories. Limb motion is an important indicator of lameness, and an example of hoof placement is shown for a healthy and severely lame cow.
SwagBot was recently demonstrated at a cattle station near Nevertire, NSW. SwagBot is a lightweight, electric vehicle designed to collect data on pasture and livestock. Local farmers were shown how SwagBot can automatically detect and spray weeds in grazing land using various spray attachments. The team also completed aerial surveying of the property which will be used to develop farm maps and resources for weed mapping.
Thank you to Central West Local Land Services for organising the event.
The RIPPA team recently completed a successful field trial on a broccoli crop at Fresh Select farms in Werribee, Victoria. RIPPA's tasks included data collection, foreign object removal, a solar endurance characterization and testing a new deep learning algorithm for weed detection that was used for real time mechanical weeding.
Thanks to the team at Fresh Select for making this possible.
This video shows our latest results in mango fruit detection, localisation and mapping. The multi-sensor robot 'Shrimp' acquires data with a variety of different sensors, including lidar for tree canopy segmentation and colour vision for fruit detection and triangulation. This is arguably the world's most accurate system for mapping individual whole fruit in commercial orchards, while the fruit is still on the tree. Compared to post-harvest yield estimates for individual trees, the system counts accurately (linear fit, near unity slope of 0.96 and r^2 value of 0.89). The system has now been validated on two subsequent seasons, with the third planned later this year (2017). Scanning is performed 2 months before harvest time, meaning there's plenty of opportunity to use it for precision agriculture and on-farm decision making, towards optimised fruit production.
The people from Jungle Creations have taken various footage of our Ladybird robot and made a viral video
This video shows footage from a recent demonstration of the Digital Farmhand robot at Richmond, NSW.
Digital Farmhand is a low cost row crop robot aimed towards helping small scale farmers in Australia & overseas to perform crop analytics and automation of simple farming tasks. The design of the platform is based around the use of cheap low cost sensors, computing and manufacturing techniques which will allow the farmer to easily maintain and modify their platform to suit their needs.
The platform comes with an actuated 3 point hitch mechanism which allows various implements to be attached (similar to a tractor). Currently 4 implements have been manufactured for this platform. These include a sprayer, seeder, tine weeder and tow ball hitch.
More details visit http://sydney.edu.au/acfr/agriculture
On the 23rd of June 2017, ACFR was invited to a Local Land Services NSW field day event to present the work they have done over the last six months on a platform called the Digital Farmhand (Previously referred to as Di-Wheel). The event generated a large amount of interest within the local farming community with over 100 registrations for the event. During the event, the team presented:
- the project overview
- the design concept of the Digital Farmhand
- plant analytics via low-cost sensors (smartphone camera)
- the future vision of the project
- live demonstration of automated row turning via low-cost sensors (smartphone camera)
- live demonstration of a farming implement (spray boom) mounted on the digital farmhand
Below are some photos from the event. Link to news article here hawkesbury gazette
We had a robotic arm lying around and thought we’d have some fun in the lab with a pneumatic pruner.
Shown here is a UR5 arm configured to navigate to way points on a tree. Once in position, the pruner is activated and a branch is removed.
We had a robotic arm lying around and thought we’d have some fun in the lab with a new type of gripper.
Shown here is a UR5 arm configured to navigate to way points on a tree. Once in position, the gripper is activated, then the arm twists and pulls the apple from the tree and places the fruit in a tray.
We recently conducted a trial demonstrating the RIPPA robot working on an apple orchard in Three Bridges, Victoria, Australia. RIPPA operated autonomously up and down the apple rows and was able to change rows at the headlands by moving sideways. The trial demonstrated VIIPA autonomously and in real time detecting then targeting apples with variable rates of fluid.
The video below shows some of the experiments conducted on the trial:
Future applications of the technology include pest management, pruning, thinning, and pollinating in tree crop farming.
In early October, ACFR conducted a series of field trials in Lembang which is located on the outskirts of the city of Bandung Indonesia with the Di-Wheel robot. The objective of the trip was to investigate how robotics can be can be deployed and utilised in a farming context in a developing country. As part of our investigation, a community of local farmers were interviewed to gain a better understanding of their requirements and their situation. We also visited a variety of engineering firms to understand the engineering capabilities within Bandung to support future field trials in that region.
Below are some videos and photos from the trip.
Tip: Hover cursor over the pictures for the caption
Over the last few months, the RIPPA robot has been working on several commercial vegetable farms around Australia. Various experimental autonomous crop interaction tasks have been demonstrated including:
- autonomous row following and data collection
- autonomous real time mechanical weeding
- autonomous real time variable rate fluid dispensing using VIIPA
- autonomous soil sampling and mapping
Our work was featured in IEEE's Video Friday on November 4, 2016: http://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-rescue-quadruped-gesture-controlled-robot-arm-self-driving-van-1986
View the video below to see RIPPA in action.
The video shows the di-wheel being demonstrated at Cobbity farm (University of Sydney Campus) on a kale crop row.
Our initial tests of SwagBot last month have been featured in media outlets around the world.
International articles include: New Scientist, IEEE Spectrum, Mashable, CNET, The Telegraph UK, Popular Science, Popular Mechanics, Engadget, The Enquirer, Quartz, Gizmodo, Tech Insider, Modern Farmer, New Atlas, The Times of India and Reuters.
Upcoming trials will focus on applying research toward autonomous farm activities including monitoring and interacting with plants and animals.
Meet SwagBot – our latest farming robot. SwagBot proved successful in its first field test. SwagBot successfully demonstrated the ability to operate in the rugged cattle station environment. Future trials will focus on applying research toward autonomous farm activities including monitoring and interacting with plants and animals.
Mary O'Kane reflects on Trends to inform smart choices in the June edition of FOCUS. (See pages 11-13)
New Scientist video featuring ACFR robots.
Members of the House of Representatives Standing Committee on Agriculture and Industry visited the ACFR technical laboratory on 14 April 2016, to hear about the latest innovations around robotics in agriculture. The committee were briefed on the current research and how it is directly related to aiding farmers and growers, such as sensory and imaging processes to improve apple growing, the RIPPA robot which can target and destroy weeds in crops, and UAVs for identifying problem weeds in the Australian outback. This visit was part of the federal parliament public hearing on agricultural innovation. More information about the hearing can be found at http://sydney.edu.au/news-opinion/news/2016/04/12/federal-parliament-public-hearing-on-agricultural-innovation-at-.html
On April 6th 2016 RIPPA ran its first endurance trial and completed almost 22 hours of continuous operation using only battery and solar power. This was a major accomplishment and testament that the RIPPA design and ACFR Ag robots are focused on being a real solution to the farmer. The run began at 0530, 1 hour before sunrise and completed at 0317 the next morning, 9 hours after sunset. For the duration, RIPPA roved autonomously up and down the spinach crop rows imaging the leaves. RIPPA then waited until solar sufficiently charged the batteries and at 1000 it began where it left off and continued roving up and down the rows. The irrigation created muddy and uneven terrain at the row ends, which was no problem for RIPPA as you can see in the video. A fantastic effort from the ACFR team.
Thanks to Horticulture Innovation Australia and to Ed Fagan for hosting and supporting us at his farm.
A new three year program of high tech R&D for orchard management has begun, with the use of our Shrimp robot to acquire data from mango, avocado and macadamia orchards.
The data includes lidar, vision, thermal, hyperspectral, soil conductivity and natural gamma, demonstrating that there are many ways to view the humble tree:
RIPPA has just had its first ever field trial on a spinach crop at Mulyan farms in Cowra, NSW. We had RIPPA driving up and down the rows autonomously using satellite based corrections to within 4cm precision. You can see RIPPA and VIIPA in action on the WIN News Central West Facebook page here:
Here's a video showing the first outdoor test of our new precision ground vehicle RIPPA™ (Robot for Intelligent Perception and Precision Application). VIIPA™ (Variable Injection Intelligent Precision Applicator) is shown autonomously shooting weeds at high speed using a directed micro dose of liquid. The first on-farm trial will be in Cowra late October, 2015!
With its comprehensive array of sensors, and ability to precisely and repeatably scan the field, Ladybird is well suited as a scientific research tool to measure crop phenotypes. We're working with the South Australian Research and Development Institute (SARDI) to test this application.
James Underwood gave a talk about autonomous information systems for tree crops, at the APAL speed updating session, alongside the National Horticulture Convention on the Gold Coast in June 2015. All the talks are available here.
This video shows the Ladybird performing targeted spot spray in real time. In this example, we show real-time results, first in the lab and then on a commercial vegetable orchard in Cowra, NSW, Australia. Ladybird detects the locations of seedlings in 3D using a stereo camera, then fires a small and controllable volume of spray at each target. Coupled with algorithms shown in previous videos for automatic weed detection, this technology can be used to deliver tiny amounts of herbicide exactly where it's needed, anywhere on the farm, allowing a herbicide volume reduction to only 0.01% compared with conventional blanket spraying applications.
This video demonstrates the use of a reconfigurable rover for crop row monitoring.
The Ladybird robot and the Agricultural Robotics team at ACFR, The University of Sydney would like to wish everyone a safe and happy holiday period!
Here’s a demonstration of concept weeding methods using the robotic manipulator on our Ladybird robot. We’ll be doing some field trials early 2015!
We've just returned from another successful trip to the farm. Ladybird scanned corn to detect different varieties of weeds within the crop and beetroot just prior to harvest for yield monitoring and to evaluate the performance of different seed spacings. With harvest occurring all around us, it was great to see Ladybird operating autonomously alongside traditional farm equipment, showing that high-tech autonomous systems can easily coexist with current methods. The farm of the future is nearer than you might think.
2-6 February 2015
Applications due: 8 December 2014 (extended)
General enquiries: email@example.com
NEW: Preliminary program now listed at http://www.acfr.usyd.edu.au/education/ssar2015.shtml
The IEEE RAS Summer School on Agricultural Robotics (SSAR 2015) is a new summer school to be held at The University of Sydney, Australia over five days during the southern hemisphere summer, from 2-6 February 2015. SSAR 2015 is supported in part by the IEEE Robotics and Automation Society and The University of Sydney.Agricultural robotics is an area of growing interest with the potential to bring about profound economic and social benefits. The School aims to promote robotics research that will enable safe, efficient, and economical production in agriculture and horticulture. The School will consist of presentations by world experts covering a broad range of topics in agricultural robotics, hands-on activities that encourage deep learning, and collaboration activities including a student poster session as well as several social events. Attendance is open to graduate students, postdocs, academics, and industry practitioners.
The main technical objective of the School is to cover the motivation driving research in agricultural robotics, existing projects and results, and open research problems in key areas of agricultural robotics. Underlying research topics include systems design of outdoor platforms, perception in semi-structured outdoor environments, planning and control for single and multiple robot systems, and manipulators for harvesting and weeding.
The School will include presentations (and opportunities to interact with) representatives from the USDA, GRDC, Horticulture Innovation Australia Limited, and the Cotton Research and Development Corporation.
Please check the website for updates on the detailed technical program.
APPLICATION AND REGISTRATION
Application details can be found on the SSAR 2015 website (http://www.acfr.usyd.edu.au/education/ssar2015.shtml).
Applications will be processed as received. Spaces are limited so please send your application as soon as possible.
Applications are due by 8 December 2014.
General enquiries can be addressed to firstname.lastname@example.org.
We've finished constructing the Ladybird and successfully commissioned it on a commercial veggie farm near Cowra, New South Wales. In two parts, the videos show the construction, automation, data and processing.
In part 1, we show the construction and testing of the vehicle on a commercial vegetable near Cowra, New South Wales. The vehicle can drive autonomously up and down rows of a vegetable farm, gathering data that we think will be useful for growers to manage the farm. The Ladybird is a solar electric powered vehicle, and during our three day trip, we didn't need to charge the vehicle once.
In part 2, we show some examples of the types of data we obtain and how it can be processed, to provide useful information to growers.
Robert Fitch's presentation in Minlaton (SA) on “Robotics in agriculture now, and a potential solution for robotic snail management on the YP” was featured in the Yorke Peninsula Country Times newspaper.
The Ladybird has captured the imagination of growers and the public alike, with online news and radio articles featured around Australia and globally. Links to stories here.
The robot drives along the orchard rows autonomously, gathering laser and camera data while passing the trees. Each tree can be automatically identified, and information such as flower and fruit counts is produced. The information can be stored in a database, compared through the season and from one year to the next, mapped and displayed visually.
We exercise the whole system for the first time, including translation, rotation and combined manoeuvres, including autonomous row alignment and following.
We have designed and built this robot as a new research platform to support Australia's vegetable industry. The omnidirectional wheel base allows traversal over most existing farm configurations, treading much more lightly over where existing tractor wheels currently run. In addition to the low weight of the vehicle, the ability to turn each wheel allows precision guidance and manoeuvrability, while minimising damage to the soil. In the undercarriage, the Ladybird carries a variety of optical sensors, including stereo and hyperspectral cameras, and the versatile robot arm enables development in a wide variety of applications, including spraying, weeding, thinning and of course to support harvesting research. We are looking forward to to our first tests on vegetable farm in the coming weeks.
We exhibited a selection of our robots at CeBIT, with an emphasis on the future of agriculture.
This video shows Shrimp driving fully autonomously in an apple orchard in the Yarra Valley, Australia. It uses a 360 degree lidar to guide it along the row (no need for GPS).
Unlike Mantis, the 2D lidars on Shrimp are looking sideways to scan the trees, so the 360 degree Velodyne sensor was used instead. To emulate a lower cost 2D lidar, only one of the 64 Velodyne lasers was processed. We used the autonomous system to obtain fruit yield data from approximately 30 rows of the farm without error.
This is a demonstration on our research platform, but the technology could easily be applied to any existing or new farm equipment, enabling smart farm vehicles to act as assistants to farmers.
We are interested in using robotic manipulators for harvesting and weeding applications.This video from our field lab illustrates our concept for how a robot arm might look in performing a harvesting task.
We gathered data from a banana plantation near Mareeba in the far north of Australia, at the end of 2013. Using Shrimp , we drove up and down rows of the plantation, acquiring 3D maps and image data.
Farmers typically use a system of coloured bags to denote the expected harvest date, which can be detected and mapped by the system. It is also hoped that in the future, growth rates of shoots or 'suckers' can be measured, to predict maturation times of the fruit directly, many months in advance.
We built and tested a prototype robotic aircraft and surveillance system to detect aquatic weeds in inaccessible habitats.
Wheel cactus which is native and endemic to Mexico has now naturalised in South Australia, New South Wales and Victoria. It is often located in terrain which is difficult to access and monitoring and control by unmanned aerial vehicles and remote sensoring offers significant potential.
The robot uses its forward looking laser to estimate the geometry of the tree foliage in front, enabling it to drive along the row without needing GPS. Additionally, it can detect people out in front, slowing down and coming to a safe halt.
The video shows a conceptual demonstration of how this could be used as a farmer assistance mechanism, whereby the vehicle could accompany a farmer, carrying heavy loads such as buckets of fruit, or towing other forms of equipment. Although demonstrated on one of our Perception Research Ground Vehicles (Mantis), the core technology can easily be applied to existing or new farm machinery.
GPS can be unreliable under canopied environments, due to occlusions between the vehicle and satellites. Therefore, forms of autonomy that require no GPS are likely to be more reliable.
In this trial a light weight hexacopter was used to detect alligator weed infestation. The final map product can be opened using Google Earth and can help the weed controllers to locate the infestation.
This trial aimed to provide a weed distribution map over a large area in Northern Queensland. During the trial we have mapped various woody weed including prickly acacia (Acacia nilotica), parkinsonia (Parkinsonia aculeate) and mesquite (Prosopis pallida). The map product can be used by the farmers to plan the control and eradication process.
Images are collected as "Shrimp" surveys the orchard. The algorithm classify and count apples in each image and provide yield estimation for each row "Shrimp" surveyed. This yield estimation can give the farmer an early indication of potential yield and allows the farmer to refine and optimise the farm operation.
As "Shrimp" drives along a row of an orchard, 3D maps are built from laser data. From this, we can segment and recognise individual trees, which is useful for data management. For example, when combined with our yield estimation techniques, it allows us to measure and associate the yield of each individual tree. This can be used to track information, such as yield, over time and it can be used actively, for example, to target autonomous or computer assisted spray trucks with spray programs for each tree.
The trial aimed to test the response of cows to the presence of a robot, and determine the feasibility of remote or autonomous herding using an unmanned ground vehicle. The cows were calm with the robot in their midst, and were willing to be herded into the dairy at a gentle pace, proving the potential of this technology. The story has captured the public's imagination, with media coverage around the world:Discovery Channel Canada, ABC Rural, BBC News, tested.com, cnet.com, International Business Times, and more.
Blog: SwagBot autonomous weed spraying demo
Feb 12, 2019
Blog: Digital Farmhand - Samoa Trials
Jan 16, 2019
Blog: Digital Farmhand and Swagbot - Orchard Trials
Jan 15, 2019
Blog: Digital Farmhand - Fiji Trials
Jan 15, 2019
Blog: RIPPA Functionality & Industry Update 2018
Unknown User (c.burnett)Jun 25, 2018
Blog: Modelling light interception in orchards
Jun 13, 2018
Blog: The Centimetre Bot
Dec 06, 2017
Blog: Automatic Dairy Cattle Lameness Detection System
Nov 17, 2017
Blog: SwagBot demonstrated at Nevertire
Unknown User (c.burnett)Oct 30, 2017
Blog: RIPPA Crop Interaction and Solar Endurance Field Trial
Unknown User (calleija)Oct 19, 2017
Blog: Mapping and counting mango fruit in orchards with machine vision, lidar and robotics
Sep 25, 2017
Blog: New Ladybird Video
Jul 26, 2017
Blog: Digital Farmhand Video
Jul 17, 2017
Blog: ACFR demonstrates the Digital Farmhand platform to local farmers
Jun 25, 2017
Blog: Robotic Arm With Pruner
May 22, 2017
- No labels