Autonomy

How the AutoCart® Perception System is Made

AutoCart® is the first Driverless Ag Technology for grain cart harvest operations, and Raven officially opened pre-orders for the system last month! It allows the farmer to monitor and operate a driverless tractor from the cab of the harvester. With AutoCart, the user can set a field plan, establish staging locations, adjust speeds, monitor location activity, and command the tractor pulling a grain cart to sync with the harvester. The harvester can offload on the go in the field, then return the tractor to a predetermined unloading area. All without a second driver. But how is AutoCart made? Here's a look behind the scenes of this revolutionary ag technology.

AutoCart-and-CNHi-Combine.png

Quincy Milloy has been working on the AutoCart system for nearly five years — and has played an integral role in the development of the system. Today, as a Staff Design Engineer, he primarily assists with the architecture on the Raven Autonomy™ product line.

How is the AutoCart perception system made?

Quincy: The perception system hardware is built and tested in-house to ensure the quality of each component. Specialized test rigs ensure that all the peripherals on the main controller are functioning properly. Each of the components have undergone environmental testing to ensure a high level of performance when mounted on a vehicle.

The software is an ever-evolving component of the system. Each unit is capable of pushing data back up to our machine learning pipeline. Once in the pipeline, the data is processed, tagged, and run through machine learning model training so our system can continue to improve.

How is the perception system installed?

Quincy: The main brains of the operation are generally installed on a protected spot within the vehicle. The perception system utilizes front-facing cameras that detect obstacles in the path of the vehicle as well as hazard cameras that ensure that the tractor does not move when someone is in an exclusion zone around the vehicle. These cameras need to be mounted high on the vehicle to ensure that they have enough field of view. We also utilize a radar system mounted on the front of the vehicle to more accurately measure the distance to obstacles.

How do you achieve autonomy? What does it all entail?

Quincy: A big aspect of autonomy is that you need to be able to perceive the world around you. We do that through a combination of sensors. There are cameras and other sensors around the vehicle that help it perceive what is around it. The hard part isn’t getting the sensor data itself, but rather what you do with that data — and how the system uses that data to classify and understand the world around it.

Of course, autonomy is not just about perception. We have many critical algorithms, such as path planners and communications strategies, that help ensure a smooth operation. All of this technology has to be pulled together with a cohesive user experience that enables the customer to get something done!

Can you turn any modern tractor into an autonomous tractor?

Quincy: You asked an engineer, so the answer will be technically yes, but how much stuff do you have to add to make that happen? You need a perception system (sensors), a communication suite (antennas), and a speed controller on the vehicle.

An incredible feature that the team developed is the ability to drive the tractor manually as well as in autonomous mode, creating a seamless user experience. We are looking forward to seeing it in action when our first AutoCart units are delivered later this year!