AI Breakthrough Speeds Up Crop Breeding with Minimal Human Input.

AI Breakthrough Speeds Up Crop Breeding with Minimal Human Input.

A novel artificial intelligence (AI) approach developed by scientists at the University of Illinois Urbana-Champaign is poised to transform crop breeding by dramatically reducing the labor required to monitor flowering traits in plants. By harnessing aerial drone imagery and a self-training machine-learning model, the research team has made it possible to identify flowering stages in grasses with minimal human supervision.

The study, led by Andrew Leakey, professor of plant biology and crop sciences, and Sebastian Varela, a scientist at the Center for Advanced Bioenergy and Bioproducts Innovation, focused on thousands of Miscanthus grass varieties—each with unique flowering timelines and characteristics. The research, published in Plant Physiology, demonstrates how this new method can streamline large-scale agricultural field studies.

“Tracking flowering time is vital to understanding productivity and environmental adaptation in many crops,” said Leakey. “But manually inspecting thousands of plants in field trials is incredibly labor-intensive and unsustainable at scale.”

To automate this process, the team employed drones to collect aerial imagery of Miscanthus fields during the 2020 growing season. However, traditional AI models often struggle with image variability across different environments or stages of plant development, typically requiring thousands of manually labeled training images for each new scenario. According to Leakey, this has hindered widespread adoption of AI tools in agricultural research.

To overcome this bottleneck, Varela introduced an innovative solution using a type of machine learning architecture known as a generative adversarial network (GAN). GANs involve two models working in tandem—one generates synthetic images while the other evaluates them, improving their performance over time. This adversarial training allows the system to develop a deep understanding of visual features related to flowering without relying heavily on annotated datasets.

Varela’s modified system, called the "efficiently supervised generative and adversarial network" (ESGAN), builds visual expertise by learning to identify real versus generated images, significantly cutting down the number of human-labeled examples required for accurate training.

The results were striking. ESGAN matched the accuracy of conventional supervised models while requiring up to 100 times fewer annotated images. Grad-CAM activation maps generated during the experiments confirmed the system’s ability to identify pre-heading and post-heading stages in crops using color-coded visual cues from the UAV imagery.

“The reduction in effort is substantial,” the researchers reported, noting that ESGAN offers a scalable and flexible solution for agricultural AI applications across varying climates, species, and field conditions.

Leakey and Varela are now collaborating with Miscanthus breeder Erik Sacks to apply this method across a multistate breeding program. The project aims to develop Miscanthus varieties that are regionally adapted and suitable for use as biofuel feedstocks on marginal agricultural land.

“Our goal is to help accelerate the adoption of AI tools in crop improvement, enabling broader advances in plant science and contributing to the growing bioeconomy,” Leakey said.

Leakey is also affiliated with the Carl R. Woese Institute for Genomic Biology, the Institute for Sustainability, Energy and Environment, and the Center for Digital Agriculture at the University of Illinois.

This advancement paves the way for future innovations in digital agriculture, particularly in reducing the costs and time associated with data collection and AI training. By minimizing the reliance on manual image annotation, ESGAN could become a cornerstone tool in modern crop science.

Source:https://phys.org/news/2025-04-approach-ai-vision-crop.html

This is non-financial/medical advice and made using AI so could be wrong.

Follow US

Top Categories

Please Accept Cookies for Better Performance