Drones and AI: The New Age of Cotton Production

Drones and AI: The New Age of Cotton Production

Producers know that harvesting cotton at the right time – when leaves are shed and cotton bolls burst open – ensures high quality and yield. A new machine learning model that tracks plant aging over time can assist them in determining the right moment for harvest.

Two images at the top of the figure show a cotton field and boll at the start of the growing season. The leaves are green and the boll is immature and only partially opened. Two images at the bottom of the figure show a cotton field and boll at the end of the growing season. Plants are brown and leafless, and the boll is mature. The boll is open, has a dry appearance, and the cotton fibers are white and fluffy.
Figure 1: Immature cotton boll and plants (top), mature cotton boll and plants ready for harvest (bottom).

Cotton senescence, as with most plants, follows a predictable sequence of events. However, senescence timing is influenced by a variety of factors, including genotype, environmental conditions, and management practices. Different genotypes exhibit varying rates of senescence, which can affect how quickly plants mature. Environmental stresses like extreme temperatures, drought, and nutrient deficiencies can accelerate senescence. Management practices such as irrigation, fertilization, and pruning can mitigate stress effects, helping to optimize the senescence process.

By analyzing senescence data, producers can choose the most suitable genotypes and management strategies to enhance cotton yield and fiber quality. This information is also valuable for plant breeders aiming to develop more resilient cultivars.

Collecting this data is challenging due to the lengthy cotton maturation period, which can span weeks to months. Observing plants at isolated intervals fails to offer a comprehensive view of their senescence process so continuous data collection is essential for a more accurate understanding. However, traditional methods are often time-consuming and labor-intensive, typically relying on hand-held tools to measure chlorophyll content.

Recently, high-throughput phenotyping has gained prominence due to rapid advancements in platform and sensor technologies, as well as data analytics methodologies. Drones equipped with various sensors fly over agricultural fields to collect large volumes of high-resolution images. However, issues like uneven lighting and spatial variation can introduce random errors and irrelevant information, obscuring the patterns and important information in the dataset. Convolutional Neural Networks (CNNs), a type of deep learning model used for processing and analyzing images, can help address these challenges. For example, they can be used to detect basic plant material versus soil and distinguish between patterns of pigmentation.

A new paper published in in silico Plants highlights how deep learning methods can tackle these challenges. PhD candidate Aaron DeSalvio and colleagues at Texas A&M University developed the first CNN designed for single-plant analysis of senescence over time using aerial images of a field-grown crop.

DeSalvio explained the significance of gathering data from single plants. “Phenotyping is typically done at a plot level. By tracking change of single plants rather than a plot, statistical power increases while the size of the research field remains the same. Additionally, researchers can try to identify varieties that demonstrate more (or less) uniformity in their phenotypes across the growing season by tracking single-plant replicates of the same variety.”

The authors grew cotton genotypes with varying rates of senescence and captured images using a drone-mounted camera throughout the growing season. Currently, the gold standard for quantifying senescence from this type of images involves visual senescence ratings. Each plant’s images were manually assigned ratings of 0% (completely green) through 100% (completely dead), a process that is quite time-consuming.

To tackle this issue, the authors developed CNN models capable of quantifying senescence as effectively as visual ratings. The CNN models were trained using values derived from visual senescence ratings and vegetation indices. Vegetation indices are quantitative values indicative of senescence progression calculated using color intensity measurements in the images. You can read more about the use of vegetation indices obtained from remote sensing here.

The authors utilized an innovative method for analyzing time-series images, facilitating temporal analysis: stacking the images of individual plants captured at various time points before feeding them into the CNN.

“This method allowed the CNNs to incorporate temporal dynamics into their training and analysis. This meant that all time points belonging to a single plant could be evaluated by the model simultaneously, which allowed it to detect differences in the progression of senescence between plants, not just differences at isolated times during the season.”

Six CNN models were developed using various training data. Among these, two successfully predicted visual senescence ratings and vegetation index values from the stacked images with high reliability. Additionally, the models captured how different genotypes senescence over the course of the growing season (see figure 2). This demonstrates that the model can be used in the future to identify new genetic factors that influence senescence within a genetically diverse population.

A figure with two panels. On the left are genotypes exhibiting rapid senescence, and stay-green is on the right. Both panels show a graph with time as days after transplant on the X axis and predicted senescence score on the Y axis. They also each show 4 images from drone flights that depict their senescence trajectories. Most of the replicates of the 3 genotypes exhibiting rapid senescence reach the maximum predicted senescence score of 5 after 110 days. This score corresponds with full senescence. On the other hand, the senescence scores for most replicates of the three stay-green genotypes peak around 120 days with senescence scores ranging from zero to three. In addition, most have a decline in senescence score, or greening, after the peak. The images for the rapid senescence genotypes follow a progression of green plants to brown plants while the stay-green genotypes remain green.
Figure 2: The model was able to capture differences in senescence trajectories between two genotypes (top). Sample time series images from drone flights showing the senescence trajectories of the two genotypes (bottom).

DeSalvio concludes, “As the demands to identify and breed for resilient crops intensify, breeding programs need scalable tools to collect data about varieties that surpass end-season measurements. Methods such as the CNN model described here can analyze and categorize plants by their life trajectories, enabling the selection of varieties whose developmental trajectories are adapted to specific environments.


READ THE ARTICLE:

Aaron J DeSalvio, Alper Adak, Mustafa A Arik, Nicholas R Shepard, Serina M DeSalvio, Seth C Murray, Oriana García-Ramos, Himabindhu Badavath, David M Stelly, Temporal image sandwiches enable link between functional data analysis and deep learning for single-plant cotton senescence, in silico Plants, Volume 6, Issue 2, 2024, diae019, https://doi.org/10.1093/insilicoplants/diae019

The post Drones and AI: The New Age of Cotton Production appeared first on Botany One.

Please follow and like us:

Everybody Is Sharing Guildford Cycads :-)