Xu Wang1, Hong Xuan2, Byron Evers1, Sandesh Shrestha1, Robert Pless2, Jesse Poland1. 1. Department of Plant Pathology, Kansas State University, 4024 Throckmorton PSC, 1712 Claflin Road, Manhattan, KS 66506, USA. 2. Department of Computer Science, George Washington University, 4000 Science and Engineering Hall, 800 22nd Street NW, Washington, DC 20052, USA.
Abstract
BACKGROUND: Measurement of plant traits with precision and speed on large populations has emerged as a critical bottleneck in connecting genotype to phenotype in genetics and breeding. This bottleneck limits advancements in understanding plant genomes and the development of improved, high-yielding crop varieties. RESULTS: Here we demonstrate the application of deep learning on proximal imaging from a mobile field vehicle to directly estimate plant morphology and developmental stages in wheat under field conditions. We developed and trained a convolutional neural network with image datasets labeled from expert visual scores and used this "breeder-trained" network to classify wheat morphology and developmental stages. For both morphological (awned) and phenological (flowering time) traits, we demonstrate high heritability and very high accuracy against the "ground-truth" values from visual scoring. Using the traits predicted by the network, we tested genotype-to-phenotype association using the deep learning phenotypes and uncovered novel epistatic interactions for flowering time. Enabled by the time-series high-throughput phenotyping, we describe a new phenotype as the rate of flowering and show heritable genetic control for this trait. CONCLUSIONS: We demonstrated a field-based high-throughput phenotyping approach using deep learning that can directly measure morphological and developmental phenotypes in genetic populations from field-based imaging. The deep learning approach presented here gives a conceptual advancement in high-throughput plant phenotyping because it can potentially estimate any trait in any plant species for which the combination of breeder scores and high-resolution images can be obtained, capturing the expert knowledge from breeders, geneticists, pathologists, and physiologists to train the networks.
BACKGROUND: Measurement of plant traits with precision and speed on large populations has emerged as a critical bottleneck in connecting genotype to phenotype in genetics and breeding. This bottleneck limits advancements in understanding plant genomes and the development of improved, high-yielding crop varieties. RESULTS: Here we demonstrate the application of deep learning on proximal imaging from a mobile field vehicle to directly estimate plant morphology and developmental stages in wheat under field conditions. We developed and trained a convolutional neural network with image datasets labeled from expert visual scores and used this "breeder-trained" network to classify wheat morphology and developmental stages. For both morphological (awned) and phenological (flowering time) traits, we demonstrate high heritability and very high accuracy against the "ground-truth" values from visual scoring. Using the traits predicted by the network, we tested genotype-to-phenotype association using the deep learning phenotypes and uncovered novel epistatic interactions for flowering time. Enabled by the time-series high-throughput phenotyping, we describe a new phenotype as the rate of flowering and show heritable genetic control for this trait. CONCLUSIONS: We demonstrated a field-based high-throughput phenotyping approach using deep learning that can directly measure morphological and developmental phenotypes in genetic populations from field-based imaging. The deep learning approach presented here gives a conceptual advancement in high-throughput plant phenotyping because it can potentially estimate any trait in any plant species for which the combination of breeder scores and high-resolution images can be obtained, capturing the expert knowledge from breeders, geneticists, pathologists, and physiologists to train the networks.
Authors: Sambuddha Ghosal; David Blystone; Asheesh K Singh; Baskar Ganapathysubramanian; Arti Singh; Soumik Sarkar Journal: Proc Natl Acad Sci U S A Date: 2018-04-16 Impact factor: 11.205
Authors: Duke Pauli; Pedro Andrade-Sanchez; A Elizabete Carmo-Silva; Elodie Gazave; Andrew N French; John Heun; Douglas J Hunsaker; Alexander E Lipka; Tim L Setter; Robert J Strand; Kelly R Thorp; Sam Wang; Jeffrey W White; Michael A Gore Journal: G3 (Bethesda) Date: 2016-04-07 Impact factor: 3.154
Authors: Michael P Pound; Jonathan A Atkinson; Alexandra J Townsend; Michael H Wilson; Marcus Griffiths; Aaron S Jackson; Adrian Bulat; Georgios Tzimiropoulos; Darren M Wells; Erik H Murchie; Tony P Pridmore; Andrew P French Journal: Gigascience Date: 2017-10-01 Impact factor: 6.524
Authors: Annalisa M Baratta; Adam J Brandner; Sonja L Plasil; Rachel C Rice; Sean P Farris Journal: Front Mol Neurosci Date: 2022-06-23 Impact factor: 6.261
Authors: Osval Antonio Montesinos-López; Abelardo Montesinos-López; Paulino Pérez-Rodríguez; José Alberto Barrón-López; Johannes W R Martini; Silvia Berenice Fajardo-Flores; Laura S Gaytan-Lugo; Pedro C Santana-Mancilla; José Crossa Journal: BMC Genomics Date: 2021-01-06 Impact factor: 3.969
Authors: Morteza Ghahremani; Kevin Williams; Fiona M K Corke; Bernard Tiddeman; Yonghuai Liu; John H Doonan Journal: Front Plant Sci Date: 2021-03-24 Impact factor: 5.753
Authors: Lucas van der Zee; Amelia Corzo Remigio; Lachlan W Casey; Imam Purwadi; Jitpanu Yamjabok; Antony van der Ent; Gert Kootstra; Mark G M Aarts Journal: Plant Methods Date: 2021-08-03 Impact factor: 4.993