BACKGROUND: In recent years quantitative analysis of root growth has become increasingly important as a way to explore the influence of abiotic stress such as high temperature and drought on a plant's ability to take up water and nutrients. Segmentation and feature extraction of plant roots from images presents a significant computer vision challenge. Root images contain complicated structures, variations in size, background, occlusion, clutter and variation in lighting conditions. We present a new image analysis approach that provides fully automatic extraction of complex root system architectures from a range of plant species in varied imaging set-ups. Driven by modern deep-learning approaches, RootNav 2.0 replaces previously manual and semi-automatic feature extraction with an extremely deep multi-task convolutional neural network architecture. The network also locates seeds, first order and second order root tips to drive a search algorithm seeking optimal paths throughout the image, extracting accurate architectures without user interaction. RESULTS: We develop and train a novel deep network architecture to explicitly combine local pixel information with global scene information in order to accurately segment small root features across high-resolution images. The proposed method was evaluated on images of wheat (Triticum aestivum L.) from a seedling assay. Compared with semi-automatic analysis via the original RootNav tool, the proposed method demonstrated comparable accuracy, with a 10-fold increase in speed. The network was able to adapt to different plant species via transfer learning, offering similar accuracy when transferred to an Arabidopsis thaliana plate assay. A final instance of transfer learning, to images of Brassica napus from a hydroponic assay, still demonstrated good accuracy despite many fewer training images. CONCLUSIONS: We present RootNav 2.0, a new approach to root image analysis driven by a deep neural network. The tool can be adapted to new image domains with a reduced number of images, and offers substantial speed improvements over semi-automatic and manual approaches. The tool outputs root architectures in the widely accepted RSML standard, for which numerous analysis packages exist (http://rootsystemml.github.io/), as well as segmentation masks compatible with other automated measurement tools. The tool will provide researchers with the ability to analyse root systems at larget scales than ever before, at a time when large scale genomic studies have made this more important than ever.
BACKGROUND: In recent years quantitative analysis of root growth has become increasingly important as a way to explore the influence of abiotic stress such as high temperature and drought on a plant's ability to take up water and nutrients. Segmentation and feature extraction of plant roots from images presents a significant computer vision challenge. Root images contain complicated structures, variations in size, background, occlusion, clutter and variation in lighting conditions. We present a new image analysis approach that provides fully automatic extraction of complex root system architectures from a range of plant species in varied imaging set-ups. Driven by modern deep-learning approaches, RootNav 2.0 replaces previously manual and semi-automatic feature extraction with an extremely deep multi-task convolutional neural network architecture. The network also locates seeds, first order and second order root tips to drive a search algorithm seeking optimal paths throughout the image, extracting accurate architectures without user interaction. RESULTS: We develop and train a novel deep network architecture to explicitly combine local pixel information with global scene information in order to accurately segment small root features across high-resolution images. The proposed method was evaluated on images of wheat (Triticum aestivum L.) from a seedling assay. Compared with semi-automatic analysis via the original RootNav tool, the proposed method demonstrated comparable accuracy, with a 10-fold increase in speed. The network was able to adapt to different plant species via transfer learning, offering similar accuracy when transferred to an Arabidopsis thaliana plate assay. A final instance of transfer learning, to images of Brassica napus from a hydroponic assay, still demonstrated good accuracy despite many fewer training images. CONCLUSIONS: We present RootNav 2.0, a new approach to root image analysis driven by a deep neural network. The tool can be adapted to new image domains with a reduced number of images, and offers substantial speed improvements over semi-automatic and manual approaches. The tool outputs root architectures in the widely accepted RSML standard, for which numerous analysis packages exist (http://rootsystemml.github.io/), as well as segmentation masks compatible with other automated measurement tools. The tool will provide researchers with the ability to analyse root systems at larget scales than ever before, at a time when large scale genomic studies have made this more important than ever.
Authors: Patrick Armengaud; Kevin Zambaux; Adrian Hills; Ronan Sulpice; Richard J Pattison; Michael R Blatt; Anna Amtmann Journal: Plant J Date: 2008-11-04 Impact factor: 6.417
Authors: Michael P Pound; Andrew P French; Jonathan A Atkinson; Darren M Wells; Malcolm J Bennett; Tony Pridmore Journal: Plant Physiol Date: 2013-06-13 Impact factor: 8.340
Authors: Jonathan A Atkinson; Luzie U Wingen; Marcus Griffiths; Michael P Pound; Oorbessy Gaju; M John Foulkes; Jacques Le Gouis; Simon Griffiths; Malcolm J Bennett; Julie King; Darren M Wells Journal: J Exp Bot Date: 2015-03-04 Impact factor: 6.992
Authors: Abhiram Das; Hannah Schneider; James Burridge; Ana Karine Martinez Ascanio; Tobias Wojciechowski; Christopher N Topp; Jonathan P Lynch; Joshua S Weitz; Alexander Bucksch Journal: Plant Methods Date: 2015-11-02 Impact factor: 4.993
Authors: Michael P Pound; Jonathan A Atkinson; Alexandra J Townsend; Michael H Wilson; Marcus Griffiths; Aaron S Jackson; Adrian Bulat; Georgios Tzimiropoulos; Darren M Wells; Erik H Murchie; Tony P Pridmore; Andrew P French Journal: Gigascience Date: 2017-10-01 Impact factor: 6.524
Authors: Nicolás Gaggion; Federico Ariel; Vladimir Daric; Éric Lambert; Simon Legendre; Thomas Roulé; Alejandra Camoirano; Diego H Milone; Martin Crespi; Thomas Blein; Enzo Ferrante Journal: Gigascience Date: 2021-07-20 Impact factor: 6.524
Authors: Stuart A Bagley; Jonathan A Atkinson; Henry Hunt; Michael H Wilson; Tony P Pridmore; Darren M Wells Journal: Sensors (Basel) Date: 2020-06-11 Impact factor: 3.576