Literature DB >> 29732261

Raspberry Pi-powered imaging for plant phenotyping.

Jose C Tovar1, J Steen Hoyer1,2, Andy Lin1, Allison Tielking1, Steven T Callen1, S Elizabeth Castillo1, Michael Miller1, Monica Tessman1, Noah Fahlgren1, James C Carrington1, Dmitri A Nusinow1, Malia A Gehan1.   

Abstract

PREMISE OF THE STUDY: Image-based phenomics is a powerful approach to capture and quantify plant diversity. However, commercial platforms that make consistent image acquisition easy are often cost-prohibitive. To make high-throughput phenotyping methods more accessible, low-cost microcomputers and cameras can be used to acquire plant image data. METHODS AND
RESULTS: We used low-cost Raspberry Pi computers and cameras to manage and capture plant image data. Detailed here are three different applications of Raspberry Pi-controlled imaging platforms for seed and shoot imaging. Images obtained from each platform were suitable for extracting quantifiable plant traits (e.g., shape, area, height, color) en masse using open-source image processing software such as PlantCV.
CONCLUSIONS: This protocol describes three low-cost platforms for image acquisition that are useful for quantifying plant diversity. When coupled with open-source image processing tools, these imaging platforms provide viable low-cost solutions for incorporating high-throughput phenomics into a wide range of research programs.

Entities:  

Keywords:  Raspberry Pi; imaging; low‐cost phenotyping; morphology

Year:  2018        PMID: 29732261      PMCID: PMC5895192          DOI: 10.1002/aps3.1031

Source DB:  PubMed          Journal:  Appl Plant Sci        ISSN: 2168-0450            Impact factor:   1.936


Image‐based high‐throughput phenotyping has been heralded as a solution for measuring diverse traits across the plant tree of life (Araus and Cairns, 2014; Goggin et al., 2015). In general, there are five steps in image‐based plant phenotyping: (1) image and metadata acquisition, (2) data transfer, (3) image segmentation (separation of target object and background), (4) trait extraction (object description), and (5) group‐level data analysis. Image segmentation, trait extraction, and data analysis are the most time‐consuming steps of the phenotyping process, but protocols that increase the speed and consistency of image and metadata acquisition greatly speed up downstream analysis steps. Commercial high‐throughput phenotyping platforms are powerful tools to collect consistent image data and metadata, and are even more effective when designed for targeted biological questions (Topp et al., 2013; Chen et al., 2014; Honsdorf et al., 2014; Yang et al., 2014; Al‐Tamimi et al., 2016; Pauli et al., 2016; Feldman et al., 2017; Zhang et al., 2017). However, commercial phenotyping platforms are cost‐prohibitive to many laboratories and institutions. There is also no such thing as a “one‐size‐fits‐all” phenotyping system; different biological questions often require different hardware configurations. Therefore, low‐cost technologies that can be used and repurposed for a variety of phenotyping applications are of great value to the plant community. Raspberry Pi computers are small (credit card sized or smaller), low‐cost, and were originally designed for educational purposes (Upton and Halfacree, 2014). Several generations of Raspberry Pi single‐board computers have been released, and most models now feature built‐in modules for wireless and bluetooth connectivity (Monk, 2016). The Raspberry Pi Foundation (Cambridge, United Kingdom) also releases open‐source software and accessories such as camera modules (five and eight megapixel). Additional sensors or controllers can be connected via USB ports and general‐purpose input/output pins. A strong online community of educators and hobbyists provide support (including project ideas and documentation), and a growing population of researchers use Raspberry Pi computers for a wide range of applications including phenotyping. We and others (e.g., Huang et al., 2016; Mutka et al., 2016; Minervini et al., 2017) have utilized Raspberry Pi computers in a number of configurations to streamline collection of image data and metadata. Here, we document three different methods for using Raspberry Pi computers for plant phenotyping (Fig. 1). These protocols are a valuable resource because, while there are many phenotyping papers that outline phenotyping systems in detail (Granier et al., 2006; Iyer‐Pascuzzi et al., 2010; Jahnke et al., 2016; Shafiekhani et al., 2017), there are few protocols that provide step‐by‐step instructions for building them (Bodner et al., 2017; Minervini et al., 2017). We provide examples illustrating automation of photo capture with open‐source tools (based on the Python programming language and standard Linux utilities). Furthermore, to demonstrate that these data are of high quality and suitable for quantitative trait extraction, we segmented example image data (plant isolated from background) using the open‐source open‐development phenotyping software PlantCV (Fahlgren et al., 2015).
Figure 1

Low‐cost Raspberry Pi phenotyping platforms. (A) Raspberry Pi time‐lapse imaging in a growth chamber. (B) Raspberry Pi camera stand. (C) Raspberry Pi multi‐image octagon.

Low‐cost Raspberry Pi phenotyping platforms. (A) Raspberry Pi time‐lapse imaging in a growth chamber. (B) Raspberry Pi camera stand. (C) Raspberry Pi multi‐image octagon.

METHODS AND RESULTS

Raspberry Pi initialization

This work describes Raspberry Pi setup (Appendix 1) and three protocols (Appendices 2, 3, 4) that utilize Raspberry Pi computers for low‐cost image‐based phenotyping and gives examples of the data they produce. Raspberry Pi computers can be reconfigured for different phenotyping projects and can be easily purchased from online retailers. The first application is time‐lapse plant imaging (Appendix 2), the second protocol describes setup and use of an adjustable camera stand for top‐view photography (Appendix 3), and the third project describes construction and use of an octagonal box for acquiring plant images from several angles simultaneously (Appendix 4). For all three phenotyping protocols, the same protocol to initialize Raspberry Pi computers is used and is provided in Appendix 1. The initialization protocol in Appendix 1 parallels the Raspberry Pi Foundation's online documentation and provides additional information on setting up passwordless secure shell (SSH) login to a remote host for data transfer and/or to control multiple Raspberry Pis. Passwordless SSH allows one to pull data from the data collection computer to a remote server without having to manually enter login information each time. Reliable data transfer is an important consideration in plant phenotyping projects because, while it is possible to process image data directly on a Raspberry Pi computer, most users will prefer to process large image data sets on a bioinformatics cluster. Remote data transfer is especially important for time‐lapse imaging setups, such as the configuration described in Appendix 2, because data can be generated at high frequency over the course of long experiments, and thus can easily exceed available disk space on the micro secure digital (SD) cards that serve as local hard drives. Once one Raspberry Pi has been properly configured and tested, the fully configured operating system can be backed up, yielding a disk image that can be copied (“cloned”) onto as many additional SD cards as are needed for a given phenotyping project (Appendix 1).

Raspberry Pi time‐lapse imaging

Time‐lapse imaging is a valuable tool for documenting plant development and can reveal differences that would not be apparent from endpoint analysis. Raspberry Pi computers and camera modules work effectively as phenotyping systems in controlled‐environment growth chambers, and the low cost of Raspberry Pi computers allows this approach to scale well. Growth chambers differ from (agro)ecological settings but are an essential tool for precise control and reproducible experimentation (Poorter et al., 2016). Time‐lapse imaging with multiple cameras allows for simultaneous imaging of many plants and can capture higher temporal resolution than conveyor belt and mobile‐camera systems. Appendix 2 provides an example protocol for setting up the hardware and software necessary to capture plant images in a growth chamber. The main top‐view imaging setup described is aimed at imaging flats or pots of plants in a growth chamber. We include instructions for adjusting the camera–plant focal distance (yielding higher plant spatial resolution) and describe how to adjust the temporal resolution of imaging. The focal distance can be optimized to the target plant, trait, and degree of precision required; large plant–camera distances allow a larger field of view, at the cost of lower resolution. For traits like plant area, where segmentation of individual plant organs is not critical, adjusting the focal length might not be necessary. Projected leaf area in top‐down photos correlates well with fresh and dry weight, especially for relatively flat plants such as Arabidopsis thaliana (L.) Heynh. (Leister et al., 1999). A stable and level imaging configuration is important for consistent imaging across long experiments and to compare data from multiple Raspberry Pi camera rigs. Although there is more than one way to suspend Raspberry Pi camera rigs in a flat and stable top‐view configuration, AC power socket adapters were attached to the the back of cases with silicone adhesive (Appendix 2). Raspberry Pi boards and cameras were then encased and screwed into the incandescent bulb sockets built into the growth chamber (Fig. 1). Users with access to a 3D printer may prefer to print cases, so we have provided a link to instructions for printing a suitable case (with adjustable ball‐joint Raspberry Pi camera module mount) in Appendix 2. This type of 3D‐printed case also works well for side‐view imaging of plants grown on plates (Huang et al., 2016; Mutka et al., 2016). For this top‐down imaging example, 12 Raspberry Pi camera rigs were powered through two USB power supplies drawing power (via extension cord and surge protector) from an auxiliary power outlet built into the growth chamber. Although we use 12 Raspberry Pi camera rigs in this example, the setup can be scaled up or down, with a per‐unit cost of approximately US$100. A single Pi camera rig is enough for a new user to get started, and laboratories can efficiently scale up imaging as they develop experience and refine their goals. Time‐lapse imaging was scheduled at five‐minute intervals using the software utility cron. A predictable file‐naming scheme that includes image metadata (field of view number, timestamp, and a common identifier) was employed to confirm that all photo time points were captured and transferred as scheduled. Images were pulled from each Raspberry Pi to a remote server twice per hour (using a standard utility called rsync) by a server‐side cron process using the configuration files described in Appendix 2. Optimizing imaging conditions for maximum consistency can simplify downstream image processing. To aid in image normalization during processing, color standards and size markers can be included in images. Placing rubberized blue mesh (e.g., Con‐Tact Brand, Pomona, California, USA) around the base of plants can sometimes simplify segmentation (i.e., distinguishing plant foreground pixels from soil background pixels), although this was not necessary for the A. thaliana example described here. Care should be taken to ensure that large changes in the scene (including gradual occlusion of blue mesh by leaves) do not dramatically alter automatic exposure and color balance settings over the course of an experiment. If automatic exposure becomes an issue, camera settings can be manually set (see Appendix 4). In this example, cameras and flats were set up to yield a similar vantage point (a 4 × 5 grid of pots) in each field of view, such that very similar computational pipelines can be used to process images from all 12 cameras. An example image has been processed with PlantCV (Fahlgren et al., 2015) in Fig. 2, and a script showing and describing each step in the analysis is provided at https://github.com/danforthcenter/apps-phenotyping. Further image‐processing tutorials and tips can be found at http://plantcv.readthedocs.io/en/latest/.
Figure 2

Examples of data collected from Raspberry Pi phenotyping platforms that have plant and/or seed tissue segmented using open‐source open‐development software PlantCV (Fahlgren et al., 2015). (A) PlantCV‐segmented image of a flat of Arabidopsis acquired from Raspberry Pi time‐lapse imaging protocol in a growth chamber. (B) PlantCV‐segmented image of quinoa seeds acquired from Raspberry Pi camera stand. (C) Example side‐ and top‐view images of quinoa plants acquired from Raspberry Pi multi‐image octagon. Plant convex hull, width, and length have been identified with PlantCV and are denoted in red.

Examples of data collected from Raspberry Pi phenotyping platforms that have plant and/or seed tissue segmented using open‐source open‐development software PlantCV (Fahlgren et al., 2015). (A) PlantCV‐segmented image of a flat of Arabidopsis acquired from Raspberry Pi time‐lapse imaging protocol in a growth chamber. (B) PlantCV‐segmented image of quinoa seeds acquired from Raspberry Pi camera stand. (C) Example side‐ and top‐view images of quinoa plants acquired from Raspberry Pi multi‐image octagon. Plant convex hull, width, and length have been identified with PlantCV and are denoted in red.

Raspberry Pi camera stand

An adjustable camera stand is a versatile piece of laboratory equipment for consistent imaging. Appendix 3 is a protocol for pairing a low‐cost home‐built camera stand with a Raspberry Pi computer for data capture and management. Altogether, the camera stand system costs approximately US$750. The camera stand (79 cm width × 82.5 cm height) was built from aluminum framing (80/20 Inc., Columbia City, Indiana, USA) to hold a Nikon COOLPIX L830 camera (Nikon Corporation, Tokyo, Japan) via a standard mount (Fig. 1). For this application, we prefer to use a single‐lens reflex (SLR) digital camera (rather than a Raspberry Pi camera module) for adjustable focus and to improve resolution. The camera was affixed to a movable bar, so the distance between camera and object can be adjusted up to 63 cm. A Python script that utilizes gPhoto2 (Figuière and Niedermann, 2017) for data capture and rsync for data transfer to a remote host is included in the protocol (Appendix 3). When the “camerastand.py” script is run, the user is prompted to enter the filename for the image. The script verifies that the camera is connected to the Raspberry Pi, acquires the image with the SLR camera, retrieves the image from the camera, renames the image file to the user‐provided filename, saves a copy in a local Raspberry Pi directory, and transfers this copy to the desired directory on a remote host. Because image filenames are commonly used as the primary identifier for downstream image processing, it is advised to use a filename that identifies the species, accession, treatment, and replicate, as appropriate. The Python script provided appends a timestamp to the filename automatically. We regularly use this Raspberry Pi Camera Stand to image seeds, plant organs (e.g., inflorescences), and short‐statured plants. For seed images, a white background with a demarcated black rectangular area ensures that separated seeds are in frame, which speeds up the imaging process. Color cards (white, black, and gray; DGK Color Tools, New York, New York, USA) and a size marker to normalize area are also included in images to aid in downstream processing and analysis steps. It is advised to use the same background, and, if possible, the same distance between object and camera for all images in an experimental set. However, including a size marker in images can be used to normalize data extracted from images if the vantage point does change. Chenopodium quinoa Willd. (quinoa) seed images are shown as example data from the camera stand (Fig. 2). To show that images collected from the camera stand are suitable for image analysis, seed images acquired with the camera stand were processed using PlantCV (Fahlgren et al., 2015) to quantify individual seed size, shape, color, and count; these types of measurements are valuable for quantifying variation within a population. The step‐by‐step image processing instructions are provided at https://github.com/danforthcenter/apps-phenotyping. This overall process (Appendix 3) provides a considerable cost savings relative to paying for seed imaging services or buying a commercial seed imaging station.

Raspberry Pi multi‐image octagon

Different plant architecture types require different imaging configurations for capture. For example, top‐down photographs can capture most of the information about the architecture of rosette plants (as described above), but plants with orthotropic growth such as rice or quinoa are better captured with a combination of both side‐view and top‐view images. Therefore, platforms for simultaneously imaging plants from multiple angles are valuable. In Appendix 4, a protocol is described to set up an octagon‐shaped chamber for imaging at different angles. The complete octagon‐shaped imaging system costs approximately US$1500. A “master” Raspberry Pi computer with a Raspberry Pi camera module is used to collect image data and also to trigger three other Raspberry Pi computers and cameras. Data are transferred from the four Raspberry Pi computers to a remote host using rsync. The octagon chamber (122 cm height and 53.5 cm of each octagonal side) was constructed from aluminum framing and 3‐mm white polyvinyl chloride (PVC) panels (80/20 Inc.; Fig. 1). The top of this structure is left open but is covered with a translucent white plastic tarp to diffuse light when acquiring images. A latched door was built into the octagon chamber to facilitate loading of plants. Four wheels were attached at the bottom of the chamber for mobility. The four Raspberry Pis with Raspberry Pi camera modules (one top‐view and three side‐views approximately 45° angle apart) in cases were affixed to the octagon chamber using heavy‐duty Velcro. To maintain a consistent distance between the Raspberry Pi cameras and a plant within the Raspberry Pi multi‐image octagon, a pot was affixed to the center of the octagon chamber, with color cards affixed to the outside of the stationary pot (white, black, and gray; DGK Color Tools) so that a potted plant could be quickly placed in the pot during imaging. To facilitate data acquisition and transfer on all four Raspberry Pis, scripts are written so the user only needs to interact with a single “master” Raspberry Pi (here the master Raspberry Pi is named “octagon”). From a laptop computer one would connect to the “master” Pi via SSH, then run the “sshScript.sh” on that Pi. The “sshScript.sh” script triggers the image capture and data transfer sequence in all four Raspberry Pis and appends the date to a user‐input barcode. When the “sshScript.sh” script is run, a prompt asks the user for a barcode sequence. The barcode can be inputted manually, or, if a barcode scanner (e.g., Socket 7Qi) is available, a barcode can be used to input the filename information. Again, it is advised to use a plant barcode that identifies the species, accession, treatment, and replicate, as appropriate. Once a barcode name has been inputted, another prompt asks if the user would like to continue with image capture. This pause in the “sshScript.sh” script gives the user the opportunity to place the plant in the octagon before image capture is triggered. The “sshScript.sh” runs the script “piPicture.py” on all four Raspberry Pis. The “piPicture.py” script captures an image and appends the user‐inputted filename with the Raspberry Pi camera ID and the date. The image is then saved to a local directory on the Raspberry Pi. The “syncPi.sh” script is then run by “sshScript.sh” to transfer the images from the four Raspberry Pis to a remote host. The final script (“shutdown_all_pi”) is optionally run when image acquisition is over, allowing the user to shut down all four Raspberry Pis simultaneously. Examples of quinoa plant images captured with the Raspberry Pi multi‐image octagon are analyzed with PlantCV (Fahlgren et al., 2015) to show that the area and shape data can be extracted (Fig. 2). Step‐by‐step analysis scripts are provided at https://github.com/danforthcenter/apps-phenotyping.

Protocol feasibility

The protocols provided in the appendices that follow provide step‐by‐step instructions for using Raspberry Pi computers for plant phenotyping in three different configurations. The majority of components for all three protocols are readily available for purchase online. Low‐cost computers and components are especially important because some experiments might test harsh environmental conditions and need to be replaced long‐term. Each of the platforms was built and programmed in large part by high‐school students, undergraduates, or graduate students and do not require a large investment of time to build or set up. Because Raspberry Pi computers are widely used by educators, hobbyists, and researchers, there is a strong online community that can be called upon for troubleshooting or to extend the functionality of a project. The best way to start troubleshooting is to use an online search engine to see if others have solved similar issues. If an error message has been triggered, start by using the error message as a search term. If a satisfactory answer is not found through an online search, posting on a community support forum like Stack Overflow is a good next step (https://raspberrypi.stackexchange.com/; https://stackoverflow.com/questions/tagged/raspberry-pi). When posting on online community forums, it is helpful to be specific. For example, if an error message is triggered, it is vital to include the exact text of the error message, to describe the events that triggered that error message, and to explain the target end goal. Automation increases the consistency of image and metadata capture, which streamlines image segmentation (Fig. 2) and is thus preferable to manual image capture. Furthermore, the low cost of each system and the flexibility to reconfigure Raspberry Pi computers for multiple purposes makes automated plant phenotyping accessible to most researchers.

CONCLUSIONS

The low‐cost imaging platforms presented here provide an opportunity for labs to introduce phenotyping equipment into their research toolkit, and thus increase the efficiency, reproducibility, and thoroughness of their measurements. These protocols make high‐throughput phenotyping accessible to researchers unable to make a large investment in commercial phenotyping equipment. Paired with open‐source, open‐development, high‐throughput plant phenotyping software like PlantCV (Fahlgren et al., 2015), image data collected from these phenotyping systems can be used to quantify plant traits for populations of plants that are amenable to genetic mapping. These Raspberry Pi–powered tools are also useful for education and training. In particular, we have used time‐lapse imaging to introduce students and teachers to the Linux environment, image processing, and data analysis in a classroom setting (http://github.com/danforthcenter/outreach/). As costs continue to drop and hardware continues to improve, there is enormous potential for the plant science community to capitalize on creative applications, well‐documented designs, and shared data sets and code.

Aluminum 80/20 Inc. (Columbia City, Indiana, USA) frame parts (Fig. A3‐1):

IDQuantity80/20 Part no.Description
A11515‐UL75 cm width × 3.81 cm height × 3.81 cm length T‐slotted bar, with 7040 counterbore in A left and A right
B21515‐UL75 cm width × 3.81 cm height × 3.81 cm length T‐slotted bar, with 7040 counterbore in D left
C11515‐UL88.9 cm width × 3.81 cm height × 3.81 cm length T‐slotted bar
D21515‐UL36 cm × 3.81 cm height × 3.81 cm length T‐slotted bar
E26525Double flange short standard linear bearing with brake holes
F2680015 S gray “L” handle linear bearing brake kit
G4336015 S 5/16‐18 standard anchor fastener assembly
H165‐245310.16 cm width × 10.16 cm height × 0.3175 cm thick aluminum plate. Three holes will be needed, two to bolt the plate to the crossbar and one hole below the bar to mount the camera with a nut.
I6320315 series 5/16‐18 standard slide in T‐nut
J6311715/16‐18 × 0.875 inch button head socket cap screw

Aluminum 80/20 Inc. (Columbia City, Indiana, USA) frame parts and paneling for octagon (Fig. A4‐1):

IDQuantity80/20 Part no.Description
A240‐4002113.919 cm width × 4 cm height × 4 cm length T‐slotted bar, with 7040 counterbore in B left and 7040 counterbore in B right
B240‐4002113.284 cm width × 4 cm height × 4 cm length T‐slotted bar. Bi‐slot adjacent T‐slotted extrusion.
C840‐400347.1856 cm width × 4 cm height × 4 cm length T‐slotted bar, with 7044 counterbore in A left and 7044 counterbore in A right
D240‐400330.3911 cm width × 4 cm height × 4 cm length T‐slotted bar, with 7044 counterbore in C left and 7044 counterbore in C right
E840‐400447.1856 cm width × 4 cm height × 4 cm length T‐slotted bar, with 7044 counterbore in C left and 7044 counterbore in C right
F140‐4080‐UL121.92 cm width × 4 cm height × 8 cm length T‐slotted bar, with 7044 counterbore in E left, 7044 counterbore in R left, 7044 counterbore in R right, and 7044 counterbore in E right
G840‐4094121.92 cm width × 4 cm height, 40 series T‐slotted bar with 45° outside radius
H140‐2061Medium plastic door handle, black
I240‐208540 S aluminum hinge
J165‐2053Deadbolt latch with top latch
K4440‐3897Anchor fastener assembly with M8 bolt and standard T‐nut
L875‐3525M8 × 1.2 cm black button head socket cap screw (BHSCS) with slide‐in economy T‐nut
M275‐3634M8 × 1.8 cm black socket head cap screw (SHCS) with slide‐in economy T‐nut
N440‐242640 S flange mount caster base plate
O1213‐8520M8 × 2 cm SHCS blue
P840‐391515 S M8 roll‐in T‐nut with ball spring
Q465‐232312.7 cm flange mount swivel caster with brake
R22616129.921 cm width × 64.9605 cm height × 0.3 cm length white PVC panel
S765‐2616116.119 cm width × 49.3856 cm height × 0.3 cm length white PVC panel
T165‐2616107.483 cm width × 32.5911 cm height × 0.3 cm length white PVC panel
  19 in total

1.  Phenotiki: an open software and hardware platform for affordable and easy image-based phenotyping of rosette-shaped plants.

Authors:  Massimo Minervini; Mario V Giuffrida; Pierdomenico Perata; Sotirios A Tsaftaris
Journal:  Plant J       Date:  2017-03-02       Impact factor: 6.417

Review 2.  Pampered inside, pestered outside? Differences and similarities between plants growing in controlled conditions and in the field.

Authors:  Hendrik Poorter; Fabio Fiorani; Roland Pieruschka; Tobias Wojciechowski; Wim H van der Putten; Michael Kleyer; Uli Schurr; Johannes Postma
Journal:  New Phytol       Date:  2016-10-26       Impact factor: 10.151

3.  3D phenotyping and quantitative trait locus mapping identify core regions of the rice genome controlling root architecture.

Authors:  Christopher N Topp; Anjali S Iyer-Pascuzzi; Jill T Anderson; Cheng-Ruei Lee; Paul R Zurek; Olga Symonova; Ying Zheng; Alexander Bucksch; Yuriy Mileyko; Taras Galkovskyi; Brad T Moore; John Harer; Herbert Edelsbrunner; Thomas Mitchell-Olds; Joshua S Weitz; Philip N Benfey
Journal:  Proc Natl Acad Sci U S A       Date:  2013-04-11       Impact factor: 11.205

4.  A Versatile Phenotyping System and Analytics Platform Reveals Diverse Temporal Responses to Water Availability in Setaria.

Authors:  Noah Fahlgren; Maximilian Feldman; Malia A Gehan; Melinda S Wilson; Christine Shyu; Douglas W Bryant; Steven T Hill; Colton J McEntee; Sankalpi N Warnasooriya; Indrajit Kumar; Tracy Ficor; Stephanie Turnipseed; Kerrigan B Gilbert; Thomas P Brutnell; James C Carrington; Todd C Mockler; Ivan Baxter
Journal:  Mol Plant       Date:  2015-06-20       Impact factor: 13.164

5.  Dissecting the phenotypic components of crop plant growth and drought responses based on high-throughput image analysis.

Authors:  Dijun Chen; Kerstin Neumann; Swetlana Friedel; Benjamin Kilian; Ming Chen; Thomas Altmann; Christian Klukas
Journal:  Plant Cell       Date:  2014-12-11       Impact factor: 11.277

6.  phenoSeeder - A Robot System for Automated Handling and Phenotyping of Individual Seeds.

Authors:  Siegfried Jahnke; Johanna Roussel; Thomas Hombach; Johannes Kochs; Andreas Fischbach; Gregor Huber; Hanno Scharr
Journal:  Plant Physiol       Date:  2016-09-23       Impact factor: 8.340

7.  Combining high-throughput phenotyping and genome-wide association studies to reveal natural genetic variation in rice.

Authors:  Wanneng Yang; Zilong Guo; Chenglong Huang; Lingfeng Duan; Guoxing Chen; Ni Jiang; Wei Fang; Hui Feng; Weibo Xie; Xingming Lian; Gongwei Wang; Qingming Luo; Qifa Zhang; Qian Liu; Lizhong Xiong
Journal:  Nat Commun       Date:  2014-10-08       Impact factor: 14.919

8.  Vinobot and Vinoculer: Two Robotic Platforms for High-Throughput Field Phenotyping.

Authors:  Ali Shafiekhani; Suhas Kadam; Felix B Fritschi; Guilherme N DeSouza
Journal:  Sensors (Basel)       Date:  2017-01-23       Impact factor: 3.576

9.  RGB and Spectral Root Imaging for Plant Phenotyping and Physiological Research: Experimental Setup and Imaging Protocols.

Authors:  Gernot Bodner; Mouhannad Alsalem; Alireza Nakhforoosh; Thomas Arnold; Daniel Leitner
Journal:  J Vis Exp       Date:  2017-08-08       Impact factor: 1.355

10.  High-throughput phenotyping to detect drought tolerance QTL in wild barley introgression lines.

Authors:  Nora Honsdorf; Timothy John March; Bettina Berger; Mark Tester; Klaus Pillen
Journal:  PLoS One       Date:  2014-05-13       Impact factor: 3.240

View more
  18 in total

Review 1.  Capturing crop adaptation to abiotic stress using image-based technologies.

Authors:  Nadia Al-Tamimi; Patrick Langan; Villő Bernád; Jason Walsh; Eleni Mangina; Sónia Negrão
Journal:  Open Biol       Date:  2022-06-22       Impact factor: 7.124

2.  Easy as piadcs: A low-cost, ultra-high-resolution data acquisition system using a Raspberry Pi.

Authors:  Anna Knapp; Arnold J Bloom
Journal:  Appl Plant Sci       Date:  2022-06-08       Impact factor: 2.511

3.  Deep learning-based detection of seedling development.

Authors:  Salma Samiei; Pejman Rasti; Joseph Ly Vu; Julia Buitink; David Rousseau
Journal:  Plant Methods       Date:  2020-07-30       Impact factor: 4.993

Review 4.  Breeding crops to feed 10 billion.

Authors:  Lee T Hickey; Amber N Hafeez; Hannah Robinson; Scott A Jackson; Soraya C M Leal-Bertioli; Mark Tester; Caixia Gao; Ian D Godwin; Ben J Hayes; Brande B H Wulff
Journal:  Nat Biotechnol       Date:  2019-06-17       Impact factor: 54.908

5.  PlantCV v2: Image analysis software for high-throughput plant phenotyping.

Authors:  Malia A Gehan; Noah Fahlgren; Arash Abbasi; Jeffrey C Berry; Steven T Callen; Leonardo Chavez; Andrew N Doust; Max J Feldman; Kerrigan B Gilbert; John G Hodge; J Steen Hoyer; Andy Lin; Suxing Liu; César Lizárraga; Argelia Lorence; Michael Miller; Eric Platon; Monica Tessman; Tony Sax
Journal:  PeerJ       Date:  2017-12-01       Impact factor: 2.984

6.  An automated, high-throughput method for standardizing image color profiles to improve image-based plant phenotyping.

Authors:  Jeffrey C Berry; Noah Fahlgren; Alexandria A Pokorny; Rebecca S Bart; Kira M Veley
Journal:  PeerJ       Date:  2018-10-04       Impact factor: 2.984

7.  The 'PhenoBox', a flexible, automated, open-source plant phenotyping solution.

Authors:  Angelika Czedik-Eysenberg; Sebastian Seitner; Ulrich Güldener; Stefanie Koemeda; Jakub Jez; Martin Colombini; Armin Djamei
Journal:  New Phytol       Date:  2018-04-05       Impact factor: 10.151

8.  Heating quinoa shoots results in yield loss by inhibiting fruit production and delaying maturity.

Authors:  Jose C Tovar; Carlos Quillatupa; Steven T Callen; S Elizabeth Castillo; Paige Pearson; Anastasia Shamin; Haley Schuhl; Noah Fahlgren; Malia A Gehan
Journal:  Plant J       Date:  2020-02-24       Impact factor: 6.417

Review 9.  Decoding Plant-Environment Interactions That Influence Crop Agronomic Traits.

Authors:  Keiichi Mochida; Ryuei Nishii; Takashi Hirayama
Journal:  Plant Cell Physiol       Date:  2020-08-01       Impact factor: 4.927

10.  ABE-VIEW: Android Interface for Wireless Data Acquisition and Control.

Authors:  Daniel M Jenkins; Ryan Kurasaki
Journal:  Sensors (Basel)       Date:  2018-08-13       Impact factor: 3.576

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.