| Literature DB >> 27834862 |
Prem Prakash Jayaraman1, Ali Yavari2,3, Dimitrios Georgakopoulos4, Ahsan Morshed5, Arkady Zaslavsky6.
Abstract
Improving farm productivity is essential for increasing farm profitability and meeting the rapidly growing demand for food that is fuelled by rapid population growth across the world. Farm productivity can be increased by understanding and forecasting crop performance in a variety of environmental conditions. Crop recommendation is currently based on data collected in field-based agricultural studies that capture crop performance under a variety of conditions (e.g., soil quality and environmental conditions). However, crop performance data collection is currently slow, as such crop studies are often undertaken in remote and distributed locations, and such data are typically collected manually. Furthermore, the quality of manually collected crop performance data is very low, because it does not take into account earlier conditions that have not been observed by the human operators but is essential to filter out collected data that will lead to invalid conclusions (e.g., solar radiation readings in the afternoon after even a short rain or overcast in the morning are invalid, and should not be used in assessing crop performance). Emerging Internet of Things (IoT) technologies, such as IoT devices (e.g., wireless sensor networks, network-connected weather stations, cameras, and smart phones) can be used to collate vast amount of environmental and crop performance data, ranging from time series data from sensors, to spatial data from cameras, to human observations collected and recorded via mobile smart phone applications. Such data can then be analysed to filter out invalid data and compute personalised crop recommendations for any specific farm. In this paper, we present the design of SmartFarmNet, an IoT-based platform that can automate the collection of environmental, soil, fertilisation, and irrigation data; automatically correlate such data and filter-out invalid data from the perspective of assessing crop performance; and compute crop forecasts and personalised crop recommendations for any particular farm. SmartFarmNet can integrate virtually any IoT device, including commercially available sensors, cameras, weather stations, etc., and store their data in the cloud for performance analysis and recommendations. An evaluation of the SmartFarmNet platform and our experiences and lessons learnt in developing this system concludes the paper. SmartFarmNet is the first and currently largest system in the world (in terms of the number of sensors attached, crops assessed, and users it supports) that provides crop performance analysis and recommendations.Entities:
Keywords: Internet of Things; semantic web; smart agriculture
Year: 2016 PMID: 27834862 PMCID: PMC5134543 DOI: 10.3390/s16111884
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Overview of Phenonet.
Comparison of SmartFarmNet with other Internet of Things (IoT) platforms. EC2: Elastic Cloud Computing.
| Platform | Sensor Discovery | Bring-Your-Own IoT Device | Scalable Data Analysis | Sharing Sensor, Data, and Analysis Results (Virtual Lab) |
|---|---|---|---|---|
| UBIDOTS | Not Supported | Yes, but requires considerable efforts to develop new interfaces | No | No. Only provides API for raw data access |
| Xively | Partial support with no specific approach for metadata description/management | Yes, but requires considerable efforts to develop new interfaces | No | No. Only provides API for raw data access |
| SensorCloud | Not Supported | Supports only vendor-specific sensors (some support for CSV file data) | Partial | Partial |
| IBM Bluemix | Not Supported | Yes, but requires considerable efforts to develop new interfaces | Partial with additional development required | Partial by using existing bluemix infrastructure as a service platform |
| Amazon IoT | Not Supported | Yes, but requires considerable efforts to develop new interfaces | Partial, with additional development required | Partial by using existing EC2 infrastructure as a service platform |
| IoTCloud | Not Supported | Yes, but requires considerable efforts to develop new interfaces | No | No |
| Apache Storm | Not Supported | Yes, but requires considerable efforts to develop new interfaces | No | No |
| SmartFarmNet | Supported via Semantic Web Technologies | Yes with in-built support for 30+ commercial and experimental sensors | Yes, real-time data analytics functions are built in | Yes, easy to use e-commerce-like use interaction model |
Figure 2SmartFarmNet’s data model.
Figure 3The SmartFarmNet ontology (phen denotes the phenonet ontology namespace, dul denotes the DOLCE+DnS Ultralite ontology namespace and ssn denotes the SSN namespace).
Figure 4SmartFarmNet—scalable data analysis of sensor data.
Figure 5SmartFarmNet architecture. DIY: do-it-yourself; RDF: resource description framework; API: Application programming interface.
SmartFarmNet platform—implementation details. LSM-Light: linked sensor middleware-light.
| Components | Implementation Details |
|---|---|
| SmartFarmNet gateway (X-GSN) | JAVA-based semantic sensor stream processor. Arduino and ArduCrop sensor wrappers to interface with IoT devices |
| Cloud Data Store (LSM-Light) | LSM-Light developed using JAVA and Open Virtuoso triple store. |
| Sensor Explorer | Java applications deployed in JBOSS |
| Reasoner Service | Apache Jena supported by SmartFarmNet OWL ontology |
| User Interfaces | Do-it-yourself tools developed in Java Server Faces (JSF) |
| Data Analytics | Redis [ |
Figure 6Sensor Schema Editor (SSE).
Figure 7Sensor data discovery and exploration.
Figure 8Query access latency—single sensor stream.
Figure 9Query access latency—multiple sensor stream.
Figure 10Query access latency with real-time statistical analysis—multiple sensor stream.
SmartFarmNet platform—real-time analysis computation time.
| Summarisation | Processing Time Total |
|---|---|
| Hourly | 6620 milliseconds |
| Daily | 9971 milliseconds |
| Weekly | 10,926 milliseconds |
| Monthly | 24,543 milliseconds |