| Literature DB >> 36016006 |
Chen Li1,2, Shijie Bian1,3, Tongzi Wu1, Richard P Donovan4, Bingbing Li1,5.
Abstract
With the rapid concurrent advance of artificial intelligence (AI) and Internet of Things (IoT) technology, manufacturing environments are being upgraded or equipped with a smart and connected infrastructure that empowers workers and supervisors to optimize manufacturing workflow and processes for improved energy efficiency, equipment reliability, quality, safety, and productivity. This challenges capital cost and complexity for many small and medium-sized manufacturers (SMMs) who heavily rely on people to supervise manufacturing processes and facilities. This research aims to create an affordable, scalable, accessible, and portable (ASAP) solution to automate the supervision of manufacturing processes. The proposed approach seeks to reduce the cost and complexity of smart manufacturing deployment for SMMs through the deployment of consumer-grade electronics and a novel AI development methodology. The proposed system, AI-assisted Machine Supervision (AIMS), provides SMMs with two major subsystems: direct machine monitoring (DMM) and human-machine interaction monitoring (HIM). The AIMS system was evaluated and validated with a case study in 3D printing through the affordable AI accelerator solution of the vision processing unit (VPU).Entities:
Keywords: 3D printing; IoT; affordable AI accelerator; computer vision; machine supervision; smart manufacturing
Mesh:
Year: 2022 PMID: 36016006 PMCID: PMC9414792 DOI: 10.3390/s22166246
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Figure 1The framework of AIMS.
Figure 2The canonical right-handed coordinate system.
Figure 3High-level architecture of the Region-based Convolutional Neural Network.
The performance and frequency of three candidate models, where AP stands for mean average precision when defined in Equation (5) are , and FPS stands for frame per second, the processing rate.
| COCO | 3D Printer | |||
|---|---|---|---|---|
| AP | FPS | AP | FPS | |
| YOLOv3 | 51.5 | 23.8 | 98.48 | 29.4 |
| YOLOv4 | 64.9 | 19.2 | 99.80 | 22.3 |
| Mask-RCNN | 60.0 | 5.00 | 98.80 | 5.90 |
Figure 4Workflow of the finger-text detection module.
Figure 5Processed camera frames from different visual perspectives.
Figure 6Illustration of determining button positions of different visual perspectives.
Figure 7Some results on different working conditions.
The accuracy of DMM on the test dataset, where AP is average precision.
| Index | Class Name | AP |
|---|---|---|
| 0 | extruder | 0.99935 |
| 1 | buildplate | 0.99946 |
| 2 | axis | 0.99539 |
Figure 8Test results for the Human-machine Interaction Monitoring.
Figure 9The set of items used for the normal operation and the abnormal condition tests.
The numerical result of 19 tests for AIMS.
| Precision | Recall |
|---|---|
| 0.923 | 0.936 |