| Literature DB >> 34957230 |
Walid Ben Ali1,2, Ahmad Pesaranghader3,4, Robert Avram3, Pavel Overtchouk5, Nils Perrin2, Stéphane Laffite1, Raymond Cartier2, Reda Ibrahim2, Thomas Modine1, Julie G Hussin3.
Abstract
Driven by recent innovations and technological progress, the increasing quality and amount of biomedical data coupled with the advances in computing power allowed for much progress in artificial intelligence (AI) approaches for health and biomedical research. In interventional cardiology, the hope is for AI to provide automated analysis and deeper interpretation of data from electrocardiography, computed tomography, magnetic resonance imaging, and electronic health records, among others. Furthermore, high-performance predictive models supporting decision-making hold the potential to improve safety, diagnostic and prognostic prediction in patients undergoing interventional cardiology procedures. These applications include robotic-assisted percutaneous coronary intervention procedures and automatic assessment of coronary stenosis during diagnostic coronary angiograms. Machine learning (ML) has been used in these innovations that have improved the field of interventional cardiology, and more recently, deep Learning (DL) has emerged as one of the most successful branches of ML in many applications. It remains to be seen if DL approaches will have a major impact on current and future practice. DL-based predictive systems also have several limitations, including lack of interpretability and lack of generalizability due to cohort heterogeneity and low sample sizes. There are also challenges for the clinical implementation of these systems, such as ethical limits and data privacy. This review is intended to bring the attention of health practitioners and interventional cardiologists to the broad and helpful applications of ML and DL algorithms to date in the field. Their implementation challenges in daily practice and future applications in the field of interventional cardiology are also discussed.Entities:
Keywords: cardiology; deep learning; interventional cardiology; neural networks; prognosis
Year: 2021 PMID: 34957230 PMCID: PMC8692711 DOI: 10.3389/fcvm.2021.711401
Source DB: PubMed Journal: Front Cardiovasc Med ISSN: 2297-055X
Type of learning methods.
|
|
|
|
|---|---|---|
| Supervised | Uses labeled outcome data. The labels are typically assigned by experts in the field prior to model training ( | Involves tasks such as regression, classification, predictive modeling, survival analysis ( |
| Unsupervised | No labeled outcome data. We observe similarities, relationships, and if possible causality among groups and variables ( | Used for tasks such as dimensionality reduction, clustering, feature extraction ( |
| Semi-supervised | The input data contains both labeled and unlabeled outcome data ( | Labeled data is used to identify specific groups in data and their parameters. These data are then inputted to the algorithm along unlabeled data to explore the boundaries of the parameters ( |
| Reinforcement | Based on behavioral psychology. The learning agent interacts with the environment to maximize a reward, and updates its parameters based on the feedback it receives from the choices it makes. The learning stops when the “reward” criteria are met to handle a decision-making function ( | Can be used in medical imaging analytics and personalized prescription selection. Popular in automated robotics ( |
Figure 1Graphical representation of the decision boundary (red line) in an optimal fitting (A) and overfitting model (B) overfitting describes a state of model which has poor generalizability due to excessive fitting of noise data presented in a training dataset.
Algorithmic overview with prominent examples of implementation in cardiology.
|
|
|
|
|
|
|---|---|---|---|---|
|
| ||||
| Decision trees, random forest, boosting | Decision trees are flowchart-type algorithms. Each variable is a condition on which the tree splits into branches, until the outcome “leaf.” Random forest and boosting are it's derivatives. | Interpretability. Integrated feature selection. No preprocessing. Handles non-linear relationships. | Computationally expensive. Can overfit or create biased trees in case of unbalanced outcome classes. | Long-term cardiovascular outcomes prediction from clinical, ECG, imaging, biomarker data ( |
| Support vector machine | Builds a hyperplane in a high-dimensional space to separate the data into 2 outcome categories with the maximum margin. | Can integrate many sparse features, limits overfitting and is computationally effective | Needs preprocessing. Limited interpretability | Automated echocardiographic assessment of mitral regurgitation ( |
| Regularized regression | Type of regression where coefficient estimates are constrained by penalty terms (ex: LASSO, ridge) | Familiar interpretations for association of variables to outcomes applied to high-dimensional data | Variable pre-selection is often advisable. | 1-year mortality predictors after MitraClip implantation ( |
|
| ||||
| K-mean clustering | Assigns each data point to a cluster (group; with k the number of groups) based on its distance from the other points | Easy to implementent. Computationally fast. | Number of groups must be known or assigned. | Separate QRS and non-QRS-region in the ECG signal ( |
| Principal component analysis | Uses orthogonal transformation to convert possibly correlated variables into a set of linearly uncorrelated principal components. | Can be used for dimensionality reduction. | Only captures linear relationships. Limited interpretability | MACE prediction from clinical and biomarker data representing metabolic syndrome ( |
|
| ||||
| Shallow neural networks | A set of nodes (“neurons”) is arranged in layers connected by edges (weights). The network connects input data to the outcome to predict through a paralleled set of parameterized non-linear transformations. | Can explore non-linear relationships (often encountered in real-life datasets) as well as linear ones. NN can handle heteroskedasticity, have been praised for the generalizability of the trained models, and are computationally effective. Flexible. | Variable pre-selection is often advisable. Needs variable pre-processing. | Diagnosis of coronary artery disease from myocardial perfusion scintigraphy ( |
| Deep fully connected neural network | An extension of the shallow NN architecture, but that uses many hidden layers (layers between input and output). Weights and biases of the NN are trained via back-propagation. | Performance increases with the quantity of data. Surpass other machine learning methods for very high-dimensional data. Flexible architecture and basis of CNN, RNN | Requires a high quantity of data. Can easily overfit. Low interpretability Sensible to changes in input data. | Mortality, readmission, LOS and diagnosis prediction from EHR ( |
| Convolutional neural network | Type of NN which learns multiple levels of feature sets at different levels of abstraction. | One of the most popular deep learning architectures. Flexible. Optimal for image classification. | Requires a high quantity of data. Can easily overfit. Low interpretability | 3D aortic valve annulus planimetry in TAVI ( |
| Recurrent neural network | Type of NN which encodes sequential data by capturing context into memory. | Adapted for natural language processing, text or video, genetic sequences or any other temporal data ( | Computationally expensive. Limited quantity of encodable data. | EHR text data extraction for mortality prediction in congenital heart disease ( |
|
| ||||
| Autoencoder | Encodes the most valuable unlabeled inputs into short codes, then uses those to reconstruct the original input as output. | Dimensionality reduction. Optimal for denoising filtering, image segmentation ( | Low interpretability | MRI-extracted cardiac motion model denoising for survival prediction ( |
| Deep generative models | Model a distribution that is as similar as possible to the true data distribution with the help of GANs or VAEs | Data augmentation and preserving data privacy with the help of synthetic data samples. Domain translation and domain adaptation. Content and style matching using adversarial inference ( | Could be computationally expensive. The models are still in the stage of getting mature for high-fidelity data sample generation. Lack of stability at training time. | Noise reduction in low-dose CT ( |
|
| ||||
| Deep reinforcement learning | RL learns how to maximize a reward function by exploring the actions available from certain states. A deep RL agent tests an action to see what reward will be returned by the environment in which it acts. | Besides robotic assistance, potential applications include: microbots that can travel through blood vessels to deliver medications; interventional training simulator and tele-intervention ( | Still in the state of infancy. Complexity and cost. Not preferable to use for solving simple problems. Huge training data demand. | The control of an electrophysiology catheter by robots ( |
AMI, acute myocardial infarction; EHR, electronic healthcare records; LASSO, least absolute shrinkage and selection operator; MACE, major adverse cardiovascular event; NN, neural network; CV, cardiovascular; MRI, magnetic resonance imaging; ECG, electrocardiogram; BP, blood pressure; CT, computed tomography; TAVI, transcatheter aortic valve implantation; PCI, percutaneous coronary intervention; VAE, variational autoencoders; GAN, generative adversarial networks; RL, reinforcement learning; STEMI, ST-segment elevation myocardial infarction.
Figure 2Domains of implementation of machine learning tools to cardiology.