| Literature DB >> 33692435 |
Xiuqing Zhu1,2, Wencan Huang3, Haoyang Lu1,2, Zhanzhang Wang1,2, Xiaojia Ni1,2, Jinqing Hu1,2, Shuhua Deng1,2, Yaqian Tan1,2, Lu Li1,2, Ming Zhang1,2, Chang Qiu1,2, Yayan Luo2,4, Hongzhen Chen1, Shanqing Huang1, Tao Xiao1, Dewei Shang5,6, Yuguan Wen7,8.
Abstract
The pharmacokinetic variability of lamotrigine (LTG) plays a significant role in its dosing requirements. Our goal here was to use noninvasive clinical parameters to predict the dose-adjusted concentrations (C/D ratio) of LTG based on machine learning (ML) algorithms. A total of 1141 therapeutic drug-monitoring measurements were used, 80% of which were randomly selected as the "derivation cohort" to develop the prediction algorithm, and the remaining 20% constituted the "validation cohort" to test the finally selected model. Fifteen ML models were optimized and evaluated by tenfold cross-validation on the "derivation cohort," and were filtered by the mean absolute error (MAE). On the whole, the nonlinear models outperformed the linear models. The extra-trees' regression algorithm delivered good performance, and was chosen to establish the predictive model. The important features were then analyzed and parameters of the model adjusted to develop the best prediction model, which accurately described the C/D ratio of LTG, especially in the intermediate-to-high range (≥ 22.1 μg mL-1 g-1 day), as illustrated by a minimal bias (mean relative error (%) = + 3%), good precision (MAE = 8.7 μg mL-1 g-1 day), and a high percentage of predictions within ± 20% of the empirical values (60.47%). This is the first study, to the best of our knowledge, to use ML algorithms to predict the C/D ratio of LTG. The results here can help clinicians adjust doses of LTG administered to patients to minimize adverse reactions.Entities:
Mesh:
Substances:
Year: 2021 PMID: 33692435 PMCID: PMC7946912 DOI: 10.1038/s41598-021-85157-x
Source DB: PubMed Journal: Sci Rep ISSN: 2045-2322 Impact factor: 4.379