| Literature DB >> 27958353 |
Jui-Hsiang Lin1, Wen-Chung Lee1.
Abstract
The logistic regression model is the workhorse of epidemiological data analysis. The model helps to clarify the relationship between multiple exposures and a binary outcome. Logistic regression analysis is readily implemented using existing statistical software, and this has contributed to it becoming a routine procedure for epidemiologists. In this paper, the authors focus on a causal model which has recently received much attention from the epidemiologic community, namely, the sufficient-component cause model (causal-pie model). The authors show that the sufficient-component cause model is associated with a particular 'link' function: the complementary log link. In a complementary log regression, the exponentiated coefficient of a main-effect term corresponds to an adjusted 'peril ratio', and the coefficient of a cross-product term can be used directly to test for causal mechanistic interaction (sufficient-cause interaction). The authors provide detailed instructions on how to perform a complementary log regression using existing statistical software and use three datasets to illustrate the methodology. Complementary log regression is the model of choice for sufficient-cause analysis of binary outcomes. Its implementation is as easy as conventional logistic regression.Entities:
Mesh:
Year: 2016 PMID: 27958353 PMCID: PMC5154187 DOI: 10.1038/srep39023
Source DB: PubMed Journal: Sci Rep ISSN: 2045-2322 Impact factor: 4.379
Figure 1All 9 classes of sufficient cause for two binary exposures.
The results of the complementary log regression for Example 1.
| Variables | Regression coefficients | 95% confidence interval | |
|---|---|---|---|
| Intercept | 0.0446 | 0.0348, 0.0545 | <0.0001 |
| Age | 0.1142 | 0.0815, 0.1469 | <0.0001 |
| BMI | 0.0724 | 0.0514, 0.0934 | <0.0001 |
| Age × BMI | 0.0866 | 0.0335, 0.1397 | 0.0014 |
The results of the complementary log regression for Example 2.
| Variables | Regression coefficients | 95% confidence interval | |
|---|---|---|---|
| Intercept | 0.0426 | 0.0053, 0.0799 | 0.0254 |
| Age | 0.1660 | 0.0570, 0.2743 | 0.0028 |
| Treatment | 0.0359 | −0.0300, 0.1019 | 0.2859 |
| Age × Treatment | 0.0098 | −0.1520, 0.1716 | 0.9057 |
The results of the complementary log regression for Example 3.
| Variables | Regression coefficients | 95% confidence interval | |
|---|---|---|---|
| Intercept | 0.0398 | 0.0163, 0.0633 | 0.0009 |
| Age | |||
| <40 | 0.0000 | ||
| 40–44 | −0.0039 | −0.0319, 0.0242 | 0.7879 |
| 45–49 | 0.0196 | −0.0150, 0.0543 | 0.2664 |
| 50–54 | 0.0486 | 0.0004, 0.0967 | 0.0480 |
| ≥55 | 0.0362 | −0.0187, 0.0911 | 0.1966 |
| Personality Type | 0.0399 | −0.0022, 0.0821 | 0.0631 |
| Age × Personality | |||
| <40 | 0.0000 | ||
| 40–44 | −0.0049 | −0.0557, 0.0459 | 0.0631 |
| 45–49 | 0.0364 | −0.0258, 0.0986 | 0.8514 |
| 50–54 | 0.0387 | −0.0410, 0.1185 | 0.2514 |
| ≥55 | 0.0898 | −0.0032, 0.1828 | 0.3412 |