Literature DB >> 22919277

A classification method of normal and overweight females based on facial features for automated medical applications.

Bum Ju Lee1, Jun-Hyeong Do, Jong Yeol Kim.   

Abstract

Obesity and overweight have become serious public health problems worldwide. Obesity and abdominal obesity are associated with type 2 diabetes, cardiovascular diseases, and metabolic syndrome. In this paper, we first suggest a method of predicting normal and overweight females according to body mass index (BMI) based on facial features. A total of 688 subjects participated in this study. We obtained the area under the ROC curve (AUC) value of 0.861 and kappa value of 0.521 in Female: 21-40 (females aged 21-40 years) group, and AUC value of 0.76 and kappa value of 0.401 in Female: 41-60 (females aged 41-60 years) group. In two groups, we found many features showing statistical differences between normal and overweight subjects by using an independent two-sample t-test. We demonstrated that it is possible to predict BMI status using facial characteristics. Our results provide useful information for studies of obesity and facial characteristics, and may provide useful clues in the development of applications for alternative diagnosis of obesity in remote healthcare.

Entities:  

Mesh:

Year:  2012        PMID: 22919277      PMCID: PMC3420233          DOI: 10.1155/2012/834578

Source DB:  PubMed          Journal:  J Biomed Biotechnol        ISSN: 1110-7243


1. Introduction

Obesity and overweight have become major health issues, because the prevalence of obesity has rapidly risen worldwide. The causes of this phenomenon are excessive ingestion of food, lack of physical activity, and environmental and genetic factors [1, 2]. Obesity and abdominal obesity are potential risk factors for insulin resistance and type 2 diabetes, cardiovascular diseases, stroke, ischemic heart disease, and metabolic syndrome [3-6], and many studies have investigated the relationship between obesity, disease, and body mass index (BMI) [7-13]. In the medical field and public health, BMI is commonly used as an indicator of overall adiposity. So, BMI is essential medical information for the prognostic prediction of diseases and clinical therapy. The principal cutoff points for underweight (<18.50 kg/m2), normal range (18.50–24.99 kg/m2), overweight or preobese (25.00–29.99 kg/m2), and obese (≥30.00 kg/m2) have been set by the World Health Organization (WHO). A large number of studies on human face have focused on facial morphology, face recognition, and medicine [14-23]. Facial characteristics provide clinical information on the present or future health conditions of patients. For example, the status of cheeks, neck circumference, and craniofacial morphology are associated with health complications, such as type 2 diabetes, hypertension, and sleep apnea [18]. Using computed tomographic (CT) scanning, Levine et al. [19] showed that the quantity of buccal fat is strongly related to visceral abdominal fat accumulation, based on the fact that patients with chubby facial cheeks tend to have upper-body obesity, and argued that plump cheeks of patients may be a high potential risk factor for metabolic complications related to obesity. Further, using facial measurements, Sadeghianrizi et al. [20] showed that craniofacial morphology is significantly different between normal and obese adolescents. They suggested that facial skeletal structures of obese adolescents tended to be relatively large, and that obesity was associated with bimaxillary prognathism. The motivation for this study is conveyed by the following 2 questions: which features or facial characteristics are associated with overweight and normal BMI status? If we identify facial features that differ between normal and overweight, how accurately can we identify normal and overweight using these features? Contributions of this study are as follows. We first propose a method of classifying normal and overweight status using only facial characteristics. To date, no study has addressed a method that predicts BMI status using facial features. Furthermore, we introduce meaningful and discriminatory features that show a statistically significant difference between normal and overweight by statistical analysis, and identify compact and useful feature sets for BMI classification using facial features in female group. The results of this study will be useful in understanding the relationship between obesity-related diseases and facial characteristics.

2. Materials and Methods

2.1. Data Collection

A total of 688 subjects participated in this study. At the Korea Institute of Oriental Medicine, frontal and profile photographs of subjects' faces with a neutral expression were acquired using a digital camera with a ruler (Nikon D700 with an 85 mm lens) and the subjects' clinical information, such as name, age, gender, weight, height, blood pressure, and pulse were recorded. All images were captured at a resolution of 3184 × 2120 pixels in JPEG format. Height and weight of subjects were measured by a digital scale (GL-150; G Tech International Co., Ltd, Republic of Korea). Based on identifiable feature points from the front and profile images of subjects, a total of 86 features were extracted. The extracted features included distance between points n1 and n2 in a frontal (or profile) image, vertical distance between n1 and n2 in a frontal (or profile) image, angles of 3 points n1, n2, and n3 in a frontal (or profile) image, area of the triangle formed by the 3 points n1, n2, and n3 in a profile image, and so forth. All points in a front and profile image are showed in Figure 1, and all the extracted features and brief descriptions are given in Table 1.
Figure 1

All points in a facial image for feature extraction ((a): points and areas in frontal image; (b): points in profile image; (c): points in right eye; (d): point in left eye). Distance, angle, and area measurements were done based on self-made tool using MATLAB on Window XP.

Table 1

All features used in this study and brief descriptions.

FeatureBrief description
FD n1_n2Distance between points n1 and n2 in a frontal (or profile) image
FDH n1_n2Horizontal distance between n1 and n2 in an image
FDV n1_n2Vertical distance between n1 and n2 in an image
FA n1_n2_n3Angle of three points n1, n2, and n3 in an image
FA n1_n2Angle between the line through 2 points n1 and n2 and a horizontal line
FR02_psuFD(17, 26)/FD(18, 25)
FR03_psu(FD(18, 25) + FD(118, 125))/FDH(33, 133)
FR05_psuFDH(33, 133)/FD(43, 143)
FR06_psuFDH(33, 133)/FDV(52, 50)
FR08_psuFD(43, 143)/FDV(52, 50)
FArea02Area of the contour formed by the points 53, 153, 133, 194, 94, 33, and 53
FArea03Area of the contour formed by the points 94, 194, 143, 43, and 94
Fh_Cur_Max_DistanDistance between points 7 and 77 in a profile image
Fh_Angle_n1_n2Angle between the line through 2 points n1 and n2 and a horizontal line
Nose_Angle_n1_n2Angle between the line through 2 points n1 and n2 and a horizontal line
Nose_Angle_n1_n2 _ n3Angle of 3 points n1, n2, and n3 in a frontal(or profile) image
SAn 1_n2Angle between the line through 2 points n1 and n2 and a horizontal line
Fh_Cur_Max_R79_69FD(77, 9)/FD(6, 9)
Nose_Area_n1_n2_n3Area of the triangle formed by 3 points n1, n2, and n3 in a profile image
EUL_L_el1 ~ EUL_L_el7Slope of the tangent at a point (el1~el7) in a frontal image
EUL_L_DHFDH(el1, el7)
EUL_L_MAXFDH(el1, elmax⁡)
EUL_L_RMAXFDH(el1, elmax⁡)/FDH(el1, el7)
EUL_L_SbFDV(el7, el1)/FDH(el7, el1)
EUL_L_StFDV(elmax⁡, el7)/FDH(elmax⁡, el7)
EUL_L_SfFDV(elmax⁡, el1)/FDH(elmax⁡, el1)
EUL_L_KhmeanAverage curvature of the left (or right) upper eyelid contour
EUL_L_khmaxMaximum curvature of the left (or right) upper eyelid contour
EUL_R_er1~ EUL_R_er7Slope of the tangent at a point (er1~er7) in a frontal image
EUL_R_DHFDH(er1, er7)
EUL_R_MAXFDH(er1, ermax⁡)
EUL_R_RMAXFDH(er1, ermax⁡)/FDH(er1, er7)
EUL_R_SbFDV(er7, er1)/FDH(er7, er1)
EUL_R_StFDV(ermax⁡, er7)/FDH(ermax⁡, er7)
EUL_R_SfFDV(ermax⁡, er1)/FDH(ermax⁡, er1)
EUL_R_KhmeanAverage curvature of the left (or right) upper eyelid contour
EUL_R_khmaxMaximum curvature of the left (or right) upper eyelid contour
PDH44_53Horizontal distance between n1 and n2 in a frontal (or profile) image

2.2. Normal and Overweight Cutoff Points

BMI was calculated as weight (kg) divided by the square of height (m) of the individual. Health consequences and BMI ranges of overweight and obesity are open to dispute [10, 24]. There is natural consequence. Physiological and environmental factors of race are associated with differences in BMI values and the assignment of BMI values for obesity and overweight depends on various factors, such as ethnic groups, national economic statuses, and rural/urban residence [8]. For instance, BMI values of a population in an Asian region tend to be lower than those of a population in a Western region; however, Asians have risk factors for cardiovascular disease and obesity-related diabetes at relatively low BMI values [11, 25]. In this study, we followed the suggestions of WHO to assign the cutoff point for each class in the Asia-Pacific region [25]. The proposed categories are as follows: normal, 18.5–22.9 kg/m2; overweight, ≥23 kg/m2. Since the facial features and BMI are influenced by gender and age [26], participants were divided into 2 groups: female; 21–40 (females aged 21–40 years) and female: 41–60 (females aged 41–60 years). Detailed data and basic statistics of each group are presented in Table 2.
Table 2

Subject characteristics and basic statistics (data are presented as mean (standard deviation); N: number of subjects, BMI: body mass index).

ClassFemale: 21–40Female: 41–60
Normal N 189193
Age32.1 (5.64)50.0 (5.42)
BMI22.2 (2.97)23.6 (2.86)

Overweight N 77229
Age32.91 (5.29)50.31 (5.44)
BMI26.0 (2.75)25.6 (2.31)
For the selection of useful and discriminatory features, only features presenting P-values < 0.05 in each group by an independent two-sample t-test were used in this study. In other words, only features with a P value < 0.05 were included in classification experiments. Thus, features used in each group are different due to the difference of age. A detailed analysis of the statistical data and the selected features is presented in Section 3.2.

2.3. Preprocessing and Experiment Configurations

In the preprocessing step, the experiment was performed in 2 ways: (1) only the normalization method (scale 0~1 value) was applied to raw datasets, and (2) normalization and discretization were applied for better classification accuracy. We used the entropy-based multi-interval discretization (MDL) method introduced by Fayyad and Irani [27]. For classification performance evaluation, we used the area under the curve (AUC) and kappa as major evaluation criteria. Additionally, sensitivity, 1-specificity, precision, F-measure, and accuracy were used for detailed performance analysis. All the results were based on 10-fold cross-validation method for a statistical evaluation of learning algorithm. All experiments were conducted by Naive Bayes classifier in WEKA software [28], and statistical analyses were conducted by SPSS version 19 for Windows (SPSS Inc., Chicago, IL, USA).

3. Results and Discussion

3.1. Performance Evaluation

For brief summarization of performance evaluation, the AUC and kappa for the 2 groups with and without the use of MDL method (i.e., 2 ways of preprocessing) are depicted in Figure 2.
Figure 2

A comparison of performance evaluations using AUC and kappa in 2 female groups (AUC-MDL and Kappa-MDL: use of MDL, AUC and Kappa: without the use of MDL).

AUC values of the method using MDL in 2 female groups ranged from 0.760 to 0.861, whereas AUC of the method without the use of MDL ranged from 0.730 to 0.771. AUC and kappa values of the method using MDL showed improvements of 0.09 and 0.115, respectively, in the female 21–40 group, and 0.03 and 0.073, respectively, in female: 41–60. Comparing AUC and kappa values, the classification performance of the method with MDL was higher than that of the method without MDL. These results showed that the BMI classification method of applying MDL was significantly better than that of not applying MDL. The identification of normal and overweight in female: 41–60 group was more difficult than that of normal and overweight in female: 21–40 group. The exact reason behind this phenomenon is unknown, but obesity and menopause-related research studies offer some clues [29-31]. Menopause leads to changes in fat tissue distribution, body composition, waist-to-hip ratio (WHR), and waist-to-height (W/Ht) in females. For instance, Douchi et al. [29] demonstrated that the lean mass of the head of premenopausal and postmenopausal females were not different, while trunk and legs were altered following menopause. Detailed results of the performance evaluation of each class and group are described in Tables 3 and 4. We think that these results imply the possibility of predicting normal and overweight status using human face information.
Table 3

Detailed performance evaluation of experiments using the MDL method in 2 groups (Sen.: sensitivity, 1-spe.: 1-specificity, Pre.: precision, F-Me.: F-measure, and Acc.: accuracy).

GroupClassSen.1-spe.Pre. F-Me.Acc.
Female: 21–40Normal0.8840.3770.8520.86880.8%
Overweight0.6230.1160.6860.653

Female: 41–60Normal0.6530.2530.6850.66870.4%
Overweight0.7470.3470.7180.732
Table 4

Detailed performance evaluation of experiments without the use of MDL method (Sen.: sensitivity, 1-spe.: 1-specificity, Pre.: precision, F-Me.: F-measure, and Acc.: accuracy).

GroupClassSen.1-spe.Pre. F-Me.Acc.
Female: 21–40Normal0.7880.3640.8420.81474.4%
Overweight0.6360.2120.5510.59

Female: 41–60Normal0.6840.3540.620.6566.4%
Overweight0.6460.3160.7080.676

3.2. Statistical Analysis of Facial Features

Statistical analysis of the comparison between normal and overweight classes was performed using an independent two-sample t-test, and a P-value < 0.05 was considered statistically significant. Features with a P-value < 0.05 in each group are described in Tables 5 and 6.
Table 5

Statistical analysis of female: 21–40 group by an independent two-sample t-test (Std.: standard deviation).

FeatureClassMean (Std.) t P-value
FD17_26Normal9.473 (1.317)3.1180.002
Overweight8.941 (1.115)
FD117_126Normal9.483 (1.303)3.3190.001
Overweight8.904 (1.257)
FDH25_125Normal96.53 (5.116)−2.690.0076
Overweight98.52 (6.32)
FDH36_136Normal23.57 (2.469)−2.750.0064
Overweight24.46 (2.191)
FD18_25Normal29.94 (2.675)−2.0360.0428
Overweight30.68 (2.753)
FD43_143Normal125.2 (7.101)−8.6250.0000
Overweight133.6 (7.384)
FD53_153Normal145.4 (5.941)−5.9910.0000
Overweight150.7 (7.642)
FD94_194Normal140.1 (6.022)−8.8750.0000
Overweight147.6 (6.934)
FDH33_133Normal147.2 (5.63)−7.2610.0000
Overweight153.1 (7.02)
FA18_17_25Normal126.2 (6.591)−2.6840.0077
Overweight128.6 (6.75)
FA118_117_125Normal125 (7.339)−3.560.0004
Overweight128.3 (6.199)
FA18_25_43Normal95.38 (5.104)−3.7220.0002
Overweight97.91 (4.896)
FA118_125_143Normal96.16 (4.753)−3.3960.0008
Overweight98.39 (5.082)
FA18_17_43Normal76.97 (6.255)−4.390.0000
Overweight80.66 (6.108)
FA118_117_143Normal76.82 (6.824)−4.6440.0000
Overweight80.9 (5.583)
FA117_125Normal21.24 (3.645)3.9830.0001
Overweight19.19 (4.142)
FA17_18Normal34.01 (5.091)2.0020.0463
Overweight32.61 (5.32)
FR02_psuNormal0.318 (0.044)4.1990.0000
Overweight0.293 (0.041)
FR05_psuNormal1.178 (0.055)4.1830.0000
Overweight1.148 (0.048)
FR06_psuNormal2.039 (0.117)−5.3340.0000
Overweight2.123 (0.115)
FR08_psuNormal1.736 (0.151)−5.7830.0000
Overweight1.854 (0.147)
FArea02Normal6470 (644.4)−2.1060.0362
Overweight6654 (652.2)
FArea03Normal3596 (364.9)−5.6370.0000
Overweight3873 (361.9)
Fh_Cur_Max_DistanNormal3.654 (1.564)1.9840.0483
Overweight3.233 (1.585)
FDH12_14Normal18.58 (2.713)−3.0060.0029
Overweight19.69 (2.817)
Nose_Angle_14_12Normal61.07 (4.611)2.9460.0035
Overweight59.29 (4.108)
Nose_Angle_12_14_21Normal106.7 (4.634)2.3970.0172
Overweight105.1 (5.237)
EUL_L_el2Normal−0.637 (0.095)−3.1350.0019
Overweight−0.597 (0.087)
EUL_L_ el3Normal−0.22 (0.118)−3.2060.0015
Overweight−0.17 (0.11)
EUL_L_ el6Normal0.483 (0.105)3.4730.0006
Overweight0.432 (0.113)
EUL_L_DHNormal3.178 (0.248)−2.530.0120
Overweight3.268 (0.292)
EUL_L_SfNormal0.408 (0.106)2.4420.0153
Overweight0.371 (0.132)
EUL_R_er2Normal−0.63 (0.087)−3.9570.0001
Overweight−0.582 (0.095)
EUL_R_ er3Normal−0.208 (0.112)−2.8220.0051
Overweight−0.167 (0.1)
EUL_R_ er6Normal0.466 (0.106)2.4920.0133
Overweight0.43 (0.111)
EUL_R_ er7Normal0.647 (0.235)2.4320.0165
Overweight0.556 (0.29)
EUL_R_DHNormal3.188 (0.226)−4.2920.0000
Overweight3.322 (0.241)
EUL_R_RMAXNormal0.443 (0.069)2.0610.0403
Overweight0.424 (0.066)
EUL_R_StNormal−0.633 (0.117)−2.5250.0122
Overweight−0.592 (0.123)
EUL_R_SfNormal0.395 (0.106)2.4520.0149
Overweight0.36 (0.104)
EUL_R_KhmeanNormal0.024 (0.007)2.8680.0045
Overweight0.022 (0.007)
PDH44_53Normal89.38 (6.081)−3.0170.0028
Overweight91.79 (5.527)
Table 6

Statistical analysis of female: 41–60 group by an independent two-sample t-test (Std.: standard deviation).

FeatureClassMean (Std.) t P-value
FDH25_125Normal94.63 (5.466)−3.0970.0021
Overweight96.29 (5.493)
FDH36_136Normal24.84 (2.283)−2.0550.0405
Overweight25.36 (2.805)
FD18_25Normal29.37 (3.287)−2.1990.0284
Overweight30.04 (2.923)
FD17_25Normal17.83 (2.717)−2.0760.0385
Overweight18.36 (2.471)
FD43_143Normal127.4 (6.471)−8.1840.0000
Overweight133.1 (7.721)
FD53_153Normal143.9 (6.343)−4.8480.0000
Overweight147.2 (7.141)
FD94_194Normal141.8 (6.01)−8.3850.0000
Overweight146.9 (6.485)
FDH33_133Normal146.8 (6.057)−6.6150.0000
Overweight150.9 (6.582)
FA18_25_43Normal99.88 (5.308)−2.5890.0100
Overweight101.2 (4.954)
FA118_125_143Normal99.74 (4.776)−4.3430.0000
Overweight101.9 (5.373)
FA117_125_143Normal124.7 (5.38)−2.4380.0152
Overweight126 (5.471)
FA18_17_43Normal81.11 (6.753)−2.6760.0077
Overweight82.85 (6.574)
FA118_117_143Normal80.69 (6.449)−3.6320.0003
Overweight83.16 (7.35)
FR02_psuNormal0.295 (0.044)2.1820.0297
Overweight0.285 (0.051)
FR05_psuNormal1.154 (0.046)3.9660.0001
Overweight1.135 (0.049)
FR06_psuNormal2.006 (0.104)−5.6880.0000
Overweight2.068 (0.121)
FR08_psuNormal1.743 (0.134)−5.9350.0000
Overweight1.827 (0.157)
FArea02Normal6358 (618.3)−2.2120.0275
Overweight6501 (696.7)
FArea03Normal3886 (397.6)−4.2450.0000
Overweight4052 (402.6)
FDV12_14Normal33.85 (3.313)2.5160.0123
Overweight33 (3.571)
FDH14_21Normal12.9 (1.633)2.1630.0311
Overweight12.53 (1.889)
Nose_Angle_14_21Normal45.73 (4.983)−2.4020.0168
Overweight46.98 (5.765)
In female: 21–40, 42 features were significantly different between normal and overweight classes (P < 0.05), and 11 of these features exhibited highly significant differences (P < 0.0000). Four features concerning distances between n1 and n2 points in a frontal image (FD43_143, FD53_153, FD94_194, and FDH33_133 related to the mandibular width or face width) exhibited particularly significant differences. The features FA18_17_43 and FA118_117_143 representing the angles between three points n1 (medial canthus), n2 (midpoint of the upper eyelid), and n3 (mandibular ramus) in a frontal image were highly significantly different. Comparing female: 21–40 and female: 41–60 groups, many features related to the eyelid were found in female: 21–40, but the features were not found in Female: 41–60. For instance, EUL_R_DH (horizontal distance from er1 to er7 in the eye image) was highly significantly different between the normal and overweight classes. The means of EUL_R_DH in normal and overweight status were 3.188 (0.226) and 3.322 (0.241) (t = −4.292, P = 0.0000). In female: 41–60, a total of 21 features were significantly different between the normal and overweight classes, and 8 of these features were highly significantly different (FD43_143, FD53_153, FD94_194, FDH33_133, FA118_125_143, FR06_psu, FR08_psu, and FArea03; P < 0.0000). Many features that were significantly different between the normal and overweight classes in particular age group were identified. 25 features such as EUL_R_St, FD117_126, Fh_Cur_Max_Distan, FDH12_14, EUL_R_DH, and EUL_R_Khmean were found only in the female: 21–40 group, while the features FD17_25, FA117_125_143, FDV12_14, FDH14_21, and Nose_Angle_14_21 were only found in female: 41–60.

3.3. Medical Applications and Limitations

Patients or potential patients with obesity-related diseases must constantly check their own BMI based on their weight. Measurements using calibrated scales and ruler are ideal, but may not always be possible in the critically ill [32] and in telemedicine or emergency medical services in real time in remote locations. Our method was designed under the prerequisite that above method cannot be used in situations such as elderly trauma or intensive care in emergency medicine, remote healthcare, and so forth. Several studies have been performed on patient BMI and weight estimation in emergency medical service and telemedicine [32-35]. These are important to enable accurate drug dosage, counter shock voltage calculation, or treatment, particularly in situations of serious illness, such as elderly trauma or intensive care [33, 34]. On the one hand, most patients are not aware of their body weight because the body weight of many individuals changes over time. For example, although patient self-estimates of weight are better than estimates by residents and nurses in emergency departments, 22% of patients do not estimate their own weight within 5 kg [34]. The method described herein can provide clues to the development of alternative methods for BMI estimation in the above situations or telemedicine, and the development of medical fields because facial characteristics provide substantial clinical information on the present or future health conditions of patients [18, 19].

4. Conclusions

The relationship between obesity, diseases, and face that are associated with health complications has been researched for a long time. Here, we have proposed and demonstrated the possibility of identifying normal and overweight status using only facial characteristics, and found statistically significant differences between the 2 classes in 2 female groups. Although there are still problems to be solved for the complete classification of BMI status, this method would provide basic information and benefits to studies in face recognition, obesity, facial morphology, medical science, telemedicine, and emergency medicine.
  25 in total

1.  The accuracy of visual estimation of weight and height in pre-operative supine patients.

Authors:  T R Coe; M Halkes; K Houghton; D Jefferson
Journal:  Anaesthesia       Date:  1999-06       Impact factor: 6.955

2.  Prediction of metabolic syndrome using artificial neural network system based on clinical data including insulin resistance index and serum adiponectin.

Authors:  Hiroshi Hirose; Tetsuro Takayama; Shigenari Hozawa; Toshifumi Hibi; Ikuo Saito
Journal:  Comput Biol Med       Date:  2011-10-13       Impact factor: 4.589

3.  Assessment of the body mass index and selected physiological parameters in pre- and post-menopausal women.

Authors:  M Skrzypczak; A Szwed
Journal:  Homo       Date:  2005

Review 4.  The search for human obesity genes.

Authors:  A G Comuzzie; D B Allison
Journal:  Science       Date:  1998-05-29       Impact factor: 47.728

5.  How accurate is weight estimation in the emergency department?

Authors:  Shyaman Menon; Anne-Maree Kelly
Journal:  Emerg Med Australas       Date:  2005-04       Impact factor: 2.151

Review 6.  Abdominal obesity and metabolic syndrome.

Authors:  Jean-Pierre Després; Isabelle Lemieux
Journal:  Nature       Date:  2006-12-14       Impact factor: 49.962

7.  Errors in weight estimation in the emergency department: comparing performance by providers and patients.

Authors:  William L Hall; Gregory L Larkin; Mauricio J Trujillo; Jackie L Hinds; Kathleen A Delaney
Journal:  J Emerg Med       Date:  2004-10       Impact factor: 1.484

8.  Relationship between obesity and depression in the Korean working population.

Authors:  Ji-Yong Kim; Hye-Mi Chang; Jung-Jin Cho; Sang-Ho Yoo; Soo-Young Kim
Journal:  J Korean Med Sci       Date:  2010-10-26       Impact factor: 2.153

9.  Body mass index and cardiovascular disease in the Asia-Pacific Region: an overview of 33 cohorts involving 310 000 participants.

Authors:  C Ni Mhurchu; A Rodgers; W H Pan; D F Gu; M Woodward
Journal:  Int J Epidemiol       Date:  2004-04-22       Impact factor: 7.196

10.  Obesity, abdominal obesity, and clustering of cardiovascular risk factors in South Korea.

Authors:  Hye Soon Park; Yeong Sook Yun; Jung Yul Park; Young Seol Kim; Joong Myung Choi
Journal:  Asia Pac J Clin Nutr       Date:  2003       Impact factor: 1.662

View more
  6 in total

1.  BMI and WHR Are Reflected in Female Facial Shape and Texture: A Geometric Morphometric Image Analysis.

Authors:  Christine Mayer; Sonja Windhager; Katrin Schaefer; Philipp Mitteroecker
Journal:  PLoS One       Date:  2017-01-04       Impact factor: 3.240

2.  Identification of five novel genetic loci related to facial morphology by genome-wide association studies.

Authors:  Seongwon Cha; Ji Eun Lim; Ah Yeon Park; Jun-Hyeong Do; Si Woo Lee; Chol Shin; Nam Han Cho; Ji-One Kang; Jeong Min Nam; Jong-Sik Kim; Kwang-Man Woo; Seung-Hwan Lee; Jong Yeol Kim; Bermseok Oh
Journal:  BMC Genomics       Date:  2018-06-19       Impact factor: 3.969

3.  The age distribution of facial metrics in two large Korean populations.

Authors:  Hae-Young Lee; Seongwon Cha; Hyo-Jeong Ban; In-Young Kim; Bo-Reum Park; Ig-Jae Kim; Kyung-Won Hong
Journal:  Sci Rep       Date:  2019-10-10       Impact factor: 4.379

4.  Classification of early age facial growth pattern and identification of the genetic basis in two Korean populations.

Authors:  Mi-Yeon Cha; Yu-Jin Hong; Ja-Eun Choi; Tae-Song Kwon; Ig-Jae Kim; Kyung-Won Hong
Journal:  Sci Rep       Date:  2022-08-15       Impact factor: 4.996

5.  Use of Facial Morphology to Determine Nutritional Status in Older Adults: Opportunities and Challenges.

Authors:  Wesley Tay; Rina Quek; Bhupinder Kaur; Joseph Lim; Christiani Jeyakumar Henry
Journal:  JMIR Public Health Surveill       Date:  2022-07-18

6.  Predicting visceral obesity based on facial characteristics.

Authors:  Bum Ju Lee; Jong Yeol Kim
Journal:  BMC Complement Altern Med       Date:  2014-07-16       Impact factor: 3.659

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.