| Literature DB >> 35254406 |
Bingjie Zhou1, Shiwei Liang1, Kyle M Monahan2, Gitanjali M Singh1, Ryan B Simpson1, Julia Reedy1, Jianyi Zhang1, Annie DeVane1, Melissa S Cruz1, Anastasia Marshak1, Dariush Mozaffarian1, Dantong Wang3, Iaroslava Semenova3, Ivan Montoliu3, Daniela Prozorovscaia3, Elena N Naumova1.
Abstract
The rapid expansion of food and nutrition information requires new ways of data sharing and dissemination. Interactive platforms integrating data portals and visualization dashboards have been effectively utilized to describe, monitor, and track information related to food and nutrition; however, a comprehensive evaluation of emerging interactive systems is lacking. We conducted a systematic review on publicly available dashboards using a set of 48 evaluation metrics for data integrity, completeness, granularity, visualization quality, and interactivity based on 4 major principles: evidence, efficiency, emphasis, and ethics. We evaluated 13 dashboards, summarized their characteristics, strengths, and limitations, and provided guidelines for developing nutrition dashboards. We applied mixed effects models to summarize evaluation results adjusted for interrater variability. The proposed metrics and evaluation principles help to improve data standardization and harmonization, dashboard performance and usability, broaden information and knowledge sharing among researchers, practitioners, and decision makers in the field of food and nutrition, and accelerate data literacy and communication.Entities:
Keywords: dashboard evaluation; data visualization; nutrition and food surveillance; nutrition data quality; quality metrics
Year: 2022 PMID: 35254406 PMCID: PMC9156375 DOI: 10.1093/advances/nmac022
Source DB: PubMed Journal: Adv Nutr ISSN: 2161-8313 Impact factor: 11.567
Three sets of search terms and keywords
| Set 1 | Set 2 | Set 3 |
|---|---|---|
| Nutrition | Dashboard | Surveillance |
| Nutritional | Data visualization tool | Monitor |
| Dietary | Data visualization platform | Health |
| Food | Atlas | Population health |
| Malnutrition | Database | — |
| Hunger | — | — |
Principles, indicators, and metrics for nutrition data and dashboards
| Principles | Indicators | Subindicators | Metrics |
|---|---|---|---|
| Evidence | Goals and scope | 1. Overall goals are clearly stated | |
| 2. Data offered by the platform support the claimed goals | |||
| 3. Data are related to nutrition and diet | |||
| 4. A sufficient description and reasonable justification are provided for each indicator/variable | |||
| Data quality | Data integrity | 5. All the data are accompanied by a clear description of data sources, links, references, and metadata | |
| 6. The dashboard provides information on population coverage, representativeness, timeframe, and sampling design | |||
| 7. The dashboard clarifies data collection methods: surveys, cohorts, census, surveillance, or estimated from statistical models | |||
| Standardization | 8. The data can be compared across time, geographical locations, and different indicators | ||
| 9. The quantity unit for data in the same category is uniform and provides sufficient information on terms, definitions, units, measurement, and calibration methods | |||
| 10. The dashboard clarifies the level of available geospatial aggregations: postal code, village/city/town, state/province/region, and country | |||
| 11. The dashboard clarifies the level of available temporal aggregation: day, week, month, quarter, year, etc | |||
| Granularity | 12. For a given level of geospatial aggregation, the data can be further divided into geographical subdivisions | ||
| 13. For a given level of temporal aggregation, the data can be further divided into temporal subunits | |||
| 14. The data are broken down by exhaustive demographical characteristics for an individual (e.g. gender, age, ethnicity, education, etc.) | |||
| 15. The data are broken down by exhaustive demographical characteristics for a household | |||
| 16. The data are broken down by exhaustive nutrition characteristics such as nutrients, food items, food groups, and diets | |||
| Completeness | 17. The amount and proportion of missing data for individual variables, including time and georeferencing, are stated | ||
| 18. The proportion of missing data is <10% | |||
| 19. Missing data are clearly denoted as missing, NA, none, or not available | |||
| 20. Reasons for missingness are clearly stated | |||
| Efficiency | Platform capability | 21. The dashboard includes a data portal that allows users to access the data | |
| 22. The platform allows users to explore, review, and download selected data in commonly used formats | |||
| 23. The dashboard is up to date and has the flexibility to incorporate more data | |||
| Visualization quality | Readability | 24. The dashboard provides adequate and effective visualizations to demonstrate the data distribution, time trend, quality, quantity, and relation of the data | |
| 25. The dashboard provides a sufficient description for each visualization with adequate details on the context of the data | |||
| 26. The visualizations have clear title, axis labels, data units, and color/marker legends | |||
| 27. The visualizations use effective and accessible color schemes and font sizes | |||
| 28. The visualizations are available in high resolution | |||
| Interactivity | 29. The dashboard includes interactive data visualizations | ||
| 30. The dashboard includes drop-down menus with multiple indicators | |||
| 31. The dashboard includes drop-down menus with different levels of details | |||
| 32. The dashboard includes zoom in/out option | |||
| 33. The dashboard includes a visualization download option | |||
| 34. The dashboard allows users to manipulate visualizations to accommodate key interests | |||
| Emphasis | Platform accessibility | 35. The dashboard is within the top results in search engines (e.g. Google, Bing) and follow SEO guidelines (keywords present, text highlightable, fast loading) ( | |
| 36. The platform is user-friendly and allows users to easily navigate and explore data and visualizations according to ADA website compliance (e.g. text readable, alt-text on hovering over images, headers present) ( | |||
| 37. The dashboard includes a tutorial to walk users through the entire platform (e.g. informative pop-ups, a tutorial video) | |||
| 38. The dashboard includes sufficient reference material, publications that related to the topics in the platform, information on the methodologies for building a dashboard and handling the data | |||
| Comprehension | 39. The dashboard visualizations provide appropriate highlights of the text, markers, reference lines, common trends, and patterns | ||
| 40. The dashboard has a good storytelling strategy with a clear focus on the main points | |||
| 41. There are clear logical flows and connections between visualizations, multipanel plots, and embedded descriptions | |||
| 42. The dashboard offers meaningful comparisons between groups, time periods, geographic locations using visualizations | |||
| 43. The classification, presenting order, and length for different topics are well balanced | |||
| Contacts and communication | 44. The dashboard includes contact information for questions, suggestions, and feedback | ||
| 45. The dashboard requests information from users and provides a password-protected working environment | |||
| Ethics | Conflicts of interest | 46. Funding sources and roles of funders are clearly stated | |
| 47. Roles of all contributors to the dashboard are clearly described | |||
| Responsible conduct | 48. The dashboard explicitly highlights responsibilities to the public, funders, research subjects, research team colleagues, and other statisticians or statistics practitioners according to ASA Guidelines ( | ||
ADA, Americans with Disabilities Act; ASA, American Statistical Association; SEO, Search Engine Optimization.
FIGURE 1A heatmap of dashboards’ evaluation results where metrics are arranged by principle, indicator, and subindicator. The metrics are shown in numbers, which are aligned with the metrics order in Table 2. The heatmap represents crude mean scores from 6 raters for each metric and each of the 13 dashboards with light yellow indicating a score of 1 and dark green indicating a score of 5. The crude scores with SD across all dashboards for each metric are shown in the second to last column. The average scores adjusted for rater and dashboard variability obtained from the mixed effects models are shown in the last column and under the name for each principle, indicator, and subindicator, accordingly. Dashboards are listed in descending order of overall performance scores aligned with Supplemental Table 1. The dashboard performance scores with SD adjusted for interrater variability is shown in the third and second to last row, and dashboard scores significantly different from the average score (P-value <0.05) are marked with asterisks. Intraclass correlation (ICC) scores for each dashboard are shown in the last row.