| Literature DB >> 16356373 |
Qaiser Mukhtar1, Prachi Mehta, Erica R Brody, Jenny Camponeschi, Michael Friedrichs, Angela M Kemple, Brenda Ralls.
Abstract
Developing a Web-based tool that involves the input, buy-in, and collaboration of multiple stakeholders and contractors is a complex process. Several elements facilitated the development of the Web-based Diabetes Indicators and Data Sources Internet Tool (DIDIT). The DIDIT is designed to enhance the ability of staff within the state-based Diabetes Prevention and Control Programs (DPCPs) and the Centers for Disease Control and Prevention (CDC) to perform diabetes surveillance. It contains information on 38 diabetes indicators (measures of health or factors associated with health) and 12 national- and state-level data sources. Developing the DIDIT required one contractor to conduct research on content for diabetes indicators and data sources and another contractor to develop the Web-based application to house and manage the information. During 3 years, a work group composed of representatives from the DPCPs and the Division of Diabetes Translation (DDT) at the CDC guided the development process by 1) gathering information on and communicating the needs of users and their vision for the DIDIT, 2) reviewing and approving content, and 3) providing input into the design and system functions. Strong leadership and vision of the project lead, clear communication and collaboration among all team members, and a commitment from the management of the DDT were essential elements in developing and implementing the DIDIT. Expertise in diabetes surveillance and software development, enthusiasm, and dedication were also instrumental in developing the DIDIT.Entities:
Mesh:
Year: 2005 PMID: 16356373 PMCID: PMC1500969
Source DB: PubMed Journal: Prev Chronic Dis ISSN: 1545-1151 Impact factor: 2.830
Timetable for Development of the Diabetes Indicators and Data Sources Internet Tool (DIDIT), 2001–2005
|
|
| |
|---|---|---|
| August 2001–December 2001 | Formed a work group composed of staff representing the Division of Diabetes Translation (DDT) at the Centers for Disease Control and Prevention (CDC) and eight Diabetes Prevention and Control Programs (DPCPs). | |
| Conducted six focus groups in Atlanta, Ga, to assess needs and elicit input on draft vision statement from PDCP staff members. | ||
| Recruited additional work group members. | ||
| Revised the concept for tool based on DPCP feedback | ||
| January 2002–December 2002 | Selected 10 indicators to be piloted and finalized data fields that would describe indicators and their data sources, based on feedback from work group members. | |
| Developed content for 10 pilot indicators with assistance from a contractor. | ||
| Presented revised concept of the DIDIT at annual conference for DDT. | ||
| Worked with contractor to develop a Web site that would house DIDIT information. | ||
| Selected an additional 28 indicators to be included in the DIDIT, using a two-round Delphi process. | ||
| December 2002 | Finalized development of demonstration Web site to be used for displaying and reviewing DIDIT content as it was developed. | |
| January 2003 | Organized the first in-person meeting in Atlanta, Ga. The work group reviewed the demonstration Web site containing information on 10 pilot indicators and developed a process for reviewing the content for indicators and associated data sources. | |
| January 2003–March 2003 | Revised content of 10 pilot indicators and completed content development of 28 additional indicators. | |
| Uploaded indicators and data sources on the demonstration Web site. | ||
| Revised content based on work group feedback. | ||
| Obtained final approval of content. | ||
| April 2003 | Uploaded revised indicators and data sources on demonstration Web site. | |
| Conducted DIDIT usability testing at DDT annual meeting. | ||
| Presented DIDIT demonstration model at DDT annual meeting. | ||
| May–June 2003 | Revised DIDIT content based on usability-test feedback. | |
| July 2003–August 2003 | Pilot tested content, design, and functionality with nine DPCPs. | |
| August 2003 | Revised content and design, based on pilot-test feedback and work group review. | |
| September 2003 | Made available a live version of the DIDIT to all DPCPs and DDT staff. | |
| November 2003 | Conducted a panel presentation describing and demonstrating the DIDIT at the annual American Public Health Association conference. | |
| December 2003 | Presented the DIDIT at annual meeting of DPCP program directors. | |
| October 2003–April 2004 | Provided training to CDC project development officers and DPCP staff. | |
| April 2004–January 2005 | Presented the DIDIT at monthly DDT "All Hands" meeting. | |
| Had users share DIDIT experiences at annual meeting of DPCP program directors. | ||
| Developed and released DIDIT glossary with 100 epidemiology and surveillance terms. | ||
| Developed and released a section in which users share DIDIT experiences. | ||
| Analyzed requirements for adding a section to provide users with resources on surveillance and epidemiology. | ||
| January 2005–May 2005 | Developed and conducted DIDIT evaluation with nine DPCPs. | |
| Designed and developed resources section. | ||
| Finalized and tested protocol for updating DIDIT content. | ||
Four Phases of the Software Development Life Cycle, Diabetes Indicators and Data Sources Internet Tool (DIDIT)
|
|
|
|---|---|
| 1. Planning | Articulate objectives and scope of DIDIT systems; ensure technical feasibility. |
| 2. Analysis | Define in detail the information system that will provide users with the benefits they desire. |
| 3. Design | Focus on how the information system will function, including design of the application, database, user interface, and operating environment. |
| 4. Implementation | Code, pilot test, and deploy the application; train users. |
|
|
|---|
| 1. Provides validation that system function, design, and content are consistent with the responses elicited from users during the processes of requirements gathering and usability testing. This validation closes the information loop and confirms earlier assumptions. |
| 2. Enables exploration of requirements or ideas suggested by users after the processes of requirements gathering and usability testing are complete. Although it might be too late to include these features in the first release, they can be incorporated into later phases. |
| 3. Allows users to work with a real-life model, permitting them to visualize and respond to more advanced requirements they may find difficult to comprehend without such a model. Users also understand more advanced requirements when they can work with a system designed for fundamental needs and functions. In addition, requirements often build on one another. |
| 4. Permits testing among a small subset of a large population of users, preferably subsets that differ from those selected in earlier development phases. This ensures a more representative sampling throughout the development process, and it ensures that feedback is well-rounded and unbiased. Although not all suggestions made during the pilot-testing phase are ultimately incorporated, the process often sparks ideas for future enhancements and provides insights for training and user support. |