| Literature DB >> 35854364 |
Jack S Nunn1,2, Thomas Shafee3, Steven Chang4, Richard Stephens5, Jim Elliott6, Sandy Oliver7,8, Denny John9,10, Maureen Smith11, Neil Orr12,13, Jennifer Preston14, Josephine Borthwick15, Thijs van Vlijmen16, James Ansell17, Francois Houyez18, Maria Sharmila Alina de Sousa19, Roan D Plotz20, Jessica L Oliver21, Yaela Golumbic22, Rona Macniven23,24,25, Samuel Wines26, Ann Borda15,27, Håkon da Silva Hyldmo28, Pen-Yuan Hsing29,30, Lena Denis31, Carolyn Thompson27,32.
Abstract
BACKGROUND ANDEntities:
Keywords: Data; Democracy; Evidence; Genomics; Health; Indigenous; Open; Participatory; Standardised; Systematic
Year: 2022 PMID: 35854364 PMCID: PMC9294764 DOI: 10.1186/s40900-022-00363-9
Source DB: PubMed Journal: Res Involv Engagem ISSN: 2056-7529
Example applications of STARDIT
| Area | Sub-area | Relevant data categories |
|---|---|---|
| Research | Health research | Reporting: Funding, conflicting or competing interests, co-design, experts involved, people affected involved, methods, process for deciding and measuring outcomes, protocols, who is accountable for ensuring protocol is followed, information about data storage, sharing, ownership and custodianship, information about data security practices and standards, information about consent and withdrawal processes evaluation of entire research process, ethical review, information about data analysis and data validation |
| Social research | ||
| Genomics research | ||
| Environmental research | ||
| Policy | Health and social policy | Reporting: Values of people involved, sources of data and evidence, data on past and current initiatives and spending [ |
| Other government policy (transport, arts, education, environment etc.) | ||
| Foreign policy | ||
| Proposed policy (including draft policy and manifestoes) | ||
| International development | ||
| Education and learning | Educational initiatives | Reporting: Sources of data and evidence for intervention, purpose of intervention, process for educational intervention creation, funding, conflicting or competing interests, experts involved, people affected involved, conflicting or competing interests, process for deciding and measuring outcomes, outcomes from intervention, evaluation of intervention, ethical review |
| Arts | Community arts projects | Reporting: Purpose of project, process for project design and implementation, experts involved, people from communities intended to benefit involved, funding, conflicting or competing interests, process for evaluating project, project evaluation, project outcomes |
| Arts funding | Reporting: People involved in deciding funding process, purpose of funding, people allocating funding (funding sources), funding amount, conflicting or competing interests, process for deciding outcomes of funding, evaluating the funding allocation process | |
| Information, media and cultural heritage | Health and medical information | Reporting: People involved in researching, writing (including medical writers), creating, reviewing (including peer reviewers), disseminating and funding, information about any potential risks (to human health or lifeforms, natural or cultural heritage), information about who assessed those risks and how (for example, medical information standards [ |
| Disaster and emergency communication | ||
| Public interest, factual information commentary, documentaries and other informative media | ||
| Intangible cultural heritage (including folklore, traditions, language), traditional, local and Indigenous knowledge and wisdom | Reporting: Who created any content containing the Indigenous or traditional knowledge, what tasks they had, how this knowledge was shared and any relevant concepts of ‘owning’ or ‘property’; reporting who knows certain things (for example, people who are recognised as ‘Preservers of Important Intangible Cultural Properties’ [ | |
| Tangible cultural heritage (including cultural property [ | Reporting: Who was involved in creating the property, any concepts of ownership or guardianship in relation to the property, data about ongoing management (including monitoring, exhibiting, restoring or moving), data about cultural significance and stakeholders involved in defining this | |
| Hardware designs (including hardware architecture, device designs or other abstract representations) | Reporting: Who was involved in creating the designs and how, who reviewed them and how (including relevant safety, regulation or standards information), what formats are the designs shared as and in what medium, information on licence(s), outcomes and impact of the hardware | |
| Code and algorithms | Reporting: Who created code (including algorithms), who is involved in reviewing and scrutinising code (including who is involved in which ethical review processes), what code is part of which distinct projects or forks, what language the code is in, what medium (for example, machine or DNA), information about ownership of data or knowledge (including concepts of intellectual property and copyright), information on licence(s), purpose of code, outcomes and impact of the code | |
| Management and monitoring | Environmental and natural heritage, natural resource management | Reporting: Data about who was involved in service design, monitoring and management processes, data about funding for monitoring or management (for example, funding for pollution monitoring), data about how information will be stored and shared (including what will be redacted and data security), data about who decides what data will be redacted and how this decision is made, information about how data will be analysed (including relevant code and algorithms) and how learning from data will be shared, information about relevant data privacy legislation and regulation |
| Public and private essential services management (health, infrastructure, waste and recycling, water and sewage, electricity) | ||
| Data management and monitoring | ||
| Evaluation | Process evaluation | Reporting: Data about processes (industrial, public health, organisational) [ |
| Evaluation of participatory methods | Reporting: Data about participatory research methods (including ‘citizen science’ [ | |
| Transparent rating | Reporting: Processes of transparency rating (or ‘scoring’) data quality about initiatives based on how much information about the initiatives is shared in a publicly accessible way (or reasons for redaction, including Indigenous knowledge) | |
| Production, consumerism and business | Industry standards | Reporting: Internal processes and data sharing practices of self-regulating industry standards (for example, the Forest Stewardship Council, Marine Stewardship Council [ |
| ‘Green’ industries and eco-tourism | Reporting: Transparent process for defining ‘green’ and ‘eco’, experts involved, people affected involved, process for deciding and measuring outcomes, outcome measures, evaluation of process | |
| Infrastructure, construction and interiors | Reporting: Transparent reporting of sources of building and furniture materials, such as wood (including relevant DNA information to verify sources of timber), metals and other materials (including information verifying the supply chain is slavery free), data from building and structural assessments | |
| Finance and financial services | Reporting: Who is involved in decision making (including investment and divestment), who scrutinises decision making, who is involved in holding individuals to account and who scrutinises this process, competing or conflicting interests of people involved in decision making, data about how concepts such as ‘ethical investments’ are defined, impacts or outcomes from investments or donations, data sharing practices and security practices, data about who scrutinises security practices | |
| Donation and philanthropy | Reporting: Any stated purposes or caveats for donation, organisations or individuals donating, how money was spent, who was involved in deciding how it was spent, what was the method for deciding this, who is accountable for overseeing this, any outcomes or impacts | |
| Other products (medical devices, electronics) | Reporting: Experts involved in production, other people involved in production process, resources involved in production process (including relevant DNA information to verify products from plants, animals and fungi), ingredients, funding for resources (for example demonstrating it is ‘slavery free’), process reporting (including Good Manufacturing Practice), regulation and authorisation processes (for example medicines and medical devices), code and algorithm checking (for example, autonomous vehicles) process for designing impact assessment, impact assessment (including human and environmental), experts involved in dismantling process (including recycling), other people involved in dismantling process and disposal, evaluation of product according to transparently-decided outcome measures | |
| Products for human use or ingestion | Food | |
| Medicines | ||
| Products for non-human lifeforms | Food | |
| Medicines | ||
| Other products | ||
| Health Technology assessment | Assessment process for pharmaceuticals, devices, procedures and organisational systems used in health care [ | Reporting: Process for deciding health technology assessment (oversight and scrutiny), sources of data and evidence, process for deciding and measuring outcomes, experts involved, people affected involved, conflicting or competing interests, outcomes from assessment decisions (including outcomes measured by those affected by assessment decisions), collation of adverse event reports from Governments and reputable sources, assessment evaluation (did it achieve what was intended?), results of economic evaluations |
| Health and social care and services | Health care and services | Reporting: Process for assessing needs (including who was involved, the method and budget), process for prioritisation of services (including budgets and ‘rationing’ decisions), process for designing and implementing service or care (including who was involved, the method and the budget), process for evaluating service or care (including impacts), patterns for evaluating service improvement initiatives, process for reporting adverse events and malpractice (including the overview and scrutiny of this process), process for identifying patterns of sub-optimal service, process for responding to malpractice or other identified issues, process for identifying impact indicators (including geolocation data) |
| Social care and services | ||
| Other services |
Fig. 1STARDIT Logo
Fig. 2STARDIT development
Values of the STARDIT project
| Value | Summary |
|---|---|
| System and language agnostic | STARDIT is system and language agnostic, it should always be designed to work across and with as many systems as possible, in as many countries and languages as possible |
| Designs and code should always be open access | In alignment with the UNESCO Recommendation on Open Science [ |
| Participatory paradigm | STARDIT development will be guided by the participatory action research (PAR) paradigm [ |
| United Nations rights-based paradigm | STARDIT will be guided by the United Nations rights-based paradigm, including human rights, environmental rights and other emerging rights |
Summary of STARDIT Beta Version data fields
| Section | Data category | Data field | |
|---|---|---|---|
| Core: Initiative context—This information locates the initiative within a clear context | Identifying information | Initiative name* | |
| Geographic location(s)* | |||
| Purpose of the initiative (aims, objectives, goals)* | |||
| Organisations or other initiatives involved (list all if multi-centre)* | |||
| Relevant publicly accessible URLs/URIs | |||
| Other identifiers (e.g. RAiD [ | |||
| Keywords or metatags—including relevant search headings (e.g. MeSH [ | |||
| Other relevant information (free text) | |||
| Status of initiative | What is the current state of the initiative?* Select from: 1. Prospective—this report is prospective or describes planned activity 2. Ongoing—the initiative is still taking place 3. Completed—the initiative has finished ( | ||
| Date range (start and end dates of initiative) | |||
| Methods and paradigms | Methods of the initiative (what is planned to be done, or is being reported as done). Include information about any populations or eco-systems being studied, any ‘interventions’, comparators and outcome measures (qualitative or quantitative)* | ||
| Include any information about theoretical or conceptual models or relevant ‘values’ of people involved with this initiative, including any rationale for why certain methods were chosen | |||
Report authorship—Information about who completed the report and how | Identifying information for each author | Name* | |
| Publicly accessible profiles, institutional pages* | |||
| Open Researcher and Contributor ID (orcid.org)* | |||
| Tasks in report completion | |||
| Other information | |||
| Accountability | Key contact at initiative for confirming report content (include institutional email address)* | ||
| Date | Date of report submission | ||
| Input: Ethics assessment | Ethics approval information (if applicable) | Assessing organisation or group* | |
| Approval date and approval ID— | |||
Input: Human involvement in initiative Who is involved in this initiative and how? Editors assessing involvement may need to use the STARDIT ‘Indicators of involvement’ tool | Details about how each group or individual was involved in the initiative | Who was involved or how would you label those involved (select from group labels or submit new group label name in free-text)* | |
| How many people were in each grouping label? | |||
| Tasks of this person or group (list as many as possible)*— | |||
| Method of doing task? How did these people complete these tasks? (what methods were used)— | |||
| Communication modes? What modes of communication were used— | |||
| How were people recruited, contacted or informed about these tasks? | |||
| Involvement appraisal | Methods of appraising and analysing involvement (assessing rigour, deciding outcome measures, data collection and analysis) | ||
Enablers of involvement (what do you expect will help these people get involved—or what helped them get involved) | |||
Barriers of involvement (what do you expect will inhibit these people from getting involved—or what inhibited them from getting involved). Are there any known equity issues which may contribute? | |||
How did the initiative change as a result of involving people? | |||
| Involvement outcomes, impacts or outputs | Were there any outcomes, impacts or outputs from people being involved?* | ||
| Learning points from involving people | What worked well, what could have been improved? Was anything learned from the process of involving these people? | ||
| Stage | Which stage of the initiative were these people involved? | ||
| Financial or other interests (including personal or professional interests) | Describe any interests (financial or otherwise), conflicting or competing interests, or how anyone involved may be personally, financially or professionally affected by the outcome of the initiative* | ||
Input: Material involvement in initiative Mapping financial or other ‘interests’ | Financial | What was the estimated financial cost for the initiative | |
| Funding information (link to publicly accessible URL if possible)— | |||
| Time | How much time was spent on this project | ||
| Other | Describe any costs or resources that cannot be measured financially or quantitatively— | ||
Outputs: Data including code, hardware designs or other relevant information | Sensitive data | Secure criteria | Data adheres to relevant industry/discipline data security requirements |
| Repository | How is data entered, changed or removed within a repository? | ||
| Usage | Who is the data from this initiative shared with? | ||
| Who has access to sensitive data and how is this decided? | |||
| Safety | Is data encrypted? Is it anonymised or de-identified? What methods are used for re-identification? What is the risk of unauthorised re-identification? | ||
| Open data | FAIR criteria | Data adheres to FAIR criteria [ | |
| Findable | Describe relevant metadata, how the data is machine readable and other relevant information | ||
| Accessible | How can data be accessed— | ||
| Interoperable | How is data interoperable or integrated with other data? | ||
| Reusable | How can data be replicated and/or combined? | ||
| Indigenous data | CARE principles | Data adheres to CARE principles [ | |
| Collective Benefit | How will Indigenous Peoples derive benefit from the data | ||
| Authority to Control | How will Indigenous Peoples and their governing bodies determine how relevant data are represented and identified | ||
| Responsibility | How will those using the data provide evidence of these efforts and the benefits accruing to Indigenous Peoples | ||
| Ethics | How have Indigenous Peoples’ rights and wellbeing been centred during the data life cycle | ||
| All data | Hosting | Where is it data stored and hosted -s | |
| Owner | Who ‘owns’ the data or claims any kind of copyright, patent(s), or other specific types of intellectual property— | ||
| Analysis methods | Describe methods used to analyse the data (including a link to any relevant code and information about validity) | ||
| Usage | How can data be used? | ||
| Dissemination | How is information about this data disseminated? | ||
| Impact | impact/effect of the output | ||
| Data control | Who controls access to the data? How are decisions about data access made? Is data anonymised or de-identified? What methods are used for re-identification? What is the risk of unauthorised re-identification? How is this risk managed? | ||
| Management and quality | Which person (or organisation) is responsible for managing (or ‘curating’) the data? | ||
| Who is accountable for ensuring the quality and integrity of the data? (this may be an individual or organisation) | |||
| Impacts and outputs: Publications, events, changes, learning items etc | What was learned | What new knowledge has been generated? (if appropriate, include effect size, relevant statistics and level or evidence)* | |
| Knowledge translation | Describe how the learning or knowledge generated from this initiative has or will be used | ||
| Impacts | Have there been any outcomes, or has anything changed or happened as a result of this initiative that isn’t captured in previous answers?* | ||
| Measurement and evaluation | How has or how will this be measured or evaluated? | ||
| Who is involved in measuring or evaluating this? | |||
| Who was or is involved in deciding on the outcomes used to evaluate any impacts or outcomes? How were they involved? | |||
| Information completed by Editors | |||
| STARDIT report version number (assigned) | Report number assigned to distinguish it from any future updated reports | ||
Indicators completed by Editors and/or peer reviewers Editors and peer reviewers assessing the report will need to look for indicators in the following categories on publicly accessible URLs* | Indicators of involvement | Use the STARDIT ‘Indicators of involvement’ tool | |
| Indicators of data practice compliance | Use the relevant criteria | ||
| Indicators of translation and impact | |||
| Other indicators | |||
Fig. 3STARDIT technical information summary
Fig. 4Planning and evaluating initiatives using STARDIT
Fig. 5Reporting initiative design in STARDIT
Summary of reporting initiative design in STARDIT
| Initiative stage | Data reported |
|---|---|
|
| |
The idea is refined with a small group of stakeholders [ | |
|
| |
Develop a communication plan to invite people to co-create involvement [ | |
|
| |
|
| |
|
|
Bold text indicates the stage summarised in Fig. 5
Questions for mapping preferences for involvement
| Question | Rationale for question |
|---|---|
| Which stakeholder group does this person align with? | To establish which grouping(s) the person identifies as being part of—for example ‘researcher’ or ‘participant’ (noting any groupings should be co-defined) |
| Describe any financial relationship or other interest this person has to this project | To provide a public record of any potential conflicting or competing financial interest |
| Views on the purpose and values of the research | To establish the purpose of the research, and the motivations and values of the initiative from multiple perspectives |
| Describe how you think the learning from this initiative could be used | To establish views about knowledge translation and application of learning |
| Views on which data from this project should be shared with which people and how | To establish that person’s view about data sharing and ownership |
| Views on who should be involved (which ‘groups’ of people)—including who should not be involved | To establish that person’s views on which ‘groups’ of people they think should be ‘involved’ in research—that is, having a role in shaping the research design, direction and outcomes |
| Views on specific tasks of this person or group | To establish that person’s views on the tasks of the specific stakeholders who they think should be involved |
| Preferred modes of communication | To establish that person’s preferences on communication modes with stakeholder groups |
| Views on what methods should be used | To establish that person’s views on which methods should be used to involve people—for example ‘online survey’ |
| Views on facilitators of involvement | To explore that person’s perceptions of what might facilitate involving specified groups of people and help inform the design of involvement |
| Views on barriers of involvement | To explore that person’s perceptions of what might be a barrier to involving specified groups of people and help inform the design of involvement |
| Views on what the outcome or output of the involvement could be | To ascertain the expectations of that person about what involving the specified groups of people might achieve |
| Views on which stage of the research this group should be involved? | To establish that person’s views on which stage of the research the specified groups of people should be involved in |