Literature DB >> 23807118

Defining, illustrating and reflecting on logic analysis with an example from a professional development program.

Marie-Claude Tremblay1, Astrid Brousselle, Lucie Richard, Nicole Beaudet.   

Abstract

Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method.
Copyright © 2013 Elsevier Ltd. All rights reserved.

Keywords:  CSSS; DSPM; Health and Social Services Centres; Intervention theory; Logic analysis; Logic model; Program evaluation; Public Health Directorate for Montreal; Theory-based evaluation

Mesh:

Year:  2013        PMID: 23807118     DOI: 10.1016/j.evalprogplan.2013.05.004

Source DB:  PubMed          Journal:  Eval Program Plann        ISSN: 0149-7189


  6 in total

1.  Learning reflexively from a health promotion professional development program in Canada.

Authors:  Marie-Claude Tremblay; Lucie Richard; Astrid Brousselle; Nicole Beaudet
Journal:  Health Promot Int       Date:  2013-08-30       Impact factor: 2.483

2.  Optimizing the development and evaluation of complex interventions: lessons learned from the BetterBirth Program and associated trial.

Authors:  Dale A Barnhart; Katherine E A Semrau; Corwin M Zigler; Rose L Molina; Megan Marx Delaney; Lisa R Hirschhorn; Donna Spiegelman
Journal:  Implement Sci Commun       Date:  2020-02-25

3.  An analysis of the adaptability of a professional development program in public health: results from the ALPS Study.

Authors:  Lucie Richard; Sara Torres; Marie-Claude Tremblay; François Chiocchio; Éric Litvak; Laurence Fortin-Pellerin; Nicole Beaudet
Journal:  BMC Health Serv Res       Date:  2015-06-14       Impact factor: 2.655

4.  Five ways to get a grip on the shortcomings of logic models in program evaluation.

Authors:  Betty Onyura; Hollie Mullins; Deena M Hamza
Journal:  Can Med Educ J       Date:  2021-12-29

5.  Using collaborative logic analysis evaluation to test the program theory of an intensive interdisciplinary pain treatment for youth with pain-related disability.

Authors:  Karen Hurtubise; Astrid Brousselle; Chantal Camden
Journal:  Paediatr Neonatal Pain       Date:  2020-04-23

6.  Healthcare professionals' longitudinal perceptions of group phenomena as determinants of self-assessed learning in organizational communities of practice.

Authors:  François Durand; Lucie Richard; Nicole Beaudet; Laurence Fortin-Pellerin; Anahi Morales Hudon; Marie-Claude Tremblay
Journal:  BMC Med Educ       Date:  2022-02-03       Impact factor: 2.463

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.