U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Bonell C, Jamal F, Harden A, et al. Systematic review of the effects of schools and school environment interventions on health: evidence mapping and synthesis. Southampton (UK): NIHR Journals Library; 2013 Jun. (Public Health Research, No. 1.1.)

Cover of Systematic review of the effects of schools and school environment interventions on health: evidence mapping and synthesis

Systematic review of the effects of schools and school environment interventions on health: evidence mapping and synthesis.

Show details

Chapter 8Research question 3: process evaluations

Research question

How feasible and acceptable are the school environment interventions examined in studies addressing RQ2? How does context affect this, examined using process evaluations linked to outcome evaluations reported under RQ2?

Methods

Inclusion and exclusion criteria

As reported in the last chapter, 16 reports were included that addressed RQ2 (outcome evaluations). We included process evaluations linked to outcome evaluations reviewed in the last chapter. To identify process evaluations associated with these, the full texts of these reports were retrieved and the following exclusion criteria were applied by one reviewer and checked by a second reviewer (there were no discrepancies to be resolved):

  • exclude if study is not a process evaluation
  • exclude if study does not report on an intervention subject to an outcome evaluation included in stage 2
  • exclude if study is not written in English.

Quality assessment

All included reports were quality assessed using the following criteria:

  • whether or not study has clear RQs/aims
  • whether or not sampling and sample are described
  • whether or not study examines planning (using qualitative data)
  • whether or not study examines delivery (using quantitative or qualitative data)
  • whether or not study examines coverage (using quantitative or qualitative data)
  • whether or not study examines receipt (using quantitative or qualitative data)
  • whether or not study examines acceptability (using quantitative or qualitative data)
  • whether or not study examines context (using quantitative or qualitative data).

These criteria used for assessing methodological quality were adapted from those used in a previous review.94 They allowed us to assess which studies were well reported, which examined the intervention process comprehensively and which enabled examination of the process from a range of perspectives. Reports were not excluded or graded based on these quality assessment ratings. Instead, this assessment was used qualitatively when weighing up evidence from each evaluation.

The criteria were piloted on a random sample of two reports by two reviewers (CB and HW) before being applied by one reviewer (HW) and checked by another (CB), with any differences being settled by discussion without recourse to a third reviewer.

Data extraction

Because only those process evaluation studies that were linked to included outcome evaluations (RQ2) were included, we had already extracted data on study RQs/hypotheses, study site and population, sampling, data collection methods, analysis methods and results. Informed by existing tools for data extraction of process evidence126,128 we extracted data related to the following: part of the process examined (planning, delivery, receipt), aspect of the process examined (feasibility, fidelity, coverage/accessibility, acceptability) and aspect of the intervention context examined (e.g. measured need, policy, institutional and professional capacity, collaboration, ‘product champions’). Data extraction tools were piloted on a random sample of two reports by two reviewers (CB and HW) before being applied by one reviewer (CB) and checked by another (HW), with any differences being settled by discussion.

Synthesis

A narrative synthesis was conducted for process evaluation studies. We included all studies in the narrative, but made clear where studies were subject to methodological limitations informed by our quality assessment. It was aimed that the narrative synthesis would develop overarching themes, but in practice it was largely restricted to narrative summaries of the findings of each study in context. This was because the studies were too heterogeneous in design and methods to develop meaningful or very detailed overarching narrative themes. Nevertheless, a narrative overview is provided in this chapter's discussion section.

Overview of included reports

Flow of literature

Only those process evaluations that were linked to outcome evaluations included in RQ2 were considered for inclusion. Therefore, the 16 included outcome evaluation study reports were screened for accompanying process evaluations (all of the reports were written in English). Of these 16, five reports included process evaluations. From checking the references of the five included reports, we identified one further linked process evaluation that was included in the in-depth synthesis. Fagen and Flay129 reported on a process evaluation of the sustainability of the AAYP intervention but this focused only on the curriculum component and so is not considered further here. Thus, six reports (one linked)55,5860,84,85 of four studies were included in the process evaluation in-depth synthesis (Figure 9).

FIGURE 9. Flow of literature: process evaluation synthesis (stage 2: in-depth synthesis).

FIGURE 9

Flow of literature: process evaluation synthesis (stage 2: in-depth synthesis).

Quality assessment

Study reports varied in whether or not they set clear RQs. Clear questions were provided by Bonell et al.59,60 and Solomon et al.58 Nearly all described their sampling methods and samples; Solomon et al.58 did not describe the sample. Studies varied in the extent to which they sought the perspectives of a range of stakeholders on the interventions. Battistich et al.55 and Solomon et al.58 relied solely on research observations of delivery. Bonell et al.59,60 collected data from external providers, school staff and students. Dzewaltowski et al.63 collected data from students, site co-ordinators and teachers. Flannery et al.119 collected data solely from teachers. Only Bonell et al.59,60 drew on qualitative and quantitative data, the others drawing only on quantitative data.

Only Bonell et al.59 examined the planning of the intervention prior to delivery, drawing on interviews with providers. Reports from all four interventions described the fidelity of intervention delivery using observations or data from teachers, other intervention staff or students. Only Battistich et al.55 and Solomon et al.58 reporting on the CDP intervention, examined the fidelity of delivery by comparing intervention with control schools. Bonell et al.59,60 and Dzewaltowski et al.63 drew on self-report data from providers to quantify delivery. The HSE evaluation59,60 also drew on qualitative observational data on delivery and interview data on the feasibility of delivery, while Flannery et al.119 examined quantitative data from teachers on feasibility. Bonell et al.59,60 and Dzewaltowski et al.63 examined coverage by determining recognition of the intervention among students. Bonell et al.59,60 evaluated acceptability qualitatively through interviews and focus groups. Flannery et al.119 evaluated acceptability quantitatively through surveys only of the training component for teachers. Dzewaltowski et al.63 examined the extent to which training provided site co-ordinators with the self-efficacy to undertake their work. Only Bonell et al.59,60 drew on qualitative data and examined how context might influence intervention delivery or uptake.

Study characteristics

Five reports examined three interventions that encouraged staff and students to develop school climates characterised by a stronger sense of community and better relationships.55,5860,119 One study evaluated an intervention that enabled staff and students to advocate for school environments promoting healthier eating and physical activities.63

Results

Narrative summary of findings from each process evaluation

Battistich et al.55 report that, across all years of the CDP intervention, teaching practices across the five areas addressed by the programme were significantly distinctive from those in comparison schools, suggesting that the intervention was feasible to deliver with good fidelity. Solomon et al.58 report that according to observations of teachers' there were significant differences between intervention and control schools in around half of the teaching practices and classroom activities intended to be brought about by the project, for example teachers' use of group praise and students participating in rule development. According to student reports, there were significant differences between intervention and control arms in most indicators of teaching practices and classroom activities prescribed by the project. This study examined intervention fidelity in a rigorous manner so that conclusions in this area are likely to be sound, but did not examine other aspects of or perspectives on process, or assess context.

Bonell et al.59 report that the intervention was delivered as intended with all components implemented, although it should be noted that this study did not examine fidelity of delivery through observation sessions. Qualitative data suggest that the external facilitator enabled schools to convene an action team involving staff/students. Inputs were feasible and acceptable and enabled similar actions in both schools. Locally determined actions (e.g. peer mediators) were generally more feasible and acceptable than preset actions (e.g. modified pastoral care). This study alone used qualitative data to examine the effect of contextual factors on implementation. This suggested that implementation was facilitated when it built on aspects of schools' baseline ethos (e.g. a focus on engaging all students, formalised student participation in decisions) and when senior staff led actions, acting as ‘product champions’. Student awareness of the intervention was high. Quantitative data on students' attitudes and behaviours suggested that the intervention aims corresponded with local needs in each intervention school. Bonell et al.60 report that some activities such as rewriting school rules involved broad participation, which was assessed through qualitative methods such as interviews.

Flannery et al.119 report that teachers being trained to deliver the PeaceBuilders intervention found the philosophy behind the intervention easy to understand. They regarded training as clear, effective and easy to follow, that the school administration supported the intervention and that it would be easy to implement and would be effective in the classroom. Surveys with teachers also suggested that the intervention was delivered regularly; approximately half rated implementation as extensive and half as moderate, with around half using half or more of the intervention materials. Other aspects of the process, other perspectives, such as those of students, and context were not examined.

Dzewaltowski et al.63 undertook a thorough quantitative assessment of process and reported that training for the HYP intervention occurred as planned, with site co-ordinator attendance very high in both years and self-efficacy arising from training being high. Site co-ordinators formed ‘change teams’ in each school that met regularly. Site co-ordinators reported an average of 26.5 implemented programme, policy or practice changes. Teachers implemented around two-thirds of planned lessons. Student surveys suggested that around one-third of students had heard about the intervention or its activities, around half of whom had participated on ‘change teams’. Context was not examined in this study.

Discussion

Narrative overview of findings

Of the 16 included outcome evaluations, five reports included process evaluations and one further linked process evaluation paper was found by reference checking. These employed a range of research methods, most frequently drawing on quantitative data collected from students and/or teachers. These reports addressed some aspects of our third RQ more than others. Although most examined feasibility or fidelity in some way, fewer examined acceptability and only one study used a mix of quantitative and qualitative methods to examine local context and how this influenced intervention processes. Process evaluations reported largely positive results regarding intervention feasibility, fidelity, reach and acceptability, although differences in methods prevent any comparison of the delivery and uptake of each intervention. The single study that examined context suggested that this was important, facilitating implementation when this built on schools' existing ethos and when senior staff championed the intervention.

Strengths and limitations

We limited our in-depth review of process evaluations to those linked to outcome evaluation studies because in consultation with our stakeholders we deemed it most useful to synthesise evidence about the feasibility and acceptability of interventions about which we have evidence of their effects. This pragmatically limited the scope of our review by preventing us from synthesising evidence, for example, on potentially innovative interventions that have been feasibility tested but not yet been subject to outcome evaluation.

Most of our outcome evaluations were accompanied by process evaluations but these involved a diversity of methods making it impossible to compare the feasibility, coverage, acceptability or context of the interventions. The small number of heterogeneous studies made it impossible to draw conclusions about how context, processes and outcomes might inter-relate.

Copyright © Queen's Printer and Controller of HMSO 2013. This work was produced by Bonell et al. under the terms of a commissioning contract issued by the Secretary of State for Health. This issue may be freely reproduced for the purposes of private research and study and extracts (or indeed, the full report) may be included in professional journals provided that suitable acknowledgement is made and the reproduction is not associated with any form of advertising. Applications for commercial reproduction should be addressed to: NIHR Journals Library, National Institute for Health Research, Evaluation, Trials and Studies Coordinating Centre, Alpha House, University of Southampton Science Park, Southampton SO16 7NS, UK.

Included under terms of UK Non-commercial Government License.

Bookshelf ID: NBK262778

Views

  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (30M)

Other titles in this collection

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...