Level | Aims | Learning Outcomes |
1 | Design an evaluation exercise for a simulation session (eg a single training session) | Demonstrates knowledge of: 1.1. established evaluation tools, such as Kirkpatrick and their limitations 1.2. ensuring evaluation approaches or tools are accessible and inclusive Demonstrates: 1.3. Creates a reusable satisfaction survey for an end of session simulation event Select clear, concise & relevant questions for inclusion in evaluation of simulated activity Appreciate that meaningful input from learners is essential for improving programmes Appreciate that meaningful input from teachers/facilitators is essential for improving programmes 1.4. ability to use information gathered to continuously develop, improve & optimise simulated activity 1.5. Importance of offering learners equal opportunity to be involved in evaluation/feedback |
2 | Design an evaluation and longitudinal monitoring plan for a simulation programme | All the knowledge & skills previously accumulated, plus: Demonstrates knowledge of: 2.1. advantages, disadvantages & limitations of each level I-IV of Kirkpatrick’s model 2.2. ASPiH Standards regarding evaluation of simulated activity different methods of evaluating simulated activity – evaluation forms, focus group, metrics, observation, patient safety data, patient satisfaction survey, peer review, semi-structured interview Demonstrates the ability to: 2.3. draw on a theory-based approach to establish anticipated causal relationships and anticipated results from simulation activities to guide the evaluation process. 2.5. Develop performance indicators and targets (where appropriate) Identify data collection processes and tools 2.6. Identify evaluation questions requiring criteria and standard 2.7. Identify evaluation methods for each question 2.8. participate in reflective practice, including peer observation of simulation activity 2.9. thematically analyse free text & describe findings 2.10. present information drawn from evaluation tools in a clear, logical, meaningful manner 2.11. integrate information from evaluation, review & peer-review to inform change in practice 2.12. apply descriptive & analytical statistics to data presentation where relevant |
3 | Evaluate programmes of simulated activity at departmental, regional, or complex level Engages in the formation and utilisation of new knowledge in simulation based education through research activity | All the knowledge & skills previously accumulated, plus: Demonstrates knowledge of: 3.1 a wide range of educational research methodologies to an advanced level Demonstrates ability: 3.2 utilise learning from SBE research to support the professional development of other simulation-based educators 3.3 present evaluation findings to stakeholders to demonstrate impact and value 3.4 identify and lead research opportunities for SBE 3.5 develop educational insights, theories and practice through academic scholarship 3.6 application of evaluation findings to departmental, regional or national programmes to advance the quality of simulation activity 3.7 ability to use evaluation data as a quality & risk management resource to assist organisations achieve improved patient safety & quality Demonstrate: 3.8 engagement in continuing professional development (CPD) with regular evaluation of performance by both learner and fellow faculty, including equality, diversity and inclusion training and/or guidance |
Examples of relevant evidence or learning resources
- Courses attended or programmes undertaken including face-to-face, e-Learning, webinars
- Literature reviews, research, reflective practice, personal reading/learning
- Feedback from supervisors, peers, learners
- Other equivalent demonstratable experience