Michigan Standard Simulation Experience Scale (MISSES)
The Michigan Standard Simulation Experience Scale (MiSSES) Template
working document available from: http://www.med.umich.edu/umcsc/research/MiSSES.doc]
html version available from: http://www.med.umich.edu/umcsc/research/MiSSES2.html]
This document contains a preliminary template including the six components of the MiSSES toolkit. It is designed to provide a framework for assessment, providing the option of assessing entire range of domains identified through a review of simulation literature.
The template was designed to be manually adapted by investigators to specific use scenarios. Investigators can select relevant sections of the template to use, based on their own needs.
Refinement of the toolkit is ongoing, and additional user testing will result in iterative improvements to the MiSSES.
Users are fee to adapt this template for their own use. Please use the following citation in published work:
“Assessment was adapted from the MiSSES template for evaluation of simulation.”
Seagull FJ, Rooney DM. Filling a void: Developing a standard subjective assessment tool for surgical simulation through focused review of current practices. Surgery. 2014 Sep;156(3):718-22. doi: 10.1016/j.surg.2014.04.048. PubMed PMID: 25175506.
The following template contains six sections.
- Each section represents a domain for evaluation.
- Each section includes highlighted text. This text should be personalized to reflect the type of activity to be evaluated.
- Preserve the five-point response scale. It is widely used in simulation research, and provides a degree of standardization.
Demographics: Collect demographics only to the extent that is needed. For example, in a session with only 2nd year medical students, no indication of “training year” would be needed. Previous experience may include clinical or simulated experience, and can be expanded to any relevant activities (e.g. video game use for laparoscopic surgery).
Self-efficacy: These questions address issues of knowledge, confidence, and ability to work independently. As written, they address the experience as a whole. These questions can be expanded to address individual aspects of the simulation, simulator, or curriculum.
Fidelity: Simulator characteristics can be assessed globally, or additionally include specific subcomponents. These can be assessed in the dimensions of fidelity of an array of characteristics within the simulator or simulation setting. Aspects of fidelity to assess could include suspension of disbelief, visual characteristics, tactile characteristics, procedural authenticity, etc.
Educational value: Knowledge and skills are two components of educational value. They can be assessed globally, or for the individual components of the activity, such as the individual learning objectives
Teaching quality: These questions assess the quality of instructors and resources that support the activity. They can additionally include adequacy of resources such as time allotted, space provided, and ease of access to simulation.
Overall: It has been noted in the literature that often times, a simple global rating is more highly correlated with objective measures of assessment than the individual sub-components of a scale. We have included it to align with best practices in assessment literature.
Versions of this instrument are available online through Qualtrics and Survey Monkey. For more information, validation data, and collaboration, please contact:
Deb Rooney, Director of Education and Research, Clinical Simulation Center, University of Michigan. DMRooney@umich.edu