Study protocol

Health Care Research Issues

In Germany, several governmental and non-governmental institutions are involved in organizing civil protection and disaster response. As a reaction to the terrorist attacks of September 11th 2001 in New York City and the Elbe floods in Germany in the summer of 2002, the Conference of the Ministers of the Interior, enacted a New Strategy for the Protection of the Population [1]. The necessity of preparedness exercises at a political-administrative level (strategic crisis management) was decided. Since 2009, this task has been embodied in the constitution of the Federal Republic of Germany [2].
In recent years, questions have been raised regarding the emergency preparedness for a range of disasters, including weather-related catastrophes, bio threats, and terrorism. Particular concerns were focused on the ability of the public health care system to cope with a large-scale emergency such as pandemic influenza [3-7]. However, it has yet to be determined whether recent initiatives [8, 9] have measurably improved the ability to respond to large-scale emergencies. Often it remains unknown whether public health personnel are trained according to the appropriate response plans and procedures, how well new equipment such as communications or surveillance systems function, or the degree to which public health plans are integrated with the capabilities of other emergency responders such as law enforcement, fire services, emergency medical services, emergency management, and hospitals.


Various instruments for measuring public health preparedness (PHEP) have been developed, most of them designed to assess capacity. However, availability of physical resources is only one predictor of successful emergency response; more important is the ability of the system to function in a coordinated manner. Critical are the measurement of the health care system’s capabilities and the health care worker’s (HCW) competencies. The inability to accurately and reliably measure these aspects of PHEP can lead to substantial problems at several levels. At the system level, the inability to create or receive structured and reliable feedback may hamper quality improvement. At the individual level (HCW level), enthusiasm and support for continued efforts may wane over time, if reliable data that confirm the utility of investments and improvements of preparedness are not available. At the system and the individual level, the limited ability to quantify capabilities and competencies and to demonstrate improvement threatens their continued availability.
The purpose of the present project is to develop and validate an instrument for the assessment of competencies that emergency nurses demonstrate during exercises to assure preparedness. The project will result in a set of valid, reliable and practical measures of emergency nurses’ competencies related to disaster response.

Capacities, Capabilities, and Competencies
While a number of instruments for measuring PHEP have been established, most of the current measures are designed to assess capacity, i.e. quantities of material assets and infrastructure elements. Moreover, only few data support the validity and reliability of such measures [10]. As illustrated by the 2005 Gulf Coast hurricanes in the USA and the 2002 Elbe floods in Germany, availability of physical resources is only one predictor of successful emergency response; arguably, more important is the ability of the system and its components to function in a coordinated manner. Particularly critical are the measurement of a public health system’s capabilities (i.e. its collective ability to undertake functional or operational actions using available preparedness assets to effectively identify, characterize, respond to, and recover from an emergency) and the individual competencies that HCW demonstrate during disaster response.
Valid measurement of individual preparedness could be based on competencies a HCW demonstrates during disaster response [12]. In the field of public health practice, the term competency may be defined as “a complex combination of knowledge, skills, and abilities demonstrated by organization members that are critical to the effective and efficient function of the organization” [13]. In general, competencies or skills can be categorized into three domains [14]:1) Occupational skills involve technical abilities to perform the required tasks; 2) academic skills include math, science, communication, problem solving and critical thinking; 3) employability skills include social skills as well as leadership abilities, which are essential for increased productivity, flexibility and work management.
Competency itself can only be measured indirectly. For assessment purposes, competency needs to be properly operationalized into its measurable dimensions (in the following testable terminal objectives (tTO) for each competency).
There has recently been some publication on the measurement of HCW competency in disaster response. In 2002, Gebbie et al. identified competencies that are essential to prepare local public health workers for response to public health emergencies [15]. By using a panel of experts in public health and emergency response they systematically identified competencies most needed by state and local public health staff in order to be prepared to respond to any emergency situation. In 2006, Hsu et al. identified seven so-called cross-cutting competencies for hospital health care staff [16]. Their study incorporated a multi-stage consensus building process including review of peer-reviewed literature, published training objectives, expert panel review, and development of tTO for each competency.
The American college of Emergency Physicians [17] and the International Nursing Coalition [18] systematically developed different sets of competencies for professional nurses in relation to mass casualty incidents.

Barriers to measuring preparedness
There are several significant barriers to measuring preparedness. Firstly, opportunities to observe systematically or measure the response are limited since large-scale public health emergencies are rare. Even when emergencies occur, the complexity and limited advanced warning make it difficult to evaluate the response systematically.
Secondly, because of the heterogeneity of public health systems and the range of possible emergencies, defining the “gold standard” of an appropriate response is challenging. Practice guidelines or standard operating procedures (SOPs) can be considered as a quasi “gold standard” for single or combined procedures. Their usefulness to quantify an overall response is limited [19]. In a recent study, we identified a methodological problem, which relates to the use of SOPs extracted from guidelines as a “gold standard”. The observed study participants (fire-fighters) in our exercise study did not fully act adherent to guidelines. We concluded that the guidelines used were actually incongruent with what people who are involved in major emergencies are most likely to do. A truly effective response is multifactorial relying on an array of individual health care worker’s competencies and a wide range of capabilities of numerous institutions. Assessing the performance of the system is enormously complex.
Thirdly, employees below state or regional levels are rarely given full-time PHEP responsibilities. Since responsibilities frequently compete with one another, staff may determine that additional time spent in assessment of PHEP could potentially detract from attention to other tasks. The net result of such competing objectives may be that PHEP programs are implemented, but rarely, if ever, tested.

Exercises for Evaluative Purposes
Because “real world” opportunities to evaluate preparedness in action are limited, the most feasible methodological alternative is to employ potential proxy events that capture the key elements of a public health emergency. Exercises that simulate emergencies are increasingly used because of their utility in training and planning. They are effective in familiarizing personnel with emergency plans, allowing different agencies to practice working together, and identifying gaps and shortcomings in emergency planning [19, 20]. Exercises can be used to evaluate the performance of individuals, specific agencies, or an overall (multi-agency) system [21].
The term exercise can be defined as any event beyond the planning process that gathers people to test or improve preparedness [22]. In 2009 the BBK lastly updated their 2004 Guideline for Developing, Performing and Evaluating Exercises and Drills [23]. According to this Guideline, exercises can be allocated into 8 categories (Planübung, Stabsübung, Stabsrahmenübung, Rahmenübung, Fachdienstübung, Vollübung, Alarmübung, and Marschübung). In the context of PHEP research, there are two major categories of exercises: discussion-based exercises and operations-based exercises [22]. Discussion-based exercises can be categorized into four subtypes (seminar, workshop, tabletop exercise, and games); operations-based exercises can be categorized into three subtypes (drill, functional exercise, and full-scale exercise).
In Germany, exercises are used, to assess PHEP on national-, federal-, local-, institutional and individual level. E.g. the BBK has been setting up every other year a Federal Government-States project organization for the planning and execution of the so-called LÜKEX exercise (Länderübergreifende Krisenmanagement-Übung/Exercise) [24]. LÜKEX stands for a series of exercises in the area of national crisis management. The superior objective is to evaluate the response actions of governmental crisis staff. The exercises have been based on different scenarios such as Power Blackout, Terrorist Attack (LÜKEX 04), Pandemic (LÜKEX 07), and Dirty Bomb Terrorist Attack (LÜKEX 09/10). E.g. on local-level the Behörde für Soziales, Familie, Gesundheit und Verbraucherschutz (BSG) of the City of Hamburg conducts up to four major exercises per year, to assure emergency preparedness of city-hospitals. Those full-scale exercises include different scenarios and are systematically evaluated by external observers. Up to 50 participants (hospital staff) participate during the execution phase. Structured feedback is provided to the hospital-management.
There have been some key articles about exercises as a means for PHEP evaluation [10, 19, 20, 25-33]. The methods used for documentation and evaluation include: 1) post-exercise self-assessment questionnaire for rescue workers [27, 30]; 2) post-exercise questionnaires filled out by simulated victims [30]; and 3) the use of trained external observers [19, 29, 30]. However, the methods used for measuring response performance are mostly reported intransparently and lack scientific rigor [10, 33]. Most literature about disaster education and training is based on research reporting “lessons learned” and other subjective measures [33]. Agencies are often forced to construct their own training and evaluation packages without a clear understanding of systems developed elsewhere.


The project consists of three methodological phases, including qualitative and quantitative methods: Core of phase 1 is to develop a conceptual framework, which outlines the network of roles and disciplines integrating discipline specific competencies. This phase will include systematic literature review, a modified Delphi process and focus-groups. In phase 2 standard psychometric methods [35, 36] will be employed to devise a preliminary item pool, to construct a draft of the instrument, to assess content validity, and to test comprehensiveness and item difficulty. Reliability and validity of the instrument will be evaluated in phase 3 using standard psychometric methods [35, 36]. Exercises will be considered as the intervention used to test the instrument. Generalizability will be achieved by assuring representativeness of participants, setting and exercise scenarios.


Personal information will only be requested from the participants, which helps to describe the sample. Participants and observers will not be asked to include their name on any questionnaire. The individual-level data collected (agency of employment, title, and length of time in current position) will only be presented in aggregate form, with the individual questionnaires never disclosed. Confidentiality will be further maintained by keeping all data in locked file drawers and in a password protected electronic database. All information obtained from subjects will be accessible to research staff only.
Informed consent statements will include an explanation of the purposes, aims, significance and a description of the research. In addition, the consent form will include eligibility requirements for participation in the studies, and the benefits and risks of participation.
The final version of a study protocol will be submitted at the local ethics committee.


This project will result in a set of valid, reliable and practical measures of emergency nurses’ competencies related to disaster response. These measures could then be used by researchers and public health practitioners to monitor an essential aspect of PHEP. The project may be prototypic for the development and validation of instruments for the assessment of HCW competencies other than emergency nurses. Furthermore, methods and data retrieved may serve as a valuable basis to explore other aspects of PHEP such as system-level capabilities. Methods and findings of this project will be published in one paper in an international scientific journal.


Matthias Lenz (ML) has been mainly involved in research activities related to Evidence-based Patient Information and Shared Decision Making (e.g. development and qualitative evaluation of an evidence based decision aid for people with type 2 diabetes [40]; development and testing of an instrument for quality assessment of patient decision aids and education programs [41]; Overweight and obesity in adulthood as health risk factors [42]; methodological challenges of using methods of systematic review and meta-analysis to assess complex interventions [43, 44]; and issues and methods of systematic literature searches using scientific databases to identify complex interventions [45]). ML is in good command of all important qualitative and quantitative methodologies utilized in the present project. ML published a number of scientific papers (see, presented projects and research findings at international congresses and served as an active peer-reviewer for a number of international scientific journals. Before having worked as a research fellow at the UHSE UHH, ML was head of an Emergency Medical Service school in Hamburg. Amongst others, one of his main tasks was the organization of disaster response exercises by guiding their planning and evaluation. Based on those former activities and expertise in the emergency medical system, ML developed and published an innovative systematic approach to transparently evaluate complex operational situations in disaster response [19].
Anke Steckelberg (AS) is member of the research group at the UHSE UHH. Her scientific activities focus on the development and evaluation of evidence based patient information. AS developed curricula in Evidence-based Medicine (EbM) for various target groups. AS developed and validated an instrument to measure critical health competence in non-medical target groups [46]. AS holds expertise in qualitative and quantitative methods [47-50].
Jürgen Kasper (JK) is member of the research groups UHSE UHH, the Institute of Neuroimmunology and Clinical Research in Multiple Sclerosis and the Department of Dental Prosthetics at the University Medical Centre Hamburg. JK contributes methodologically to a large number of projects in the context of health communication and EbM [41, 46, 51-54]. His scientific activities reach from shared decision making, quality of decision support strategies, communication analysis, and evidence based patient information. JK holds profound expertise in probabilistic and classic test theory and according strategies of instrument development including qualitative methods.
Ramona Kupfer (RK) is member of the research group at the UHSE UHH and Ph.D.-student, mainly involved in the current project. In addition she participates in research activities related to evidence based patient information and the development of decision aids. RK has studied Health Sciences, Education and Anglistics at the UHH and gained work experience as a paramedic and pediatric nurse.


Co-operation is initiated with the Freie und Hansestadt Hamburg, Behörde für Soziales, Familie, Gesundheit und Verbraucherschutz (BSG), Fachabteilung Versorgungsplanung (Leiterin Elke Huster-Nowack), Billstraße 80, 20539 Hamburg. The BSG conducts up to four major exercises per year, to assure emergency preparedness of city-hospitals. Those full-scale exercises include different scenarios and are systematically evaluated by external observers.


1.    Bundesamt für Bevölkerungsschutz und Katastrophenhilfe (BBK). Beschlüsse der Ständigen Konferenz der Innenminister und -senatoren der Länder zum Bevölkerungsschutz.2009. Available from: __Rechtsgrundlagen/05__IMK-Beschluesse/ (accessed 15. January 2010).
2.    Gesetz über den Zivilschutz und die Katastrophenhilfe des Bundes (Zivilschutz- und Katastrophenhilfegesetz – ZSKG). BGBl I 2009; 57:2941-2988.
3.    Pandemic preparedness in the European Union – multi-sectoral planning needed. Euro Surveill 2007; 12:E070222 070221.
4.    Pandemic preparedness: different perspectives. How ready are we for the next pandemic? Vaccine 2006; 24:6800-6806.
5.    Influenza as an issue on the agenda of policy makers and government representatives. What can we do? What do we need? Vaccine 2006; 24:6793-6795.
6.    How to develop and implement pandemic preparedness plans? The need for a coherent European policy. Vaccine 2006; 24:6766-6769.
7.    Fock R, Bergmann H, Bussmann H, Fell G, Finke EJ, Koch U, et al. Influenza pandemic: preparedness planning in Germany. Euro Surveill 2002; 7:1-5.
8.    Bundesamt für Bevölkerungsschutz und Katastrophenhilfe (BBK). Forschung im BBK.2009. Available from: (accessed 1. January 2010).
9.    European Commission (CORDIS). Seventh Framework Programme for Research and Technological Development (FP7).2009. Available from: (accessed 1. January 2010)
10.    Asch SM, Stoto M, Mendes M, Valdez RB, Gallagher ME, Halverson P, et al. A review of instruments assessing public health preparedness. Public Health Rep 2005; 120:532-542.
11.    Savoia E, Massin-Short SB, Rodday AM, Aaron LA, Higdon MA, Stoto MA. Public health systems research in emergency preparedness a review of the literature. Am J Prev Med 2009; 37:150-156.
12.    Subbarao I, Lyznicki JM, Hsu EB, Gebbie KM, Markenson D, Barzansky B, et al. A consensus-based educational framework and competency set for the discipline of disaster medicine and public health preparedness. Disaster Med Public Health Prep 2008; 2:57-68.
13.    Nelson J.C., Essien J.D.K., Latoff J.S., Wiesner P.J. Collaborative competence in the public health agency: Defining performance at the organizational and individual employee levels. PREVENTION 97 Conference: Research Linkages between Academia and Practice. Atlanta, Georgia 1997.
14.    Hedges LE, Axelrod VM. Assessing Learning. Columbus, USA: Ohio State University, Vocational Instructional Materials Lab; 1995.
15.    Gebbie K, Merrill J. Public health worker competencies for emergency response. J Public Health Manag Pract 2002; 8:73-81.
16.    Hsu EB, Thomas TL, Bass EB, Whyne D, Kelen GD, Green GB. Healthcare worker competencies for disaster training. BMC Med Educ 2006; 6:19.
17.    American College of Emergency Physicians NBC Task Force. Developing Objectives, Content, and Competencies for the Training of Emergency Medical Technicians, Emergency Physicians, and Emergency Nurses to Care for Casualties from Nuclear, Biological, or Chemical (NBC) Incidents: Final Report. Washington, DC: Department of Health and Human Services, Office of Emergency Preparedness; 2001.
18.    International Nursing Coalition for Mass Casualty Education (INCMCE). Educational Competencies for Registered Nurses Responding to Mass Casualty Incidents.2003. Available from: (accessed 15. Mai 2010).
19.    Lenz M, Richter T. Disaster response to the release of biohazardous agent: instrument development and evaluation of a firefighter’s exercise. Prehosp Disaster Med 2009; 24:197-203.
20.    Biddinger PD, Cadigan RO, Auerbach BS, Burstein JL, Savoia E, Stoto MA, et al. On linkages: using exercises to identify systems-level preparedness challenges. Public Health Rep 2008; 123:96-101.
21.    Dausey DJ, Buehler JW, Lurie N. Designing and conducting tabletop exercises to assess public health preparedness for manmade and naturally occurring biological threats. BMC Public Health 2007; 7:92.
22.    U.S. Department of Homeland Security. Office for Domestic Preparedness. Homeland Security Exercise and Evaluation Program, Volume I: HSEEP Overview and Exercise Program Management.2007. Available from: (accessed 1. January 2010).
23.    Leitfaden für das Anlegen, Durchfuhren und Auswerten von Katastrophenschutzübungen. Edition 2009 edn. Bonn, Germany: Bundesamt für Bevölkerungsschutz und Katastrophenhilfe (BBK). 2009.
24.    Bundesamt für Bevölkerungsschutz und Katastrophenhilfe (BBK). Länder Übergreifende Krisenmanagementübung (EXercise).2009. Available from: (accessed 1. January 2010).
25.    Savoia E, Massin-Short SB, Rodday AM, Aaron LA, Higdon MA, Stoto MA. Public health systems research in emergency preparedness: a review of the literature. Am J Prev Med 2009; 37:150-156.
26.    FitzGerald DJ, Sztajnkrycer MD, Crocco TJ. Chemical weapon functional exercise–Cincinnati: observations and lessons learned from a “typical medium-sized” city’s response to simulated terrorism utilizing Weapons of Mass Destruction. Public Health Rep 2003; 118:205-214.
27.    Beaton RD, Oberle MW, Wicklund J, Stevermer A, Boase J, Owens D. Evaluation of the Washington State National Pharmaceutical Stockpile dispensing exercise: Part I – Patient volunteer findings. J Public Health Manag Pract 2003; 9:368-376.
28.    Jasper E, Miller M, Sweeney B, Berg D, Feuer E, Reganato D. Preparedness of hospitals to respond to a radiological terrorism event as assessed by a full-scale exercise. J Public Health Manag Pract 2005; Suppl:S11-16.
29.    Klein KR, Brandenburg DC, Atas JG, Maher A. The use of trained observers as an evaluation tool for a multi-hospital bioterrorism exercise. Prehospital & Disaster Medicine 2005; 20:159-163.
30.    Schleipman AR, Gerbaudo VH, Castronovo FP, Jr. Radiation disaster response: preparation and simulation experience at an academic medical center. J NuclMed Technol 2004; 32:22-27.
31.    Vinson E. Managing bioterrorism mass casualties in an emergency department: lessons learned from a rural community hospital disaster drill. Disaster Management & Response 2007; 5:18-21.
32.    Wang C, Wei S, Xiang H, Xu Y, Han S, Mkangara OB, et al. Evaluating the effectiveness of an emergency preparedness training programme for public health staff in China. Public Health 2008; 122:471-477.
33.    Williams J, Nocera M, Casteel C. The effectiveness of disaster training for health care workers: a systematic review. Ann Emerg Med 2008; 52:211-222, 222 e211-212.
34.    Crane JS, McCluskey JD, Johnson GT, Harbison RD. Assessment of community healthcare providers ability and willingness to respond to emergencies resulting from bioterrorist attacks. J Emerg Trauma Shock; 3:13-20.
35.    McDowell I, Newell C. Measuring Health: A Guide to Rating Scales and Questionnaires. (2nd ed.) Oxford, UK: Oxford University Press; 1996.
36.    Streiner DL, Norman GR. Health Measurement Scales: A Practical Guide to Their Development and Use (2nd ed.) Oxford, UK: Oxford University Press; 1995.
37.    Ericsson KA, Simon HA. Protocol analysis; Verbal reports as data (revised edition). Cambridge, MA: Bradfordbooks/MIT Press; 1993.
38.    Cronbach LJ. Coefficient alpha and the internal structure of tests. Psykometrika 1951; 16:297-334.
39.    Davidoff F, Batalden P, Stevens D, Ogrinc G, Mooney SE. Publication guidelines for quality improvement studies in health care: evolution of the SQUIRE project. BMJ 2009; 338:a3152.
40.    Lenz M, Kasper J, Mühlhauser I. Development of a patient decision aid for prevention of myocardial infarction in type 2 diabetes – rationale, design and pilot testing. Psychosoc Med 2009; 6:Doc05.
41.    Lenz M, Kasper J. MATRIX – development and feasibility of a guide for quality assessment of patient decision aids. GMS Psychosoc Med 2007; 4:Doc10.
42.    Lenz M, Richter T, Mühlhauser I. The morbidity and mortality associated with overweight and obesity in adulthood: a systematic review (German). Dtsch Arztebl Int 2009; 106:641-648.
43.    Lenz M, Steckelberg A, Richter B, Mühlhauser I. Meta-analysis does not allow appraisal of complex interventions in diabetes and hypertension self-management: a methodological review. Diabetologia 2007; 50:1375-1383.
44.    Lenz M, Steckelberg A, Mühlhauser I. Patient education programmes and decision aids – evaluation of complex interventions (an update). Av Diabetol 2008; 24:443-452.
45.    Lenz M, Kasper J, Mühlhauser I. Searching for diabetes decision aids and related background information. Diabet Med 2006; 23:912-916.
46.    Steckelberg A, Hulfenhaus C, Kasper J, Rost J, Mühlhauser I. How to measure critical health competences: development and validation of the Critical Health Competence Test (CHC Test). Adv Health Sci Educ Theory Pract 2009; 14:11-22.
47.    Berger B, Steckelberg A, Meyer G, Kasper J, Mühlhauser I. Training of patient and consumer representatives in the basic competencies of evidence-based medicine: a feasibility study. BMC Med Educ; 10:16.
48.    Steckelberg A, Balgenorth A, Berger J, Mühlhauser I. Explaining computation of predictive values: 2 x 2 table versus frequency tree. A randomized controlled trial. BMC Med Educ 2004; 4:13.
49.    Steckelberg A, Hulfenhaus C, Kasper J, Mühlhauser I. Ebm@school–a curriculum of critical health literacy for secondary school students: results of a pilot study. Int J Public Health 2009; 54:158-165.
50.    Steckelberg A, Kasper J, Redegeld M, Mühlhauser I. Risk information–barrier to informed choice? A focus group study. Soz Praventivmed 2004; 49:375-380.
51.    Kasper J, Geiger F, Freiberger S, Schmidt A. Decision-related uncertainties perceived by people with cancer – modelling the subject of shared decision making. Psychooncology 2008; 17:42-48.
52.    Kasper J, Légaré F, Scheibler F, Geiger F. Turning signals into meaning – ‘Shared decision making’ meets communication theory. Health Expectations 2010 (in press).
53.    Kasper J, Köpke S, Mühlhauser I, Nubling M, Heesen C. Informed shared decision making about immunotherapy for patients with multiple sclerosis (ISDIMS): a randomized controlled trial. Eur J Neurol 2008; 15:1345-1352.
54.    Kasper J, Lenz M. Criteria for the development and evaluation of decision aids (German). ZArztlFortbildQualitatssich 2005; 99:359-365.
55.    Savoia E, Testa MA, Biddinger PD, Cadigan RO, Koh H, Campbell P, et al. Assessing public health capabilities during emergency preparedness tabletop exercises: reliability and validity of a measurement tool. Public Health Rep 2009; 124:138-148.