Author: Wilbeck, Jennifer
Date published: October 1, 2011
During the past decade, as resident work hours have undergone mandated reductions, the scope of the Acute Care Nurse Practitioner (ACNP) has increasingly expanded to include the performance of advanced procedures among hospital in-patient and intensive care units. Not only to meet the requirements of national credentialing bodies but also to ensure best patient outcomes, ACNP procedural competence for central line insertions must be adequately defined and effectively measured. To date, these definitions and measurement tools remain elusive.
The need for an ongoing competency evaluation method for ACNP procedures in the hospital is not an isolated or localized problem. Although any specialty educated and certified NP may work in an acute care or inpatient setting, the designation of ACNP is reserved for those who have graduated from an accredited ACNP master's program and have successfully passed the ACNP certification board exam. National estimates indicate that of the more than 254,000 advanced practice nurses, only 2.6% or 6,624 are certified as ACNPs.1 With respect to practice settings, 8.5% of responding NPs in the 2004 national Nurse Practitioner Sample Survey2 were practicing in hospital inpa- tient settings, while only 4.5% of respondents were considered to be ACNPs by certification. Of these, up to 22% were credentialed to "initiate use of central venous catheters."3
As defined by the credentialing body, ACNP scope of practice includes both noninvasive and invasive assessments and interventions, ranging from interpretation of X-rays and electrocardiograms to hemodynamic monitoring, line or tube insertions, and lumbar punctures.4 Although most ACNPs practice within a tertiary care setting, approximately half do so in areas other than intensive care units or other acute care areas. Given the diversity of practice scope and settings for the ACNP, it is a common misperception that the "focus of ACNP practice... involves invasive skills."3 As such, the ACNP scope of practice for advanced procedures performance within the hospital setting is primarily based upon three indicators: educational preparation, certification specialty, and facility specific credentialing and privileging.
All ACNPs functioning in an in-patient role are subject to The Joint Commission requirements that institutions seeking accreditation demonstrate competence for all credentialed and privileged hospital providers. Professional competence has been defined as "the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and community being served."5 More concisely stated, competence is "the possession of a required skill, knowledge, qualification, or capacity."6 Regardless of its definition, the evaluation methods for procedural competence are leftto be determined by the internal policies of individual institutions and tend to build upon current knowledge and requirements found within graduate medical education.
In order for an ACNP to function as a provider within a hospital setting, he or she must be both credentialed and privileged to perform specific functions - a process most commonly defined by individual institutional medical staffregulations. The process of credentialing and privileging is designed to collect, validate, and evaluate data representative of the provider's education and training, licensure, current competence and ability to carry-out specified duties. Most institutions currently utilize some type of evaluation which includes, at least in part, a minimum number of procedures which must be safely performed under supervision to demonstrate competency. Current Accreditation Council for Graduate Medical Education (ACGME) guidelines, however, state that "assessment of procedural competence should not be based solely on a minimum number of procedures performed but on a formal evaluation process", creating a challenge for many institutions. 7
At Vanderbilt University Medical Center (Figure 1), the six ACGME areas are assessed for general competency during the initial credentialing process.8 The evaluation process includes patient care, medical/clinical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice. Integration of these areas into an institution's initial credentialing and privileging process meets The Joint Commission requirements for both Focused Professional Practice Evaluation (FPPE) and performance evaluations. There are multiple components of the credentialing and privileging process which address competency (see highlighted boxes of Figure 1). It falls to the individual institution to develop and maintain a reliable process which evaluates the credentials of an ACNP with regards to his or her current competency to perform requested privileges. The fundamental challenge to this assessment of ACNP procedural competency is the lack of validated, evidenced-based metrics by which to evaluate provider practices.
Review of the Literature
At initial glance, operationalizing the requirements to evaluate competency would seem to be easily achieved. This, however, is not the case. In seeking to identify methods which satisfy the ACGME mandate for formal evaluative processes, no evidence-based standardized evaluation tools for assessing ACNP procedural competence are available. Those tools identified in the literature are generally found within studies of graduate medical education and do not speak specifically to the procedural competence of ACNPs in the hospital setting. Furthermore, clear relationships between ACNP competency assessments and documentation are not identified in the literature. Therefore, a two-part literature review was undertaken with the first focusing on competency evaluation methods for ACNP skills, and the second evaluating the use of templates and checklists as a means to encourage compliance with national, evidence-based guidelines.
Part I: Methods of Competency Evaluations for ACNP Skills
For the first review, assessments within medical training programs and practices were sought due to the lack of literature addressing specifically ACNP skills competency evaluations. A comprehensive literature search using the search terms "provider competency assessment", "competency assessment tools", and "evaluation tools AND competency" was conducted using the Medline & CINAHL databases from 1995 to 2010. The search was limited to English-language texts describing methods of competency evaluation. Using these search criteria, a total of 353 potentially relevant articles were identified. Further narrowing was then accomplished by excluding studies which discussed the concept of competency as it relates to professional behavior rather than procedural skills competence, or the processes of teaching competence or curriculum development. Additionally, articles which did not clearly discuss evaluation methods and those addressing competency assessment within nursing education or non-advanced practice nursing practice settings were also excluded. Finally, because the ACGME and The Joint Commission requirements for competency evaluations are applied only within the United States, articles originating from geographic locations outside the United States were excluded. Following cursory article reviews and exclusions, a total of 15 articles were considered for this literature review.
Among the 15 articles included for review, the majority were comprised of research studies (n = 7) and reviews (n = 6). Two additional articles were included which consisted of consensus group recommendations for competency evaluation methods. Of the included articles, the majority represented specialty specific competencies including emergency medicine, general surgical, gastrointestinal, obstetric & gynecological, and rhinologic & allergy specialty literature.
Interventions Utilized in the Literature.
A total of seventeen different methods for assessing competence were identified in the literature. Methodologies which were not discussed in at least two of the fifteen articles were excluded from further consideration (n = 6). Appendix Table 1 summarizes the eleven most common methods for competency evaluation identified in the current literature: benchmarking, checklist evaluations, global assessments, simulation, standardized patients, objective structured assessment of technical skills (OSATS), objective structured clinical examinations (OSCEs), peer reviews, portfolios, procedure logs, and 360- degree evaluations.
Three studies described the development of a checklist for use of competency assessment within surgical residency programs, and all found the checklists to demonstrate high reliability and construct validity. 9-11 One checklist9 utilized a Likert scale to evaluate performance of specific technical skills within a general surgical residency program. Another checklist10 was noted to possess high reliability (0.97 overall) and construct validity specific to 4 of the 6 ACGME competencies including the patient care domain under which procedural competency falls. Using Cronbach's alpha, the reliability coefficients of these four competencies ranged from 0.92 to 0.95. The competency domains were found to have a moderate degree of correlation with ranges of correlation coefficients between 0.64 and 0.75.
One study11 assessed emergency medicine resident procedural competence using procedure specific checklists comprised of critical actions in conjunction with simulations. The evaluation of residents in this study was based upon the checklist of critical actions, in addition to a summative evaluation by physician faculty. The authors determined that the evaluation of procedural competence using both the checklists and summative evaluations correlated with evaluations using written testing methods. Specific to the practice of emergency medicine, a consensus group for Emergency Medicine graduate medical education advocates the use of checklist evaluations as the primary assessment tool for performance of patient care.12
Simulation and OSCES.
The use of simulation was explored by Weinberg and colleagues as a method for pediatric providers to demonstrate competency.13 Their comprehensive literature review concluded that simulation is an effective tool for training pediatric acute care providers, and is also useful in demonstrating competency. Sufficient supporting literature was discussed related to the following aspects of pediatric training: resuscitation, trauma management, airway management, procedural skills, crisis resource management/team training, disaster/mass casualty training. They conclude that further research is needed to develop valid assessment tools for performance evaluation in regards to competency assessed using simulation.13
An additional literature review was undertaken in which simulation was one of the methods identified as an appropriate evaluation method for competence among anesthesia providers. 14 In this review, it was noted that simulation is especially helpful in assessing competency related to uncommon clinical presentations or skills. This finding was supported by others who advocated simulation as the primary assessment tool for procedures, and all agreed as well that OSCEs were appropriate methods of alternate assessment.12
OSATs and Global Assessments.
One recent study conducted among five surgical residency programs evaluated the feasibility of an objective structured assessment of technical skills (OSATS) while performing six different abdominal procedures.15 Included within the OSATS were multiple evaluation methods - a task-specific checklist, global rating scale, and pass/fail assessment. Data analysis for this study supports the conclusion that the OSATS can provide objective, reliable and valid assessments of surgical skills among residency programs.
Based on prior pilot-testing,16 OSATS were used to evaluate the use of video-based assessments of endoscopic sinus surgeries. A two-part assessment tool was utilized including a checklist and global assessment portions, with videoed OSATS assessed by five evaluators using this tool.17 The researchers concluded that the use of videotapes combined with objective assessment tools and the previously validated OSATS method is a feasible and valid method for evaluating surgical skills. Because multiple evaluations increase validity and reliability, an additional benefit of this assessment method is the ability for each video to be viewed and scored by multiple evaluators.
Portfolios (including varied additional methodologies).
The use of portfolios to demonstrate competency was also evaluated. Carraccio and Englander concluded that both written and web-based portfolios show great potential as a single assessment method for competency evaluation among all six of the ACGME domains.18 Following the literature review, these authors developed a web-based portfolio application which incorporated numerous tools for assessment such as procedure logs, benchmarking, and 360-degree evaluations. Challenges to portfolio use, they found, involve the time investment required for accurate compilation.
On a more skill-specific basis, Salerno and colleagues19 concluded that initial provider competency in 12-lead electrocardiogram (ECG) interpretation could be sufficiently demonstrated with the completion of an internal medicine residency program, coupled with ACLS courses and board certification. In regards to maintenance of ECG competency, there was no identified number for yearly ECG interpretations or continuing medical education hours identified. These authors did note that further data on competency testing methods specific to ECGs was needed.
Some have suggested that increased proficiency and improved patient outcomes occur with providers performing high volumes of selected procedures.7,19 Given the exclusionary criteria previously detailed, only one study was identified and included in this literature review which addresses competency related to frequency of procedure performance.7 Spier and colleagues found the number of colonoscopies required to reach a 90% level of independent performance among physicians completing a gastrointestinal fellowship was assessed. Using a retrospective analysis, they concluded that a significant number more procedures (N=500) was required to ensure reliable independent practice than the current recommended minimal requirement (N=140).
Part II: Templates and Checklists to Facilitate Guideline Compliance
For the second part of the literature review, the impact of standardized documentation template use on compliance with national clinical practice guidelines was assessed. A comprehensive literature search was conducted using the MEDLINE and CINAHL databases from 1990 to 2009 using the following search terms: "chart abstraction", "clinical documentation", "clinical guidelines", "checklists", "compliance", "documentation", "documentation compliance", "documentation templates", "practice guidelines", "standardized documentation", "standardized templates" "templates", and "template documentation methods". Studies addressing care in the outpatient arena were excluded, as were studies which looked in general terms at methods to increase guideline compliance and those which were originally non-English manuscripts. Studies which did not explicitly link documentation to compliance with a national guideline and those which focused on the use of electronic charting were also excluded. From 737 potentially relevant articles, a total of eight (8) studies and one consensus paper were critically reviewed. One of the studies,20 however, raised serious concerns for conclusion validity (i.e., the evidence was "D" level) and was therefore not considered further. The remaining seven studies and one expert consensus addressed the use of standardized forms (n=4) and templates/checklists (n=4) as they related to national guideline compliance.
Impact of Standardized Forms.
Three studies21-23 and one expert consensus panel24 addressed the effect of standardized forms on national guideline compliance. Chart audits were used to gather data both before and after the standardized forms were initiated in each of the studies; a total of 2,780 charts were reviewed in these three studies. An overall improvement of 12% (N = 200, p<0.001) in chart completeness of wound care documentation was found in one study21, while an improvement of 49% (N = 175, p<0.001) with national acute asthma treatment guideline compliance was demonstrated in another.22 Overall improvement scores were not calculated in a third study,23 but improvement among nineteen (19) documentation variables was found to be statistically significant with a p<0.001 (N = 2405). Although only two of these studies22-23 specifically speak to documentation as it re lates to compliance with national clinical practice guidelines, all three studies agree that the use of structured, standardized forms led to improved overall completeness of documentation compared with free-text documentation.
The use of standardized forms is further supported by the International Expert Wound Care Advisory Panel. 24 In response to recent CMS guidelines requiring specific provider documentation to maintain reimbursements, this expert panel recommends using standardized forms and consistent terminology as a method to facilitate documentation and ensure patient safety throughout inpatient stays. With particular focus on pressure ulcers, the Panel advocates for standardized forms which include validated risk assessment scales and detailed documentation checklists.
Impact of Templates and Checklists.
An estimated 1,745 charts were reviewed in four studies25-28 to address the effect of standardized templates/checklists on national guideline compliance. Three of the four studies were of high quality ("A" and "B" ratings). The one study26 found to have less than ideal quality demonstrated differences in group compositions, blurring of the study groups (cross-over potential), and a lack of strict definitions for diseases.
Higgins and Becker25 noted an overall improvement in documentation compliance of 40% (N = 1000, p<0.001) when templates and ongoing quality improvement initiatives were utilized. Although their study assessed compliance with national guidelines specific to academic medical center documentation rather than with general clinical practice guidelines and concurrent quality improvement initiatives were in place throughout the study, the overall effect of the template must still be presumed to have played some role in the positive outcome. A second study26 reported a 29% improvement (N = 201, p<0.001) with national bariatric surgery quality measure compliance when dictation templates were introduced. Another27 used a simple stamped template to improve compliance with adolescent psychosocial screenings in the emergency department (an increase of 9% with N = 201, p<0.01). Checklists and pathways were shown in the fourth study28 to increase core measure adherence by 21.4% in the emergency department (N = 239). Additionally, two studies noted that the use of standardized documentation methods not only increased documentation compliance, but that it changed clinical practices overall to more closely adhere to national guideline recommendations. 22-23
In summary, all of the studies reviewed supported the use of standardized documentation as a method to increase compliance with national guidelines, and all showed statistically significant results to support the recommendation. Of the eleven methods described above for assessing competency, each is addressed within the ACGME Toolbox,8 a detailed description of various evaluation methods with suggestions for use in graduate medical education. Despite the numerous evaluation methods, guidelines, and recommendations identified, a clearly superior method for ongoing procedural competency evaluation remains undefined.
The OSAT is considered, by some, the superior method for patient care topics,29 while simulations are best for evaluating medical procedures such as central line placements. Additionally, as technology advancements lead to new electronic and web-based programs, many of the tools discussed may be revised into electronic formats. This literature review identified programs and researchers who are already exploring and validating such methods; it is expected that these applications will only increase in coming years.
Another lingering question concerns the number of procedures required to demonstrate competency. Many contend that procedural competence has never been validated by numbers alone. 6,7,12 Even if competency is defined as a percentage of time in which successful, independent skill performance is achieved, there is no established, evidence-based percentage known to represent competence. Some advocate 90% independence to be required for physicians to be considered competent for performance of independent colonoscopy performance7; however, would there not be others who might consider 80% to be an acceptable, safe alternative? Additionally, patient outcomes are a poor indicator of provider competence as they are rarely related only to the provider's performance; there are too many variables which influence these outcomes.
Consensus does exist that written exams and patient outcomes are outdated and undesirable for use as evaluation tools in competency assessment. 5,14,29,30 Considering the individual capabilities of providers, scores on multiple choice exams could exceed actual competence if the provider had, for example, a photographic memory yet was deficient in clinical knowledge. Alternately, for providers with significant test anxiety, the level of actual competence may not be accurately reflected on a written exam. Perhaps for these reasons, traditional knowledge testing with written exams has reportedly not been proven valid. 14
There is an overwhelming consensus that the use of multiple and overlapping evaluation methods is necessary to achieve the most valid assessment of provider competency; a single parameter is not sufficient. 5,7,14,31 Though some evidence-based metrics for competency evaluation among physicians and house staffhave been developed, there are none which specifically focus on the evaluation of ACNP procedural competence. As such, ongoing competency assessment models for the in-patient ACNP are leftto be identified within medical models and translated to the advanced practice nursing domain.
Strategies for Practice
Despite the identified gaps and ambiguities within the literature, clear applications for assessment of ACNP procedural competence for central line insertion can be made. The individuals and offices responsible for overseeing ongoing ACNP procedural competence must ensure that the chosen multiple methods are both appropriate to the setting and feasible. For example, although simulation has been shown to be a valid and reliable measure for procedural competency assessment, it is better utilized within the settings of education and focused professional practice evaluations during initial credentialing rather than for ongoing evaluation.
First and foremost, any suitable method for valid, ongoing ACNP procedural competency evaluation must encompass multiple parameters. Portfolios encompass a collection of various documents such as procedure logs and benchmarking, allowing providers to highlight learning experiences and achievements. In one identified literature review, the use of both written and web-based portfolios was shown to have great potential as a single assessment method for competency evaluation among all six of the ACGME domains. 18 Portfolios, ideally electronic, were also considered by a consensus group for Emergency Medicine education as appropriate choices for performance evaluations.12
Pervasive concerns regarding the use of procedure logs as a singular method for competency assessment were identified despite the overall support for their use within larger portfolios. Although in the past procedure logs have been utilized as documentation of competence, a strictly numbers approach to procedural competence lacks validation. Additionally, portfolios require a large time investment and accurate documentation by providers for accurate compilation. For these reasons, an attestation statement of competent performance for both procedure logs and portfolios is warranted. 4,12
Although benchmarking and checklists provide valuable data within a portfolio, ongoing procedural competence cannot be demonstrated without concurrent use of a validated standard. With each of these methods, the incorporation of national clinical practice guidelines as the comparison metric for ACNP competence is not only feasible, but appropriate and attainable. As evidence-based guidelines by definition represent best practice for safe, quality care, adherence to such guidelines can be used in part as evidence of procedural competence. One guideline detailing best practices for central line insertion is available from the Agency for Healthcare Research and Quality's National Guideline Clearinghouse, entitled "Strategies to prevent central line associated bloodstream infections in acute care hospitals." 32 Review of this practice guideline, using the AGREE instrument, found the guideline to be valid and reliable. Tying competency checklists to evidence-based national guidelines ensures that the evaluation process assesses much more than a mere technical skill.
To integrate these recommendations and most effectively demonstrate ongoing evaluation of ACNP central line insertion, it is recommended that an electronic procedural documentation template be developed from evidence-based clinical practice guidelines. Doing so will capture documentation that demonstrates adherence to national standards and best practices. Review of this electronic data may then be easily translated for the purposes of benchmarking each provider's practices. An attestation statement by the ACNP's collaborating physician or another privileged provider combined with a portfolio including the documentation data/benchmarking will not only meet TJC mandates, but encourage ACNP adherence to clinical practice guidelines.
Today's healthcare environment of increased attention on high-quality, safe patient outcomes, requires ACNPs to demonstrate competency to obtain and maintain positions as credentialed and privileged providers in hospital settings. Unfortunately, no evidence-based metrics can be identified in the literature specific to APN advanced skill performance. This gap in practice promotes an environment for competency assessment models to be identified within medical models and translated to the advanced practice nursing domain. Overwhelming consensus is that the use of multiple and overlapping evaluation methods is necessary to achieve the most valid assessment of provider competence; a single parameter is not sufficient. 5,7,14,31 To close this gap, creation of new APN specific competency assessment tools must be developed that are feasible, valid and reliable. Development of such evidence-based tools must become a priority within institutions that utilize ACNPs for central line insertions. Not only will this promote a culture of patient safety and quality outcomes but promote ACNP engagement and transformation for strong, competent performance.
1. Health Resources & Services Administration. The registered nurse population: findings from the 2008 national sample survey of registered nurses. http://bhpr.hrsa.gov/ healthworkforce/rnsurveys/rnsurveyfinal.pdf. Accessed July 20, 2011.
2. Goolsby MJ 2004 AANP national nurse practitioner sample survey, part 1: an overview. J Am Acad Nurse Prac. 2005;17:337-341.
3. Kleinpell RM. Acute care nurse practitioner practice: results of a 5-year longitudinal study. Am J Crit Care. 2005;14:211-221.
4. Melander S, Kleinpell R, McLaughlin R. Ensuring clinical competency for NPs in acute care. Nurse Pract. 2007;32(4):19-20.
5. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287:226-235.
6. Vargo JJ. North of 100 and south of 500: where does the "sweet spot" of colonscopic competence lie? Gastrointest Endosc. 2010;71:325-326.
7. Spier B J, Benson M, Pfau PR, Nelligan G, Lucey MR, Gaumnitz EA. Colonoscopy training in gastroenterology fellowships: determining competence. Gastrointest Endosc. 2010;71:319-24.
8. Accreditation Council for Graduate Medical Education. Toolbox of assessment methods. http://www,acgme.org/ Outcome/assess/Toolbox.pdf. Accessed July 20, 2011.
9. Anderson CI, Jentz AB, Harkema JM, Kareti LR, Apelgren KN, Slomski CA. Assessing the competencies in general surgery residency training. Am J of Surgery. 2005;189:288-292.
10. Brasel KJ, Bragg D, Simpson DE, Weigelt JA. Meeting the accreditation council for graduate medical education competencies using established residency training program assessment tools. Am J Surg. 2004;188:9-12.
11. Noeller TP, Smith MD, Holmes L, et al. A theme-based hybrid simulation model to train and evaluate emergency medicine residents. Acad Emerg Med. 2008;15:1199-1206.
12. King RW, Schiavone F, Counselman FL, Panacek EA. Patient care competencies in emergency medicine graduate medical education: results of a consensus group on patient care. Acad Emerg Med. 2002;9:1227-1235.
13. Weinberg ER, Auerbach MA, Shah NB. The use of simulation for pediatric training and assessment. Curr Opin Pediatr. 2009;21:282-287.
14. TetzlaffJE. Assessment of competence in anesthesiology. Curr Opin Anaesthesiol. 2009;22:809-813.
15. GoffB, Mandel L, Lentz G, et al. Assessment of resident surgical skills: is testing feasible? Am J Obstet Gynecol. 2005;192:1331-40.
16. Lin SY, Kulsoom L, Ishii M, et al. Development and pilottesting of a feasible, reliable, and valid operative competency assessment tool for endoscopic sinus surgery. Am J Rhinol Allergy. 2009;23:354-359.
17. Laeeq K, Infusino S, Lin SY, et al. Video-based assessment of operative competency in endoscopic sinus surgery. Am J Rhinol Allergy. 2010;24:1-4.
18. Carraccio C, Englander R. Evaluating competence using a portfolio: a literature review and web-based application to the ACGME competencies. Teach Learn Med. 2004;16:381-387.
19. Salerno SM, Alguire PC, Waxman HS. Training and competency evaluation or interpretation of 12-Lead electrocardiograms: recommendations from the American College of Physicians. Ann Intern Med. 2003;138:747-750.
20. Nicol MF. A risk management audit: are we complying with the national guidelines for sedation by non-anaesthetists? J Accid Emerg Med. 1999;16:120-122.
21. Kanegaye JT, Cheng JC, McCaslin RI, Trocinski D, Silva P. Improved documentation of wound care with a structured encounter form in the pediatric emergency department. Ambul Pediatr. 2005;5:253-257.
22. Robinson SM, Harrison BDW, Lambert MA. Effect of a preprinted form on the management of acute asthma in an accident and emergency department. J Accid Emerg Med. 1996;13: 93-97.
23. Wrenn K, Rodenwald L, Lumb E, Slovis C. The use of structured, complaint-specific patient encounter forms in the emergency department. Ann Emerg Med. 1993;22:805-812.
24. Armstrong DG, Ayello EA, Capitulo KL, et al. New opportunities to improve pressure ulcer prevention and treatment: Implications of the CMS Inpatient Hospital Care Present on Admission (POA) Indicators/Hospital-Acquired Conditions (HAC) Policy. A consensus paper from the International Expert Wound Care Advisory Panel. J Woun, Ostomy Continence Nurs. 2008;35:485-492.
25. Higgins GL, Becker MH. A continuous quality improvement approach to IL-372 documentation compliance in an academic emergency department, and its impact on dictation costs, billing practices, and average length of stay. Acad Emerg Med. 2000;7:269-275.
26. Parikh JA, Yermilov I, Jain S, McGory M, Ko CY, Maggard MA. How much do standardized forms improve the documentation of quality of care? J Surg Res. 2007;143:158-163.
27. van Amstel LL, Lafleur DL, Blake K. Raising our HEADSS: Adolescent psychosocial documentation in the emergency department. Acad Emerg Med. 2004;11: 648-655.
28. WolffAM, Taylor SA, McCabe JF. Using checklists and reminders in clinical pathways to improve hospital inpatient care. Med J Aust. 2004;181:428-431.
29. Sultana CJ. The objective structured assessment of technical skills and the ACGME competencies. Obstet Gynecoly Clin North Am. 2006;33:259-265.
30. Baker WE. Evaluation of physician competency and clinical performance in emergency medicine. Emerg Med Clin North Am. 2009;27:615-626.
31. Marple BF. Competency-based resident education. Otolaryngol Clin North Am. 2007;40:1215-1225.
32. Agency for Healthcare Research and Quality, National Guideline Clearinghouse. Strategies to prevent central line-associated bloodstream infections in acute care hospitals. http://www.guideline.gov/content.aspx?id=13395&se arch=central+line+insertion. Accessed July 20, 2011
Jennifer Wilbeck, DNP, APRN, CEN, Assistant Professor and FNP/ACNP-ED Program Coordinator
Vanderbilt University School of Nursing, Nashville, TN
Marguerite Murphy, DNP, RN, DNP Program Director
Georgia Health Sciences University, College of Nursing, Augusta, GA
Janie Heath, PhD, APRN, FAAN, Associate Dean of Academic Affairs and E. Louise Grant Endowed Chair of Nursing
Georgia Health Sciences University, College of Nursing, Augusta, GA
Clare Thomson-Smith JD, MSN, RN, FAANP, Director for Advanced Practice Nursing
Vanderbilt University Medical Center, & Assistant Dean for Faculty Practice
Vanderbilt University School of Nursing, Nashville, TN
Correspondence concerning this article should be addressed to Jennifer.firstname.lastname@example.org