ASSESSMENT
DRIVERS BEST PRACTICES
Version 2.7
2 | ©2019 NIRN-UNC
The mission of the National Implementation Research Network (NIRN) is to contribute to the
best practices and science of implementation, organization change, and system reinvention to
improve outcomes across the spectrum of human services.
email: nirn@unc.edu
web: http://nirn.fpg.unc.edu
About NIRN
SUGGESTED CITATION
Citation and Copyright
This document is based on the work of the National Implementation Research Network (NIRN).
Ward, C., Metz, A., Louison, L., Loper, A., & Cusumano, D. (2018). Drivers Best Practices Assessment. Chapel Hill, NC:
National Implementation Research Network, University of North Carolina at Chapel Hill. Based on: Fixsen, D.L.,
Blase, K., Naoom, S., Metz, A., Louison, L., & Ward, C. (2015). Implementation Drivers: Assessing Best Practices.
Chapel Hill, NC: National Implementation Research Network, University of North Carolina at Chapel Hill.
This content is licensed under Creative Commons license CC BY-NC-ND, Attribution- NonCommercial- NoDerivs.
You are free to share, copy, distribute, and transmit the work under the following conditions: Attribution — You
must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that
they endorse you or your use of the work); Noncommercial — You may not use this work for commercial purposes;
No Derivative Works — You may not alter, transform, or build upon this work. Any of the above conditions can be
waived if you get permission from the copyright holder.
We ask that you let us know how you use these items so we can use your experience and data to improve and expand
the survey. Please email us at nirn@unc.edu
© 2019 NIRN-UNC
Drivers Best Practices Assessment V2.7
|
3
Table of Contents
Appendix A
Appendix B
Drivers Best Practices Assessment
Action Plan Template
Administration Process & Key Roles
Facilitator
Note Taker
Participants
Observer
Scoring
Preparation for the Administration
Research Basis and Outcomes from Completion
Next Steps Aer Administration
Drivers Best Practices Assessment:
Fidelity Checklist
Scoring Form
Scoring Rubric
References
4
8
10
14
28
29
30
4
4
4
4
4
4
5
5
6
7
29
30
Introduction & Purpose
4 | ©2019 NIRN-UNC
The purpose of the Drivers Best Practices Assessment (DBPA) is to assist organizations in assessing their current
supports and resources for quality use of selected programs or practices. Specifically, organizations can use it to:
Identify strengths and opportunities for improvement in their current supports and resources;
Select implementation best practices to strengthen sta competency and organizational practices; and
Provide an implementation team with a structured process to develop an action plan and data to monitor
progress.
The Drivers Best Practices Assessment is administered for a specific practice or program, rather than for the
organization in general. The essential functions of the program or practice should be known and clearly defined.
It is important to choose one practice or program and answer the questions with that selected practice or program
in mind.
The administration of this tool is conducted by the Facilitator who introduces the Drivers Best Practices
Assessment. The Facilitator introduces the assessment’s purpose, provides an overview of the administration
process and scoring, introduces concepts or big ideas measured, reads each item aloud and provides necessary
clarification, and engages the team in the discussion and voting process. Information about key roles is provided
in the table below:
Introduction & Purpose
Focus of the Assessment
Administration Process & Key Roles
Facilitator
An individual who has been trained in the administration process, has experience with the organization, and
has a relationship with the respondent.
The facilitator is responsible for:
Leading discussion and adhering to the administration process; and
Contextualizing items for respondents or providing examples of the organization’s work.
The facilitator does not vote.
Note Taker
Key responsibility includes recording scores and ideas shared for action planning and any questions and issues
that are raised during administration. The Note Taker does not vote.
Participants
Participants include implementation team members and other sta who have roles in implementation of the
selected practice or program, are involved in dierent support activities or are in a leadership role for the
organization and responsible for overseeing aspects of the implementation infrastructure. Participants vote
on each item, discuss dierence in scores, and achieve modified consensus.
Observer
Observers are invited with permission of the implementation team to learn about the process or the activities
in the organization. Observers do not vote.
Drivers Best Practices Assessment V2.7
|
5
The identified participants complete the Drivers Best Practices Assessment by discussing each item and coming
to consensus on the final score for each item. The respondents score each item on a three-point scale (i.e., in
place (2 points), partially in place (1 point), not in place (0 points), respectively) using a simultaneous and public
voting process. This type of voting facilitates participation of all respondents and neutralizes any potential power
influences. When asked to vote (e.g., “Ready, set, vote”), participants simultaneously hold up either two fingers to
vote “Fully in Place,” one finger to vote “Partially in Place,” or a closed hand to vote “Not Yet.” Alternately, teams
can use numbered cards to vote. If the team is unable to arrive at consensus, additional data sources documented
in the Scoring Rubric can be used to prompt thinking and help achieve modified consensus. Modified consensus
means that voters in the minority can live with and support the majority decision on an item. If modified
consensus cannot be reached, the Facilitator guides the team to identify a later time for further discussion. The
majority vote is recorded.
The following should be in place prior to administering the Drivers Best Practices Assessment.
Facilitator should have knowledge of the concepts measured in the assessment and experience supporting
organizations using implementation best practices;
Implementation Team has agreed to administration and commitment of time (approximately one hour for
preparation, two hours for completing the assessment, and one hour for action planning); and
Materials have been assembled in preparation for administration, including:
» Blank copies (paper or electronic) of the DBPA rubric accessible to all respondents;
» Data sources (e.g., policies, procedures) to inform the assessment; and
» If relevant, previously completed administrations including reports.
Scoring
Preparation for the Administration
6 | ©2019 NIRN-UNC
SCALES & SUBSCALES ITEM #
Competency Average of items
• Selection 1, 2, 3, 4, 5
• Training 6, 7, 8
• Coaching 9, 10, 11, 12
• Fidelity 13, 14, 15, 16
Organization Average of items
• Decision-Support Data System 17, 18, 19, 20
• Facilitative Administration 21, 22, 23, 24, 25, 26, 27
• Systems Intervention 28, 29, 30
Total Average of all items
The Implementation Drivers include:
Competency– Strategies to develop, improve, and sustain practitioners’ ability to implement a program or
practice as intended in order to achieve desired outcomes. Competency Drivers include: Fidelity, Selection,
Training, and Coaching.
Organization– Strategies for analyzing, communicating, guiding, and responding to data in ways that result
in continuous improvement of supports for sta to use the selected program or practice. Organization
Drivers include: Decision-Support Data System, Facilitative Administration, and Systems Intervention.
The basis of the Drivers Best Practices Assessment is derived from implementation science research and the
Active Implementation Framework of the Implementation Drivers (Fixsen, Naoom, Blase, Friedman, & Wallace,
2005; Metz, Bartley, Ball, Wilson, Naoom, & Redmond, 2014). Implementation Drivers are core components
or building blocks of the infrastructure needed to support practice, organizational, and systems change. The
Drivers emerged from a synthesis of commonalities among successfully implemented programs and practices
(Fixsen et al., 2005). See Appendix A for information regarding validation of the measure.
Research Basis and Outcomes from Completion
Drivers Best Practices Assessment V2.7
|
7
Implementation Drivers are the key
components of functional supports that
enable a program’s success.
The Drivers Best Practices Assessment was created to guide organizations as they develop the infrastructure to
support use of selected programs or practices. As such, it is recommended that teams engage in the following
activities aer they complete each administration:
Review and use the (a) Total score, (b) Scale Scores, and (c) Item Scores to identify areas of strength and
need;
Identify priorities to address within a plan;
Develop and create an Action Plan (Appendix B) that defines immediate and short-term actions focusing on
improving the infrastructure activities to support use of the selected program or practice.
If this is a repeated administration, review and update existing plan to continue support for the selected
program or practice.
For more information on the Implementation Drivers derived by the National Implementation Research Network,
visit http://nirn.fpg.unc.edu and the Active Implementation Hub at http://implementation.fpg.unc.edu.
Next Steps After Administration
8 | ©2019 NIRN-UNC
+ : Yes - : No
/ : Unsure or not applicable
An individual with knowledge of implementation drivers and skill in administering the assessment is
identified to facilitate.
Facilitator invites participants, including Implementation Team members, who have a role in developing,
monitoring, and improving implementation supports.
1. SKILLED FACILITATOR
2. RESPONDENTS INVITED
A well-defined program or practice is identified for the assessment.
3. PROGRAM/PRACTICE IDENTIFIED
Facilitator ensures that language in the assessment has been contextualized for the agency, copies
(paper or electronic) are available for each participant, a note taker has been identified, and a room is
set up with a laptop, projector, internet connection, and conference phone (video if possible) for any
participants joining remotely.
4. MATERIALS PREPARED IN ADVANCE
Fidelity Checklist
DRIVERS BEST PRACTICES ASSESSMENT:
Facilitator provides a review of the assessment, its purpose, and instructions for scoring the items.
Facilitator obtains informed consent from participants to collect and use their response to
understand implementation status and inform action planning.
5. OVERVIEW
6. CONSENT
Facilitator documents date of the assessment, names and roles of participants, and the intervention
being assessed.
The Facilitator introduces the Implementation Drivers one at a time and provides an overview of the
best practices of each Driver. Facilitator then asks the team to describe their current practices, asks
the team which agency in the system has responsibility for the driver, and then directs the team to
complete the items through discussion and consensus.
The team is given time to review, discuss, and come to consensus on the score for each item through a
voting process. Facilitator answers questions, contextualizes, and provides clarification as needed for
the respondents. The facilitator also seeks equity of voice from all participants to ensure a complete
assessment of the context is understood.
7. DOCUMENTATION
8. ADMINISTRATION & INTRODUCTION
9. CONSENSUS
PROTOCOL STEPS
Drivers Best Practices Assessment V2.7
|
9
/ : Unsure or not applicable
The team documents each scoring decision electronically or on the scoring form used to record scores.
10. RECORDING
For items where there is further clarity or information needed, the Facilitator notes the question in the
“Notes” section. A note taker captures the team discussion of each Implementation Driver in the relevant
section.
Aer the last question has been asked and answered, the Facilitator or Note Taker generates the reports
and distributes graphs of total scores.
While viewing the graphs, Facilitator prompts the team in a discussion of the results to identify strengths
and opportunities. If a repeated administration, Facilitator highlights all of the subscales that moved in a
positive direction and celebrates progress. Facilitator initiates a discussion of updates on achievements,
progress, and major milestones or barriers that have occurred since previous administration.
If there is time to review the results and action plan, Facilitator engages the team in a prioritization
process for identifying key areas for planning and needed actions. If there is not suicient time for
review of results and action planning, the Facilitator ensures that a date and time are set for the
Review and Action Planning
Facilitator thanks the team for their openness and for sharing in the discussion.
11. NOTE-TAKING
12. DATA SUMMARY
13. REVIEW
14. PLANNING
15. CONCLUSION
NOTES
10 | ©2019 NIRN-UNC
DRIVERS BEST PRACTICES ASSESSMENT:
Scoring Form
Today’s Date:
Individuals Participating in the Assessment:
Relevant Sta for Practice/Program:
Which sta are involved in use of the practice/program?
Which of those sta are considered in this assessment?
Facilitator(s):
Practice/Program Being Assessed Today:
Use the Scoring Form below to capture the respondent team’s final score for each item. If the respondent team is
unable to arrive at consensus, additional data sources for each item are documented in the Scoring Rubric.
1. There is someone accountable for the recruitment and selection of relevant sta for the
program or practice.
2. Job descriptions are in place for relevant sta that carry out the program or practice.
3. Individuals accountable for selection understand the skills and abilities needed for relevant
sta.
SELECTION
Note a rating for each item below: 2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE
Drivers Best Practices Assessment V2.7
|
11
DRIVERS BEST PRACTICES ASSESSMENT:
Scoring Form
COACHING
TRAINING
FIDELITY
Note a rating for each item below: 2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE
4. Selection protocols are in place to assess competencies for relevant sta that carry out the
program or practice.
5. Selection processes are regularly reviewed.
6. There is someone accountable for the training of relevant sta for the program or practice.
7. Agency sta provide or secure skill-based training for relevant sta on the program or practice.
8. Agency sta use training data for improvement.
9. There is someone accountable for coaching of relevant sta for the program or practice.
10. Coaching is provided to improve the competency of relevant sta for the program or practice.
11. Agency sta use a coaching service delivery plan.
12. Agency sta regularly assess coaching eectiveness.
13. There is someone accountable for the fidelity assessments of relevant sta for the program or
practice.
14. The agency uses a fidelity assessment for the program or practice.
15. Agency sta follow a protocol for fidelity assessments.
16. Agency sta use fidelity data to improve program or practice outcomes and implementation
supports.
17. There is someone accountable for the decision-support data system.
18. Agency sta have access to relevant data for making decisions for program improvement.
DECISION-SUPPORT DATA SYSTEM
12 | ©2019 NIRN-UNC
DRIVERS BEST PRACTICES ASSESSMENT:
FACILITATIVE ADMINISTRATION
SYSTEMS INTERVENTION
Scoring Form
19. Data are useful and usable.
20. Agency sta have a process for using data for decision-making.
21. Leadership sets aside resources to support the development of sta competency to deliver the
program or practice.
22. Leadership develops and/or refines internal policies or procedures that support the program or
practice.
23. Leadership makes changes in organization roles, functions, and structures as needed to
accommodate the program or practice.
24. Leadership engages in regular communication with all sta and service users regarding the
program or practice.
25. Leadership visibly promotes the importance of eectively implementing the program or practice.
26. Leadership problem-solves challenges to implement the program or practice eectively.
27. Leadership recognizes and appreciates sta contributions to implement the program or practice
eectively.
28. Leadership engages stakeholders and sta in developing a shared understanding of the need for
the program or practice.
29. Leadership creates opportunities for stakeholders and sta to learn and design solutions
together to support the program or practice.
30. Leadership regularly communicates with stakeholders regarding the program or practice.
Note a rating for each item below: 2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE
Drivers Best Practices Assessment V2.7
|
13
14 | ©2019 NIRN-UNC
SELECTION
1. There is someone accountable for the recruitment and selection of relevant sta for the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
A specific person is responsible
for coordinating the quality and
timeliness of recruitment and
selection processes for relevant sta
supporting the program or practice.
This person is able to execute the
responsibilities related to his/her
role in the selection process.
A specific person is responsible
for coordinating the quality and
timeliness of recruitment and
selection processes for relevant sta
supporting the program or practice.
There is not a specific person
responsible for coordinating
the quality and timeliness of
recruitment and selection processes
for relevant sta supporting the
program or practice.
Job description of person
accountable for recruitment
and selection
2. Job descriptions are in place for relevant sta that carry out the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Job descriptions are:
clear about expectations for the
position;
aligned with the competencies
required for the program to be used
competently.
Job descriptions are clear about
expectations for the position.
Job descriptions are not clear about
expectations for the position or
aligned with the competencies.
Job descriptions
The Selection Driver refers to use of a purposeful process for selection of sta with the required skills,
abilities, and other program/practice-specific prerequisite characteristics.
Scoring Rubric
Tell me about your selection process(es). Record responses:
What agency is primarily responsible for this driver? Record responses:
Drivers Best Practices Assessment V2.7
|
15
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Individuals accountable for
selection:
know the knowledge, skills, and
abilities related to the sta position;
accurately assess applicant
knowledge, skills, and abilities.
Individuals accountable for
selection:
know the knowledge, skills, and
abilities related to the sta position.
Individuals accountable for selection
have little or no knowledge of the
knowledge, skills, and abilities
related to the sta position.
Job descriptions
Selection protocol
3. Individuals accountable for selection understand the skills and abilities needed for relevant sta.
4. Selection protocols are in place to assess competencies for relevant sta that carry out the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Selection protocol includes all of the
following:
an assessment of core skills needed
for position;
specific procedures (e.g., scenario,
role play) for assessing individual’s
ability to perform key skills;
specific procedures for assessing
ability to receive and use feedback
provided during the interview;
a documented process for review of
adherence to the interview protocol;
record of the ratings of individuals’
responses.
Selection protocol includes all of the
following:
an assessment of core skills needed
for position;
a documented process for review of
adherence to the interview protocol;
record of the ratings of individuals’
responses.
Generic selection protocol (e.g.,
similar protocol for any position)
exists.
Selection protocol (including
procedures used during the
selection process)
Data showing the results of
core skills assessments
5. Selection processes are regularly reviewed.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Selection processes are annually
reviewed and revised as needed to
improve the selection process. The
annual review examines at least
three of the following:
interview results (e.g., protocol
adherence, applicant responses);
training data;
turnover data;
fidelity data;
exit interview results.
Selection processes are annually
reviewed and revised as needed to
improve the selection process. The
annual review examines at least one
of the following:
interview results (e.g., protocol
adherence, applicant responses);
training data;
turnover data;
fidelity data;
exit interview results.
Selection processes are not
reviewed and revised.
Selection process
documentation
Data on selection outcomes
16 | ©2019 NIRN-UNC
TRAINING
The Training Driver refers to use of purposeful, skill-based, and adult-learning informed processes designed
to support relevant sta in acquiring the skills and information needed to support the program/practice.
Training of relevant sta at the agency provides knowledge related to the theory and underlying values of
the program/practice, opportunities to practice new skills to meet fidelity criteria, and feedback in a safe and
supportive training environment.
Scoring Rubric
Tell me about your training process(es). Record responses:
What agency is primarily responsible for this driver? Record responses:
6. There is someone accountable for the training of relevant sta for the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
A specific person is responsible for
coordinating quality and timeliness
of training for relevant sta
supporting the program or practice.
This person is able to execute the
responsibilities related to his/her
role in training.
A specific person is responsible for
coordinating quality and timeliness
of training for relevant sta
supporting the program or practice.
There is not a specific person
responsible for coordinating
quality and timeliness of training
for relevant sta supporting the
program or practice.
Job description of person
accountable for training
Drivers Best Practices Assessment V2.7
|
17
7. Agency sta provide or secure skill-based training for relevant sta on the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Training is:
required and provided before sta
begin to use the program or practice;
provided by trainers who have deep
content knowledge and who are
eective trainers;
skill-based and includes
opportunities for practice and
feedback in a safe environment;
comprehensive, including practice-
specific and complementary skills
(e.g., equity, diversity, and inclusion).
Training is:
required and provided before sta
begin to use the program or practice;
provided by trainers who have deep
content knowledge and eective
presentation delivery skills.
Training is not:
required and/or is not provided
before sta begin to use the new
program or practice;
provided by trainers who have deep
content knowledge and eective
presentation delivery skills.
Professional learning
schedule
Training outlines or agendas
Training evaluations
Presenter qualifications
Agendas for training
presenters
8. Agency sta use training data for improvement.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Training assessment data are:
collected and used to improve future
training activities; and
provided to supervisors and
coaches in a timely manner to
improve sta competency and other
implementation drivers.
Training assessment data are:
collected and used to improve future
training activities.
Training assessment data are not
collected.
Training outcome data
Evidence that data are used
for improvements
18 | ©2019 NIRN-UNC
COACHING
The Coaching Driver refers to the purposeful process of supporting sta to generalize newly learned skills for
the program/practice, to be used competently by the practitioner in real-world settings and interactions.
Scoring Rubric
Tell me about your coaching process(es). Record responses:
Which agency is primarily responsible for this driver? Record responses:
9. There is someone accountable for coaching of relevant sta for the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
A specific person is responsible
for coordinating the quality and
timeliness of coaching relevant sta
supporting the program or practice.
This person is able to execute the
responsibilities related to his/her
role in the coaching process.
A specific person is responsible
for coordinating the quality and
timeliness of coaching relevant sta
supporting the program or practice.
There is not a specific person
responsible for coordinating the
quality and timeliness of coaching
relevant sta supporting the
program or practice.
Job description of person
accountable for coaching
10. Coaching is provided to improve the competency of relevant sta for the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Coaching is provided at least
monthly to relevant sta. Coaches
feedback to sta is based on direct
observation and at least one other
data source such as:
group or individual reflections;
product or document review;
fidelity data;
interviews with key stakeholders.
Coaching is provided at least
monthly to relevant sta. Coaches
feedback to sta is based on one of
the following:
group or individual reflections;
product or document review;
fidelity data;
interviews with key stakeholders.
Relevant sta do not receive
coaching at least monthly.
Coaching schedules
Samples of coaching feedback
Drivers Best Practices Assessment V2.7
|
19
11. Agency sta use a coaching service delivery plan.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
A written plan outlines coaching
provided to relevant sta, including
three of the following:
skill sets for being a coach;
frequency of coaching;
coaching methods;
feedback methods and timeframe;
communication protocols for coach
and supervisor.
Adherence to the plan is reviewed
regularly.
A written plan outlines the coaching
supports provided to relevant
sta, including at least one of the
following:
skill sets for being a coach;
frequency of coaching;
coaching methods;
feedback methods and timeframe;
communication protocols for coach
and supervisor.
A written coaching service delivery
plan does not exist.
Sample of coaching service
delivery plans
Content and concept lists
used by coaches
12. Agency sta regularly assess coaching eectiveness.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Agency sta assess coaching
eectiveness quarterly through the
use of two or more data sources:
practitioner fidelity;
coach fidelity;
sta satisfaction with coaching
surveys.
Coaching eectiveness data are
used to improve coaching and other
implementation drivers.
Agency sta assess coaching at least
annually through the use of at least
one data source:
practitioner fidelity;
coach fidelity;
sta satisfaction with coaching
surveys.
Coaching eectiveness is not
assessed.
Coaching fidelity:
observations of coaches
conducting coaching activities
coaching logs
coaching notes
Satisfaction surveys from
those being coached
Evidence the data are used to
inform improvements in
coaching methods
20 | ©2019 NIRN-UNC
FIDELITY
13. There is someone accountable for fidelity assessments of relevant sta for the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
A specific person is responsible for
coordinating fidelity assessments
of relevant sta for the program
or practice. This person is able to
execute the responsibilities related
to his/her role.
A specific person is responsible for
coordinating fidelity assessments
of relevant sta for the program or
practice.
There is not a specific person
responsible for coordinating fidelity
assessments of relevant sta for the
program or practice.
Job description of person
accountable for fidelity
assessments
14. The agency uses a fidelity assessment for the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
The agency consistently uses a
fidelity assessment for the program
or practice.
The agency inconsistently uses a
fidelity assessment for the program
or practice.
The agency does not use a fidelity
assessment.
Fidelity assessment may
include multiple measures to
address context, content, and
competency
Technical manual
Research documents
Scoring Rubric
The Fidelity Driver refers to the purposeful process of using fidelity assessments to evaluate the extent to
which a program/practice is implemented as intended.
Tell me about your fidelity process(es), including how oen fidelity data are reviewed. Record responses:
Which agency is primarily responsible for this driver? Record responses:
Drivers Best Practices Assessment V2.7
|
21
15. Agency sta follow a protocol for fidelity assessments.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Agency sta follow a written
protocol that includes all of the
following:
orientation process for relevant sta;
process for how fidelity data are
used;
communication protocol for sharing
fidelity data.
Agency sta follow a written
protocol that includes some but not
all of the following:
orientation process for relevant sta;
process for how fidelity data are
used;
communication protocol for sharing
fidelity data.
Agency sta do not follow a written
protocol for fidelity assessments.
Fidelity assessment
protocol
Documentation of fidelity
assessments
16. Agency sta use fidelity data to improve program or practice outcomes and implementation supports.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Agency sta review fidelity
assessment data regularly and
use assessment data to improve
implementation drivers.
Agency sta review fidelity
assessment regularly but data are
are used inconsistently to improve
implementation drivers.
Agency sta do not review or use
fidelity assessment data.
Documentation of action
plans for improvement of
selection, training, or
coaching processes
Documentation of feedback to
coaches and/or trainers
Documentation of feedback
provided to practitioners
22 | ©2019 NIRN-UNC
17. There is someone accountable for the decision-support data system.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
A specific person is responsible for
coordinating a data system that is
used to support decision-making
for the program or practice and
its implementation. This person is
able to execute the responsibilities
related to his/her role in overseeing
the decision-support data system.
A specific person is responsible for
coordinating a data system used
to support decision-making for
the program or practice and its
implementation.
There is no person responsible for
coordinating a data system used
to support decision-making for
the program or practice and its
implementation.
Job description of person
accountable for decision-
support data system
18. Agency sta have access to relevant data for making decisions for program improvement.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Relevant sta have access to and
can analyze all of the following data
for program improvement:
fidelity data;
outcome data;
programmatic data, including
feedback from practitioners and
program beneficiaries;
financial data.
Relevant sta have access to and
can analyze some but not all of
the following data for program
improvement:
fidelity data;
outcome data;
programmatic data, including
feedback from practitioners and
program beneficiaries;
financial data.
Relevant sta do not have access
to any of the following data for
program improvement:
fidelity data;
outcome data;
programmatic data, including
feedback from practitioners and
program beneficiaries;
financial data.
Sample data reports
The Decision-Support Data System refers to the development and use of data systems to support decision
making and improvement activities, including the collection and use of programmatic data, fidelity data, and
outcome data.
Scoring Rubric
Tell me about your decision-support data system process(es). Record responses:
Which agency is primarily responsible for this driver? Record responses:
DECISION-SUPPORT DATA SYSTEM
Drivers Best Practices Assessment V2.7
|
23
19. Data are useful and usable.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Data collected meet all of the
following criteria to be useful and
usable:
collected in a standardized way by
trained sta;
provide relevant information that
can support improvement processes;
available when relevant sta are
making decisions;
an important component of practice
routines.
Data collected meet some but not
all of the following criteria to be
useful and usable:
collected in a standardized way by
trained sta;
provide relevant information that
can support improvement processes;
available when relevant sta are
making decisions;
an important component of practice
routines.
Data collected do not meet any of
the following criteria to be useful
and usable:
collected in a standardized way by
trained sta;
provide relevant information that
can support improvement processes;
available when relevant sta are
making descisions;
an important component of practice
routines.
Sample data team meeting
notes
20. Agency sta have a process for using data for decision-making.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Agency sta have a process for
using data for decision-making that
includes all of the following:
data are disaggregated, analyzed,
and summarized at least quarterly;
data summaries are communicated
clearly in written reports to relevant
sta;
action plans are developed and
monitored regularly to improve
implementation supports and
outcomes;
data summaries and action plans are
shared with key stakeholders.
Agency sta have a process for
using data for decision-making that
includes some but not all of the
following:
data are disaggregated, analyzed,
and summarized at least quarterly;
data summaries are communicated
clearly in written reports to relevant
sta;
action plans are developed and
monitored regularly to improve
implementation supports and
outcomes;
data summaries and action plans are
shared with key stakeholders.
Agency sta do not have a process
for using data for decision-making.
Documentation of processes
used by agency to review data
and make decisions
Sample data reports
Sample action plans
24 | ©2019 NIRN-UNC
Facilitative Administration refers to an agency’s leaders, managers, and implementation teams developing
and using strategies that facilitate and support use of the program/practice, and that make the work of
practitioners easier. For the purpose of this assessment, leadership is inclusive of your executive leaders,
managers, and team members who are responsible for the program or practice.
Scoring Rubric
FACILITATIVE ADMINISTRATION
Tell me about your agency’s/site’s organizational structure (e.g., leadership, management, teams). For the purpose of this assessment,
leadership is inclusive of your executive leaders, management, and team members who are responsible for the program or practice.
Record responses:
23. Leadership makes changes in organization roles, functions, and structures as needed to accommodate the
program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Leadership consistently makes
changes to organization roles,
functions, and structures.
Leadership inconsistently makes
changes to organization roles,
functions, and structures.
Leadership does not make changes
to organization roles, functions, and
structures.
Organizational chart
Position descriptions
21. Leadership sets aside resources to support the development of sta competency to deliver the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Leadership sets aside resources
to support sta competency
development:
selection
training
ongoing coaching, and
monitoring fidelity
Leadership sets aside some but
not all resources to support sta
competency development:
selection
training
ongoing coaching, and
monitoring fidelity
Leadership does not set aside
resources at all or does so in general
(i.e., not for the specific program/
practice).
Budget
22. Leadership develops and/or refines internal policies or procedures that support the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Leadership consistently develops
and/or refines policies and
procedures to make it possible to do
the work of the program or practice.
Leadership develops and/or
refines policies and procedures
inconsistently.
Leadership does not develop and/
or refine policies and procedures to
make it possible to do the work of
the program or practice.
Budget
Training
Resources
Drivers Best Practices Assessment V2.7
|
25
24. Leadership engages in regular communication with all sta and service users regarding the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Leadership communicates with all
sta and service users and receives
and responds to feedback from all
sta and service users.
Leadership communicates to all sta
and service users.
Leadership does not communicate
regularly with sta and service
users.
Communication plan
Example communications
25. Leadership visibly promotes the importance of eectively implementing the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Leadership speaks about and can
answer questions regarding what it
takes to eectively implement the
program or practice.
Leadership speaks about the
importance of implementing the
program or practice but struggles to
answer questions about what it will
take to do so eectively.
Leadership struggles to speak
about and answer questions
regarding what it takes to eectively
implement the program or practice.
Communication plan
Example communications
26. Leadership problem solves challenges to implement the program or practice eectively.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Leadership consistently problem-
solves challenges using data to
eectively implement the program
or practice.
Leadership inconsistently problem-
solves challenges using data to
eectively implement the program
or practice.
Leadership does not problem-solve
challenges using data to eectively
implement the program or practice.
Meeting minutes
Observations
27. Leadership recognizes and appreciates sta contributions to implement the program or practice eectively.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Leadership consistently recognizes
and appreciates sta contributions
to eectively implement the
program or practice.
Leadership inconsistently recognizes
and appreciates sta contributions
to eectively implement the
program or practice.
Leadership does not recognize and
appreciate sta contributions to
eectively implement the program
or practice.
Meeting minutes
Observations
26 | ©2019 NIRN-UNC
28. Leadership engages stakeholders and sta in developing a shared understanding of the need for the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Leadership works together with all
of following stakeholder groups and
agency sta to develop a shared
understanding of the need for a
program or practice.
Funders and/or Board
Beneficiaries of the practice or
program (e.g., clients)
Community Partners
Leadership works together with at
least one of following stakeholder
groups and agency sta to develop
a shared understanding of the need
for a program or practice.
Funders and/or Board
Beneficiaries of the practice or
program (e.g., clients)
Community Partners
Leadership does not work with
stakeholder groups and agency sta
to develop a shared understanding
of the need for a program or
practice.
Meeting minutes
Communications
29. Leadership creates opportunities for stakeholders and sta to learn and design solutions together to support the
program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Leadership creates opportunities
consistently to learn and design
solutions together to support the
program or practice.
Leadership creates opportunities
inconsistently to learn and design
solutions together to support the
program or practice.
Leadership does not create
opportunities to learn and design
solutions together to support the
program or practice.
Meeting minutes
Agendas
Communications
Systems Intervention refers to how agency leaders, managers, and implementation teams work with diverse
and representative external partners. These partners include funders, organization’s board or governing
entity, beneficiaries of the practice or program, and community partners. Leadership works with these
partners to ensure availability of resources required to align and deliver the practice.
Scoring Rubric
SYSTEMS INTERVENTION
Tell me about your systems intervention process(es). For the purpose of the assessment, stakeholders are
external groups who are necessary for the successful use of the program or practice. Record responses:
Drivers Best Practices Assessment V2.7
|
27
30. Leadership regularly communicates with stakeholders regarding the program or practice.
2- IN PLACE 1- PARTIALLY IN PLACE 0- NOT IN PLACE DATA SOURCE
Leadership
provides information to stakeholders
regarding the program or practice;
receives information from
stakeholders regarding the program
or practice;
requests and responds to feedback
from all stakeholders regarding the
program or practice.
Leadership provides information to
stakeholders regarding the program
or practice.
Leadership does not engage in
communication with stakeholders
regarding the program or practice.
Communications
Meeting minutes
Observations
28 | ©2019 NIRN-UNC
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation Research: A synthesis
of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The
National Implementation Research Network (FMHI Publication #231). http://nirn.fmhi.usf.edu/resources/detail.
cfm?resourceID=31
Metz, A., Bartley, L., Ball, H., Wilson, D., Naoom, S., & Redmond, P. (2014). Active Implementation Frameworks
(AIF) for successful service delivery: Catawba County child wellbeing project. Research on Social Work Practice,
1-8. doi:10.1177/1049731514543667
Ogden, T., Bjørnebekk, G., Kjøbli, J., Patras, J., Christiansen, T., Taraldsen, K., & Tollefsen, N. (2012).
Measurement of implementation components ten years aer a nationwide introduction of empirically supported
programs – a pilot study. Implementation Science, 7, 49. doi:10.1186/1748-5908-7-49. Retrieved from http://www.
implementationscience.com/content/pdf/1748-5908-7-49.pdf
References
Drivers Best Practices Assessment V2.7
|
29
Validation
Ogden et al. (2012) at the Atferdssenteret - Norsk senter for studier av problematferd og innovativ praksis -
Universitet i Oslo (The Norwegian Center for Child Behavioral Development, University of Oslo) validated a
previous version of the Drivers Best Practices items. Ogden et al. collected data to establish the reliability and
validity of the Implementation Driver items. The researchers interviewed 218 practitioners, supervisors, and
managers associated with two well-established evidence-based programs in Norway. The Cronbach alphas
obtained in their study were: selection, 0.89; training, 0.91; coaching, 0.79; fidelity, 0.89; decision-support data
systems, 0.84; facilitative administration, 0.82; systems intervention, 0.82; and leadership, 0.88.
Metz et al. (2014) assessed Implementation Drivers in a county social service system before, during, and aer
implementation capacity was developed. Low scores on the Drivers assessment at baseline were associated
with low levels of fidelity use of the innovation. As implementation capacity was developed, the scores on the
Drivers assessment increased (nearly doubled). Higher scores on the Drivers assessment were related to much
higher fidelity use of the innovation.
Appendix A
30 | ©2019 NIRN-UNC
Drivers Best Practices Assessment Action Plan Template
Contributors to Action Plan: Date of Action Plan:
Focus of DBPA:
Based on your review of the DBPA results, identify at least 2-3 priorities to address within an action plan.
Create an Action Plan using the template below that defines immediate and short-term actions focusing on
improving the infrastructure activities to support use of the selected program or practice.
AREA ACTIONS NEEDED BY WHO BY WHEN
Appendix B