The Hexagon: An Exploration Tool
Hexagon Discussion & Analysis Tool Instructions
© 2019 NIRN - University of North Carolina at Chapel Hill
NIRN
© 2019 NIRN – University of North Carolina at Chapel Hill
Metz, A. & Louison, L. (2019) The Hexagon Tool: Exploring Context. Chapel Hill, NC: National
Implementation Research Network, Frank Porter Graham Child Development Institute, University
of North Carolina at Chapel Hill. Based on Kiser, Zabel, Zachik, & Smith (2007) and Blase, Kiser &
Van Dyke (2013).
PROGRAM INDICATORS
Program indicators assess new or existing programs or practices that will be implemented along the
following domains: evidence, supports, and usability. These indicators specify the extent to which the
identified program or practice demonstrates evidence, supports for implementation, and usability across
a range of contexts.
IMPLEMENTING SITE INDICATORS
Implementing site indicators assess the extent to which a new or existing program or practice matches
the implementing site along the following domains: population need, fit, and capacity. The assessment
specifies suggested conditions and requirements for a strong match to need, fit, and capacity for the
identified program or and practice.
PRIOR TO USING
1.
Identify the program or practice to be assessed.
2. Review the discussion questions prior to meeting to ensure any data or resources that need
to be reviewed for this discussion are available. If appropriate, an organization may prioritize
components for deeper exploration based on the context and potential programs or practices.
3. Identify a team to participate in the discussion. If the site has an Implementation Team, that team
can complete the assessment as part of their work. If not, identify key stakeholders internal and
external to the organization who have important diverse perspectives on the need and possible
programs or practices. Suggested team members include leaders, managers, direct practitioners
and consumers or community members.
DURING USE
1. The team should review and discuss the questions for each indicator and document relevant
considerations. Extra space is included in each section for notes and additional questions identified
by the team to address unique needs and contexts.
2. Aer discussing each component, the team rates the component using the 5-point Likert scale in
each section.
3. Using the discussion notes and ratings, the team makes recommendations about whether to
adopt, replicate, or de-implement the program or practice. While ratings should be taken into
account during the decision-making process, the ratings alone should not be used to determine
final recommendations.
Hexagon Discussion & Analysis Tool Instructions
WHEN TO USE
HOW TO USE
The Hexagon Discussion and Analysis Tool helps organizations evaluate new and existing programs and
practices. This tool is designed to be used by a team to ensure diverse perspectives are represented in a
discussion of the six contextual fit and feasibility factors.
The Hexagon Tool can be used at any stage in a program’s implementation to determine its fit with
the local context. It is most commonly used during the Exploration stage: the period when a site is
identifying possible new programs or practices to implement. If the organization has an Implementation
Team, the Implementation Team can carry out this function for the organization.
1
2
SUPPORTS
Expert Assistance
Staing
Training
Coaching & Supervision
Racial equity impact assessment
Data Systems Technology Supports (IT)
Administration & System
The Hexagon can be used as a planning tool to guide selection and evaluate potential programs and
practices for use.
The Hexagon: An Exploration Tool
PROGRAM INDICATORS
IMPLEMENTING SITE INDICATORS
FIT WITH CURRENT
INITIATIVES
Alignment with community,
regional, state priorities
Fit with family and community
values, culture and history
Impact on other interventions &
initiatives
Alignment with organizational
structure
NEED
Target population identified
Disaggregated data
indicating population needs
Parent & community
perceptions of need
Addresses service or
system gaps
CAPACITY TO IMPLEMENT
Sta meet minimum
qualifications
Able to sustain staing, coaching,
training, data systems, performance
assessment, and administration
Financial capacity
Structural capacity
Cultural responsivity capacity
Buy-in process operationalized
Practitioners
Families
EVIDENCE
Strength of evidence—for
whom in what conditions:
Number of studies
Population similarities
Diverse cultural groups
Eicacy or Eectiveness
Outcomes – Is it worth it?
Fidelity data
Cost – eectiveness data
USABILITY
Well-defined
program
Mature sites to
observe
Several replications
Adaptations for context
EVIDENCE
USABILITYCAPACITY
SUPPORTSFIT
NEED
3
Today’s Date:
Individuals Participating in the Assessment:
Facilitator(s):
Practice/Program Being Assessed:
Identify the program or practice to be assessed. Write the numerical rating that best describes each
component below.
EVIDENCE
SUPPORTS
USABILITY
NEED
FIT
CAPACITY
PROGRAM/
PRACTICE
1
PROGRAM/
PRACTICE
2
PROGRAM/
PRACTICE
3
PROGRAM IMPLEMENTING SITE
4
RATING
1. Are there research data available to demonstrate the eectiveness (e.g. randomized trials, quasi-experimental
designs) of the program or practice? If yes, provide citations or links to reports or publications.
2. What is the strength of the evidence? Under what conditions was the evidence developed?
3. What outcomes are expected when the program or practice is implemented as intended? How much of a change
can be expected?
5 High Evidence
The program or practice has documented evidence of eectiveness based on at least two rigorous, external research studies
with control groups, and has demonstrated sustained eects at least one year post treatment
4 Evidence
The program or practice has demonstrated eectiveness with one rigorous research study with a control group
3 Some Evidence
The program or practice shows some evidence of eectiveness through less rigorous research studies that include comparison
groups
2 Minimal Evidence
The program or practice is guided by a well-developed theory of change or logic model, including clear inclusion and exclusion
criteria for the target population, but has not demonstrated eectiveness through a research study
1 No Evidence
The program or practice does not have a well-developed logic model or theory of change and has not demonstrated
eectiveness through a research study
4. If research data are not available, are there evaluation data to indicate eectiveness (e.g. pre/post data, testing
results, action research)? If yes, provide citations or links to evaluation reports.
5. Is there practice-based evidence or community-defined evidence to indicate eectiveness? If yes, provide citations
or links.
6. Is there a well-developed theory of change or logic model that demonstrates how the program or practice is
expected to contribute to short term and long term outcomes?
EVIDENCE
Program Indicator
7. Do the studies (research and/or evaluation) provide data specific to the setting in which it will be implemented
(e.g., has the program or practice been researched or evaluated in a similar context?)?
If yes, provide citations or links to evaluation reports.
8. Do the studies (research and/or evaluation) provide data specific to eectiveness for culturally and linguistically
specific populations? If yes, provide citations or links specific to eectiveness for families or communities from
diverse cultural groups.
Additional Questions/ Notes
Ratings
5
1. Is there a qualified “expert” (e.g., consultant, program developer, intermediary, technical assistance provider) who
can help with implementation over time? If yes, list names and/or organization (e.g. Center, University) and contacts.
2. Are there start-up costs for implementation of the program or practice (e.g., fees to the program developer)? If yes,
itemize in notes section. What does the implementing site receive for these costs?
SUPPORTS
Program Indicator
RATING
3. Are there curricula and other resources related to the program or practice readily available? If so, list publisher or
links. What is the cost of these materials? Enter in notes section.
4. Is training and professional development related to this program or practice readily available? Is training culturally
sensitive? Does it address issues of race equity, cultural responsiveness or implicit bias? Include the source of training
and professional development. What is the cost of these materials? Enter in notes section.
5. Is coaching available for this program or practice? Is coaching culturally sensitive? If so, list coaching resources and
cost in notes section.
6. Are sample job descriptions and interview protocols available for hiring or selecting new sta for this practice? If so,
identify here and any costs associated.
7. Is guidance on administrative policies and procedures available? If so, identify resources and any costs associated.
8. Are there resources to develop a data management plan for this program or practice (including data system and
monitoring tools) available? If so, identify resources and any costs associated.
9. Is there a recommended orientation to facilitate “buy-in” for sta, key stakeholders and collaborative partners? If
so, explain/describe briefly in notes section.
5 Well Supported
Comprehensive resources are available from an expert (a program developer or intermediary) to support implementation,
including resources for building the competency of sta (sta selection, training, coaching, fidelity) and organizational practice
(data system and data use support, policies and procedures, stakeholder and partner engagement.)
4 Supported
Some resources are available to support implementation, including limited resources to support sta competency (e.g.,
training and coaching) and limited resources to support organizational changes (e.g., data systems)
3 Somewhat Supported
Some resources are available to support competency development or organizational development but not both
2 Minimally Supported
Limited resources are available beyond a curriculum or one time training
1 Not Supported
Few to no resources to support implementation
Additional Questions/ Notes
Ratings
1. Is the program or practice clearly defined (e.g. what it is, for whom it is intended)?
2. Are core features of the program or practice identified, listed, named (e.g. key components of the program or
practice that are required in order to be eective)?
USABILITY
RATING
Program Indicator
3. Is each core feature well operationalized (e.g., sta know what to do and say, how to prepare, how to assess
progress)?
4. Is there guidance on core features that can be modified or adapted to increase contextual fit?
5. Is there a fidelity assessment that measures practitioner behavior (i.e., assessment of whether sta use the practice
as intended)? If yes, provide citations, documents, or links to fidelity assessment information.
6. Has the program or practice been adapted for use within culturally and linguistically specific populations and/or is
there a recommended process for gathering community input into culturally specific enhancements?
7. What do we know about the key reasons for previous successful replications?
8. What do we know about the key problems that led to unsuccessful replication eorts previously?
9. Are there mature sites with successful histories of implementing the program or practice who are willing to be
observed?
5 Highly Usable
The program or practice has operationalized principles and values, core components that are measurable and
observable, and a validated fidelity assessment; modifiable components are identified to support contextualization
for new settings or populations
4 Usable
The program or practice has operationalized principles and values and core components that are measurable and
observable, has tools and resources to monitor fidelity, but does not have a validated fidelity measure; modifiable
components are identified to support contextualization for new settings or populations
3 Somewhat Usable
The program or practice has operationalized principles and values and core components that are measurable and
observable but does not have a fidelity assessment; modifiable components are not identified
2 Minimally Usable
The program or practice has identified principles and values and core components; however, the principles and
core components are not defined in measurable or observable terms; modifiable components are not identified
1 Not Usable
The program or practice does not identify principles and values or core components
Additional Questions/ Notes
Ratings
6
7
RATING
1. Who is the identified population of concern?
2. What is/are the identified needs of this population?
NEED
Implementing Site Indicator
3. Was an analysis of data conducted to identify specific area(s) of need relevant to the program or practice? If yes,
what data were analyzed? Were these data disaggregated by race, ethnicity and language?
4. How do aected individuals and community members perceive their need? What do they believe will be helpful?
How were community members engaged to assess their perception of need?
5. Is there evidence that the program or practice addresses the specific area(s) of need identified? If so, how was this
evidence generated (e.g., experimental research design, quasi- experimental research design, pre-post, descriptive)?
6. If the program or practice is implemented, what can potentially change for this population?
5 Strongly Meets Need
The program or practice has demonstrated meeting need for identified population through rigorous research
(e.g., experimental design) with comparable population; disaggregated data have been analyzed to demonstrate
program or practice meets need of specific subpopulations
4 Meets Need
The program or practice has demonstrated meeting need for identified population through rigorous research
(e.g., experimental design) with comparable population; disaggregated data have not been analyzed for specific
subpopulation
3 Somewhat Meets Need
The program or practice has demonstrated meeting need for identified population through less rigorous research
design (e.g., quasi-experimental, pre-post) with comparable population; disaggregated data have not been
analyzed for specific subpopulation
2 Minimally Meets Need
The program or practice has demonstrated meeting need for identified population through practice experience;
disaggregated data have not been analyzed for specific subpopulation
1 Does Not Meet Need
The program or practice has not demonstrated meeting need for identified population
Additional Questions/ Notes
Ratings
RATING
1. How does the program or practice fit with priorities of the implementing site?
2. How does the program or practice fit with family and community values in the impacted community, including the
values of culturally and linguistically specific populations?
FIT
Implementing Site Indicator
3. What other initiatives currently being implemented will intersect with the program or practice?
4. How does the program or practice fit with other existing initiatives?
5. Will the other initiatives make it easier or more diicult to implement the proposed program or practice and
achieve the desired outcomes?
6. How does the program or practice fit with the community’s history?
5 Strong Fit
The program or practice fits with the priorities of the implementing site; community values, including the values of
culturally and linguistically specific populations; and other existing initiatives
4 Fit
The program or practice fits with the priorities of the implementing site and community values; however, the
values of culturally and linguistically specific population have not been assessed for fit
3 Somewhat Fit
The program or practice fits with the priorities of the implementing site, but it is unclear whether it aligns with
community values and other existing initiatives
2 Minimal Fit
The program or practice fits with some of the priorities of the implementing site, but it is unclear whether it aligns
with community values and other existing initiatives
1 Does Not Fit
The program or practice does not fit with the priorities of the implementing site or community values
Additional Questions/ Notes
Ratings
8
9
RATING
1. Typically, how much does it cost to run the program or practice each year? Are there resources to support
this cost? If the current budget cannot support this format, outline a resource development strategy.
2. What are the staing requirements for the program or practice? (Number and type of sta, e.g.,
education, credentials, content knowledge)
CAPACITY TO IMPLEMENT
Implementing Site Indicator
3. Does the implementing site currently employ or have access to sta that meet these requirements?
4. If so, do those sta have a cultural and language match with the population they serve, as well as
relationships in community?
5. What administrative practices must be developed or refined to support the use of this program or
practice?
6. Is leadership knowledgeable about and in support of this program or practice? Do leaders have the
diverse skills and perspectives representative of the community being served?
7. Do sta have the capacity to collect and use data to inform ongoing monitoring and improvement of the
program or practice?
8. What administrative policies or procedures must be adjusted to support the work of practitioners and
others to implement the program or practice?
9. Will the current communication system facilitate eective internal and external communication with
stakeholders, including impacted families and the community?
10. Will the program or practice require use of or changes to building facilities? Use notes section to explain.
List required uses of and/or changes. Include costs if known.
11. Does the program or practice require new technology (hardware or soware, such as a data system)?
Use notes section to explain. List required hardware and/or soware. Include costs if known.
12. Does the program or practice require use of or changes to the monitoring and reporting system? Use
notes section to explain. List required uses of and/or changes. Include costs if known.
5 Strong Capacity
The implementing site adopting this program or practice has all of the capacity necessary, including a qualified
workforce, financial supports, technology supports, and administrative supports required to implement and
sustain the program or practice with integrity
4 Adequate Capacity
The implementing site adopting this program or practice has most of the capacity necessary, including a qualified
workforce, financial supports, technology supports, and administrative supports required to implement and
sustain the program or practice with integrity
3 Some Capacity
The implementing site adopting this program or practice has some of the capacity necessary, including a qualified
workforce, financial supports, technology supports, and administrative supports required to implement and
sustain the program or practice with integrity
2 Minimal Capacity
The implementing site adopting this program or practice has minimal capacity necessary, including a qualified
workforce, financial supports, technology supports, and administrative supports required to implement and
sustain the program or practice with integrity
1 No Capacity
The implementing site adopting this program or practice does not have the capacity necessary, including a
qualified workforce, financial supports, technology supports, and administrative supports required to implement
and sustain the program or practice with integrity
Additional Questions/ Notes
Ratings
10