Dwayne Hodgson

View Original

An Evaluation Framework to Assess Learning Programs

Evaluating Learning requires looking at the big picture and zooming in on the details.

It has been a while since I've written a blog post, but in my current role as a Knowledge Services Advisor with the Green Municipal Fund at the Federation of Canadian Municipalities,  I keep getting questions about how to evaluate the effectiveness of our workshops, webinars and other learning events. So, in an effort to collect my thoughts, here goes....
 

Some Current Models

The most widely-recognized learning evaluation frameworks was developed Phillips (Reaction, Learning, Behaviour and Results)  and Kirkpatrick (who added "Return on Investment" as a measure of efficiency). 

When I worked at Global Learning Partners, we taught a great course called Learning Evaluation by Design that was based on the work of GLP's founder, Dr. Jane Vella, and her colleagues Jim Burrow and Paula Berardinelli's book, How Do They Know They Know Their work grafted what was essentially a results chain onto the Principles and Practices of Dialogue Education to develop a simple evaluation framework to assess:

  • Learning: how well the learners met the Achievement-Based Objectives during the workshop;

  • Transfer: how the learners apply their learning in their life, work or community; and

  • Impact or the change in their organization and/or community.

For each level, they would then think through evidence of the change they would look for -- either qualitative or quantitative -- and develop a plan for collecting this data. This elegant approach mirrored a lot of what I had learned about using Results-Based Management (RBM) in community development work.

However, both of these approaches don't focus enough on assessing how the training itself was researched, designed and facilitated.  Instead they look primarily "downstream" to see the results.  But if the original workshop does not take into account the learners' needs, context and prior experience, or if the conference design does not follow sound adult learning principles and/or the facilitators are not skilled at supporting the learning process, there is little point in looking for downstream evidence of learning in the workplace or community. 
 

Proposing A New Framework

To address these challenges, I would like to propose a new learning evaluation framework that draws on the insights of the Dialogue Education approach to researching, designing and facilitating learning, as well as the downstream results that Phillips and Kirkpatrick and others have suggested. 


Ways You Might Assess Each Stage

Stage of the Learning Process


  • Ask if (!) / how the organizers conducted a Learning Needs and Resources Assessment (LNRA) either before they designed the event and/or to customize the event to this particular group of learners. What did they:

  • Ask the participants and other stakeholders about their background, current context, work, successes, challenges, questions and opinions;

  • Study to better understand these issues (e.g., reports, evaluations, websites);

  • Observe about the learners, their situation, organization or community.

  • How did this information change their decisions about the Content, Achievement-Based Objectives, Process or other design parameters?

  • What else might they have found out that would have helped the design and/or facilitation?

Learning Needs & Resources Assessment

Design

  • Document the expected Theory of Change of the training program to understand how this event fits into the larger change process they are trying to support.

    • Review the design parameters as outlined in the Steps of Design:

  • People,

  • Purpose,

  • Transfer & Impact Objectives

  • Date and Time,

  • Place, Venue and/or Platform

  • Content: Knowledge, Skills and Attitudes

  • Achievement-Based Objectives (ABOs)

  • Process or the Learning Tasks to help them meet the ABOs

  • Check for congruence between these parameters. If you like, map them on a Learning Design Canvas to summarize the key parameters.

  • Note which parameters were given to them, and which ones they made decisions about and why?

  • Review the Achievement-Based Objectives to assess if they support the appropriate level of learning to support the desired Transfer Objectives.

  • Review the Process to see how the training designed embodies selected principles and practices of how adults learn most effectively.

Facilitation

  • Observe the facilitation to see how it embodies good adult learning facilitation principles and practices and note suggestions where it might be improved.

  • Debrief the facilitator’s own impressions of the learning process.

  • Check the design parameters vs. the actual situation (e.g. start times vs. schedule, participation rates, technology platforms) and any impromptu changes that the facilitation team may have made in light of changing circumstances or emergent learning needs.

  • Consider any other factors that influence learning during the workshop or teleconference (e.g. group dynamics, technical problems).

Learning

  • Observe any mid-course opportunities for personal & group learning synthesis during the workshop to assess their learning and permit making changes in response to any emerging learning needs.

  • Check how well participants met each Achievement-Based Objective (ABO) by reviewing the products of the learning activities to assess what they learned and how well.

  • Hold a short post-workshop quiz on the content if appropriate

  • Invite the participants to name their most significant learning, their progress vs. their pre-training workshop, and their transfer objectives (e.g. what they will apply to their work after the workshop, and how they will assess these)

  • In an online survey, email or interview, Invite the participants to reflect on their experience of the learning event, possibly including:

    • publicity for the event -- how did they find out about it?

    • registration - was it easy to sign up for the event?

    • pre-workshop LNRA -- did they take part in this? did they see how the facilitators used this information?

    • design - what did they think about the design of the workshop?

    • facilitation - how did the facilitators do in guiding them through the design?

    • their own participation in the event

    • other participant's contributions to the learning

    • their most significant learning, experience or insight ("a-ha!")

    • what new questions they now have -- every good learning experience generates as many questions as it does answers.

    • what challenges do they anticipate in applying their learning in their life, their workplace and/or community?

    • what support would they like to apply their learning afterwards? Coaching? Resources? Peer support?

Feedback

 

Transfer

  • Look for evidence that they have applied their new knowledge, skills and/or attitudes (KSAs) to their work situation through observation, reviewing plans, reading reports, etc.

  • Conduct interviews and/or focus groups to ask their feedback on the utility of the training now that they are applying what they have learned:

    • What has proven useful?

    • What not so helpful?

    • What else do you wish you had learned?

  • Analyze the contextual factors that have influenced how they have applied their learning. This could included positive forces (e.g., supplemental information, peers support), or negative forces (e.g., workload, resistance from the boss, inflexible systems). transfer including workload, support from peers, time since the training, context, etc.

  • Suggest tools and processes to reinforce their learning and enhance transfer.

  • Look for changes in Behaviour and/or Conditions that suggest that the transfer (application) of the new KSAs made a difference for the participant, organization, peers (e.g. volunteers) and/or community (e.g. improved volunteer recruitment, retention and satisfaction)

  • Anaylze any contextual factors that influenced impact (e.g. supporting and/or hindering conditions in the community or work place, other actors). These can be both positive and negative influences.

  • Review the Theory of Change in light of the evidence and suggest changes that better display how change actually occurs. .

Impact


Your Turn

What are your questions? What do you think? What parts work for you? What would you change? Please enter your thoughts in the comments below.

Of course, evaluating the effectiveness of your training programs is never as straight-forward as this! And in the next blog post, I'll explain why! Stay tuned!