Evaluation Planning
Originally written in 2014, with minor changes for the web
This document was written as part of a project to use iPads to assist with training staff in HE to effectively use interactive whiteboards.
Approach
I will be using a modified version of Kirkpatrick's approach to
learning evaluation. Kirkpatrick's original design is considered
a cornerstone in the learning industry
(Clark, 2012) and is the most often cited framework
(Arthur et al. in Praslova, 2010). He suggested four steps to evaluation:
Alterations
Clark suggests a number of alterations to this design:
Motivation not Reaction
Kirkpatrick wanted to measure reactions
(Kirkpatrick in Praslova, 2010). To put another way,
how well the learners liked a particular learning process
(Clark, 2012). However, the research in this area shows
a weak link between learners reported reactions and job performance
after training (Boehle in Clark, 2012). It is possible
to have an enjoyable learning experience without meeting the
learning objectives and equally possible to struggle through but
meet objectives.
It is suggested instead to look at motivation (Clark, 2012) which can comprise other learner reportable criteria such as how important and doable the learning is perceived (Markus in Clark, 2012) and how much learners believe they have learned (Alliger et al. in Praslova, 2010).
Performance not Behavior
Kirkpatrick wanted to measure behaviour,
What changes in job performance resulted from the learning
process?
(Clark, 2012), but Gilbert suggests that instead we
should be looking at performance. Performance not only includes
desired behaviours but also desired consequences
(Gilbert in Clark, 2012).
Focusing too strongly on just behaviours can lead to blind robotic repetition, where as it is more desirable to root those behaviours in the context of their outcomes. A common example of this in the service industry is the training of staff to ask customers questions such as "Is everything OK with your meal?" but then giving a negative reaction to the customer if they do complain about something. If we were to just look at behaviour when evaluating the training (eg. 96% of staff remember to ask customers if their meal is OK) then we will miss the problematic consequences in our evaluation (eg. 80% of customers who responded with a complaint were unsatisfied with how it was dealt with).
The other problem with simply measuring behaviour in the classroom
context is that you do not then know how learners will perform in
the real world and you may end up teaching to the test instead
preparing them for life tests
(Halpern and Hakel in Praslova, 2010, p221). By equally
focusing on performance consequences this danger is somewhat
mitigated as the teacher is forced to find ways of evaluating
realistic outcomes.
Flipped model
Perhaps the biggest change suggested by Clark is to flip the order of the steps and then use them as a planning model (Clark, 2012). The planning process now has eight stages that tightly link the learning activity design and the design of the evaluation.
The activity is thus planned to be measurable, and any problems discovered in evaluation can be linked to a specific stage in the planning (Chyung in Clark, 2012). The stages can also be divided into internal to the learning activity (motivation and learning) and external (performance and result). Since the external stages are influenced by outside factors this can be taken into account when evaluating the learning activity on its own (Alliger et al. in Praslova, 2010) and thus help avoid blaming bad business management on good learning.
Goals (Planning)
Desired result
The organisational results I am looking for are increased student engagement, increased formative assessment and improved student outcomes.
Desired performance
In order to meet my desired result my learning activity will lead to the behaviour of academics designing interactive whiteboards into their lectures and the desired consequence that they are used well in those lectures.
Desired learning
In order to bring about my desired performance my learning activity will give academics knowledge of how to operate SMART Boards (the most common interactive whiteboard used at The University of Warwick) and the skills to incorporate engagement, formative assessment and learning objectives into their use.
Desired motivation
In order to facilitate my desired learning my learning activity will demonstrate the benefits of using interactive whiteboards in lectures, namely:
- increased student engagement through interactivity (Macdonald, 2008)
- increased formative assessment
- improved student outcomes
- readily updatable resources (Macdonald, 2008)
- hyperlinks to related material (Macdonald, 2008)
- use of multimedia (Macdonald, 2008)
- portability of lecture materials (Macdonald, 2008)
Additionally, my learning activity will need to gently introduce
learners to using the technology as
even the most apparently confident individuals need support at the
beginning.
(Salmon, 2004)
Evaluation
Actual motivation
What
I will evaluate how easy learners found the activity, how beneficial they felt the outcomes to themselves and how beneficial they felt the outcomes to their students.
Why
The learners must be motivated to meet the learning outcomes. "If the goal or task is judged as important and doable, then the learner is normally motivated to engage in it." (Markus in Clark, 2012)
Who
- Library colleagues
-
Customers
- academics
- postgraduates
- Postgraduate Award Technology Enhanced Learning students
How
Self completion questionnaires. One built into activity and one online.
When
Immediately after the learning activity and 2 months later.
Actual learning
What
I will evaluate if learners are able to operate SMART Boards and if they are able to incorporate engagement, formative assessment and learning objectives into their use.
Why
Learns will need to have these skills and knowledge in order to perform well.
Who
- Library colleagues
-
Customers
- academics
- postgraduates
- Postgraduate Award Technology Enhanced Learning students
How
Formative and summative assessment as
pre and post tests provide the most direct measure of learning
(Arthur in Praslova, 2010, p220).
When
Before and after the learning activity.
Actual performance
What
I will evaluate if learners design interactive whiteboards into their lectures and if they are used for increasing engagement, formative assessment and meeting learning objectives.
Why
If the desired result is to be achieved then learners will have to put their skills and knowledge into effective action.
Who
-
Customers
- academics
- postgraduates
- Postgraduate Award Technology Enhanced Learning students
How
Self completion questionnaire.
When
1 term after learning activity.
Actual result
What
I will evaluate if there is an increase in student engagement, increased formative assessment and improved student outcome.
Why
If the desired result is not met than the purpose of the learning activity is undermined.
Who
-
Customers
- academics
- postgraduates
- Postgraduate Award Technology Enhanced Learning students
How
Self completion questionnaire. This will be somewhat subjective as it will be based on the learners perceptions but will allow data to be collected practically.
When
1 term after learning activity.
Next steps
After the completion of the evaluation, the results will be used to inform the development of similar learning activities for other items of technology available in the Teaching Grid.