Thursday 15 January 2009

Evaluation Mtg - part 1



Evaluation meeting 15 Jan - part 1

Dominic Bygate (our new ESCAPE project Manager) and I attended the JISC-organised-Inspire Research-Facilitated Evaluation Event in Maple House, Birmingham. The event presented many opportunities - not only a great chance to catch up with our cluster (Universities of … Westminster, Exeter and Bristol) but also our Critical Friend - Malcolm Ryan. Malcolm had suggested we catch up the night before the event so that we can meet on a more informal and social basis. It was great to meet Inspire Research (Aka Rachel and Alison) who also joined us, and also good too to discuss our work outside the framework of the agenda items on Thursday - great job Malcolm - thank you!

What follows is a collection of thoughts that came out of the Evaluation Event …

We started the day by trying to sketch out what we saw as evaluation. Many metaphors were produced from our table

They included …
* A running track that had a start and finish line but was littered with hurdles to encounter and pathways that accelerate the journey.
* Shark infested waters, firing squads and comfy chairs
* Scales of judgement and also cycles and cycles
* Mirrors that gave the evaluator a chance to reflect on the whole and not just the individual pieces of the project - I liken this to suggesting it’s a great tie and a fab shirt - but together they look just lousy!
* Boulders of various sizes suggesting that evaluation has many levels of granularity


Other things I scribbled from this session was an Ice Cream Sundae that showed much pleasure hidden below the surface, but potentially also some hidden fruits that might be sour to the palette. Notions of known’s and unknowns also were presented. I wondered too about unknown-unknowns as we engaged with our project.


What the metaphors were describing were journeys, multiple pathways, some good times ahead and also some difficult things to confront. The importance of stakeholder engagement was raised and a great point was offered in terms of helping students develop their skills so that they might better articulate and describe their needs and experiences. I liked that.

After this visioning exercise we were presented with some useful definitions of evaluation. Notions of paradigms and where you are located suggested that some definitions better fit the different projects. It is conceivable too that a hybrid definition might better reflect what at is needed for some projects.


Using some of the information from the ‘exploring the tangible benefits of e-learning' document we were asked to reflect as a cluster why we might engage in evaluation. We were presented with a list that included
* Cost savings / resources
* Improved retention & recruitment
* Enhancing skills and employability
* Increased student achievement
* Improved inclusion

My immediate thoughts were that whilst the items on the list are extremely laudable our project wants to get to the items by working with staff - and so notions of staff development and changing thinking were vital. Closer inspection of the exploring the tangible benefits of e-learning document suggest that the initial list given to us was only a subset and supporting staff was indeed noted.

Our cluster also spoke about
* the tension in evaluation between ‘numbers and stories’ - the quantitative and qualitative debate. So ypou can go watch a 5 star rated film - But what does 5 stars mean?
* the potential conflict between different stakeholders as to what evidence might be needed
* notions of developing transformative change and also culture change
(Such notions are at the heart of ESCAPE project)
* the need for alignment of project with institutional direction for sustainable change or possible the alignment / engagement with external communities to support sustainability - discipline / areas of interest etc

Other groups fed back

* All the initial items on the list related to output (as we had indicted)
* Various stakeholders and key outputs
* Sustainability, culture, embedding, lessons learned for the sector
* Proving improvement
* Need to develop systems and measures
* Benefits to the individual learner
* And also supporting a community
(resonates with sections of the How People Learn framework)
* Approaches and evaluation methods were presented and also the importance of base lining new and existing data.
* Getting a decent response which was not just the number of responses but levels of sophistications that will help the project develop - This idea was nicely aligned with the previous idea of helping students articulate their experiences and n9ot just expecting them to be able to.
* Quality of conversations and how to measure and evidence the conversation.
* Does the process of capturing get in the way of / does it create its own evidence?
* How to measure behavioural change
* The idea of the reverse novelty effect was described which raised issues on
* When to evaluate and how to evaluate


It was really interesting to see how the groups tackled the same task
One group focused ‘more’ on methods, one more on ‘stakeholders’ and two (us included) probably focused more on wider issues.

MBR

No comments: