Thursday 29 January 2009

Times Higher Education


Some time ago I was passed an email from the Times Higher Education (THE) looking for examples of innovative practice in assessment. I responded positively on behalf of the University - Clearly I don't know all the good work that is going on at UH, but given my position in the Blended Learning Unit, I do get a scent of good things outside of engineering too!

Little happened for a while and then a couple of weeks back I engaged in a telephone interview with Rebecca Attwood. The THE journalist writing the piece. Really pleased to see that 'some' of what I said found its way to the article and I am pleased too to help put Hertfordshire on the map. We have many things to be proud of here - just as other HEI's do too. The article can be found here


I really enjoyed the article - great Job Rebecca, but it was a tad disappointing to note that the contributions came from the same 'names'. Surely there are more of us out there with a passion for assessment?

I would love to read a piece with great contributions from academics / staff developers whose names I have never heard of / or are not so intimately connected with assessment but nevertheless are as passionate about the students' experience and the impact that assessment has. Just a thought :-)

Wednesday 28 January 2009

Meeting with Management Team of the Business School


The ESCAPE project will work with the Business School and also the School of Life Sciences. Naturally we are presently on many paper trails looking at current practice. This will inform our project and evaluation plans. To support and enhance our ‘paper-trailing’ Dominic and I sat with Mike Broadbent Head of Department: Accounting, Finance and Economics (Chair of Faculty Learning and Teaching Group) and Karen Robins (Associate Head of Department (Management Systems)), from the Business School yesterday. We had a really productive meeting. They seem to have as much energy and enthusiasm for the ESCAPE project as we have. We discussed some of their general Teaching, Learning and Assessment challenges as well as the ESCAPE project.

Predominantly their challenges lie in the area of large groups and direct entry students. Large classes present many different challenges which we need to unpack and provide appropriate responses to. I am aware of the excellent work done by David Kraithman (and others) where they spent some time and effort providing really useful and accessible electronic resources to the students. The motivation being to move the lectures from being didactic information giving affairs to being more seminar based interactions. It will be of little surprise that class performance improved. Great job David! Not only does this show how technology can help the teaching and learning nexus but also how the out of class engagement can, and indeed should, inform the in class engagement.


I was also musing also at some of the practicalities of assessing large classes too. My largest class probably peaked at around 200. For some modules in the Business School student numbers are higher. I heard that a self service coursework hand in-box was withdrawn from use due to the potential Health and safety issues as queues of students formed to submit their work just before the submission deadline. I then got thinking about a class size of say 360 students and considered the case where each student submits their work in a plastic wallet - a reasonably common phenomenon Assuming it takes around 5 seconds to pick up and take out the work from the wallet and then another 5 seconds to put it back after marking (i.e.10 seconds per student), this trivial activity alone adds another hour to the assessment burden. Okay, this might not seem like much but as class sizes grow so too does the time and distraction for engaging in trivial non learning oriented activity.

Clearly there is much potential in electronic submission, which UH has, but making those systems useable and also configurable to suit the various usage patterns is no trivial activity.

We spoke of the Business School’s desire to meet the University’s target to reduce its time to return coursework and conversations moved to marinating the standard of assessment. Again, I have views here, which essential centre on the notion of ‘what’s the use of an integrated, challenging, well thought through piece of coursework if the attendant marking demand suggests its takes 6 weeks to return the feedback?’. Cleary I am not suggesting moving away from establishing authentic and challenging assessment tasks but there might be some work here in looking at the assessment experience in its entirety.

A really good meeting and one which established a positive tone for engagement but quite rightly set down some challenges from the chalk (pc) face for us to consider.

A presentation to the universities Academic Quality and Enhancement Committee (AQEC)

I was invited to present some examples of effective assessment and feedback practice to the Universities Academic Quality and Enhancement Committee (Tuesday 26th Jan 2009). Although the presentation was not specifically related to our ESCAPE activity it was a committee that the ESCAPE project needs to touch. If we are to be successful in making our activity mainstream and embedded then such committees need to be shown the value of our endeavours and we need them to help us respond to our challenges. Working with enthusiasts and champions is not enough. By definition, ESCAPE has an explicit intent to be sustained long after the project funding has finished.

I started by outlining the importance of assessment and feedback - both in terms of the challenges raised by the results of the National Student Survey (NSS) and also (and ultimately) learning! Any meaningful and aligned curricula has assessment at its core.

I then moved to show some work that Helen Barefoot and I have been doing in terms of collecting case studies from the LTI and more importantly overlaying (on the case studies) principles of good assessment and feedback practice. The case studies are tagged to help staff find resources relating to their challenge.

I think (hope) the committee liked the presentation and also the pragmatism of recognising that academic colleagues are busy and so not only do we need to collect resources but help staff access them in ways that meets their needs.

The perpetual problem was raised about engaging students with feedback and the frustrations felt by staff in producing feedback that is not even collected let alone used by the students. I have my own views on why this is the case - but in many cases I think the problem resides with us. Just as assessment creates activity so too should feedback. Feedback also should create consequences.

Friday 16 January 2009

Evaluation Meeting- Birmingham 15th January 2009




This meeting was my first chance to really immerse myself in the project and I was looking forward to meeting our critical friend Malcolm Ryan and the others from the JISC support network. I was very interested in having a chance to meet with people from our sister projects - particularly those in our cluster. I am hoping that as our projects progress our links will grow stronger and we will develop ways of supporting each other.

The venue in Birmingham was an excellent one as was flexible and spacious enough to accommodate and encourage all the meetings - scheduled and unscheduled, formal and informal that took place. It really brought home to me how a space can really influence the productivity and the modality of the events that occur within it.

Reflecting on the evaluation of chocolate chip cookies ( see Mark's previous entry) , which was one of the exercises the we performed to stimulate a discussion on evaluation, made me think about how evaluation should be right there at the design process. Chocolate chip cookies do not appear naturally in nature (at least not in Hertfordshire) therefore there will have been (- I would hope) an evaluation policy before the biscuits were put into general production. This would not have been the case with say a banana - which does occur naturally in nature. We can evaluate it - but it is an artificial evaluation one that we will impose on it from our perspective. We want to eat it so we want it to taste nice, be nutritious and not poison us. This occurs by chance ......not design.

This simple exercise has really made me think about evaluation an how it is so important for it to be there right at the planning and design stage.

Thursday 15 January 2009

Evaluation Mtg - part 2


Evaluating a chocolate-chip cookie

During the evaluation meeting (14 Jan 2008) we were asked to evaluate a chocolate chip cookie. We did this in our cluster.

We were presented with four bowls filled with a whole cookie and bite size pieces of the same cookie. We were asked to devise evaluation criteria that we could use to choose the best cookie. Sure the exercise might not be overly related to our project but (putting that aside and chomping away) the exercise drew up lots of interesting observations.

We spent a long time drawing up a very long and extensive list.

Naturally taste was on the list but so too was colour, distribution of choc-chips, density of choc-chips, look of cookie, crunchiness, chewiness, buttery-ness, smell, size, cost, packaging, etc. The list goes on...

Having identified the list we then tasted them. It was interesting that after producing the list someone immediately said something about a cookie that was not on the list!

This asks the question can all useful evaluation criterion be identified before the evaluation takes place ?- I wonder if this resonates with the notion of the unexpected consequences?

I struggled to offer any sensible thoughts and did not have a sophisticated enough palette to sense/taste and hence talk about buttery-ness. It was also suggested that just because we might be able to identify criteria they might not be relevant. This was certainly the case with me on the buttery-ness criterion.

Another group mentioned they just went for ‘taste’ And this reminded me of notions of connoisseurship. Can we really set out what we looking to measure or is there a sense of connoisseurship that has a part to part to play here too?.


The obvious things arose from the wider group about

* just because you can measure it, it doesn’t mean it's important and
* nothing ever improved/grew/enhanced by being measured alone.
Hmm, I’m mindful here of the Hawthorne Effect.

Nevertheless, these were some useful reminders as we now engage in developing our evaluation plan. There were certainly some useful lessons to learn from this exercise.

MBR

Evaluation Mtg - part 1



Evaluation meeting 15 Jan - part 1

Dominic Bygate (our new ESCAPE project Manager) and I attended the JISC-organised-Inspire Research-Facilitated Evaluation Event in Maple House, Birmingham. The event presented many opportunities - not only a great chance to catch up with our cluster (Universities of … Westminster, Exeter and Bristol) but also our Critical Friend - Malcolm Ryan. Malcolm had suggested we catch up the night before the event so that we can meet on a more informal and social basis. It was great to meet Inspire Research (Aka Rachel and Alison) who also joined us, and also good too to discuss our work outside the framework of the agenda items on Thursday - great job Malcolm - thank you!

What follows is a collection of thoughts that came out of the Evaluation Event …

We started the day by trying to sketch out what we saw as evaluation. Many metaphors were produced from our table

They included …
* A running track that had a start and finish line but was littered with hurdles to encounter and pathways that accelerate the journey.
* Shark infested waters, firing squads and comfy chairs
* Scales of judgement and also cycles and cycles
* Mirrors that gave the evaluator a chance to reflect on the whole and not just the individual pieces of the project - I liken this to suggesting it’s a great tie and a fab shirt - but together they look just lousy!
* Boulders of various sizes suggesting that evaluation has many levels of granularity


Other things I scribbled from this session was an Ice Cream Sundae that showed much pleasure hidden below the surface, but potentially also some hidden fruits that might be sour to the palette. Notions of known’s and unknowns also were presented. I wondered too about unknown-unknowns as we engaged with our project.


What the metaphors were describing were journeys, multiple pathways, some good times ahead and also some difficult things to confront. The importance of stakeholder engagement was raised and a great point was offered in terms of helping students develop their skills so that they might better articulate and describe their needs and experiences. I liked that.

After this visioning exercise we were presented with some useful definitions of evaluation. Notions of paradigms and where you are located suggested that some definitions better fit the different projects. It is conceivable too that a hybrid definition might better reflect what at is needed for some projects.


Using some of the information from the ‘exploring the tangible benefits of e-learning' document we were asked to reflect as a cluster why we might engage in evaluation. We were presented with a list that included
* Cost savings / resources
* Improved retention & recruitment
* Enhancing skills and employability
* Increased student achievement
* Improved inclusion

My immediate thoughts were that whilst the items on the list are extremely laudable our project wants to get to the items by working with staff - and so notions of staff development and changing thinking were vital. Closer inspection of the exploring the tangible benefits of e-learning document suggest that the initial list given to us was only a subset and supporting staff was indeed noted.

Our cluster also spoke about
* the tension in evaluation between ‘numbers and stories’ - the quantitative and qualitative debate. So ypou can go watch a 5 star rated film - But what does 5 stars mean?
* the potential conflict between different stakeholders as to what evidence might be needed
* notions of developing transformative change and also culture change
(Such notions are at the heart of ESCAPE project)
* the need for alignment of project with institutional direction for sustainable change or possible the alignment / engagement with external communities to support sustainability - discipline / areas of interest etc

Other groups fed back

* All the initial items on the list related to output (as we had indicted)
* Various stakeholders and key outputs
* Sustainability, culture, embedding, lessons learned for the sector
* Proving improvement
* Need to develop systems and measures
* Benefits to the individual learner
* And also supporting a community
(resonates with sections of the How People Learn framework)
* Approaches and evaluation methods were presented and also the importance of base lining new and existing data.
* Getting a decent response which was not just the number of responses but levels of sophistications that will help the project develop - This idea was nicely aligned with the previous idea of helping students articulate their experiences and n9ot just expecting them to be able to.
* Quality of conversations and how to measure and evidence the conversation.
* Does the process of capturing get in the way of / does it create its own evidence?
* How to measure behavioural change
* The idea of the reverse novelty effect was described which raised issues on
* When to evaluate and how to evaluate


It was really interesting to see how the groups tackled the same task
One group focused ‘more’ on methods, one more on ‘stakeholders’ and two (us included) probably focused more on wider issues.

MBR

Saturday 10 January 2009

Project manager

Our new project manager, Dominic, started work on the ESCAPE project this week. I am sure he will be a great asset to the project and will drive the work forward. Dominic has experience of working with staff on change management projects and I am sure his expertise in this area will be invaluable when he works with colleagues in the Business School and Life Sciences.

I am looking forward to being a part of the ESCAPE steering group and am feeling better about taking a back seat. I currently have a number of other projects which I am working on and keeping all the plates spinning in the air is certainly proving challenging.

I am looking forward to following the progress of ESCAPE via the blog and hope that Mark and Dominic keep us up to date with their thoughts.