Showing posts with label Evaluation. Show all posts
Showing posts with label Evaluation. Show all posts

Tuesday, 7 September 2010

Final Cluster Meeting

We have just had our last cluster meeting, which was held at the University of Exeter. The theme of the meeting was to give each project a chance to present what had been achieved over the two years and to look to the future - beyond the official end of the projects at the end of October. It gave us a chance to per critique each other projects. We presented on the work of ESCAPE and Mark demonstrated aspects of the ESCAPE Toolkit. Themes that we covered included:


  • the development of a set of ESCAPE Principles

  • mapping the current assessment landscape to these principles

  • considering efficiency verses effectiveness

  • what does transformative change look like

  • demonstration of some of the "themes in practice" videos

We talked about how the project has laid a foundation for staff engagement with a year long university wide assessment project that is running post ESCAPE.

The meeting was in the usual format of a two day timetable with meeting spread over lunchtime on the Thursday to lunchtime on the Friday. Helen Beetham joined us on Friday to facilitate a session on exploring what we have learnt as a cluster. We were looking to build upon the collaborative efforts of our joint cluster presentation at the University of Greenwich e- learning Conference. There were some interesting ideas for further collaboration including a collaboration with the University of Exeter (INTEGRATE Project) with a sharing of resources from our projects.

We agreed that beyond the final programme meting in October we would look to meet in 12 - 18 months to look at what impact our projects have had.


Wednesday, 31 March 2010

Cluster Meeting 25th & 26th March 2010


We have just hosted the third meeting of our cluster (universities of Bristol, Exeter, Hertfordshire and Westminster). The venue was the Comfort Hotel in St.Albans. The hotel was at one time the home of Samuel Ryder of Ryder Cup fame. The meeting ran as is usual from lunchtime on the 25th to lunchtime on the 26th. This gives everyone a chance to arrive in good time to start the meeting and for a social event in the evening - usually a group meal at a local restaurant. This time is valuable as much of the meeting is about sharing experiences of our projects and seeking advice from our fellow cluster project teams and our critical friend Malcolm Ryan.


The first day kicked of with a welcome and introduction by Malcolm and a review of the ground rules for the two days. The Chatham House rules mean that we can speak candidly about issues and events arising in the projects. We then moved on to looking how we had used the Appreciative Inquiry approach with our project and a facilitated session run by Rachel Harris of Inspire Research. We then moved on to evaluation and looked at how we are approaching this- thinking about "what a successful project looks like " and discussing the value in understanding why something did not work out as planned. The day was rounded of by a very pleasant meal in which the topics discussed during the day were further explored.


The second day looked at the plans for the final cluster meeting at Exeter - a draft agenda was proposed and celebrations planned! We then moved on to plans for a joint conference submission by the cluster at the Geenwich e learning conference in the summer. The day finished with a session on using Fishbone Analysis to explore the sustainability agenda of the projects.


All in all a really useful two days - we have similar experiences and issues and it was an excellent opportunity to raise them and to identify ways to move forward.

Wednesday, 7 October 2009

Poster and Video


This last week we have been involved in the production poster and video in preparation for a joint meeting of the curriculum design and the curriculum development projects in Manchester on the 13th and 14th of October. The poster is a visual representation of the stages and mechanisms of the project. It captures the journey through the project by the stake holders and illustrates the approaches taken at the various stages of the project by the ESCAPE team. It was influenced heavily by the change management development activities that we were involved in at our last CAMEL meeting. One of the aspects of change management was looking at a four stage model of change management, viz:
  • Awareness
  • Interest
  • Trying
  • Adopting

It was interesting mapping our project activities to these four stages and realising the the stages were not atomic- that is to say that they could overlap - and that it was possible for stakeholders to drop back a stage( or two!).

The poster does not really capture the complexity of what we are doing - for example we are working with 9 different module teams, with some module attempting quite complex and innovative changes in assessment practice, but is a good start as a "top layer" map of the project.

I am now working on a three minute flip cam video that will introduce the ESCAPE project and include a stakeholder voice. This along with videos introducing all the other projects, can be found at www.youyube.com/jisccdd - a JISC dedicated channel on YouTube.






Thursday, 28 May 2009

It's a Two Day Event


Things are starting to pick up speed with the project. We are planning a two day off site event at the university conference venue - the Fielder Centre. The likely date is towards the end of July - once all the exam boards are out of the way. The aim of the event is to give the nine module teams that we are working with the time and space to critically examine their assessment and feedback practices. And also then to start the reengineering process. We are planing ( at this early stage)the two days to be a mixture of presentations , facilitated workshops and module team meetings. The module teams will be able to call on the resources of BLU and the LTI to examine the particular pedagogical and technological aspects of possible assessment scenarios. In particular we are calling on our experts in curriculum design and change management.


By the end of this week all the module coordinators will have had appreciative enquiry reflective interviews in order to build up case studies of how the module is delivered and assessed . These case studies along with data from the student assessment surveys that we are carrying out, will give us a really clear picture of the issues surrounding each particular module. They will become the jumpimg off point for the reengineering process .

Friday, 20 March 2009

Meeting Life Sciences Module Teams - Scene Setting

So, after much behind the scenes activity we recently met with the module teams from the School of Life Sciences. We are engaging in Appreciative Inquiry (as an Evaluation Framework) and before we enter the Inquire Phase (with them) we wanted to set the scene and help get them on board with our work. They, like other Schools, are very busy and I have no desire to draw on their time unnecessarily. We set aside 2.5 hours for a meeting where we could outline the project and get them interacting. The time flew by - for me anyway. Dominic did a great job introducing the project - I simply made up numbers and responded to some of their concerns.

Naturally (given it was a 2.5 hour session), we built in stuff for them to do and feedback - I thought (and I did say that I hoped I had not misread the mood of the room) that they were very positive.

Helen Barefoot came along too since this is her School and she spans across the Learning and Teaching Institute and also the School. Helen was a great advocate and could help relate our ideas to their context.

We asked for their immediate thoughts - which are now being collated and will be fed back here.

At the end of the session we also gave them Flip Cameras and asked them to introduce themselves and talk very briefly about a positive assessment experience. The room was lively and vibrant and although it was late and at the end of the 2.5 hours we got some great clips. This will now start to set a very positive picture when we start the Inquire Phase.

Thanks LIfe Sciences - Great start!

Tuesday, 17 March 2009

Steering Group - More thoughts

Colleagues at our inaugural Steering Group meeting offered support, encouragement and (quite rightly) probing - I thought I would post some of the observations and thoughts here too.

A few 'things' that resonated with me from the discussions -

1 the NSEE vs. NSS
2 taking my (our) own agenda to the Schools
3 Making due consideration to all items of the plan
4 Ambition – making sure that we are not being overly ambitious.
5 Drawing on projects that have already engaged in similar-ish activity


1. I have flirted with the NSSE / or seen pointers to it (the NSSE) on my USA conference travels and it something that I would like to explore further. Clearly we will have to balance the institutional requirements of 'success' in the NSS and so this will need to stay on our radar too. I would like us to focus on learning and seeing how that aligns with the NSS.
i.e. not wanting to be NSS driven.

2. David Nicol asked what agenda we will be taking to the Schools - i.e. in addition to working with the agendas of the Schools. Great question! I have thought long-and-hard about this after the meeting - I am aware of David' self regulation agenda for instance. Most of my work has been about trying to develop a personalised experience for the students. That is not to trample over the benefits of collaboration, community centeredness or seeing that socialisation is important but I am keen to see students as individuals in the current mass HE system. In some sense this is in part response to one of the 14 grand challenges raised by the National Academy of Engineering. I think Personlised Learning / seing students as individuals will resonate with us here since we can quite rightly draw on David's work, promote the benefits of community centeredness but also remind ourselves that we are dealing with individuals. We are a large University and losing contact with our students (as individuals) will be of concern to many of us here.

3. Our project plan was details in some areas but less so in others - we need to balance our endeavors and show commitment to all aspects of the project and its stages. Lisa Gray raised similar observations when she asked for more specificity.

4. We write in our project plan about transformative and sustained change. I don't wish to retreat from these aims but the Steering encouraged a close look at the size of the 'ambition'.

5. I am aware of the FAST and SENLEF projects and also the Engaging Students with Assessment Feedback project. Actually now I come to write about those and other A&F related projects (SPRINTA etc) that Chris Rust and I were involved with at the University of Essex, its kind of bonkers that there are still so many issues with some of the basics of assessment and feedback practice.
I will make sure that we revisit those projects and use what the others have already found. I will also draw on the experiences and expertise of Malcolm Ryan and the Learner Experience projects. Learner Experience is also part of this project

Hmm. Lots to do - but it was just great to get the input from colleagues. I have every confidence that our project will benefit from the experience and expertise of our Steering Group - Great input - thank you!

Mark

Monday, 9 March 2009

Scottish Journey

Mark and I travelled up to Glasgow last week to meet with Rachel Harris from Inspire Research. We were looking at how using appreciative enquiry techniques could help us in our work with the module teams in the schools. AI looks at growing what works in a organisation to foster change rather than to think of things in terms of something needing fixing.

The day was very productive and we were able to flesh out exactly how we we going to approach things on a module level. In addition we have started to put a detailed timeline together for work we are doing with the schools. We had a chance to knock around quite a few ideas and we were able to develop some ways of looking at things that should really help the sustainability of the project's influence after the project itself finishes. In particular we looked at how many modules you would need to influence in order to bring about an institutional change . Mark compared it to how many people were needed to start a mexican wave in a stadium. Was there a critical mass of modules that you needed? and if so what were the factors that influenced that number. We also looked at things from person centred approach and looked at how we could measure a persons influence within a school as an agent for change. We wanted to encapsulate how many modules a particular person could influence and see if we could express that as a kind of index or quotient. It was good to have Rachel's input and experience to keep us both grounded on what was possible within our time frame.

At the end of last week we also got the feedback on the project plan from JISC - so this week will be concerned with updating the plan in the light of these and the steering groups comments. In addition we will be and arranging meetings with the module teams within the schools to get them to examine their current practice within an AI framework.

Wednesday, 25 February 2009

First Meeting of the Steering Group

Last Thursday we had our first meeting of the steering group, we had an excellent turnout. It was good to see Malcolm Ryan again - our critical friend from JISC and fellow VW owner and to meet Prof. Margaret Price - from Oxford Brookes University. In addition Prof. David Nicol from the University of Strathclyde joined us via elluminate along with Prof. Peter Bullen head of BLU at UH.

We used a very large plasma screen to display the elluminate interface and Mark managed to solve some technical hitches prior to the start of the meeting so the sound was excellent.

The steering group were very supportive of what we had achieved so far and were in agreement with our approach as detailed in the draft project plan - which was approved. They were able to offer advice on areas where they felt their experience was relevant. It is good to know that we are backed up in our endeavours by such a depth and breadth of experience. The steering group’s guidance and advice will be invaluable to helping make the project a success.

Some things that I got from the meeting:

  • look at the student perspective of assessment & compare with the lecturer view
  • think about how teaching & learning staff development for new (and existing) staff can support the sustainability of the project
  • create hard data for analysis
  • what are the staff and student drivers here
  • students will always want more feedback - we need to encourage and facilitate self regulated learners
  • look at the national survey of student engagement
  • think about the context of the change we are trying to achieve - make sure that senior champions are on side
  • the challenge of using general principles of good assessment practice - of which there are a number of schools of thought - and turning them into actual an assessment regime for a particular subject module.

One of the main themes that I picked up on is how much work is being done in similar areas to our own and how we need to be aware of it. One of the discussions that Mark and I have had is his feeling that we need to have a good idea of what is out there - perhaps by carrying out a survey exercise.

              Monday, 2 February 2009

              Appreciating what works ...

              As we develop and map out our evaluation plans at the forefront of our thinking is a desire to ...

              * evaluate 'as we go' and learn lessons quickly
              * ensure the evaluation activity does not upset or interfere with our collaborative endeavors with our partner schools
              * seek out what already works
              * get as much stakeholder input into the evaluation activity as possible.

              Appreciative Inquiry (AI) seems to offer an evaluation framework that meets many of the characteristics we are looking for. We will frame our questions in the spirit of AI and go looking for what works and how we can use more of it.

              Visioning a brighter future and not taking a deficit approach is something we are looking forward to engage with.

              Will keep you informed of the progress and, because the approach to the ESCAPE team, will reflect back things that challenge and excite us

              Friday, 16 January 2009

              Evaluation Meeting- Birmingham 15th January 2009




              This meeting was my first chance to really immerse myself in the project and I was looking forward to meeting our critical friend Malcolm Ryan and the others from the JISC support network. I was very interested in having a chance to meet with people from our sister projects - particularly those in our cluster. I am hoping that as our projects progress our links will grow stronger and we will develop ways of supporting each other.

              The venue in Birmingham was an excellent one as was flexible and spacious enough to accommodate and encourage all the meetings - scheduled and unscheduled, formal and informal that took place. It really brought home to me how a space can really influence the productivity and the modality of the events that occur within it.

              Reflecting on the evaluation of chocolate chip cookies ( see Mark's previous entry) , which was one of the exercises the we performed to stimulate a discussion on evaluation, made me think about how evaluation should be right there at the design process. Chocolate chip cookies do not appear naturally in nature (at least not in Hertfordshire) therefore there will have been (- I would hope) an evaluation policy before the biscuits were put into general production. This would not have been the case with say a banana - which does occur naturally in nature. We can evaluate it - but it is an artificial evaluation one that we will impose on it from our perspective. We want to eat it so we want it to taste nice, be nutritious and not poison us. This occurs by chance ......not design.

              This simple exercise has really made me think about evaluation an how it is so important for it to be there right at the planning and design stage.

              Thursday, 15 January 2009

              Evaluation Mtg - part 2


              Evaluating a chocolate-chip cookie

              During the evaluation meeting (14 Jan 2008) we were asked to evaluate a chocolate chip cookie. We did this in our cluster.

              We were presented with four bowls filled with a whole cookie and bite size pieces of the same cookie. We were asked to devise evaluation criteria that we could use to choose the best cookie. Sure the exercise might not be overly related to our project but (putting that aside and chomping away) the exercise drew up lots of interesting observations.

              We spent a long time drawing up a very long and extensive list.

              Naturally taste was on the list but so too was colour, distribution of choc-chips, density of choc-chips, look of cookie, crunchiness, chewiness, buttery-ness, smell, size, cost, packaging, etc. The list goes on...

              Having identified the list we then tasted them. It was interesting that after producing the list someone immediately said something about a cookie that was not on the list!

              This asks the question can all useful evaluation criterion be identified before the evaluation takes place ?- I wonder if this resonates with the notion of the unexpected consequences?

              I struggled to offer any sensible thoughts and did not have a sophisticated enough palette to sense/taste and hence talk about buttery-ness. It was also suggested that just because we might be able to identify criteria they might not be relevant. This was certainly the case with me on the buttery-ness criterion.

              Another group mentioned they just went for ‘taste’ And this reminded me of notions of connoisseurship. Can we really set out what we looking to measure or is there a sense of connoisseurship that has a part to part to play here too?.


              The obvious things arose from the wider group about

              * just because you can measure it, it doesn’t mean it's important and
              * nothing ever improved/grew/enhanced by being measured alone.
              Hmm, I’m mindful here of the Hawthorne Effect.

              Nevertheless, these were some useful reminders as we now engage in developing our evaluation plan. There were certainly some useful lessons to learn from this exercise.

              MBR

              Evaluation Mtg - part 1



              Evaluation meeting 15 Jan - part 1

              Dominic Bygate (our new ESCAPE project Manager) and I attended the JISC-organised-Inspire Research-Facilitated Evaluation Event in Maple House, Birmingham. The event presented many opportunities - not only a great chance to catch up with our cluster (Universities of … Westminster, Exeter and Bristol) but also our Critical Friend - Malcolm Ryan. Malcolm had suggested we catch up the night before the event so that we can meet on a more informal and social basis. It was great to meet Inspire Research (Aka Rachel and Alison) who also joined us, and also good too to discuss our work outside the framework of the agenda items on Thursday - great job Malcolm - thank you!

              What follows is a collection of thoughts that came out of the Evaluation Event …

              We started the day by trying to sketch out what we saw as evaluation. Many metaphors were produced from our table

              They included …
              * A running track that had a start and finish line but was littered with hurdles to encounter and pathways that accelerate the journey.
              * Shark infested waters, firing squads and comfy chairs
              * Scales of judgement and also cycles and cycles
              * Mirrors that gave the evaluator a chance to reflect on the whole and not just the individual pieces of the project - I liken this to suggesting it’s a great tie and a fab shirt - but together they look just lousy!
              * Boulders of various sizes suggesting that evaluation has many levels of granularity


              Other things I scribbled from this session was an Ice Cream Sundae that showed much pleasure hidden below the surface, but potentially also some hidden fruits that might be sour to the palette. Notions of known’s and unknowns also were presented. I wondered too about unknown-unknowns as we engaged with our project.


              What the metaphors were describing were journeys, multiple pathways, some good times ahead and also some difficult things to confront. The importance of stakeholder engagement was raised and a great point was offered in terms of helping students develop their skills so that they might better articulate and describe their needs and experiences. I liked that.

              After this visioning exercise we were presented with some useful definitions of evaluation. Notions of paradigms and where you are located suggested that some definitions better fit the different projects. It is conceivable too that a hybrid definition might better reflect what at is needed for some projects.


              Using some of the information from the ‘exploring the tangible benefits of e-learning' document we were asked to reflect as a cluster why we might engage in evaluation. We were presented with a list that included
              * Cost savings / resources
              * Improved retention & recruitment
              * Enhancing skills and employability
              * Increased student achievement
              * Improved inclusion

              My immediate thoughts were that whilst the items on the list are extremely laudable our project wants to get to the items by working with staff - and so notions of staff development and changing thinking were vital. Closer inspection of the exploring the tangible benefits of e-learning document suggest that the initial list given to us was only a subset and supporting staff was indeed noted.

              Our cluster also spoke about
              * the tension in evaluation between ‘numbers and stories’ - the quantitative and qualitative debate. So ypou can go watch a 5 star rated film - But what does 5 stars mean?
              * the potential conflict between different stakeholders as to what evidence might be needed
              * notions of developing transformative change and also culture change
              (Such notions are at the heart of ESCAPE project)
              * the need for alignment of project with institutional direction for sustainable change or possible the alignment / engagement with external communities to support sustainability - discipline / areas of interest etc

              Other groups fed back

              * All the initial items on the list related to output (as we had indicted)
              * Various stakeholders and key outputs
              * Sustainability, culture, embedding, lessons learned for the sector
              * Proving improvement
              * Need to develop systems and measures
              * Benefits to the individual learner
              * And also supporting a community
              (resonates with sections of the How People Learn framework)
              * Approaches and evaluation methods were presented and also the importance of base lining new and existing data.
              * Getting a decent response which was not just the number of responses but levels of sophistications that will help the project develop - This idea was nicely aligned with the previous idea of helping students articulate their experiences and n9ot just expecting them to be able to.
              * Quality of conversations and how to measure and evidence the conversation.
              * Does the process of capturing get in the way of / does it create its own evidence?
              * How to measure behavioural change
              * The idea of the reverse novelty effect was described which raised issues on
              * When to evaluate and how to evaluate


              It was really interesting to see how the groups tackled the same task
              One group focused ‘more’ on methods, one more on ‘stakeholders’ and two (us included) probably focused more on wider issues.

              MBR