Showing posts with label student. Show all posts
Showing posts with label student. Show all posts

Thursday, 9 September 2010

ESCAPE themes - student focus

The ESCAPE (Assessment for Learning) themes have been synthesised from the literature. Indeed purposely draw on the literature to create the questions that prompt teachers to think about their assessment design in light of the six ESCAPE themes.

But the design is only one facet of good assessment. The role of the students also needs to be considered. Sure, the design should stimulate appropriate student behaviors etc. but we wanted to go further.

What we are working on now, is a set of 'accessing questions' for our ESCAPE themes that are explicitly written for students. We have the same themes (of course) but the accessing questions are different. Although, we have not tested the ideas out yet, I wanted to share some of our current thinking. Comments, questions, thoughts, most welcome.


Good Practice in Assessment-For-Learning:
Engages students with the assessment criteria
Assessment is an important aspect of student learning and should be used to help reinforce the expected standards. Our interactions with students, through assessment and feedback, should help students engage with the assessment criteria.

Q1.1 I seek out opportunities to help me understand the academic standards expected of Higher Education.
Q1.2 I take advantage of the resources available (across UH) to ensure my work meets the academic standards expected of me.
Q1.3 When presented with an assessment task I read and ensure I correctly understand the assessment criteria
Q1.4 When I receive feedback on my work I look at my feedback and link it back to the assessment criteria to support my future learning
Q1.5 I ensure my assessment submission responds to all the assessment criteria / learning outcomes described in the assessment briefing documents

In what ways do you engage with the assessment criteria?


Good Practice in Assessment-For-Learning:
Supports personalised learning
Students have their own motivations and interests. As individuals, students also have differing needs to support their learning. Whilst individual assessment tasks are likely to be an impractical proposition it is helpful to consider how assessment can support the personalisation of learning.

Q2.1 Where appropriate, I use my own personal experiences to support my assessment submissions.
Q2.2 I take opportunities to let my lecturer know about the areas I would like feedback on (strengths and weaknesses).
Q2.3 I take advantage of any assessment choices presented to me to suit my learning preferences (topic/weighting/timing/criteria).
To finish!
In what ways do seek out opportunities for personalised learning?


Good Practice in Assessment-For-Learning:
Ensures feedback leads to improvement
Feedback is an essential aspect of assessment activity. Feedback will be more effective if it is prompt and makes sense to the students. Moreover, good feedback provides a commentary on the students’ submissions, offers advice on how the work could be developed and provides opportunities for students to demonstrably engage with the feedback.

Q3.1 I recognise the many ways that feedback is presented to me about my work and my learning
Q3.2 I know when and where feedback on my assessment is available and pick up my feedback as soon as it is released.
Q3.3 I take opportunities to discuss feedback with my lecturers and my peers.
Q3.4 I take the time to identify (by myself and / or with my peers) the strengths and weaknesses of my own assessment.
Q3.5 I use feedback from previous assessment tasks to help me improve my understanding and my next assessment task

In what ways do you ensure that the feedback leads to your improvement?



Good Practice in Assessment-For-Learning:

Focuses on student development
Assessment has a significant influence on student motivation and the ways in which students approach their learning. Good assessment develops the students’ interests, motivations and encourages appropriate study behaviours. Ultimately good assessment motivates good learning.

Q4.1 When constructing my assessment submission I focus my effort on learning (i.e. linking concepts together) rather than just remembering information.
Q4.2 When I receive feedback on my assessment I look carefully at the comments, advice and encouragement and do not just concentrate on the marks I received
Q4.3 I take the time to review my own assessment (self assessment) before and after I submit my work
Q4.4 I make sure I identify the positive aspects of my own work as well as areas for improvement

In what ways do you ensure your activity focuses on the development of your learning?


Good Practice in Assessment-For-Learning:
Stimulates dialogue
A good learning environment considers the individual student whilst also recognising the importance of a learning community. Further, learning is enhanced if students are able to share their conceptions and misconceptions. Good assessments support the development of a learning community and provide opportunities for students to engage in a dialogue about their learning.

Teachers too should have an opportunity to engage in a dialogue. A dialogue that helps them shape their teaching and engage in staff, module and programme development activity.

Q5.1. I take every opportunity to contribute to group and class discussions relating to assessment and learning.
Q5.2. I look for opportunities to discuss my assessment with my peers and my teachers
Q5.3 I look for other sources of help to support my assessment. This might include reading lists, learning groups, central support systems etc.
Q5.4. I use the assessment tasks (and subsequent feedback) to help me develop my understanding of the standards expected of me
Q5.5. I always read the feedback I receive and use it to help me shape my learning

In what ways do you engage in dialogue about your learning and assessment activity


Good Practice in Assessment-For-Learning:
Considers student and staff effort
Good assessments create a good educational experience. Good assessment set out high expectations, foster appropriate study behaviours and stimulate students’ inquisitiveness, motivation and interest. Good assessment should distribute the students’ effort across the study-period and topic areas. Good assessments will demand an appropriate amount of student effort. Good assessments will not, however overload students nor their teachers. Good assessments ensure there is adequate time for teachers to create and deliver feedback in ways that supports student learning.

Q6.1. I put all my assessment deadlines in a diary/online calendar so that I am aware of what is expected of me
Q6.2 I plan my work so that I am able to work on assessment tasks that have overlapping deadlines
Q6.3. I avoid cramming and spread out my time on assessment tasks
Q6.4 I follow the advice given (on the assignment briefing document) regarding how much time I should typically spend on my assessment activity
Q6.5 For each assessment task I carefully plan each stage of the work (e.g. reviewing previous feedback, reading/research for new assignment, creating a first draft, reviewing and amending, proof reading, self evaluation, submission)

In what ways do you plan and manage your effort to enhance your learning?


Again, comments / thought / questions most welcome.

Mark

Friday, 23 April 2010

It's Quality Time!


One of the things that has struck me this week is the how often subtle changes to a modules delivery can have quite wide ranging changes to the learning interactions that take place. The example I am thinking of is the Corporate Social Responsibility and Sustainable Marketing Module. The module coordinator - using his experience from previous years, has posted a set of FAQs on his module Studynet site. The questions address points and issues that were raised by students last year and were generated from the experiences of the whole module team.
As a result, module team's interactions with students - both face to face and online, been able to focus more on developing the students understanding of the material covered in the module . Students can be pointed in the direction of where to find specific information. This leaves the team time to develop the "high quality " dialogue and interaction with students that can really support and personalise the students learning. This simple strategy(one of a number adopted by the team) of using a well primed bank of FAQS has acted as a filter so that the more complex and interesting questions that students generate can form the basis of meaningful discussion with the module team.

Here comes the data (1) ...


We are now starting to see the fruits of our labour on the ESCAPE project. Some of the 'fruits' arise when we talk to staff about their assessment practice and resulting experience whereas other 'fruit' arises from observations of student performance and their engagement with their studies. I thought I would share some of what we are finding ...

A new module (not one of the original ESCAPE modules) wanted help with Peer Assessment. The member of staff was already engaging her students with peer assessment and hence were reaping numerous learning gains.

* Students were able to see how their peers responded to the same task
* Students were able to engage more with the marking criteria and standards
*

Previously, however, she was kept very busy after the peer assessment by dealing with more than expected students questioning their marks. Whilst it is highly appropriate that the students are exposed to a fair and reliable assessment, many of the efficiency gains made by the staff member were lost due to the need to deal with students on a one-to-one basis.

With the help of the ESCAPE project we were able to re-purpose a web-based data collection facility (developed to support computer based assessment). Using the web based data collection facility we posted to the students a series of questions (asking them to reflect on their own submission and the peer assessment process. This was an addition to the work previously done and hence created addiction learning gains. Importantly, we also included an opportunity for the students to 'comment on their mark' and note that if they were over or under marked to provide evidence where this was the case -with reference to the marking criteria used in the peer assessment process. The result was of which a vast reduction in 'additional' time required by the staff member to look at the concerns.

This addition was a real win-win. Students were now reflecting on the process (and sharing the reflections with staff) and the staff were reaping efficiency gains.

Slightly self-promoting but this typifies exactly the type of things this project is about. Using technology (led by pedagogy) to reap learning and efficiency gains.

I can't see this technology-supported intervention stopping when the project finishes.

Thursday, 4 February 2010

Off to Scotland


Helen Barefoot and I were invited by the HEA to talk run a workshop on assessment and feedback at an all day Assessment event at the Robert Gordon University. What follows is a collection of (very quick) thoughts on the day …

Helen and I have run a few workshops together (inside UH) and so it was just great to take our work (much of which is guided by our ESCAPE activity) to colleagues outside UH. For various reasons the workshop did not run :-(. Despite our disappointment we did get to hear some great assessment related presentations.

Dai Hounsell presented a really grounded key-note and, in fact covered, much of what we were covering too. That assessment is not a new challenge, that good assessment is planned activity and that good assessment stimulates learning. We did not need the NSS to get us thinking that assessment is important. Some really useful slides from Dai that I will explore and come back to in a later post. Great start to the day.

A couple of student perspective presentations followed.
A student led campaign that successful introduced a turn-around-time policy for coursework and interestingly a policy to provide feedback on examination scripts. The learning gains to be had from providing feedback on examination scripts seems rather limited to me. I’m always banging-on about feedback creating consequences, and I’m just not convinced I know what consequences flow, or are able to flow, from feedback on end-of-process, high stakes assessment tasks. That’s a post for another day. But just to say, I’m off the fence on this one. I just don’t get it. Another delegate did note they had a similar policy at his institution and only 15% of the scripts (with feedback) were picked up. Surely we would be better placed putting our feedback on work that will be picked up and more importantly attended to, by the students. And relax!

Steve Draper, engaging as ever, had a couple of threads running through his presentation. First, was the interesting anomaly that overall a department was rated 5th against other departments (107 in total) for the NSS question overall, I am satisfied with the quality of the course and yet questions relating to feedback were ranked much lower. Feedback on my work has been prompt (ranked 54/107), Feedback on my work has helped me clarify things I did not understand (ranked 79/107) have received detailed comments on my work (ranked 101/107). Steve asked us to explore what might be going on. Was there a complex weighting algorithm for all the items on the NSS? Should the individual items of the NSS not sum to the overall score? If not, what was missing, what was the missing ingredient? Second, Steve separated declarative and procedural learning and pondered as to where our efforts on providing feedback might prove most effective. i.e. might we get more learning from less or (better targeted) feedback?

Friday, 26 June 2009

Railway Journey


Yesterday I attended the "Engaging and Responding to Learners" workshop run by JISC 'in Birmingham. It was an extremely valuable day - both in terms of the workshop content and as a forum to network and engage with colleagues on other projects.

I took the train from Euston. The different types bright shiny new rolling stock that seemed to appear briefly and then whizz past the window provided an interesting distraction during the journey. It struck me that when there was just British Rail operating the railways, there was a one size fits all policy - such that often trains would depart having four, eight or eleven nearly empty carriages.

The situation on the railways has a parallel with education - we are investing heavily in technological solutions for the students educational journeys. There is much more willingness to look at bespoke solutions and to embrace new ways of thinking than stay with the one size fits all delivery mechanisms of the past. The result? - faster smoother journeys for all!

For those interested - on my journey the train was hauled by a EWS Class 90 and I returned to Euston on a Virgin Pendolino

Thursday, 4 June 2009

Casting Bread On The Water.................


Yesterday afternoon we went live with the student assessment survey. We emailed all students registered on the nine modules that we are working with, to invite them to fill in the survey. The university exam period only ended last Friday . We were unsure when students would log on again to pick up their university emails after their post exam celebrations and the consequent recuperation period ! When I checked this morning there were 51 responses already! many students have given quite detailed feedback on their assessment experience , filling out the optional parts of the survey with their thoughts and suggestions. Lets hope that this level of response continues.This data is really important - along with the module case studies, it will provide us with the information to ensure the assessment benchmarking exercise is as comprehensive as it can be.
Regarding our two day event - one of our module teams is having difficulty making the dates - 21 and 22 July, so we are looking to have a separate event . This module is the most complex in terms of the delivery and assessment modes and has the largest module team. It is a Bio Science double credit module that encompasses three different areas of science.





Friday, 10 April 2009

Curriculum Design Toolkit



Although I am the ESCAPE Project Director I only (officially) get to work on the project for 0.2 of my time The other 0.8 is dedicated to my role as the Deputy of the Blended Learning Unit (BLU). In the 0.8 capacity part of my remit is to lead a team of seconded staff in the area of Curriculum Design and Innovation (CDI)

Some of the CDI team are currently working on developing a Curriculum Design Toolkit. The toolkit has three components and is intend to help individuals, module teams and programme teams reflect on, and where appropriate re-engineer, their curricula. The primary components of the toolkit are

A Diagnostic tool – used to establish where the problems in the curricula might lay
A Features and Consequences map – used to show what the ‘features and consequences’ of the diagnosed curricula are likely to be.
A Suggestions for Improvement resource bank – used to provide staff with different forms of advice should they wish to tackle any diagnosed problems

Wednesday and Thursday of this week I drew the team together to review progress and accelerate the development. The team are only seconded the BLU on a fractional basis and so they too have many varied demands placed on them.

The two days was full of energy and enthusiasm for the development.
After I outlined the vision for the toolkit and fed back initial thoughts from users of a pilot Diagnostic tool day one was set aside for the team to show where they are at.

Marija Cubric outlined the Technology-to-Principles Matrix. Given that UH is wedded to the notion of Blended Learning there is a need for us to help staff make appropriate decisions about the use of the varied technologies available. For us it is never ‘the answer is a wiki, now what was the question?’

Following Marija’s presentation / discussion we spent the rest of the day exploring the progress of the various CD strands.

Currently the strands under development are ‘Curriculum Design Toolkits for…

* Core values (Here we use the Principles of Good Practice in Undergraduate Education offered by Chickering and Gamson)

* Employability – this development is being led by Amanda (Mandi) Relph and Sarah Flynn

* Research Informed Teaching – this development is being led by Phil Porter

* Assessment for Learning – this development is being led by Maria Banks and Myself

Other strands to be developed include

* Internationalisation
* Entrepreneurship


The Diagnostic tool offers a set of research informed principles to the user. The Principles are different for each strand.

Each of the Principles are accessed separately by five questions. The Diagnostic tool rates the responses to the questions and shows, graphically, if there is good, or other, alignment with each of the Principles. We spent the whole of Wednesday looking at the Principles and the ‘accessing questions’.


Wednesday was really productive and we moved on our developments and thinking significantly. Wednesday also included a trip out to a local Indian Restaurant. It was great to continue the discussions as well spend some time socialising with colleagues.

Thursday was due to be a testing day. We were hoping (somewhat ambitiously) to get other members of the BLU / LTI to test the toolkits under different scenarios. Alas we were just too ambitious and so we stood the ‘testers’ down.

We continued with our work from Wednesday and also spent part of the day engaging with students. This was to incorporate their voice into the Features and Consequences Map. We thought getting the students involved would be a valuable contribution. I’m not sure any of us knew just how valuable that activity was going to be. We took input from Lauren Anderson and Elizabeth Terry (placement students in the BLU), Yoeri Goosens – BLU Student Consultant, Ruth Hyde – BLU research assistant recently engaged in the JISC funded STROLL project and Dawn Hamlet. Dawn is our link with the UH student Union. We have worked with this group before and they are always dedicated and committed. Thank you!

Each of the strand leaders met separately with the students to get the student view. It was really interesting to see how the strand leaders engaged the students differently. I suspect the different facilitation techniques kept the students alert too. That is they were not overwhelmed by the same technique.


Gaining the voice of the student was followed by a super-efficient and dialogic facilitation technique (Marija Cubric) to get us to vote on the likely ability of the various technologies to meet the needs of the various Principles.

We came a long way in those two days and although we are left with some on-going activity I doubt we would have so far if we had stayed in our offices and worked on the toolkits on an ad-hoc basis.

Oh, why am I writing this here? Mainly because of the Assessment-for-Learning strand that is being developed. It will be just great to get this and the Technology-to-Principles matrix fed into ESCAPE.

Watch this space…

Great job team!

Thursday, 2 April 2009

A few thoughts from Prof. Brenda Smith. HEA Assessment and Feedback Event

Brenda, as always, presents a really coherent argument and draws on a wealth of experience and expertise. Brenda’s position allows her to talk with numerous institutions which subsequently feed back into her knowledge. She uses these examples really well to allow her to provide some concrete examples of practice in the sector.

A few things I wrote down from her presentation …

* Engage students in the planning and curriculum planning – students can be great change agents.
* Assessments ‘can’ be set at week 3
* Get the Heads of Academic Schools at Teaching and Learning sessions – get T&L valued.
* Policies of professorship for T&L not valued or limited take up?
* A student friendly academic School gives a higher student feedback return
* The University of Lincoln, provides a booklet relating to students’ feedback. The booklet includes ‘this is what you asked for, this is what you said, and this is what we did. Helen and I are trying to get a You Said – We Did campaign up and running at UH. This resonated with us.
* Staff taking students to lunch and other simple schemes that attempt to engage students with staff. This again reminds me of Chickering and Gamson’s first principle ‘ Good practice im undergraduate education encourages staff student contact.
* Too much feedback might not be helpful
* Don’t just respond to students’ needs – Brenda gave the example of groups work and students not wanting it– Don’t forget to tell them why you are using it and what the purposes are. It’s not a secret!
* Stratchlyde – podcast lectures and use lecture time more productively – this frees up time for better feedback – This reminds me of what we do with the Blended Learning - use the technology o enhance and extend the traditional teaching sessions.
* Use of voice – audio supported feedback
* What do students do with feedback?
* Useful ways of presenting feedback – quickly!

Lots and lots more really useful ideas - I'm sure I've only touched the surface and dine an injustice to Brenda's session.

Wednesday, 25 March 2009

ESCAPE coverage in the Universe




Hi there, my name is Elizabeth Terry and I am a Business Studies student currently on placement with the Blended Learning Unit/ Learning and Teaching Institute (BLU/LTI). Working in such a committed learning environment, I have come to realise just how much the BLU/LTI value the students' opinion. Before working here, I had no idea how much thought went into enhancing the student experience, particularly within the classroom. Because of this, I have decided to do all that I can to make the students fully aware of what activities we do and the benefits that they bring to not only the university, but also their studies.

Liaising closely with Dawn Hamlet (Vice President Academic Support and Campaigns), I have successfully been able to get numerous stories on various projects that the BLU/LTI are running into the Universe (the student’s paper that has on average 4,000 readers). This week, the Universe features a story on the ESCAPE project. The article highlights concerns that are attached to assessment and invites the students to give their opinion and views on assessment and feedback.