Innovations in Assessment seminar at UCL

This is one of the HEA seminar series on assessment and was well attended with some enthusiasm from participants.
There was a real mix of traditional and innovative assessment approaches presented inlcuding examinations, MCQs, scenario based assessment, use of research folders and lecture casts and these were sometimes combined in the same programme.

Two interesting ideas:
1. Carl Gombrich, Philosophy: Use of a lecturecast send to students in advance with students posting questions they would like answered online and voting on the most popular questions which then got answered in the taught session. Although based on transmission -the lecture- the student questions were interpreted as self-formative assessment.

2. Chiara Ambrosio,  History of science module students researched a topic which could be a continuation of something a previous student had started reseraching. The aim was eventual publication of the research. A research folder was presented for summatuve assessment -like a portfolio.

Students also had to read to read others’ projects and were tested on these in an exam which seemed a bit incongruous to me but seemed to work. So there was a clear link between peer formative assessment and summative assessment. A student joined the presenters and was very enthusiastic about the assessment approach.

See outputs soon at http://www.heacademy.ac.uk/events/detail/2012/seminars/themes/ts049_ucl.

Notes fromonline seminar about Making Assessment Count project

The project team led by Gunter Saunders at Westminster have developed a tool – e-Relfect-for linking feedback, self-review questionnnaires and student reflections on this in a learning journal supported by  tutorials.
Details are at:https://sites.google.com/a/staff.westminster.ac.uk/mace/home

Gwyneth attended an online seminar presented by Gunter and had the following thoughts:

  • The principles behind this tool fit well into our AC framework for this project.
  • Automating such a feedback process made it easy for students to use but it could be frustrating
  • The benefits are that it encourages learners to act on feedback
  • It all depends on the quality of feedback from staff and it is not clear whether or not using the system encourages staff to reflect on the feedback they give – there are still QA issues for MAC to address.
  • I wonder if there are less complex ways of achieving something similar
  • Nearly all those participating in the seminar voted that a strength of this system is encouraging feedback dialogue and reflection over a whole programme. This fits again with the AC longitudinal approach.
  • Also this tool is for undergraduates and the Y/N questions in the questionnaire might not be so well recieved by PGs who might expect something more sophisticated.

MAC is worth looking at for the pilots even if we did not use the toolkit in its present form.

Notes from an Elluminate seminar with David Nicol

I took part in a JISC-sponsored Webinar today, run by David Nicol, entitled, “Assessment and feedback: in the hands of the student”.

David’s presentation raised several points that are useful for our project:

  • He discussed the idea that the purpose of feedback might be about developing the capacity in students for evaluative judgement, not just to receive feedback on specific pieces of work. He linked evaluative judgement to ideas of critical thinking, and emphasised use of feedback and knowledge building (rather than just focusing on giving feedback).
  • The emphasis on “timely/detailed/clear” feedback, driven by student national surveys, was criticised for adopting a “delivery” model of feedback, rather than a cognitive one in which students are expected to decode; evaluate and compare; identify discrepancies; revise and construct knowledge; and transfer this understanding to new areas.
  • Practical strategies for fostering this included responding to comments; sequencing assignments to encourage drafting and re-drafting; overlapping tasks; patchwork texts that need to be ‘stitched’ together; reflection on feedback; and ipsative asessment (Gwyneth was name checked here).
  • Echoing our own discussions, Simon was arguing about the importance of separating out grading work from commenting on it/providing feedback. Learning about academic standards isn’t the same as learning how to improve work. This was in the context of self-review by students, but the point may well stand more generally.
  • Peer review is a useful model, since it encourages engagement with comments, prompts revisions and so on. Similarly, offering peer review comments to others fosters learning: it prompted reflection on their own work too (since their work is used as a point of reference), and also encouraged engagement with the marking criteria. Important principles here would include maximising the number of reviews undertaken, engaging in dialogue and relating feedback offered to students’ own work.

Resources from the session, including the recording, slides and text chat are available from the Design Studio.

Participating in Assessment “Swap Shop”

This Monday, I took part in the JISC Assessment technology “swap shop” Elluminate meeting. As well as providing an update for people on our project, and what we’re up to, there were a few points that came of relevance to us:

  • There’s use of a Moodle plugin for managing assessment, developed by ULCC. The project site is, http://as.exeter.ac.uk/support/educationenhancementprojects/current_projects/ocme/. Here’s some blurb from their project plan: “The system (which is being developed by our Moodle hosting partner ULCC) will integrate fully with the ELE (Exeter Learning Environment) Moodle installation to allow students to submit coursework through ELE and receive a Turnitin similarity score. Assignments may then be marked on-line using Turnitin GradeMark or Word ‘track changes’. Feedback (which may consist of uploaded files or completed bespoke feedback forms) will be returned to students via ELE. Personal tutors will also be able to view feedback for their tutees.”
  • There’s also quite a simple administrative project about feedback and notification, e.g. The date when feedback to students is due. It’s just using VLE calendar and Outlook. (http://www.jisc.ac.uk/whatwedo/programmes/elearning/assessmentandfeedback/glamorgan.aspx)
  • There seem to be quite a lot of overlap with the Dundee project, interACT, which is focused on medical education but which is concerned with feedback processes that seem to look a lot like ipsative processes. They’re working with Blackboard rather than Moodle, however.
  • The LDSE modelling tools are being pushed for the programme as a whole, so we’re a bit ahead of the game in that we’ve already committed to using this.

Second team meeting 12/1/12

This second team meeting was used to firstly to recieve a summary of the views of the steering group. The suggestion of inlcuding doctoral students in the project was a good idea especially since there are already plans to provide a way of capturing doctoral supervision sessions and feedback in Moodle. Secondly, the reserchers reported on progress with the Baseline report. The data has been collected, although there may be some gaps to fill in later, and will be synthesised next week ready for a draft of the report at the end of January.

The remainder of the meeting focussed on developing pilot plans and the REAP assessment principles were discussed. The 3 pilot leaders present thought that these would be useful for pilot planning and will discuss further with the programmes teams involved.

Steering group meeting 12/1/12

The project steering group met with Peter Chatterton our critical friend also in attendance. The steering group agreed not only to monitor the project and sign off deliverables, but also to have a critical input into the project and discuss hot topics as they arise.  Suggestions arising from this meeting included:

  1. Possibility of the involvement of doctoral students.
  2. Interest in pedagogic modeling/pedagogic patterns being extended to assessment (Diana Laurillard to present on this at a future team meeting).
  3. Interest in looking at module-level practice in relation to advising students about assessment, e.g. handbook or advice given in class.

Stakeholder engagement was identified as a being likely to be an issue and a stakeholder engagement plan was recommeded.

Meeting with JISC Dec 8th 2011

Gwyneth, Martin and Tim and pilot leaders attended a meeting with JISC reps-Lisa Gray and Paul Bailey- at the IOE on 8.12.11.
Mary Stiasny also attended for lunch.

Notes from the meeting will be available soon. There was some useful discussion around the Baseline report with a recognition that the darft report for Jan. 2012 will be incomplete and we will identify ‘known unknowns’-Martin’s phrase.

The pilots will also need to baseline the programmes before piloting the Assessment Career framework and these will be included in the Baseline report at a later stage.
Jisc will send us some links to other relevant projects and tools that we might wish to use.

Programme evaluation workshop

A while back, I attended a programme-wide workshop on evaluation for the project, with Gwyneth – I meant to post about this at the time but I’ve been swamped, so wanted to make sure I posted something up about this before it slips my mind entirely.

This meeting provided a chance to think about the project’s evaluation in a little more detail. The slides from the event are available, as is a recording of the session. There was a helpful discussion of things like what measures we might have access to that could describe impact, and there was a lot of interest in the possibility of linking assessment practice to workload in some way – concerns about efficiency were widespread, as was the hope of raising quality whilst lowering costs. Our plans are to do this using the tools developed in the LDSE project, although we’ll have to see how well this works when we try it.