At this final programme meeting we presented the feedback analysis tool and gave a demonstration of the Moodle reporting plug in which is now working. This report enables a tutor to select a student and see the grades and feedback for all modules on the programme in one place -see below (that is provided that the feedback was submitted in Moodle.)
Demo of Moodle Assessment Reporting
There was loads of interest in it. The reports mean that tutors can easily view feedback from a previous module and use this to comment on how students are progressing. Knowing that tutors can view past feedback may also encourage students to go back and look at past feedback when preparing a new assignment.Some wanted info and source code so we will put instructions on how to implement the plug-in/source code in the JISC design studio. We will also present results of the pilot of the reporting tool which we are doing this term.
Some people asked if the report was possible for students to access the tool as well as staff and that might be something for the future.
We had a useful discussion with the other projects in our CAMEL group in Belfast. All have synergy with Assessment Careers.
The Queens EAffect project uses electronic feedback in the vle and marking online. This saves admin days, is quicker for students to hand in electronically as they are not needing to travel and markers also prefer marking online now they are used to it. They also are using REAP principles and analysing feedback. The Dundee project will also introduce online marking and have analysed feedback.
The project at MMU has overlap with the work of the IOE Assessment Working Group. They have undertaken a review of appeals procedures and a review of the of reassessment period to reduce failed resist. They are developing a specification for e-assessment with new codes of practice. E-portfolios are in the top 3 forms of assessment. They are also developing marking rubrics.
We also explored how to engage with different stakeholders- senior management, students and colleagues and this is an area we could discuss at our next meeting.
I also gave keynote presenting early results from Assessment Careers at the conference at QUB Assessment and Feedback conference. The feedback tool worked well -there were comments on the praise cateogry that it does not distinguish brief comments from constructive praise. The principles appeared to be uncontentious to this audience-a bit of a variation on the themes of other principles that are in use.
On 20th Feb. 2013 I gave a seminar to a group of staff at Glamorgan University in which I outlined the project and the feedback analysis tool. They tested out the tool with some feedback samples and the majority found it easy to use. The issue of how much praise to provide was raised again and there was a comment on the cultural nature of feedback. They suggested that giving praise is a manifestation of British politeness and that is why in the UK we feel praise is so important to soften critque. I would be interested to find out if there is any research on culture differences in feedback.
We also discussed the draft principles which were mostly seen as not contentious. One comment was that each principle needs to be clear who is to take the action-students assessors etc.
There is a brief mention of the IOE in the JISC on Air broadcast: Driving change in assessment and feedback. The main focus is technological innovation but some of the principles have resonance with Assessment Careers.
Martin, Holly and I gave a presentation at the SRHE 2012 conference on the project entitled: Assessment Careers: towards a vision of post-modularisation. The slides are available in the documents page of this blog.
We presented the feedback analysis tool and some results as well as the student responses so far on how they use feedback and Martin’s pre-pilot on peer feedback. There was a good receptive audience and they asked the following questions and comments:
Q1 Will you be able to have a look at the kind of language that tutors use in their feedback?
Q2 If feedback is so problematic, maybe we should only do feed forward, we’ve switched to only giving feedback on formative work, not summative.
Q3 How do you develop the self-assessment skills of students?
Other presentations worth noting were:
1. Asghar and Hall from York St John University talking about Dialogue days. Where students and staff meet informally to discuss any aspect of their course outside classroom. They claim that this is more useful than course committees.
2. Dai Hounsell from Edinburgh has done a meta level literature review on feedback. He was not convinced that the feedback and assessment guidelines that are publicised across the sector are evidence informed. Most research on feedback is based undergraduate. Most feedback is on current content and not feed forward which matches our findings.
There were other critiques of feedback that is delivered to students and not discussed and more evidence that students do not find praise helpful but these other papers were not saying anything new.
Finally, the venue was very good!
Celtic Manor resort lobby
I presented our feedback analysis tool at the JISC online pre-conference session today alongside Dundee presenting their similar but different feedback auditing tool. See http://onlineconf12.jisc.ac.uk/ and https://sas.elluminate.com/site/external/jwsdetect/nativeplayback.jnlp?sid=2009077&psid=2012-11-15.0205.M.429B2124FC94CAC8371A338C2B143D.vcr for a session recording.
There was plenty of discussion about and interest in: ipsative feedback, the issue of consistency of feedback, how the context might influence feedback profiles as well as the importance of longitudinal assessment and encouraging learners to read and act on feedback.
The Dundee feedback audit tool goes into more depth on the quality of feedback e.g whether or not examples and explanations are provided and whether or not the feedback encourges learners to become self-regulating. Perhpas we can we look at feedback in more depth too?
In keeping with our initial results, the Dundee team identified that much feedback related to the current task and did not advise on future work. They also looked at individual staff feedback profiles, which we avoided as it might be a sensitve issue, but I think there is some mileage in using the tool for individuals’ private self-reflection on practice.
Holly and I attended this meeting and our video was presented http://youtu.be/VSaGbPoXPh0 and we gave a poster presentation too which attracted lots of interest.
Programme and logitudinal assessment were mentioned several times by JISC and the HEA and so the Assessment Career theme is very topical. Also 2 other projects from the OU and Dundee have developed feedback analytical tools, although different from ours, and there was interest in unpacking feedback and being more transparent over the nature of feedback and what is appropriate in differnet contexts. e.g. does the very last piece of work for a programme need to have feedback? Also would it be helpful to share feedback with colleagues perhaps anonymously?
Some other points raised that we might consider were:
How do we know where the start and end of an assessment career is in a modularised programme?
What about increasing use of part-time staff who do not have an overview of the programme? Is it the programme leader’s role to provide this overview?
The term dialogue may imply lots of conversations on top of feedback process and implies work intensification.
More use could be made of peer feedback.
The concept of assessment careers may not be obvious to stakeholders other than lecturers e.g. QA or admin staff so do we need a different project slogan?
This was the final conference for a NTFS funded project on programme-focused assessment or PFA. This is already practiced in tteh US e.g. Alverno College and there are examples from the project in the UK too.
This means assessment of programme learning outcomes rather than module learning outcomes and it means integratative assessment from more than one module.
The PFA approach compliments our Assessment Career approach in that it encourages linking of modules and cumulative work rather than the fragmentation that is often found under modularisation.
Other benefits inlcude the possibility of synoptic assessments that can draw on skills from the whole programme. The EdD thesis would be a good example of this.
We discussed a few issues about implementation of PFA including the possible need for flexible regulations and better working in programme teams.
I think that combining PFA and the Assessment Career approach to feedback would work well and enhance assessment on a large scale.
I presented a paper on ipsative assessment at the HEA event “Self-assessment: strategies and software to stimulate learning” held at the Open University.
The concept of ipsative assessment attracted much interest but was out of context from the rest of the day which focussed on online or computer-based self-testing. I would query whether self-testing is self-assessment since it is the academic who designed the test who is doing the assessment via computer software. There was however discussion of certaintly -based marking which does require the student to self-assess on how confident they are about an answer.
Software from NZ “Peerwise ” was mentioned as being innovative in that it allows learners to create their own questions and comment on each other’s questions as well as answer them. Again this was for multi-choice type questions – but the idea could be extended to longer answer questions. I liked the idea of a community of learners discussing assessment questions.
The replay of the event is now available at: http://stadium.open.ac.uk/stadia/preview.php?whichevent=1955&s=1.
Copies of the presentations are available on the HEA’s website at: http://www.heacademy.ac.uk/events/detail/2012/seminars/themes/tw037_ou
I recently attended a one-day event organised by the TeAL project at Middlesex University. This provided an interesting and frank account of the challenges as well as the successes they faced. It was particularly good to hear from senior leaders and teachers, as well as the project team. There were many issues that are relevant to this project’s work, and that of the Assessment Working Group. I brought back and shared some of the documents from the event – particularly the flow chart talking through the change process, and the table that maps things like values, roles and processes against stages and questions.