Author Archives: gwynethhughes

Bloomsbury learning environment event on assessment

Gwyneth, Holly and Tim presented tools from the project and for the first time a formal presentation by Tim on the Moodle reporting system at:

BLE Seminar: Exploring the use of digital technologies to enhance assessment and feedback 4th June

While students reflecting on feedback is not a new idea the reporting of feedback from all modules in a programme did not seem to happen elsewhere. Capturing feedback is well recieved and could be useful for tutors to get an overview of a student’s progress. The current system is only for staff and it was suggested that students might want access to the reports – but this woudl require more Moodle development work.

Slides can be viewed here.

 

JISC final Programme Meeting

At this final programme meeting we presented the feedback analysis tool and gave a demonstration of the Moodle reporting plug in which is now working. This report enables a tutor to select a student and see the grades and feedback for all modules on the programme in one place -see below (that is provided that the feedback was submitted in Moodle.)

Demo of Moodle Assessment Reporting

There was  loads of interest in it. The reports mean that tutors can easily view feedback from a previous module and use this to comment on how students are progressing. Knowing that tutors can view past feedback may also encourage students to go back and look at past feedback when preparing a new assignment.Some wanted info and source code so we will put instructions on how to implement the plug-in/source code in the JISC design studio. We will also present results of the pilot of the reporting tool which we are doing this term.

Some people asked if the report was possible for students to access the tool as well as staff and that might be something for the future.

 

CAMEL meeting and presentation at Queens Belfast

We had a useful discussion with the other projects in our CAMEL group in Belfast. All have synergy with Assessment Careers.

The Queens EAffect project uses electronic feedback in the vle and marking online. This saves admin days, is  quicker for students to hand in electronically as they are not needing to travel and markers also prefer marking online now they are used to it. They also are using REAP principles and analysing feedback. The Dundee project will also introduce online marking and have analysed feedback.

The project at MMU has overlap with the work of the IOE Assessment Working Group.  They have undertaken a review of appeals procedures and a review of the of reassessment period to reduce failed resist. They are developing a specification for e-assessment with  new codes of practice. E-portfolios are in the top 3 forms of assessment. They are also developing marking rubrics.

We also explored how to engage with different stakeholders- senior management, students and colleagues and this is an area we could discuss at our next meeting.

I also gave keynote presenting early results from Assessment Careers at the conference at QUB Assessment and Feedback conference. The feedback tool worked well -there were comments on the praise cateogry that it does not distinguish brief comments from constructive praise. The principles appeared to be uncontentious to this audience-a bit of a variation on the themes of other principles that are in use.

Seminar at Glamorgan University

On 20th Feb. 2013 I gave a seminar to a group of staff at Glamorgan University in which I outlined the project and the feedback analysis tool. They tested out the tool with some feedback samples and the majority found it easy to use. The issue of how much praise to provide was raised again and there was a comment on the cultural nature of feedback. They suggested that giving praise is a manifestation of British politeness and that is why in the UK we feel praise is so important to soften critque. I would be interested to find out if there is any research on culture differences in feedback.

We also discussed the draft principles which were mostly seen as not contentious. One comment was that each principle needs to be clear who is to take the action-students assessors etc.

Presentation at SRHE 2012 conference

Martin, Holly and I gave a presentation at the SRHE 2012 conference on the project entitled: Assessment Careers: towards a vision of post-modularisation. The slides are available in the documents page of this blog.
We presented the feedback analysis tool and some results as well as the student responses so far on how they use feedback and Martin’s pre-pilot on peer feedback. There was a good receptive audience and they asked the following questions and comments:
Q1 Will you be able to have a look at the kind of language that tutors use in their feedback?
Q2 If feedback is so problematic, maybe we should only do feed forward, we’ve switched to only giving feedback on formative work, not summative.
Q3 How do you develop the self-assessment skills of students?

Other presentations worth noting were:

1. Asghar and Hall from York St John University talking about Dialogue days. Where students and staff meet informally to discuss any aspect of their course outside classroom. They claim that this is more useful than course committees.
2. Dai Hounsell from Edinburgh has done a meta level literature review on feedback. He was not convinced that the feedback and assessment guidelines that are publicised across the sector are evidence informed. Most research on feedback is based undergraduate. Most feedback is on current content and not feed forward which matches our findings.

There were other critiques of feedback that is delivered to students and not discussed and more evidence that students do not find praise helpful but these other papers were not saying anything new.

Finally, the venue was very good!

Celtic Manor resort lobby

Analysing Feedback presentation online at JISC conference

I presented our feedback analysis tool at the JISC online pre-conference session today alongside  Dundee presenting their similar but different feedback auditing tool. See http://onlineconf12.jisc.ac.uk/   and https://sas.elluminate.com/site/external/jwsdetect/nativeplayback.jnlp?sid=2009077&psid=2012-11-15.0205.M.429B2124FC94CAC8371A338C2B143D.vcr for a session recording.
There was plenty of discussion about and interest in: ipsative feedback, the issue of consistency of feedback, how the context might influence feedback profiles as well as the importance of longitudinal assessment and encouraging learners to read and act on feedback.
The Dundee feedback audit tool goes into more depth on the quality of feedback e.g whether or not examples and explanations are provided and whether or not the feedback encourges learners to become self-regulating. Perhpas we can we look at feedback in more depth too?

In keeping with our initial results, the Dundee team identified that much feedback related to the current task and did not advise on future work. They also looked at individual staff feedback profiles, which we avoided as it might be a sensitve issue, but I think there is some mileage in using the tool for individuals’ private self-reflection on practice.

JISC Programme meeting Birmingham 17-18th Oct.

Holly and I attended this meeting and our video was presented http://youtu.be/VSaGbPoXPh0 and we gave a poster presentation too which attracted lots of interest.

Programme and logitudinal assessment were mentioned several times by JISC and the HEA and so the Assessment Career theme is very topical. Also 2 other projects from the OU and Dundee have developed feedback analytical tools, although different from ours, and there was interest in unpacking feedback and being more transparent over the nature of feedback and what is appropriate in differnet contexts. e.g. does the very last piece of work for a programme need to have feedback? Also would it be helpful to share feedback with colleagues perhaps anonymously?

Some other points raised that we might consider were:

How do we know where the start and end of an assessment career is in a modularised programme?

What about increasing use of part-time staff who do not have an overview of the programme? Is it the programme leader’s role to provide this overview?

The term dialogue may imply lots of conversations on top of feedback process and implies work intensification.

More use could be made of peer feedback.

The concept of assessment careers may not be obvious to stakeholders other than lecturers e.g. QA or admin staff so do we need a different project slogan?