Quantcast
Channel: LACE - Learning Analytics Community Exchange » Adam Cooper
Viewing all articles
Browse latest Browse all 12

Electronic Management of Assessment and Assessment Analytics

$
0
0

During June 2014, LACE collaborated with Jisc and the EUNIS E-Learning Task Force to run a workshop session on “Electronic Management of Assessment and Assessment Analytics” prior to the 2015 EUNIS Congress. Jisc is the charity with a mission to maximise the benefits of technology for UK Higher Education and Further Education and Skills sectors. EUNIS Congress is the annual meeting of the European UNiversity Information Systems community, and has an agenda spanning highly technical infrastructure issues through to educational technology.

The backdrop to the workshop was Jisc’s ongoing programme of work on the Electronic Management of Assessment (EMA), which was explained by Gill Ferrell, EUNIS Board member and Jisc consultant. Jisc has developed a guide to EMA and run a number of co-design workshops which are reported on the EMA blog. Assessment and learning analytics are, in many ways, natural bed-fellows and, with some justification, assessment practice, and especially psychometrics, can be labelled “learning analytics before we had the title ‘learning analytics’.”

The section of the workshop on assessment analytics, led by LACE’s Adam Cooper, set the scene on learning analytics and made the claim that assessment has been somewhat neglected in the discourse on learning analytics and yet it is, arguably, an excellent jumping-off point for adopting learning analytics because assessment is universally expected to be part of a teaching and learning experience by both teachers and learners. It is also something which clearly benefits all students, and we know that assessment and feedback have a pivotal role in shaping what and how students learn.

The presentation was scene-setting for a discussion among the over-60 participants from a diverse range of European universities and colleges. We used the Jisc EMA Assessment Lifcycle as a way of structuring discussion, to reflect on the diverse range of possibilities for making more use of data, not just arising from assessment but in support of the range of processes involved in the whole lifecycle. The lifecycle approach is helpful in drawing attention to possibilities other than the stereotype which is a teacher using assessment data to demonstrate to a learner where they are in deficit relative to an idealised graduate in X.

Assessment lifecycle: specifying, setting, supporting, submitting, marking and production of feedback, recording marks, returning marks and feedback,reflecting (return to specifying)

Jisc EMA Assessment Lifecycle Model (based on an original from Manchester Metropolitan University, UK)

This approach seemed to work well, given the lively group discussions. The following topics of discussion on promising areas for exploration of assessment analytics were captured (these are based on post-it notes from the session, augmented with verbal comments made during the session). The vast majority of attendees were not from academic research units, and they were encouraged to focus thinking on what is realistic as an innovation topic in the practical institutional settings with which they are familiar.

Specifying

  • analyse how different assessment methods influence results.
  • better understand the curriculum design: statistics on assessment of learning objectives, scale of assessment, confirming/countering claims that “we over-assess”.

Setting

  • (no points recorded)

Supporting

  • Can we use behaviour/engagement data to improve support while the assessment is in progress, including to guide ePortfolio activity?
  • What do video-watching statistics show about problem areas?

Submitting

  • Relate the submission time relative to the deadline to results, explore differences between full and part time students.

Marking and production of feedback

  • Analysing feedback from teachers to improve what students receive, possibly leading to staff development opportunities and raising awareness of “assessment literacies”.
  • Sentiment analysis of student feedback.
  • Item analysis (e.g. using Classical Test Theory – see the slide-set for a link which introduces CTT)

Recording marks

  • (no points recorded)

Returning marks and feedback

  • Discovering the relationship between the feedback medium and the effect on student performance (also satisfaction).
  • Simply providing a student’s position in relation to their cohort is not enough – not really giving information to guide improvement, and for some students, being close to the median might make them miss the potential to improve – so assessment analytics needs to go into more detail, and ideally be used within a tutoring process to translate analytical results into action.

Reflecting

  • Can we understand student engagement with the feedback stream?
  • Use detailed assessment data for course evaluation, curriculum re-design, etc.
  • Item analysis – improve reliability of the assessment instruments and markers.

Although this was quite a short session, and it is impossible to capture the richness of the discussions and the way in which different perspectives on assessment led to different conceptualisations of assessment analytics, it was clear that there are abundant opportunities to introduce more use of data to enhance the manifold of teaching, learning, and assessment and that moves which are underway to develop EMA, as well as greater adoption of e-Assessment provide a relatively easy pathway to introduce some learning analytics without making a big deal of it.

The post Electronic Management of Assessment and Assessment Analytics appeared first on LACE - Learning Analytics Community Exchange.


Viewing all articles
Browse latest Browse all 12

Latest Images

Trending Articles





Latest Images