Show simple item record

dc.contributor.authorAndreassen, Helene N.
dc.contributor.authorLåg, Torstein
dc.contributor.authorStenersen, Mark
dc.date.accessioned2016-03-11T09:41:03Z
dc.date.available2016-03-11T09:41:03Z
dc.date.issued2015-10-19
dc.description.abstractAlthough online resources can complement and support face-to-face teaching, they run the risk of creating a distance between the student and the teacher. However, this might be counteracted through careful design of structure and content. In particular, interactive tasks can provide the student with much needed feedback while at the same time allowing the teacher access to both stateful and event data information from student replies to evaluate and improve the course content. This winter, the University Library at UiT launched a MOOC on information literacy, iKomp (www.edx.bibsys.no, with estimated launch of the English version in May this year). Built on the open source platform Open EdX, iKomp consists of four modules: learning strategies, source evaluation, information searching, and academic integrity. As with most MOOCs, the content is a mix of text, videos, learning activities, and tests. The final exam consists of a 40 question multiple-choice test. The lack of direct teacher-student interaction makes the assessment of learning outcomes from MOOCs and other web-based courses a challenge. While it is important that the selected assessment method tests the students’ understanding of the course content, we consider it equally important that the exam instigates learning. Each question in the multiple-choice test closing iKomp has four alternative answers, and each distractor, or wrong answer, is formulated as a plausible answer, thus encouraging thoughtful deliberation in the student. An advantage of using event recording technology in teaching is the possibility of gaining insight into student learning through usage data. In this paper, we present the results from a deep log analysis of the exam results from a period of 6 months. The study has a twofold objective: By examining the students’ performance, we aim to evaluate the exam content. Specifically, analysing response patterns, we can assess whether some of our alternative answers are confusing or if the questions are easily misunderstood. By examining the type and rate of errors, we aim to determine the areas where students need more input. Filling these gaps, we answer to the students’ needs and thereby improve the overall value of the course. The analysis of the exam answer distribution log reveals that several questions are too easy, as the majority of students succeed on the first attempt. Others questions are more evenly distributed, with more than one answer alternative being selected quite frequently. This paper presents patterns, or the lack thereof, between answer distributions and course content. Specifically, do certain content areas stand out, in terms of error rates? We discuss whether a revision of the exam or the course content is called for, and whether some areas may be harder to teach online than others. We consider this type of analysis to have at least three possible benefits: (i) improving our information literacy courses, (ii) refining our understanding of student learning, and (iii) increasing student-teacher interaction online.en_US
dc.identifier.citationEuropean Conference on Information Literacy (ECIL), 19-22 October 2015en_US
dc.identifier.cristinIDFRIDAID 1314327
dc.identifier.urihttps://hdl.handle.net/10037/8879
dc.identifier.urnURN:NBN:no-uit_munin_8435
dc.language.isoengen_US
dc.publisherTallinn University, Estoniaen_US
dc.rights.accessRightsopenAccess
dc.subjectVDP::Samfunnsvitenskap: 200::Biblioteks- og informasjonsvitenskap: 320en_US
dc.subjectVDP::Social science: 200::Library and information science: 320en_US
dc.titleThe long and winding road: Insights from student misconceptionsen_US
dc.typeConference objecten_US
dc.typeKonferansebidragen_US


File(s) in this item

Thumbnail

This item appears in the following collection(s)

Show simple item record