University student performance in multiple choice questions : an item analysis of Mathematics assessments

Please be advised that the site will be down for maintenance on Sunday, September 1, 2024, from 08:00 to 18:00, and again on Monday, September 2, 2024, from 08:00 to 09:00. We apologize for any inconvenience this may cause.

Show simple item record

dc.contributor.advisor Van Staden, Surette
dc.contributor.coadvisor Harding, Ansie
dc.contributor.postgraduate Brits, Gideon Petrus
dc.date.accessioned 2018-07-13T06:44:35Z
dc.date.available 2018-07-13T06:44:35Z
dc.date.created 2018/05/03
dc.date.issued 2017
dc.description Dissertation (MEd)--University of Pretoria, 2017.
dc.description.abstract The University of Pretoria has experienced a significant increase in student numbers in recent years. This increase has necessarily impacted on the Department of Mathematics and Applied Mathematics. The department is understaffed in terms of lecturing staff, which impacts negatively on postgraduate study and research outputs. The disproportion between teaching staff and the lecturing load and research demands has led to an excessive grading and administrative load on staff. The department decided to use multiple choice questions in assessments that could be graded by means of computer software. The responses of the multiple choice questions are captured on optical reader forms that are processed centrally. Multiple choice questions are combined with constructed response questions (written questions) in semester tests and end-of-term examinations. The quality of the multiple choice questions has never before been determined. This research project asks the research question: How do the multiple choice questions in mathematics, as posed to first-year engineering students at the University of Pretoria, comply with the principles of good assessment for determining quality? A quantitative secondary analysis is performed on data that was sourced from the first-year engineering calculus module WTW 158 for the years 2015, 2016 and 2017. The study shows that, in most cases, the questions are commendable with well-balanced indices of discrimination and difficulty including well-chosen functional distractors. The item analysis included determining the cognitive level of each multiple choice question. The problematic questions are highlighted and possible recommendations are made to improve or revise such questions for future usage.
dc.description.availability Unrestricted
dc.description.degree MEd
dc.description.department Science, Mathematics and Technology Education
dc.identifier.citation Brits, GP 2017, University student performance in multiple choice questions : an item analysis of Mathematics assessments, MEd Dissertation, University of Pretoria, Pretoria, viewed yymmdd <http://hdl.handle.net/2263/65477>
dc.identifier.other A2018
dc.identifier.uri http://hdl.handle.net/2263/65477
dc.language.iso en
dc.publisher University of Pretoria
dc.rights &#169; 2018 University of Pretoria. All rights reserved. The copyright in this work vests in the University of Pretoria. No part of this work may be reproduced or transmitted in any form or by any means, without the prior written permission of the University of Pretoria.
dc.subject UCTD
dc.title University student performance in multiple choice questions : an item analysis of Mathematics assessments
dc.type Dissertation


Files in this item

This item appears in the following Collection(s)

Show simple item record