We are excited to announce that the repository will soon undergo an upgrade, featuring a new look and feel along with several enhanced features to improve your experience. Please be on the lookout for further updates and announcements regarding the launch date. We appreciate your support and look forward to unveiling the improved platform soon.
dc.contributor.author | Long, Caroline![]() |
|
dc.contributor.author | Bansilal, Sarah![]() |
|
dc.contributor.author | Debba, Rajan![]() |
|
dc.date.accessioned | 2014-10-13T06:38:23Z | |
dc.date.available | 2014-10-13T06:38:23Z | |
dc.date.issued | 2014-08-26 | |
dc.description.abstract | Mathematical Literacy (ML) is a relatively new school subject that learners study in the final 3 years of high school and is examined as a matric subject. An investigation of a 2009 provincial examination written by matric pupils was conducted on both the curriculum elements of the test and learner performance. In this study we supplement the prior qualitative investigation with an application of Rasch measurement theory to review and revise the scoring procedures so as to better reflect scoring intentions. In an application of the Rasch model, checks are made on the test as a whole, the items and the learner responses, to ensure coherence of the instrument for the particular reference group, in this case Mathematical Literacy learners in one high school. In this article, we focus on the scoring of polytomous items, that is, items that are scored 0, 1, 2 … m. We found in some instances indiscriminate mark allocations, which contravened assessment and measurement principles. Through the investigation of each item, the associated scoring logic and the output of the Rasch analysis, rescoring was explored. We report here on the analysis of the test prior to rescoring, the analysis and rescoring of individual items and the post rescore analysis. The purpose of the article is to address the question: How may detailed attention to the scoring of the items in a Mathematical Literacy test, through theoretical investigation and the application of the Rasch model, contribute to a more informative and coherent outcome? | en_US |
dc.description.librarian | am2014 | en_US |
dc.description.uri | http://www.pythagoras.org.za | en_US |
dc.identifier.citation | Long, C., Bansilal, S., & Debba, R. (2014). An investigation of Mathematical Literacy assessment supported by an application of Rasch measurement.Pythagoras, 35(1), Art. #235, 17 pages. http://dx.DOI.org/ 10.4102/pythagoras.v35i1.235. | en_US |
dc.identifier.issn | 1012-2346 | |
dc.identifier.other | 10.4102/pythagoras.v35i1.235 | |
dc.identifier.uri | http://hdl.handle.net/2263/42351 | |
dc.language.iso | en | en_US |
dc.publisher | AOSIS Open Journals | en_US |
dc.rights | © 2014. The Authors. Licensee: AOSIS OpenJournals. This work is licensed under the Creative Commons Attribution License. | en_US |
dc.subject | Investigation | en_US |
dc.subject | Rasch measurement | en_US |
dc.subject | Learners | en_US |
dc.subject | Mathematical Literacy (ML) | en_US |
dc.subject | School subject | en_US |
dc.title | An investigation of mathematical literacy assessment supported by an application of Rasch measurement | en_US |
dc.type | Article | en_US |