The Role of Retained Austenite on the Performance of High Chromium White Cast Iron and Carbidic Austempered Nodular Iron for Grinding Ball Applications

Show simple item record

dc.contributor.advisor Siyasiya, Charles Witness
dc.contributor.coadvisor Mostert, R.J. (Roelf)
dc.contributor.postgraduate Moema, Shumane Joseph
dc.date.accessioned 2019-08-12T11:18:48Z
dc.date.available 2019-08-12T11:18:48Z
dc.date.created 19/04/11
dc.date.issued 2018
dc.description Dissertation (MSc)--University of Pretoria, 2018.
dc.description.abstract The role of retained austenite on the performance of cast iron based grinding media (balls) is somewhat controversial. One school of thought is that retained austenite improves the performance through work hardening and transformation to martensite. Others argue again that the same phenomenon compromises the performance of the balls through spalling. In this study, high chrome white cast iron (HCWCI) and carbidic austempered nodular iron (CANI) grinding balls were subjected to different heat treatments to yield various amounts of retained austenite, after which they were subjected to abrasive wear testing. The tests involved were high stress abrasion (pin-on-belt abrasion test), low stress abrasion (rubber wheel abrasion test according to ASTM G65), combined abrasion-impact conditions (ball mill) and impact tests (drop test). As expected, it was found that the percentage retained austenite increases with an increase in austenitising temperature due to the dissolution of carbides. The surface hardness again decreases as the destabilisation temperature and amount of retained austenite increases. The retained austenite content of the alloy before the high stress abrasion wear (pin on belt test, POB) was found to be significantly higher if compared to that after testing. The decrease in retained austenite was due to the transformation from austenite (rest) to strain induced martensite (’) that occurs during the high stress abrasion wear test. The austenitising temperature of 1000 C resulted in low percentage mass loss values or a lower wear rate from the ball mill test. The best high stress abrasive wear resistance for HCWCI, which was due to optimum properties, was achieved when the retained austenite content was reduced to 22.6%. It was found that the high stress abrasion resistance decreased abruptly once the retained austenite content decreased to lower than 10%. The CANI alloy which was austempered at 275 C (with graphite spheroids and carbides in the ausferrite matrix and 17.4% retained austenite) performed better under both low and wet abrasive wear (ball mill) conditions.
dc.description.availability Unrestricted
dc.description.degree MSc
dc.description.department Materials Science and Metallurgical Engineering
dc.identifier.citation Moema, SJ 2018, The Role of Retained Austenite on the Performance of High Chromium White Cast Iron and Carbidic Austempered Nodular Iron for Grinding Ball Applications, MSc Dissertation, University of Pretoria, Pretoria, viewed yymmdd <http://hdl.handle.net/2263/71010>
dc.identifier.other A2019
dc.identifier.uri http://hdl.handle.net/2263/71010
dc.language.iso en
dc.publisher University of Pretoria
dc.rights © 2019 University of Pretoria. All rights reserved. The copyright in this work vests in the University of Pretoria. No part of this work may be reproduced or transmitted in any form or by any means, without the prior written permission of the University of Pretoria.
dc.subject UCTD
dc.title The Role of Retained Austenite on the Performance of High Chromium White Cast Iron and Carbidic Austempered Nodular Iron for Grinding Ball Applications
dc.type Dissertation


Files in this item

This item appears in the following Collection(s)

Show simple item record