CLOSING THE ASSESSMENT GAP: THE SIMMARKS APP FOR EQUITABLE COURSEWORK EVALUATION
DOI:
https://doi.org/10.35631/IJMOE.724054Keywords:
Coursework Evaluation, Mark Simulation, Application, Lecturer, Teaching And Learning ProcessAbstract
Coursework includes a variety of technical activities and assignments assessing students' affective and psychomotor abilities, crucial components of the student evaluation. Some of the coursework rely on the lecturer’s subjective assessments that could leads to inconsistent mark distributions among different student groups. Thus, to overcome this issue, a scoring rubric is designed as assign marks for each domain and criteria requirement. Despite this, significant disparities in ratings persist mainly due to assessor neglect to refer the appropriate domain criteria during evaluations. SimMarks App was developed to allow the selection of domain criteria before conducting any assessments, ensuring reliable score simulations and helping identify potential issues. Testing has demonstrated the app's viability, effectiveness, and user-friendliness. It executed commands with over 90% accuracy, and the target user rated its usability as good. An average score error margin just 1.1 marks. The app's success is largely attributed to the adopted of the ADDIE paradigm during its development. As the SimMarks App continues to evolve, it is poised for broad deployment, ensuring more accurate and consistent evaluations.