Automatic Assessment of Computer Programs in eLearning Systems

Dangel, Ulrich, Clarke, David, Dichev, Kiril, Rychkov, Vladimir, Lobb, Richard, Murphy, John and Lastovetsky, Alexey (2014) Automatic Assessment of Computer Programs in eLearning Systems. In: The 15th Educational Technology Conference of the Irish Learning Technology Association (ILTA). May 29th and 30th, UCD, Dublin, Ireland.

Abstract

eLearning systems provide online quiz tools, which can be used for assessments, providing instantaneous feedback for both students and lecturers. Moodle, a popular open-source eLearning platform, supports quizzes with various types of questions, such as multiple choice, calculated answer, essay, etc. For courses in Computer Science, which often involve programming, the functionality of traditional quizzes is insufficient, as programming cannot be mapped to these existing quizzes. Automated execution and verification of code is already available in the context of computer programming contests. Integration of these existing techniques into eLearning systems provides the required functionality for Computer Science courses. In this work, we present CodeRunner, an extension to Moodle that allows students to submit code as a solution to an assigned question. This code is compiled, executed and the output is compared to the solution provided by the teacher. Feedback is provided to the student and they are given the opportunity to modify their solution and resubmit. This immediate feedback and the student's desire to get a green checkmark encourage students to learn through an iterative process until they achieve the correct solution. The automated system lifts the burden of correcting student submissions from course demonstrators and provides uniformity and equality in the final grade. CodeRunner was originally developed and introduced in the University of Canterbury, New Zealand. The main limitation of the original plugin was security. The execution of untrusted code, such as student submissions, on a production server which is also hosting websites, such as the eLearning platform itself, is a major security vulnerability. Fortunately, this vulnerability can be closed by isolating the code execution with the help of virtualization and cloud computing. We solved this problem by separating the Moodle quiz plugin from the code execution. Student source code is sent from the production Moodle server to a remote virtual server for execution and the output is then returned for grading. We introduced automatic code assessment at the UCD School of Computer Science and Informatics in 2014, for approx. 500 students in 4 courses. CodeRunner supports various programming languages: C, Python, Java and we extended this list with Bash Shell and Scheme. We conducted a survey of students, teachers and demonstrators on the automated code assessment. We found that students like to interact with the system, and were motivated to work at the problem until they got it right. Demonstrators and teachers liked the system and found it gave them more time teach coding techniques. More time is required to set the questions when compared to a conventional lab or assignment; however for medium to large sized classes the time saved in correcting outweighs this. In general, this system is unsuitable for teaching software engineering incorporating multiple files, complex structures and tools, and it is unable to automate the correcting of individual project-based assignments. However, it is ideal for introductory programming courses and simple problem solving exercises.

Information
Library
View Item