Automark at London South Bank

So it begins

There have not been many days as refreshing and nervous as this one. Today we were going to test our platform in a real life tutorial at London South Bank University. We were expecting 90 students split into a morning and afternoon session. The brave lecturers Dr Manik Gupta, Kasra Kassai and Souheil Fenghour gave us the opportunity to test our system on their C coding tutorial.

We had the problems set out by Manik and were ready to test. Students managed to login to the system with ease and got started trying to solve the problems. The students had the option to write code on the online editor or do it offline and then submit their solutions online.

The problem statements were set by the lecturers. This session contained three C problems to solve: Positive, Negative or Zero, Number Cubes and days of the month. Students immediately became hooked in trying to solve them as quickly as possible. They became highly addicted and kept submitting to see if they had got the right answer, even though they had not completed the code correctly. This meant that they liked using the new environment to test their code and receive instant feedback on their work. It also enabled teachers to monitor their student’s submissions in real-time and gain insights about their performance. The staff no longer had to go through tens of emails and perform manual marking. They instantly received a table of their students’ performances in real-time. This will save the staff hours of marking time every week — extra time which can be spent with the students to assist them with any difficulties that they are facing.

We found that students were calling the teachers to see why their submissions were incorrect immediately and then the teachers were able to provide hints and tips to the students. Even though it was a new environment it made the lesson a lot more engaging. The students got used to the new environment within 15 minutes and without any prior training.

Submissions in Numbers

In order to gain more insights about student submissions lets take a look at some statistics regarding these submissions. For this particular exercise there were 3 problems and 63 students enrolled in the course. Surprisingly, there were a total of 4000+ submissions on the system! This means that on average each student performed 23 submissions on each problem (the distribution is not so even though, as we will see below).

The 2 workshops were split between 9-11 am and 2-4PM. This is why we see 2 peeks in the chart bellow corresponding to these 2 sessions. However, we can also see that some students are still performing attempts even throughout the day. The submissions continued to come through until 9pm on Friday, with a few submission over the weekend.

 

These are the 63 data points for student submissions. We can see that the submissions are uneven; some students are making more attempts than others. This could be because they are uncertain, or in some cases they do not understand something in the problem statement. Something to delve deeper into is the reason some students make as many as 130+ submissions.

We can also see that most students made attempts in the first 2 problems, but very few made attempts on the third problem. This could be due to the fact that they ran out of time during the session and did not attempt outside of their allocated lab times.

 

Around 70% of the submissions on the system were compilable code (wrong or accepted answer). This agrees with our observation that majority of the students preferred to write and test their code in their own Development Environments and then copy and paste their code for online submission. The versatility of the platform allows for this. Runtime errors are usually due to errors in Input/Output reading.

 

 

Conclusion

Even though it was the first time students used the system they managed to navigate around it with ease and were able to use it without any great difficulty. All in all the day was a success for testing the platform and our assumptions. Dr Manik Gupta was instrumental in allowing us to introduce this system to LSBU and would love for us to go back next semester and help with their Python coding module. “I am keen to try the platform again next term with the students’ Python assignments.” Said Dr. Gupta. “I think the platform is perfect as it stands, with just a few more features, I can see it adding great value to our modules” said Kasra, a PhD researcher and TA for this module.

It is visionaries like Dr Gupta, Kasra and Suhail who allow forward thinking and progress in our education sector. Without their vision it would not be possible for startups like Automark to succeed and deliver their vision. Together we will work to bridge the gap in digital skills, and bring the appropriate tech skills to the market sooner.

 

Leave a Reply

Your email address will not be published. Required fields are marked *