Randomized Assessments from Question Bank

11:03 AM on July 14, 2022 | computer science programming

As computer science program enrollments soar, instructors are asked to scale their courses and move them online. There are a number of benefits of moving courses online, however, it also raises concerns about academic integrity, particularly around exams.

Benefits of Computer-based Exams

In computer science, one of the main benefits of a computer-based exam is allowing students to write code in an actual development environment with access to a compiler and the ability to run and test their code. For many topics we cover in computer science, the level of mastery we want students to reach is applying the concept to code, so it makes sense that their assessments ask them to write code.

Additionally, with exams being on a computer, students do not need to physically be in a given place at a given time. Students can take the exam at their convenience during a pre-set window. This is particularly helpful for online courses where students may not all be in the same time zone. Instructors can put a time limit on how long the exam is within that window to ensure all students have the same amount of time.

Finally, moving exams to a computer enables auto-grading and instant feedback. Not only do auto-graders mean less work for instructors and TAs after an exam, but they also allow timely feedback to students.

Deploying Unique Exams

Despite all of these benefits, there is still hesitancy about migrating content online due to academic integrity concerns. One approach to this is individualizing student exams. Codio has a few exam proctoring settings to help without making any changes to the actual exam. 

The first is the ability to shuffle Guides pages. By placing each assessment on its own Guides page, toggling on the Shuffle Question Order will present each student with a randomized ordering of questions. This makes it harder for students to collude as a student’s first question is different from other students’. Additionally, instructors can toggle on Forward Only Navigation which means students can not re-visit assessments.

Another way to deploy unique exams is to randomize which assessments or questions a student sees.

Randomized Assessments

Codio enables instructors to add Random Assessments. Simply specify the list of tag values, and Codio will pull an assessment matching those criteria from the associated assessment library.

 

As assessments are added to the assessments library, instructors can synchronize their assignments with a single click to ensure they are pulling from the full breadth of the assessment library.

Create Free Instructor Account   

Codio’s Global Assessment Library

Instead of starting from scratch, instructors can pull from Codio’s existing Global assessment library. All the assessments in Codio’s Global assessment library are auto-graded and each assessment is tagged by:

  • Programming language
  • Assessment type
  • Category (topic-level)
  • Content (sub-topic level)
  • Learning Objective (in SWBAT form - “Students will be able to….”)
  • Bloom’s Taxonomy level

Bloom’s taxonomy is a way to represent the level of mastery of the content being assessed. See the graphic below for a description of each level and a list of common verbs associated with that level.

For example, when assessing a student’s knowledge of loops, different assessments reveal different levels of mastery.

Learning Objective

Bloom’s Level

List the different types of loops in Java

Remember

Select the type of loop to use given the described use case

Understand

Implement an iterative algorithm to calculate factorials

Apply

Design a piece of turtle art using a loop

Create

 

Amira and colleagues (2018) described how generating exams with questions standardized to a learning taxonomy such as Bloom’s helps ensure alignment between course learning objectives and assessments.


Create your own Assessment Library

Codio also allows instructors to create their own assessment libraries. Assessment libraries are associated with an organization, making it easy for instructors who teach different sections of the same course or even different variations of the same course (e.g. CS1 for majors and CS1 for non-majors), to collaborate on building a question pool.

Unlike many other tools which allow the creation of question banks, Codio allows you to save the entire page layout and associated files as a single assessment in the library. Within an assessment library, an assessment can have a simple layout - which saves just the contents of the assessment, or a complex layout, which saves the entire page. This allows you to easily save full coding exercises with auto-grading scripts and starter code.

Create Free Instructor Account   

Are Randomized Exams Fair?

One concern about randomized exam questions is ensuring an equivalent level of difficulty across the generated exams. Fowler and colleagues (2022) studied randomized assessments in a CS0 and Data Structures course and found “the apparent fairness of the generated exam permutations is reasonably pleasing. The worst semester in this respect was Fall 2019, but even this variance in score is less than half a letter grade.”

References

Amria, A., Ewais, A., & Hodrob, R. (2018, January). A Framework for Automatic Exam Generation based on Intended Learning Outcomes. In CSEDU (1) (pp. 474-480).

Fowler, M., Smith IV, D. H., Emeka, C., West, M., & Zilles, C. (2022, February). Are We Fair? Quantifying Score Impacts of Computer Science Exams with Randomized Question Pools. In Proceedings of the 53rd ACM Technical Symposium on Computer Science Education V. 1 (pp. 647-653).

 

Elise Deitrick

Elise is Codio's VP of Product & Partnerships. She believes in making quality educational experiences available to everyone. With a BS in Computer Science and a PhD in STEM Education, she has spent the last several years teaching robotics, computer science and engineering. Elise now uses that experience and expertise to shape Codio's product and content.