QuizCon is an online quizzing tool that provides an alternative to the use of multiple choice questions. By using triangle with integrated intervals between the corners, students have the option to choose their confidence of one option over the other. Students also have the option of indicating they do not know the answer and not lose all the points on the question. By providing these options, students can earn partial credit, and instructors can get a more accurate view of student understanding and misconceptions. To learn more about confidence-weighted multiple choice questions please see the CTL QuizCon Information Page that I developed.
The tool that was developed after I had submitted a grant proposal to Columbia's Office of Teaching, Learning, and Innovation. The inspiration for submitting the proposal came after I had learned about confidence-weighted multiple choice questions from Elizabeth and Robert Bjork, professors at UCLA's psychology department, who presented on the subject at Columbia's Science of Learning Symposium. After getting approval for the grant, I worked as the principal investigators as well as learning experience designer on the project. Working with the development team at Center for Teaching and Learning, we developed the application in 6 months and first piloted it with two instructors at Columbia in Fall 2021.
Improve the general writing of (confidence weighted) multiple choice questions based on research-backed recommendations.
Increase instructor usage of competitive incorrect alternative answers for (confidence weighted) multiple choice questions.
Improve the use of (confidence weighted) multiple choice questions to assess higher order skills.
Improve instructor analysis of student performance for (confidence weighted) multiple choice questions.
Increase the usage of (confidence weighted) multiple choice quizzes as a learning tool rather than just an assessment tool.
Increase the use of information retrieval strategies that lead to long term retention by having students evaluate all incorrect alternatives before choosing what they believe to be the correct answer. This increases the engagement in productive retrieval processes to learning.
Decrease the incentive of guessing on multiple choice quizzes and thus increasing the accuracy of feedback students get based on their current level of knowledge and understanding of the content.
Improve performance on follow up assessments of the same or similar content (e.g. midterms and final exams).
Decrease the anxiety students feel on multiple choice quizzes.
As the person who proposed this project, I had the unique role of being the client as well as a member of the team developing the project. Over the course of this project I performed the following tasks:
Determined the vision of the project and intended outcomes for instructors and students.
Decided on key functions needed to meet the most basic needs of instructors at Columbia University.
Recruiting faculty member to use the tool, and expand the reach of the tool through the development of marketing materials.
Collaborated with UX designer and software developer to put in place various functions and site elements that enhance the learning experience for instructors and students.
Worked with piloting faculty in addressing their needs when implementing the tool in their class.
Led the project assessment efforts through the design and creation of student surveys, to measure the learning experience and ease of use.
Bruno, J. E. (1989). Using MCW-APM test scoring to evaluate economics curricula. Journal of Economic Education, 20, 5–22
Butler, A. C., & Roediger III, H. L. (2007). Testing improves long-term retention in a simulated classroom setting. European Journal of Cognitive Psychology, 19(4-5), 514-527.
Cisneros-Pahayahay, M., & Pahayahay, G. (2017). Level and Quality of Knowledge Using Confidence-Weighted NRET Scoring Method in Multiple Choice Test. Advanced Science Letters, 23(2), 885-889.
Kim, M. K., Patel, R. A., Uchizono, J. A., & Beck, L. (2012). Incorporation of Bloom’s taxonomy into multiple-choice examination questions for a pharmacotherapeutics course. American journal of pharmaceutical education, 76(6).
Little, J. L. (2011). Optimizing multiple-choice tests as learning events (Doctoral dissertation). University of California, Los Angeles.
Little, J. L., & Bjork, E. L. (2015). Optimizing multiple-choice tests as tools for learning. Memory & cognition, 43(1), 14-26.
Snyder, L. G., & Snyder, M. J. (2008). Teaching critical thinking and problem solving skills. The Journal of Research in Business Education, 50(2), 90.
Sparck, E. M., Bjork, E. L., & Bjork, R. A. (2016). On the learning benefits of confidence-weighted testing. Cognitive research: principles and implications, 1(1), 1-10.