A number of instructors in my practice always come up with the idea that it would be very instructive for students if they would develop test questions. In a project for the Digital University Consortium of the Netherlands, a tool for this was developed: Question Bank. The tool was introduced in a number of institutions and an article was written (Draaijer, S. Boter, J. 2005). But because of a lack of professional support and funding, the tool is no longer in active use or development (though it is open source and anyone is free to use and develop it further). But there are some developments in the field.
The first development I picked up last week was the presence of the system PeerWise by the University of Oakland, New Zealand. The system allows students to enter questions and other students can answer these questions and rate them. It is a purely formative tool. The developers are very enthousiastic.
Sally Jordan of the Open University of the UK is less convinced. She wonders if this system would really help. This brings me to a research report that was just published by Papinczak et al. (2011). They investigated the deployment, use and effect of student-generated questions in a medical curriculum. For the specific courses, the students were asked to develop questions for a final exam. 25% of the total number of questions for the final exam would comprise those student-generated questions.
The researchers conclude that students engaged in a surface-learning approach which they found not very worthwhile. However, they also concluded that “Although students may have memorised the questions and answers, there is no evidence that they do not understand the information.” In other words: using student-generated questions is not that bad either. The format of student-generated questions does add to the possible repertoire for instructors in higher education.