This is the blog for the SGC4L project, funded from the JISC Assessment and Feedback programme and led by the Physics Education Research Group at the University of Edinburgh.

As well as this blog, the project wiki contains documents and information on the progress, development and dissemination activities associated with the project.

Wednesday, March 7, 2012

Qualitative categorization

Our two final year undergraduate project students are making great progress in categorising samples of the PeerWise questions created by students in our first year Physics class.

They've developed and refined a range of information to be captured from each question authored by a student as follows:


Clarity of Questions:  - 
0 – Unclear (including spelling & grammar that make question unclear)
1 – Clear

Feasible Distractors:
0 – None
1 – At least 2 but not all
2 – All Feasible

Explanation:
0 – Missing
1 – Inadequate or wrong
2 – Minimal/unclear
3 – Good/Detailed
4 – Excellent (Describes physics thoroughly, remarks on plausibility of answer, use of appropriate diagrams, perhaps explains why you would have obtained distractors)

Quality of Author Comments:
0 – None
1 – Irrelevant
2 – Relevant

Correct:
0 – Obviously Incorrect
1 – Most Likely Correct

Recognised as Incorrect: 
0 – N/A or not recognised
1 – Recognised as incorrect by students (or disagreement with author)

Diagram: 
0 – None
1 – Contextual picture but not relevant
2 – Relevant diagram or picture



Plagiarism:
0 - Potentially Plagiarised
1 – Not obviously plagiarised

Context of Question: 
0 – None (formulas, recalling info)
1 – Irrelevant or extraneous context (entertaining, imaginary)
2 – Physics (frictionless, idealized situation)
3 – Relevant real world context (applicable to daily situations cars on racetracks)

Revised Taxonomy:
1 – Remember, Recognise or Recall OR just plugging in numbers
2 – Understand, Interpret or predict (No calculation needed, understanding 3rd law for example)
3 – Apply, Implement or Calculate (1 step calculation)
4 – Analyse, differentiate or organise (multi-step calculation, higher analysis)
5 – Evaluate, Asses or Rank ( Evaluating various options and assessing their validity)
6 – Create, Combine or Produce (Asked to combine various areas of physics, need to get a structure right to solve whole problem)

This last category maps the question onto levels in the cognitive domain of Bloom's (revised) taxonomy. After doing some tests to establish an acceptable level of inter-rater reliability, we've let the two students loose on their own sets of questions.

They're in progress, but early indications are that in contrast to a recently published study (A participatory learning approach to biochemistry using student authored and evaluated multiple-choice questions, Denny and Bottomley DOI: 10.1002/bmb.20526) we're seeing that only relatively few questions inhabit the lower reaches of this scale, with most in category 3 and 4 and non-negligible numbers in the highest categories. I'll post more results when we have them.

Here's a nice example of a question classified at level 5 in the taxonomy. 



No comments:

Post a Comment