Table 2.


  • Categorize the question based on what students are being asked to do, not on how challenging the question may be. (For example, a “comprehend” question for a difficult concept could be a more challenging problem than an “analyze” question on an easier concept.)

  • Evaluate questions with reference to what material we know students were exposed.

Question 1. Could students memorize the answer to this specific question?
    Yes: go to question 2.
    No: go to question 4.
Question 2. To answer the question, are students repeating nearly exactly what they have heard or seen in class materials (including lecture, textbook, laboratory, homework, clicker, etc.)?
    Yes → See Remember
    No: go to question 3.
Question 3. Are students demonstrating a conceptual understanding by putting the answer in their own words, matching examples to concepts, representing a concept in a new form (words to graph, etc.), etc.?
    Yes → See Comprehend
    No: Go back to question 1. If you are sure the answer to question 1 is yes, the question should fit into “remember” or “comprehend.”
Question 4. Is there potentially more than one valid solution* (even if a “better” one exists or if there is a limit to what solutions can be chosen)?
    Yes: go to question 5.
    No: go to question 8.
Question 5. Are students making a judgment and/or justifying their answer?
    Yes → See Evaluate
    No: go to question 6.
Question 6. Are students synthesizing information into a bigger picture (coherent whole) or creating something they haven’t seen before (a novel hypothesis, novel model, etc.)?
    Yes → See Synthesize/create
    No: go to question 7.
Question 7. Are students being asked to compare/contrast information?
    Yes → See Analyze
    No: go to question 16.
Question 8. To answer the question, do students have to interpret data (graph, table, figure, story problem, etc.)?
    Yes: go to question 9.
    No: go to question 14.
Question 9. Are students determining whether the data are consistent with a given scenario or whether conclusions are consistent with the data? Are students critiquing validity, quality, or experimental data/methods?
    Yes → see Evaluate
    No: go to question 10.
Question 10. Are students building up a model or novel hypothesis from the data?
    Yes → See Synthesize/create
    No: go to question 11.
Question 11. Are students coming to a conclusion about what the data mean (they may or may not be required to explain the conclusion) and/or having to decide what data are important to solve the problem (i.e., picking out relevant from irrelevant information)?
    Yes → See Analyze
    No: go to question 12.
Question 12. Are students using the data to calculate the value of a variable?
    Yes → See Apply
    No: go to question 13.
Question 13. Are students redescribing the data to demonstrate they understand what the data represent?
    Yes → See Comprehend
    No: go back to questions 4 and 8.
Question 14. Are students putting information from several areas together to create a new pattern/structure/model/etc.?
    Yes → See Synthesize/create
    No: go to question 15.
Question 15. Are students predicting the outcome or trend of a fairly simple change to a scenario?
    Yes → See Apply
    No: go to question 16.
Question 16. Are students demonstrating that they understand a concept by putting it into a different form (new example, analogy, comparison, etc.) than they have seen in class?
    Yes→ See Comprehend
    No: go back through each category or refer to category descriptions to see which fits the best
  • * This question originally had the word “answer” in place of the word “solution.” In subsequent use of the BDK, we found that the word solution led to less confusion about the application of this question. This was not an issue in our initial use of the BDK for this report.

  • Originally, if answering “no” to question 7, we had reviewers go back to question 4 and if they were sure it was “yes,” they should be able to answer “yes” to questions 5, 6, or 7. This did not lead to any difficulties in our initial use of the BDK for this report. However, in subsequent use of the key, we found examples of questions in which comprehension-level questions were also possible. Therefore, we revised the BDK to lead raters to question 16 here to account for those question types.