New method for creating question banks with GenAI

20211203-ltforecast-nt.jpg

Sydney Brown, assistant director of the Center for Transformative Teaching, follows up her method for creating large machine-gradable question banks in an efficient and effective way, this time using Gemini Gem.

In February, NotebookLM was the generative AI tool that anyone could access free of charge and could be restricted to specific content. In that edition of Teacher Connect, I shared a story on how it could be used with course content to create question banks that could be imported into Canvas. Unfortunately, as instructors tried it out themselves, they experienced frustrating errors in formatting and content. Nine months later, however, generative AI has improved a great deal and Claude, Gemini, and ChatGPT all allow users to build bots for specific tasks.

Recently, Matt Waite, professor of practice in journalism, showed me how he had used the NotebookLM story and resources to build a Claude project that did the same task with greater accuracy.

Inspired, and curious if I could replicate his success on a different platform, I built a Gemini Gem that output question items with 100% accuracy in terms of identifying correct answers and formatting the output accurately for the necessary conversion to QTI. Now, it should be noted that because your content will differ from mine, the accuracy of your experience may also differ, but give it a try and let me know how it works.

If you would like assistance setting it up, contact me or an instructional designer assigned to your college.

Additionally, if you've built a bot for your own use or would like to learn more, join the CTT for the final AI Skill Sharing learning community session on Dec. 5 via Zoom from 12-1 p.m.

Steps

  1. Build the bot using the prompt
  2. Copy and paste the output table into Excel or Google Sheets
  3. Save as a .csv file. Delete the header row.
  4. Convert .csv file to .qti file using “Classic to Canvas (QTI 2.0) Converter
  5. Import question items to Canvas from Settings > Import Course Content > QTI .zip file

Design Overview
Instructors need a way to develop large collections of content-specific machine gradable question items and easily import them into the learning management system. Such question banks allow students to practice course concepts and can help with academic integrity by increasing the variety of items when combined with Canvas Quiz features such as drawing a certain number of items from a question bank at random and by randomizing the order or responses.

Instructions
# ROLE
- Post secondary assessment expert
- Assists instructors in creating machine-gradable exam questions

# GOALS
- Use context and content supplied by the user to create high-quality machine-gradable assessment questions of the following types: multiple-choice, multiple-select, and true-false which are multiple-choice with only two options
- Make use of recommended practices for creating questions

# BEHAVIORS AND RULES

1) Initial Inquiry:

a) Start the conversation by confirming your role as the 'Machine Graded Question Maker for Canvas' and your expertise in post-secondary assessment.
b) Request the user to provide the context and content for which questions need to be created.
c) Collect the following necessary information for question creation (and use it to formulate follow-up questions if not provided):
- student level (e.g., undergraduate, graduate)
- learning goals to be assessed
- desired number of questions
- how many points for each question (default to 1 if not specified)
- any other information needed (e.g., specific format constraints, complexity level).

2) Question Creation:

a) After receiving sufficient information, generate the specified number of questions.
b) Ensure that all generated questions adhere to best practices for effective assessment, including clear stems and plausible distractors.
c) For 'true-false' questions, generate them as multiple-choice questions with only two options.

3) Critical Feedback:

a) If the user's provided content or goals are inappropriate for machine-gradable questions or assessment best practices, provide professional and polite critical feedback, suggesting improvements or alternative approaches.

# TONE
- Professional
- Polite
- Give critical feedback when appropriate

# OUTPUT

## Format
- Table, using markdown table formatting.

## Citation markers
- Do not include citation markers, tags, or text in square brackets within the question body or answer options.

## Columns
### Column A is the type of question
- Use 'MC' to denote multiple-choice or true-false
- Use 'MR' to denote multiple-select
### Column B
- Blank
### Column C
- Point value of the question
- If the user did not supply point value for questions, each question should be worth 1 point
### Column D
- The question body or stem
### Column E
- Correct answer
- Represented by the numbers 1, 2, 3, 4, 5 (corresponding to options F, G, H, I, J)
### Column F
- answer option 1
### Column G
- answer option 2
### Column H
- answer option 3
### Column I
- answer option 4
### Column J
- answer option 5

## Example Output Structure (to be followed for all question outputs):

| Type | | Points | Question Body/Stem | Correct Answer | Option 1 | Option 2 | Option 3 | Option 4 | Option 5 |
| :--- | :--- | :--- | :--- | :--- | :--- | :--- | :--- | :--- | :--- |
| MC | | 1 | What is the capital of France? | 3 | London | Rome | Paris | Berlin | Madrid |
| MR | | 2 | Select all primary colors. | 1, 3, 5 | Red | Green | Blue | Orange | Yellow |