Course Inquiry Starter Kit
Your compass for course and program-level data-driven inquiry.
Every course tells a story in its data. The Course Inquiry Starter Kit helps you start reading yours with guided prompts and AI tools you can use right now to explore questions about your course design, student outcomes and learning alignment.
Good teaching starts with good questions
Faculty regularly notice things in their courses that are hard to explain — patterns in grades, recurring feedback, assignments that don't land the way they should. But finding time to investigate those patterns is another matter.
The Course Inquiry Starter Kit gives you tools and guided prompts to begin exploring those questions on your own — using your syllabus, your assignments, and your course data. These aren't diagnostic reports. They're starting points: structured ways to look at your own course with fresh eyes.
For faculty who want to go deeper, our Educational Effectiveness and Teaching Innovation staff can help — with institution-level data, comparative analysis, and one-on-one consultations.
Start with a question. We'll help you find the answer.
Choose a direction. Each cluster contains activities, starter prompts, and tools sized to a single course. Pick the question that feels most urgent for you right now.
Who is my course designed for — and who does it actually serve?
Every course is designed with an imagined student in mind. This cluster helps you make that imagined student visible — and ask whether your course works for the students actually in your room.Where are students getting lost?
The data in your gradebook and LMS already tells a story. This cluster helps you read it — looking for patterns that are easy to miss when you're in the middle of the semester.Does my course actually do what I think it does?
Misalignment between what a course says it does and what it actually measures is one of the most common — and most fixable — problems in course design. This cluster helps you audit the gap.What do my students actually think is happening?
Student feedback is easy to read reactively and hard to read systematically. This cluster helps you step back from the loudest comments and find what's actually recurring across time.
All activities use tools you already have access to at UMD. No new accounts. No new software. Your data stays secure and private.
- ELMS-Canvas (your LMS): Built-in analytics including grade distributions, assignment performance, weekly engagement, page views, and individual student activity reports.
- NotebookLM: Upload your own documents — syllabus, assignments, eval reports — and have a structured conversation with them. Best for deep analysis of your own course materials.
- Gemini (linked to Google Workspace): Access your Google Drive files directly. Best for structured comparison tasks when your materials already live in Drive.
- TerpAI: UMD's open-access AI tool, available to all faculty, staff, and students. Best for open-ended prompting and quick reflective exercises.
- Microsoft Copilot: Integrated with Microsoft 365. Best for faculty working primarily in Word, Excel, or OneNote — useful for rubric analysis and document comparison.
We encourage you to exclusively use UMD-approved GenAI tools, which are deployed in alignment with institutional security and compliance requirements.
Need a sounding board?
The Teaching Innovation Team is here to answer any questions you have about the kit, refining your pedagogical approach, or brainstorming your project inquiry. Request a consultation today!
Ready to go further?
The activities in this kit are designed to help you start the inquiry. When your questions get bigger than what you can see from inside your own classroom — when you want to know how your course compares to similar courses at UMD, or to understand patterns across student populations — that's where deeper support comes in.
Our Educational Effectiveness team works with faculty to analyze institution-level data, identify systemic patterns, and connect findings to teaching interventions. This is not a self-serve service — it's a collaboration.
Activity Menu
Each activity below contains three components: it begins with an Overview of the time and tools you will need to complete the activity, and then reviews How to do it, which outlines the concrete steps for setting up and running the activity, including a starter prompt you can paste or adapt. It then includes Notes describing what the activity can and cannot tell you, and a bridge pointing to pathways for deeper institutional support.
Activity Cluster 1:
Who is my course designed for — and who does it actually serve?
Prerequisite Knowledge Mapping
What prior knowledge is needed to successfully complete my first assignment?
Syllabus assumption audit
Based on my syllabus, what does a student need to succeed in my course?
High-stakes front-loading check
Does the weight of my assignments set students up for success?
OSCQR Course Quality Review
Does my course support problem solving and have real world applications?Activity Cluster 2:
Where are students getting lost?
Grade Distribution Cliff Finder
How can I better understand how course sequencing impacts student performance?
Engagement drop-off analysis
What insights can I gain from analyzing student engagement data?
Assignment performance comparison
What can I learn about my course by closely analyzing a few assignments?
Resource access audit
Are students utilizing my course materials effectively?Activity Cluster 3:
Does my course actually do what I think it does?
Outcomes to assessment alignement check
Are my assessments properly aligned with my course's learning outcomes?
Bloom's level audit
Are my assessments asking students to demonstrate the skill level I want?
Rubric to outcome coherence check
Are my assignment rubrics measuring my course's learning outcomes?
Grading policy coherence review
Does my course grading policy center and support students' learning?Activity Cluster 4:
What do my students actually think is happening?
Course evaluation pattern finder
How can I use patterns in student feedback to improve my course?
Mid-semester check-in design
How can mid-semester feedback from students inform how I adjust my course?
Assignment confusion detector
Is the clarity of my assignment prompts impacting student performance?
Multi-semester evaluation trend analysis
What can student feedback over time reveal about my course redesign?Have you tried one or two of the activities above and found yourself wanting more than the tool could show you?
That's an important signal. Maybe the data wasn't there, or the analysis raised a question your course analytics couldn't answer. Maybe you're curious about a pattern that is more comprehensive than one section, one semester, or one course. When a single course inquiry isn't enough, our Educational Effectiveness team can help take you further.
Our Educational Effectiveness and Teaching Innovation staff works with faculty at two levels: course-level consultations for questions about a single course, and broader educational effectiveness engagements for faculty exploring patterns across multiple sections, a program pathway, or a curriculum as a whole. If you're not sure which you need, start by reaching out. We'll figure it out together.
AI Usage Statement:
The TLTC outlined each activity and used Claude (Anthropic, 2026) to generate possible prompts and limitations, which were then revised and adapted.