Skip to main content
Brown paper background with a compass in the corner

Course Inquiry Starter Kit

Your compass for course and program-level data-driven inquiry.

Every course tells a story in its data. The Course Inquiry Starter Kit helps you start reading yours with guided prompts and AI tools you can use right now to explore questions about your course design, student outcomes and learning alignment.

Good teaching starts with good questions

Faculty regularly notice things in their courses that are hard to explain — patterns in grades, recurring feedback, assignments that don't land the way they should. But finding time to investigate those patterns is another matter.

The Course Inquiry Starter Kit gives you tools and guided prompts to begin exploring those questions on your own — using your syllabus, your assignments, and your course data. These aren't diagnostic reports. They're starting points: structured ways to look at your own course with fresh eyes.

For faculty who want to go deeper, our Educational Effectiveness and Teaching Innovation staff can help you go further — with institution-level data, comparative analysis, and one-on-one consultations.

Start with a question. We'll help you find the answer.

Choose a direction. Each cluster contains activities, starter prompts, and tools sized to a single course. Pick the question that feels most urgent for you right now.

  1. Who is my course designed for — and who does it actually serve?
    Every course is designed with an imagined student in mind. This cluster helps you make that imagined student visible — and ask whether your course works for the students actually in your room.

  2. Where are students getting lost?
    The data in your gradebook and LMS already tells a story. This cluster helps you read it — looking for patterns that are easy to miss when you're in the middle of the semester.

  3. Does my course actually do what I think it does?
    Misalignment between what a course says it does and what it actually measures is one of the most common — and most fixable — problems in course design. This cluster helps you audit the gap.

  4. What do my students actually think is happening?
    Student feedback is easy to read reactively and hard to read systematically. This cluster helps you step back from the loudest comments and find what's actually recurring across time.

All activities use tools you already have access to at UMD. No new accounts. No new software.

  • ELMS-Canvas (your LMS): Built-in analytics including grade distributions, assignment performance, weekly engagement, page views, and individual student activity reports.
  • NotebookLM: Upload your own documents — syllabus, assignments, eval reports — and have a structured conversation with them. Best for deep analysis of your own course materials.
  • Gemini (linked to Google Workspace): Access your Google Drive files directly. Best for structured comparison tasks when your materials already live in Drive.
  • TerpAI: UMD's open-access AI tool, available to all faculty, staff, and students. Best for open-ended prompting and quick reflective exercises.
  • Microsoft Copilot: Integrated with Microsoft 365. Best for faculty working primarily in Word, Excel, or OneNote — useful for rubric analysis and document comparison.

Ready to go further?

The activities in this kit are designed to help you start the inquiry. When your questions get bigger than what you can see from inside your own classroom — when you want to know how your course compares to similar courses at UMD, or to understand patterns across student populations — that's where deeper support comes in.

Our Educational Effectiveness team works with faculty to analyze institution-level data, identify systemic patterns, and connect findings to teaching interventions. This is not a self-serve service — it's a collaboration.

 

Request a Consultation

Activity Menu

Each activity below contains three components: it begins with How to do it, which outlines the concrete steps for setting up and running the activity, and Spark, a starter prompt you can paste or adapt. It then includes an Honest note describing what the activity can and cannot tell you, and a Bridge pointing to pathways for deeper institutional support.

Key: ELMS-data = ELMS-Canvas analytics; NLM = NotebookLM; GEM = Gemini; TAI = TerpAI; COP = Microsoft Copilot


Activity Cluster 1: 

Who is my course designed for — and who does it actually serve?

Activity Cluster 2: 

Where are students getting lost?

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "grade distribution cliff finder" on it.

Grade Distribution Cliff Finder

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "engagement drop-off analysis" on it.

Engagement drop-off analysis

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "assignment performance comparison" on it.

Assignment performance comparison

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "resource access audit" on it.

Resource access audit

Activity Cluster 3: 

Does my course actually do what I think it does?

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "outcome to assessment alignment check on it.

Outcomes to assessment alignement check

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "Bloom's Level Audit" on it.

Bloom's level audit

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "Rubric to outcome coherence" on it.

Rubric to outcome coherence check

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "grading policy coherence review" on it.

Grading policy coherence review

Activity Cluster 4: 

What do my students actually think is happening?

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "course evaluation pattern finder" on it.

Course evaluation pattern finder

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "mid-semester check-in design" on it.

Mid-semester check-in design

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "assignment confusion detector" on it.

Assignment confusion detector

Rough wooden boards with vintage paper, looking like a hand-written field note, with the the words "Multi-semester evaluation trend analysis" on it.

Multi-semester evaluation trend analysis

Have you tried one or two of the activities above and found yourself wanting more than the tool could show you?

That's an important signal. Maybe the data wasn't there, or the analysis raised a question your course analytics couldn't answer. Maybe you're curious about a pattern that is more comprehensive than one section, one semester, or one course. When a single course inquiry isn't enough, our Educational Effectiveness team can help take you further.

Our Educational Effectiveness and Teaching Innovation staff works with faculty at two levels: course-level consultations for questions about a single course, and broader educational effectiveness engagements for faculty exploring patterns across multiple sections, a program pathway, or a curriculum as a whole. If you're not sure which you need, start by reaching out. We'll figure it out together.

Back to Top