Mudslide Full Talk

1.0x

Mudslide Full Talk

Created 2 years ago

Duration 0:11:09
lesson view count 142
Select the file type you wish to download
Slide Content
  1. Mudslide: A Spatially Anchored Census of Student Confusion for Online Lecture Videos

    Slide 1 - Mudslide: A Spatially Anchored Census of Student Confusion for Online Lecture Videos

    • Elena L. Glassman1,2 Juho Kim1,2
    • Andrés Monroy-Hernández1 Meredith Ringel Morris1
    • 1Microsoft Research, Redmond, WA, USA
    • 2MIT CSAIL, Cambridge, MA, USA
    • {juhokim, elg}@mit.edu
    • {andresmh, merrie}@microsoft.com
  2. Slide 2

    • Teachers can assess confusion in traditional classrooms with muddy cards.
    • After class, students write questions about what confused them:
  3. Mudslide translates the practice ofmuddy cards into the realm of online lecture videos.

    Slide 3 - Mudslide translates the practice ofmuddy cards into the realm of online lecture videos.

    • Lecture Video
    • Student highlights
    • point of confusion
    • Student asks question
    • Pile of muddy cards from an in-person class
  4. Design Goals

    Slide 4 - Design Goals

    • Encourage students to reflect on the entire lecture
    • Encourage students to provide specific feedback
    • Provide fast and intuitive way for students to give feedback
    • Allow teachers to quickly interpret student confusion
    • 4
  5. User Experience

    Slide 5 - User Experience

    • Teachers produce slide-based online lecture video
    • Students watch online lecture video
    • Students submit confusion
    • Submission Interface
    • Teachers and students see resulting map of confusion across lecture slides
    • Viewing Interface
    • 5
  6. Submission Interface

    Slide 6 - Submission Interface

  7. Submission Interface

    Slide 7 - Submission Interface

  8. Mudslide

    Slide 8 - Mudslide

    • Student Submission Interface
    • Teacher Viewing Interface
    • 8
  9. Viewing Interface

    Slide 9 - Viewing Interface

  10. Spatial Annotation

    Slide 10 - Spatial Annotation

    • 10
  11. Evaluation

    Slide 11 - Evaluation

    • Constructed baseline interface
    • simple text box
    • Teachers (19) produced slide-based online lecture videos
    • Crowd of Mechanical Turkers simulated an entire classroom of students
    • collectively populated slides with a mean of 45 muddy points
    • Students (25)
    • watch online lecture video
    • submit confusion with interface A
    • Mudslide
    • Baseline
    • Repeat steps 1 and 2 using interface B
    • View confusion heatmap
    • Survey
    • Teachers returned to view map of confusion
    • Survey
    • 11
  12. Evaluation

    Slide 12 - Evaluation

    • Constructed a baseline interface
    • A more literal translation of the index card: a simple text box
    • 19 local middle and high school teachers
    • produced online lecture videos based on their classroom lecture slides
    • Simulated a 20-30 person classroom
    • Amazon Mechanical Turk Workers watched teacher’s lecture videos
    • Submitted muddy points
    • Used either Mudslide or baseline submission interfaces
    • 25 middle and high students
    • came into the lab
    • watched two online lecture videos & submitted feedback with either Mudslide or baseline
    • counterbalanced, within-subjects experimental design
    • 21 students had time to interact with the Mudslide viewing interface
    • Survey
    • Teachers see resulting map of confusion across lecture slides
    • Survey
    • 12
  13. Evaluation: Student Submission Experience

    Slide 13 - Evaluation: Student Submission Experience

    • 21 of 25 students preferred Mudslide over the baseline.
    • Baseline
    • more freedom
    • 36% appreciated that they “could write down whatever [they] wanted”
    • less support
    • 28% expressed the difficulty or tedium of expressing the muddiest point only through text “instead of just pointing to it.”
    • “Advanced scenarios were very difficult to explain and say what about them was confusing.”
    • 13
  14. Evaluation: Student Submission Experience

    Slide 14 - Evaluation: Student Submission Experience

    • Mudslide
    • Pros
    • Visual Cues
    • It “was much easier to provide the muddiest moment when I had the visual cues to help remind me what they were.”
    • Show rather than tell
    • "I could easily show the instructor the area where I was confused with pretty accurate precision”
    • Cons
    • Hard to "point" at confusing narration
    • “You had to come up with something very relevant and specific to the slides you were clicking on.”
    • Fewer students gave non-substantive comments in the Mudslide condition (p<0.05)
    • 14
  15. Evaluation: Teachers' Viewing Experience

    Slide 15 - Evaluation: Teachers' Viewing Experience

    • Compared to the baseline, teachers
    • rated Mudslide as significantly more useful (p<0.05)
    • thought Mudslide gave them "a better sense of students' confusion" (p<0.001)
    • “[The Mudslide UI] allowed me to see EXACTLY [emphasis theirs] where the kids were confused while [the baseline] method was more vague”
    • observed that Mudslide made it harder for students to give general feedback, e.g., about pace and tone quality
    • 15
  16. Evaluation: Spatial Annotation (Heatmap)

    Slide 16 - Evaluation: Spatial Annotation (Heatmap)

    • Teachers
    • “I found that the heatmap was much better for a quick at a glance view, and I could literally zoom in on problem areas.
    • … If I had more time I’d prefer the comments in text form but I am pretty sure that doesn’t happen very often.”
    • “Seeing what slide was the most confusing and what was confusing about it was helpful and students normally aren't able to provide that feedback on paper.”
    • 16
  17. Evaluation: Reflection

    Slide 17 - Evaluation: Reflection

    • Teachers
    • “More than just gauging how a student does with the activity, I can quickly see where my presentation may be have lacking or unclear.”
    • Their perceptions about the clarity of their lectures changed, and they wanted to revise their lectures.
    • Students
    • “It helped me see what other people were confused about, which reassured me when I was confused.”
    • 17
  18. Mudslide: A Spatially Anchored Census of Student Confusion for Online Lecture Videos

    Slide 18 - Mudslide: A Spatially Anchored Census of Student Confusion for Online Lecture Videos

    • Elena L. Glassman1,2 Juho Kim1,2
    • Andrés Monroy-Hernández1 Meredith Ringel Morris1
    • 1Microsoft Research, Redmond, WA
    • 2MIT CSAIL, Cambridge, MA
    • {juhokim, elg}@mit.edu
    • {andresmh, merrie}@microsoft.com
  19. We found that this spatially anchored census of student confusion is helpful.

    Slide 19 - We found that this spatially anchored census of student confusion is helpful.