Generation of Personalized Tasks and Sample Solutions in MATLAB for Anonymous Peer Feedback

Описание к видео Generation of Personalized Tasks and Sample Solutions in MATLAB for Anonymous Peer Feedback

Recording of my talk at the MATLAB EXPO 2024

Website: https://www.matlabexpo.com/online/202...

Learning the fundamentals of electrical engineering is not a spectator sport. Lecturers can explain the basics to students, but they cannot make them understand. Just like learning to ride a bicycle or play the piano, students need to actively work through many examples, from simple to more sophisticated. Simple topics can be implemented with multiple choice questions or numerical problems. For more complex problems, handwritten solutions are more appropriate, where circuits, formula conversions, or diagrams can be quickly sketched.

Students also need frequent and timely feedback on their progress and results. To increase engagement in electrical engineering and improve student understanding, we use MATLAB and LaTeX to create personalized assignments so that students can collaborate and discuss different approaches to a handwritten solution, but cannot simply copy and plagiarize from each other. Students then correct and evaluate their peers' submissions in a double-blind, anonymous peer-feedback process to reduce the review burden on instructors and supervisors.

Personalized sample solutions help students to perform the review in a technically correct way, even if they were not able to solve their own problem correctly. It will be shown how MATLAB can be used as a framework to automate the process of generating individualized problems and sample solutions, sending them to students, and organizing the administrative process of peer review and grading. The talk will conclude with some experiences from the past semesters and a look into the future with some suggestions for further improvements.

Chapter marks:
00:00 Introduction
01:11 Motivation
07:47 Process
10:21 Tasks and Sample Solutions
12:37 Implementation
14:40 Evaluation
19:25 Summary

Комментарии

Информация по комментариям в разработке