Like many other academics, Sean Wise noticed a sharp drop in student engagement in class following the pandemic. In response, the professor of entrepreneurship and innovation at Toronto Metropolitan University (TMU) sought to create a tool to help students engage with course material where they organically hang out: on their phones.
“I created ProfBot as an attempt to undermine the global trend of quiet quitting,” he said.
ProfBot, a highly restricted AI-powered chatbot, acts as students’ personal study assistant. Designed to help students prepare for exams, the bot provides course-specific guidance curated by the professor. ProfBot asks students practice questions and compares their answers to A-level answers their professor has previously inputted. Then, the tool provides real-time feedback and explains why a student’s answer may not be up to par.
Dr. Wise said he aimed to make the tool user-friendly for professors. He recommends educators input past exam questions and grading rubrics. “The computer just learns that,” he explained.
Dr. Wise’s colleague, Michael Mihalicz, an assistant professor at TMU who teaches entrepreneurship and strategy, tested the tool with his students and can vouch for the user-friendly interface. “Ultimately, you’re going to get out of it what you put into it,” he said.
Despite positive outcomes on students’ grades in a recent pilot testing whether academics would be willing to adopt this type of tech in the classroom, Dr. Wise said there have been mixed reactions. In post-pilot interviews, it was found that stigma around abuse and sweeping university policies banning AI contributed to professors feeling hesitant.
“When a professor’s first reaction is, ‘Oh, I read about that; it’s all about cheating,’ they don’t tend to want to expose themselves to adopting new tools because they’re concerned that any adoption could be negative,” he said. “[They] may not be ready for this.”
Dr. Wise thoroughly acknowledged the possible risks associated with AI, but emphasized how “railing in” the bot is essential to differentiate it from an open-ended tool like ChatGPT that may be suspectable to incorrect answers, racism and abuse.
“We created a bot that was incredibly limited. I mean, so limited; it doesn’t even use any ‘Wow factor,’ but it does its job. And it only does its job,” he said.
Although relatively basic, the tool appears to aid learning. Results from a trial of ProfBot published last May found a correlation between students utilizing the tool and improved final grades. Over a nine-day pilot, 243 students were given access to ProfBot. Fifty-seven per cent of the students who used ProfBot more than once increased their grades by five per cent or more on the final exam. Anecdotally, some students said the bot substantially streamlined their study sessions.
“It was the only thing I used for my final, and I scored a 98,” said Margaret Koca, a student who used ProfBot in Dr. Wise’s entrepreneurial behaviour and strategy course.
Similarly, Susan Odus, who has worked with ProfBot both as a teaching assistant and as a student, said the tool helped her make learning connections she wouldn’t have been able to otherwise
As a TA, Ms. Odus said being able to refer students to a tool like ProfBot reduced her workload and helped them tailor their answers to the professor’s expectations. “You can tell who used it and who didn’t,” she added.
Careful application of classroom AI for increased learning equity
Along with reducing educators’ workloads, Mr. Mihalicz suggests that limited AI tools such as ProfBot may be an avenue to ensure greater learning equity. He specializes in using behavioural economics to understand educational disparities in Canada, particularly among Indigenous populations.
“You can create a system like this and make that more broadly available to help make higher learning more accessible to a broader range of students. I also think it could be really valuable in terms of personalized learning experiences,” he said. “But, it’s also pretty dangerous if it’s not done in a good way.”
Ms. Odus also noted that ProfBot’s recognition of semantic similarity among students’ answers accommodates a diversity of word choice in answers. However, she noted that students’ overdependence on AI tools, especially with the rise of ChatGPT, can be problematic.
“Sometimes, students really rely on these tools in a way that’s not very healthy,” she said. With the risk of overdependence, Mr. Mihalicz said he recognizes the necessity of caution when implementing these tools. However, he worries that students in the future may not be competitive in the job market if they do not have knowledge of using AI.
“You have a lot of people who are worried about AI and the impact that it’s going to have on education and on academia. I think it’s happening, regardless of how people feel about it. And, I think that we are doing our students a disservice by not teaching them how to use these tools in a responsible way,” he said.
No such thing as “incredibly limited” in coding. Coding, in general, and Machine Learning and AI, more specifically, are inherently human redundant. (yes that is the jargon) Any current limitation is a programmer away from being unlimited. The deskilling of teachers began more than 40 years ago in North America public education. That is why private schools maintain a lower student/teacher ratio and offer a plethora of extra curricular activities that promote engagement. Contrary to what machine learning’s fundamental assumption, people are not stupid. If they are not engaged with the course it is probably due to three core design aspects: student/teacher ratio; qualitative methods; and/or faculty expertise. AI teaching is like the name implies, artificial.