Answering Your AI Questions with Laura Dumin, PhD

Macmillan Employee
Macmillan Employee
0 0 579

The introduction of generative AI in academic environments has sparked a vibrant discussion on its impact on academic integrity, creativity, and the evolving roles of educators. This same dialogue inspired the creation of the Institute at Macmillan Learning and its first course, “Teaching with Generative AI: A Course for Educators.” Laura Dumin, PhD, a leading voice in this discourse, is one of three subject matter experts who contributed to the course, which offers a blend of asynchronous and synchronous learning, including hands-on experience developing a course policy around AI, designing assignments with considerations for AI, and navigating conversations with students about the use of AI. 

To get a glimpse into the practical knowledge and insights the course will offer, we asked Dr. Dumin five questions about AI in higher education that emerged from our AI webinar series last fall. Read on to get her take as she shares her insights on real questions from instructors like you.

Laura Dumin, PhD, is a professor of English and Technical Writing at the University of Central Oklahoma. She has been exploring the impact of generative AI on writing classrooms and runs a Facebook learning community to allow instructors to learn from each other. When she is not teaching, Laura works as a co-managing editor for the Journal of Transformative Learning, directs the Technical Writing BA, and advises the Composition and Rhetoric MA program; she has also been a campus Scholarship of Teaching and Learning (SoTL) mentor. Laura has created four micro-credentials for the Technical Writing program and one for faculty who complete her AI workshop on campus.

In your experience, when it comes to verifying the authenticity of AI-generated content, particularly in academic papers, are there other methods that can be used aside from solely depending on AI and tools like TURNITIN?

Laura Dumin: I would turn this question on its head and ask why we are verifying content. Are we concerned about facts that might be hallucinations [or false information generated by AI]? If so, we need to teach our students research skills. Are we worried about AI writing the whole assignment or large portions of it? This gets to a different question about why students are in school and taking classes. Do they need the knowledge for the future or is this course just a graduation requirement that they care little about? How can the information be made more relevant to the students so that they are less likely to cheat? And then finally, teach students about ethical and transparent AI use so that they are willing to tell you and comfortable with telling you when and where they used AI to augment their own work.

Should educators consider it their responsibility to educate students on the ethical and responsible use of AI tools, akin to how they teach the responsible use of platforms like Google and Wikipedia and tools like graphing calculators?

Laura Dumin: Yes! I get that there is a lot of content for instructors to get through in 15 weeks, but the reality is that these technologies are still new to a lot of students. We can’t expect them to magically learn ethical and responsible AI use if we don’t help them get there. Also, what looks responsible in one field might be different in another field. We want students to learn and understand the tools and the nuances for responsible use.

With the increasing role of AI in academic writing, what are your thoughts on universities introducing prerequisite courses dedicated to teaching students how to effectively use AI tools?

Laura Dumin: I’m not sold on the need for these types of courses. I’m seeing talk that AI is being approached more in middle school, which means that we have about 5 years before students come to us with a better understanding of ethical and responsible AI use. If it takes an average of 18 months to get a new course on the books, this means that the course probably won’t have a long lifespan. And since the AI landscape keeps rapidly shifting, it would be hard to put together a course that looked the same from start to finally being taught.

What are your thoughts on using AI to aid brainstorming while nurturing students' independent thinking? What does it mean and can it potentially hinder students’ creativity in generating original ideas?

Laura Dumin: I’m ok with it and I am a writing instructor. I get that the struggle of brainstorming can be part of the process in a writing class. If that’s the case, make that clear to students and help them understand the rationale. But if a student is so frozen by an inability to move past the brainstorming phase that they can’t get the project done, no one wins. In that case, AI can help students have a path forward. AI brainstorming can also help open new possibilities for students to see sides of an issue that they weren’t aware of.

Despite the availability of AI-driven revision tools, how might educators motivate students to actively seek feedback from peers and writing centers and recognize its value?

Laura Dumin: Having students reflect on AI feedback versus human feedback can help them see the pros and cons of each type. I’m more interested in students getting feedback that is helpful and that works for them. I don’t think it has to be or even should be, an either/or situation. If a student can’t make it to the writing center or doesn’t have access to peers for help, why not use AI to get at least some feedback?

Learn more about the "Teaching with Generative AI" course.

Learn More About Laura Dumin.