-
About
Our Story
back- Our Mission
- Our Leadership
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
- Macmillan Community
- :
- Newsroom
- :
- Learning Stories Blog
- :
- AI in Education: Six Steps to a Strong Classroom P...
AI in Education: Six Steps to a Strong Classroom Policy
- Subscribe to RSS Feed
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
Creating effective AI policies for classrooms can be a challenging but crucial task. By implementing good policies, instructors can create a robust set of instructions that not only guide students in the ethical use of AI tools but also foster a learning environment where they can develop their critical thinking and independent learning skills.
Learning how to create a policy, and what should be included, are some of the learning objectives in the “Teaching with Generative AI Course” from the Institute at Macmillan Learning. While course participants get hands-on experience and guidance creating policies that are best suited for their classrooms, there are some best practices that are universally beneficial. Here are six ideas to help you create an effective and impactful policy.
Define Clear Guidelines for AI Use, Including Use Cases.
Clearly outline what is allowed and what is prohibited when using AI in coursework to avoid any ambiguity. Outline acceptable uses, prohibited uses, and how to attribute and cite uses. For example, some uses you may want to consider as being acceptable or not could include: brainstorming ideas, checking for grammar or formatting errors, paraphrasing complex source material, summarizing key themes and arguments in source texts, or exploring different perspectives on a topic.
You may find only some AI tools acceptable, or want to specify different guidelines for ChatGPT, Gemini, Grammarly, CoPilot, and Claude, among many others. For example, you might specify that AI tools like Grammarly can be used for grammar checks and style improvements, while tools like ChatGPT can be used for brainstorming ideas but not for generating entire paragraphs or essays. For instance, "Students may use ChatGPT to generate ideas for essay topics but must draft the actual content themselves." Adam Whitehurst, Senior Director of Course Design at Macmillan Learning noted, “Given how rapidly AI tools are evolving, it's likely that you will discover what AI is and isn't good at alongside your students. Set a tone that leaves an opening for grace and flexibility.”
Explain Why the Policies are Put in Place.
Whether it be to maintain academic integrity, help students develop their own critical thinking skills, or teaching appropriate references and citations, there are many good reasons to have a policy in place. While it may be tempting for students to take an easy way out of their work, they would be missing out on many of the benefits of learning. Explaining this or other reasons for the policy can be helpful in doing that. "It's important to connect your guidelines with your course outcomes so that students understand why the use of AI in your course does or does not support their learning," Whitehurst said.
One instructor has the following language in their policy: “College is about discovering your own voice and style in writing, creating the version of yourself you want to be in the future. To do that, you need to trust in your own ideas and develop the skills you have to become that person you want to be.”
Help Students Develop Strategies for Responsible AI Use.
Provide students with practical ways to integrate AI into their learning process without relying on it to do their work. This sets them up for success in your class and better prepares them for entering a workforce where AI will likely be used. AI tools can be used to enhance learning, and can provide additional resources and perspectives. But using them as shortcuts to bypass the learning process can be harmful and deprive students of the knowledge they need to succeed in classes that build on what they should have learned in your course.
For example, you may want to encourage students to use AI tools for initial brainstorming or to explore different perspectives on a topic. For instance, “Use ChatGPT to generate a list of potential research questions, then choose the one that interests you the most and develop your thesis from there.”
Outline Misuse Clearly.
Creating clear guidelines around the misuse of AI tools can be done by defining misuses explicitly, providing concrete examples, and explaining the consequences of misuse. This puts everyone on the same page and helps foster a culture of integrity by emphasizing the importance of using AI tools ethically and the value of academic honesty.
For example, you may explain in your policy that misuse of AI includes, but is not limited to, using AI to generate entire essays, copying AI-generated content without proper attribution, and submitting AI-generated content as your own work. For example, “Submitting a paper where the majority of content is generated by Chat GPT without proper citation is considered academic dishonesty.'" It’s also important to outline what happens when there is misuse. Speaking of misuse …
Create Consequences for Misuse.
Clearly state the consequences for submitting AI-generated content as their own to uphold academic integrity. Policies should include any penalties students will incur if they submit AI work outside of assignments' parameters. Consider whether the consequences should be different on the first offense and later offenses. Consider building in flexibility when it comes to consequences, but lay out the options in a way that allows you to tailor them as needed.
Consequences may range from a warning to a failing grade on the assignment or in the course. For example: "First offense: The student will receive a warning and be required to redo the assignment. Second offense: The student will receive a failing grade for the assignment and may be referred to the academic integrity committee. Third offense: The student may receive a failing grade for the course and face disciplinary action." By outlining the consequences of misuse, you strengthen your policy and help students better understand what's at stake.
Require Transparency, Documentation and Attribution.
Ensure students understand the importance of citing AI tools appropriately in their work by asking them to include attribution. Students may not be sure the best way to do that, and having a policy that clarifies how will eliminate the excuse of not knowing whether use was allowed. Detailing documentation of how AI tools were used in assignments encourages students to be fully transparent and also reflect on the way the tools have impacted their work and their learning. Some examples of this could include:
“[Text/Visuals] were created with assistance from [name the specific Gen AI tool]. I affirm that the ideas within this assignment are my own and that I am wholly responsible for the content, style, and voice of the following material
Text generated by ChatGPT, OpenAI, Month,Day, Year, https://chat.openai.com/chat."
Crafting effective AI policies is essential to fostering an environment where students can leverage technology ethically while developing their critical thinking and independent learning skills. By implementing policies that meet their unique needs, instructors can ensure that AI serves as a valuable tool for enhancing education rather than undermining it. A balanced, transparent, and accountable classroom that prepares students for the future, equipping them with both the technical know-how and the ethical framework to succeed in an AI-enabled world.
Want more on AI? Check out:
How Authors & Educators Can Shape AI’s Future in Education
Can AI Create a More Engaged Student?
Answering Your AI Questions with Laura Dumin, PhD
Answering Your AI Questions with Daniel Frank, PhD
Answering Your AI Questions with Antonio Byrd, PhD
-
2020
12 -
2021
32 -
2022
44 -
2023
52 -
2024
57 -
Accessibility
4 -
Achieve
15 -
AI
18 -
Author Spotlight
1 -
Authors
3 -
COVID19
1 -
DEI
46 -
Diversity and Inclusion
1 -
History
1 -
iClicker
4