-
About
Our Story
back- Our Mission
- Our Leadership
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
- Macmillan Community
- :
- Newsroom
- :
- Learning Stories Blog
Learning Stories Blog
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Learning Stories Blog
Showing articles with label AI.
Show all articles
MarisaBluestone
Community Manager
09-06-2024
05:52 AM
As a new school year begins, educators are once again tasked with finding innovative ways to engage and support their students. The tools and technologies available today are more advanced than ever, and AI is at the forefront of many of these innovations. While AI might sound futuristic or even intimidating, it’s already changing the educational landscape and reshaping classrooms and learning experiences in meaningful ways. This is true in high schools and colleges; in STEM or humanities courses; and with new and seasoned educators.
While it may feel overwhelming, there are practical steps you can take now to ensure that its use in your class helps, rather than hinders, students’ learning. From personalized learning paths to efficient administrative tasks, AI can offer a variety of practical benefits that can enhance teaching and make the learning journey more enriching for students
AI is not, in any way, a substitute for human interaction. But by embracing it thoughtfully, we can help to create more dynamic, personalized, and effective learning environments. At Macmillan Learning we believe AI can be a powerful tool that can elevate human capabilities, enrich learning experiences, and empower both educators and students. Here are some practical tips to help you seamlessly integrate AI into your classroom.
1. Start Small and Build Confidence
The prospect of using AI in the classroom can be daunting, both for educators and students. You can start small by introducing AI tools that address common challenges in a simple, approachable way. For instance, you might ask students to use an AI, like ChatGPT, to brainstorm topics for an upcoming essay. This low-stakes introduction can help demystify AI, allowing students to see it as a supportive tool rather than a replacement for their creativity or effort. Over time, as both you and your students become more comfortable, you can expand the use of AI to more complex tasks, gradually building confidence and proficiency.
2. Focus on Supplementing, Not Replacing
AI should be seen as an aid to enhance the teaching and learning process, not as a replacement for the teacher's role. You can use AI-driven tools like Macmillan Learning’s AI Tutor to support students' learning when, where, and how they need it most, transforming homework and self-study into an interactive and engaging experience that promotes deeper understanding. For example, the AI can handle more routine questions and provide basic explanations when the student needs it, freeing up class time for you to dive deeper into complex topics or facilitate richer discussions. Learning is an inherently human experience and this approach both maximizes learning efficiency and reinforces your essential role in guiding and mentoring students.
3. Provide Clear Guidelines and Uses
Establishing clear guidelines on how AI should be used in your classroom is crucial to maintaining academic integrity. It’s helpful to outline what constitutes acceptable AI-assisted work in your class, such as using AI for brainstorming, researching information, or checking grammar and punctuation in drafts. Emphasize that while AI can be a valuable partner in a student’s learning journey, it should not replace critical thinking or be used to complete assignments in their entirety. Additionally, provide guidance on how to cite AI sources, reinforcing the importance of transparency and accountability. For example, if a student uses an AI tool to gather initial research, they should include a citation similar to how they would credit a book or article. (Read more about best practices for creating an AI policy).
4. Promote Ethical Use and Awareness
It may not be intuitive for students to know the “right” way to use AI. While having an AI Policy in place helps students understand what is considered an acceptable use, it’s also important to explain why the policies are put in place. Make sure your students are aware of the potential pitfalls, such as AI-generated content that may be biased, incorrect, or based on outdated information. It's important to explain why certain policies are in place, focusing on the value of critical thinking, problem-solving, and helping students to build authentic communication skills. One way to do this is to discuss real-world scenarios from your assignments where AI-generated errors could have serious consequences, helping students grasp the importance of cross-verifying information and taking responsibility for the final content. By fostering an ethical mindset, you help prepare students not just for success in the classroom but for a future where AI will likely play a significant role in their careers.
5. Integrate AI into Existing Assignments
Instead of creating entirely new assignments, find ways to integrate AI into what you're already doing. This can be done without compromising the core objectives of your assignments, such as critical thinking, content mastery, and creativity. For example, you might encourage students to use AI for the initial stages of a project, such as generating ideas or outlining their thoughts. After this AI-assisted start, students can develop these ideas further on their own, ensuring the final output is a genuine reflection of their understanding and creativity. This approach maintains the focus on developing critical thinking and original thought while leveraging AI to enhance the process.
6. Encourage Reflection and Metacognition
Reflection can be a powerful tool for learning, helping students to think about their learning process and identify areas where they may need further improvement or additional study. Recognizing these gaps allows them to seek help, adjust their learning strategies, and focus on areas that require more attention. (Read more about our research on its impact here.) Surprisingly, there’s an easy way to integrate AI as well as encourage metacognition: design some assignments that require students to provide a critical reflection on their work and the AI's role in it. For example, after completing a project, ask students to write a brief reflection on what role AI played in their work and what they learned from using it. Did it help them think more broadly about a topic? Did they find it limited in certain ways? This exercise not only promotes metacognition but also helps students develop a more nuanced understanding of how to use AI effectively and ethically.
7. Stay Updated and Experiment
AI is a rapidly evolving field, and new tools and features are constantly emerging. Make it a habit to stay informed about the latest developments in AI technology and educational applications. Experiment with different AI tools to find those that best complement your teaching style and meet your students' needs, consider signing up for Teaching with Generative AI or a similar course, and come back to the Macmillan Learning Community for tips and best practices.
Whether you’re a seasoned educator or just beginning to explore AI’s potential, these tips provide a solid starting point and are just some of the many ways you can get started with AI in your classroom. Like the use of AI itself, there’s no one-size-fits-all approach so it’s essential to experiment, adapt, and discover what works best for you and your students. As you begin to integrate AI, remember that it's a tool meant to enhance your teaching, not replace it. By staying curious and open to new ideas, you’ll not only keep up with the latest advancements but also lead the way in preparing students for the future.
... View more
Labels
-
2024
-
AI
0
0
1,172
MarisaBluestone
Community Manager
08-08-2024
12:57 PM
Creating effective AI policies for classrooms can be a challenging but crucial task. By implementing good policies, instructors can create a robust set of instructions that not only guide students in the ethical use of AI tools but also foster a learning environment where they can develop their critical thinking and independent learning skills.
Learning how to create a policy, and what should be included, are some of the learning objectives in the “Teaching with Generative AI Course” from the Institute at Macmillan Learning. While course participants get hands-on experience and guidance creating policies that are best suited for their classrooms, there are some best practices that are universally beneficial. Here are six ideas to help you create an effective and impactful policy.
Define Clear Guidelines for AI Use, Including Use Cases.
Clearly outline what is allowed and what is prohibited when using AI in coursework to avoid any ambiguity. Outline acceptable uses, prohibited uses, and how to attribute and cite uses. For example, some uses you may want to consider as being acceptable or not could include: brainstorming ideas, checking for grammar or formatting errors, paraphrasing complex source material, summarizing key themes and arguments in source texts, or exploring different perspectives on a topic.
You may find only some AI tools acceptable, or want to specify different guidelines for ChatGPT, Gemini, Grammarly, CoPilot, and Claude, among many others. For example, you might specify that AI tools like Grammarly can be used for grammar checks and style improvements, while tools like ChatGPT can be used for brainstorming ideas but not for generating entire paragraphs or essays. For instance, "Students may use ChatGPT to generate ideas for essay topics but must draft the actual content themselves." Adam Whitehurst, Senior Director of Course Design at Macmillan Learning noted, “Given how rapidly AI tools are evolving, it's likely that you will discover what AI is and isn't good at alongside your students. Set a tone that leaves an opening for grace and flexibility.”
Explain Why the Policies are Put in Place.
Whether it be to maintain academic integrity, help students develop their own critical thinking skills, or teaching appropriate references and citations, there are many good reasons to have a policy in place. While it may be tempting for students to take an easy way out of their work, they would be missing out on many of the benefits of learning. Explaining this or other reasons for the policy can be helpful in doing that. "It's important to connect your guidelines with your course outcomes so that students understand why the use of AI in your course does or does not support their learning," Whitehurst said.
One instructor has the following language in their policy: “College is about discovering your own voice and style in writing, creating the version of yourself you want to be in the future. To do that, you need to trust in your own ideas and develop the skills you have to become that person you want to be.”
Help Students Develop Strategies for Responsible AI Use.
Provide students with practical ways to integrate AI into their learning process without relying on it to do their work. This sets them up for success in your class and better prepares them for entering a workforce where AI will likely be used. AI tools can be used to enhance learning, and can provide additional resources and perspectives. But using them as shortcuts to bypass the learning process can be harmful and deprive students of the knowledge they need to succeed in classes that build on what they should have learned in your course.
For example, you may want to encourage students to use AI tools for initial brainstorming or to explore different perspectives on a topic. For instance, “Use ChatGPT to generate a list of potential research questions, then choose the one that interests you the most and develop your thesis from there.”
Outline Misuse Clearly.
Creating clear guidelines around the misuse of AI tools can be done by defining misuses explicitly, providing concrete examples, and explaining the consequences of misuse. This puts everyone on the same page and helps foster a culture of integrity by emphasizing the importance of using AI tools ethically and the value of academic honesty.
For example, you may explain in your policy that misuse of AI includes, but is not limited to, using AI to generate entire essays, copying AI-generated content without proper attribution, and submitting AI-generated content as your own work. For example, “Submitting a paper where the majority of content is generated by Chat GPT without proper citation is considered academic dishonesty.'" It’s also important to outline what happens when there is misuse. Speaking of misuse …
Create Consequences for Misuse.
Clearly state the consequences for submitting AI-generated content as their own to uphold academic integrity. Policies should include any penalties students will incur if they submit AI work outside of assignments' parameters. Consider whether the consequences should be different on the first offense and later offenses. Consider building in flexibility when it comes to consequences, but lay out the options in a way that allows you to tailor them as needed.
Consequences may range from a warning to a failing grade on the assignment or in the course. For example: "First offense: The student will receive a warning and be required to redo the assignment. Second offense: The student will receive a failing grade for the assignment and may be referred to the academic integrity committee. Third offense: The student may receive a failing grade for the course and face disciplinary action." By outlining the consequences of misuse, you strengthen your policy and help students better understand what's at stake.
Require Transparency, Documentation and Attribution.
Ensure students understand the importance of citing AI tools appropriately in their work by asking them to include attribution. Students may not be sure the best way to do that, and having a policy that clarifies how will eliminate the excuse of not knowing whether use was allowed. Detailing documentation of how AI tools were used in assignments encourages students to be fully transparent and also reflect on the way the tools have impacted their work and their learning. Some examples of this could include:
“[Text/Visuals] were created with assistance from [name the specific Gen AI tool]. I affirm that the ideas within this assignment are my own and that I am wholly responsible for the content, style, and voice of the following material
Text generated by ChatGPT, OpenAI, Month,Day, Year, https://chat.openai.com/chat."
Crafting effective AI policies is essential to fostering an environment where students can leverage technology ethically while developing their critical thinking and independent learning skills. By implementing policies that meet their unique needs, instructors can ensure that AI serves as a valuable tool for enhancing education rather than undermining it. A balanced, transparent, and accountable classroom that prepares students for the future, equipping them with both the technical know-how and the ethical framework to succeed in an AI-enabled world.
Want more on AI? Check out:
How Authors & Educators Can Shape AI’s Future in Education
Can AI Create a More Engaged Student?
Answering Your AI Questions with Laura Dumin, PhD
Answering Your AI Questions with Daniel Frank, PhD
Answering Your AI Questions with Antonio Byrd, PhD
... View more
Labels
-
2024
-
AI
0
0
1,665
MarisaBluestone
Community Manager
07-17-2024
07:05 AM
For many of us, we can pinpoint that one person who helped us become who we are today. Whether it was a parent helping decide what career to pursue, a teacher igniting passion about a topic, a counselor offering understanding and belonging, or something else, interaction with that person changed the course of our lives. Sparking that flame of curiosity is the reason why many of us show up every day to work for a learning company like Macmillan Learning, and it’s also the reason why instructors’ and authors' work is more important than ever.
It’s that feeling that Macmillan Learning CEO Susan Winslow evoked in her keynote speech at the Textbook and Academic Authors Association annual meeting last month in Nashville. Her discussion with authors, most of whom were also educators, focused on the critical role that they can and should play in guiding AI use in the classroom. Her key message was this: learning is very much a human experience and while the integration of AI and other tech tools in educational settings may change how teachers teach and learners learn, the core of the experience -- what makes us want to learn -- has not.
Learn more about what she had to say about how authors and educators can shape AI and how AI can be used as a tool to help students learn even better below. You might also gain some practical tips about how to integrate AI in your classroom.
AI in Education Today
AI is fast becoming a familiar presence in classrooms. From students' use of large language models (LLMs) to conduct research, to AI-driven tutoring systems, to automated grading and personalized learning platforms, it’s hard to avoid its impact on teaching and learning. And while these developments are welcome, getting here wasn't seamless.
As many of us were enthusiastic, there were also just as many who were apprehensive and skeptical about the benefits of AI in educational settings. Winslow shared her own experience with this and explained that shortly after ChatGPT launched, "we were slammed with very urgent worried messages from instructors saying, 'I think my students just cheated their way through the final.'" Instructors wanted a voice in how AI was being used in their classrooms, and while cheating was the initial reaction, it wasn't their only concern. Many were also worried about the potential for increased inequities, the perpetuation of biases within AI systems, and their inability to keep up with the rapid changes.
She used her own learning journey as a means of embracing her own “stages of learning” as it related to AI; and ultimately, she was able to move from early denial to a place where she was enthusiastic about the opportunities ahead. She encouraged Macmillan Learning employees to learn as much as possible and experiment with different technologies as quickly as possible because “even to experiment you have to protect content and understand what AI is capable of.”
With Winslow's own learning journey and the company’s experimenting came important and still unanswered questions: How is this going to affect equity? How are we going to empower learners with the skills they need for the AI-enabled workforce of the future? What did all of this mean if nobody was speaking with educators to do it? While big tech companies had been chasing the dream and the challenge of AI, educators had not been brought to the proverbial table to discuss where it went next. Even with all those unanswered questions, Winslow’s direction was clear about the way forward. “We could all be great at teaching AI how to be an exceptional learner. But that isn't our job. Our jobs are still to help humans learn, and so our mission really didn't change.” She recognized that learning companies like Macmillan Learning could help to inspire what was possible for every learner. And she also knew that students would be best served if the company did it with the input and feedback from educators.
Shaping AI Through Collaborative Efforts
Macmillan Learning's initiatives in customizing AI tools to meet specific educational needs have shown promising results. "The most successful projects in the past were when that little synergy came to life and we went from serendipity and big ideas to ‘yes let's make that happen,” Winslow said. Educators' involvement in the design and implementation phases ensures that AI tools are developed with a deep understanding of educational needs and contexts.
On the flip side, passive adoption of AI could lead to unintended consequences, including increased inequities and the perpetuation of biases. Educators bring unique insights and expertise that are crucial in refining and improving AI tools, ensuring they serve the educational community effectively.
Winslow's keynote emphasized the necessity for educators to actively engage with AI tools rather than adopting them passively. "We need you in this space more than ever. You are the critical piece," Winslow said.
Alongside instructors and students, who are beta testing new products, Macmillan Learning is actively exploring how and to what extent AI should be integrated in its products and services. Internally, the company has conducted experiments with AI tools to understand their capabilities better. Upcoming products, such as the AI Tutor and iClicker Question Creator, are designed to assist instructors in creating more engaging and personalized learning experiences.
Practical Steps for Educators and Authors to Integrate AI
Integrating AI into the classroom may feel overwhelming, but there are practical steps that instructors and authors can take to engage effectively with this technology. Susan Winslow emphasized the importance of experimenting with AI tools and sharing findings with peers. Here are some actionable steps based on her insights:
Engage in Safe Exploration: Platforms like Playlab.ai offer educators a safe space to experiment with AI tools, create assignments, and share experiences without risking intellectual property issues. Winslow highlighted that using such platforms helps educators get familiar with AI, brainstorm new ideas, and rethink classroom activities with AI skills. Winslow further stressed that educators should not underestimate how confused students can be about the rules of AI use in the classroom. By experimenting with AI tools and sharing their experiences, educators can learn what works and what doesn’t, and develop ethical guidelines for AI use.
Participate in Training and Workshops: Given the rapid advances of AI, it’s a good idea to stay up to date on the latest thinking on AI and education. Winslow shared her own journey of learning about AI by attending conferences, reading research articles, and staying updated on the latest developments. Educators and authors should continuously seek knowledge about AI and its implications for education. One good way to do that is through participation in workshops and training sessions, such as the AI for Educators eight-week course offered by the Institute at Macmillan Learning. These sessions provide structured support and professional development on AI basics, helping educators and authors understand AI’s capabilities and limitations.
Collaborate with Publishers: Working closely with publishers in this space can provide valuable feedback and insights. Winslow mentioned that Macmillan Learning frequently organizes gatherings where educators can play with tools, give feedback, and discuss the practical applications of AI in education. This collaboration ensures that AI tools are developed with a deep understanding of educational needs and contexts.
Address Ethical Considerations: Winslow highlighted the ethical concerns related to AI use, such as ensuring equity and addressing biases. Educators should consider these factors when integrating AI into their teaching practices. Creating a transparent and honest environment where the use of AI is documented can help mitigate some of these concerns.
Focus on Human Connection: Despite the capabilities of AI, Winslow noted that human interactions are essential for deep learning. Educators should leverage AI to enhance, not replace, human interactions in education. AI can handle administrative tasks and provide core explanations, but it cannot offer the deeper insights and emotional engagement that human instructors can.
Looking ahead, Winslow envisions a future where AI and human educators work together to enhance learning outcomes. She believes, "In a future where AI is changing our jobs and skill sets and is still prone to hallucinations and is still citing awards and articles that do not exist, the reverse is true of what some of the media says … expertise matters more. Good training matters more." The ongoing role of educators in shaping and refining AI tools will be crucial in ensuring that AI enhances rather than replaces human interactions in education.
While AI tools can provide good core explanations about topics, “they don't offer deeper insights or answer questions about what students should now do with the information they just learned.” And that is yet another reason why the human component of learning is so important. As we move forward, it is more important than ever for educators and authors to take proactive steps in shaping the future of AI in education, ensuring that it serves to enhance, rather than undermine, this very human experience of learning.
Stay tuned for even more insights from the TAA meeting in the coming weeks and months. To read more about AI and the Inherently Human Experience of Learning from Susan Winslow, click here and to learn more about Susan's presentation at TAA click here.
... View more
Labels
-
2024
-
AI
1
0
830
shanifisher
Macmillan Employee
04-30-2024
07:00 AM
What if instead of getting in the way, AI could help students get deeper into topics that interest them?
I’ve been thinking about this since attending the College Board’s 2023 AP Annual Conference, where a panel of educators spoke about the role of generative AI in the classroom. The majority of teachers attending the session had not used AI and were apprehensive about how it could support them, until some of the early adopters started to share their experiences. A lot of light bulbs began to light up. It turned the conversation quickly from a negative focus on cheating to an emphasis on providing students with new opportunities to learn. Time has allowed more teachers to better understand what AI can do for them, and it is fueling excitement!
One teacher from the panel I caught up with, Bruno Morlan, of Acalanes High School (CA), has his AP History students practice answering questions with feedback from AI to help elicit more details needed for full credit. He’s used AI to help students decide where to do additional research in order to prepare for a presentation. “[AI] has not gotten in the way of honest academic work; it has students getting deeper into topics that interest them.”
We know that teachers are constantly learning, and AI literacy is at the forefront of lifelong learning for educators. Former Chief Reader for the APⓇ Psychology Exam and new coauthor on Myers' Psychology for the APⓇ Course, Elizabeth Yost Hammer, is empowering her students to learn AI literacy alongside other key skills in the college psychology classroom. “We teach critical thinking, we teach tech literacy, and we talk about how these come together in thinking critically about what we’re reading online. Now we are talking about AI literacy broadly: how do you use it well, how do you critically think about it, and how do you use it ethically.”
Hammer is teaching students to create a hypothesis before using AI to assist in a research project. Students were a bit surprised to hear their professor suggesting they use AI. At Xavier University of Louisiana, where Hammer serves as the director of the Center for the Advancement of Teaching and Faculty Development, their mission is to develop students into leaders. Hammer encourages her students to use AI as a tool, but to always imbue their voice in their work: “We need you at the table, and you are not going to be at the table if you don’t have a voice.” AI literacy involves students understanding the benefits and the risks, checking information sources, and always developing their own thoughts. This is preparing them for their time outside the classroom as they think about career opportunities, too.
At Macmillan Learning, we see AI creating more engaged students with an AI Tutor that we’re piloting in several disciplines in higher ed. The tutor is prompted to chat with students in a Socratic communication style, aiming to stimulate intellectual curiosity and facilitate self-directed learning. The tutor will not simply provide the answer but is instead instructed to help students with a specific homework question. Instructors piloting the tutor have reported that their students are getting the help they need in a safe space where they are not embarrassed to persist the way they might with a human tutor. Students are engaging and asking more questions when they don’t understand or need a reminder on key concepts or equations. One instructor noted: “The AI Tutor is helping students get started, resulting in less questions about the basics of how to work problems. This is allowing us to use help room time/office hours for more advanced questions and even some discussions!”
Whether you’re a college instructor or teaching high school, AI is sure to have an impact on your classroom and influence how students learn. Embracing AI doesn't mean losing the essence of teaching and learning; it's about amplifying it, making sure every voice is heard, and every piece of history is explored with a fresh perspective. Let’s work together to prepare students not just for exams, but for a world where technology and humanity work together to spark new ideas and new voices with our engaged students.
... View more
Labels
-
2024
-
AI
3
0
1,669
MarisaBluestone
Community Manager
04-08-2024
07:38 AM
Daniel Frank, PhD, is one of three subject matter experts who contributed to the first course at the Institute at Macmillan Learning, "Teaching with Generative AI: A Course for Educators." The course integrates diverse perspectives into the discourse surrounding AI in education by blending asynchronous and synchronous learning. It offers practical experience in formulating AI-related course policies, designing AI-informed assignments, and fostering dialogues with students on AI applications.
Dr. Frank offers a unique perspective on AI in higher education, tackling three key questions from our AI webinar series last fall. Explore his background and insights on real queries from fellow professors for a closer look at the practical knowledge the Institute course will offer.
Daniel Frank, PhD, teaches First Year Composition, multimedia, and technical writing within the Writing Program at the University of California, Santa Barbara. His research interests include AI Writing technologies, game-based pedagogy, virtual text-spaces and interactive fiction, passionate affinity spaces, and connected learning. Dan is always interested in the ways that new technologies interface with the methods of making, communicating, learning, and playing that students are engaged with across digital ecosystems. His pedagogical focus is always rooted in helping students find their own voices and passions as they learn to create, play, and communicate research, argumentation, and writing, across genres, networks, and digital communities.
Should educators consider it their responsibility to educate students on the ethical and responsible use of AI tools, akin to how they teach the responsible use of platforms like Google and Wikipedia and tools like graphing calculators?
Daniel Frank: It’s long been my position that the technology is (and is becoming increasingly) ubiquitous, and that attempting to ban all use or consideration of the technology will not remove the tech from our students’ lives, but will instead remove only honest approaches and conversations about the tech from the classroom. Generative AI is a strange technology that can be easily misunderstood and misused. I think it’s much more productive to bring the tools into the light so that they can be critically considered, rather than swept into the shadows for students to use in all the wrong ways.
What are some strategies to foster students' intrinsic motivation through generative AI, focusing on methods that go beyond external incentives such as grades or assignment completion?
Dan Frank: It’s worth noting that the points-based, grade-focused approach of much of traditional education isn’t conducive to the valuing of personal growth and development. If education is framed as a transactive process where students are here to get their grade and move on, they will turn to tools that promise to automate/alleviate that arduous process. If we want to instill in our students intrinsic motivation, we’ll have to create spaces in our curriculum for experimentation and risk taking. Students should be encouraged to see LLMs as the limited technologies they are and to value their own critical thinking, choices, and rhetorical sovereignty when interfacing with these tools, but the threat to have their work be ‘perfect’ to get the points they need will short-circuit that process and tempt then to cut corners. I think it can be very valuable to try to think about how, for instance, a paper that clearly uses too much generative AI at the cost of clear, unique, personalized, critical thinking might serve as a learning opportunity rather than an ‘I caught you’ moment.
How can we harness AI to boost students' writing skills while ensuring they actively engage in the writing process rather than solely relying on AI-generated content?
Dan Frank: I think the key to this is to help students learn to value what they can bring to the table that AI cannot. It’s very important to help students learn to critically ‘read’ the output of a Large Language Model (LLM) such as ChatGPT. Though this is a revolutionary technology, it still is deeply limited: it lacks the deeper thinking, creativity, and critical thinking that only a human brain can bring to a paper. Students can be taught to see how LLMs produce predictable sentence structures, throw around unnecessary ‘fluff,’ tend to sound like they’re ‘selling’ rather than analyzing, make gestures at ideas but don’t really unpack them, and so forth. The second part of this is to help students demystify the processes of composition. Many students think that if they can’t produce perfect, beautiful writing at the first attempt, they won’t be able to at all–but concepts such as freewriting, iterative drafting, think-pair-shares, clustering and mind mapping (which LLMS might help with!) can help students see that writing is a constant, continual, developing process, and that this is true for even the best writers in the world. I think that in understanding both of these elements, students can learn to value the development of their own unique voice and will be less inclined to resort to LLM output at the cost of their own rhetorical options.
Learn more about the "Teaching with Generative AI" course.
Learn more about Daniel Frank
... View more
Labels
-
2024
-
AI
1
0
1,687
MarisaBluestone
Community Manager
04-03-2024
06:16 AM
The flagship course at the Institute at Macmillan Learning, "Teaching with Generative AI: A Course for Educators," was created with leading voices in the discourse of AI in higher education, including Antonio Byrd, PhD. The course combines asynchronous and synchronous learning, providing hands-on practice in crafting course policies regarding AI, creating AI-informed assignments, and engaging in discussions with students about AI usage.
Dr. Byrd shares his unique perspective and insights centered around AI in education by answering three questions from our AI webinar series last fall. To gain insight into the practical knowledge offered by the Institute course, delve into his background and insights on real questions from professors like you.
Antonio Byrd, PhD, is assistant professor of English at the University of Missouri, Kansas City. He teaches courses in professional and technical communication, multimodal composition, composition studies, and qualitative research methods. He serves on the Modern Language Association and Conference on College Composition and Communication Joint Task Force on Writing and AI (MLA-CCCC Joint Task Force on Writing and AI). Established in February 2023, this task force of scholars from different subfields of English gathers to support policies, assessments, and teaching about and with artificial intelligence in humanities classes and research. Antonio's first book manuscript From Pipeline to Black Coding Ecosystems: How Black Adults Use Computer Code Bootcamps for Liberation (The WAC Clearinghouse/University Press of Colorado) is forthcoming fall 2024.
Based on your knowledge, experience, and/or research, how do students perceive the meaningfulness of feedback provided by AI compared to feedback from human sources?
Antonio Byrd: In my first year writing class on research methods, I gave students the option to use a large language model for feedback on their literature reviews. Most students did not take this option, and instead relied on my and their peers’ comments. One student wrote in their self-assessment at the end of the unit that they didn’t like using artificial intelligence and found the human feedback more than helpful. I’ve given students the option to use LLMs for other tasks, and most do not take them. I suspect students bring some critical orientations to AI already and we should reveal those orientations to shape our policies and pedagogical decisions.
What does the future of AI in education look like, and how can educators prepare for upcoming advancements and challenges in this field?
Antonio Byrd: The future of AI in education is probably already here. Many educational technology companies offer software already fused with artificial intelligence, such as Grammarly and Google Docs. Rather than going to a website to access a chatbot, the chatbot will come to them in learning management software. Arizona State University has gone a step further by partnering with OpenAI to create AI tools specific to the needs of their students. Educators need to be at the decision-making table when their institutions decide to switch from banning generative AI to willingly integrating them into existing learning tools.
Given the absence of established institutional policies regarding AI usage, particularly in the context of plagiarism, how can educators navigate the ethical considerations surrounding AI adoption? Should using ChatGPT or other generative AI tools to respond to exam questions be considered a form of plagiarism?
Antonio Byrd: Navigating ethical considerations and policies for AI adoption may need to be a grassroots effort among faculty and even students. What those policies look like might depend on the discipline and their specific approach to critical inquiry and problem-solving. I think there should be some kind of tiered alignment from institutions to the classroom syllabus; not a copy and paste of the department’s copy and paste of the institution’s broad policy, but one that takes themes from one bigger tier and adapts it down the line to individual classrooms. Even in classrooms, instructors may set ground rules or guidance with students based on the class content.
Learn more about the "Teaching with Generative AI" course.
Learn more about Antonio Byrd
... View more
Labels
-
2024
-
AI
1
0
1,033
MarisaBluestone
Community Manager
04-01-2024
06:11 AM
Leonardo da Vinci epitomizes the essence of productivity and innovation. His remarkable ability to juggle painting, engineering, anatomy, and invention with mastery and creativity can serve as a beacon for those seeking to enhance their productivity in today's fast-paced world. In an era where the digital landscape presents both vast opportunities and challenges, the wisdom of one of history's most brilliant minds can offer some invaluable lessons.
As students navigate the challenges and opportunities of the digital age, they may want to consider how the wisdom of one of history’s greatest minds could be applied to enhance their own productivity. Below, we explore how da Vinci's timeless strategies, coupled with AI and other modern technology, can lead to a renaissance in modern productivity for students.
Curiosity as a Productivity Engine: Da Vinci’s insatiable curiosity propelled him to explore diverse fields of study, much like how a modern professional might seek continuous learning opportunities to stay ahead in their career. In a world where information is at our fingertips, fostering a culture of curiosity is more crucial than ever. AI may be able to serve as a modern torchbearer of da Vinci’s insatiable quest for knowledge.
AI-powered educational platforms, or the tools within these platforms, can adapt to a student's learning style and pace, presenting personalized challenges and topics of interest. For instance, AI tutoring systems can suggest resources on new subjects based on the student’s interactions and progress, fueling curiosity and encouraging self-directed learning.
Meticulous Organization and Note-Taking: Da Vinci kept detailed notebooks filled with sketches, scientific diagrams, and observations. Just as his notebooks were the holding place of new ideas, today's digital tools offer students unparalleled opportunities for organization.
AI-enhanced apps and programs not only store information but also actively help us make connections between disparate ideas. With these tools, students can keep organized notes, prioritize tasks so they can meet deadlines, track their coursework, monitor their progress and even document their ideas.
Setting and Reflecting on Goals: Da Vinci often undertook projects that pushed the boundaries of his knowledge and skills. Similarly, goal setting in the digital age is not just about ambition; it's about reflection and adaptation. Regular reflection on goals can be instrumental in helping students assess where they are and help them get to where they want to be.
Macmillan Learning understands that goal setting and reflection (GRS) is critical to the learning process, and we’ve embedded it within Achieve, our digital learning platform. Our GRS surveys engage each of the three phases of metacognition: planning (where students set goals and plan how to accomplish them), monitoring (where students check in on and track their progress), and evaluating (where students decide whether or not their strategies have been successful, and decide to seek help). Knowing what you want to achieve and setting a plan for how to achieve it can be a helpful boost for productivity!
Balancing Breadth and Depth: While da Vinci is known for his diverse interests, he also delved deeply into subjects, mastering them. The Renaissance is distinguished by its holistic approach to knowledge and creativity, where disciplines were deeply interconnected. This encouraged individuals like da Vinci to be both artists AND scholars.
Personalized learning platforms can help support that holistic approach, helping students to achieve a balance between exploring a wide range of subjects and diving deep into specific areas of interest. By analyzing a student’s engagement and comprehension levels, these platforms (like Learning Curve, Achieve’s adaptive quizzing engine) can suggest when to broaden learning horizons and when to focus more intensely on mastery.
Rest and Diversification as Sources of Inspiration: da Vinci recognized the value of rest and varied pursuits, which fueled his creativity and productivity. Da Vinci knew the value of stepping away from his work to find inspiration in the world around him. Modern productivity advice often echoes this, advocating for breaks, hobbies, and activities outside of work to rejuvenate the mind and inspire innovation.
Today, AI can remind us when to take a break, ensuring our brains have time to rest and our creativity remains sparked. One way to do this is through AI-based wellness and productivity apps. These can monitor a student's study habits and suggest optimal times for breaks, relaxation, and engaging in hobbies or physical activities, which can help prevent burnout. (Learn more about our thinking on this here.)
By looking to the past, we can find enduring strategies to navigate the complexities of modern life and work, much like how da Vinci navigated the renaissance era's challenges and opportunities. He showed us the power of blending curiosity, planning, and learning.
Today, we have the tools to bring his vision into the 21st century, transforming how students can plan for both their present and future. At Macmillan Learning, inspired by da Vinci's enduring curiosity, we constantly explore innovative methods to boost student productivity. As we continue to harness educational technology and AI, we help pave the way for a new era of productivity, where balance, curiosity, and continuous learning all fuel students’ success.
... View more
kate_geraghty
Macmillan Employee
03-27-2024
07:42 AM
“I have no special talent. I am only passionately curious.” -Albert Einstein
I love questions. I always have. From the simplest idea to the most complex mystery, I have always loved how the sheer act of asking questions opened the door to a new idea, a better understanding, or a discovery. And I’m not the only one.
From an early age, questioning emerges naturally. Toddlers famously pepper their caregivers with endless "whys?"—inquiring about everything from why the sky is blue to why bedtime comes so early. But early questioning hints at a valuable lifelong tool: curiosity, an innate desire to explore and better understand the world around them.
As we get older, the questions don’t stop. We explore new ideas in classrooms, our homes, and the world around us. We thrive on discovering and learning. In today’s modern workplace, honing that skill is essential.
Curating a Culture of Curiosity
The world of work has changed a lot over the past five years, but arguably one of the most impactful to our collective future was the explosive re-emergence of artificial intelligence in 2022. Almost immediately, workers of all professions began worrying about being replaced by machines or more accurately, an uber-efficient AI.
The speed with which AI is evolving demands a workforce that is driven by a desire to question, learn, and adapt; a workforce that imagines what could be. Those who explore their curiosity are not only better equipped to leverage AI's potential but also to anticipate its impact on their work.
One critical step in nurturing curiosity is to recognize the need for a change in mindset; or, in other words, reevaluating how we could view this technology and what it may make possible. Instead of fearing the creep of AI or the possibility of a future where it replaces work, we could see it as a collaborator that enhances our capabilities. At Macmillan Learning, we offer our employees a continuous series of learning and development opportunities to encourage them to dig in, learn, and experiment. And with a pinch of curiosity, we can dare to explore what could be.
Asking the Right Questions
How can AI help us to understand and meet teaching and learning needs more effectively? In what ways can AI contribute to a more equitable and inclusive workplace? What role should AI play in improving engagement and outcomes? How can we best prepare students for the workforce of the future?
By asking questions, we begin a journey of learning and discovery, and ultimately personal and professional growth. Using curiosity as a tool means not just asking questions, but asking the right questions. It means understanding the problem you are trying to solve and creating the questions to get to the core, layer by layer.
Beyond the big questions, we can use AI in our everyday work to enhance our creativity and productivity. The more we ask and learn, the better AI will perform for us. Each question, each additional prompt, reveals something that we can then use our uniquely human skills to address. Our capacity to question, think, and critically analyze are all necessary functions to find the best results. By learning how to develop those questions that challenge the way we think, we can spot opportunities for change and innovation, making curiosity one of the most powerful tools for exploration. The path forward is filled with both promise and uncertainty, and there is real courage in stepping into the unknown armed only with questions and a willingness to discover. Providing a safe environment where employees are encouraged to inquire, experiment, learn, and even fail at times, will result in a team that is not only better equipped to tackle the uncertainties of the future but importantly, become an active participant in shaping that journey.
What’s Next?
The future of work isn’t waiting for us to be ready for it. It’s here now. So how does one even begin to cultivate a culture of curiosity? Maybe a first step is to ask yourself today: How can I leverage AI to enhance my unique human capabilities?
In this journey, it is important to note that new technical skills are important, but it’s most often softer skills that will need to be nurtured to help employees thrive in a future workplace: adaptability, creativity, critical thinking, and collaboration. AI is a tool that works best when coupled with human drive, so with a shift in mindset and an openness to explore what’s possible, employees can more effectively harness AI's power to navigate ongoing change.
It is curiosity that propels us to explore that “next thing.” Embracing curiosity in the age of AI is not just about keeping pace with technology—it's about shaping the future of work. It’s about fostering innovation and resilience. It’s about creating a space where we can use our human abilities to take the next steps to create meaningful change.
Could the simple power of curiosity be the asset that unlocks that next great opportunity? Because, I have a few questions...
... View more
Labels
-
2024
-
AI
2
0
2,986
EllieC
Macmillan Employee
03-25-2024
12:45 PM
The introduction of generative AI in academic environments has sparked a vibrant discussion on its impact on academic integrity, creativity, and the evolving roles of educators. This same dialogue inspired the creation of the Institute at Macmillan Learning and its first course, “Teaching with Generative AI: A Course for Educators.” Laura Dumin, PhD, a leading voice in this discourse, is one of three subject matter experts who contributed to the course, which offers a blend of asynchronous and synchronous learning, including hands-on experience developing a course policy around AI, designing assignments with considerations for AI, and navigating conversations with students about the use of AI.
To get a glimpse into the practical knowledge and insights the course will offer, we asked Dr. Dumin five questions about AI in higher education that emerged from our AI webinar series last fall. Read on to get her take as she shares her insights on real questions from instructors like you.
Laura Dumin, PhD, is a professor of English and Technical Writing at the University of Central Oklahoma. She has been exploring the impact of generative AI on writing classrooms and runs a Facebook learning community to allow instructors to learn from each other. When she is not teaching, Laura works as a co-managing editor for the Journal of Transformative Learning, directs the Technical Writing BA, and advises the Composition and Rhetoric MA program; she has also been a campus Scholarship of Teaching and Learning (SoTL) mentor. Laura has created four micro-credentials for the Technical Writing program and one for faculty who complete her AI workshop on campus.
In your experience, when it comes to verifying the authenticity of AI-generated content, particularly in academic papers, are there other methods that can be used aside from solely depending on AI and tools like TURNITIN?
Laura Dumin: I would turn this question on its head and ask why we are verifying content. Are we concerned about facts that might be hallucinations [or false information generated by AI]? If so, we need to teach our students research skills. Are we worried about AI writing the whole assignment or large portions of it? This gets to a different question about why students are in school and taking classes. Do they need the knowledge for the future or is this course just a graduation requirement that they care little about? How can the information be made more relevant to the students so that they are less likely to cheat? And then finally, teach students about ethical and transparent AI use so that they are willing to tell you and comfortable with telling you when and where they used AI to augment their own work.
Should educators consider it their responsibility to educate students on the ethical and responsible use of AI tools, akin to how they teach the responsible use of platforms like Google and Wikipedia and tools like graphing calculators?
Laura Dumin: Yes! I get that there is a lot of content for instructors to get through in 15 weeks, but the reality is that these technologies are still new to a lot of students. We can’t expect them to magically learn ethical and responsible AI use if we don’t help them get there. Also, what looks responsible in one field might be different in another field. We want students to learn and understand the tools and the nuances for responsible use.
With the increasing role of AI in academic writing, what are your thoughts on universities introducing prerequisite courses dedicated to teaching students how to effectively use AI tools?
Laura Dumin: I’m not sold on the need for these types of courses. I’m seeing talk that AI is being approached more in middle school, which means that we have about 5 years before students come to us with a better understanding of ethical and responsible AI use. If it takes an average of 18 months to get a new course on the books, this means that the course probably won’t have a long lifespan. And since the AI landscape keeps rapidly shifting, it would be hard to put together a course that looked the same from start to finally being taught.
What are your thoughts on using AI to aid brainstorming while nurturing students' independent thinking? What does it mean and can it potentially hinder students’ creativity in generating original ideas?
Laura Dumin: I’m ok with it and I am a writing instructor. I get that the struggle of brainstorming can be part of the process in a writing class. If that’s the case, make that clear to students and help them understand the rationale. But if a student is so frozen by an inability to move past the brainstorming phase that they can’t get the project done, no one wins. In that case, AI can help students have a path forward. AI brainstorming can also help open new possibilities for students to see sides of an issue that they weren’t aware of.
Despite the availability of AI-driven revision tools, how might educators motivate students to actively seek feedback from peers and writing centers and recognize its value?
Laura Dumin: Having students reflect on AI feedback versus human feedback can help them see the pros and cons of each type. I’m more interested in students getting feedback that is helpful and that works for them. I don’t think it has to be or even should be, an either/or situation. If a student can’t make it to the writing center or doesn’t have access to peers for help, why not use AI to get at least some feedback?
Learn more about the "Teaching with Generative AI" course.
Learn More About Laura Dumin.
... View more
Labels
-
2024
-
AI
0
0
1,572
MarisaBluestone
Community Manager
03-14-2024
01:08 PM
Whether in a social or professional context, community plays a critical role in our lives. It creates a support system, offers opportunities for networking and allows people to connect based on shared values. For educators, being part of a vibrant, collaborative community is not just beneficial—it's essential for growth and innovation.
Community in education goes beyond mere networking; it's about building a shared space where experiences, insights, and challenges can be exchanged openly. At Macmillan Learning, we’ve seen the impact that a strong and engaged community of educators can have. We’ve witnessed firsthand the impact that our peer consultants have had on instructors across the country through knowledge sharing and the power that ideas carry when instructors can engage freely and openly with their colleagues.
Understanding the importance of community and peer learning, Macmillan Learning recently launched The Institute at Macmillan Learning. The new venture aims to build and support a community of instructors, provide practical knowledge, and empower educators to meet modern teaching challenges with confidence.
The first course “Teaching With Generative AI: A Course for Educators”, was designed to create a community of practice with asynchronous and synchronous components, interactive workshops, and platforms for discussion and collaboration. During the two month course, attendees improve their abilities, keep up with the latest in technology, and discover ways to leverage artificial intelligence to create enduring benefits for their students and institutions. Importantly, they’ll participate in an active and accountable community with their peers.
Simply put, we designed the Institute not just to educate on a topic, but to connect educators from various backgrounds, disciplines and institutions to create a rich tapestry of knowledge and experience.
Building a Community of Instructors
At the heart of The Institute is community. We want educators to engage in meaningful conversations, explore current educational topics, and collectively seek solutions to contemporary challenges in education. This approach is not only about imparting knowledge; it's about fostering a collaborative environment where every instructor feels heard, valued, and empowered to contribute. And AI is just the beginning.
The Institute’s community offers a private online space where instructors can ask and reflect on the complex questions that have been keeping them awake at night. There’s more than one way to take on a challenge, and we believe that hearing perspectives from across disciplines and institutions from both seasoned educators and those newer to teaching offers unparalleled access to insights. In addition to the learning and synchronous parts of the course, feedback and reflection within this community will forge both personal and professional development and, importantly, help facilitate a culture of continuous improvement in AI pedagogy. By participating in this community, educators not only gain access to a wealth of knowledge and resources but also become part of a larger movement in shaping the future of education.
The Institute's focus on collaboration and community building is a testament to Macmillan Learning’s commitment to enhancing the educational landscape, not just for students, but the instructors who inspire them. It will be a place that provides practical knowledge, meaningful community, and opportunities for educators to showcase newfound expertise.
About “Teaching with Generative AI: A Course for Educators”
Teaching with Generative AI: A Course for Educators is a two-month course that offers a blend of asynchronous and synchronous learning, including hands-on experience developing a course policy around AI, designing assignments with considerations for AI, and navigating conversations with students about the use of AI. Each week will delve into a new topic designed to deepen participants' pedagogical practices, enhance their comfort with and understanding of AI, provide practical assignment blueprints for classroom use, and build a vibrant community of practice for professional growth and innovation in education. The course will be delivered in Macmillan Learning's courseware platform, Achieve.
Participants who successfully complete the course will receive digital certification that can be easily shared with their institutions and with colleagues on platforms like LinkedIn. Registration begins on April 15, and the first 100 to sign up get a 60% discount. For more details, and for information on future courses, visit the Institute’s website.
... View more
Labels
-
2024
-
AI
0
0
1,616
marcy_baughman
Macmillan Employee
01-31-2024
11:21 AM
At Macmillan Learning, our commitment to education and the transformative power of learning are at the forefront of our mission: "inspiring what’s possible for every learner." We understand that learning is an inherently human experience, and that technology can play an important role to support it. But with new technologies come new responsibilities and new questions we must ask ourselves; and in our journey to develop AI-driven educational tools, we’re aiming to establish thoughtful governance.
While AI can be incredibly helpful to both students and instructors, there are flaws in many AI tools that may negatively impact the learning journey. Given the potential challenges of AI-driven tools, we became determined to avoid perpetuating biases whenever possible, maintain high ethical standards to guide development of digital tools, and seek active guidance from experts.
To navigate these ethical waters successfully, we sought guidance from experts with deep knowledge of both higher education and AI tools. Additionally, we wanted to ensure that the perspectives and experiences of our instructors and students were represented in the advisory process. This led to the creation of two advisory boards, the Macmillan Learning Ethics Advisory Board and the Data Privacy Board. Together, the boards help us to explore complex issues like ethical and equitable ways to ensure representation of the diverse groups we serve, effective mitigation of data privacy concerns, and improvements in how we could leverage AI to fortify our research practices.
Our Ethics Advisory Board
In my role leading our Learning Science and Insights team, I helped manage the ethics advisory board, which is composed of distinguished individuals who bring a wealth of knowledge and experience to the table. Each of the advisory board members were carefully selected for their deep understanding of AI and their direct interactions with students; they worked roles that range from Data Science and Data Activist to Professor of Learning Analytics.
The advisory board held two virtual sessions during the summer of 2023. In the first session, we presented the Macmillan Learning preliminary AI guidelines, which aimed to establish principles for the responsible development and use of AI in education. The board members provided valuable feedback, helping us refine our guidelines and develop a mission statement that would guide all our AI-related work. Our Mission: "We create research-driven opportunities through AI for educators to better support the diversity of students within their learning environments." We next worked together to create AI principles that will guide our product development.
Our AI Principles
Based on the feedback and insights gained from these sessions, the Macmillan Learning team is now diligently working on the development of new AI technologies. These technologies are designed to support our instructors and students in meaningful and equitable ways while adhering to the ethical principles established with the guidance of our advisory board. As refined with the advisory board's input, our principles are:
We are transparent, accountable, and ethical in how we use AI.
We keep people at the center of AI oversight and improvement.
We promote AI literacy and education on AI safety, ethics, and responsible use.
We strictly observe, monitor, and improve data privacy and security principles.
We continually monitor our AI tools and research to verify that we are not introducing unintended harms or exacerbating inequalities.
We continually learn from diverse groups of experts to improve AI safety and ethics.
As we move forward, we remain committed to using AI as a tool to augment learning, while also retaining the humanity of learning and fostering meaningful and transferable educational experiences. AI can be a helpful companion on the path to learning, and we are determined to make that learning journey safe, reliable, effective, and unbiased for every learner.
We recognize that our work isn't done. The landscape of AI is constantly evolving, and so we remain committed to updating our practices and policies to ensure our research and development practices are ethical and transparent. By continuing to have conversations with our learners seeking expert guidance, and undertaking collaborative explorations of AI's potential uses, we underscore the broader narrative - that the integration of AI into education must always be guided by ethical considerations and a deep commitment to keep learning at the heart of what we do at Macmillan Learning. Stay tuned as we continue to shape the future of AI in education.
Read more about Macmillan Learning’s AI journey, including Generative AI and Product Development: What We’ve Learned & What’s Ahead, The Impact of AI on the Inherently Human Experience of Learning, and how we’re Readying our Workforce for an AI World.
... View more
Labels
-
2024
-
AI
0
0
921
susan_winslow
Macmillan Employee
01-17-2024
06:06 AM
This time last year, Macmillan Learning updated our mission, vision and values. We remain committed to our mission: to inspire what’s possible for every learner. At the core of this mission is our unwavering belief in the enduring value of education. We believe that an investment in education is an investment in human potential. Whether that means a more prosperous future or by providing a stepping-stone to personal and societal growth, we know we have a role to play.
Over the past year, there have been concerns about AI in education leading to its overuse, students’ diminished critical thinking skills, or a reduction in students' investment in their own learning. But I believe that those of us who intimately know what makes learning successful understand that an education is more than just accumulating facts or mastering skills; rather, it's an exploration of self and world and the genesis of lifelong curiosity and learning.
This is a new era for education – one that embraces the dynamics of a digital age. But also one that acknowledges the timeless value of human connection and the importance of learning from each other. As we navigate this rapidly changing landscape, our mission remains our anchor, guiding our endeavors and fueling our passion. AI doesn't change this. We believe all technology enables the human-centric experience of learning.
In a world marked by political turbulence, the role of education becomes even more critical. We remain steadfast in adhering to our values and mission and believe that classrooms must be places where ideas are fostered, engaged with, and critiqued – not removed or banned. Which brings me to another thing that hasn’t changed for us in 2024: our dedication towards diversity and inclusion.
We understand how access to our course materials and technology can help transform 'what’s possible' into reality and carry this responsibility with a deep sense of purpose and optimism. Importantly, we take pride in creating products that reflect the rich diversity of the student populations we serve and strengthen the inclusive value of educational environments. It’s through such commitments we envision neutralizing persistent gaps in outcomes reflective of racial, ethnic, gender, socioeconomic, and cultural lines. We believe that educators and students should feel represented in our learning materials, and that no student should be expected to adopt any particular political or cultural point-of-view in order to succeed in the classroom.
As we proceed in this new year, expect to see us further embolden our commitment to our mission. Expect us to stand firm in our belief in the value of education, and steadfast in our dedication to inspiring possibilities – to fostering curiosity and enabling success for every student. Our ultimate goal? To make a difference in learners' lives – to not just educate, but inspire what’s possible. And we envision a world in which every learner succeeds. Through our content, services and tools we aim to make that a reality. We invite you to join us on this journey, as it will surely be an interesting and exciting one in 2024.
... View more
MarisaBluestone
Community Manager
01-03-2024
06:13 AM
AI has the potential to transform education by improving students' learning experiences and bolstering their critical thinking skills. This topic is of particular interest to many instructors and surfaced many times during “The Importance of Truth, Honesty, and Pedagogy in an AI World” webinar, which featured Macmillan Learning CEO Susan Winslow. t that
For some instructors, AI raises the concern that LLMs (Large Language Models) will serve as a shortcut to students, hindering the development of their critical thinking skills, including problem-solving and writing. However, that doesn’t need to be the case. One instructor noted during the webinar: “If the gauntlet is just to submit a written paper, we’ve missed the mark of what and how to teach.” He said that the goal instead should be to encourage students to critically navigate life. He added that this would require teaching in a way that helps students to recognize when or why to use these LLMs and, in turn, help them better understand the world around them.
From educators becoming AI learners themselves, to the crucial role of teaching responsible AI use, there are many different ways that it is showing up in the classroom. Before the start of the Spring term, here are seven ways that AI can impact students’ critical thinking skills.
Teaching Responsible Use: A crucial aspect of fostering critical thinking in an AI-driven world is teaching responsible AI usage. AI is undeniably a powerful tool, and one instructor on the webinar stated that "it is on us to teach responsible use." By imparting the importance of using AI consciously and ethically, instructors can help equip students with the skills needed to make the best possible choices both in and out of the classroom.
Understanding AI Limitations: Critical thinking involves not only leveraging AI, but also understanding its limitations. Knowing this helps students to make informed decisions about when and how to use AI as a resource. In addition to helping foster a deeper understanding of its role in their learning journey, students can also learn the kind of human touches that AI simply cannot replicate -- which they’ll need to know as they consider their future careers. According to one instructor, part of teaching responsible use is letting students know "what AI can do well, what it can't, and what do WE add that AI can't?"
Mitigation, Not Elimination: Because challenges such as plagiarism may never be fully eliminated, some instructors support the idea of "mitigation" instead. One instructor noted, “as educators, we have the duty to help students navigate the gray areas and use AI to improve, not undermine their writing.” To that end, instructors can encourage students to view AI as a means to expedite research, fix grammatical errors, or enhance their understanding of a topic. By steering students away from the temptation to copy and paste, educators help students leverage technology to augment their learning rather than supersede it.
AI's Role in Feedback: Constructive feedback is paramount to learning, but not all students readily embrace it. Introducing AI into the feedback loop can transform this experience and make it more positive, engaging and effective for students. As students interact with AI-generated feedback that’s personalized to them, they are prompted to reflect critically on their work, fostering continuous improvement and enhancing their overall critical thinking abilities.
Evolving Classroom Dynamics: AI introduces a level of scalability and personalization that can change classroom dynamics. By tailoring learning experiences to individual student needs and pace, students can engage in a more participatory learning process tailored to their abilities. One instructor noted: “if we allow for AI to take care of most of the basic education students need, then brick and mortar schools have the opportunity to become a place for flourishing of project-based learning, social and cultural development, and experiential education-- all of which will still need us teachers to facilitate!"
AI as an Enhancer of the Learning Process: Although AI cannot replicate human touch or original ideas, it can contribute significantly to the learning process. It can provide explanations, aid comprehension, and offer a platform for students to explore and understand complex topics. For students who may feel uncomfortable asking repetitive questions in a traditional setting, AI can act like a tutor and serve as a reliable and accessible resource. It democratizes education and becomes what CEO Susan Winslow refers to as “the great equalizer.”
Educators as AI Learners: When it comes to AI, instructors are also students. Many are actively engaged with AI technologies, continually experimenting with different approaches to harness its power in and out of the classroom. This iterative process of improvement ensures that they’re well-equipped to guide their students as well as take advantage of the latest advancements. As they adapt and refine their approaches, they become role models for students in embracing technology to help augment their critical thinking abilities.
AI in education represents more than just a technological advancement; it is a paradigm shift that empowers students to adapt, explore, and refine their critical thinking skills. It can bring about a significant shift in education, creating new landscapes for students to adapt, explore, and enhance their critical thinking skills. As educators embrace AI as a tool for enrichment, they pave the way for a more engaging, equitable, and intellectually stimulating educational landscape.
If you're curious about the intersection of education and AI, check out additional insights from instructors on Supporting Academic Integrity in an AI World or check out Five Things Instructors Should Consider When Assigning Homework in a World with ChatGPT.
... View more
Labels
-
2024
-
AI
0
0
15.5K
kristin-peikert
Macmillan Employee
12-19-2023
06:24 AM
Since its explosion over the last year, AI is no longer a distant, futuristic concept. It's here, woven into our daily lives in ways we might not even realize. From voice assistants guiding our mornings, to algorithms helping us choose what to watch or buy, AI is everywhere, quietly shaping our world. But how do we prepare our workforces for this new era?
Macmillan Learning CEO, Susan Winslow, recently described her AI journey akin to processing the stages of grief; I see some of our teams having the same experience - denying the impact AI will have on their workflows, hoping it will go away, or angry at being asked to evaluate processes that seem fine already. It is clear to me, however, that we all need to embrace the AI learning curve. That extends beyond our tech talent and coding gurus and starts with seeing AI not as a replacement for our jobs, but an opportunity to enhance human potential.
At Macmillan Learning, we are providing our teams with foundational courses to get them started, including content explaining what AI is to understanding the various ways that AI can be leveraged as a learning tool. We’ve been training best practices of good prompting and creating guidelines on how to ensure safety and ethical use along the way. These basics are followed with a challenge for each of us to keep an open mind, take initiative, and experiment.
Our leaders are not exempt from this. It’s proven important for each of us to lead by example, rolling up our sleeves and diving in to set the tone for our teams. I recently took a hands-on approach to testing HR use cases for generative AI and even dug into the development of an employee self-service chatbot. The insights that I have gained from both successes and failures have helped me to communicate, first hand, insights to hopefully make it a little less scary for our teams to take a first step.
But change is hard and saying “this will be a fun adventure” doesn’t always quiet the fears people have about the unknown. That’s why we have to continually remind ourselves that people will always be at the center of successful evolution. AI can handle repetitive tasks, and serve as a great starting point for ideation; but empathy, creativity, and intuition? Those are OUR superpowers. That's what sets us apart from the algorithms.
As a People & Culture leader, I talk every day to our team about jobs, and AI has left a big question mark for a lot of people as they worry about proficiency or replacement. I have told our teams to try not viewing AI as competition, but a collaboration between human ingenuity and AI efficiency that frees us up to focus on what we do best—using our creativity, emotional intelligence, and problem-solving skills. Incidentally, those are the skills I’m leaning into to get the most out of using AI tools too.
Our approach may not be perfect, and I keep reminding myself that it will take resilience to stay energized about the possibilities when advances in tomorrow’s technology require us to retool our thinking all together. I will challenge you, just like our teams, to embrace the potential of AI, learn from it, and together, let's shape a future where technology empowers us to be the best versions of ourselves. As one of our core values reminds: learning is a journey we are on together.
... View more
Labels
-
2023
-
AI
0
0
1,407
Tim_Flem
Macmillan Employee
11-30-2023
06:35 AM
In a recent Executive Order from the White House on AI, President Biden encouraged the industry to “shape AI’s potential to transform education by creating resources to support educators deploying AI-enabled educational tools, such as personalized tutoring in schools.” We’ve known since research completed in the 1980s that personalized tutoring is significantly more effective than “factory models of education,” and I have spent a good portion of my career considering how human-centered product design and technology can help fulfill the promise of personalized learning.
The time is now. This fall, I’ve watched with great interest and excitement as our teams at Macmillan Learning have tested personalized tutoring with thousands of students in courses in our digital courseware product, Achieve.
Early results indicate that AI-enhanced personalized tutoring positively impacts student engagement and progress, especially at times when they need help with assignments. Without giving answers away, the AI-powered tutor uses socratic-style questions delivered in a chatbot to help guide students step-by-step to the correct conclusion. Importantly, we have seen that whereas students sometimes feel self-conscious asking their instructor or teaching assistants for help, they are open and persistent with the AI tutor, asking questions repeatedly until they gain better understanding.
Faculty have also responded positively, noting that the AI tutor is available late at night or generally when faculty and teaching assistants are not available to answer questions. Because our AI tutor is grounded in the specific Achieve course content from the instructor’s assignment, faculty have reported confidence that their students are receiving better assistance than other online options. While it’s still a bit early for us to understand the efficacy of the AI tutor, we have enough early positive indicators that we are eager to now understand the impact it has on learning. Please stay tuned! AI tutoring is just the beginning of the opportunities in front of us. Imagine a learning environment in which teachers have a learning assistant that knows each student's preferences and levels of preparedness, paces lessons accordingly, and provides timely interventions when needed. Imagine that instructors can intrinsically engage students by framing knowledge acquisition and skill-building in ways that acknowledge the student’s curiosity, their lived experience, cultural background, and personal goals/mission–all while ensuring that students make progress against faculty course outcomes. The promise of AI is that we can support both instructors and students in making learning more deeply personal, accessible, and engaging.
But how do we get there? It may be easy to surmise that with the breathtakingly fast evolution of Generative AI technology the promise of personalized learning assistants will be delivered very soon by large language models. But, human-centered products never result purely from technological advancement–instead, we must roll up our sleeves and intentionally create products that students and teachers find valuable and trustworthy.
One of the most challenging and important problems to solve with AI-enhanced personalized learning is the management and protection of student data privacy and security. In surveys Macmillan Learning has conducted this fall, 63% of students indicated that they have concerns about how data is used, stored, and generated by AI applications and companies. In our fall 2023 AI tutor tests, we have been firmly grounded in the AI safety and ethics principles and processes that we developed with the help of two advisory boards of experts. Good intentions are important, but they’re not enough.
We have been, and will continue, actively monitoring to ensure that data, privacy, and security measures are working as intended. We will continually work with experts to stay current on quickly evolving tools and best practices, and importantly, to implement auditing processes on the AI products/features we’re developing. We are resolute that AI tutors and assistants in our Achieve and iClicker platforms will align with our rigorous human-centered AI ethical principles and processes.
We’ve also heard concerns that the use of AI may create disparities in education, as economically disadvantaged individuals might not have access to the same resources. There's also the risk of AI systems inheriting biases present in their training data, which can also perpetuate disparities. We believe that it's essential to approach the integration of AI in education with an awareness of these challenges and a commitment to use these tools ethically, inclusively, and equitably.
Again, good intentions are important, but not enough. We are working with AI bias experts to determine if we can proactively detect and, when possible, mitigate bias in training data. This fall, we have conducted research specifically with students and faculty at minority-serving institutions to ensure that we acknowledge the needs, questions, and concerns around AI from traditionally underrepresented populations.
As if these substantial challenges are not enough, we also have new AI software infrastructure, QA testing, and monitoring projects to tackle. Every workday feels more full and fulfilling than the day before, but I’ve truly never been more energized in my career in education. I share President Biden’s observation that we have not recently had such a tangible opportunity to fundamentally transform education–and to do so in a way that benefits every learner.
... View more
Topics
-
2020
12 -
2021
32 -
2022
44 -
2023
52 -
2024
59 -
Accessibility
5 -
Achieve
15 -
AI
18 -
Author Spotlight
1 -
Authors
3 -
COVID19
1 -
DEI
47 -
Diversity and Inclusion
1 -
History
1 -
iClicker
4
Popular Posts
Reflections On the One Year Anniversary of the Murder of George Floyd
MarisaBluestone
Community Manager
12
0
Diversity of Thought and Our Educational Mission
Chuck_Linsmeier
Macmillan Employee
10
0
From Attention to Retention: Unpacking Gagné’s Principles with Achieve
bill_yin
Macmillan Employee
7
0