Open AI and College Composition (Part 3)

davidstarkey
Author
Author
0 0 644

If, as I noted in last month’s post, attempting to keep students from using chat generative pre-trained transformers is all but impossible, how might ChatGPT and other artificial intelligence language programs be used productively for teaching and learning? That’s a question that has already generated a tremendous amount of thought and research among educators, with the proliferation of ideas and suggestions growing on a daily, if not hourly, basis. Among the resources I recommend are the work being done by Anna Mills, who curates AI Text Generators: Sources to Stimulate Discussion Among Teachers for the WAC Clearinghouse, and Rhonda Grego, Dean of the School of English and Humanities at Midlands Technical College in Columbia, South Carolina, who has created a detailed and annotated list of scholarly articles with a focus on classes in which writing is a major component of the syllabus.

As with my previous posts, the rapid development of AI, and educators’ responses to its perils and triumphs, means the following ideas are only suggestions, places to begin.

Actively assign ChatGPT as part of the coursework.

While we want our class discussion to be models of invention and creativity, ChatGPT obviously offers a much quicker way to generate essay ideas worth discussing. Rather than a desultory ten or fifteen minutes with students kicking around one or two obvious or half-formed ideas, they could spend their time evaluating and critiquing the ideas offered by GPT, which is particularly adept at generating pros and cons for specific arguments.

AI appears equally deft at summarizing complex arguments. Students can practice this essential skill by writing their own summaries of readings or topics in-class, then comparing them with AI-generated summaries. Conversely, the AI summary can be created first, then dissected by the class for flaws and omissions, of which there are sure to be some.

Indeed, just about any early-process writing activity, from generating a thesis to locating sources to creating an outline, can be supplemented, or complicated, by AI input. In this model, AI acts as a kind of tutor, prompting students to try ideas, answering questions, responding to student concerns and skepticism--essentially becoming something like the online writing guide that so many software developers have worked so long to create.

Allow students to use ChatGPT as they wish, but ask them to be honest about how they have used it.

Once students begin using AI as a partner, it will be tempting for them to say, as they might to an overeager parent, “You’re so good at this, why don’t you just go ahead and do it yourself?” If, as we will see in next month’s post, detecting this sort of plagiarism is problematic, should we just give in and acknowledge its inevitability?

Ethan Mollick, a professor of management at Wharton, concedes: “I think everybody is cheating ... I mean, it’s happening. So what I’m asking students to do is just be honest with me.... Tell me what they use ChatGPT for, tell me what they used as prompts to get it to do what they want, and that’s all I’m asking from them. We’re in a world where this is happening, but now it’s just going to be at an even grander scale.”

Clearly, this approach is not without its drawbacks. What criteria are we using to grade student work not actually produced by students? Whom (or What), exactly, are we grading? Mollick’s proposal may but pragmatic, but it is not far from a tactic discussed last month: not grading at all.

Emphasize the writing process and have students show their work.

A more productive approach is to insist that students be transparent about their own writing processes. While we may preach the gospel of process, too often, especially for teachers with heavy composition loads, it’s much easier simply to assess product. Among the many recommendations composition teachers have made for responding to AI, two occur frequently: 1) Have students do more work in class, with the teacher maintaining a productively intrusive presence from the beginning to the end of the assignment, and 2) insist that each of those stages in the process is read and assessed by the instructor to ensure that the work is consistent with the student’s own writing. If ChatGPT is part of the process, its use should be akin to that of tutoring session or a database search, and every aspect of its use should be well-documented.

Prioritize quality over quantity.

An emphasis on an instructor’s close involvement in the composition process, in tandem with AI’s ease in creating competent product—ChatGPT can meet a semester’s word count in a couple of minutes—should encourage educators to move away from word count as a mark of achievement and toward fewer essays, with more drafts, more in-class work, and more attention to detail. Again, students may consult AI as they compose, but the instructor’s emphasis should be on helping them craft their own sentences and paragraphs rather than cutting and pasting ready-made computer-generated prose.

Assign multimodal writing.

Many professors devoted to multimodal composition have been frustrated by the pace at which their colleagues have adopted non-alphabetic writing practices, but ChatGPT’s wizardry with words should go a long way towards making college composition classes places where, in addition to written text, “essays” consist of images, sound, video, computer graphics, and whatever else persuasively forwards an argument.

Insist on accuracy and facts.

Those who are doubtful of AI’s impending ability to conquer the world often point to the wild inaccuracies to which it is given. In class, let AI have its say on the topic under discussion, then have students do their best to identify what is false or misstated. Because ChatGPT is so error-prone, students will need to be more alert than ever to fact-checking information, certainly a worthwhile development in our era of exaggerations, lies and blatant misinformation.

Nurture the individual writer.

ChatGPT relies on groupthink and hivemind; its prose lacks individual creativity and flair. Media Studies professor Ian Bogost compared a conversation with ChatGPT to “every other interaction one has on the internet, where some guy (always a guy) tries to convert the skim of a Wikipedia article into a case of definitive expertise.” Bloviating generalizations that anyone can make are just as unappealing in college writing as they ever were. Instead, we writing teachers should be cultivating the distinctive voice given to every human being. In class, analyze the prose of ChatGPT, pointing out its blandness, the fact that, as Bogost notes, the writing is “formulaic in structure, style, and content” and “consistently uninteresting as prose.” Rewrite sentences and paragraphs to rehumanize AI’s list of facts and figures. Peter Elbow, Anne Lamott, Natalie Ginsberg: be ready in the wings; we may need you.