-
About
Our Story
back- Our Mission
- Our Leadershio
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
- Macmillan Community
- :
- English Community
- :
- Bits Blog
- :
- Bits Blog - Page 37
Bits Blog - Page 37
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Bits Blog - Page 37

Author
04-08-2015
09:14 AM
This blog was originally posted on February 5th, 2015. Flying across the country a few weeks ago, I read Diogo Mainardi’s The Fall: A Father’s Memoir in 424 Steps (you can hear an interview with the author here). It’s a slim book—166 pages—so I had time to read it twice through, which I did with pleasure and gratitude. While the story of Mainardi’s son Tito’s botched birth in a Venice hospital, which left him with cerebral palsy, is gripping from first to last, what fascinated me most about the book was its structure: it is divided into 424 brief passages, some as short as a four-word sentence (“Tito has cerebral palsy,” which opens the book), others as long as half a page. Why 424 steps? As Mainardi reveals, “four hundred and twenty-four steps” is “the farthest that Tito has ever walked” without falling. In these 424 brief passages, Mainardi introduces readers to his family and most of all to Tito in a way so full of love that I was quickly drawn in and wanted to linger there with them long after my plane had touched down. I wanted to hear about more and more steps, get to know Tito even better (the photos of Tito that accompany the text are breathtakingly beautiful). But The Fall is more than a father’s memoir and a love song to his first son; it is also a tightly woven meditation on the web of associations that circle Tito, from the Scuola Grande di San Marco’s façade, designed by Pietro Lombardo in 1489 which now stands at the entrance to Venice Hospital—scene of many mistakes, including the one made during Tito’s birth—to Ezra Pound’s praise of Lombardo and the “stupid aestheticism” that Mainardi had shared with Pound before Tito’s birth. The web gets more dense and full of cross-references as the steps proceed. This 424-step-long meditation on disability and on love got me thinking about Winston Weathers, whose book An Alternate Style (1980) introduced us to the Grammar A of school discourse and the Grammar B of, well, everything else. One of the alternates Weathers showed readers was a simple list; another was a series of what he called “crots”: bits or fragments of text. But it also reminded me of David Shields’s much more recent Reality Hunger, a manifesto made up of brief snippets of text, many of them copied verbatim from other people’s work without acknowledgment. This musing led me to consider whether the time is ripe for this particular kind of fragmented or fragmentary writing (my experience with social media writing makes me say “yes!”), and also made me want to experiment with this form, and to engage students in experimenting with it. So now I am imagining a writing assignment that would begin: “Create a series of very brief passages, all related to one topic and arranged so that they reach a climax or make a very telling point by the end.” I’d start out with low stakes—just a few pages and meant for in-class sharing rather than a formal grade. But now I’m thinking that many others may be way ahead of me and have perfected such an assignment. If you have, please share now! In the meantime, check out Mainardi’s book and get to know the amazing Tito. [Image: The Fall: A Father’s Memoir in 424 Steps by Diogo Mainardi. From Other Press.]
... View more
2
0
1,002


Author
04-08-2015
07:33 AM
This blog was originally posted on February 2nd, 2015. In the United States comics generally appeal to those who already know how to read and write, but in other contexts sequences of images with relatable characters and stories convey important information to the illiterate about how to avoid danger or pursue opportunities. For example, Mudita Tiwari and Deepti KC of India’s Institute for Financial Management and Research are distributing comic books about financial literacy in the slum of Dharavi in Mumbai to discourage women from relying on vulnerable hiding places in their homes to squirrel away cash. As a co-author of Understanding Rhetoric, a comic textbook, I was particularly interested to see their financial literacy tools for women, which emphasized graphic media for storytelling and sequential art as a means of communication. As they explained to the annual conference of the Institute for Money, Technology, and Financial Inclusion, before adopting this approach they found that the lack of information about banking alternatives was compounded by apathy toward generic information that “didn’t click.” To provide meaningful context, they developed an interactive story-telling approach using comic books that starred two major characters: Radha, who is always struggling with financial adversities, and Saraswati, her sensible money-managing friend. Researchers actually used real-life stories to compose the narrative. Financial Literacy for Women Entrepeneurs The literacy problem in India is serious, because the country has 287 million illiterate adults, or 37 percent of all illiterate adults globally (UNESCO Education for All Global Monitoring Report). However, many countries have large populations of illiterate adults, and in the United States, public health efforts have enlisted comic books for decades (Schneider, “Quantifying and Visualizing the History of Public Health Comics”). Even in the supposedly conservative 1950s, Planned Parenthood used comics to get out the word about family planning. Selene Biffi was asked to write a public health comic book for Afghanistan by the United Nations. The experience inspired her to found a nonprofit organization that makes graphically appealing storytelling-oriented print materials for the developing world, Plain Ink. According to their website, rather than donate books manufactured in the West, their organization supports “the use of local skills in the countries where we work” and strives to “find the best authors, illustrators, printers and distributors to collaborate with” to “create employment and contribute to local economic and social development.” A story on the organization in Fast Company includes some sample pages, which show children making a lid for a well and a sign warning of contaminated water. These panels need to communicate information efficiently, simply, and without ambiguity. Composition instructors can create interesting audience-oriented assignments for students that ask them to create comics for audiences lacking fundamental literacy skills, perhaps as part of a larger research project exploring a topic, such as ways to ameliorate disease or the effects of natural disasters. As an example, faculty could show recent pamphlets with visual instructions about containing the Ebola epidemic. Explaining complex phenomena with simple illustrations can also provide the provocation of a grand challenge to classes exploring different communication modalities. For example, how could global warming be explained to non-literate people or discoveries about the benefits of breast feeding using only pictures? The peer-reviewed research may use relatively advanced scientific models, but the issues you assign should be ones that affect rich and poor alike.
... View more
1
0
1,817

Author
04-08-2015
07:03 AM
This blog was originally posted on April 18th, 2014. Think 1957. Think the inimitable Jerry Lee Lewis. Or Elvis Presley. Both sang about a whole lotta shakin’ goin’ on. I said come on over baby, a-whole lotta shakin’ goin’ on Yeah I said come on over baby, a-whole lotta shakin’ goin’ on Well we ain’t fakin’, a-whole lotta shakin’ goin’ on I was a high school kid in 1957, and little did I imagine that fifty-plus years later this song would keep popping into my head in relation to digital literacy and the ways it has helped us reimagine writing as, always, multimodal. And we ain’t fakin’! Since I teach courses on digital literacies and the digital essay, I decided that this year at 4Cs, I would try to go to every session on multimodal writing. Until I saw the program, that is: there were so many panels devoted to a range of perspectives on multimodality that I would have had to clone myself several times over in order to attend them all. That fact reaffirmed what I’ve been seeing as I visit schools across the country: writing programs are increasingly inviting their students to produce multimodal projects, with some pretty stunning results. Last month in Arkansas, for example, I heard a teacher describe an assignment that asked students to create and “pitch” proposals for new apps, and another teacher describe the animated smartphone mini-lessons she and her students were producing to help each other learn and retain material. On my own campus, intructors are guiding students in doing everything from digital research projects to beautifully illustrated and published storybooks. Most important, students I encounter continue to tell me that they are highly engaged and motivated by such projects. So I was delighted to hear that Bedford/St. Martin’s was sponsoring a Multimodal Celebration during the 4Cs meeting, where participants could showcase their students’ projects. When I arrived at the celebration, the large room was already jammed with people eager to see what students across the country had come up with. Lining three sides of the room were posters describing instructor assignments—along with examples of student work in response to those assignments. Liz Losh was there talking about her students’ mini-Comic Con; Erik Ellis’s students’ fabulous storybooks were on display; posters such as the ones seen below proved yet again that today’s writers are thinking about how to use visuals and infographics to get and hold an audience’s attention. These projects testified to the imaginative, creative, and serious work being produced by students across the United States. I was particularly thrilled, since I believe we are coming close to the point of not having to label such projects as “multimodal.” In sum, it seems to me that the word “writing” will soon carry with it the assumption (entirely justified) of multimodality. As we move toward that day, I see two areas that need our careful attention. The first has to do with colleagues who are still puzzled by or resistant (or indifferent) to multimodal writing, who don’t understand how all writing could be said to be multimodal. I sympathize with these colleagues: after all, writing has a way of changing on us—constantly, and we have had a steep learning curve ever since I entered the profession, as new and emerging technologies have shaped and affected what we think of as “writing.” So we need to find ways to link what may seem new and foreboding to the tried and true principles of rhetoric and to provide support and encouragement to those who are uncomfortable with multimodality. Second, we need more research on how to assess such projects, and in this regard we can turn to our students, creating rubrics together and testing them for accuracy. Luckily, both these areas of concern are already being attended to by leading scholars like this year’s 4Cs Co-Exemplars, Cindy Selfe and Gail Hawisher. From where I stand, I think it’s safe to say that multimodal writing is alive and well and prospering in writing programs across the country. No wonder that during the Bedford/St. Martin’s celebration, participants and attendees called for a follow-up celebration of student multimodal writing next year in Tampa – to loud applause. Oh yeah, there’s a whole lot of multimodalin’ goin’ on!
... View more
1
0
1,014

Author
04-03-2015
12:09 PM
This blog was originally posted on October 14th, 2014. In a recent conversation on the Council on Basic Writing’s listserv (CBW), a correspondent asked about minimum qualifications for teaching Basic Writing. A listserv discussion ensued about appropriate degrees and necessary training. As minimum qualifications remain a long-standing question for the theory and practice of BW, we examined this conversation as part of our Teaching Basic Writing Practicum. On the listserv, key theorists and practitioners from our field offer their insights. Peter Adams (whose co-authored article on ALP is included inTeaching Developmental Writing [TDW] 4e) and Gerald Nelms address the promise of studying student development as an essential part of BW teacher training. Michael Hill, new co-chair of CBW, inquires about the need for national policies on teacher training. Hill asks if policy work and best practices statements remain of concern to CBW members. For my own perspective on this conversation, I turn again to Adrienne Rich’s “Teaching Language in Open Admissions,” and her recently published course notes and syllabi for teaching Basic Writing in the SEEK program at City University of New York. In “Teaching Language,” Rich offers what she sees as the most significant qualification for a teacher of BW courses: “a fundamental belief in the students is more important than anything else….This fundamental belief is not a sentimental matter: it is a very demanding matter of realistically conceiving the student where he or she is, and at the same time never losing sight of where he or she can be” (TDW 4e 25). In other words, the student is not a problem to be solved, but a human being learning to write as a socio-cultural subject, within and beyond the constructs of a BW course. As Nelms suggests, students in BW do not arrive in our classrooms as “blank slates” (also see Shannon Carter’s work in TDW 4e). However, for me, the issue of this issue moves in a somewhat different direction from Nelms’ concern that “prior knowledge can both help and hinder learning.” Instead, I want to turn the question back on our selves, as new and experienced teachers of BW: What about our own multiple literacies? What stated or unstated assumptions and values—as expressed in syllabi, writing assignments, and course activities— may become barriers to our own students’ learning? What can we do to recognize such barriers, and to begin to ameliorate them? In the practicum class, we attempt to address these questions through activities such as Reading what others have written about the roles of their own socio-cultural backgrounds as learners and as teachers of BW Writing about and discussing our own socio-cultural backgrounds as learners and as teachers of BW Addressing the diverse intersections of students’ socio-cultural backgrounds Teaching model mini-lessons Tutoring at an off-campus site that does not have a writing center. As in other BW theory and practice courses across the US, we attempt to create a community of teacher/scholars who actively interrogate our own theories as we develop new practices. As individual teachers, even as all of us are apparently white, our socio-cultural backgrounds represent a diversity of life experiences, fields of study, and approaches to teaching and learning. Often we find that we need to agree to disagree. Perhaps just as often, I grapple with expanding my own comfort zone, so that I remain aware of the need to learn from students, as well as merely to teach. Because of the intersecting needs to interrogate and innovate, I welcome a national discussion of qualifications for teaching BW. Yet even as we undertake such a discussion, we need to recognize the diverse roots of our field. Adrienne Rich, who had only a BA when she taught BW at City College, remains one of field’s foundational teacher/scholars. Her work offers a keen understanding of the role of critical awareness for teachers of BW and also helps us to address a key issue for aspiring teacher/scholars in BW: Not only what we need to know— but perhaps more significantly, whywe need to know it.
... View more
1
0
995

Author
04-02-2015
09:30 AM
In my last blog I discussed the importance in critical thinking of precisely establishing what, exactly, one is thinking critically about. As I continue to ponder the essence of critical thinking—both as co-author of Signs of Life in the U.S.A. and in my current role as assessment director for my university—I am experimenting with ways of conveying, to both professors and students alike, what, exactly, critical thinking itself is. My task is not made easier by the fact that a lot of what passes for critical thinking is really critical reading—as when a critical "thinking" assignment is to unpack the argument in an assigned reading. Critical reading, of course, is an essential skill for college students, who must master it both for their collegiate careers and for their lives beyond, and it does bear a close relation to critical thinking, but it is not the same thing. Critical reading, one might say, is the equivalent to establishing the whatness of someone else's text; critical thinking goes beyond that—often to the expression of one's own argument, but before getting to that argument (which is a rhetorical act) one has to do some critical thinking. I put it this way: critical thinking is a movement from what to . . . so what then? It enlarges upon the recognition of something (an argument, a phenomenon, a problem) and reflectively seeks a further significance, or, in the case of a problem, a solution. Let me take a simple example from the business world (I choose a business example because that is the world towards which most of our students are destined, and because business surveys consistently complain that new employees can't think critically). So, imagine that you are in the soft drink business, with an emphasis on selling sweetened sodas, but your sales are falling. The reduction in sales, in this case, is your what, which is also a problem demanding a solution. To solve the problem you need to do some critical thinking, and the first thing is to find the cause for your drop in sales. This can involve testing hypotheses—for example, "Is it because our product doesn't taste good anymore?" Some research is likely to show that sweetened soda sales are down across the board, so taste probably isn't the cause of the problem. So, a second critical question would be "Is there something wrong with sweetened sodas?" Here, you can situate sweetened sodas into a larger system involving public health, wherein sweetened sodas are receiving a lot of blame for America's obesity problems. You might jump at this point to a solution: "OK, we'll crank out some new diet soda products"—which is exactly the sort of thing that has happened a number of times in the history of soft drinks. Except this time, further critical research will show that diet soda sales aren't doing so well either due to a growing concern about health implications of artificial sweeteners. So maybe another diet product isn't the solution to your problem. But what about naturally flavored soda waters? I think you can now see what I'm doing here: essentially, I've reverse engineered something that has clearly taken place in a lot of soft drink manufacturing boardrooms recently, because America is currently awash in naturally enhanced flavored soda waters, with more varieties appearing practically every day. That didn't happen by accident. It happened because a lot of business people went through a what to so what then? critical thinking process. In an era of information overload, when just about everyone is accustomed to receiving enormous amounts of information without thinking much about it beyond tweeting it here, or pinning it there, this simple, yet profound, movement from what to so what then? needs to be pointed out. As I play with the idea (writing this blog is a form of playing with it) I am hoping to solve a problematic what that especially afflicts assessment: the fact that while just about everyone agrees that "critical thinking" is an essential university skill, no one can agree on what, exactly, critical thinking is. Will I solve my problem with a what . . . so what then? explanation? I don't know yet, but I have arranged a test of my hypothesis to see what happens. I just hope that it doesn't end in . . . you know, whatever.
... View more
0
1
761

Author
03-05-2015
08:30 AM
So Disney is returning once again to that old standard, the story of Cinderella, doing it over but with live action this time. And therein lies a semiotic tale. Because the Cinderella story provides a very good occasion for teaching your students about cultural mythologies, and the way that America's mythologies often contradict each other. In the case of Cinderella, one must begin with the fact that it is a feudal story in essence, one in which a commoner is raised to princess status, not through hard work but through a kind of inheritance: her personal beauty. Such a narrative very much reflects the values of a time when social status was usually inherited rather than achieved. Thus the fact that the Cinderella story (and don't forget Pretty Woman) has been told with popular success again and again in post-feudal, bourgeois America, is significant. As I noted in my blog on Frozen, what makes the reprise of such stories meaningful is the way in which they contradict the bourgeois mythology that links social status with hard work—something that sociologist Max Weber called the "Protestant Work Ethic"—while simultaneously contradicting the American mythology of social egalitarianism. In effect, we find a striking contradiction here between ideology and desire. Most Americans, I believe, would still claim a powerful allegiance to the ideologies of hard work and of social equality: those mythologies are very much alive. But at the level of desire, Americans flock with their children, again and again, to feudal Cinderella stories that neither challenge a world of princes and paupers nor question a happy ending of social status achieved through . . . small feet. Widening the cultural-semiotic system in which the Cinderella story functions, we can see that America has a lot of high cultural literary productions that openly challenge the ideology of the work ethic, but from a very different angle. From The Rise of Silas Lapham to The Great Gatsby, The Rise of David Levinsky to An American Tragedy, we find tales of the corruptive effect of social success achieved through effort. The pursuit and possession of wealth in these stories is presented as spoilers of what America should be about. So, we have a tradition of high cultural questioning of a crucial American mythology (an "American Dream" achieved through hard work), and a string of highly profitable low cultural appeals to glamorized feudalism (and don't get me started on The Lord of the Rings, a story that I adore but which is, nonetheless, one long paean to the divine right of kings). But it gets even more complicated when we bring gender codes into the analysis. Because it is no accident that the feudal fantasies involved in the Cinderella story invariably involve girls and women as the rising protagonists, while the literary critiques of the money-corrupted capitalist always involve men. So from a gendered point of view, all these Cinderella narratives are telling the little girls who are taken to see them that what they should work on is their personal beauty and personality, and some "prince charming" will take care of the rest. Little boys, on the other hand, are being told, in effect, to ignore the warnings of Fitzgerald and Dreiser, because what matters for men is to achieve princely (meaning moneyed) status. In short, the most conservative of gender coded behaviors are being promoted through the endless reprising of the Cinderella story, and this matters a lot at a time when the most probable real-world avenues to economic success in America involve hard study and hard work in technical disciplines that are traditionally coded as male. It's the same old story.
... View more
0
3
935

Author
03-04-2015
06:30 AM
Work on Emerging 3e is, thankfully, coming to a close. Don’t let anyone ever, ever tell you that writing a textbook is easy. It’s much more work than I ever imagined. Right now I am working on the new sequences. We’re going with eight brand new sequences, touching on every reading in the book and including two new research-based sequences. What’s on my mind is the nature of intellectual labor, particularly in relation to teaching. You know, one of my colleagues pointed out that when someone asks us about our work we’re likely to talk about our research, but the truth is that the bulk of the actual work we do is connected to teaching. For me, working within composition, pedagogy, and writing program administration, the relation between my research and my teaching is even stronger. My passion and my intellectual labor—my work—is deeply connected to teaching: to the classroom, to the design of courses, and to the shaping of assignments. I’m not sure the depth of this intellectual labor is always recognized by departments or the institution, which is a real shame. I will say that crafting each sequence for Emerging involves re-reading each essay I plan on using, thinking about the ideas of each, thinking about the ideas of each in relation to each other, considering how these ideas sequence, carefully wording assignments to guide students to explore those connections, crafting questions to prompt students’ thinking, integrating work from other assignments connected to the readings. That’s a lot of thinking. So much has been written about the status of composition and its laborers within the institution. I can’t help but think that if we continue to foreground not the work but the intellectual work we do then perhaps we can begin to shift the conversation and then the culture. Or maybe I am being totally unrealistic. Thoughts?
... View more
0
0
444

Author
02-18-2015
06:30 AM
I just made my reservations for the Conference on College Composition and Communication (CCCC). Wow, some lessons learned. The first lesson: reserve rooms early. I couldn’t get into the host hotel or the backup hotel or even the backup, backup hotel. I’m only about a mile away from the conference but I know from past experience there is no greater pleasure than getting through a long day of panels and then simply stepping into an elevator and collapsing in my room. This year I will be taking a hike before collapsing. I have to admit I was really kind of shocked. I just never expected it to be that hard to find a hotel room in Tampa of all places. The second lesson is closely connected: CV lines are expensive. I tried every traveler’s trick I know, including Kayak, Orbitz, Hotels.com, AAA discounts, state government rates—everything. I still can’t believe it costs $250+ a night to stay in Tampa. When all is said and done, I will be spending about $1,000 to attend the conference. Luckily, it’s just across the state from me so I can drive there. If I had to fly in, that cost would be even higher. That’s a lot of money, it seems to me, for a line on one’s CV (especially since I am not presenting this year and so, really, it’s not a line on my CV). It prompts me to think about the costs of tenure: the money we invest while on the tenure track to get our work out there, to stay current, to connect to others, and to move towards tenure. The cost problem is compounded for me since I won’t be getting department funds to travel this year, as I am technically “out of unit” and up in the dean’s office. I’m trying to think of this as a critical investment in my career but it’s a tough sell to my bank account. Third lesson: they do an awesome job with the conference. Yes, I’m in sticker shock thinking of what I am paying for where I am staying. But in getting things together for the conference I was really impressed with all the work they’re doing. I watched some YouTube videos about the location, I see they have more poster sessions (with cash awards!), and super kudos to Joyce Carter for all that work—there are a ton of new features to look forward to. I’ll be sure to enjoy many panels and will delight in seeing professional friends that, really, I only see at Cs. But I have to admit what I look forward to the most is the Bedford party. For me, it’s the highlight of the conference. Hope to see you there. And you can bet I will be taking these lessons with me as Cs moves to Houston in 2016. I’ll be saving up, booking early, and thinking about some new formats to share my work.
... View more
0
3
360

Author
11-19-2014
12:49 PM
Nick Marino, our gest blogger for this week, is a first year student in the MA program at Florida Atlantic University, specializing in 20 th century British Literature. He lives with his cat in South Florida, a place he finds oddly inspiring.I’m with Nick on this meditation about the use of personal technology in the classroom, even through Richard Restak’s “Attention Deficit: The Brain Syndrome of Our Era” argues rather persuasively that multitasking is a myth. In the classes I teach, I encourage “responsible” use of technology like smart phones: pull it out to bring up a reading, research the author on the internet, check your calendar, or even log in to Blackboard. Need to answer that text or call? No problem. Discretely step outside. I’m always a bit amazed that students find even this rather liberal policy challenging, texting in class anyway. Maybe Nick’s thoughts can offer me some new directions.What do you think? I don’t care if my students use their phones in class. This is apparently a bad attitude for a teacher to have. I’m told that I should care. I’m told that this stance causes my students to think they can use their phones everywhere. I’m told that letting them use their phones in class means that they won’t respect me and teachers do need to be respected. My attitude towards phones in class is a little more complicated than that. Not caring suggests that I would express no preference given the choice between having them stare at their books and my face or their phones. I don’t want my students to use their phones in class but, except in extreme circumstances, I will not stop them from doing so. I should disclose that I haven’t told my students about how I feel about cell phone use in class. I tried to be strict about it on the first day of class, while reading the policy on my syllabus. Since then I’ve barely brought it up, nor have I called a student out for looking at their phone. I haven’t had an extreme circumstance thus far, such as what happened to a colleague of mine. One of her students answered a phone call in class (unapologetically I’m told). My colleague confronted the student in a professional manner and later sent an email to the class stating that answering a phone call in class is inappropriate and will not be tolerated. This was the right thing to do because the disruption that student caused certainly affected the ability of his peers to learn. On the other hand, I don’t think that a student pawing at his Yik Yak feed distracts his neighbor enough to warrant a confrontation. There are two reasons why I don’t stop students from using their phones in class: I don’t have the disposition or visual dexterity to catch, punish and reform students who use their phones in class. Even if I did have the above and used it, my students would most likely retaliate by being reticent in class and or by filling out negative course evaluations at the end of term. My reasoning reeks of self-preservation but the reasoning behind policing phone use in the classroom is not as ironclad as it seems. I feel like teachers don’t want their students to use their phones in class because it is rude and disrespectful and because it impedes their ability to pay attention and learn. Using a phone in class is rude but rudeness is subjective. One of my undergraduate English professors considered it rude and got very offended if a student yawned in class. Yawning is a bodily function though, unlike tweeting. But how much is it our responsibility as teachers to ensure that our students learn proper manners? If one of my students passes my class and later gets in trouble for checking their phone in the presence of an eagle-eyed professor, or perhaps later on after graduation, in front of their boss, do I bear any responsibility for their sorry fate? I guess what this is about is whether a teacher can change the life of an 18 year old who did not choose to take my class and may not have even chosen to enroll in college. Or better yet, can I teach them anything that has utility beyond school in general? I think I can. I’m just not sure that I can teach them how not to be rude. I agree that it’s difficult to pay attention to someone speaking if I’m checking my phone or computer. That’s why I almost never do it. I spent a lot of time and money to get into a graduate program that pays my tuition. I want to get as much as possible out of my dual roles of teacher and student. To me this means giving my undivided attention to whomever I speak to, free from the distraction of social media (if only for a given time). But I don’t know about my students. I get the feeling that what they know about apps, social media, and pop culture greatly surpasses what I know. At any given time, roughly one quarter of my class is engaged with their phones. When we’re watching a video in class or when we’re doing peer review that figure goes down. When I’m lecturing or awkwardly trying to stimulate discussion it can go up. The interesting thing about the students that use their phones in my two class sections is that they cannot be pigeonholed. A student’s gender, ethnicity, personality (that is, talkative or quiet in class) and writing ability does not correlate with their use of cell phones in class. I have students who are strong writers and who listen to my feedback on their papers even though they frequently supplement their class time with checking their phones. I also have polite students who abstain from using their phones in class but struggle with their writing and make the same mistakes that I caution them against both in class and through written feedback. I don’t necessarily agree with the belief that my students cannot pay attention to me, the text, and their phones at the same time. Consider that today’s college student could very well have grown up using the internet as soon as they learned how to walk. It’s important to recall though that internet access is influenced by class and ethnicity, a fact that’s easy to forget on a college campus with abundant Wi-Fi. Nevertheless I think it’s significant that this generation has grown up with the internet. My students probably cannot remember when using snail mail wasn’t a bureaucratic inconvenience but the fastest way to send and receive large amounts of information. They probably cannot remember when gas stations, elevators, and restaurants didn’t have TV monitors informing them of tomorrow’s weather and the latest on the Kardashians. My students are jaded, inured to technology. When Tinder came up in class they told me the idea of it without considering the ramifications of judging someone based only on a picture that may not even be of the person who created the profile. Likewise they probably don’t see how disturbing it is that social media is obsessed with garnering approval. After all, great works and events generally arise out of some form of dissent, but dissent may not get you upvotes. Of course some of my students do express how new technology has sobering effects on society. This usually comes up in their papers but not during class discussions. Naturally I wonder whether they really feel that the effects are sobering or whether whey write what I will agree with so as to get a better grade. My students are skilled at multitasking in an age that demands it. If you’re reading this blog post with other browser tabs or programs open on your computer then you are multitasking. Even if you only have this post on your screen you still have the links and search box on the side panel competing for your attention. The days of watching TV with only the volume and channel graphics as the interface are over. What’s trending, what’s hot, what’s new; these are all part of the 21 st century zeitgeist in which distraction is inevitable. Our phones are crammed with apps that we can check at any given moment. If we use computers to take notes in class we’re prone to emails popping up as they come in, pulling our eyes away from a person’s face just for a second to check. If your phone is within eyesight while reading this then you are probably multitasking and not giving my words your undivided attention. I don’t take it personally though, since undivided attention is becoming an increasingly rare commodity. The days of paying attention to a single thing, like the page of a book or a person lecturing without the benefit of PowerPoint (or some other visual stimuli), are gone. Whether this is good or bad for our future is debatable. What’s not debatable is that the Millennial generation knows how to adapt to this reality because it’s not new to them. They aren’t awed by the internet as I sometimes am. My point is that today’s college student is attuned to this reality without consciously knowing it. And I would argue that as rude as it may be, students still can learn in my class even while being distracted by their phones. Policing the restriction on cell phones on my syllabus isn’t worth the effort because my students will find some analog way to distract themselves. One student who sits in the first row alternates between checking his phone and carefully sketching out what looks like tables or spreadsheets in his notebook. There’s no way that doing this helps him learn how to write English papers, but am I to stop him? Should I confiscate his notebook to teach him a lesson, as I’m told more strict instructors do with cell phones? I’m deathly afraid of confiscating a student’s cell phone given how attached they are to them and how litigious our society is. Perhaps there’s another reason why a sizeable number of my students use their phones in class. Maybe they are so excruciatingly bored that they cannot help it. I don’t believe that theory for a second.
... View more
0
0
388

Author
11-06-2014
10:48 AM
This week’s guest blogger is Rebecca Jensen. Rebecca is an MFA student at Florida Atlantic University where she teaches two classes of first-year composition. She worked as fiction editor for Driftwood Press, a literary magazine based in Tampa and is currently nonfiction editor for FAU’s Coastlines. After sixteen years spent living in England, Rebecca is enjoying her rediscovery of Florida, using the experience to investigate themes of travel and identity in her own creative work. In this post Rebecca turns the question of revision back on ourselves. I have to admit that after reading it I realize I can’t readily articulate how I revise either. “But Miss Jensen, how do you revise?” It’s my first semester as an MFA student and instructor of English, so you would think that I’d be able to answer this with ease. Yet the question posed by one of my students took me off guard. One of the most important qualities I have always looked for in a teacher is confidence, and I hope that this is what my students usually see in me. So when I was faced with this question, I hated to admit in front of them all that I don’t actually know how to do it. I don’t have a specific technique, and I don’t hold the key to the revision process. Initially, when I thought about assigning papers, I didn’t think about the time that would pass between students receiving the assignment instructions and their papers appearing on my desk. Monday morning would inevitably roll around and a stack of freshly printed and proofread papers would await me. I assume that my instructions are clear until a student email pops into my inbox, and a face peers around the door of my office. I forgot how much labor really goes into writing and revising, especially for non-English majors. I didn’t realize they would agonize over my papers, I thought of revisions as just things that happen and things that, ultimately, I would have to grade. Instead, I am constantly facing questions: how do I do it myself, and what am I doing to help my students succeed in their revision process? The way in which I tackle revision depends on the type of paper or assignment I am working on. For a creative piece of writing, it might take me hours or days just to alter a few sentences. I’ll play them over in my mind even when I’m not at my desk or in front of a computer screen. But an academic paper usually has some sort of strict deadline, so I’m rushed into making changes. Often I’m afraid to cut and delete sections of my academic work because what if I don’t come up with something else to fill the gap before my class deadline? I’m a hypocrite. I tell my students not to be afraid to remove paragraphs or restructure sentences. Do it! See what happens! Be bold! They look at me with terrified faces, imagining their essays torn to shreds, destroyed. The revision process is a personal journey. I read an article by Stephen Sutherland recently, entitled “Reading Yourself: Revision as Ventriloquism,” in which he explains that the process is something we teach, but we never see in action. The student undergoes this journey alone. The only things that my students have to guide them are their instinct and my written feedback on previous papers. I can only hope the comments I leave are useful. I’m making a conscious effort to steer away from dropping brief hints like “word choice” or “informal language” on their papers. Instead, trying to explain ways they can improve their discourse and acquire the formidable academic tone. In the classroom, I am trying to mirror my written feedback in my lectures. By discussing what it means to receive these vague comments on their drafts, I hope that my students understand that it is not a lack of concern, rather a shortness of time that prohibits me leaving comments that are detailed and fully expressed. That’s what office hours are for, I tell them. I wonder if this is enough. Am I doing everything I can to help them? Is it okay to tell my students that there isn’t one set revision technique that is guaranteed to work, that I’m stumbling through it (and so far succeeding) and so will they? Is there a right or wrong way to revise that I just haven’t discovered yet?
... View more
0
0
431

Author
10-30-2014
07:32 AM
A few days ago, a piece of fan mail flooded in. So OK, it was really an email from a former student hoping that I would address the reaction to the Ebola epidemic. At first I was reluctant to go anywhere near the topic (for reasons that will emerge presently), but I've come to the conclusion that this could be a very good "teaching moment" about semiotic analyses (besides, I can hardly afford to disappoint my few readers here), so here goes. The first thing is to review exactly what a cultural semiotic analysis does. It moves from the denotation of a sign or semiotic topic (that is, what it is or what its primary significance is) to its connotation (that is, to what it suggests or signifies at a broader cultural level). This movement proceeds by way of a placement of the denotative sign into a system of relevant historical and contemporary associations and differences. A lot of different people have already essentially done this with respect to the Ebola epidemic. Some are arguing, in effect, that the epidemic signifies (connotatively) a failure on the part of the presidential administration. Such an interpretation implicitly (or explicitly) accordingly situates the sign within a system that includes the upcoming November elections, the current unpopularity of the president, and a general (or, at least, widely reported) sense that things are not quite under control in this country at present. Of course, this interpretation is politically motivated and is usually presented for partisan electoral purposes. The converse interpretation, which also often has political overtones, interprets the reaction to the Ebola epidemic as an act of mass “hysteria,” and (at least implicitly) decries those who are using it either to bash the president. Then there is the way that the mass media are using the epidemic as click bait and for other audience-generating purposes. With my local CBS news radio affiliate now including regular “Ebola Updates,” even though the disease has not appeared in Los Angeles, I can readily see how the mass media have more or less construed the sign of Ebola as something looking like this ($). But underlying the political and the commercial significations of the sign “Ebola” lies something more fundamental, which is, quite simply, fear. It is this fear that makes Ebola something that can be exploited for political or profit making purposes, and it too needs analyzing. Ebola fear stems from a number of unknowns. First, there is the unknown involving just what, denotatively, Ebola is. How infectious is it? Is it the "coming plague" that we have been warned about? Will it mutate into something more infectious? Could it spiral out of control? To these questions no one can offer confident answers. This is why we see some pretty strong reactions to the epidemic that are not partisan nor a reflection of media greed. Such reactions come from nations like Jamaica (which has banned in-flights from affected west African nations), individuals like Los Angeles's Congresswoman Maxine Waters (who has called for Ebola preparedness at Los Angeles International Airport— ), from Mexico (which blocked the docking of a Carnival cruise ship on Ebola worries) and from colleges that have discontinued student admissions from Ebola-affected countries (like Navarro Community College in Texas). And then there are the nurses, who have been asking for better equipment and training for a long time in the wake of the epidemic. Some of the new protocols that are now appearing (including medical hazmat suits that leave no portion of the skin uncovered, and which also call for trained observers to watch medical personnel as they take their suits off after patient care exposure) are not reassuring. When we take such things into consideration, we can see that the Ebola epidemic fits into yet another system. This system includes all the signs that potentially fatal infectious diseases (which have been on the run ever since modern medicine began to develop both vaccines and the antibiotic treatments that floored such one-time killers as tuberculosis, pneumonia, and the casual infections that we now hardly notice thanks to antibiotics) are making a comeback. AIDS is a signifier in this system, and so is the very real problem of antibiotic overuse that is already undermining the effectiveness of the "silver bullets" we have come to take for granted. Within this system, Ebola can be very scary indeed. For this reason, I am inclined to withhold judgment. I simply am not certain what Ebola is—what, that is, its full denotation will prove to be. The sources of my information (the public mass media), give me not only sensationalized reports but also fumbling misstatements from the CDC (a lawsuit against the CDC seems to be brewing in Dallas on the part of the second Ebola-infected nurse whose actions in the wake of her initial fever her lawyer claims to have been misrepresented). Since I do know that the Ebola virus is a really nasty killer, and that it is infectious (much more infectious than AIDS), I am not inclined to interpret Ebola fear as mere “hysteria.” Basically, I think it is better to wait until we know more about the denotation here before moving towards connotation.
... View more
0
0
326

Author
10-29-2014
10:30 AM
My guest blogger today, Jenn Murray, has spent the last 16 years as a Midwesterner trying to adjust to life in South Florida. After many years at home with her children, Jenn is currently in her first year of the MA program at Florida Atlantic University, where she is studying multicultural literature and trying to narrow her research interests enough for a thesis.Jenn’s post isn’t only about the stages we all go through in emerging as teachers. It’s also about the ways in which teaching makes us better writers. I have to admit—I never thought about this before. But in taking a moment to reflect I realize she’s absolutely right. When I am writing an article I have a much sharper sense of my argument and what it needs to do, a clearer sense of my organization and the moves I want to make, and a surer understanding of what evidence I want to bring to bear. A lot of that comes from experience in the discipline but now I can see how parts of it come from teaching writing. Cool. I know you’ve heard the saying. We’ve all heard it at one point or another. “Those who can, do. Those who can’t, teach.” It’s one of those snarky comments that get tossed around without much thought, but I am doing a lot of thinking about it right now. I have grown quite accustomed to walking to the front of the classroom at the beginning of every semester. My tendency to daydream is well-documented—beginning in Kindergarten each report card is emblazoned with a hand-written note from the teacher, some iteration of “She is very bright, if she would just stay focused in class”—so I always used sitting up front as a success strategy. It’s sometimes awkward, but if I am up front I am less likely to daydream. In a lot of ways this semester is no different. And yet it is very, very different. This semester, walking to the front of the classroom does not always involve stopping at the first row of seats. Often it means walking to the front and taking my place as teacher. This is a whole new level of awkward. Armed with my copy of Elements (our program’s custom supplemental text) and a binder full of tips and strategies, I approached this role with quite a bit of trepidation. “Those who can, do. Those who can’t, teach.” Sure, but what if I simply can’t teach? I walked in to that first College Writing class terrified that I wouldn’t be able to do the job. No way would I quit, but could I actually do it? Looking at the classroom full of students, many of whom were sitting in their first college class ever, it occurred to me that we were in this thing together—and we had to make it out together. So we dove in. We are well into the semester now, and I have managed to cover an incredible amount with my students. We have read essays, answered contextual questions, and debated some pretty hot topics. I am seeing improvement in the work that they are submitting. But there is something else, too. I am seeing improvement in the work that I am doing. I am looking at my own work with a sharper eye. I am thinking more carefully about my research and evaluating the structure of my own writing a little more critically. I have come to the realization that that old saying may be wrong. Those who can, do, sure. But those who teach? Sometimes they manage to do better.
... View more
0
1
415

Author
10-15-2014
10:30 AM
This week’s guest blogger is Katie Schipper. Katie is a graduate student in the English department at Florida Atlantic University. She currently teaches two sections of first-year composition and believes in the value of writing as a means to express what we know and as a tool to acknowledge how much we have to learn. She also has two cats.I love that Dawn Skowczewski’s essay resonated so much for Katie; it did so for me this semester as well. And she’s getting at an issue that I frequently return to: who gets to teach composition (and why)? In framing her “vague” qualifications I think she’s pointing not just to her emergence as a teacher but also to deeper institutional issues. Who teaches composition at your school? And are they only “vaguely” qualified? One of the first things I said on my very first day of teaching, to my very first section of first-year composition students was “I’m a graduate student, so I’m vaguely qualified to teach this class.” That might have been a rookie mistake. What’s that they say about not letting them see you sweat? But a few students laughed, and that was my goal, and more importantly it’s too late now—I mean, I said it. And the reality is, I am only vaguely qualified. I’ve done various teacherly jobs, I’ve written page upon page upon page (ad infinitum) of expository essays, and I’ve read even more—and those are my vague qualifications. I didn’t really have a vocabulary for how I was feeling until I read Dawn Skorczewski’s essay “From Playing the Role to Being Yourself: Becoming the Teacher in the Writing Classroom” in Bedford/St. Martin’s Teaching Composition. Then I saw that I was in good company. I realized all (or, to be safe, most) teachers feel like frauds at some point in their teaching careers. I also realized that maybe, like new parents who live in fear that they’ll do something terrible to their infant, I lacked the experience that comes with the making of mistakes as well as the realization that mistakes are inevitable—and vital. I think now that this little admission brings me closer to the students sitting in the classroom. When I tell them that their writing can have as much authority as the essays they read in Emerging, I mean it. When I suggest that they’re granted agency by the mere act of putting words on a page (much in the same way that I am granted agency by showing up and standing in front of a class of college students even if in some moments I feel like a vaguely qualified fraud), I mean that too.
... View more
0
0
404

Author
09-24-2014
10:30 AM
I just finished rereading Sondra Perl’s essay “Understanding Composing” reproduced in the excellent Bedford resource Teaching Composition: Background Readings. I’m teaching Perl in our pedagogy course for new graduate teaching assistants, ENC 6700 Introduction to Composition Theory and Methodology; the essays forms part of a cluster of readings on drafting and audience. I’ve been thinking about Perl’s use of “felt sense,” the internal, somatic feelings that exist prior to and result in a piece of writing. Perl’s use of the term certainly resonates with me as a writer when I am writing things such as this blog. But I am wondering how (or if) I can use it in the composition classroom for expository, academic writing. That is, I am wondering to what extent felt sense relies on a writer’s investment in the project. Does felt sense only come into operation when the writing matters to a writer? Do we have to care to evoke a felt sense? Or what happens when our felt sense in relation to a writing project involves procrastination, distaste, revulsion, disdain, or any number of non-generative emotions I imagine students in the classes I teach might have? I’m thinking I might explore the affective dimensions of composition in my classroom, perhaps by having students follow an exercise like the one that opens Perl’s essay: recording out loud what they are thinking, doing, and feeling as they write. It might help some students connect to the class but more importantly it might help students who are struggling identify (and then perhaps divest) emotions related to the composing process. Have you considered the emotional dimensions of writing in your classes? What works?
... View more
0
18
513

Author
09-17-2014
11:30 PM
No, I’m not going to post a You-Tube video of myself getting doused in ice water, and, indeed, by the time this posts, the ice bucket challenge will have probably morphed into something else anyway—most likely a series of parodies. Rather, I wish to submit this latest of virally-initiated fads to a semiotic analysis, seeking what it says about the culture that has so enthusiastically embraced it. As always in a semiotic analysis, we begin with a system of associations and differences, and with some history. The actual act—dousing someone with a large bucket of ice water—of course, refers back to a once spontaneous, and then institutionalized, end-of-Super Bowl ritual by which the winning coach is sloshed with the melted remains of the Gatorade barrel. That is part of the system in which we can locate the current fad, but already we find a significant difference. That difference lies in the fact that the Super Bowl related ice bucket prank is not only an act of celebration but one celebrated by a highly elite masculine club (in fact there is a faint aura of hazing about it), while the ice bucket challenge is an act of pure populism. Not only can anyone participate, but it is, by definition, a mass activity through which individuals are “called out” to participate (indeed, there is a certain whiff of coercion about the matter, a trick-or-treat vibe that caused even Barack Obama to say “no thank you, I’ll just make a monetary contribution”). Thus, the ice bucket challenge can be associated with such medical research fund raising activities as wearing yellow Live Strong bracelets or participating in walkathons, but it is also a reflection of a hetero-directed society whereby (in this case benignly and for a good cause) individual behavior is dictated by group pressure. America, which prides itself on its tradition of individualism (this is one of our chief mythologies), has a hetero-directed tradition as well that goes all the way back to the founding of the Massachusetts Bay Colony. For the people that we know as the “Puritans” (their own name for themselves was the Congregationalists) had a very group-oriented worldview, one that compelled every individual member in the Congregation to demonstrate to his or her co-religionists the signs of salvation, or face expulsion. The tug-of-war between staunch individualism and hetero-directedness is one of the most enduring contradictions in American history and culture. In some decades (the fifties are notorious for this), hetero-directedness weighs more heavily (it isn’t called “hetero-directedness”, of course: we know it as “conformity”); in other decades, anti-conformist individualism is dominant (the sixties generation at least viewed itself as anti-conformist). The tug-of-war at present is especially complex. On the one hand, digital communications technology has been a tremendous nurturer of hetero-directedness. From the sudden viral explosions that produce flash mobs, zombie walks, and, yes, the ice bucket challenge, to the constant sharing of individual experience on the world wide web, digitality has created a global hive that is always abuzz with Netizens caught up in a network of constant group behavior. But on the other hand, we are also living in an era of intense libertarianism, a hyper-individualism often expressed, paradoxically enough, by way of the same social media behind the global hive. It is this sort of non-dialectical mixture of individualism and hetero-directedness that makes America such a culturally complicated, and, well, paradoxical place. While revealing such paradoxes does not resolve them, it at least helps us to understand ourselves as a society a bit better.
... View more
0
0
403