-
About
Our Story
back- Our Mission
- Our Leadershio
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
- Macmillan Community
- :
- English Community
- :
- Bits Blog
- :
- Bits Blog - Page 38
Bits Blog - Page 38
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Bits Blog - Page 38

Author
08-20-2014
02:22 PM
We recently snagged a large grant from our school’s technology fee to outfit AMP, our Advanced Media Production lab. It’s filled with geeky love including 15 high-end fully-kitted iMacs, a clutch of HD video cameras, a Livescribe pen, Final Cut Pro, Adobe Creative Cloud, and a 3D scanner and printer. Initially we’ll be using the lab for our graduate courses but the idea is that what students learn in their grad classes will trickle down into their own teaching. That’s my explicit goal for the fall. I’m slotted to teach ENC 6700: Introduction to Composition Theory and Methodology, our pedagogy course for new Graduate Teaching Assistants. I’m planning multiple sessions in the AMP lab: one for our discussion of readings about teaching and technology, one for us to learn how to use some of the tech in a very hands-on way, and one for production where students will design a lesson plan or tool to use in their own classrooms. I’m not sure what we will end up producing. I’ve always thought it would be interesting to visually represent an essay or a paper’s argument in three dimensions but I think it would also be interesting to create tutorial videos on common issues from a classroom. I hope to return to this at the end of the semester, so that I can share with all of you what happens on the near-bleeding edge of technology. In the meantime, if you had your ultimate playground computer lab, what would you include?
... View more
0
0
314

Author
07-31-2014
09:30 AM
One of the most common demands made upon colleges and universities today is that they must teach "critical thinking." As a great believer in the teaching of critical thinking, I feel that it is incumbent upon all of us who teach it to be very clear about just what we think critical thinking is, however. I have offered my own semiotics-based take on the matter in this blog before and will not repeat it now. My focus this time will be on the sorts of standardized multiple-choice tests that have been offered on critical thinking for assessment purposes. For having looked at some of these tests, I can conclude that while they do contain some of the elements of critical thinking (specifically, the ability to distinguish logical fallacies from sound logic, and pseudo-argument from valid argument), they are still very incomplete in their approach to the subject and need to be supplemented by what I will call the empirical side of critical reasoning. Here's why. It is perfectly possible to construct a logically valid argument on the basis of false information. For example, if it were true that there is no global warming going on in the world, no climate change, and no increase in atmospheric greenhouse gases, then it would be logical to argue that nothing needs to be done about the problem because it doesn't exist. This argument is being made right now and I presume that my readers will see what's wrong with it, but I'll spell it out: the empirical facts as determined by virtually every reputable climate scientist on earth dispute its grounding premise. In other words, to think critically about global climate change, one has to study the science of the matter, and only then can a valid and logical argument be made. (It is worth pointing out that when one of the last holdouts among prominent climate scientists finally conceded that the scientific evidence indeed pointed to anthropogenetically induced climate change, he was denounced on personal grounds by climate change deniers, not logical or scientific ones. See how the Christian Science Monitor reported the story in 2012 here. To generalize: critical thinking includes logical and rhetorical skills (they are necessary), but such skills are not sufficient. Every problem in critical thinking requires knowledge of the relevant facts. These facts can be scientific, or historical, or mathematical, or based in any number of other knowledge disciplines, but without knowledge of the facts (call it "content"), there cannot be adequate reasoning. This is why "reasoning skills" cannot be disassociated from content-based education in science, history, and so on and so forth. I am perfectly aware of the postmodern and/or poststructural objection to my position, an objection based in both a deconstruction of reason itself and of the existence of any facts apart from values. Having written an entire book contesting this point of view (Discourse and Reference in the Nuclear Age, 1988), I am not going to attempt to refute it here. I'll only say this (echoing something Bruno Latour has written): if you don't accept scientific (or other forms of) factuality, then you have no basis on which to challenge climate change denial. And, more to the point: while you may have a basis for "critique," you do not have a firm basis for critical thinking. This is why the critical thinking apparatus of Signs of Life in the U.S.A. is grounded in Peircean rather than structuralist or poststructuralist semiotics. Charles Peirce was a philosophical and scientific realist. He acknowledged the mediational role of signs, but wrote that semiotic systems are grounded in reality. I will concede that no one can finally prove the truth of this perspective, but from a Pragmatistic point of view it offers a far more effective basis for the teaching of critical thinking than one that offers no answer to those whose arguments are founded in made-up "facts," or in no facts at all.
... View more
0
0
489

Author
07-30-2014
09:30 AM
I’m headed to Boston this weekend and that has me pumped, for two reasons. First it means time with my partner (woo hoo). Second it means that we’re starting work on Emerging 3e (super woo hoo). I’ll be meeting with my Bedford editor (Beditor?) to go over reviews for the next edition, and I have already dashed off my own cockamamie ideas. Both of these are really limited pools; there are only so many reviewers and there’s only one me. But hey! Look! There are a potentially vast number of you! So, anonymous reader from the Interwebs, what suggestions do you have for Emerging? Never heard of the text? No worries. Tell me what you want in your dream textbook of contemporary readings and I will take it from there. So what should we be teaching? How? What apparatus do you want? What matters more—price or content? What have publishers totally been missing when it comes to readers?
... View more
0
0
299

Author
07-23-2014
09:20 AM
Whisper, not unlike Snapchat, is another increasingly popular app. Whisper allows people to share secrets anonymously, accompanying each secret with a photo. I’ve been exploring the app, enjoying its voyeuristic pleasures and discovering that many use it (not unlike Snapchat) for sexual ends. It strikes me that Whisper is an immediate, uncurated, digital version of PostSecret. I think it would be interesting to teach them together, asking students either to use Whisper to create their own PostSecret-like visual arguments or asking them to consider how the two differ—particularly what happens when secrets are freely posted without anyone looking over them. What’s particularly interesting about Whisper is that it allows replies. Others in the class could offer feedback on a visual argument through visual arguments of their own. I remain a bit concerned about how open this sand box is and just who might be wandering into it from outside of class but I think it’s a tool worth examining if not indeed worth using.
... View more
0
0
553

Author
07-17-2014
06:33 AM
With the World Cup standing as the globe's most prominent popular cultural event of the moment, I think it is appropriate for me to take a cultural semiotic look at it, especially in the wake of all the commentary that has followed Brazil's rather epic loss to Germany in the semi-finals. As I write this blog, Holland is playing Argentina in the second semi-final, but since neither the outcome of that game nor the final to follow is of any significance from a semiotic point of view, I will not concern myself here with the ultimate outcome of the games but will focus instead on the non-player reactions to the entire phenomenon. Let me first observe that while I am myself not a fan of the game that the rest of the world calls football (I'm not a fan of the game that Americans call football either), I am fully aware that to much of that world the prestige of the World Cup is roughly equaled by the value to us Americans of the World Series, the Super Bowl, the NCAA Final Four, the NBA finals, and the BCS championship combined. I have also been surprised to learn that the Olympic gold medal for football has hardly a fraction of the significance of the World Cup for the rest of the world, as signified by Argentina's attitude towards Lionel Messi (currently the world's greatest scorer, but perhaps the greatest of all time), who brought home Olympic gold in 2008 but is still regarded as a lesser man than Diego Maradona, who, in spite of a controversial career that boasts no Olympic gold medals, did bring home the Cup in 1986. (Perhaps lesser "man" is the wrong term: Argentines simply regard Maradona as "God"). So I get the point that football is a very big deal in the rest of the world, so big that it may not be possible for most Americans to grasp just how big a deal it is. Which takes me to the semiotic question: why is football such a big deal? What is going on when a reporter from Brazilian newspaper O Tempo can remark, in the wake of the 1-7 defeat at the hands (or feet) of Germany: "It is the worst fail in Brazil's history. No-one thought this possible. Not here. Not in Brazil. People are already angry and embarrassed. In a moment like this, when so desperate, people can do anything because football means so much to people in Brazil"? To answer this question I should perhaps begin by clearing the decks in noting that I don't think that Ann Coulter has the answer. I mean, American football, basketball, and baseball (our most passionately followed sports) are team sports too (Coulter appears to think that soccer-football is morally inferior because it is too team oriented and insufficiently individualistic, which is odd when one considers that names like Maradona, Pele, Bobby Charlton—and let's throw in Georgie Best for good measure—are names in Argentina, Brazil, and Great Britain that are at least as magical as Babe Ruth, Joe Montana, and LeBron James are in America, and probably a lot more so). So how can it be explained? As always there is no single explanation: this question is highly overdetermined. But let's start with the sheer variety of sporting choices in America. The list of easily available spectator and participant sports here is so long there really isn't much point in trying to list them. America has them all, and so the appeal of any given sport must always be taken in the context of a lot of other sports competing for attention (which is why Los Angeles, the second largest metropolitan market in America, can get along perfectly well year after year without an NFL franchise). On the other hand, in much of the rest of the world while football isn't precisely the only game in town, it is often practically so (let me except those African nations wherein long distance running is practically the only game in town: which is why Africans—in men's competitions, not women's—win most of the important marathons). A game that doesn't require much in the way of expensive equipment, football can be played by all classes, and of course offers a fantasy pathway to fame, glory, and riches for impoverished football dreamers. In other words, for the rest of the world, football is the big basket into which nations put most of their sports eggs. But who cares anyway? Whether someone is carrying a ball over a line, kicking a ball into a net, throwing a ball into a basket, or hitting a ball onto the grass or into the bleachers (and so on and so forth), what difference does it make? Why is Brazil in despair? Why do people die at soccer-football games? What gives with British soccer hooligans? Here things get complicated. Perhaps the most important point to raise is that sporting events have served as sublimated alternatives to war since ancient times. The original Olympics, for example, featured events that were explicitly battle oriented—today's javelin event at the modern Olympics recalls the days of spear throwing and a foot race run while carrying a shield—and the role of international sport in modern times continues to be that of a symbolic substitute for more lethal conflict (consider the passionate competitions between the USA and the USSR during the Cold War, with the 1972 Olympic basketball final and the 1980 hockey "miracle on ice" looming especially large in memory). While I could go on much further here, suffice it to say that the significance of the World Cup is intimately tied up with nationalism and international conflict. So when the Brazilian "side" fails to kick as many balls into a net as the German side, the emotional feel is akin to having lost a war. This is not rational, but human beings are not invariably rational animals. Signs and symbols can be quite as important as substantial things. Americans right now are trying to get into the game when it comes to the passions of global football, but in spite of decades of youth football competition and legions of soccer moms, it really hasn't happened yet. All in all, American sport is still rather isolationist (I do not say this as a criticism): though we call the World Series, well, the World Series, only American teams play in that game, and the Super Bowl is only super on our shores. But while there may be something parochial about our sporting attitude, at least it isn't a matter for a national crisis if "our" team loses. That's not a bad thing. Personally (and not semiotically), I believe that people should only get passionate about their own exercise programs (I feel awful if I miss a day of running), but, consistent with the mores of a consumer society, sport in America is increasingly a spectator affair, something to watch others do for us as a form of entertainment. It isn't good for the national waistline, but at least we aren't in a state of existential angst because a handful of guys with tricky feet just lost in the semi-finals. By the way: Argentina just went into the final. Maybe Messi will be God. (Alas.)
... View more
0
2
355

Author
07-03-2014
12:30 AM
I confess to a certain fascination for the Beat generation. Not because I belonged to it, mind you (I'm getting old but I'm not that old: the Beats belonged to my parents’ generation), but because of their profound influence on America's cultural revolution, a revolution that continues to roil, and divide, Americans to this day. In other words, if you want to understand what is happening in our society now, knowing something about the history of the Beats is a good place to start. Please understand that when I say this, my purpose is semiotic, not celebratory. In fact, as far as I am concerned, the Beats, and their Boomer descendants, all too often equated personal freedom with hedonistic pleasure, leading America not away from materialism (as the counterculture originally claimed to do) but to today's brand-obsessed hyper-capitalistic consumerism. What the Frankfurt School called "commodity fetishism" has morphed into what Thomas Frank has called the "commodification of dissent" (you can find his essay on the phenomenon in Chapter 1 of Signs of Life in the USA), wherein even anti-consumerist gestures are sold as fashionable commodities, while money and what it can buy dominate our social agenda and consciousness. But what interests me for the purposes of this blog is the fate of three recent movies that brought the Beats to the big screen. The first is Walter Salles' production of Jack Kerouac's signature Beat novel, On the Road (2012), a story that had been awaiting a cinematic treatment ever since Marlon Brando expressed an interest in it in 1957. Another is John Krokides' Kill Your Darlings (2013), a treatment of the real-life killing of David Kammerer by Lucien Carr—a seminal figure in the early days of the Beats and close friend of Allen Ginsberg, William Burroughs, and Jack Kerouac. And the third is Big Sur (2013), a dramatization of Kerouac's novel of the same title. What is most interesting about these movies is their box office: though On the Road enjoyed a great deal of pre-release publicity and starred such high profile talent as Kristen Stewart, Viggo Mortensen, Kirsten Dunst, and Garrett Hedlund, its U.S. gross was $717,753, on an estimated budget of $25,000,000 (according to IMDb). International proceeds were somewhat better (about eight and a half million dollars), but all in all, this was a major flop. Kill Your Darlings did even worse. Starring the likes of Daniel Radcliffe (as Allen Ginsberg?!) and Michael C. Hall, it grossed just $1,029,949, total (IMBd). Big Sur, for its part, grossed . . . wait for it . . . $33,621 (IMDb). Even Kate Bosworth couldn't save this one. Can you spell "epic fail"? As I ponder these high profile commercial failures, I am reminded of another recent literary-historical movie set in a similar era, which, in spite of an even higher level of star appeal, flopped at the box office: Steven Zaillian's 2006 version of Robert Penn Warren's classic novel All the King's Men. Resituating the action from the 1930s to the 1950s, and boasting an all-star cast including such luminaries as Sean Penn, Jude Law, Anthony Hopkins, Kate Winslet, Mark Ruffalo, and the late James Gandolfini, the movie grossed $7,221,458 on an estimated $55,000,000 budget (IMBd). Now, it is always possible to explain commercial failures like these on aesthetics: that is, they simply could be badly executed movies. And it is true that All the King's Men got bad reviews, while On the Road's reception was somewhat mixed (Wikipedia). Kill Your Darlings, on the other hand, actually did pretty well with the reviewers and won a few awards (again according to Wikipedia). But the key statistic for me is the fact that Jackass Number Two was released in the same weekend as All the King's Men and grossed $28.1 million dollars (Wikipedia), four times as much King's, twenty-eight times as much as Darlings, and about forty times (US box office) as much as Road. I don't even want to calculate its relation to Big Sur. So I don't think that aesthetics explains these failures entirely. Especially when one considers how just about any movie featuring superheroes, princesses, pirates, pandorans, malificents, and minions (not to mention zombies and vampires), draws in the real crowds. Such movies have an appeal that goes well beyond the parents-with-children market and include a large number of the sort of viewers that one would expect to be interested in films starring Kristen Stewart, Daniel Radcliffe, and Jude Law. But unlike the literary-historical dramas that failed, these successful films share not only a lot of special effects and spectacle but fantasy as well; and this, I think is the key to the picture. Indeed, you have to go back to the 1970s to find an era when fantasy was not the dominant film genre at the American box office, and since the turn of the millennium fantasy has ruled virtually supreme. While it is not impossible to attain commercial success with a serious drama (literary-historical or otherwise), it is very difficult. The success of movies like Glory, The Butler, and The Help demonstrates that movies that tackle racial-historical themes resonate with American audiences, so I do not think that the failure of these Beat films can be attributed simply to America's notorious disinterest in history. And, after all, The Great Gatsby (2013 version) did well enough. Perhaps it is nothing more than a disinterest in movies that are made by directors who are so personally enamored with their material that they forget that they have to work hard to make it just as attractive to audiences (I get this impression from some Amazon reviews of the DVD of Kill Your Darlings). Artistic types tend to identify with the Beats (the original hipsters), but apparently today's hipsters aren't interested in hipster history. Given the failure of On the Road, Kill Your Darlings and Big Sur (not to mention All the King's Men), I would be surprised to see any future efforts in this direction, however. If nothing else, today's youth generation appears to be uninterested in the youthful experiences of their grandparents—spiritual and actual. In all fairness, I suppose that one cannot blame them.
... View more
0
0
428

Author
05-14-2014
07:30 AM
Today’s guest blogger is Bettina Caluori. Bettina is Professor of English at Mercer County Community College in New Jersey. She earned her M.A. and Ph.D. at the University of California, Santa Barbara. For the past four years she has served as the coordinator for English Composition I (ENG101). Before that she chaired a college-wide committee on assessment for several years. In addition to writing courses she teaches American literature and women’s literature.Barclay: I really enjoyed the opening of our conversation. It’s had me thinking about “balance” ever since: theory and praxis, consistency and individuality… good things to think about.Anyway, I would love to hear more about the writing program at Mercer. What sort of pedagogies do you use in your writing program and why?Bettina: Our assumption is that students must read and write with more purpose than merely mastering the information in texts. As a result we approach critical thinking about college-level texts and the writing process (drafting, revision) as inextricably linked endeavors. Toward that end, students must encounter conceptually challenging selections that require them to comprehend and analyze authors’ arguments, and the writing process should help them refine their understanding and synthesize a critical response. The department values students’ independent thinking about the specific facets of authors’ thoughts and therefore requires supporting evidence that quotes and discusses key ideas. Given this emphasis, we do not assign essay topics in the rhetorical modes. We also discourage the five-paragraph essay which usually supplies more of an organizational pattern than an impetus for writers to link, build and develop ideas. We prioritize formative feedback through class activities and comments on papers that will help students develop and support their reasoned arguments. Feedback on grammar and mechanics remains important, but it should by no means overshadow the intellectual objectives of our writing courses. Barclay: That sounds almost exactly like the goals of the writing program here at Florida Atlantic University. I love the connections we can make across different kinds of institutions. So often I feel like Rhet/Comp somehow shouldn’t exist as a field because our answers to common questions are so determined by local context. It’s reassuring to me then to hear about our commonalities.So how do you enact that philosophy in the classroom?Bettina: Typically we use group work to focus students on the texts and create structure for their active engagement with them. If students are going to need to quote and respond to significant claims in an essay it makes sense that class activities should shift the responsibility to identify important claims to students. Many professors have students exchange drafts and give peer feedback. Some give sample student essays and ask students to grade them using the departmental rubric. I think we all struggle to help students synthesize texts in their essays. As I answer this question, I think an inventory of approaches in this particular area would be helpful. Barclay: For us both! It’s always a challenge to capture and retain lore and yet if we don’t find ways to do so then each new crop of teachers has to reinvent the wheel.Shifting gears a little, as someone in writing program administration what are the unique challenges of creating, promoting, guiding, generating programmatic change? How do you make it happen? What resistance is there?Bettina: There are many challenges! In my mind they all center on time constraints and communication issues. For example, if I circulate a rubric that emphasizes the importance of quoting and tries to describe levels of success using textual evidence, I might believe I have just communicated with everyone about course outcomes. But I have learned that rubrics don’t always communicate directly and smoothly. Like everything else, they are interpreted in terms of people’s past practice and assumptions. Quoting to add color to an essay and quoting in order to analyze concepts are two different things and only the professor who emphasizes the latter promotes the kind of textual evidence we want to prioritize. The rubric is a start at communication. What counts as critical thinking? This is another area where a phrase means different things to people and it takes some time and effort to build a common set of assumptions. So the requirements of effective communication about complex subjects point to the second daunting challenge, which is finding the time and means for effective conversation. Not surprisingly this is hardest to do with our part-time faculty who work at multiple campuses and are not compensated to contribute to departmental initiatives. The full-time faculty have the time and responsibility to set curricular directions and we can reach agreement, although we also make everything happen only gradually. It took a long time to write department rubrics and it takes time to get back to refining and updating them. Barclay: I know what you mean by gradual change. I like to say that writing programs have the turning radius of a cruise ship. We also face problems of getting our teachers to engage in and commit to change. We mostly have Graduate Teaching Assistants but like your adjuncts they aren’t compensated. So how do you deal with these communication issues?Bettina: We look for ways to communicate better with adjunct faculty when everyone has full schedules to manage. For me, it is discouraging to arrange meetings for adjunct faculty and have low turnout, and no doubt adjunct faculty are frustrated when the meetings that I can manage in my schedule don’t work with theirs. Now we have an online orientation to teaching ENG101, a website for sharing course materials, but none of this is as good as being able to talk over lunch about our courses. We are having a departmental retreat at the end of this semester and inviting adjunct faculty, and we will be glad to have anyone who is able to attend, but of course it won’t be possible for everyone. Our department envisions a great deal of change at our retreats because this is when, before or after a semester, we set aside six hours for discussion and planning. Barclay: For us, it’s fall orientation. Everyone is together in one place at one time so we try to make things happen then. And what about resistance to change? Given the large GTA population I tend to have an easier time with change. What’s it like working with a largely contingent labor pool?Bettina: There is always some resistance to change because almost by definition it puts new priorities and risks in place. As ENG101 coordinator, I meet new adjunct faculty at the moment when they may be experiencing how our writing program differs from other places. Sometimes there is resistance, or perhaps skepticism, about our emphasis on critical thinking or our use of textbooks that some consider too challenging for our students. While most people have responded positively, there has been some resistance to our new practice of giving peer feedback to faculty on their paper comments and grading. As I see it, the way forward comes back to communication. How can we improve our communication about how to introduce challenging texts? How can we help people understand the rationale for peer feedback (supporting more effective revision in our students’ essays)? Sometimes it feels like a strange Catch-22. For example, if I could devise the ideal communicative setting in which all full-time and part-time ENG101 faculty could reach a detailed, shared consensus on paper comments, we might not need the peer-feedback system we have because its form originates in a need to communicate in a context with impediments. If we proceed without complete consensus with faculty peer feedback because of these impediments, there is resistance to this way of communicating. One thing is certain, and that is that my colleagues will press for improvements, so we will see where we will go because we can always try adjustments. Barclay: Yes. That’s just another way of saying that when it comes to writing programs change is one of the few constants we can rely on. LOL! Thanks for joining me Bettina and good luck as you continue to develop your program!
... View more
0
1
380

Author
04-16-2014
06:30 AM
Today’s guest blogger is Anthony Lioi—an ecocritic, Americanist, and compositionist who works at the Juilliard School, where he also directs the Writing Center. He earned the BA at Brown University and the MA and Ph.D. at Rutgers University and held positions at Rutgers and MIT before taking his current position. His work has been published in ISLE: Interdisciplinary Studies in Literature and Environment, Feminist Studies, MELUS, CrossCurrents, transFormations, ImageText, and other journals. He is working on a book on nerd culture and environmental discourse. I grew up watching The Magic Garden, a groovy television show that aired on WPIX-Channel 11 in the New York Metro region. It was the 1970s, so Miss Paula and Miss Carol [sic] had long hair, guitars, and jeans that hugged their hips. They swung on rope swings, singing songs about friendship and imagination. My favorite bit (besides the songs) was the Chuckle Patch, a group of laughing daisies that grew leaves with jokes on one side and punch lines on the other. When Paula and Carol approached the Chuckle Patch, the flowers would laugh in greeting. To get the daisies to laugh again, however, they had to pluck a leaf and tell a joke. Leaf = laugh. You got one freebie with the Chuckle Patch, and then you had to work for it. I tell this story because, having taught writing at MIT and the Juilliard School for the past decade, I spend a lot of time in the Chuckle Patch, a lot of time with delightful, capable students who are not, in the main, interested in literature, popular culture, or environmental politics—not interested, that is, in the things that interest me. The Chuckle Patch is in it for the laughs, just as MIT students are in it for the science and Juilliard students, the performance. The Chuckle Patch will laugh once out of courtesy, and then you had better make with the jokes. Don't, and the daisies demure, watching as if you are a particularly dull sort of ape. They challenge whatever vestige of writing-student-as-nascent-English-major you still had in your dull ape brain. When I started teaching in graduate school, my writing program used Ways of Reading, which was great for grad students preparing for orals, but which struck me, even then, as a little much. After teaching Foucault's Panopticon for a couple of semesters, I got tired of the de-Foucault-inating necessary afterwards, the process of convincing students that not all power is bad after teaching them laboriously that all power is bad. There is a place for such an approach, especially in a Liberal Arts paradigm in which students will encounter these ideas again. In an institute of technology—and a music conservatory is an institute of technology—you have three or four chances to Reveal the Larger Context before students submerge into specialization. You have to choose the jokes such that no laugh goes unlaughed. The Chuckle Patch has other things to do. This begs the question of the principle of choice. How to choose the jokes when you only get a couple of shots? Predictably, I figured out what not to do first. One does not appeal, I discovered, to a common context in which the subject is Chuckles—students specialties—and the method is Philosophy of Chuckles. This appears as clueless pandering: you, the person who can't run a nuclear reactor or play a Bach partita, are going to instruct them on the Meaning of Chuckles? Not. Defaulting to one's own specialty is also an error. Elective classes are understood to involve the instructor's specialty; required writing classes are not. So no assignments about American climate change novels or the gender politics of postmodernism. (See Point 1: Ways of Reading.) The trick would appear to be: Engage students on issues already in the penumbra of their central interests, then frame that context with tasks of critical reading, writing, and research. Clever, right? In my current context, I teach classes that are evenly split between native speakers of English from the United States and the Commonwealth, and international students for whom English is a second, fourth, and even fifth language. Finding the shadow of their central interests is complicated by the sheer diversity of students. Taking into account only Sinophones, there are students from Macau, Taiwan, Hong Kong, Australia, and mainland China, not to mention students from Georgia, the country, and Georgia, the state, sitting right next to each other. So there is no hope of creating coherence through national culture, American or otherwise. This year, using the second edition of Emerging, I managed to annoy my students into engagement with issues of public interest. (I am a little brother, by birth order and inclination, so doing good in the style of evil comes naturally.) They had already taken a placement exam using a New Yorker article about empathy, Paul Bloom's “The Baby in the Well” (May 2013). I paired this essay with Kwame Anthony Appiah's “Making Conversation” by asking students if empathy could be used as an instrument of cosmopolitanism. This question ran into trouble on both ends: students oversimplified Bloom's understanding of empathy as the imagined experience of other people's pain by sticking to a flat version of “walking a mile on someone else's shoes” while denying the important of nationality as a category of identity. In the end, there was less feeling and less nationalism than one might have wished, so the notion of empathy as a bridge between nations fell flat. A thousand flowers had already bloomed in the Republic of Music, so what's the big deal? I pivoted to Arwah Aburawa's essay “Veiled Threat,” about the graffiti of Princess Hijab, a Parisian artist-activist. This got a chuckle out of the Patch. As I had hoped, Hijab's activities offended them because the graffiti involved the defacement of private property. The idea that a destroyer of property might consider herself an artist/activist compelled a controversy. Some students defended her as an activist but not as an artist, some defended her contrariwise, and others attacked on all fronts. The rough drafts required everyone to encounter an argument significantly different than their own. This assignment passed the penumbra test, because everyone was interested in art and property, but no musician was forced to write about music. I doubled-down on this strategy in the next essay by choosing David Foster Wallace's “Consider the Lobster.” I knew from other classes that Wallace's argument drove students to distraction by refusing to preach about the ethics of lobster-eating. I crafted a question about the possibility of empathy across the species boundary: Was it possible to empathize with crustaceans? Is empathizing with crustaceans isomorphic with cosmopolitan fellow-feeling? Do lobsters have a nation? Fortunately, students in the class ranged from Francophone Cartesians to Japanese animists, both of which conflicted with American sentimentalists who felt that one should empathize with pets, but not with prey. At this point, students were ready to step away from “art” and further into “ethics,” uniting students as predators while opposing them across cosmology. At this point—the beginning of the next semester—I decided that the Patch was ready to chuckle at something closer to home, something that might offend them across nationality, world-view, and professional ethos. I screened the American documentary Ai Weiwei: Never Sorry, about the Chinese dissident Ai Weiwei and his use of conceptual art against the Chinese government. I asked students to consider the role of the artist as a defender of human dignity using Francis Fukuyama's “On Human Dignity” as the frame. Though students realized that I was taunting them, they could not help but react. The Sinophone population united against the offense Ai had given to China's reputation on the world stage, even as some argued that his activism was justified. This led to a better understanding of Fukuyama's “human dignity” as a liberal concept: students elaborated a notion of communitarian or collective dignity fundamentally at odds with the framing text. The non-Sinophones were dragged into the chuckle by their sheer force of their peer's reactions: one student speculated that the film's directors were secretly working for the American government. Even students who didn't care about international politics cared about the work that Ai called “art.” With their thoroughly Romantic aesthetics, focused on virtuosic performance and profound emotion, students from all backgrounds failed to grasp that conceptual art aimed to make a political point at the intersection of art and ideology. So offended was the group as a whole that the research component of the assignment succeeded beyond my expectations, as students struggle to find as many sources as possible to understand Ai as activist even as they denied Ai as artist. This is the most successful writing class I have taught at Juilliard in my seven years there. Nonetheless, it is not clear whether my shadow-strategy is transferable to the community college, four-year college, or university context. The configuration of my Chuckle Patch is quite distinct, even from patches at other arts institutions. This experiment needs to be run on these other contexts to test its general validity. I am going to run the experiment next year to see if my results can replicated in the initial context. The results from this year are still in process, as the Patch struggles to move beyond the motive force of Teacher to an autonomous critical laughter. Though I still exhort students with songs of friendship and imagination, the punch lines on the leaves have changed. The daisies are aware of some differences now, but the show has a long run ahead of it.
... View more
0
0
420

Author
04-10-2014
06:30 AM
The theme of this blog, as well as Signs of Life in the U.S.A., is, of course, the practice of the semiotic analysis of popular culture in the composition classroom and in any course devoted to popular cultural study. But it is worth noting that my choice of the word “semiotics,” rather than “semiology,” is grounded in a meaningful distinction. For while the words “semiotics” and “semiology” are often interchangeable (they both concern the analysis of signs), there is a technical distinction between them that I’d like to explain here. To begin with, “semiology” is the study first proposed by Ferdinand de Saussure, and came to be developed further into what we know today as structuralism. “Semiotics,” on the other hand, is the term Charles Sanders Peirce coined (based on the existing Greek word “semiotikos”) to label his studies. But the difference is not simply one of terminology, because the two words refer to significantly different theories of the sign. Semiology, for its part—especially as it evolved into structuralism—is ultimately formalistic, taking signs (linguistic or otherwise) as being the products of formal relationships between the elements of a semiological system. The key relationship is that of difference, or as Saussure put it, “in a language system there are only differences without positive terms.” The effect of this principle is to exclude anything outside the system in the formation of signs: signs don’t refer to extra-semiological realities but instead are constituted intra-semiologically through their relations to other signs within a given system. Often called “the circle of signs” (or even, after Heidegger, ‘the prison house of language”), sign systems, as so conceived, constitute reality rather than discover or signify it. It is on this basis that poststructuralism—from Derridean deconstruction to Baudrillardian social semiology to Foucaultian social constructivism—approaches reality: that is, as something always already mediated by signs. Reality, accordingly, effectively evaporates, leaving only the circle of signs. Semiotics, in Peirce’s view, is quite different, because it attempts to bring in an extra-semiotic reality that “grounds” sign systems (indeed, one of Peirce’s terms for this extra-semiotic reality is “ground”). Peirce was no naïve realist, and he never proposes that we can (to borrow a phrase from J. Hillis Miller) “step barefoot into reality,” but he did believe that our sign systems not only incorporate our ever-growing knowledge of reality but also can give access to reality (he uses the homely example of an apple pie recipe as a sequence of semiotic instructions that, if followed carefully, can produce a real apple pie that is not simply a sign). For me, then, Peircean “semiotics” brings to the table a reality that Saussurean/structuralist/poststructuralist “semiology” does not, and since, in the end, I view the value of practicing popular cultural semiotics as lying precisely in the way that that practice can reveal to us actual realities, I prefer Peirce’s point of view, and, hence, his word. But that doesn’t mean I throw semiology out the window. As readers of this blog may note, I always identify the principle of difference as essential to a popular cultural semiotic analysis: that principle comes from semiology. For me, it is a “blindness and insight” matter. Saussure had a crucial insight about the role of difference in semiotic analysis, but leaves us blind with respect to reality. Peirce lets us have reality, but doesn’t note the role of difference as cogently as Saussure. So, in taking what is most useful from both pioneers of the modern study of signs, we allow the two to complement each other, filling in one’s blindness with the other’s insight, and vice versa. Add to this the fact that Peirce has a much clearer role for history to play in his theory of the sign than Saussure (and his legacy) has, and the need for such complementarity becomes even more urgent. And finally, when we bring Roland Barthes’ ideological approach to the sign (he called it “mythology”) into the picture, we fill in yet another gap to be found in both Saussure and Peirce. Put it all together—Peircean reality and history, Saussurean difference, and Barthesian mythology—and you get the semiotic method as I practice it. And it works.
... View more
1
0
10.9K

Author
03-26-2014
12:30 PM
Learn all my students’ names on the very first day.
... View more
0
0
279

Author
02-13-2014
12:30 PM
In my last blog I discussed the difference between a formalist semiotic analysis and cultural one. In this blog I would like to make that discussion more concrete by looking at one of the most popular ads broadcast during Super Bowl XLVIII. Yup, “Puppy Love.” Let’s begin with the formal semiotics. This is an ad with a narrative, but the narrative is conducted via visual images and a popular song voice over rather than through a verbal text. The images are formal signs that tell us a story about a working horse ranch that is also a permanent source of puppies up for adoption—as signified by the carved sign permanently placed in front of a ranch house reading “Warm Springs Puppy Adoption.” It is also important to note that while the ad could be denoting a dog rescue operation, the fact that we see a pen full of nothing but Golden Retriever puppies who are all of the same age suggests that it is more likely that the young couple who run the ranch and the puppy adoption facility are Golden Retriever breeders. We’ll get back to this shortly. The visual narrative informs us, quite clearly, that one of the puppies is close friends with one of the Clydesdale horses on the ranch, and that he is unhappy when he (or she, of course) is adopted and taken away from the ranch. We see a series of images of the puppy escaping from his (or her) new home by digging under fences and such and returning to the ranch. After one such escape, the Clydesdales themselves band together to prevent the return of the puppy back to his adoptive home, and the final images show the puppy triumphantly restored to his rightful place with his friend on the ranch. It’s a heartwarming ad with a happy ending that is intended to pull at our heartstrings. And that leads us to our first, and most obvious, cultural semiotic interpretation of the ad. The ad assumes (and this is a good thing) a tender heartedness in its audience/market towards animals—especially puppies and horses. It assumes that the audience will be saddened by the puppy’s unhappiness in being separated from his Clydesdale buddy, and will be elated when the puppy, together with Clydesdale assistance, is permanently reunited with his friend. Of course, audience association of this elation with a group of Clydesdales (Budweiser’s most enduring animal mascot) will lead (the ad’s sponsors hope) to the consumption of Budweiser products. So, what’s not to like? The first level of cultural semiotic interpretation here reveals that America is a land where it can be assumed that there are enough animal lovers that a sentimental mass market commercial designed for America’s largest mass media audience of the year will be successful. Heck, (to reverse the old W.C. Fields quip) any country that likes puppies and horses can’t be all bad. But there is more to it than that. As I watch this ad I cannot help but associate it with a movie that was made in 2009 called Hachi: A Dog’s Tale. The movie was directed by an internationally famous director (Lasse Hallstrom) and starred actors no less than Richard Gere and Joan Allen (with a sort of cameo played by Jason Alexander). And it was never released to U.S. theaters. Yes, that’s right. While Hachi: A Dog’s Tale was released internationally, received decent reviews, and even made a respectable amount of money, this Richard Gere movie has only been accessible to American audiences through DVD sales. With talent like that at the top of the bill, what happened? Why wasn't it released to the theaters? Well, you see, the movie is based on a true story that unfolded in Japan before the Second World War. It is the story of an Akita whose master died one day while lecturing at his university post and so never returned to the train station where the Akita had always greeted him upon returning home. The dog continued to return to the train station most (if not every) evening for about ten years, sometimes escaping from his new owners in order to do so. He finally was found dead in the streets. Hachiko, the original name of the dog, is a culture hero in Japan, and there is a statue of him at the train station where he kept vigil for ten years. A movie about him was made in Japan in 1987, and while the U.S. version is Americanized, it is pretty faithful to the original story and to the Japanese film. Which probably explains while it never was released for U.S. theatrical distribution. I mean, the thing is absolutely heartbreaking. Have a look at the comments sections following the YouTube clips of the movie, or the Amazon reviews of the DVD: almost everyone says the same thing: how they weep uncontrollably whenever they watch the thing. It is significant that the DVD cover for the movie makes it look like a warm and fuzzy “man’s best friend” flick that children and Richard Gere fans alike can love. Yes, it's a rather good movie (the music is extraordinary), but warm and fuzzy it ain't. And this takes us to the next level of interpretation of “Puppy Love.” Like Hachi, the puppy in the ad doesn’t want to be separated from someone he loves. But unlike Hachi, the puppy is happily reunited with his friend in the end. His tale is a happy one—and an unrealistic one. It is a wrenching experience for all puppies to be adopted away from their families (which are their initial packs), but they don’t tend to be allowed to go back. And animals are permanently separated from the people whom they love (and who loved them) all the time due to various circumstances which can never be explained to them. This is what makes Hachi: A Dog’s Tale so heartrending: it reveals a reality that it is not comfortable to think about: evidently this was too much reality for theatrical release. So “Puppy Love” distracts us from some uncomfortable realities, including the fact that puppies are bred all the time as commodities who will be separated from their original homes (that’s why the fact that the “Puppy Adoption” sign in the ad seems to indicate a breeding operation is significant) and have their hearts broken. The ad makes us feel otherwise: that everything is OK. This is what entertainment does, and that is what is meant by saying that entertainment is “distracting.” But feeling good about puppy mills isn’t good for puppies. And feeling good about the many hard realities of life can lessen audience desire to do something about those realities. And that takes us to a broader cultural-semiotic interpretation: as Max Horkheimer and Theodore Adorno suggested over half a century ago, the American entertainment industry has been working for many years to distract its audience from the unpleasant realities of their lives, thus conditioning them to accept those realities. Horkheimer and Adorno have gone out of fashion in recent years, but I still think that they have a lot to tell us about just why Americans continue to accept a status quo that is anything but heart warming.
... View more
0
3
437

Author
12-18-2013
12:25 PM
I don’t often get the chance to chat with teachers outside my program using Emerging. That’s about to change. Next semester we’re hoping to have several guest bloggers talking about using the text and the challenges they face both with the text and in the classroom.<! more > This isn’t a series of pat promos. I’m hoping for critical dialogue and engaged interaction. That’s the goal for my classroom as well, because it’s that kind of conversation that leads to informed revision and ultimately better writing. That process (as I try to remind students) doesn’t just belong to the isolated world of the writing classroom. I think it’s inherent to all writing and just as important for larger issues—civic engagement, political action, personal decision making. So look for new voices and if you’d like to join in let me know. Erm, I’m not sure how (since most comments on this blog end up as spam ) but I’ll work with the Bedford folks to find avenues for you to contribute. I definitely look forward to hearing what you have to say!
... View more
0
0
977

Author
12-05-2013
10:08 AM
I want to return to my recent critical moment during grading. In short, I was frustrated—not because of the amount of work involved (that’s just par for the course at this point) but because students had problems with things we had gone over in class again and again. I felt both angry and like a failure. Then I realized I was just stuck in Clockwork Christ mode. “Clockwork Christ” is a term I coined over my years working new Graduate Teaching Assistants (it’s also the name/subject for an article I’d like to write some day, if my administrative work load ever lightens (as if) so, “dibs!”). The concept comes in part from my teaching experience but I am also indebted to the work of Richard E. Miller, especially in “The Arts of Complicity: Pragmatism and the Culture of Schooling” (College English 1998). I use the term to index two dominant and contradictory narratives of teaching that circulate in culture, narratives that we as teachers often tend to inhabit, enact, and embody whether cast in the role by our students or ourselves. It’s easy to identify these narratives. The first is teacher-as-Christ, the one who sacrifices everything so that students can experience the transformative powers of education. Based on your age, you know this figure from To Sir with Love, The Dead Poets Society, Dangerous Minds, Sister Act II, Freedom Writers, Stand and Deliver, or School of Rock. Depending on your theoretical inclinations, you know it too through the work of Paulo Freire or Peter Elbow. The narrative is simple: teacher encounters victimized and distrustful students; teacher passionately devotes self to “saving” these students (often through unorthodox pedagogies); students are transformed. But running alongside this narrative is a second, inverse narrative of teacher-as-cog, the mindless functionary of a bureaucracy bent on grinding students into dust. Based on your age, you know this figure from Fast Times at Ridgemont High, The Breakfast Club, Ferris Bueller’s Day Off, or Pink Floyd’s “Another Brick in The Wall.” Depending on your theoretical inclinations, you might find it in David Bartholomae or Gerald Graff. Practically I see these narratives manifest in new teachers all the time. The same teacher will, one week, hold extra office hours on the weekend (though few if any students will show up) and the next week wait with a slathering snarl for some student to miss one more class so they can rigidly implement our attendance policy, fail them from the class, and have one less paper to grade. I don’t think we can escape from these dual narratives but we can become aware of them, which is what I did while grading. More than that, we can deploy them. I can’t believe I’m about to share this in the everlasting medium of the Web since I have always only shared it orally with the caveat I would strenuously disavow the words but, well, here goes… I offer you the “nuclear option.” The nuclear option foregrounds the disjuncture of these two narratives to “shock” students at the moment most needed. Before revealing it, there are some important points to keep in mind. First, in order for it to work you must learn your students’ names on the first day of class. If you can manage this, they will love you because they are nothing but a nameless face in every other class they are taking as first year students (Step One: Deploy Christ). Second, you can use this option once and only once. I wait for that point in the semester when students are just not doing the readings, not showing up with drafts, not “there” in any real sense. At that moment, I stand before them and I move to Step Two: Deploy Clockwork. I say something like this, “Look, if no one wants to do this work we can all just go home. I’m happy to do all I can to help you pass this class but the truth is it doesn’t really matter to me because I get paid the same whether you pass or fail.” The reaction is almost always the same: they feel guilty (their own Christ reaction) and therefore re-energized. Ummm, in case anyone asks, I did not write this post.
... View more
0
0
584

Author
11-27-2013
05:32 AM
I recently finished up work for a paper I’m presenting at SAMLA (“Well-Played, WPA: Promoting Growth in an Era of Budget Cuts”). I open the paper with a number of “koans,” zen-like paradoxes that contain profound truths. One of them is this simple fact: There is no WPA theory (yes, with a slight nod to The Matrix). That there is no WPA theory is more than a paradox; it is in fact a common genuflection within most WPA theory. That is, many discussions of theoretical approaches to writing program administration open with some acknowledgement that the actual practices of administration are inextricably bound to local conditions. I especially like Jeanne Gunner’s configuration of this fact in “Cold Pastoral: The Moral Order of an Idealized Form”: “general rules apply only weakly to varying local conditions (a WPA truism)” (29). What makes it a koan is the fact that, despite that acknowledgement, we continue to theorize. But because any generalized theory is at best partially useful, we share narratives as well—a combination of abstraction and practicality, theory and praxis, why we did something but (more importantly) what we did. I can’t help but think that teaching is much the same. There are lots of theories, lots of pedagogies, but there is, balancing all of that, an equal (if not more massive) accumulation of lore. That is, as with running a writing program, when we teach we often do what works without having to know why. An odd discipline we are. I sometimes wonder why all of Composition/Rhetoric doesn’t just implode.
... View more
0
0
282

Author
11-08-2013
02:05 PM
With October 31st being the submission deadline for this, my 78th Bits blog, I thought I'd turn to answer a question a student of mine asked about the significance of the sorts of costumes being marketed to women these days for Halloween wear. Well, that one's pretty easy: in a cultural system that includes such phenomena as a young Miley Cyrus seeking to shake off her Hannah Montana image by (to put this as politely as possible) making an erotic spectacle of herself in order to succeed as a grown-up singer, the immodest (let's stay polite) wear marketed to women at Halloween is just another signifier of what Ariel Levy has quite usefully labeled a "raunch culture." Whether or not such explicit displays (and expectations thereof) of female sexuality constitute a setback for women's progress (which would be a Second-wave feminist assessment of the matter) or an advance (which might be a Third-wave interpretation) is not something I want to get into here. It's Halloween as a cultural sign that I'm interested in now. To see the significance of the contemporary Halloween, we need (as is always the case with a semiotic analysis) to situate it within a system of signs. We can begin here with the history of Halloween. Now, whether or not Halloween is a Christianized version of an ancient pagan harvest festival, or, as, All Hallow's Eve, is simply the liturgical celebration of the saintly and martyred dead that it claims to be at face value, is not something we need be concerned with. More significant is that neither of these meanings have been operative in modern times, when Halloween became a children's holiday: a night (with no religious significance whatsoever) to dress up in costume and go trick-or-treating for free candy. But in these days of an ever more restricted children's Halloween, with parental escorts or carefully monitored parties taking the place of the free range trick-or-treating of a generation and more ago, along with an ever expanding adult celebration of Halloween, we can note a critical difference, which, as is usually the case in a semiotic analysis, points to a meaning—actually, several meanings. The first is all too painfully clear: whether or not we actually live in more dangerous times (which is a question that has to be left to criminologists), we certainly feel that America has become a place where it is not safe to let children roam about on their own at night. The trust that Americans once had for each other has certainly evaporated, and the modern Halloween is a signifier of that. (One might note in this regard the ever more gruesome displays that people are putting up in their front yards: yes, Halloween began as a celebration of the dead, but this obsession with graphic and violent death hints at an insensitivity to real-life suffering that does not do much to restore that old trust.) But as Halloween has shrunk in one direction, it has exploded in another, becoming one of the premier party nights of the year for young adults. Joining such other originally liturgical holidays as Mardi Gras, today's Halloween is essentially a carnival—an event that has traditionally featured an overturning of all conventional rules and hierarchies: a grand letting off of steam (sexual and otherwise) before returning to the usual restrictions on the day after. Dressing in costume (whether along the more traditional lines as some sort of ghoul, or as some other more contemporary persona), enables a freedom—a licentiousness even—that ordinary life denies. At a time when, in reality, the walls are closing in for a lot of Americans, carnivalesque holidays like Halloween are, quite understandably, growing in popularity. There is more to it than that, of course. A further significance is the way that Halloween, like so many other American holidays (both religious and secular), has become a reason to buy stuff—not only costumes and food and candy, but also decorations, Christmas style, that start going up a month or more before the holiday arrives. Like Valentine's Day, and Mother's Day, and Father's Day, and, of course, Christmas, Halloween is now part of a different sort of liturgical calendar: a signifier of the religion of consumption. And no, I don't celebrate Halloween. October 31st has a very private significance for me: on that day in 1980 all of my Harvard dissertation directors signed off on my PhD thesis. I think of it as the Halloween thesis. I suppose that my doctoral gown is a costume of sorts, but I haven't worn it in years.
... View more
0
0
283