-
About
Our Story
back- Our Mission
- Our Leadershio
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
- Macmillan Community
- :
- English Community
- :
- Bits Blog
- :
- Bits Blog - Page 132
Bits Blog - Page 132
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Bits Blog - Page 132

Author
08-21-2013
10:30 AM
As I near the end of my summer “break” I am once again chagrined at my inability to juggle teaching, administration, and research. I have a proposal that’s only about half done, I have a summer class that’s about wrapped up, and I have a schedule that’s about ready to be put to bed. It’s that proposal that kills me. Being a WPA means never really being “off.” Even when I am not in the office in the summer I am checking email and putting out fires and, more than once, running to campus for whatever meeting of importance is being called this week. I know, theoretically, this juggling act is possible—WPAs do it all the time—but it’s a struggle for me and I almost always drop the ball and that ball is always research. I have a precious two weeks before orientation starts. There’s much WPA work to be done but I am hoping to make a push to finish the proposal at least. Please, tell me I’m not the only one who struggles with the juggle…
... View more
0
1
293

Author
08-14-2013
07:04 AM
I wanted to take a moment to say goodbye to my assistant for the last four years, Mike Shier, who’s off to a creative writing PhD program this fall. I’ll take this chance to welcome my new assistant Scott Rachesky too. Though I’m sure you’re out there, I can’t imagine a WPA running a writing program by her or himself. Just the amount of paperwork alone has been rising steadily each year. And scheduling nearly one hundred teachers remains a challenging if at times frustrating exercise in logic games (if I move Teacher X to class A, then I can move Teacher Y to Class B but only if Teacher Z is willing to take on class C). My assistant is more than a workhorse. My assistant keeps me sane. Moving assistants then is always challenging, and not just because of missing old dynamics while establishing new ones. Rather, it’s the fact that so much lore resides with my assistant—not just inside jokes and memories of harrowing times but an ingrained knowledge of how things work and how to make them work when they (as usual) don’t work. I know there’s a body of literature out there on teaching lore but I wonder if we’ve thought much about administrative lore. It exists, it’s invaluable, and when you change assistants every 3-4 years it has to be constantly passed on, recreated, and renewed. So let it begin.
... View more
0
0
375

Author
08-09-2013
07:30 AM
Ain't easy. Especially when I'm in the process of revising the chapter introductions for the eighth edition of Signs of Life in the U.S.A., and the blockbuster du jour is a movie about a mutant superhero with "adamantium" claws. I confess to being surprised to discover that this chap is almost forty years old now, with Wolverine having first appeared as a Marvel Comics character in 1974. But on the other hand, there's a certain logic to that, because the 1970s marked a turning point in the history of American popular culture, when fantasy story telling was making its transition from its heretofore marginal status as B-movie level "kid's stuff" to center stage dominance. In this era when superheroes and other forms of fantasy stories loom so large in American entertainment, it is easy to take this status for granted. But there is a history to it—and a rather paradoxical one at that. The paradox lies in the fact the 1970s was a decade in which the chaotic violence of the 1960s finally receded. The Vietnam War, the assassinations, the burning of one American city after another, the campus violence that culminated at Kent State in 1970, all came to an end. Though there would be troubles enough in the seventies (and there are always troubles), America calmed down and began to dream about . . . war. For this was the decade of Star Wars: the movie that practically changed everything. And what were Americans dreaming about in the turbulent sixties? Many things, of course, but among them was peace, a new consciousness, flower power. No, really. Though published in 1970, Richard Bach's Jonathan Livingston Seagull was written in the 1960s, and while it is hard to believe it now, was once taken very seriously and was a major pop cultural phenomenon. Same thing for the syrupy sweet poetry of Rod McKuen (Listen to the Warm gives you an idea), which reached major best-seller status. And let's not forget Richard Brautigan, whose poem "April 7, 1969," I once argued in a paper for my freshman composition course, represented the future of poetry (it goes like this "I feel so bad today/that I want to write a poem./I don't care: any poem, this/poem." My professor responded: "He felt so bad that day that he wanted to write a poem. Not just any poem. No poem at all"). I forgive myself today for such early enthusiasms of mine because serious critics were announcing at the time that loopy but gentle works like Brautigan's Trout Fishing in America would so change literature that novels in the future would be called "Brautigans." One might think, at a time of turbulence again, with an ongoing war on terrorism that has turned our society up side down, ground wars in Iraq and Afghanistan, and ideological and racial divisions that are coming to rival those of the 1960s, that Americans might return to those flowery dreams of the past, but it isn't working out that way. Bach and Brautigan and McKuen and the whole flower power ethos of the sixties never looked more impossible, so outlandishly flakey. The mood today is "apocalpse now," as giant robots storm the Pacific Rim, zombies stalk the Georgia woods, and superheroes fight their endless battles against ever more violent opponents. "Imagine there's . . . nothing to kill or die for," John Lennon wrote, before he was murdered. Now, it appears that this time around we cannot even begin to imagine anything that doesn't involve killing and dying. But adamantium claws? Won’t we feel pretty silly about that someday?
... View more
0
0
292

Author
08-07-2013
12:17 PM
This coming semester I have the opportunity to teach a class for our Women, Gender, and Sexuality Studies program. The undergraduate class, Introduction to Gender and Sexuality, fits well with my interests in Queer Theory so I jumped at the chance. It wasn’t until after I jumped that I realized I have no idea what I am doing. Give me any FYC class anywhere and I can walk in with a day or two of prep. But this is a 50-seat undergraduate lecture class. I was suddenly dumbfounded: what do I teach? What’s the text? How do I teach in such a large-class format? What should I cover? Am I giving tests? How do I even grade them? I was working on our fall orientation materials for new Graduate Teaching Assistants when I made the connection: the new course had me feeling like a new teacher once again. And I think that’s useful. After almost twenty years of teaching it’s easy to forget what it’s like to stand in front of the classroom for the first time. My goal is to help our new teachers do that; remembering how scary it is adds, this year, a new emotional dimension for me. I’ll figure out my new class (thankfully I have a good two weeks to work it out) but more importantly I won’t forget the slight moment of paralyzing terror, because remembering it will help me help others get through it.
... View more
0
1
286

Author
08-01-2013
08:59 AM
As a gay man living in a state that constitutionally bans gay marriage and partnered to a man living in a state that does allow it, I am of course rather pleased with the Supreme Court’s ruling on the Defense of Marriage Act (DOMA). Not everyone is, nor should they be. I’ve been thinking about how to teach this issue in a way that allows everyone—no matter their political views—to come to the table. Kwame Anthony Appiah’s essays are a good start. In “Making Conversation” and “The Primacy of Practice,” Appiah discusses gay marriage explicitly if briefly. But his larger point is that we not live in a world where we can’t isolate from difference, a world in which we must find a way to get along. That call for dialogue is, I think, a good starting point for discussing DOMA. Kenji Yoshino is a good follow up. He, too, discusses gay marriage in “Preface” and “The New Civil Rights,” but he also suggests that the next wave of civil rights needs to come not from the courts but from conversations—a concept quite similar to Appiah. I think both of these authors would be useful in helping student to learn civil and civic discourse. DOMA is a great way to start practicing those skills.
... View more
0
0
309

Author
07-25-2013
10:30 AM
The relative flop of Johnny Depp's recent foray into the Lone Ranger franchise (I say "relative" because many a domestic box office disappointment ends up in the black after all due to international ticket sales and DVD, Netflix, and whatnot re-sales) left the movie criticism community abuzz with post mortem analyses that were fairly dripping with the kind of schadenfreude that greets most expensive blockbuster busts. The reason for the failure of the film are many—including the perceptions that it couldn't make up its mind whether it was Blazing Saddles or Little Big Man, and that Tonto came off suspiciously like Jack Sparrow—but whether it was really that silly stuffed crow that was to blame, or simply the fact that contemporary kids don't know the Lone Ranger from Hopalong Cassidy is not my concern here. What I want to look at is what happens when some entertainment concepts that have more than outlived their times are mined in Hollywood's endless quest for safe formulae in a high stakes era when the bottom line is far more important than creativity. I can easily imagine what Disney's thinking was on this: with Superhero flicks pulling in billions (forget Iron Man for a moment, they've even resurrected Superman successfully for the umpteenth time) the studios (I use this word loosely) are ever on the lookout for old entertainment "claims" that haven't yet been fully mined out, and the Lone Ranger was available (though the spectacular failure of John Carter should have been a warning here). But the problem is that some old mines contain toxic ores, and the Lone Ranger is one of them. The problem, of course, is Tonto. Though by standards of the time in which the Lone Ranger story was created Tonto was quite a progressive advance over the usual savages-circling-the-wagon-train representations (Cochise in the short-lived television series Broken Arrow offers a similar, though far less well known example), by the 1960s Tonto's wooden caricature of a "noble savage" in subservience to the Ranger's dominance just didn't cut it anymore. That Disney and Depp were very well aware of this problem is quite evident in their movie. Here, Tonto is dominant, while the Ranger, though physically imposing at six feet five inches in height, is stiff and unimpressive. But this obvious attempt to lay to rest the ghosts of Westerns past by retelling the story from Tonto's perspective apparently failed to persuade Native American audiences— according to a review in the Los Angeles Times—who were neither terribly keen to see this old war horse resurrected, and were particularly unhappy to see, once again, a white actor playing an Indian character. I think the lesson to be learned from this is that there simply are some entertainment concepts that can't be redeemed, no matter how good one's intentions may be. You don't try to bring back Stepin Fetchit. You don't try to remake Birth of a Nation from a slave perspective. Frankly, though it did pretty well both commercially and critically, I think it was a mistake for Peter Jackson to lug King Kong back onto the silver screen. With Accounting in absolute control over Creative in today's movie industry, however, I expect that we will have many more attempts to dig up toxic concepts and decontaminate them for redistribution. But, please: don't anyone try to pretend that The Merchant of Venice would make a great vehicle for Baz Luhrmann.
... View more
0
0
370

Author
07-25-2013
08:30 AM
As I write this, Edward Snowden—NSA secret leaker—is still very much in the news, though in a “Where in the world is Edward Snowden?” kind of way. At the same time, the current issue of Time is devoted to “The Informers” and anti-government hacktivism. Peter Singer’s essay “Visible Man: Ethics in a World without Secrets” would be a great essay to be teaching right now. Singer examines both the rising nature of a surveillance society, one in which notions of privacy are changing and one in which we actively participate that change by signing away our information en masse online, as well as “sousveillance,” a kind of surveillance from below in which citizens keep an eye on the government through sites like Wikileaks. For Singer, sousveillance keeps governments transparent. It would be interesting to test his ideas against Snowden, particularly as his case develops.
... View more
0
3
273

Author
07-19-2013
07:12 AM
One of my favorite classroom activities is the argument haiku, where students summarize a reading or the argument for their papers in that super-condensed Japanese poetic genre. My assistant Mike, ever younger and thus ever hipper, has introduced me to a tool that I think could bring this exercise to a whole new level—Vine. Vine is like twitter as video or a moving instagram. Users make 7 second looping videos. The fact that you can start and stop recording allows for a crude editing and composition component. I think it would be an interesting assignment to use Vine in the classroom. Though it does require a “smartphone,” that doesn’t seem to be a huge issue for my students these days. What I find intriguing is the time constraint, which both forces conciseness and inventiveness. Imagine a 7 second video that acts out the argument of a reading or paper. It can be fun and engaging while also prompting students to focus on the core ideas of whatever they’re doing. Vine. I’m putting it on my to do list.
... View more
0
8
397

Author
07-11-2013
10:30 AM
I admit it: I am a Yahoo! News junkie. While the rest of the world is scrambling to Tumblr or Twitter, or Instagram or Reddit or (dare I say) Facebook or all the other social networking sites whose meteoric rises (and sometimes falls) are common knowledge to a linked up world, I start my day with Yahoo! News. After which I go to the Los Angeles Times, the Washington Post, and MarketWatch.com, all of which together provide me not only with my morning news "fix" but also with that constant stream of raw information that one must have in order to perform semiotic analyses. Like all web sites, Yahoo! News is not static, and its designers have changed its appearance and structure a number of times over the years. But the latest change, which seems to have solidified after a week or so of experimentation, is the most semiotically interesting to me because of the way that it has abandoned the classification scheme that characterized earlier incarnations of Yahoo! News. That is, rather than dividing up the headlines into different classifications such as "U.S. News," "World News," "Sports," "Technology," "Business," and so on and so forth, the new page is simply a scrambled and undifferentiated mixture of news stories (while there are still some classificatory links at the top and side of the page, they are rather inconspicuous compared to the center of the screen headlines). This itself is a difference, and, as such, it is a sign. I'll have to coin a term to characterize just what kind of sign I think this is. I'll call it an "emblematic sign," insofar as I think that the Yahoo! News page change is emblematic of something much bigger going on today. What it signifies is precisely the nature of what, many years ago now in the light-speed time frames of Internet history, was once called the "information superhighway." This veritable avalanche of 24/7 information that is transforming not only the way we live but also the way things work in this world, is a godsend to popular cultural semiotics, but it also presents a problem. That problem is that information alone is not self-interpreting. Its meaning does not lie on its face. To get to the significance of things you must think about them critically, and that means organizing information into structured systems of association and difference. Now, as a semiotician who has been doing this more or less instinctively for many years (long before I began to codify the process in Signs of Life in the USA), I am not in the least discommoded by the often chaotic vistas of the information superhighway, and so I am not particularly bothered by the new Yahoo! News page. But there are two things that do concern me about it. The first is my realization that the change is probably motivated by a desire to get Yahoo! News page readers to click on more stories than they would if presented with pre-classified news categories. Because if you, say, only want to look at U.S. news, in the earlier format you could ignore the rest of the page to go directly to the U.S. news section, but now you have to skim the entire front page to find what you are looking for, and so see a lot more headlines along the way. This can be conducive to what might be called "impulse clicking," like the impulse buying schemes that you can find in retail stores. After all, the more you click, the more revenue Yahoo! makes. But beyond a slight irritation at being manipulated in this way (oh well, Yahoo! has to make some money if I'm going to get my free news feed), my deeper concern is for those to whom critical thinking is not instinctive. That is, the presentation of undifferentiated news can only intensify the sense that information is not semiotic, is not meaningful; it is only information, something to grab before going on to the next tidbit of information, all seen in meaningless isolation. And that kind of information makes for less, not more, understanding of our world.
... View more
0
3
309

Author
07-11-2013
06:24 AM
I’ve recently discovered Pinterest (pinterest.com), a virtual “pinboard” that allows users to post and share items from across the web. I stumbled into it mostly because of my cult-like devotion to Crossfit and Paleo eating, but as I’ve come to play with it more I’ve wondered how it might be useful in the writing classroom. For certain, any kind of class involving web or visual design would allow students to use Pinterest to create inspiration boards. Because users can follow boards, the class as a whole can follow each other, offering feedback through comments on particular “pins” while getting more inspiration themselves. I also think it might be interesting in a research based class. The different boards could be used to organize different areas of a research topic and since so much research is web-based these days, students could organize their work through the Pinterest boards (and again, the class could follow, comment, and share). Hrm. Pinteresting. Perhaps something worth trying.
... View more
0
0
367

Author
07-10-2013
07:00 AM
Change is very much in the air here at Florida Atlantic University. Our president resigned, we’re starting a national search for a dean for our college, and we’re moving to a new chair in the department. Change, of course, can be exciting: a time for new growth, new hope, and new directions. It can also be terrifying: a time of uncertainty, instability, shifting lines of institutional power. I tend to remain optimistic in the face of change. It’s not that hard to do when you run a writing program. As my mentor Richard E. Miller once remarked (I may be slightly paraphrasing), writing programs have one fundamental asset—we have all the students. For sure our program is the economic engine of our college, thanks to the role we play in the core curriculum. That is, of course, a double-edged sword, since it risks turning our courses into “service courses” and thus devaluing the program, the department, and the college. Perhaps so. But having weathered many changes already I will say this much: service or not, there is some measure of security when you are vital to the basic functioning of the university.
... View more
0
0
287

Author
06-28-2013
09:30 AM
In the teaching of popular cultural semiotics, one of the key concepts that must be explained and applied is that of denotation. It seems to be an obvious point, but it turns out that for students it is a rather difficult concept to master. In the semiotic sense, the denotation of a sign can be an actual object (as, for example, an automobile), whereby its connotation is the image associated with it through a marketing campaign or through consumer use. Thus, the denotation of the Volkswagon New Beetle was a compact car introduced in the late 1990s, whose connotation was intended to invoke a retro-revival of the free-spirited VWs of the 1960s, but which ended up with a feminine connotative image due to its popularity among women consumers (and which is why the latest Beetle has been more “aggressively” styled to attract male consumers). Denotation is a bit trickier when considering a television show or movie as a sign because there is no objective “thing” as with a consumer product. Here one can begin with the fundamental concept of the show. For example, in the television series “The Walking Dead,” the fundamental concept (which we can equate with the denotation of the show) is that of a group of American survivors of a global pandemic that has transformed most of the population into flesh eating zombies. Living a nomadic existence, these survivors must carry weapons and be prepared at any moment to defend themselves against zombie attacks from former humans who may even include their own friends and family. What all this connotes (that is, what such a basic story line says about the audience that is entertained by it) is a matter for semiotic interpretation—something I will not go into here. Now, given the popularity of “The Walking Dead,” my students often choose it as a topic for their research papers, and I encourage that choice because this program is such a rich source of cultural signifiers. But where my students most often struggle with their interpretations is in neglecting to clarify what the fundamental concept of the show is all about, its basic denotation. They commonly pick up on an individual plot element or particular character, but without establishing the denotative context in which plot elements unfold and characters act, their interpretations may miss the most significant parts of the show as a cultural signifier. The same sort of thing happened with my student papers back in the hey day of “Heroes.” Students would write that the characters in “Heroes” were all out “to save the world,” but without establishing the fundamental concept of the series (that is, the presentation of a group of otherwise ordinary people who discover that they each have extraordinary super powers), their papers could just as well have been describing a World War II story. Indeed, my most common comment on many such papers was “what are they saving the world from?” The denotation is crucial because it enables the semiotician to construct the most applicable system in which to situate the sign. Thus, while there are romantic elements in “The Walking Dead,” a primary association of that series with, say, “Cheers” isn’t going to be very fruitful, even though that show had a central romantic relationship too. On the other hand, associating “The Walking Dead” with “Buffy the Vampire Slayer” is much more fruitful because here the romantic element in each case is contextualized by a basic situation involving a constant violent struggle against dead/alive monsters. Similarly, interpreting any Batman story without mentioning the denotative fact that the hero is a rich vigilante who dresses up as a bat is going to miss a crucial part of the cultural connotation, or significance, of the whole franchise. So, getting to the essence of just what it is that you are interpreting, before actually interpreting it, is a critical part of the semiotic procedure. It requires the ability to produce a concise descriptive summary of your topic—getting a sense of the forest, so to speak, before moving on to the trees. It sounds easier than it actually is, but especially in this era of mass decontextualized information, teaching your students to describe the denotative essence of popular cultural phenomena is one of the first lessons that you will want to undertake.
... View more
0
0
665

Author
06-26-2013
04:37 PM
Sorry Ned Stark but you have it wrong. Fall is coming. We’re already working on our standard assignment sequence for the fall, testing it in summer classes to make sure it’s workable and to gather sample student work for orientation. It’s a tricky process. The last thing we want is for a student taking our first semester course in the summer to run into the same essays in our second semester course in the fall. The solution we use is to divvy up the assignments across several sections. Students in the fall might encounter one reading they’ve had in the summer—perhaps two—but they’ll be approaching the text(s) in a whole new way with a whole new set of questions. Our sequence this fall uses Malcolm Gladwell’s “Small Change” (about the failure of the “Twitter” revolution to create real social change), Rebekah Nathan’s “Community and Diversity” (about the failure of universities to live up to their ideals of community and diversity on campus), Helen Epstein’s “AIDS, Inc.” (about HIV prevention strategies in Africa, and Wesley Yang’s “Paper Tigers” (about Asian-American stereotypes and “tiger children”). I’ve yet to title the entire sequence but it will be focused on strategies and mechanisms to create change—hopefully something students can benefit from.
... View more
0
1
332

Author
06-19-2013
10:00 AM
Summer isn’t just impossible for our classes and the students in them; it’s impossible for me, too. Theoretically, this is my research time—theoretically. As the WPA for my school, I check email daily to put out countless little fires. As a human being, I mostly want to sleep. I envy colleagues who manage to juggle it all somehow: research, teaching, trips off to Europe. I just can’t seem to make it all happen. Being on a 9-month contract makes it all more impossible. I cobble together money for rent by teaching part of the summer (losing a chunk of research time) and doing administrative work for part of the summer too (losing more research time). Granted, I’ve never been good with unstructured time. Give me a routine and watch me hum but give me time and my bed calls to me more loudly than anything. I am working, slowly, on a research project but it’s a struggle and a fight. And my “summer” is half over—I’ll be teaching soon and back in the office. How do you do it? Or do you? I’ll say this, there is comfort in tenure. No, more than comfort: peace of mind. I tell people I don’t get paid much money but I get paid a lot in time. And in the end I treasure the quality of life that academia affords me. Now, if only I could work more research into that life.
... View more
0
0
278

Author
06-13-2013
06:48 AM
[Editor's Note: This blog post discusses plot details from the recent “Game of Thrones” episode "The Rains of Castamere.”] HBO's highly successful adaptation of the George RR Martin "A Song of Fire and Ice" novels might have been titled "Middle Earth Meets the War of the Roses Meets the Sopranos Meets Quentin Tarantino." But I'll admit that "Game of Thrones" was a much handier choice. Up till now I've had nothing to say about the series beyond the fact that it is another signifier of a continuing medieval revival that began with the enthusiastic embrace of J. R. R. Tolkien's The Hobbit and The Lord of the Rings trilogy by the baby boom generation back in the 1960s, and which has been continued not only by the successful filming of those works in the new millennium but by a whole series of "sword and sorcery" entertainments in popular literature, cinema, and video games (indeed, long before there was "World of Warcraft" there was "Dungeons and Dragons," while Harry Potter himself is a descendant of Tolkien's reworking of the wizard tradition). But with the recent broadcasting of "The Rains of Castamere" episode of "Game of Thrones," an interesting difference has appeared within this system of medieval-themed phenomena, and as I cannot say often enough, it is the differences within a semiotic system that points to cultural significance. The difference in this case is to be seen less in the episode itself (also known as the "Red Wedding") than in the impassioned response to it. And as the Los Angeles Times reports, that response hasn't been pretty (see "'Game Of Thrones' fans see red over "Red Wedding'"). Indeed, as the Times reports, a whole Twitter feed, @RedWeddingTears, has been opened just to express fan outrage over the episode. There is no way that I could provide a casual summary of what happened in the "Red Wedding" episode. Like "Lost" and "Desperate Housewives," " Game of Thrones" is a very complicated story indeed, with far too many characters and plot lines to contain in any simple description. Ideally, one has to watch not only every episode of the TV series but to read the novels on which it is based as well (not so incidentally, the fact that many of the TV fans of the program have not read the books accounts for a good deal of the outrage expressed in the Twitter rants: they were completely taken by surprise). So let me just say that in a scene that combines the killing of Macduff's family in Macbeth with the Glencoe Massacre with the climactic revenge scene in Django Unchained, most of the surviving members of a popular family in the story (the Starks), and their household, are treacherously (and graphically) murdered during a wedding celebration. Even the family dog/wolf is killed. The response to all this has been quite striking. In astoundingly profane language, male and female fans of the show alike (the preponderance seems to be female, however) tweet their fury at the TV show, at the producers of the show, and even at George RR Martin for "ruining" their lives. Tears, convulsions, even a claim of having been "literally" killed (literally!?) by watching the episode make their way into the tweet thread. Now, violence is nothing new to either modern or ancient story telling and entertainment, and deep audience identification with fictional characters is hardly a recent phenomenon either (my favorite instance of this from the past is to be found in the nearly hysterical reaction to the death of Little Nell during the serialization of Charles Dickens' The Old Curiosity Shop). But there is still something interesting going on here. The key lies in the fact that for all its claims of historical realism, "Game of Thrones" still belongs to the category of fantasy story telling (there are, after all, dragons in the tale). And while fantasy stories are characteristically violent, and prominent characters do die in them, they generally present rather clear-cut oppositions between Good and Evil, with Good eventually triumphing. This is particularly obvious in Tolkien's tales, whereby Evil is not only ugly but not even human (oh, Sauron has his human slaves and allies, but the focus is always on Orcs, trolls, Balrogs, giant spiders, Nazgulized-Ring-Wraiths-Who- Were-Once-Men, and whatnot). Human history, on the other hand, isn't like that all, and if you want some really depressing reading, read some world history; it doesn't matter from what region of the world. It's always human beings murdering other human beings, with no one being wholly Good, and Evil frequently triumphing over the innocent. So that's what makes fantasy what it is: fantasy, not a realistic view of human behavior or history. The unspoken contract between the fantasy story teller and the fantasy audience is that Good will ultimately prevail over Evil (which is why English audiences for over a century preferred Nahum Tate's revised version of King Lear to Shakespeare's original: in Shakespeare almost everyone dies, Good as well as Evil; in Tate's play both Cordelia and Lear live happily ever after). That many fans of "Game of Thrones" feel betrayed by the "Red Wedding" is thus quite understandable. As a popular genre, fantasy is supposed to deliver on its ancient promise, and the "Red Wedding" doesn't. The Good Starks are practically wiped out. There is one member of the clan left, from what I understand, and she may get her revenge in the end for all I know, but right now a lot of fans are seeing, well, Red. Which raises an interesting semiotic question. Martin's novels (which contained the "Red Wedding" scene already for anyone who took the trouble to read it, though the TV version made it even more violent), and the television show that has been created out of those novels indicate a certain difference in the history of popular fantasy story telling. If this difference was restricted to just this instance, it wouldn't mean very much. But at a time when (Good) heroes are being replaced by (not-so-Good) anti-heroes all over television (Mad Men, Breaking Bad, The Sopranos, and Dexter are prominent examples), while such current hits as "The Walking Dead" present a world where Good and Evil have become irrelevant in a chaotic killing field whereby your own family members and allies can be zombified into killable enemies at any moment, "Game of Thrones" appears to be a sign. But of what? I'd like to think that the growing popularity of stories that are abandoning the old conventions of Good triumphing over Evil signifies a certain historical realism, and even maturity, in their audiences, a rejection of simplistic fantasy and a recognition of the messiness of human history. But I don't think that's it. The Good vs. Evil motif still thrives in most superhero stories and many, if not most, of all the other stories that continue to be a part of our popular culture. The signs point instead to a certain fascination with pure violence itself, an amoral vision even bleaker than Tarantino's, whose violence is characteristically "justified" by some atrocity or other on the part of its victims (Nazism in Inglourius Basterds, and slavery in Django Unchained). After all, not everyone was outraged by the "Red Wedding," and the responses of the not-outraged fans of the show (which can be found all over the Internet as well) to the laments of those who were, are, dare I say, often unnecessarily violent.
... View more
0
0
339
Popular Posts