-
About
Our Story
back- Our Mission
- Our Leadershio
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
- Macmillan Community
- :
- English Community
- :
- Bits Blog
- :
- Bits Blog - Page 39
Bits Blog - Page 39
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Bits Blog - Page 39

Author
10-10-2013
07:34 PM
So the jury is in, the dust has settled, and the critics, by and large, are happy. Breaking Bad has concluded, and, unlike The Sopranos (with which it had so much in common), it ended pretty decisively. As I noted in a previous blog, at least one TV critic (Mary McNamara at the Los Angeles Times) had hoped that the show would end with some sort of message that Walter White was evil—not a mere anti-hero—and that crime, in the end, does not pay. And in Walter White’s confession that everything that he had done he did for himself, McNamara found a satisfying conclusion that set “this series free”. Another Times critic usefully noted a different angle, however, observing that “Walt turned from anti-hero to outright villain at some point, but there was always somebody worse in the room” (Robert Lloyd). Lloyd’s point is that this is a common Hollywood strategy used to keep audiences on the side of anti-heroes even when they cross the line into outright villainy. As Lloyd puts it, “In the age of the bad-good/good-bad guy, the formula is to ensure that however bad your bad guy is, there is a badder one around. This is a cheat, of course, but a common one.” Finally, yet another Times critic has offered the following shrewd description of the story arc of this much discussed show: “Since some point in its second season, 'Breaking Bad' has effectively been two shows in one. The first of those two shows was the one we thought we were watching in the pilot: A mild-mannered chemistry teacher breaks bad and discovers how thrilling it can be, then drags us into the thrills right alongside him. In general, he is someone we’re supposed to root for, someone we’re supposed to cheer. In the second show, 'Breaking Bad' was all about a man who made a choice to break bad and revealed untold depths of bleak awfulness within himself. It was a series about a man who broke himself open and revealed a monster driving the controls, then decided he kind of liked that version of himself” (Todd VanDerWerff). All in all, this is some pretty danged good television criticism, criticism whose cultural significance is quite wide ranging. In the first place, it illustrates how, in what I call an “entertainment culture,” a great deal of significant "high" cultural activity has migrated to the “low” sphere of popular culture. In such a world not only has popular art largely taken the place of “high” art, but even such traditionally “high” cultural activities as literary criticism can now be found operating at a very high level in the analysis of television programs. (Please note that I do not at all intend this observation as a criticism: it is only an interpretation of the signs. I’m actually delighted when I see good television criticism, and that is why I often turn to the L.A. Times, which has a very able staff of TV writers.) But there is another meaning to be gleaned from the finale of Breaking Bad, one which indicates the conditions under which popular entertainment still exists. This is the fact that, as VanDerWerff and Lloyd imply above, commercial art (and television is commercial art) is generally beholden to audience desire. If Franz Kafka once remarked that reading a novel should be like getting hit over the head with an ice-axe, the commercial artist’s credo is more along the lines of “give the audience what it wants.” And Breaking Bad gives everyone everything: for the moralist, there is Walter White finally paying the ultimate price for his crimes, provided by what I'll call a deus ex machinegunna. For the millions of viewers who identified with him, on the other hand, there is the fact that Walter White (who was often not the worst person in the room) achieves a certain justice (of sorts) in the end, and is at least trying to look out for his children. Such a have-it-both-ways conclusion was a masterpiece of keeping-everyone- satisfied commercial art. (Incidentally, my guess is that the reason why The Sopranos had such an unsuccessful finale was that its writers couldn’t figure out any such conclusion. How on earth could Tony Soprano achieve any sort of posthumous justice? What even worse villains would he need to revenge himself upon? And yet, there were all those fans who so identified with Tony that killing him off just wouldn’t work either, so the end was . . . sidestepped—perhaps, as a student of mine has suggested, simply to leave the door open to a revival of the series sometime down the line.) And there, at least, lies the only tenuous distinction that I can make between high art and popular art. Unbeholden to the profit motive, high art can afford to disappoint its audience, or even bother it. Commercial art can’t. It isn't a hard and fast distinction, and there are certainly many examples that could challenge it, but at this time when commercial art is, like billboards along a country road, blotting out the high art horizon, there are no other distinctions that could be any more certain.
... View more
0
0
297

Author
09-26-2013
06:53 AM
Let’s begin with Carl Straumsheim at Inside Higher Education: The Full Report on Udacity Experiment“San Jose State University on Wednesday quietly released the full research report on the for-credit online courses it offered this spring through the online education company Udacity. The report, marked by delays and procedural setbacks, suggests it may be difficult for the university to deliver online education in this format to the students who need it most.The report's release lands on the opposite end of the spectrum from the hype generated in January, when university officials, flanked by the Udacity CEO Sebastian Thrun and California Governor Jerry Brown, unveiled the project during a 45-minute press conference. The pilot project, featuring two math courses and one statistics course, aimed to bring high-quality education to students for a fraction of the cost of the university's normal tuition. Wednesday's report went live on the university’s website sometime before noon Pacific time, appearing with little fanfare on the research page of the principal investigator for the project, Elaine D. Collins. Collins serves as associate dean in the College of Science.The report provides a long-awaited look into how the pilot project has fared. The initials results from the spring pilot led the university to put its partnership with Udacity on “pause” for the fall semester. Last month, the university released results from the summer pilot, showing increased retention and student pass rates. However, those reports barely scratched the surface of the data the university collected during the project . . . .Research has shown that at-risk students tend to struggle in online classes, said the education consultants Michael Feldstein and Phil Hill. That disadvantaged students enrolled in SJSU Plus courses posted similarly poor pass rates suggests the spring pilot was rushed, they said.‘We have to be careful that our sense of altruism doesn’t overcome our sense of common sense,’ Hill said. ‘If we know that at-risk students don’t tend to do well in online courses, you can’t just wish away that problem.’"After weeks of delays, San Jose State U. releases research report on online courses Take away points: A highly touted large scale experiment to determine whether MOOCs are the answer to cost and accessibility problems in public higher education has produced negative results. Those students most in need of reduced costs and increased access obtained the least benefit from the experiment. The negative results have been released in stages, appearing with “little fanfare.” And now Stacey Patton at The Chronicle of Higher Education:Influx of Foreign Students Drives Modest Increase in Graduate-School Enrollments“Enrollments in graduate programs at American colleges and universities have increased modestly, driven largely by a rise in international students, according to a report being released on Thursday by the Council of Graduate Schools . . . .The number of international students in American graduate programs went up by 8 percent from the fall of 2011 to the fall of 2012, up slightly from the 7.8-percent increase in the previous year. By contrast, first-time graduate enrollment increased by only 0.6 percent for U.S. citizens and permanent residents over the same period . . . .First-time enrollments of U.S. citizens and permanent residents was flat or down from the previous year in a number of science, technology, engineering, and mathematics fields. And total enrollment, counting both new and returning students, in graduate programs fell by more than 2 percent, to nearly 1.74 million students, in the fall of 2012, following a decline of 0.8 percent the year before . . . .It is good news that international-student enrollments are trending upward, said Debra W. Stewart, the council's president. But an increase of less than 1 percent in domestic students is worrisome, she added, given that the American economy will have an increasing need for highly skilled workers. The U.S. Department of Labor has forecast a 22-percent rise in jobs requiring at least a master's degree from 2010 to 2020, and a 20-percent rise for jobs requiring doctorates.‘We have strong increases for international students, which is good because if we didn't have strong enrollment from abroad, some graduate programs would be faltering,’ Ms. Stewart said. ‘But there are some particular concerns about where declines continue to persist for U.S. students. We are seeing a widening gap between U.S. and international first-time enrollments in engineering, math, and computer science.’In the fall of 2012, more than half—54.7 percent—of all graduate students who are categorized as temporary U.S. residents were enrolled in science, technology, engineering, and mathematics fields, compared with 17.3 percent of U.S. citizens and permanent residents . . . .” http://chronicle.com.libproxy.csun.edu/article/Graduate-School-Enrollments/141577/ Take away points: Thirteen years and three generations of millennial college students (classes of 2004, 2008, & 2012) into the new millenium, and American-born enrollments in STEM graduate programs are “flat or down.” While 57.4% of international graduate student enrollments are in STEM fields, only 17.3% of U.S. citizens or permanent residents are. The increased use of digital tools in education over these years does not appear to be increasing technological acumen or interest. Could this be an effect of digital devices being toys more than tools in too many instances?
... View more
0
0
292

Author
09-25-2013
01:30 PM
I have, officially, become “The Man.” Of course, as a writing program administrator I was in many ways already “The Man.” In fact, I tell new teachers in our program to use that to their advantage, to ally themselves with students against me (“Yeah, I hate this essay too but our mean writing program director is making us use it so I’m going to do my best to help you all survive it.”) But now I am a whole new level of “Man”. Now I am the Coordinator for Credentialing, Assessment, and Interdisciplinarity—an adjunctive functionary of the dean’s office tasked with making sure people who teach are qualified to do so (according to a strictly literal interpretation of our accrediting body’s guidelines), figuring out what assessment means for our college and helping departments to enact it, and shepherding a diverse collection of interdisciplinary programs in our college. Wow. I mean I can’t even talk about the position in non-institutionalese. One of the things that strikes me about the position is that it’s an odd assortment of tasks (what does credentialing have to do with interdisciplinarity?), yet at the same time the near random coupling of it all feels right in a bureaucratic sense. My motto has been, for some time, “embrace the illogic”; this position merely confirms that as the path to sanity. On the one hand I’m thrilled. I enjoy administration. It’s why I’m a WPA. And I enjoy it too because one of my primary research interests is the institution itself. What better way to understand it than to be swallowed by it whole? On the other hand, I’m uneasy. It’s more work and a WPA is, by definition, a busy person to start with. And it’s unpleasant work. I’m a cat herder, sending out frequent and carefully worded emails to chairs to get this or that done. I’m the sheriff, dropping the boom on some unsuspecting professor who is not precisely credentialed to teach such and such course. Ultimately, I am hoping to unravel more of the Gordian knot that is the institution. I hope to be sharing what I learn here, so stay tuned (and, please, wish me luck).
... View more
0
1
294

Author
08-22-2013
12:30 PM
With all the buzz surrounding the final season of Breaking Bad (not to mention the fact that this is my last blog for the summer, though I'll be back in the fall), I thought that this would be a good time to consider the significance of endings. Of course, if you want to read a full, and magisterial, study of fictional endings as a whole, I highly recommend the late Sir Frank Kermode's The Sense of an Ending. Here I wish only to look at the finales of highly popular television dramas. The high level of attention that the concluding episodes of Breaking Bad is receiving just now puts it into some pretty select company, joining such other series as M*A*S*H, Twin Peaks, The X-Files, Lost, and The Sopranos, whose conclusions were also national news. In the case of Twin Peaks, Lost and The X-Files, much of the fascination with their finales was due to the devilishly complicated—not to say obscure—nature of those series, whose audiences looked forward to their concluding episodes rather in the manner of someone waiting for the arrival of next Sunday's newspaper to find the solution to that impossible crossword puzzle. Here sheer curiosity, the desire to finally have clarified just what exactly had been going on through all those seasons, was the driving factor (and which is why the impossibly obscure final episode of Lost was such a letdown for many of its fans). With M*A*S*H (whose final episode set a long-running record for television viewership), on the other hand, there was a kind of family dynamic going on, a desire to see what would happen to this cast of characters who through eleven seasons had become like personal friends to their audience. But when it comes to shows like Breaking Bad, while the curiosity and the sense of personal relationship are there too, something more is going on. What this "something more" might be is suggested in Mary McNamara's Los Angeles Times review, "What will Breaking Bad's story mean?". Seeing Walter White not as an anti-hero who has blundered into evil territory but as an out-and-out villain, McNamara proposes that "Breaking Bad wasn't about how good men will go to extremes when pushed; it was about how 'good men' can be secretly bad." This significantly raises the stakes, and so, for McNamara, while series finales "are notoriously difficult . . . this one seems more important than most, carrying with it a discernible moral weight. In Gilligan's [Breaking Bad's creator] worldview, does evil survive and thrive? Can there be redemption or simply containment?" Good questions, these, because, like The Sopranos, Dexter, and Mad Men, Breaking Bad has been a television series that invites viewers to put themselves into the shoes of some awful creeps. What this TV trend signifies is not easy to assess. Of cpourse, there is always the old "misery loves company factor" to take into account: that is, the appeal of a program featuring people whose lives are so bad that your own life feels better to you. But I think that there is something more going on here, a sense of growing desperation in this country that causes millions of viewers to wonder what their own breaking points might be, just how far they can be pushed before they abandon every restraint of civil society—before they even care any longer what those restraints are. For such viewers, Breaking Bad, and shows like it, may be a vicarious experience. The success of series like The Walking Dead, in which all that is left in the lives of its characters is an unending war for survival of everyone against just about everyone (even your own relatives can become the "enemy" with just one fatal infection in The Walking Dead), indicates that something like this is indeed the case. It isn't the horror here that stands out: it's the absolute freedom, the complete abandonment to violence. In the end, then, the power of Breaking Bad lies in the way it holds a mirror up to a society that is tearing itself apart, and where crime, all too often does pay (they don't call them "banksters" for nothing). My guess is that the finale of Breaking Bad will hold neither redemption nor containment, only the sickening feeling I presume we were supposed to get at the end of Natural Born Killers, when it becomes clear that the psycho-killers have finally gotten away with it.
... View more
0
0
469

Author
07-25-2013
10:30 AM
The relative flop of Johnny Depp's recent foray into the Lone Ranger franchise (I say "relative" because many a domestic box office disappointment ends up in the black after all due to international ticket sales and DVD, Netflix, and whatnot re-sales) left the movie criticism community abuzz with post mortem analyses that were fairly dripping with the kind of schadenfreude that greets most expensive blockbuster busts. The reason for the failure of the film are many—including the perceptions that it couldn't make up its mind whether it was Blazing Saddles or Little Big Man, and that Tonto came off suspiciously like Jack Sparrow—but whether it was really that silly stuffed crow that was to blame, or simply the fact that contemporary kids don't know the Lone Ranger from Hopalong Cassidy is not my concern here. What I want to look at is what happens when some entertainment concepts that have more than outlived their times are mined in Hollywood's endless quest for safe formulae in a high stakes era when the bottom line is far more important than creativity. I can easily imagine what Disney's thinking was on this: with Superhero flicks pulling in billions (forget Iron Man for a moment, they've even resurrected Superman successfully for the umpteenth time) the studios (I use this word loosely) are ever on the lookout for old entertainment "claims" that haven't yet been fully mined out, and the Lone Ranger was available (though the spectacular failure of John Carter should have been a warning here). But the problem is that some old mines contain toxic ores, and the Lone Ranger is one of them. The problem, of course, is Tonto. Though by standards of the time in which the Lone Ranger story was created Tonto was quite a progressive advance over the usual savages-circling-the-wagon-train representations (Cochise in the short-lived television series Broken Arrow offers a similar, though far less well known example), by the 1960s Tonto's wooden caricature of a "noble savage" in subservience to the Ranger's dominance just didn't cut it anymore. That Disney and Depp were very well aware of this problem is quite evident in their movie. Here, Tonto is dominant, while the Ranger, though physically imposing at six feet five inches in height, is stiff and unimpressive. But this obvious attempt to lay to rest the ghosts of Westerns past by retelling the story from Tonto's perspective apparently failed to persuade Native American audiences— according to a review in the Los Angeles Times—who were neither terribly keen to see this old war horse resurrected, and were particularly unhappy to see, once again, a white actor playing an Indian character. I think the lesson to be learned from this is that there simply are some entertainment concepts that can't be redeemed, no matter how good one's intentions may be. You don't try to bring back Stepin Fetchit. You don't try to remake Birth of a Nation from a slave perspective. Frankly, though it did pretty well both commercially and critically, I think it was a mistake for Peter Jackson to lug King Kong back onto the silver screen. With Accounting in absolute control over Creative in today's movie industry, however, I expect that we will have many more attempts to dig up toxic concepts and decontaminate them for redistribution. But, please: don't anyone try to pretend that The Merchant of Venice would make a great vehicle for Baz Luhrmann.
... View more
0
0
372

Author
07-11-2013
10:30 AM
I admit it: I am a Yahoo! News junkie. While the rest of the world is scrambling to Tumblr or Twitter, or Instagram or Reddit or (dare I say) Facebook or all the other social networking sites whose meteoric rises (and sometimes falls) are common knowledge to a linked up world, I start my day with Yahoo! News. After which I go to the Los Angeles Times, the Washington Post, and MarketWatch.com, all of which together provide me not only with my morning news "fix" but also with that constant stream of raw information that one must have in order to perform semiotic analyses. Like all web sites, Yahoo! News is not static, and its designers have changed its appearance and structure a number of times over the years. But the latest change, which seems to have solidified after a week or so of experimentation, is the most semiotically interesting to me because of the way that it has abandoned the classification scheme that characterized earlier incarnations of Yahoo! News. That is, rather than dividing up the headlines into different classifications such as "U.S. News," "World News," "Sports," "Technology," "Business," and so on and so forth, the new page is simply a scrambled and undifferentiated mixture of news stories (while there are still some classificatory links at the top and side of the page, they are rather inconspicuous compared to the center of the screen headlines). This itself is a difference, and, as such, it is a sign. I'll have to coin a term to characterize just what kind of sign I think this is. I'll call it an "emblematic sign," insofar as I think that the Yahoo! News page change is emblematic of something much bigger going on today. What it signifies is precisely the nature of what, many years ago now in the light-speed time frames of Internet history, was once called the "information superhighway." This veritable avalanche of 24/7 information that is transforming not only the way we live but also the way things work in this world, is a godsend to popular cultural semiotics, but it also presents a problem. That problem is that information alone is not self-interpreting. Its meaning does not lie on its face. To get to the significance of things you must think about them critically, and that means organizing information into structured systems of association and difference. Now, as a semiotician who has been doing this more or less instinctively for many years (long before I began to codify the process in Signs of Life in the USA), I am not in the least discommoded by the often chaotic vistas of the information superhighway, and so I am not particularly bothered by the new Yahoo! News page. But there are two things that do concern me about it. The first is my realization that the change is probably motivated by a desire to get Yahoo! News page readers to click on more stories than they would if presented with pre-classified news categories. Because if you, say, only want to look at U.S. news, in the earlier format you could ignore the rest of the page to go directly to the U.S. news section, but now you have to skim the entire front page to find what you are looking for, and so see a lot more headlines along the way. This can be conducive to what might be called "impulse clicking," like the impulse buying schemes that you can find in retail stores. After all, the more you click, the more revenue Yahoo! makes. But beyond a slight irritation at being manipulated in this way (oh well, Yahoo! has to make some money if I'm going to get my free news feed), my deeper concern is for those to whom critical thinking is not instinctive. That is, the presentation of undifferentiated news can only intensify the sense that information is not semiotic, is not meaningful; it is only information, something to grab before going on to the next tidbit of information, all seen in meaningless isolation. And that kind of information makes for less, not more, understanding of our world.
... View more
0
3
311

Author
07-10-2013
07:00 AM
Change is very much in the air here at Florida Atlantic University. Our president resigned, we’re starting a national search for a dean for our college, and we’re moving to a new chair in the department. Change, of course, can be exciting: a time for new growth, new hope, and new directions. It can also be terrifying: a time of uncertainty, instability, shifting lines of institutional power. I tend to remain optimistic in the face of change. It’s not that hard to do when you run a writing program. As my mentor Richard E. Miller once remarked (I may be slightly paraphrasing), writing programs have one fundamental asset—we have all the students. For sure our program is the economic engine of our college, thanks to the role we play in the core curriculum. That is, of course, a double-edged sword, since it risks turning our courses into “service courses” and thus devaluing the program, the department, and the college. Perhaps so. But having weathered many changes already I will say this much: service or not, there is some measure of security when you are vital to the basic functioning of the university.
... View more
0
0
289

Author
05-12-2013
05:46 PM
Soon after I started teaching, while still just a Graduate Teaching Assistant, I realized no one knows what I really do for a living. You’ve probably experienced it, too. It goes something like this: “So what do you do for a living?” “I’m an English teacher.” “Uh-oh. I better watch my grammar!” My standard reply is “It’s OK. I’m off the clock.” The more complete answer would be something like “Well, actually, grammar is the least of what I do. I teach students to think critically, to make connections between complex ideas, to express those ideas to others, and to do so in a way that conforms with the particular quirks of academic writing so that they can go on and succeed in whichever discipline they choose. I also try to teach them a set of adult skills, ranging from managing time, to completing tasks on time, to asking for help, to communicating when there are problems.” But I could say all of that and still, in my heart, I know that “they” expect me to be teaching grammar. Indeed, in some quarters “they” demand it (“Students can’t write!” they screech). And so I try. In our program, more specifically, we teach students to recognize and track their specific patterns of error. If a student doesn’t understand how to use a semicolon then that error is going to happen again and again. If they come to understand they have an issue with semicolons, then they can focus their attention on that one issue, master it, and solve it. With each paper I grade, I identify the prominent patterns, note them for the student, and ask them to track them for the next paper using an error checklist. The checklist asks them to list the error, to review it in the handbook, and to identify how they addressed it in the current paper. It doesn’t work. Well, I should say, it only works when students choose to use this tool. More often, I get error checklists hurriedly scribbled before class. Errors are dutifully listed and checked, handbook pages are references, but the paper itself remains filled with just the same error. Frustrating. More frustrating are those students whose errors just don’t have a pattern—errors that are random, careless, syntactically complex. The tools I offer them seem woefully inadequate. Grammar is not my job. It’s my bane. It’s hard enough getting students to invest in a course they’re forced to take, task enough to get them to care about something their systematically disinclined to like at all. To get them to care about grammar is a goal still outside my reach. Grammar? FAIL.
... View more
1
3
1,099

Author
05-02-2013
10:30 AM
A friend of mine from Australia emailed me the following link to a clip from the television series The Newsroom. http://www.safeshare.tv/w/UAGOcLSuLX . The clip’s origin is not identified in the link he provided, however, so it took me a moment to recognize Jeff Daniels as the impassioned panelist angrily denouncing the present condition of America while extolling its past. Instead, the clip is simply identified as "The most honest three and a half minutes of television, EVER." Is it, though? Well no, not exactly. Here’s why. First of all, Jeff Daniels’ speech is an example of something very familiar to American history: the Jeremiad, which is a sermon or text that, like the Old Testament Book of Jeremiah, denounces the moral state of the present in comparison to some distant past. The New England Puritans were especially fond of Jeremiads, so it is only appropriate that one of their descendants, Robert Lowell, composed in his poem "For the Union Dead" a Jeremiad that is, for my money, the greatest in our history, as Lowell compares the flaccid materialism of 1960s America with the moral backbone of men like Robert Gould Shaw (made famous today by the movie Glory). But indeed, as Raymond Williams so brilliantly demonstrated in his book The Country and the City, people have always decried the present on behalf of some lost "golden age," an age that keeps receding into the past whenever you actually go look for it. What we praise today was once reviled in its own time. The Newsroom’s Jeremiad is no different. Denouncing a floundering present state of America, Daniels’ character emotionally harkens back to an era of high moral purpose that never really existed (the U.S., for example, didn't enter the Second World War for moral reasons: we stayed out of it as Hitler gobbled up Europe, inaugurated the "final solution," and nearly knocked out England; we only got into it when Japan attacked Pearl Harbor and Germany declared war on us). More interesting to me, however, is the ideological incoherence of the rant. That is, while the scene begins with a caricature of both a "liberal" (who is pointedly insulted for being a "loser") and a "conservative" (who is roundly refuted in his belief that Americans alone are "free"), in the end it isn’t clear just what ideology it represents. On the one hand, Daniels’ position appears to be “liberal” enough to admit that America isn’t the greatest country in the world, anymore, but on the other hand, its nostalgic longing for things past (note the music rising and the tears forming in the eyes of the students as they capture it all on their iPhones) repeats the conservatively "exceptionalist" belief that there is something really special about America—that, in fact, we once were the greatest country on earth and we still ought to be. Whether or not America was, is, or someday will be the greatest country on earth is not a problem for semiotics. What is is the question of just why television so commonly tries to have it both ways when it comes to ideology. I’m reminded here of an episode of Criminal Minds that presented the gun-loving militia movement as a haven for misfits and psychopaths, while at the same time attempting to elicit audience empathy towards it. The result is ideological incoherence. Similarly in this clip from The Newsroom, after the ideological left and right are dismissed, sheer nostalgia is substituted for a coherent politics, sometimes with a conservative slant (especially in the reference to the “men”—never the women—of the past), and sometimes with a vaguely liberal slant. Why this sort of thing is so common on TV is not hard to explain. Television programming exists to generate profits, and you don’t succeed at that by offending too large a swath of your potential audience. So you choose ideological incoherence: a position that isn’t really a position at all and so will not offend too many people. That’s good for the bottom line, but, no, it doesn’t really make for an honest political assessment.
... View more
0
4
416

Author
04-18-2013
09:30 AM
Call it the meme that isn't quite a meme yet. That's one of the interesting things about the new Brad Paisley/LL Cool J song that is all over the news, the Net, and Twitterland: look for it on YouTube and you will find lots of personal reactions to the song, but not a performance of the song itself—not, at least, as I write this blog. That's understandable; with so much advance publicity that no amount of money could buy, the copyright holders can be forgiven for wanting to get a chance to see some album sales first before free versions will be allowed on the world wide web. But the lyrics are out there, as well some news clips of the song and its performers discussing it, and that will be enough for me to work with here. As I cannot repeat often enough, a semiotic analysis must begin with the construction of a relevant system in which to situate the sign that you are interpreting. The construction of that system entails the identification not only of significant associations but also critical (one might say "diacritical") differences. In the case of "Accidental Racist," then, we can start with the system of popular music. Within this system a particular association immediately leaps out at us: "Accidental Racist" even explicitly draws attention to that association when the "white" voice in the song notes that his Confederate battle flag* t-shirt only means to signify that he is a Skynyrd fan. Yes, of course: there hasn't been this much fuss about the racial overtones of a pop song since Lynard Skynyrd's "Sweet Home Alabama." And the fact that so much attention is being paid to Paisley and Cool J almost forty years after Skynyrd's lucrative experiment in racial provocation is certainly a sign that race relations in America are still quite fraught. But that doesn't take us very far. It is the differences that can reveal even more. In this case we can look at the differences in popular music genres. Skynyrd is a "rock" band ("Southern rock," to be more specific), while Paisley is a "country" singer, and Cool J is a rapper. Now, rock music was co-created by black and white performers (Chuck Berry and Carl Perkins are essential names in this history), so, even in the face of racist distinctions in the 1950s between white "rock-and-roll" and black "rhythm and blues," classic rock music does not have powerfully apparent racial identifications (even among Southern rock bands, groups like The Allman Brothers—the greatest of the bunch—were anything but segregationist: Lynyrd Skynyrd had gone out on a limb, and they knew it). But country music and rap do. As the Paisley side of "Accidental Racist" makes very clear, country music is, first and foremost (though not exclusively) the music of the white South (note the requirement that country music singers, male and female, must sing with some sort of southern twang). And rap music (now more commonly called "Hip Hop") is still regarded, in spite of its huge non-black audience, as the music of the black inner city, as is made clear by the LL Cool J portion of "Accidental Racist," which is filled with many stereotypical features of urban black popular culture. And here, I think, is where the significance of the song lies. Paisley and Cool J know who their audiences are. They know their genres, and the symbols that belong to those genres. More importantly, they know their audiences' attachments to those symbols. The Confederate battle flag is one such symbol, and a significant portion of Paisley's audience is still quite attached to that symbol, even as (especially as) that symbol is being taken down (finally) from State Houses throughout the South. If he wants to keep his audience, Paisley can't come out and denounce the CBF (things haven't changed that much), so, instead, he is trying to change its meaning, turning it into a symbol of proud young southern manhood, not wanting to offend.** This is a lot different than the Lynyrd Skynyrd gambit. They knew perfectly well that they were waving a red flag, literally as well as figuratively, in the face of America with their prominent adoption of the Confederate battle flag. That was confrontation. Paisley is looking for negotiation. And that's why there has been so much reaction to the song even before many of us have heard it performed in full. Because the question is whether or not the meaning of the CBF can be negotiated. Since the reaction so far has been more against LL Cool J's complicity in this negotiation ("If you don't judge my gold chains . . . I'll forget the iron chains") than against Paisley, the indications are precisely that, even in the light of what I am willing to grant as Paisley's and Cool J's good intentions (they state in interviews that they only want to open a healing racial dialog), there are some symbols whose histories, and thus their significance, can't be rewritten. If young southern white men want to display their pride, wearing the CBF is not going to be an uncontroversial way of doing so. Not today. Probably not ever. *A great fuss is made by defenders of the public display of the Confederate flag over the fact that most such displays are of the Confederate battle flag, not the national flag. The distinction, presumably, is to mark the difference between the national symbol of the Confederacy, which stood for the defense of slavery, and the battle flag of the Confederate armies, which supposedly stood for valorous men simply doing their duty and defending their rights. Frankly, I don't buy it: the Confederate soldier (often a conscript, not so incidentally) was fighting for the Confederate nation, which was created in the defense of slavery, so the difference is meaningless in my book. **There is an irony here. Brad Paisley is from West Virginia, which was created during the Civil War when the western, mostly non-slave owning, counties of Virginia seceded from secessionist Virginia with the help of the Union army. He may be merely role playing in the song, but I can't help but wonder whether he and his fans are aware of the irony.
... View more
0
0
438

Author
03-21-2013
09:01 AM
In my last blog I sang the praises of the unexpected dividends of digitally-based research. So I hope that this, and the fact that I write this column (web log, or “blog”) for Bedford Bits, will be sufficient evidence that I am hardly a purblind “Luddite” ignorant of, and hostile to, technology. Still, in this blog I want to sound a certain warning note, whose theme could be “balance is everything.” I am prompted to this theme both by the daily deluge of features in Inside Higher Education and The Chronicle of Higher Education devoted in one way or another to technology—from MOOCS to calls for the defunding of liberal arts education in public universities on behalf of STEM spending—and by my just reading Ian Morris’ book Why the West Rules—For Now. In fact, it is Morris’s book that has helped me clarify more effectively to myself just why the Humanities still matter. Not that, that is Morris’s thesis. In itself, Why the West Rules is a grand narrative, in the tradition of such books as Hans Zinsser’s Rats, Lice and History and (more particularly) Jared Diamond’s Guns, Germs, and Steel. Vigorously arguing for a kind of geographical determinism in history (“maps, not chaps,” as Morris says again and again), Why the West Rules implicitly suggests a certain sobering lesson for us Humanists: namely, that societies, for whatever reasons, that fail to look forward, eventually succumb, with dismal results, to those that, for whatever reasons, succeed in looking forward. Thus, prehistoric agriculturalists overwhelmed foragers thousands of years ago, just as industrialized Europe overwhelmed the rest of the world some two centuries ago. Since it is quite obvious today that postindustrial technology is already the you’d-better-get-with-the-program-zeitgeist (and there is something more than vaguely Hegelian in Morris’s book), those of us who cling to non-technological values would appear to be not only old-fashioned but quite frankly in the way. No wonder the governor of North Carolina wants to withdraw all state support for liberal arts education in his state’s public universities, and even the president of the United States appears to think that STEM and “education” are synonymous. But there is something crucial that books like Why the West Rules have left out: this is the human (or if you like, the moral) dimension in history. To give him credit, Morris is quite explicit on the fact that his quantitative assessment of history excludes moral judgment. From his own perspective he is neither applauding nor condemning what he calls the “shape of history”; he is simply describing it (the fact that his interpretation of history radically underestimates the role of sheer contingency—and critically contradicts itself on the basis of its own evidence—is beside the point). The point is that not only current historians but just about everyone else outside the Humanities seem to be adopting an essentially economistic/materialistic attitude towards social history. And that’s a problem. Just to look at Morris: his assessment of history is based in a quantitative measurement scheme that he calls a “human development index,” or “social development,” for short. Focusing on “energy capture,” “urbanism,” “information processing,” and “war making capacity,” Morris measures 15,000 years of social history. And (guess what), by his measure the United States is currently at the top of the heap but in real danger of being overtaken by China. Since, about six hundred years ago, China was at the top of the heap but was eventually overtaken by an industrialized Europe, and, lacking its own industrial revolution (essentially a failure of forward-looking thinking) was conquered and humiliated by Europeans, it would appear to behoove contemporary Americans not to make that mistake. The fact that this is one of the take-away points of Morris’s book is demonstrated by the way that the CIA has consulted Morris to get his take on what the future is going to look like. Now, I don’t want to get tangled up in the racial and cultural politics that books like Why the West Rules inevitably raise. What I want to bring up is what they don’t raise. And what they don’t raise is an index of what life is actually like for the mass of people living in any society. History shows plenty of examples of developmentally high scoring societies in which life for the bulk of the population has been quite miserable. In fact, I would argue that contemporary China’s peculiar brand of totalitarian capitalism constitutes one such society. But if we ignore the qualitative conditions of life in a society, as so many economistic books do today, an inevitable conclusion might be that if it is to survive into the future, the United States, which built the world’s first mass middle-class society, would do well to get with the Chinese program, dismantle its middle class, and create a mass work force of impoverished workers led by a tiny elite of technologically savvy businessmen and women. Does this sound outlandish? No: it is exactly what is happening in this country right now. Whether we are talking about the MOOC revolution that will, if it succeeds, turn middle-class academic careers into McDonald’s-class jobs for all but a handful of sage-on-the-screen superstars, or at the much more general demand that American workers not expect things like pensions and benefits, nor even job security, on behalf of corporate “competitiveness,” we are looking at the results of economistic materialism. If the index for the society as a whole is high, who cares about the lives of the mass of individuals? I’ll tell you who cares: Humanists in the Humanities. It is Humanists who can, without demanding an end to technological society (after all, we already have a thriving Digital Humanities) provide the balance that social thinking and planning require if our society is going to be worth living in. That is a value that quantitative thinking neglects or even eschews. So, no, I do not feel myself to be behind the times or an impediment to anything in my loyalty to the Humanities. This is not backward thinking, not a fixation on the past; it is forward thinking based upon the critical thinking skills that enable us to assess everything in a total context, a context that goes beyond economic and monetary measures. Quality must balance quantity, and at the moment it appears that only Humanists are still saying this.
... View more
0
0
534

Author
03-07-2013
12:30 AM
Digital technology has absolutely revolutionized the possibilities for research when it comes to popular culture-related writing assignments. Not only can students find advertisements (print and video), television shows, music, movies, and, of course, scholarly and popular commentary online, but they can also benefit from the way that online sources can keep up with the pace of popular culture far more successfully than can print. Indeed, thanks to the possibilities of digital research, far more ambitious writing assignments for our students can be designed precisely because it is so much easier for them to get access to the material that they need, especially when it comes to semiotic analysis and interpretation. This is a rather obvious point, and might appear to hardly require an entire blog entry, but I want to discuss here one of the advantages of the Internet that I do not believe our students are sufficiently aware of: that is, the possibilities it offers for unguided exploration and unexpected discoveries. The standard model for student research appears to be something along the following lines: determine some relevant search terms for your topic, enter them into a search engine, and then pick out a few of the more promising looking links that appear. Most of the time something from Wikipedia will turn up at or near the top of the page, and in the worst case scenario the research effectively ends right there. But even the more responsible student is likely to remain confined within the terms of the key words used for the search, and this can very much limit one’s research, especially when one of the tasks of the assignment is to construct relevant systems, or contexts, within which a given popular cultural topic can be analyzed semiotically. Such systems require a very wide-ranging scope of knowledge and information, more wide ranging than a narrow key word search will be able to reveal. Here is where an alternative to the key word search-and-discover research model can be very helpful. In this alternative, which might be called (after Borges) a “garden of the forking paths” approach, one’s research is approached as a kind of journey without itinerary. You know that you are going someplace, but you don’t have anywhere specific in mind. Instead, you let the signposts along the way take you to the next signposts, which take you to further signposts, and to places that you never expected to be going to at all. Let me give an example. As I have mentioned a number of times in my blogs here, one of the most important questions I ask myself as I analyze contemporary American culture is how we got to where we are now: a country that is radically at odds from the goals that were so widely embraced by American youth in the 1960s. You can’t simply enter a question like that into a search engine, and reading a book on the subject can lead to a narrowing of one's view on this massively overdetermined question given the needs of books to stick to specific arguments couched within specific disciplines. So I have been on a kind of research walk-about for some time now. I can’t remember when it began precisely, but I do recall reading Ann Charters’ anthology, The Sixties Reader, about a year ago, and while I found a lot of material there that I already knew about and expected to find, I also found a selection from an oddly titled book called Ringolevio, by an oddly named writer named Emmett Grogan. Going online to find out who Emmett Grogan was revealed a lot of basic information about him personally, but also tipped me off to a group I knew vaguely about, the Haight Ashbury Diggers. Searching for information on the Diggers took me to a web site called Diggers.org, where I found not only information but a discussion forum frequented by a large variety of now middle-aged (and older) ex-Diggers, ex-Hippies, and not-so-ex-Hippies and Diggers. Reading this forum has given me an enormous amount of primary source sociological information that I never expected to find at all. That site has tipped me off to some useful books to read, but it has also led me to further web pages where I have learned ever more about what, precisely, happened to a large number of people from the sixties who once tried to change society. I have discovered their pride, their disappointment, their continuing passion, and, yes, sometimes their paranoia. It has been a journey through America without my having to leave my desk. Just today I visited Diggers.org and looked at an update on how these sixty-somethings and seventy-somethings feel that the Occupy Wall Street movement is an extension of their own. So that got me searching for Occupy Wall Street information, which not only took me specific OWS websites but also, to my surprise, to a website devoted to libertarianism, something that, on the face of it, could not be more opposed to the Diggers’ values or to those of the sixties. But wait, reading the libertarian page revealed to me that many conservatively identifying young libertarians have a lot in common with the Diggers (who evolved into what they themselves called the Free Families) in their insistence on complete freedom. This has brought young libertarians into conflict with such high profile conservatives as Glenn Beck and Rush Limbaugh, who have been seeking libertarian support. It’s all very interesting, but my point is that I didn’t expect to be there at all. I’m still on walk-about, still gathering what I need to know to answer my questions. I’ve learned a heck of a lot along the way that I never expected to learn at all, because I didn’t even know it was there. Of course my project is far larger and more open ended than a student paper assignment. But I want to make the point that research is a wandering as much as it is a homing in. If you go at it with only one destination in mind, you’ll miss what you really need to know. The Internet can make that wandering both easy and quite fun.
... View more
0
0
373

Author
02-07-2013
07:11 AM
The fundamental principle of popular cultural semiotics is that everything can bear some sort of social significance. And that means everything. Yes, even running shoes. Take the current barefoot/minimalist running shoe fad (I'm not sure if it is sufficiently entrenched to be called a trend yet). The fad in question involves the explosively popular shoes that either look like some sort of foot glove (with a separate compartment for each toe), or which appear to be more or less conventional in looks but use much less cushioning and other material than ordinary running shoes. The former variety includes the Vibram FiveFingers shoe, while the latter includes the Merrell Sonic Glove Barefoot runner. The whole thing really got started with the publication of a book called Born to Run (2009), which focused on the barefoot running prowess of the Tarahumara Indians of Mexico, and you can learn a good deal about it all here. Now, on the face of it the barefoot running phenomenon would seem to have a purely functional significance, based upon the fact that the Tarahumara Indians are apparently able to run effortlessly for hundreds of miles barefoot and without injury. Furthermore, after decades of running shoe technology developments that have enhanced the cushioning and support of conventional running shoes, recent research suggests that all that cushioning and support causes runners to run in such a way that their heels strike the ground first in their running stride (this is called "heel striking"), and that this kind of stride puts great stress on the knee and ankle joints, causing injuries. The "barefoot" technology shoes, on the other hand, are designed to force a toe-striking stride, which may be a less injury-prone running style. But there's more to all this than simply the physiology of running and the functionality of shoes, for looked at semiotically these new shoes are signs as much as they are athletic gear. First, then, there is what we can call the "Noble Savage" angle. Since the eighteenth-century romanticization of the aboriginal peoples contacted in the age of European exploration, the "Noble Savage" has been an emblem, for such writers as Rousseau, of a prelapsarian innocence that European civilization has lost. Reflecting more of a European mythology than a human reality, the "Noble Savage" is a construct not unrelated to such gestures as Marie Antoinette's donning of simple peasant clothing. The Tarahumara Indians serve as "Noble Savages" in this sense, conferring upon running shoes their aura of a prelapsarian innocence. A corollary to the "Noble Savage" significance of barefoot running shoes is their "green" appeal. Using less material than a conventional running shoe, barefoot/minimalist runners would appear to use fewer resources and thus be more sustainable than the ordinary, beefed up variety. Now, I'm all for green technology, and I am no cheerleader for European-style civilization, but as a runner and a semiotician I know that there is something a little funny about all this. First of all, the "Noble Savage" bit has always been condescending, and it is no less so today with the barefoot running movement's use of the Tarahumara Indians. Living in primitive conditions for generations, they have developed the kind of hardened feet that can run without protection not because they are purer but because they have historically had no other options. They also run on natural trails without glass and nails and other foot-cutting stuff (this is why few "barefoot" runners in the U.S. actually run barefoot: minimalist running shoes are supposed to protect the foot from such things). One wonders how the Tarahumara would perform if they had modern running shoes to run with. Beyond this is the fact (which I know from personal experience) that there are barefoot running-specific injuries that occur when one strikes too far up on one's toes—which is what running barefoot running shoes are designed to compel. Painful calf injuries often result, contradicting the claim that barefoot running is all good. As for the possible claim that minimalist running shoes are more ecologically friendly, well, not quite. Using less material they wear out much faster than conventional running shoes, and must be replaced every few months if heavily used, leading to the consumption of more resources in the long run (pun intended), not less. And finally there is a particularly American consumerist angle to all this. For the fact is, as I know from my own experience, that you can discipline yourself to run in such a way that your foot strikes the ground in an optimum fashion without requiring any special sort of shoe. Indeed, the best balance for aging knees and ankles like mine is a nicely cushioned and supportive shoe combined with a foot strike that lands between the arch and the ball of the foot. I do not need to buy a special shoe to force me to run this way—indeed, a minimalist shoe would force me to strike too high up on my toes and really mess up my calf muscles in no time (I know: I've tried barefoot running). So here is the point: the barefoot/minimalist running shoe fad signifies within the context of a larger consumer system whereby Americans tend to prefer products and gimmicks that promise to do their work for them, rather than making an effort on their own. Whether it is the "toning shoe" (also originally based on a kind of back-to-nature claim) that claims to exercise your body even when you are not really exercising, or the endless weight loss programs and pills that promise slim bodies without effort or discomfort, Americans like to buy results in the consumer marketplace rather work for them. Purchasing an expensive barefoot running shoe (they are priced at a premium) rather than training yourself to run with a healthier stride is a part of this phenomenon. No one is really being more natural or green or aboriginal by choosing one shoe over another, and unless you have a nice smooth turf to run on, it isn't very healthy to run barefoot. The aura of naturalness and health associated with minimalist running shoes is a matter of image, not function, a sign rather than a substance.
... View more
0
1
750

Author
01-30-2013
09:35 AM
In this series of posts, I’d like to think about student responses to a first day writing sample that asked “What is academic writing?” (broadly, of course, for IRB-related reasons). Though the sample size is really quite small I think these students nevertheless reveal some of what many students bring to our classrooms. One of the things that students bring, represented in many of the responses, is a particular understanding of the form of academic writing, an understanding created through No Child Left Behind (NCLB) or (more specifically) the FCAT, Florida’s mechanism for complying with that federal legislation. These responses were easy to spot because they emphasized not just the form of academic writing but a very specific form—and a very formulaic one. The students who presented this view of academic writing indicated that it has an introduction, a conclusion, and a thesis. I’m not sure how to feel about this grouping. The overall tripartite construction, broadly speaking, applies to the kind of writing we ask students to do in our writing courses. But I am concerned about how that preconceived notion of form might limit students and how it might even block them from writing well or writing at all. I don’t want to open the NCLB can of worms here, but I'm wondering about the experiences of other teachers. Do tests like the FCAT do anything at all to prepare students for your classroom? Are they a start? Or are they a hindrance? Do we building on what students learned in high school? Or do we tear it down?
... View more
0
0
369

Author
01-23-2013
12:33 PM
Grammar was the most prominent (if somewhat disheartening) theme in students’ first day responses to the question “What is academic writing?” However, surprisingly, the second most mentioned feature was citation. That one really caught me by surprise. I guess I am so surprised because citation seems to be a particular Achilles heel for students. They seem to have little sense of what it is or when it’s needed. Given that citation is, I think, a kind of disciplinary “secret handshake,” a way of showing that you are a member of a particular discipline and belong there, it’s not all that surprising that first year students would know so little about citation. I’m just glad to know that it exists at all. In fact, that’s the approach I’ve adopted to teaching citation—starting by making sure students know it exists. I only teach my students three things about citation. I don’t “teach” them MLA citation (even though we use it in our class) because, first of all, students are going to end up in many different disciplines with many different citation systems. There’s a good chance they will never use MLA again. Besides (and secondly) citation systems change. Teaching the intricacies of one instantiation of one citation system will end up useless knowledge—if not the next semester then certainly some day. No. I tell students they only need to know three things about citation: It exists. In class we discuss what this means. Basically, students need to understand that if they are using words or ideas from someone else there needs to be a citation. If it’s not absolutely right, it’s wrong. In discussing this point, we continue part of the conversation from the first point: academic writing takes proper attribution very very seriously. We generally open this up to a discussion of plagiarism: what it is, how it happens, what the consequences are, and how to avoid it. But this point also underscores the “secret handshake”-ness of citation. I think that citation is part of the process that David Bartholomae describes in “Inventing the University.” Using it, if not mastering, is evidence that students are stepping into our language. Know how to find the answer. I’ll admit it, with all the recent changes to MLA and with writing in a discipline that uses either MLA or APA depending on the place of publication, I have to look up citation formats all the time. If Idon’t know every citation form by heart why should my students? Instead, I know how to find the answers and in class we tease out where those answers might be: A handbook or other reference book. A reliable web resource (Purdue OWL being the most popular, of course). Electronic citation tools like Refworks or even Microsoft Word. A targeted Google search with some discerning assessment of the results. Asking me. Asking at the writing center. Following what other sources have done, ones they know are correct. It doesn’t matter how students find the right answer, as long as they have a set of tools for finding that answer. I know my approach might be a bit weird (as I am) or a bit oversimplified (as I never am) but the more I use this approach (which requires constant reinforcement through the semester) the more I am convinced it’s a workable one. So, only three things to know about citation. Could that work in your classroom?
... View more
0
2
361