-
About
Our Story
back- Our Mission
- Our Leadershio
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
- Macmillan Community
- :
- English Community
- :
- Bits Blog
- :
- Bits Blog - Page 133
Bits Blog - Page 133
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Bits Blog - Page 133

Author
06-12-2013
09:27 AM
We’re already a couple of weeks into the first of our two summer semesters. Each is an impossibility since each lasts 6 weeks. How does one squeeze 16 weeks of writing instruction into 6 weeks? Not easily, to be sure, and perhaps not well. Our first strategy is to cut the course down to its bare bones: students write 4 papers instead of 6. Even then the pace is beyond relentless. Here’s a skeleton of our summer syllabus: 5/14 Introduction 5/16 Discuss Reading 1 5/21 Paper 1 Draft / Peer Revision 5/23 Paper 1 Final / Discuss Reading 2 5/28 Paper 2 Draft / Peer Revision 5/30 Paper 2 Final / Discuss Reading 3 6/4 Conferences / Midterm Portfolio 6/6 Paper 3 Draft / Peer Revision 6/11 Paper 3 Final / Discuss Reading 4 6/13 Paper 4 Draft / Peer Revision 6/18 Paper 4 Final / Discuss Final Portfolio 6/20 Final Portfolio It’s bad enough that in a regular semester we have writing due every week but in the summer we have it due every class. It’s always concerned me. Ideally, we should be using the whole summer but there are institutional pressures here that involve complex politics of budgets, student progress, time to degree, and more. Does anyone else have such a crazy summer schedule?
... View more
0
1
246

Author
05-23-2013
06:35 AM
Since I’ve been on the subject of grammar, I wanted to end with some thoughts on handbooks. In doing so I am explicitly staking my claim to the article I’d like to write though I never will, the one that exists in that utopic space where I am not spending all my time running a writing program, where I have the luxury of research. When I do write it, it will be called “Whither Handbooks?” so you can’t have that title. When you decide that I’ve waited too long and decide to write this article yourself you will at least need to cite this blog post (Ha!). This is my last post of the season and as any publisher rep will tell you I have a large soapbox when it comes to handbooks, so I will restrict my comments here to one simple observation: handbooks don’t work. The evidence? “Students can’t write” (they say). I won’t blame publishers. They strive constantly to build the better handbook. But the genre itself is so bloated and increasingly so foreign that I think we need something else. Each time a rep walks through my door, I look for that something else. I haven’t found it yet. (Maybe I will write it, off in that utopic space where I am not spending all my time running a writing program). But to continue a subtheme of this series of posts, perhaps the problem isn’t with students or with secondary education or with a texting generation or with No Child Left Behind or with apathetic students. Maybe the problem is with me. It may be (probably is) that my own bias seeps into my pedagogy. If I fail to help my students succeed with grammar skills, it may have less to do with them or with the tools and more to do with my attitudes and approaches. I’m teachable. Tell me… what works?
... View more
0
0
297

Author
05-16-2013
06:20 AM
Half a century ago, when Marshall McLuhan launched the modern era of media studies with his groundbreaking book The Gutenberg Galaxy, television, cinema, radio, and record players were the dominant electronic media, so it was only natural that McLuhan would focus on the shift he saw taking place at the time from a text, or print-based, culture, to an aural and image based culture. Such a culture would mark a radical change in consciousness, McLuhan predicted, a departure from the logical form of thinking fostered by the linear structure of alphabetic writing, and a return of sorts to a more ancient oral/visual consciousness in what he called the “global village.” With the rise of the Internet and related digital technologies, McLuhan’s predictions have been considerably complicated. This is due to the fact that while digital technology, too, is an electronic medium saturated with visual images and aural content, it has also brought back the (digitally) printed word into popular culture and consciousness. Indeed, in the earliest online progenitors of social networking sites—what we once called “chat rooms”—printed texts were all that appeared. And with the subsequent rise of the blogosphere in the latter part of the 1990s, not to mention the online posting of such traditional print media as newspapers, magazines (does anyone remember “zines”?), fiction (does anyone remember “hypertexts”?) and so on and so forth, the digital proliferation of print appeared to refute one of McLuhan’s most fundamental observations about media history. Indeed, things seemed to be going back to Gutenberg. With the obliteration of MySpace—which in its heyday tended to be plastered with visual imagery and aural content—at the hands of Facebook, which made sheer printed text the dominant feature on the screen “page,” this return of the font appeared to be even further accomplished. Sure, there was still YouTube, but Facebook was really getting much of the attention. But while it is far too early to predict any eventual demise of Facebook (I certainly wouldn’t do that, but a number of technophilic pundits are doing just that right now), something new appears to be happening, a differential shift in the high-speed history of digital culture. And it is the appearance of a difference that signals the need for a semiotic analysis. This difference involves the emergence of such sites as Twitter (still print based but reduced to a kind of digital shorthand) and Tumblr, which, in its presentation of “microblogs” is heavy on uploaded images and light on printed text. Facebook, too, has largely supplanted the much more text-based world of the original blogging site, while purchasing Instagram as if to hedge its bets in a suddenly image-rich Net-world. Add to this the continued popularity of YouTube and many other sites devoted to the uploading of an endless video stream contributed by a ubiquitous arsenal of iPhones, Droids, Galaxys, and whatnot, and you have a veritable tsunami of images: pixels, not fonts. So maybe it is premature to rule McLuhan out. Perhaps the global village has arrived in the form of a global hive, a buzzing crowd of digitally connected Netizens who appear to be unable to let go for a few minutes to concentrate on an actual here-and-now as they hook up with a virtual elsewhere. Heaven knows what it will be like when Google Glass arrives. And maybe it is also premature to equate digital literacy with the literacy that college composition courses are tasked to teach and develop. While the ability to make a video or post a tweet is indeed a kind of communication, and critical-thinking based exposition is also a kind of communication, that does not make the two forms of communication equivalent. College composition courses exist to train students in what McLuhan identified with the linear, logical, and rational structure of discourse that emerged long ago with the advent, first, of alphabetic writing, and, later, with the invention of the printing press. Digital technology appears to be developing a very different sort of thought process and a very different sort of writing. Judging between the two as the one or the other being “better” is not a very useful exercise. But when writing teachers from around the world speak of a “literacy crisis,” they are referring to an inability to think and write in a linear, logical (and grammar, too, is a form of logic) fashion. Given the way that digital technology is trending towards image and sound (that is, to non-linear, alogical signs), it is not evident that this is a useful pathway to teaching the kind of critical-thinking-based writing that we are trying to teach. Post Script: as I completed this blog, I came across the following Inside Higher Ed from John Warner. It provides a nice complement to my argument. http://www.insidehighered.com//blogs/just-visiting/about-facebook-home-ad
... View more
0
0
461

Author
05-15-2013
11:39 AM
In my efforts to address grammar in the classes I teach, I tried a new online tool this semester (offered by another publisher so I will speak a bit generally here). I was suspicious from the start. The critical literature I know suggests that “skill and drill” doesn’t work, that students can identify errors in writing exercises but then make those same errors in their own writing. Still, I saw that what I was doing wasn’t enough, the publisher made good claims about effectiveness of their system, and they offered to let me and my students try it for free. I’ll admit I implemented it a bit lackadaisically. I put it in the syllabus, assigned it as one small portion of their grade, had the publisher rep come to class to explain it, and urged students to use it in my end comments on their papers. We’re one week from the end of the semester; 4 students have completed it, 3 are working through it, 3 signed up but did nothing, 13 didn’t even sign up. Of those who did give it a try, their experiences have been telling, if anecdotal. More than one has told me something along the lines of “I was so focused on comma splices that I aced the online diagnostic but I keep making the error.” I can’t say I am surprised. I don’t want to dismiss this tool or the possibility of technology to assist in language and grammar skills. It may be that what’s needed is a shift not in the tools but in me. I’m not sure but I’m having students reflect on this particular tool as part of their final assignment so I will have more fodder for thought then. In the meantime… Grammar? FAIL.
... View more
0
1
349

Author
05-12-2013
05:46 PM
Soon after I started teaching, while still just a Graduate Teaching Assistant, I realized no one knows what I really do for a living. You’ve probably experienced it, too. It goes something like this: “So what do you do for a living?” “I’m an English teacher.” “Uh-oh. I better watch my grammar!” My standard reply is “It’s OK. I’m off the clock.” The more complete answer would be something like “Well, actually, grammar is the least of what I do. I teach students to think critically, to make connections between complex ideas, to express those ideas to others, and to do so in a way that conforms with the particular quirks of academic writing so that they can go on and succeed in whichever discipline they choose. I also try to teach them a set of adult skills, ranging from managing time, to completing tasks on time, to asking for help, to communicating when there are problems.” But I could say all of that and still, in my heart, I know that “they” expect me to be teaching grammar. Indeed, in some quarters “they” demand it (“Students can’t write!” they screech). And so I try. In our program, more specifically, we teach students to recognize and track their specific patterns of error. If a student doesn’t understand how to use a semicolon then that error is going to happen again and again. If they come to understand they have an issue with semicolons, then they can focus their attention on that one issue, master it, and solve it. With each paper I grade, I identify the prominent patterns, note them for the student, and ask them to track them for the next paper using an error checklist. The checklist asks them to list the error, to review it in the handbook, and to identify how they addressed it in the current paper. It doesn’t work. Well, I should say, it only works when students choose to use this tool. More often, I get error checklists hurriedly scribbled before class. Errors are dutifully listed and checked, handbook pages are references, but the paper itself remains filled with just the same error. Frustrating. More frustrating are those students whose errors just don’t have a pattern—errors that are random, careless, syntactically complex. The tools I offer them seem woefully inadequate. Grammar is not my job. It’s my bane. It’s hard enough getting students to invest in a course they’re forced to take, task enough to get them to care about something their systematically disinclined to like at all. To get them to care about grammar is a goal still outside my reach. Grammar? FAIL.
... View more
1
3
1,096

Author
05-02-2013
10:30 AM
A friend of mine from Australia emailed me the following link to a clip from the television series The Newsroom. http://www.safeshare.tv/w/UAGOcLSuLX . The clip’s origin is not identified in the link he provided, however, so it took me a moment to recognize Jeff Daniels as the impassioned panelist angrily denouncing the present condition of America while extolling its past. Instead, the clip is simply identified as "The most honest three and a half minutes of television, EVER." Is it, though? Well no, not exactly. Here’s why. First of all, Jeff Daniels’ speech is an example of something very familiar to American history: the Jeremiad, which is a sermon or text that, like the Old Testament Book of Jeremiah, denounces the moral state of the present in comparison to some distant past. The New England Puritans were especially fond of Jeremiads, so it is only appropriate that one of their descendants, Robert Lowell, composed in his poem "For the Union Dead" a Jeremiad that is, for my money, the greatest in our history, as Lowell compares the flaccid materialism of 1960s America with the moral backbone of men like Robert Gould Shaw (made famous today by the movie Glory). But indeed, as Raymond Williams so brilliantly demonstrated in his book The Country and the City, people have always decried the present on behalf of some lost "golden age," an age that keeps receding into the past whenever you actually go look for it. What we praise today was once reviled in its own time. The Newsroom’s Jeremiad is no different. Denouncing a floundering present state of America, Daniels’ character emotionally harkens back to an era of high moral purpose that never really existed (the U.S., for example, didn't enter the Second World War for moral reasons: we stayed out of it as Hitler gobbled up Europe, inaugurated the "final solution," and nearly knocked out England; we only got into it when Japan attacked Pearl Harbor and Germany declared war on us). More interesting to me, however, is the ideological incoherence of the rant. That is, while the scene begins with a caricature of both a "liberal" (who is pointedly insulted for being a "loser") and a "conservative" (who is roundly refuted in his belief that Americans alone are "free"), in the end it isn’t clear just what ideology it represents. On the one hand, Daniels’ position appears to be “liberal” enough to admit that America isn’t the greatest country in the world, anymore, but on the other hand, its nostalgic longing for things past (note the music rising and the tears forming in the eyes of the students as they capture it all on their iPhones) repeats the conservatively "exceptionalist" belief that there is something really special about America—that, in fact, we once were the greatest country on earth and we still ought to be. Whether or not America was, is, or someday will be the greatest country on earth is not a problem for semiotics. What is is the question of just why television so commonly tries to have it both ways when it comes to ideology. I’m reminded here of an episode of Criminal Minds that presented the gun-loving militia movement as a haven for misfits and psychopaths, while at the same time attempting to elicit audience empathy towards it. The result is ideological incoherence. Similarly in this clip from The Newsroom, after the ideological left and right are dismissed, sheer nostalgia is substituted for a coherent politics, sometimes with a conservative slant (especially in the reference to the “men”—never the women—of the past), and sometimes with a vaguely liberal slant. Why this sort of thing is so common on TV is not hard to explain. Television programming exists to generate profits, and you don’t succeed at that by offending too large a swath of your potential audience. So you choose ideological incoherence: a position that isn’t really a position at all and so will not offend too many people. That’s good for the bottom line, but, no, it doesn’t really make for an honest political assessment.
... View more
0
4
414

Author
05-02-2013
08:30 AM
A prospective graduate student emailed me today to say he’s been avidly reading my posts here. I was surprised. Usually when I write these posts it feels like I am speaking into a vast and determinedly quiet void. (This fact is not helped be the amount of spam comments on Bits posts. As I write this there are over 100 spam comments in my junk folder, mostly selling (of all things) Ugg boots). It’s strange to think about blogging into a void and I think I find it especially strange for me. It’s been 10 years since I declared 2003 The Year of the Blog (and at least 8 since I even looked at that piece. Kudos to me for coding a design that still reads well in a browser). So much of the early scholarship on blogging in relation to composition talked about student writing for real audiences but as someone who’s been blogging here for years I have to admit I can’t imagine a more unreal audience. I have no idea who you are and, frankly, I’m not at all sure you’re even there. Online writing has, of course, changed quite a bit in the intervening years. There’s the rise of Twitter and microblogging, of course, but I am thinking more about Facebook. When I write there I know precisely who my audience is because my profile is utterly locked down and because I have all my friends sorted into various lists. Knowing the exact audience makes a huge difference. It’s no wonder then that I keep assigning papers in the courses I teach. Students surely know that I am the audience, no matter what else we might pretend. I used to find that somehow problematic but at this moment it feels enabling (though I don’t know my students would agree): there’s something extremely useful about knowing your audience precisely. So, void, how are you? Ugg boots, you say? Delightful.
... View more
0
3
344

Author
04-24-2013
10:30 AM
My partner lives in the South End, so I am up every other month or so, FLL to BOS. My editor lives in Boston, as do all the wonderful Bedford people I know. I’ve weathered Boston’s bitter cold and its sweltering heat, I’ve enjoyed its food and culture, and yes I’ve walked the street where those bombs went off. Two essays from Emerging came to mind as the events of that day unfolded. The first is from the first edition: Joan Didion’s “After Life.” Didion writes about the processes of grief and mourning, specifically around the sudden loss of her husband. She has a particular phrase that has always stuck with me: “the ordinary moment.” The ordinary moment is a reminder that abrupt and disruptive and tragic change doesn’t tend to announce itself; it tends instead to arrive in the most mundane of moments, in the ordinary course of ordinary days in the ordinary stream of life. For Didion, one moment she was cooking dinner (as she had any number of nights) and the next her husband was dead. For those watching the finish line of the Boston Marathon, it was the same. It was just an ordinary moment, a regular day, and in a flash that all changed. The other essay I’ve been thinking about is in the second edition: Peter Singer’s “Visible Man: Ethics in a World Without Secrets.” It was especially on mind since it was part of the set of assignments I taught this semester, focused on privacy and property in an age of social media. Singer opens with the Foucault-famous Panopticon, suggesting that technology has created a panoptic society of surveillance, where standards of privacy change rapidly. One remedy that Singer sees is “sousveillance,” surveillance from below, represented for Singer by the citizen taping of Rodney King’s beating. I think that, for Singer, these modes of watching are opposed (or at least counterbalanced). Sousveillance, through sites like Wikileaks, allows us to keep an eye on those keeping an eye on us. But on the day of the Marathon I was struck by how both surveillance and sousveillance can work together. As law enforcement tried to disentangle the day’s events it was both surveillance cameras and every cell phone, camera, and video camera that made the difference: everyone was watching and, because of that, something was seen. Other essays spring to mind but these two are resonating for me today, in my shock and in my grief and in my hope.
... View more
0
0
376

Author
04-18-2013
09:30 AM
Call it the meme that isn't quite a meme yet. That's one of the interesting things about the new Brad Paisley/LL Cool J song that is all over the news, the Net, and Twitterland: look for it on YouTube and you will find lots of personal reactions to the song, but not a performance of the song itself—not, at least, as I write this blog. That's understandable; with so much advance publicity that no amount of money could buy, the copyright holders can be forgiven for wanting to get a chance to see some album sales first before free versions will be allowed on the world wide web. But the lyrics are out there, as well some news clips of the song and its performers discussing it, and that will be enough for me to work with here. As I cannot repeat often enough, a semiotic analysis must begin with the construction of a relevant system in which to situate the sign that you are interpreting. The construction of that system entails the identification not only of significant associations but also critical (one might say "diacritical") differences. In the case of "Accidental Racist," then, we can start with the system of popular music. Within this system a particular association immediately leaps out at us: "Accidental Racist" even explicitly draws attention to that association when the "white" voice in the song notes that his Confederate battle flag* t-shirt only means to signify that he is a Skynyrd fan. Yes, of course: there hasn't been this much fuss about the racial overtones of a pop song since Lynard Skynyrd's "Sweet Home Alabama." And the fact that so much attention is being paid to Paisley and Cool J almost forty years after Skynyrd's lucrative experiment in racial provocation is certainly a sign that race relations in America are still quite fraught. But that doesn't take us very far. It is the differences that can reveal even more. In this case we can look at the differences in popular music genres. Skynyrd is a "rock" band ("Southern rock," to be more specific), while Paisley is a "country" singer, and Cool J is a rapper. Now, rock music was co-created by black and white performers (Chuck Berry and Carl Perkins are essential names in this history), so, even in the face of racist distinctions in the 1950s between white "rock-and-roll" and black "rhythm and blues," classic rock music does not have powerfully apparent racial identifications (even among Southern rock bands, groups like The Allman Brothers—the greatest of the bunch—were anything but segregationist: Lynyrd Skynyrd had gone out on a limb, and they knew it). But country music and rap do. As the Paisley side of "Accidental Racist" makes very clear, country music is, first and foremost (though not exclusively) the music of the white South (note the requirement that country music singers, male and female, must sing with some sort of southern twang). And rap music (now more commonly called "Hip Hop") is still regarded, in spite of its huge non-black audience, as the music of the black inner city, as is made clear by the LL Cool J portion of "Accidental Racist," which is filled with many stereotypical features of urban black popular culture. And here, I think, is where the significance of the song lies. Paisley and Cool J know who their audiences are. They know their genres, and the symbols that belong to those genres. More importantly, they know their audiences' attachments to those symbols. The Confederate battle flag is one such symbol, and a significant portion of Paisley's audience is still quite attached to that symbol, even as (especially as) that symbol is being taken down (finally) from State Houses throughout the South. If he wants to keep his audience, Paisley can't come out and denounce the CBF (things haven't changed that much), so, instead, he is trying to change its meaning, turning it into a symbol of proud young southern manhood, not wanting to offend.** This is a lot different than the Lynyrd Skynyrd gambit. They knew perfectly well that they were waving a red flag, literally as well as figuratively, in the face of America with their prominent adoption of the Confederate battle flag. That was confrontation. Paisley is looking for negotiation. And that's why there has been so much reaction to the song even before many of us have heard it performed in full. Because the question is whether or not the meaning of the CBF can be negotiated. Since the reaction so far has been more against LL Cool J's complicity in this negotiation ("If you don't judge my gold chains . . . I'll forget the iron chains") than against Paisley, the indications are precisely that, even in the light of what I am willing to grant as Paisley's and Cool J's good intentions (they state in interviews that they only want to open a healing racial dialog), there are some symbols whose histories, and thus their significance, can't be rewritten. If young southern white men want to display their pride, wearing the CBF is not going to be an uncontroversial way of doing so. Not today. Probably not ever. *A great fuss is made by defenders of the public display of the Confederate flag over the fact that most such displays are of the Confederate battle flag, not the national flag. The distinction, presumably, is to mark the difference between the national symbol of the Confederacy, which stood for the defense of slavery, and the battle flag of the Confederate armies, which supposedly stood for valorous men simply doing their duty and defending their rights. Frankly, I don't buy it: the Confederate soldier (often a conscript, not so incidentally) was fighting for the Confederate nation, which was created in the defense of slavery, so the difference is meaningless in my book. **There is an irony here. Brad Paisley is from West Virginia, which was created during the Civil War when the western, mostly non-slave owning, counties of Virginia seceded from secessionist Virginia with the help of the Union army. He may be merely role playing in the song, but I can't help but wonder whether he and his fans are aware of the irony.
... View more
0
0
436

Author
04-18-2013
07:30 AM
We use midterm conferences in our writing program—one on one time between teachers and each student. Often these are moments to check in, see how things are going, warn a student in danger, or offer a student direction (or hope) for the rest of the semester. More often, these feel like cursory affairs—students are all too often just to get them done and after sitting through 15 or 20 I think I am too. (Fatigue happens.) This semester I’ve tried to reframe the conference as a consulting session. I told students that our meeting is their exclusive time with a writing expert who can offer whatever help or advice they need. I encouraged them to think about what they want out of the session—do they need me to comment on a draft? Work through some persistent problem of language? Devise a plan to get a specific grade by semester’s end? I’m hoping my students take me up on the offer. I want the conferences to be productive and I think the best way to make that happen is to make them very student-driven. I have mine all day Wednesday. I’ll try to follow up in a later post.
... View more
0
1
248

Author
04-10-2013
06:23 AM
[Note to self: sometimes it’s useful to write blog posts while grading!] Yes, I am still commenting on student work and yes again it’s bringing up issues that make me think. This time it’s about the gray. Sometimes I feel like students only see in black and white (or perhaps only want to see in black and white). But the world is gray and I try to encourage them to getting into the messy, muddled middle of that gray—because I feel that’s where the best critical thinking happens. Most of the students in my course this semester aren’t there yet. That’s not surprising given that we’re only on the second paper. It’s also partially my fault. I always tell new teachers that bad assignments create bad papers and I am regretting, slightly, some key wording in the prompt I used this time around: “write a paper in which you determine whether or not the benefits of living online outweigh the risks.” No wonder I am seeing few papers that seek some middle ground; the assignment wording practically begs them to take sides. Still, there are promising moments. Some students have started moving to the gray by making argument that say “Well, sure there are some benefits but here are the risks” or “There are so many risks but we can manage them if we do blah.” I think that’s the most exciting part of things for me at this moment: watching students begin to develop stronger critical thinking skills that we haven’t explicitly covered in class. It’s what I love most about teaching. Stand back sometimes and education (and growth) just happens.
... View more
0
0
361

Author
04-04-2013
06:55 AM
Certain decades in our history have a distinctive signature, an immediately recognizable identity through which we can contextualize, in a general way, a wide range of popular cultural phenomena. The Roaring Twenties (Fitzgerald’s Jazz Age), for example, the Fifties, the Sixties, the Seventies, and the Eighties, all have their profiles, which, while certainly the result of a good deal of generalization, really do outline broad cultural trends. The Sixties really were an era of cultural disruption, even revolution, while the Seventies saw a cultural retreat from the brink amid a depoliticized popular culture characterized by such musical trends as country-inspired “soft rock,” “glam” music, and “disco,” with truck drivers, preppies, yuppies and punks all emerging as popular cultural figures of note. The Eighties, for their part, really did bring in a harder edged note: an urban oriented popular culture musically divided between rap and heavy metal, with street gangs and stock brokers becoming cultural icons. And the Nineties? Well, in the beginning it looked like that would be the decade of “grunge” and a revival of the Sixties environmentally oriented counter culture. But all that pretty much collapsed as a mass cultural phenomenon by mid-decade, as a new, more Friday-casual business culture went pop, led by the dot.com explosion. But just ask anyone to describe the Nineties as a decade and see what you get. There simply isn’t a clear signature there. And now that we are in the third year of the Twenty-Teens, has any signature for the Naughts emerged? Heck, we can’t even agree on what to call the first decade of the twenty-first century. We can point to a lot of historically profound events--the 9/11 attacks and the ensuing war on terror, wars in Afghanistan and Iraq, the advent of the Great Recession, and the election of America’s first black president—but none of this is reflected in any obvious way in the popular culture of the period (certainly not in the way that the Vietnam War touched the Sixties). Perhaps someone might nominate the Naughts as the Social Media decade, or perhaps, more precisely, the GoogleBook decade (FaceGoogle?) But Facebook, Google, et. al, are media, just television is a medium, and while television has profoundly affected every decade since its invention, we do not identify any decade as the TV decade. So while I think that a case could be made for the current popular cultural era being dominated by such media, I’m not sure that any particular signature emerges. Which brings me to my point: for a long time now American popular culture has not been terribly susceptible to any clear decadal identity. I’m not sure why this should be the case, but I do think that it is worth thinking about semiotically. Is it due to the fact that our popular culture is so diverse, so divided into sub-categories and sub-sub-categories created in large part for the purposes of niche marketing? To put this another way, is it because we do not really share a common culture anymore, making no cultural trend large enough to provide a signature for its time? Or perhaps the causes are darker than that. Even a casual glance at the political culture in America today reveals deep and hostile divisions, not only between “Blue State” and “Red State” Americans (or MSNBC vs. Fox News Americans) but within political parties themselves (consider what pundits refer to as the “civil war” raging within the Republican party right now). I’m really not sure what the answer is, but I am open to suggestions.
... View more
0
2
507

Author
04-03-2013
11:43 AM
[Note to self: don’t write blog posts while grading!] Yes, I am commenting on student papers, which brings up all sorts of issues for me (frustration is probably at the top of the list). I’m going to express that frustration as RTFC / RTFS, both of which are variations on RTFM which you can google (or bing, though “bing” hasn’t yet achieved verb status) to decode. More specifically, I am grading electronically using Word’s comment and track changes features. I love grading this way for oodles of reasons but for now I want to focus on just one: history. Having students’ previous papers allows me to open up the last paper as I look at the current one, reviewing my comments to see how they’ve progressed—well, more accurately, to see how they have not progressed. I’ve just finished another student paper in which the student didn’t seem to pay any attention to my comments on the last paper, which is to say that the student is making the same sort of mistakes this time around. I’ve always intuitively known that students get papers back and lock onto that one letter at the end that represents the grade. I had hoped to get them to focus on the actual comments this time by doing a class exercise in which they read the comments on their last paper and summarized what they needed to do for their next paper. I thought that would help. It didn’t. RTFC! [Side note: it’s not just comments that students seem to avoid reading. I constantly get emails with questions like “What’s due next class?” when the answers are all sitting on the syllabus. RTFS!] It’s frustrating on at least two levels. First, if students won’t read my comments they won’t know where to focus for the next paper. Second, if students won’t read my comments then why do I spend hours writing them? It’s probably this last which prompts my RTFC response. In any case, how do you get students to read, digest, understand, respond to your comments?
... View more
0
0
285

Author
03-27-2013
10:30 AM
I had planned on colon-titling this post “Write!” but as it turns out the correct subtitle is “When?” Specifically, “when” is the problem I faced in class this week, as in “when are students going to turn in this draft?” or “when are students going to show up for class today?” Both of these are suddenly pressing problems, although I can’t say if it’s because of our local institutional quirks or simply the way writing classes work. Hence, this post—part explication, part plea. About half my students turned in a draft for this assignment. I’m wavering between two hypotheses. First, it’s midterm season. Technically, for us, it’s just after midterms since we just had our spring break (early, yes, I know). But there’s also a more general issue with this course. As a spring section of our first semester writing course, it tends to have a very high fail rate, in part because many students taking the course weren’t able to pass it in the fall. That’s the explication part. Now the plea. Do you experience a kind of massive mid-semester slump in your courses? If so, how do you handle it? Is it par for the course? Or is there some way to offer a course correction (so to speak)? I’ve made it very clear to my class this semester that writing courses are what I call “self-punishing systems.” Here’s the language from my syllabus to explain what I mean: WARNING! Unlike other courses you make take, ENC 1101 is a process-based class. You don’t learn content that needs to be memorized for an exam. Instead, you learn the process of writing. Like any process, the more you practice the better you get. And like any process, the learning is hands on. That means that ENC 1101 is a self-punishing system. If you are late to class or absent or fail to complete a draft, I do not dock your grade. Instead, your grade will likely be lower simply because you are missing the information and practice of the process that we do each week in class. The majority of students fail ENC 1101/1102 because of absences but not because of the attendance policy. They fail because the more class you miss, the less you practice the process, the lower your grade. Most of my students have taken this to heart but still I feel I must be missing something to have so many stumble half way to the finish line. Or is this just the way of things?
... View more
0
8
315

Author
03-21-2013
09:01 AM
In my last blog I sang the praises of the unexpected dividends of digitally-based research. So I hope that this, and the fact that I write this column (web log, or “blog”) for Bedford Bits, will be sufficient evidence that I am hardly a purblind “Luddite” ignorant of, and hostile to, technology. Still, in this blog I want to sound a certain warning note, whose theme could be “balance is everything.” I am prompted to this theme both by the daily deluge of features in Inside Higher Education and The Chronicle of Higher Education devoted in one way or another to technology—from MOOCS to calls for the defunding of liberal arts education in public universities on behalf of STEM spending—and by my just reading Ian Morris’ book Why the West Rules—For Now. In fact, it is Morris’s book that has helped me clarify more effectively to myself just why the Humanities still matter. Not that, that is Morris’s thesis. In itself, Why the West Rules is a grand narrative, in the tradition of such books as Hans Zinsser’s Rats, Lice and History and (more particularly) Jared Diamond’s Guns, Germs, and Steel. Vigorously arguing for a kind of geographical determinism in history (“maps, not chaps,” as Morris says again and again), Why the West Rules implicitly suggests a certain sobering lesson for us Humanists: namely, that societies, for whatever reasons, that fail to look forward, eventually succumb, with dismal results, to those that, for whatever reasons, succeed in looking forward. Thus, prehistoric agriculturalists overwhelmed foragers thousands of years ago, just as industrialized Europe overwhelmed the rest of the world some two centuries ago. Since it is quite obvious today that postindustrial technology is already the you’d-better-get-with-the-program-zeitgeist (and there is something more than vaguely Hegelian in Morris’s book), those of us who cling to non-technological values would appear to be not only old-fashioned but quite frankly in the way. No wonder the governor of North Carolina wants to withdraw all state support for liberal arts education in his state’s public universities, and even the president of the United States appears to think that STEM and “education” are synonymous. But there is something crucial that books like Why the West Rules have left out: this is the human (or if you like, the moral) dimension in history. To give him credit, Morris is quite explicit on the fact that his quantitative assessment of history excludes moral judgment. From his own perspective he is neither applauding nor condemning what he calls the “shape of history”; he is simply describing it (the fact that his interpretation of history radically underestimates the role of sheer contingency—and critically contradicts itself on the basis of its own evidence—is beside the point). The point is that not only current historians but just about everyone else outside the Humanities seem to be adopting an essentially economistic/materialistic attitude towards social history. And that’s a problem. Just to look at Morris: his assessment of history is based in a quantitative measurement scheme that he calls a “human development index,” or “social development,” for short. Focusing on “energy capture,” “urbanism,” “information processing,” and “war making capacity,” Morris measures 15,000 years of social history. And (guess what), by his measure the United States is currently at the top of the heap but in real danger of being overtaken by China. Since, about six hundred years ago, China was at the top of the heap but was eventually overtaken by an industrialized Europe, and, lacking its own industrial revolution (essentially a failure of forward-looking thinking) was conquered and humiliated by Europeans, it would appear to behoove contemporary Americans not to make that mistake. The fact that this is one of the take-away points of Morris’s book is demonstrated by the way that the CIA has consulted Morris to get his take on what the future is going to look like. Now, I don’t want to get tangled up in the racial and cultural politics that books like Why the West Rules inevitably raise. What I want to bring up is what they don’t raise. And what they don’t raise is an index of what life is actually like for the mass of people living in any society. History shows plenty of examples of developmentally high scoring societies in which life for the bulk of the population has been quite miserable. In fact, I would argue that contemporary China’s peculiar brand of totalitarian capitalism constitutes one such society. But if we ignore the qualitative conditions of life in a society, as so many economistic books do today, an inevitable conclusion might be that if it is to survive into the future, the United States, which built the world’s first mass middle-class society, would do well to get with the Chinese program, dismantle its middle class, and create a mass work force of impoverished workers led by a tiny elite of technologically savvy businessmen and women. Does this sound outlandish? No: it is exactly what is happening in this country right now. Whether we are talking about the MOOC revolution that will, if it succeeds, turn middle-class academic careers into McDonald’s-class jobs for all but a handful of sage-on-the-screen superstars, or at the much more general demand that American workers not expect things like pensions and benefits, nor even job security, on behalf of corporate “competitiveness,” we are looking at the results of economistic materialism. If the index for the society as a whole is high, who cares about the lives of the mass of individuals? I’ll tell you who cares: Humanists in the Humanities. It is Humanists who can, without demanding an end to technological society (after all, we already have a thriving Digital Humanities) provide the balance that social thinking and planning require if our society is going to be worth living in. That is a value that quantitative thinking neglects or even eschews. So, no, I do not feel myself to be behind the times or an impediment to anything in my loyalty to the Humanities. This is not backward thinking, not a fixation on the past; it is forward thinking based upon the critical thinking skills that enable us to assess everything in a total context, a context that goes beyond economic and monetary measures. Quality must balance quantity, and at the moment it appears that only Humanists are still saying this.
... View more
0
0
532
Popular Posts