-
About
Our Story
back- Our Mission
- Our Leadershio
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
- Macmillan Community
- :
- English Community
- :
- Bits Blog
- :
- Bits Blog - Page 134
Bits Blog - Page 134
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Bits Blog - Page 134

Author
03-20-2013
08:53 AM
So what was it like using one of the selections from e-Pages? Well, for starters, it's one of the few times I've ever had students "like" a reading. I won't claim that had anything to do with having the reading online; I think it had a lot more to do with the subject matter (as one student described it, "it's about that MySpace porn star") and the fact that the events of the essay took place in South Florida, not far from our school. About the usual number of students actually did the reading, with about the same mix of those using laptops in class and those who had printed the reading. For my students, it seems like it was just another reading. What I found far more interesting was my own reaction to the reading. I live and breathe technology. I've taught whole classes using just PDFs and my iPad. I grade electronically, too. But I just couldn't read this essay online. I had to print it out so that I could attack it properly: highlighting, underlining, annotating. What it revealed to me was a subtle hierarchy of print and screen in my world. Writing (including comments on student papers)? Screen. PDF? Screen (something about PDFs' scalability and the really good annotation tools out there). Other online text? Print. I didn't realize how divided my print and digital literate practices were nor why (it wasn't until I sat to write this that I realized why I'll read a PDF online but not an essay in HTML). Am I alone in this or do you have your own quirks in terms of what's done online and what's done on dead tree?
... View more
0
0
239

Author
03-13-2013
01:06 PM
So before we even got to discussing our reading from Emerging’s e-Pages in class, I ran into my first challenge: how do you cite an online selection in a print anthology? I’ve almost gotten the hang of the new MLA format for citations (emphasis on almost) but this is an entirely different beast—a truly hybrid one. The question was tricky enough for me to run it by my editor who, in turn, ran it by one of Bedford’s handbooks editors, who finally suggested that it be cited with respect to its original format (thus, a short web piece would be cited as a short web piece, a video as a video, and so on). For the record, then, the citation for the Susan Rubin Erdely essay is: Erdely, Susan Rubin. “Kiki Kannibal: The Girl Who Played with Fire.” E-Pages for Emerging: Contemporary Readings for Writers. Bedford St. Martin’s. n.d. Web. 25 Feb 2013. Ugh. Complicated. But it also suggests the challenge of citation in the digital age: formats change faster than citations and MLA and other organizations seem to be playing a never-ending game of catch up. There are of course other citation systems. Has anyone jumped ship from MLA because of these sorts of issues? I’d be curious to hear what you’ve tried and how it works with unconventional digital content.
... View more
0
6
362

Author
03-07-2013
12:30 AM
Digital technology has absolutely revolutionized the possibilities for research when it comes to popular culture-related writing assignments. Not only can students find advertisements (print and video), television shows, music, movies, and, of course, scholarly and popular commentary online, but they can also benefit from the way that online sources can keep up with the pace of popular culture far more successfully than can print. Indeed, thanks to the possibilities of digital research, far more ambitious writing assignments for our students can be designed precisely because it is so much easier for them to get access to the material that they need, especially when it comes to semiotic analysis and interpretation. This is a rather obvious point, and might appear to hardly require an entire blog entry, but I want to discuss here one of the advantages of the Internet that I do not believe our students are sufficiently aware of: that is, the possibilities it offers for unguided exploration and unexpected discoveries. The standard model for student research appears to be something along the following lines: determine some relevant search terms for your topic, enter them into a search engine, and then pick out a few of the more promising looking links that appear. Most of the time something from Wikipedia will turn up at or near the top of the page, and in the worst case scenario the research effectively ends right there. But even the more responsible student is likely to remain confined within the terms of the key words used for the search, and this can very much limit one’s research, especially when one of the tasks of the assignment is to construct relevant systems, or contexts, within which a given popular cultural topic can be analyzed semiotically. Such systems require a very wide-ranging scope of knowledge and information, more wide ranging than a narrow key word search will be able to reveal. Here is where an alternative to the key word search-and-discover research model can be very helpful. In this alternative, which might be called (after Borges) a “garden of the forking paths” approach, one’s research is approached as a kind of journey without itinerary. You know that you are going someplace, but you don’t have anywhere specific in mind. Instead, you let the signposts along the way take you to the next signposts, which take you to further signposts, and to places that you never expected to be going to at all. Let me give an example. As I have mentioned a number of times in my blogs here, one of the most important questions I ask myself as I analyze contemporary American culture is how we got to where we are now: a country that is radically at odds from the goals that were so widely embraced by American youth in the 1960s. You can’t simply enter a question like that into a search engine, and reading a book on the subject can lead to a narrowing of one's view on this massively overdetermined question given the needs of books to stick to specific arguments couched within specific disciplines. So I have been on a kind of research walk-about for some time now. I can’t remember when it began precisely, but I do recall reading Ann Charters’ anthology, The Sixties Reader, about a year ago, and while I found a lot of material there that I already knew about and expected to find, I also found a selection from an oddly titled book called Ringolevio, by an oddly named writer named Emmett Grogan. Going online to find out who Emmett Grogan was revealed a lot of basic information about him personally, but also tipped me off to a group I knew vaguely about, the Haight Ashbury Diggers. Searching for information on the Diggers took me to a web site called Diggers.org, where I found not only information but a discussion forum frequented by a large variety of now middle-aged (and older) ex-Diggers, ex-Hippies, and not-so-ex-Hippies and Diggers. Reading this forum has given me an enormous amount of primary source sociological information that I never expected to find at all. That site has tipped me off to some useful books to read, but it has also led me to further web pages where I have learned ever more about what, precisely, happened to a large number of people from the sixties who once tried to change society. I have discovered their pride, their disappointment, their continuing passion, and, yes, sometimes their paranoia. It has been a journey through America without my having to leave my desk. Just today I visited Diggers.org and looked at an update on how these sixty-somethings and seventy-somethings feel that the Occupy Wall Street movement is an extension of their own. So that got me searching for Occupy Wall Street information, which not only took me specific OWS websites but also, to my surprise, to a website devoted to libertarianism, something that, on the face of it, could not be more opposed to the Diggers’ values or to those of the sixties. But wait, reading the libertarian page revealed to me that many conservatively identifying young libertarians have a lot in common with the Diggers (who evolved into what they themselves called the Free Families) in their insistence on complete freedom. This has brought young libertarians into conflict with such high profile conservatives as Glenn Beck and Rush Limbaugh, who have been seeking libertarian support. It’s all very interesting, but my point is that I didn’t expect to be there at all. I’m still on walk-about, still gathering what I need to know to answer my questions. I’ve learned a heck of a lot along the way that I never expected to learn at all, because I didn’t even know it was there. Of course my project is far larger and more open ended than a student paper assignment. But I want to make the point that research is a wandering as much as it is a homing in. If you go at it with only one destination in mind, you’ll miss what you really need to know. The Internet can make that wandering both easy and quite fun.
... View more
0
0
371

Author
03-06-2013
07:51 AM
So I’m about to try something new this semester—new for me, new for the second edition of Emerging, and (I think) new for many other Bedford texts as well: e-Pages. E-Pages are readings that exist only online, like chapters of a textbook that only exist in virtual space but are nevertheless connected to the rest of the print edition. My students and I are about to read one of these selections, Susan Rubin Erdely’s “Kiki Kannibal: The Girl Who Played with Fire.” Why have e-Pages at all? For some of the selections we’ve included in the new edition, it’s simply necessary (such as the video from Wired.com); for others, hosting them online preserves rich features such as images and links (such as Erdely). I’ve already covered why there isn’t an entire “e” version of Emerging in a previous blog post; e-Pages is a start. This selection in particular is also doing double duty for us in class. We’ve been using Rachel Kadish’s “Who Is This Man and Why Is He Screaming?” and Peter Singer’s “Visible Man: Ethics in a World Without Secrets” to think about issues of privacy and property in the online world. Not only does Erdely’s piece advance this discussion but reading it online itself raises questions of privacy and property, permissions and access, protections and openness. I’m not sure how it’s going to go. Reading online is, from my experience, different from reading print. But honestly I chose the piece because I wanted to give this e-Pages thing a good test drive. I’ll let you know how it goes.
... View more
0
0
280

Author
02-27-2013
07:02 AM
Intensive work helped. Seeing what arguments looked liked from fellow students helped as well. But there are still a number of students struggling with the concept of argument, students who keep submitting to me statements of fact. I try and encourage them to go back and unlock the idea they were trying to communicate—I can usually see it—and I am hoping that approach will work. But the experience as a whole has helped me realize a few things. First, for the way I approach academic writing, arguments are pretty much central. I tell my class that a good argument will lay out the entire paper: you’ll know just what to write in each paragraph (and in just what to order) because of the argument. That same argument also calls forth the evidence it needs; it tells students just wear to go in the texts to find the quotations to support each little claim the argument wants to make. So, it’s not surprising students are struggling with this but also not surprising I am spending so much time on it (just as it’s not surprising I am ending up so frustrated). I can’t help but think, though, that there has to be a better way (for different ways exist aplenty I know). Yesterday during our Writing Committee meeting one of my colleagues shared his approach, using a simplified Toulmin model and a worksheet for students. That might be the next thing for me to try, since it seems like he’s had some good success with it. We’re working on actual body paragraphs and evidence next but, before I leave argument, I have to ask a question that reflects the title of this series completely by accident: am I insane for this emphasis on argument? Am I missing something? Or is it just this hard to help students get a handle on academic argument?
... View more
0
0
268

Author
02-21-2013
08:31 AM
I realize that I may be accused of simply trying to attract attention to my blog with a title like this (no, really, after seeing a gun ownership debate spring up on Inside Higher Education of all places—see Nate Kreuter’s blog “On Guns in My Classroom” —I could swear that America’s passionate gun rights defenders spend their not-buying-guns-right-now time surfing the Internet for any possible opportunity to weigh in on the subject), but the topic really does call for semiotic analysis. It will not be my purpose to make an argument for my own position on the controversy (though that position is probably already clear enough), but, rather, to point out and interpret a significant new twist on this old debate. As I cannot repeat often enough in instructing students how to perform a semiotic analysis, it all begins with the apprehension of a significant difference. In the case of the debate over gun control, this difference lies in the way that gun rights are now defended when compared to defenses from the past. In the past, the key arguments for gun ownership included appeals to tradition (especially the rituals of male initiation rites whereby fathers and grandfathers passed on to their sons and grandsons the traditions of gun ownership as a kind of rite of passage—have a look at an Orvis catalog for an example of a continuation of this tradition) and to the needs of hunters and those living in rural areas. But while these arguments are still made, a whole new argument has entered the picture, and it is obscuring the traditional one. For in the current debate, the dominant arguments one hears include an insistence on the need to own guns in order to defend oneself against other gun slingers, and, in an even more extreme version, in the need to own guns in order to resist governmental authority (the worst of which, to such gun owners, being any attempt to confiscate their guns—which leads to an interestingly vicious circle: one needs guns to defend against the government taking away one’s guns, which requires more and more purchases of guns any time gun control is allowed to reenter the national discussion). Now this is a difference, and with an enormous significance. It is one thing to argue on behalf of tradition, family continuity, and the formation of masculine character (though if I had my druthers, shooting animals wouldn’t be necessary for male initiation); but it is quite another to argue on behalf of personal defense and resistance to all forms of governmental authority. The question is: what does this difference signify? As I have explained before, in order to interpret a cultural sign, one must also situate it within a larger system within which its meaning can be determined. Differences alone, while necessary, are not sufficient in finding a meaning. And it is not at all difficult to find a relevant system within which the contemporary pro-gun argument can be situated: all we need to do is look at the manifold of signs betokening Americans' hostility towards each other, along with a concomitant movement of what were once regarded as fringe beliefs (especially forms of extreme libertarianism) to the political main stage. Back in the sixties, certain factions of the New Left armed themselves in the name of revolution, now it's what I'll call the "New Right" talking revolution. And when the Republican party makes opposition to governmental authority a bedrock ideology, while practically any discussion about anything on the Internet can (and will) turn into a vicious dog fight ultimately reflecting the extreme divide between what can be roughly called Red and Blue Americans, it should not be so surprising to see guns defended as weapons to be used against other people and against the government. Indeed, I see the new terms for the defense of gun ownership rights as the most potent of a host of signifiers betokening a society that is falling apart. Call it social anomie, or alienation, or whatever you like, but when our own college students are being widely encouraged to bring guns onto campus, and we as educators are being forced to consider whether we want to be packing ourselves, and a fight breaks out on Inside Higher Education about the need to pack heat on campus, the moon, as Bruce Catton used to say, is in a new phase.
... View more
0
0
413

Author
02-20-2013
11:42 AM
Rough drafts of argument for the first paper are starting to trickle in. They do not give me confidence. I had hoped that spending time just on the argument of a paper might help students get a foothold on academic writing. What could be more focused than a single statement? We spent last class discussing argument at length. We looked at all the things “argument” can mean as students worked in groups to discuss the difference between an argument between lovers, a scientific argument, a political argument, and an argument in a courtroom. We talked about what elements of each might make sense in academic writing. We talked about the difference between a statement of fact, a statement of opinion, and an argument. They even worked in groups to formulate sample arguments for this paper’s prompt. In short, I used every trick in my bag to help them understand the concept. [Side note: it amazes me that this is a concept that so needs to be taught at all. I feel the same way about apostrophes. Ask just about any teacher in the writing program here and they’ll all say students have no clue how to use apostrophes. That baffles me. You would think that in a culture obsessed with owning things that the apostrophe would be treated with a hallowed reverence. You’d think, that is, that students would know, if nothing else, how to designate who owns what. I feel the same way about arguments. In a culture so divisive and politicized you would think argument—just taking a strong stance—would be natural. You would think that the challenge would be not getting students to make an argument but getting them to moderate it, to take different points of view into consideration. You would think.] So far the drafts that have been emailed to me have reflected very little of what we did in class. Many are statements of fact. Most all are so broad and vague as to be meaningless. It’s discouraging to me at this moment, though perhaps I am writing too soon. I’ll post next once I have them all. For now, like any teacher, I am obsessing about today’s class—strategizing, ruminating, deliberating. I’ll have the students’ drafts to work with and I am hoping that will be the key. I often find that students only get a sense of a thing when they see what the thing should look like. Today we’ll start with sample work and then move into peer revision. Fingers crossed that that does the trick. Next post I’ll share how it goes and hopefully share some sample arguments from students. I’m hoping that sharing what they can and cannot yet do with all of you might give me new ideas to approach this topic.
... View more
0
0
260

Author
02-13-2013
11:05 AM
I’ve heard it said that one definition of insanity is doing the same thing over and over and expecting different results. It’s a definition I find particularly resonant this semester since I am teaching our first semester writing course in the spring. Historically, our fail rates for this spring are around double what they are in the fall. Of course, that is on some level not all that surprising—many students taking the course in the spring do so because they didn’t pass it in the fall. In order to address this problem and (more importantly) in order to help these students, I am trying something completely new for me and for our program. Consider it my attempt to stop the insanity. Perhaps this new approach will offer new results. Normally, we have students write six papers in a semester. Relentless, I know. And baffling in a way. After all, we’re asking them to demonstrate from the very first paper the skills they’re not expected to have until the end of the semester. That is, we’re asking them to know how to write a paper even though we teach them how to do that throughout the course of the semester. Talk about insanity. This semester I thought I would mix things up by breaking apart a paper into small constituent parts: argument, evidence, organization. I’m walking the students through each part before they assemble them all into a paper. I’m hoping that breaking the task down will have a number of benefits. First, I am hoping that the smaller pieces will be more digestible, that focusing on only one aspect of a paper will help students to get a handle on it in ways that working on an entire paper might not. Second, by breaking apart the first paper into pieces parts I am hoping to make the paper itself less intimidating: writing a whole paper might feel overwhelming but writing just one sentence (an argument) might feel perfectly manageable. Third, I am hoping that students can begin to see how each piece engages a kind of critical thinking and that they might slowly hone their skills with such thinking part by part. There are more reasons to try this approach too, I think, but before I venture to lay them forth I’d like to report on how things go. I’m getting drafts of their argument for the first paper today so will report on progress soon. In the meantime, I’d love to hear your own approaches to teaching the academic expository paper. Where do you start?
... View more
0
0
269

Author
02-07-2013
07:11 AM
The fundamental principle of popular cultural semiotics is that everything can bear some sort of social significance. And that means everything. Yes, even running shoes. Take the current barefoot/minimalist running shoe fad (I'm not sure if it is sufficiently entrenched to be called a trend yet). The fad in question involves the explosively popular shoes that either look like some sort of foot glove (with a separate compartment for each toe), or which appear to be more or less conventional in looks but use much less cushioning and other material than ordinary running shoes. The former variety includes the Vibram FiveFingers shoe, while the latter includes the Merrell Sonic Glove Barefoot runner. The whole thing really got started with the publication of a book called Born to Run (2009), which focused on the barefoot running prowess of the Tarahumara Indians of Mexico, and you can learn a good deal about it all here. Now, on the face of it the barefoot running phenomenon would seem to have a purely functional significance, based upon the fact that the Tarahumara Indians are apparently able to run effortlessly for hundreds of miles barefoot and without injury. Furthermore, after decades of running shoe technology developments that have enhanced the cushioning and support of conventional running shoes, recent research suggests that all that cushioning and support causes runners to run in such a way that their heels strike the ground first in their running stride (this is called "heel striking"), and that this kind of stride puts great stress on the knee and ankle joints, causing injuries. The "barefoot" technology shoes, on the other hand, are designed to force a toe-striking stride, which may be a less injury-prone running style. But there's more to all this than simply the physiology of running and the functionality of shoes, for looked at semiotically these new shoes are signs as much as they are athletic gear. First, then, there is what we can call the "Noble Savage" angle. Since the eighteenth-century romanticization of the aboriginal peoples contacted in the age of European exploration, the "Noble Savage" has been an emblem, for such writers as Rousseau, of a prelapsarian innocence that European civilization has lost. Reflecting more of a European mythology than a human reality, the "Noble Savage" is a construct not unrelated to such gestures as Marie Antoinette's donning of simple peasant clothing. The Tarahumara Indians serve as "Noble Savages" in this sense, conferring upon running shoes their aura of a prelapsarian innocence. A corollary to the "Noble Savage" significance of barefoot running shoes is their "green" appeal. Using less material than a conventional running shoe, barefoot/minimalist runners would appear to use fewer resources and thus be more sustainable than the ordinary, beefed up variety. Now, I'm all for green technology, and I am no cheerleader for European-style civilization, but as a runner and a semiotician I know that there is something a little funny about all this. First of all, the "Noble Savage" bit has always been condescending, and it is no less so today with the barefoot running movement's use of the Tarahumara Indians. Living in primitive conditions for generations, they have developed the kind of hardened feet that can run without protection not because they are purer but because they have historically had no other options. They also run on natural trails without glass and nails and other foot-cutting stuff (this is why few "barefoot" runners in the U.S. actually run barefoot: minimalist running shoes are supposed to protect the foot from such things). One wonders how the Tarahumara would perform if they had modern running shoes to run with. Beyond this is the fact (which I know from personal experience) that there are barefoot running-specific injuries that occur when one strikes too far up on one's toes—which is what running barefoot running shoes are designed to compel. Painful calf injuries often result, contradicting the claim that barefoot running is all good. As for the possible claim that minimalist running shoes are more ecologically friendly, well, not quite. Using less material they wear out much faster than conventional running shoes, and must be replaced every few months if heavily used, leading to the consumption of more resources in the long run (pun intended), not less. And finally there is a particularly American consumerist angle to all this. For the fact is, as I know from my own experience, that you can discipline yourself to run in such a way that your foot strikes the ground in an optimum fashion without requiring any special sort of shoe. Indeed, the best balance for aging knees and ankles like mine is a nicely cushioned and supportive shoe combined with a foot strike that lands between the arch and the ball of the foot. I do not need to buy a special shoe to force me to run this way—indeed, a minimalist shoe would force me to strike too high up on my toes and really mess up my calf muscles in no time (I know: I've tried barefoot running). So here is the point: the barefoot/minimalist running shoe fad signifies within the context of a larger consumer system whereby Americans tend to prefer products and gimmicks that promise to do their work for them, rather than making an effort on their own. Whether it is the "toning shoe" (also originally based on a kind of back-to-nature claim) that claims to exercise your body even when you are not really exercising, or the endless weight loss programs and pills that promise slim bodies without effort or discomfort, Americans like to buy results in the consumer marketplace rather work for them. Purchasing an expensive barefoot running shoe (they are priced at a premium) rather than training yourself to run with a healthier stride is a part of this phenomenon. No one is really being more natural or green or aboriginal by choosing one shoe over another, and unless you have a nice smooth turf to run on, it isn't very healthy to run barefoot. The aura of naturalness and health associated with minimalist running shoes is a matter of image, not function, a sign rather than a substance.
... View more
0
1
747

Author
02-06-2013
11:43 AM
I continue this series of posts thinking about student responses to a first-day writing sample that asked “What is academic writing?” (broadly, of course, for IRB-related reasons). Though the sample size is small, I think these students nevertheless reveal some of what many students bring to our classrooms. None of the responses caused me undue concern. They included many of the elements that show up in my classroom: an attention to language, the importance of citation, and organization/form. The only thing that gave me pause in reading those trends in the responses was the singular emphases that seemed to reign: This student thought grammar was all important while that student thought the key was having an introduction, thesis, and conclusion. Part of the challenge for me as I start this semester is showing students how they’re all right and all wrong, how to take what they already know about academic writing and combine it with new elements to make a more cohesive whole. On the upside, a few students talked about critical thinking (hurray!), and a few also talked about writing as a process (woohoo!). Overall, though no one student had a perfect view of the work that lies before us, each had at least a partial one. Coming together as a class, I am hoping that they can share the different pieces of the puzzle that each of them possesses. That’s how classes should work, I think. That’s what we’ve been touting since (at least) Kenneth Bruffee’s “Collaborative Learning and the ‘Conversation of Mankind.’” As a whole, the writing samples give me hope. Except for one. One student spoke a kind of truth that I think few students could or that few students have yet to encounter. That student argued that academic writing was hard to define because, with each class he entered, the teacher told him that what he was taught before was wrong. It’s this student’s response that haunts me the most because I believe him. I believe he’s been told that class after class. And I fear I might end up telling him the same, with all the best of intentions and from all my teaching and scholarly experience in the field. What a horribly confusing message for a student to have (and to have so determinedly, so fixedly, so repeatedly). His response says less about him as a writer and more about academia as an institution (and maybe about Comp/Rhet as a field). We have so many approaches as a discipline; we have so many pedagogies. And each teacher compounds those differences. If we can’t say for sure what academic writing is, then students like that one will always get confusing if not conflicting messages. My goal for this semester is the same as my goal every semester: help my students become the best thinkers and writers they can be. It is at once a modest and ambitious goal. Today, I am left hoping it is capacious enough for the student who’s always been told that what he already knows, what he was already taught, is just wrong. Therein lies the challenge for me this semester.
... View more
0
0
345

Author
01-30-2013
09:35 AM
In this series of posts, I’d like to think about student responses to a first day writing sample that asked “What is academic writing?” (broadly, of course, for IRB-related reasons). Though the sample size is really quite small I think these students nevertheless reveal some of what many students bring to our classrooms. One of the things that students bring, represented in many of the responses, is a particular understanding of the form of academic writing, an understanding created through No Child Left Behind (NCLB) or (more specifically) the FCAT, Florida’s mechanism for complying with that federal legislation. These responses were easy to spot because they emphasized not just the form of academic writing but a very specific form—and a very formulaic one. The students who presented this view of academic writing indicated that it has an introduction, a conclusion, and a thesis. I’m not sure how to feel about this grouping. The overall tripartite construction, broadly speaking, applies to the kind of writing we ask students to do in our writing courses. But I am concerned about how that preconceived notion of form might limit students and how it might even block them from writing well or writing at all. I don’t want to open the NCLB can of worms here, but I'm wondering about the experiences of other teachers. Do tests like the FCAT do anything at all to prepare students for your classroom? Are they a start? Or are they a hindrance? Do we building on what students learned in high school? Or do we tear it down?
... View more
0
0
366

Author
01-24-2013
03:30 PM
When advising students on how to go about choosing a movie for semiotic analysis, I always suggest that they have a look at those films that have been nominated for an Oscar Best Picture award. This is by no means a sure-fire route to finding a culturally significant movie for analysis (and, of course, every film is semiotically significant), but by definition any Oscar-nominated movie has attracted significant popular attention and is likely, accordingly, to offer a rich field for analysis. Such is certainly the case for this year's frontrunner in the Academy sweepstakes, Steven Spielberg's Lincoln. Indeed, if I was a wagering sort of person I'd be betting the farm on it to win right now, not only because it is a very well conceived, written, and directed film that displays some of the best acting in Hollywood history, but because it is a potent cultural sign as well. And it is Lincoln's status as a sign that I would like to look at now. To begin with, Lincoln is one of those movies that the members of the Academy love to award Oscars to. Quite aware of the poor reputation Hollywood has earned for mostly making action-packed blockbusters for adolescent audiences, Academy voters gratefully shower gold-plated statuettes on those films that aim at the higher end of cultural production. Historically themed movies do particularly well in this regard (think Lawrence of Arabia, Gandhi, and Shakespeare in Love—which boasted the added cachet of being about a high cultural literary icon), and Lincoln lies very much within this tradition of movies that polish up Hollywood's tarnished image. But beyond the significance of Lincoln's association with other historically themed movies there is the man himself. Probably the only president left who can function as a national hero (Washington's and Jefferson's status as slave owners has much reduced their personal appeal, while both Kennedy and FDR have got reputations for sexual license to live down), Lincoln is not only, as Edwin Stanton declared, a man for the ages, he is a man for the mainstream as well, and there aren't too many political leaders left like that. Not that Abraham Lincoln doesn't have his detractors. Neo-confederates on the right continue to denounce the sixteenth president as the "tyrant" who "started" the Civil War, while New Left critics still complain that Lincoln wasn't sufficiently anti-slavery. But the fact that Spielberg's Lincoln is a popular success (91% on the Tomatometer), as well as an Academy juggernaut, is a sign that the American mainstream is still behind the great rail splitter. A signifier of the American dream as well as an exemplar of that which is most decent in the American character (not to mention within American democracy), Abraham Lincoln is a much-needed unifying figure at a time of rampant political polarization. This is ironic, of course, because it was his election to the presidency in 1860 that caused America (which was even more polarized then than it is today) to split in two. But that wasn't Lincoln's fault, and I personally am glad to see the popularity of this new movie about him. It was once said that if you wanted to write a bestseller, something about Abraham Lincoln's doctor's dog would do the trick, and it looks like Lincoln is still a subject of widespread admiration. That's good to know, because Abraham Lincoln is one national hero we can't afford to lose.
... View more
0
1
812

Author
01-23-2013
12:33 PM
Grammar was the most prominent (if somewhat disheartening) theme in students’ first day responses to the question “What is academic writing?” However, surprisingly, the second most mentioned feature was citation. That one really caught me by surprise. I guess I am so surprised because citation seems to be a particular Achilles heel for students. They seem to have little sense of what it is or when it’s needed. Given that citation is, I think, a kind of disciplinary “secret handshake,” a way of showing that you are a member of a particular discipline and belong there, it’s not all that surprising that first year students would know so little about citation. I’m just glad to know that it exists at all. In fact, that’s the approach I’ve adopted to teaching citation—starting by making sure students know it exists. I only teach my students three things about citation. I don’t “teach” them MLA citation (even though we use it in our class) because, first of all, students are going to end up in many different disciplines with many different citation systems. There’s a good chance they will never use MLA again. Besides (and secondly) citation systems change. Teaching the intricacies of one instantiation of one citation system will end up useless knowledge—if not the next semester then certainly some day. No. I tell students they only need to know three things about citation: It exists. In class we discuss what this means. Basically, students need to understand that if they are using words or ideas from someone else there needs to be a citation. If it’s not absolutely right, it’s wrong. In discussing this point, we continue part of the conversation from the first point: academic writing takes proper attribution very very seriously. We generally open this up to a discussion of plagiarism: what it is, how it happens, what the consequences are, and how to avoid it. But this point also underscores the “secret handshake”-ness of citation. I think that citation is part of the process that David Bartholomae describes in “Inventing the University.” Using it, if not mastering, is evidence that students are stepping into our language. Know how to find the answer. I’ll admit it, with all the recent changes to MLA and with writing in a discipline that uses either MLA or APA depending on the place of publication, I have to look up citation formats all the time. If Idon’t know every citation form by heart why should my students? Instead, I know how to find the answers and in class we tease out where those answers might be: A handbook or other reference book. A reliable web resource (Purdue OWL being the most popular, of course). Electronic citation tools like Refworks or even Microsoft Word. A targeted Google search with some discerning assessment of the results. Asking me. Asking at the writing center. Following what other sources have done, ones they know are correct. It doesn’t matter how students find the right answer, as long as they have a set of tools for finding that answer. I know my approach might be a bit weird (as I am) or a bit oversimplified (as I never am) but the more I use this approach (which requires constant reinforcement through the semester) the more I am convinced it’s a workable one. So, only three things to know about citation. Could that work in your classroom?
... View more
0
2
359

Author
01-16-2013
11:07 AM
Each semester, I administer a writing sample on the first day of class. It is completely superfluous. While other institutions might use a first day writing sample as a diagnostic to confirm or revise placement, my institution is barred by state law from any sort of remediation so placement isn’t an issue. Still, the writing sample gives me a quick sense of where the class is as a whole, helps me quickly identify students who might need extra support, and provides an introduction to the theme of the class through a response to a quotation from our first reading. This year, instead of crafting a prompt from our first reading I decided instead to simply ask students, “What is academic writing?” I figured the responses would serve some of the same purposes. But it also seemed like a particularly appropriate question because this semester I am teaching the first course in our writing sequence, which most students take in the fall. I was hoping to reveal any hidden assumptions about the course students might have, particularly since this population tends to be especially at risk of failing (since most of them didn’t pass the course in the fall). The results of the sample weren’t particularly surprising, though they were revealing. I’d like to discuss them in this series of posts (broadly, of course, for IRB-related reasons). Though the sample size is really quite small I think these students nevertheless reveal some of what many students bring to our classrooms. For example, the one thing mentioned most was grammar. Students think academic writing means writing with perfect (or at least good) grammar and writing skills. Part of me internally sighs at this assumption, since the focus of our course is on critical thinking. But in reading the responses I’ve started wondering if I’m wrong and the students are right. After all, at least once a year there’s someone in our institution claiming that “students can’t write.” Increasingly, I am contacted by publishers and companies marketing a host of assessment tools, all of which strongly highlight an ability to track error across an entire writing program. And of course, there’s a growing discourse of “accountability” in legislatures and in the public sphere. Everyone, that is, is clamoring for grammar. It’s not that I don’t address writing and language issues in my classroom: our program teaches students to recognize, track, and correct their common patterns of error. But for me, correct writing is just writing, maybe even just writing. That it is correct doesn’t mean it does much of anything. And as I write that I realize that one of my own assumptions about academic writing is that it does something. For our program, that “something” has to do with academic discourse and argument through expository writing but I guess “doing something” is the basis of any rhetoric. I plan on addressing this assumption and unpacking it in class next week but I am wondering if you’ve encountered the same sort of (mis)understanding about academic writing. How do you address it? And what is academic writing for you?
... View more
0
0
439

Author
01-10-2013
01:17 PM
So it's January 1st and I have a Bits Blog deadline approaching. A perfect time to share my sense of renewal, my resolutions for the future, my desires for AD 2013. After all, this is one of the primary rituals of the American New Year's celebration (along with alcohol and football); why not just jump right in and share? Except that I never formulate New Year's resolutions. I never look to a new year as any different from the past year. I never experience a sense of renewal when the glittering ball drops. Alcohol and football are never on the program for me. In short, the rituals of circular time are cultural mythologies that I can analyze and teach, but do not experience or practice. If you are like most Americans, you will be concluding by now that I am a pessimist—something that is one of the unspoken, but most profoundly condemned, taboos of American culture. That is why I was so delighted to discover Barbara Ehrenreich's book Bright-Sided: How the Relentless Promotion of Positive Thinking Has Undermined America, and so pleased to be able to publish an excerpt from that book in the "American Paradox" chapter of the seventh edition of Signs of Life in the USA. For Ehrenreich's book has the courage to critically explore one of America's most pervasive mythologies: the belief that everything turns out for the best, that the future is always better than the past, that the American dream is an irrefutable reality. Your hackles may already be rising as I appear to be embarking on a critique here of that sort of optimism. But fear not: cultural semiotics analyzes beliefs; it doesn't sell any of its own. And all I mean to do right now is add something to the analysis of American optimism that is not covered in Ehrenreich's historico-cultural exploration. That is to say, the role of biology in the formation of an optimistic culture. I was prompted to this analysis by an article by Tali Sharot in The Washington Post. In this article, Sharot—a researcher in cognitive and brain science at University College London—explores the biological and evolutionary sources for human optimism. Among her insights is the fact that the price human beings paid for the ability to imagine future time was the consciousness of inevitable death, and that without some sort of cognitive compensation for that consciousness human life would not be endurable. What that compensation is appears to lie in an asymmetry in our brains, according to Sharot. On the one hand, we have the left inferior frontal gyrus, which tends to respond to good news very actively. On the other hand, we have the right inferior frontal gyrus, which seems to specialize in processing bad news; but, as Sharot notes, it doesn't "do a very good job" at this. What it all boils down to is that our brains eagerly latch onto good news and incorporate it into our memories and imagination, while bad news seems to slip out of our cognitive radar. In other words, as a pessimist might put it, most human beings are hard wired into denial. This capacity for denial is very good for enhancing one's potential to contribute to the gene pool; hence, it has been selected for over hundreds of thousands of years of evolution. And if we look at just a few centuries of American cultural evolution we can see how Sharot's biological analysis can help us understand a social phenomenon. Consider, then, that modern American society is indeed a culture of immigrants (the annihilation of the first peoples has effectively muted, if not destroyed, their contribution). Most (not all: African slaves, of course, were not voluntary immigrants) of these immigrants indeed undertook the risk of coming to America in the optimistic belief that things would be better in one way or another if they did. Now, consider some four centuries of concentrating such optimists in the same place and you see how not only the gene pool but the culture itself will be heavily tilted towards bright-sidedness. Of course, evolution is never total, and neither is culture. Some of us seem to have a more active right inferior frontal gyrus than others. We're in the minority and are certain to remain that way because we are less likely to have children. And no, I don't.
... View more
0
0
277
Popular Posts