-
About
Our Story
back- Our Mission
- Our Leadershio
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
- Macmillan Community
- :
- English Community
- :
- Bits Blog
- :
- Bits Blog - Page 40
Bits Blog - Page 40
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Bits Blog - Page 40

Author
12-06-2012
01:11 PM
I confess that I am getting weary of seeing the Ayn Rand–inspired political meme, "makers vs. takers," still saturating the mass media. Attempting, apparently, to win by way of insult what they could not accomplish at the ballot box, Republican candidates and pundits alike seem to be consoling themselves for the stunning outcome of the 2012 elections by insisting that a bunch of nonproducing "takers" are taking over America. I'll leave to others the point-by-point critique of this inane meme, however, because what I want to do, in my feeble way, is introduce a substitute meme: Let's call it "cultures vs. vultures." So, who are "cultures"? Cultures are people who believe in the common weal. They believe that a society is not simply an aggregation of autonomous units aggressively competing with each other in a zero-sum game in which every success must be won at the expense of a cloud of "losers." They are doctors and nurses and nurse's aides who save lives; teachers who educate; fire and police personnel who serve and protect; government employees who get us our driver's licenses, fix road potholes, erase graffiti, and so on. They are businesspeople who establish companies that employ people at a decent wage with benefits, and they are the people who work for them. They are writers and artists. They are women and men who take care of their families, join the PTO, coach soccer and baseball. They are …well, many of us. And who are vultures? They are people who buy companies in order to eviscerate them by selling off their assets and firing their workforces. They are corporations who ship employment overseas to maximize the hundred-million-dollar compensation packages of their top executives while squeezing their diminished workforces. They are financial services companies that invent complex investment instruments and sell them to uncomprehending investors who thought that they were buying safe collaterialized bonds. They are the bond-rating agencies that gave these bonds triple-A ratings. They are speculators in oil futures and short-selling stock investors for whom the stock exchange is a vast, global gambling house. They are the accounting firms who cook the books when they should be auditing publicly held companies. They are billionaires who think that they can purchase the entire American electoral system, and make the attempt to do so election after election. They are digital-technology entrepreneurs who want to eliminate America's universities on behalf of a tiny elite who will make the 1% look like a supermajority. In short, they are the kind of people that Ayn Rand–quoting-types admire, and they are the people who have so damaged the world economy that perpetual crisis is the new normal. Given the power of tidy soundbites in our meme-driven mediatized society, the substitution of a "cultures vs. vultures" formulation for a "makers vs. takers" one would not be trivial. It could change the nature of our political discourse. But I have no illusions. Though I am not a corporation, I am a person, and, in spite of the many claims for the democratic potential of the interactive Internet, ordinary people are not heard through our mass media. But perhaps I should change my blog moniker to Linseed Lohan?
... View more
0
1
454

Author
11-28-2012
12:30 PM
For this last post in this series, I want to consider the final part of the general education law as presented to our committee: Remaining courses (15 credits) to be determined by each university or college This provision prompted a colleague to say, “Now the local bloodbaths will begin.” And I imagine they will. The core is shrinking. Some department somewhere is going to lose 6 credits, 2 courses, and who knows how many students. I won’t predict a bloodbath, but only because I deeply believe in academic collegiality—call me an idealist if you will. My hope is that at my own institution, those final 15 credits will be decided through rigorous processes with multiple points of faculty input. In other words, I hope we will harness what James Surowiecki would call the “wisdom of crowds.” However, I can predict one very local point of contention: Public Speaking. I know our School of Communication and Multimedia Studies is quite eager to see Public Speaking as part of the core and, frankly, I’m not at all opposed to having it there. Students do need to know how to speak effectively. And, after all, isn’t that where rhetoric really started? In the agora? And of course, my professional organization is called College Composition and Communication. We get to drop some course in our local bucket… shouldn’t Public Speaking be one? It is, of course, more complex than that. For starters, we are in lean budget times and, as in any famine, humans start acting less than humane. As resources become scarce, we genetically hoard and scavenge—it’s the way our ancestors survived. So yes, I am thinking about the implications for our writing program if Public Speaking can replace our second semester writing course, ENC 1102. I want to support my colleagues and I want students to learn how to “communicate effectively” in multiple modes. I also want to continue to support our graduate students and I also want to make sure that students have as much practice with writing as we can give them. Fortunately, there are advantages to the Byzantine structures of our state bureaucracy. Specifically, our “ace in the hole” is (I hate to sound so coldly calculating and strategic) FAC Rule 6A-10.030. That’s part of the Florida Administrative Code, the set of rules that governs public education; locally, it’s called the Gordon Rule. The rule states that students must take four writing classes, two of which must be offered by the English Department. Now, HB 7135, the state law that started this whole mess process, has more weight that anything in the FAC—law trumps code. However, the Communication Committee did reaffirm the Gordon Rule and, for now, it exists. Its existence means, of course, that even with Public Speaking in the core, students will still need to take another English course. It also means that if both ENC 1102 and public speaking are dropped in our local bucket, students are more likely to take ENC 1102—not just to knock out both the core and the Gordon Rule requirements in one fell swoop, but also because (as any speech teacher will confirm) students are terrified of speaking in public—more terrified than they are of writing. Darn. This is not the post I intended. I had hoped to talk about how to avoid a bloodbath. I think I’ve ended up sounding like Sun Tzu. So reader, let me crowd source this to you: Public Speaking or more writing? Ideally, both, of course. But if you were forced to put one and only one in the bucket, which would it be, and why?
... View more
0
1
300

Author
11-15-2012
08:03 AM
As I contemplate the results of the November 6 election, I find I am absolutely astonished: My ballot turned out to be virtually identical to the overall results of California. Good gracious, even Proposition 30, the temporary tax increase designed to stave off absolute financial disaster for public education, passed. I mean, Californians voting for a tax increase? How did that happen? Well, you couldn't have guessed it from reading the oft-maligned "liberal media"—nor from reading the responses to many online news stories involving Californian or national politics. For in spite of the common right-wing canard that the media have a "liberal bias" (as if the Fox News network wasn't a member of the mass media), and ever since Sarah Palin and the Tea Party burst onto the national stage, the media have provided such heavy coverage of conservative movements that it has been difficult to detect any non-conservative pulse in the American electorate. Even the Los Angeles Times, traditionally a moderately liberal news organization, has served as a conservative crystal ball recently, reporting that support for Proposition 30 had fallen drastically below 50% on the eve of the election (it passed by a 54% to 46% margin—which the Times reluctantly described as "appear[ing] to squeak by with a victory"). And the comments from readers to the Times on the issue prior to the election itself seem to all have demanded that every teacher and professor in the state be fired in lieu of any tax increase. No, the outcome of this election was not at all apparent in either the traditional or digital media. Which takes me to my point. While I would not go so far as to describe myself as hopeful, it is nice to see that the power of the mass media (a power that I regard as indisputable) is not yet absolute. Even with the anti-Prop 30 forces outspending their opponents by twenty million dollars or so (with a lot of that coming from mysterious out-of-state forces), Prop 30 passed; and Prop 32 (which would have ended the ability of labor unions to make election campaign contributions, and which the same mysterious out-of-state forces spent a fortune to support), failed. In short, a whole lot of television time went to naught. One takeaway that I am fairly confident of is that when it comes to the comments sections on online newspaper and other Internet news sources, conservative responders appear to outnumber liberal or moderate ones. Perhaps the liberals are on Twitter (a real possibility), or Facebook, but if you went by the election news story responses on Yahoo! during the primaries, you would have sworn that Ron Paul was a shoe-in for the Republican nomination. So, we have to be careful when trying to use the mass media as a bowl of tea leaves to predict American electoral behavior. The last four years have presented a disproportionate coverage of Tea Party activity, which has granted that movement a lot of real power (the media can create their own realities sometimes), but this election's results reveal that it isn't all Tea in America. There's a lot more going on that just isn't as sensational, and thus doesn't invite those click-the-links that make money for digital media. Is there a new, moderate, "silent majority" out there? Let us hope so.
... View more
0
1
317

Author
10-31-2012
01:33 PM
I never cease to be amazed by the number of my colleagues who exhibit little to no awareness of FERPA (the Family Educational Rights and Privacy Act), which is the educational equivalent of HIPAA (the Health Insurance Portability and Accountability Act). Both pieces of federal legislation mandate absolute privacy when it comes to information, whether pertaining to health (HIPAA) or, more relevantly, student grades (FERPA). It wasn’t all that long ago that I could walk through the halls of our department and see boxes of graded student papers outside the doors of my colleagues’ offices (yikes!). I understand FERPA, and I celebrate it. I also detest it. The problem is that I grade student work electronically using my word processor’s Track Changes and Comment features—good for the environment (well, good for trees anyway) and good for my sanity and health (for me, typing doesn’t produce the kind of repetitive stress that writing does). Actually, electronic grading isn’t the problem. Returning electronically graded student work is. "FERPA-ly" speaking, e-mail is not a secure medium; someone could intercept the e-mail or a roommate could see it on the student’s computer, revealing the grade and breaking the law. So, returning graded student work by -email is technically illegal (well…let me say “non-FERPA compliant,” instead). Blackboard and other course management systems are okay (or FERPA-compliant, if you will) since they are considered “secure” environments. But Blackboard is a royal pain in the ass and always seems to be, technologically speaking, about five years behind the curve. To return one student paper through Blackboard can take me as many as five mouse clicks. That doesn’t sound like a lot, but add in Blackboard’s slow response time and suddenly returning student work takes almost as long as grading it (not really, but that’s how it feels). I’ve tried using Dropbox, but that involves getting each student to download and install Dropbox and then create and share folders. Besides, when I did try it I discovered it’s eerily pan-panoptic. I get a little pop-up whenever the student puts anything in the folder; they get one when I do the same. It’s like we’re always watching each other or, what’s worse, always acting as though we’re being watched. Of course, I could print the papers but that defeats much of the purpose of electronic grading. What to do? Dream. In my dream, there is what I call the “FERPA-fied student locker.” The interface is simple: Dropbox simple. Each student signs up for an account in the locker with a code to add them to my class. When I sign in I see this: To return work, I just drag and drop the graded file into the appropriate folder, where it is encrypted and stored in the student’s online locker. That’s what Web 2.0 is, folks—not just leveraging the “wisdom of crowds” through crowdsourcing but also Web applications that feel like a desktop environment. Drag and drop, drag and drop. Let me say it one more time because I love and want it and need it—drag and drop. That’s all I want. No discussion boards. No online peer revision. No electronic grading. No assessment tools. And no, not that other thing either. Just this. Does the FERPA-fied student locker exist? No. Can it? Yes. “We have the technology. We can make [it] better than [it] was. Better...stronger...faster” (and I’m fairly certain it won’t cost six million dollars).
... View more
0
6
285

Author
10-24-2012
11:39 AM
My foundations in the eLearning discipline lay in the intersection of computers and writing. It was the central concern of my dissertation and the subject of my early scholarship. Though my research interests have shifted to questions of writing program administration, I continue to think about technology, in part because I am more nerd than academic.One might find it curious, then, to discover how little technology we use in the writing program I direct. There are some logistical reasons for this, primarily connected to budgetary concerns, but there are some pedagogical concerns as well. Anyone who has taught a course online will testify to its remarkable ability to consume time. One quick example: it takes 50 minutes to listen to and guide a classroom discussion; it takes many more to read through that discussion online and respond accordingly.At the same time, eLearning is the trending buzzword at our school. The administration has even created the Center for eLearning despite drastic slashes in our budget.Personally, I’m not ready to endorse a fully online writing class. I know such a beast exists, and I know that many teachers and schools find ways to make it happen successfully.I don’t see how it could work given our students, our resources, and our pedagogy. A hybrid course, however, has some appeal—though I must confess that the appeal is logistical, strategic, and perhaps even Machiavellian: hybrid courses would allows us to double the number of Tuesday/Thursday sections, the most popular class time for a student population that values extended weekends.I’d love to develop that hybrid course. I’ve fully intended to do so for at least three years now. But I’ve discovered that being a writing program administrator is 40 percent being a firefighter, 40 percent being a police officer (there are many policies that need to be enforced and many disputes that need to be mediated), 10 percent cheerleading (particularly in lean budget times), and 10 percent existing despite exhaustion.So I ask you, dear readers, what are your thoughts on hybrid writing courses? Have you taught one? What are some key ingredients to making them work? What tools do you use?
... View more
0
1
351

Author
10-17-2012
07:55 AM
This is the post I’ve been avoiding. I’ve been avoiding it because, simply, I’m tired of the election, frightened by it, sick of it, overwhelmed by it, have been driven to the brink of paranoia around it. It’s not that I’m apathetic (far from it); it’s just that I’m done. In fact, I end up avoiding almost all political news (which drives my hubby crazy since politics is his hobby/passion/addiction, one exacerbated by living in Boston and listening to talk radio). I won’t use this forum as my soapbox though I will say I envy those of you who get to look at the issues, who have the exorbitant luxury of considering where candidate X stands on jobs or taxes or education or Syria or the national debt or any other issue. For me, every election is a single-issue election. As a queer, I need answer only one question: which candidate gives me the best chance of existing for another four years? Despite my personal aversion to any discussion of the looming election, it’s no doubt something that can (maybe should) be taught in the FYC classroom. For me, though, it’s not about advocating for whichever left-ish or right-ish or middle-ish position you think is “correct” or “just” or “true.” For me, teaching the election has everything to do with helping students to see that polarization is a central problem -- one that everyone needs to address. It’s a lesson I learned for myself on a recent power walk through my neighborhood in Wilton Manors, which statistically has the second highest per capita gay population in the country or, in other terms, has “1270% more gay men per capita than the national average,” which is to say that I live in one of the gayest places in the country if not on earth. As I jammed to my Glee playlist (OMG, Blaine’s rendition of “It’s Time”) and upped it to a pace of about 13’31” per mile I passed a house with a sign for a candidate I do not support. At first, I was repulsed: “Seriously? Seriously? Do you know where you live? How dare you?” But then I caught myself and realized that not only could I tolerate that adversarial difference—I could celebrate it. To translate my epiphany into the classroom, I would turn to “Making Conversation” and “The Primacy of Practice,” Kwame Anthony Appiah’s contributions in Emerging. If anyone were to ask me which essay was central to Emerging, which reading somehow embodied the spirit of the text, I would probably point to Appiah. His argument isn’t a kumbaya-like “we should all get along / teach the world to sing in perfect harmony” utopianism (though students are often tempted to read him this way). Rather, Appiah perfectly understands that “cosmopolitanism” is, as he puts it, “the name not of the solution but of the challenge.” That is, we no longer have the option of living like separatists; the world is all at once too crowded, too mobile, and too interconnected. We thus have only two options: find a way to live with those who are different than us or destroy ourselves (I’m paraphrasing with some exaggeration here to make my point, but not by much). That’s what strikes me with this election and that is what I would bring into the classroom: how do we get along, here in this place, now at this time? The answer isn’t about convincing the other side how wrong they are nor is it about winning or losing, triumphing or decimating. As he goes on to explore in “The Primacy of Practice,” it’s not about this or that set of values (since people with the same values can fight as easily as those with different values); rather, it’s about getting used to difference. His essay is a great way to get students thinking about that process and, perhaps, practicing it as well. Much needed, I’d say. Because, believe me, there’s no better opportunity to get used to difference than this election. All it takes, really, is a walk through the neighborhood.
... View more
0
0
425

Author
10-10-2012
01:09 PM
A fundamental axiom in my philosophy of writing program administration is that institutions are, by definition, crystal-latticed structures designed to hold contradictions together in close proximity. Take, for example, a simple question that on the surface would seem to have an obvious answer: Who gets to teach writing? At my institution, we continue to hunker down through a perfect storm worse than any category 5 hurricane: SACS reaccreditation, program review, a fairly new and tentative administration, assessment, strategic planning, a new state law reducing the credits in the statewide general education/core curriculum, and budget cuts. It’s fascinating to see so many narratives about the university revisited and revised all at once; at the same time it’s inevitable (and simultaneously frustrating and amusing) to see how this storm churns up paradoxical discourses that long ago rested quietly in the sediment of the institution. For example, as part of our college’s budget cuts, our dean jettisoned our business writing course, since it primarily served students in the College of Business (CoB). We lost not only those students but also nine instructor lines (ouch). Recently, unaware of the fact that the CoB was transferring our course (ENC 3213) into their equivalent course (GEB 3213), an associate provost called a meeting between me and my dean and the folks from CoB. Her concern was “credentializing.” (Our school is hewing to a strict interpretation of the “18-credit-hour” rule in our accrediting body, which states that in order to teach a subject the teacher must have at least 18 graduate credit hours in the subject.) She explained that while it’s clear how someone who has taken a graduate course in Shakespeare could teach our writing course, it seemed to her a bit of a leap to claim that this same person could also teach business writing. Consider this theorem 1: only people with graduate training in business writing should teach business writing. I explained, first, that anyone working within academia is a business writer. I spend all day composing reports, proposals, memos, and e-mails, and I got my job only because of my skills in writing both a cover letter and resume. I then pointed out the logical extension of her argument. Consider that theorem 2: if a teacher needs graduate training in a discipline to teach a course in that discipline, then (I explained) there were only three people at our entire school who could teach our FYC courses—our three tenure-track Composition/Rhetoric faculty. (Curiously, that does not include me. Though my dissertation is in Comp/Rhet and though I worked with some rather significant figures in the field, my doctoral program did not have a Comp/Rhet track, thus my coursework would not qualify me to teach in the program I administer [cf. axiom 1]). Though unqualified to teach rhetoric, it would seem I am at least an adequate rhetorician, since the associate provost then conceded that perhaps people in English could teach the course—with sufficient orientation. (Though I suspect the logistical nightmare of replacing some 100 teachers of FYC courses had a more persuasive effect than the argument that academics are business writers). Complicating matters more, our university has a vigorous writing across the curriculum (WAC) program whose base philosophy is that any one in any discipline (with some training) can teach writing—we even have a second-semester FYC writing course offered in chemistry. Thus, theorem 3: writing happens in all disciplines and thus responsibility for writing should be distributed across the university. Let’s review the emerging geometry: Axiom: Institutions are designed to hold contradictions in close proximity. Theorem 1: Only people with training in a discipline can teach in that discipline. Theorem 2: Thus only people with graduate training in Comp/Rhet can teach writing. Theorem 3: Writing happens in all disciplines; thus, anyone in any discipline can teach writing. Maddening. Fascinating. As we continue to weather out this storm, I keep encouraging our chair to “embrace the illogic” of the institution. From my perspective, since any institution holds mutually exclusive positions simultaneously, getting out of any one jam simply means shifting one’s rhetorical position in the latticework. But there are larger questions embedded here. Who does get to teach writing? At our school writing is embedded in the English department. Where does it reside in your school—and why? As Comp/Rhet continues to mature as a discipline, how much longer can English (or communications, even) contain it? How can WAC/WID survive the twin forks of assessment (too often code for accountability) and accreditation?
... View more
0
2
310

Author
10-04-2012
02:07 PM
In my seminars on popular culture, my students make a class presentation on a popular cultural topic of their choice that forms the basis for their research papers. One requirement for their presentations is to explain both what their topic is and why they chose it. Over the years, this apparently basic task has proven to be more challenging than it appears, so I now offer to students my own reasons for choosing the topics that I write about as a cultural semiotician. The first point I raise is that selecting a topic for a researched semiotic interpretation should not be a random act. As in a scientific research program, the choice of the topic begins with a need to answer a significant question. The researcher does not yet know the answer, but may have some educated guesses that the project is intended to test. Thus, the choice of a topic represents not only an interest in a question or problem but also a general sense of where to go with it. It's particularly important to choose a topic not simply because a student "likes" it. In the semiotic interpretation of popular culture this is a particular hazard, because affection for one's topic can result in an inability to maintain the objective stance necessary to conduct an analysis. It is possible, of course, to have a fondness for one's topic and be able to think critically about it (I have a lifelong affection for The Lord of the Rings trilogy, for example, but that does not prevent me from seeing the rather serious political problems with that lovely fantasy), but it's not easy for those who are first learning about popular cultural semiotics to be able to do so. What often results is a paper that is more like a press release or puff piece. As a reason for choosing certain topics for my own semiotic analysis, I point out the following: why I have returned, on a number of occasions here in my Bits blog, to the popular culture of the 1960s and its historical aftermath? I can assure you that it is not out of an affection for the events of the sixties—quite the contrary. Rather, as I look at the increasingly corporatist/hypercapitalist trajectory of American society (a trajectory that has been uninterrupted no matter which political party holds the White House), I ask myself, quite simply, "what happened?" How did a generation (the largest of its kind in American history) whose cultural ethos profoundly challenged the corporate/capitalist "Establishment" end up participating in the construction of an America that is now making the 1950s look progressive? As I have stated before, there is no single answer to such a question, but the pursuit of answers, I believe, is crucial to any attempt to reverse the trend. You can't stop something if you do not understand what you are trying to stop. So, I will be returning here to what I regard to be one of the most important cultural questions of our time, the question of how a mass cultural challenge to one sort of society became an embrace, of how a decade of sociocultural exploration and experimentation rapidly transitioned to four decades of corporatist entrenchment whereby universities became "brands," students became "consumers," and scholars became "entrepreneurs." Four decades in which the collectivist spirit of sixties youth culture has retreated before a wave of libertarianism. Four decades in which the grandest social accomplishments of the thirties and forties (Social Security, Medicare, unionization) are being unraveled by the children and grandchildren of the generations that fashioned them. My thinking so far about this historical conundrum has taken me to many places. There is always something more to consider and some of it is very sensitive stuff. But it is the road that I am now traveling in my popular cultural analyses, simply because I think it is the most significant social phenomenon of our times.
... View more
0
1
312

Author
09-26-2012
01:43 PM
Every year we offer a “standard” sequence of assignments for our teachers. Returning teachers are invited to use or adapt it; new teachers use it as they become familiar with our program, the course, and writing their own assignments. We test the sequence in the summer and gather sample student work at the same time that we use during our orientation. I thought I would offer this year’s sequence, which we titled “Rights and Bytes: The Technology of Civil Rights,” as a model for how we put assignments together. Feel free to use it or adapt it as needed.Paper 1: YoshinoAt the end of “Preface” and “The New Civil Rights,” Kenji Yoshino suggests that ultimately the law will play only a partial role in the evolution of a “new” civil rights, one based on the value of authenticity and the common denominations of being human. Write a paper in which you extend Yoshino’s argument by identifying other key areas of society that must play a role in the creation of a new civil rights.Questions for Exploration: According to Yoshino, why does an exclusive focus on law limit civil rights? What role must conversation play? What’s the difference between civil rights and human rights? How can we make the transition from one to the other? Does covering prevent the evolution of civil rights? What social factors might change covering: peer pressure? popular culture? What’s necessary for a person to achieve authenticity? How might economics, culture, or even religion function in Yoshino’s vision?Paper 2: Poe and YoshinoIn “Preface” and “The New Civil Rights,” Kenji Yoshino makes his arguments with little reference to or awareness of technology. However, as Marhall Poe makes abundantly clear in “The Hive,” technologies such as Wikipedia are growing rapidly and, more crucially, are becoming an increasingly important facet of our lives. Use Poe’s discussion of Wikipedia to complicate Yoshino’s argument by writing a paper in which you assess the potential of technology to improve civil rights. Questions for Exploration: Is Wikipedia’s bottom-up model an analogue to Yoshino’s emphasis on conversation as a mechanism of social change? Does the relative anonymity of Wikipedia impede civil rights by promoting covering? How can we harness the collaborative power of a project like Wikipedia for social change? Is the current model of civil rights a cathedral or a bazaar? What might/should a new civil rights look like in these terms? Given the pace of change on Wikipedia (and technology in general), is it realistic to expect it to play a role in slower processes, such as political and legal ones?Paper 3: Friedman, Poe, YoshinoSo far we’ve considered the relationship between rights and technology in a fairly local context—the United States. However, as Thomas Friedman makes clear in “The Dell Theory of Conflict Prevention,” it’s increasingly difficult to think about anything in a local context as the world and its economies becomes more and more interconnected. After all, Dell is as tied to its supply chain companies as they are to Dell and so too the countries involved. Using ideas from all three authors, write a paper in which you evaluate the possibility of universal human rights. Questions for Exploration: Despite past efforts, can we ever achieve a universal set of human rights? Which model of civil rights might help with that goal—equality or liberty? How might covering on a global scale impede that goal? Can nations have “reason forcing” conversations? How might technology play a role in promoting global human rights? Does the kind of collaboration represented by Wikipedia suggest that it’s possible? Would it require a top-down or bottom-up model? How might supply chains be used not simply to guarantee peace but also to advance human rights? Do economic pressures within supply chains make it more difficult to achieve universal human rights? What are the human costs of globalization? What challenges do mutant supply chains pose, and how might countries collaborate to overcome those challenges?Paper 4: Olson and One OtherIn “The End of Race: Hawaii and the Mixing of Peoples,” Steve Olson demonstrates how technological advances in genetics suggest that race is no longer a biological reality. At the same time, he also indicates that race and racism persist. Write a paper in which you evaluate “the end of race” using ideas from Olson and one other author. Questions for Exploration: If race has no biological basis, why does it continue to function as a category? What role does covering play in the continuation of race? What relationship does covering have to communities of descent? Does our current equality paradigm for civil rights mandate the continuation of race? Would switching to a liberty paradigm change things? Does Wikipedia offer a model for what a world without race might look like? What do the conflicts within Wikipedia, such as that between Cunc and Sanger, suggest about race and its persistence? What’s the difference between racism and prejudice? Do any of the authors offer tools for us to combat one, the other, or both? How might global economic collaboration affect our understanding of race? Does globalization exacerbate the racialization of culture? Do mutant supply chains form from racial groupings or communities of descent, and why might that difference matter?
... View more
0
9
432

Author
08-23-2012
10:41 AM
Once upon a time, universities had identities, purposes, goals, mottos, even just functions. Then, in the 1990s, they suddenly developed "missions," in imitation of the corporate fad of the time whereby mission-focused "strategic planning" was all the rage. (I know this firsthand, having sat through endless strategic planning sessions at my university in the 1990s.) Today, strategic planning is regarded as a lumbering dinosaur in need of replacement by nimble-thinking campus president-CEOs who haven't the time to listen to students and faculty about the directions of their campuses. Instead, universities now have brands. I bring this up in a blog devoted to the teaching of popular cultural semiotics to demonstrate once again that the purpose of such classes is not to celebrate our entertainment/consumer culture but to analyze it for signs of the kind of society we have become. And in seeing universities treated by university personnel as products to be branded and sold to "consumers," we can measure just how far we have gone into the hypercapitalistic obliteration of every other perspective on what life and society can be. When universities are brands rather than settings for learning and scholarship, they are not only being told to behave like businesses, they are being told that they are businesses. This, of course, is very bad news indeed for the humanities, which have never been cash cows, and are now staring down at their own potential extinction. (If you want to see a starkly honest assessment of the future of the humanities as an academic career, I suggest that you look at the site 100 Reason Not to Go to Graduate School). Academic pundits from Stanley Fish to Martha Nussbaum have weighed in on this crisis (Fish has taken the position that the humanities shouldn't even attempt to justify themselves to business-model obsessed administrators, while Nussbaum eloquently makes the traditional case for the extra-economic value of training a citizenry in humanistic values), but they aren't changing anything. Indeed, the crisis is accelerating much like the way the effects of global warming are, as is to be expected when our culture is coming to adopt the positions of corporate-think as if they were the only world views possible. The situation has gotten so bad that I expect that some of my readers are simply shaking their heads at this obsolescent baby boomer who just cant simply get with the program. Well okay, call me old fashioned; I do not regard TED and MOOC as my friends. Like the medieval monks and lay scholars who flocked to Europe's original universities as a haven from the violence and brutalities of the Middle Ages, I entered academia as a haven from the salesmanship and money worship of American society. I think that Death of a Salesman is a far more valuable document than the Forbes 500. The entrepreneur and the CEO are no heroes of mine. For many years, American society made room for people like me, carving out a space for something (the study of literature, philosophy, and art) that never could justify itself in market-based terms. Ironically enough, I can justify the teaching of popular culture in market terms, both because there is a large demand for it among students and because its study (as I tell my classes) can be useful in such areas as advertising, market research, and the culture industries. But my emphasis is on critical thinking, not "Disneygeering." As I also tell my students, I am not personally threatened by this seismic shift in the nature of the modern university. Once upon a time, the American education system made room for people like me, and the career that I began many years ago will almost certainly serve out my time. Though we are moving toward a situation in which most "educators" will be little more than minimum-wage purveyors of digitized learning management systems, I'll be retired by then. It's the next generation that is going to pay the full price here, not mine. And they shall live corporately ever after.
... View more
0
0
669

Author
03-08-2012
12:08 PM
A student in my semiotics of popular culture class has asked me whether I thought that the largely white, male Academy failed to award an Oscar to Viola Davis because they couldn’t stand to see a sweep of the actress awards by black women. Such a question merits a seriously considered answer, and I think that this blog is a good place to provide one. First, while I do not pretend to be able to read the minds of the Academy voters, I am certainly aware of the growing controversy over their demographic make-up, and while I do not think it impossible that they were influenced in their voting by their own racial instincts, I think it more likely that they turned to Meryl Streep as they have always turned to Meryl Streep: that is, as a symbol of solid acting excellence in an industry largely devoted to action-packed, special-effects driven entertainment aimed mostly at adolescents. In other words, I’m with Neal Gabler of the Los Angeles Times, who recently argued that the candidates for best picture (including The Artist, which, of course, won) reflect a combination of self-loathing (for all of the low cultural stuff that Hollywood usually produces) and nostalgia (for movies that reflect high-art values or high moral purpose) among the Academy voters, who assuage their consciences by voting for the few high-art or high moral purpose movies that come along in a given year. But more importantly for me is the fact that, as is almost always the case in American culture, political controversy swirls around questions of race rather than class. What I find significant is that Billy Crystal quipping, “nothing takes the sting out of the world economic problems like watching millionaires present each other with golden statues” prompted laughter rather than recognition of the class inequalities of this annual ritual,. As the royal figures of entertainment prance up a red carpet in their formal evening attire and designer gowns into a private hall from which members of the public are excluded, no one appears to be disturbed. Crystal’s joke, which really does nail the point, is received the same way as was the quip by Rod Steiger’s character in Doctor Zhivago. That character breaks the tension at a lavish, upper-class banquet when a protest march of impoverished, chanting Russians passes by the windows by joking that maybe the people will sing in tune after the revolution. His fellow aristocrats erupt in laughter and applause in gratitude for this bursting of the bubble of conscience, and go back to their feast. Which is exactly what happened at the Oscars. Meanwhile, we can keep up with what is happening in the “real world” by reading how Susan Naomi Bernstein’s students’ computers are so decrepit that they break down while students are taking their mandated writing exams. Or the fact that increasing numbers of retirement-age Americans are going back to work or moving in with their adult children because they cannot afford to retire independently. Of course, this sort of thing doesn’t make front-page headlines, and won’t, no matter who wins an Oscar in whatever category. Photo: [Oscars Through the Ages, on Flickr]
... View more
0
1
291

Author
02-09-2012
02:04 PM
Well, the Boss is coming out with a new record and embarking upon a national tour to promote it. Though the full album, Wrecking Ball, has not yet been released, a single entitled “We Take Care of Our Own” has just appeared, and it happens to provide a very good topic for semiotic analysis. I heard it on the radio for the first time last night while driving home from teaching a popular culture class. Aesthetically, it sounded like vintage Springsteen to me (same chord patterns and instrumentation, same arrangement, same less-than-clearly-enunciated lyrics), but I could pick out the chorus, which is “Wherever this flag is flown, we take care of our own.” And that surprised me. It sounded so jingoistic, so emptily patriotic, not like the Boss at all. And then I immediately remembered that this happened before, almost thirty years ago, with “Born in the U.S.A.,” a protest song dripping with irony, most of which was lost on Ronald Reagan, who alluded to the song for campaign purposes until he was set straight on the fact that it was hardly Republican campaign material. So I decided to look up the lyrics on the Net. Sure enough, the chorus, when juxtaposed with the rest of the lyrics, which bitterly describe a nation that isn’t taking care of its own, abandoning them “From Chicago to New Orleans, from the muscle to the bone, From the shotgun shack to the Super Dome,” make it clear that the Boss hasn’t gone conservative in his later middle age. He’s still on message. But alas, there’s a hitch, as there always is when popular music assumes the mantle of social protest. A brief search of ticket prices for the upcoming Wrecking Ball tour makes it clear that, excluding what scalpers may be getting, tickets will cost anywhere from $118–$872, depending on location and amenities. (I’m sure other prices can be found, but these figures are representative). Now, who is taking care of whom with such prices? Such a contradiction, of course, is to be expected with an art form (popular music) that is thoroughly embedded in commercial culture. Protest music, in such a context, is virtually ensured to become what Thomas Frank has so usefully called “a commodification of dissent.” I can easily imagine purchasers of those $872 tickets boasting to friends about how they have struck a blow against corporate America by attending the latest Springsteen concert. It’s like wearing a Che Guevara t-shirt (indeed, Che’s iconic image in black beret and beard is a lucrative one for express-your-radicalism-through-commodity-consumption purposes). But the contradiction goes deeper than this. Popular music, even when framed in the form of social protest, is created to be entertaining, and entertainment makes you feel good. People who feel good are not likely to go out and try to change the things that the music they enjoy may criticize. Another way of putting this was offered in a popular song from the 1960s written by Tom Lehrer, called “We Are the Folk Song Army,” which was a spoof of the protest movement of that era. Alluding to the Spanish Civil War, Lehrer sings, “Remember the war against Franco/It’s the one in which each of us belongs/Though they may have won all the battles/We had all the good songs.” Or, as Lehrer acidly noted in an interview given in 2000, “I’m fond of quoting Peter Cook, who talked about the satirical Berlin cabarets of the ’30s, which did so much to stop the rise of Hitler and prevent the Second World War.” Pointing out such things can be dispiriting to students. After all, popular music, from the era of the Beats to the present, has been a cherished medium for the expression of social protest and countercultural vision. But Bruce Springsteen singing about an America that is doing everything it can right now to abandon the social vision of Franklin D. Roosevelt is a powerful signifier of the inability of commercially based popular music to effect social liberalism. Indeed, even the Boss’s new single undercuts its own bite at the end with a feel-good chorus that makes it easy to forget the bitter, but obscurely metaphorical, lyrics with which the song opens. Heck, even I thought, at first listen, that Springsteen had gone soft.
... View more
0
0
267

Author
01-12-2012
11:34 AM
What with the sequel to the sequel to the sequel to the cinematic remake of the revival of the original Mission Impossible series currently leading in the American box office sweepstakes, perhaps it would be more profitable to turn to the cultural semiotics of a much larger contest right now: the American presidential primary season, a quadrennial media extravaganza that presents us with a lively combination of the unlikely, the improbable, and the downright preposterous. I am not referring to the political outcomes of the primaries but, rather, to the way they are structured by the mass media. It all begins with the notorious fact that Iowa—a state with around 1 percent of the American population, and that is roughly 92 percent white—is allowed by disproportionate media attention to be, if not the party kingmaker in the primary season, at least the party wanna-be-king-unmaker. And the Iowa primary isn’t even a primary: its caucus votes are nonbinding and elect no delegates. To compound this apparent anomaly in what purports to be a democratic process, it is New Hampshire—a state that has a good deal less than 1 percent of the American population and is around 94 percent white—that is allowed, election after election, to be the first and most influential presidential primary, practically disenfranchising the voters of such states as California and New York. These facts about our primary electoral rituals are very well known. The question is what they signify. While entire volumes could be written on the matter, I’d like to focus on a few fundamental cultural-semiotic principles that are explored much further in the now-published seventh edition of Signs of Life in the U.S.A. The first is that of cultural mythologies—that is, those underlying beliefs, world views, or ideologies that govern cultural consciousness and behavior. In this case, the relevant mythology is that of an agrarian America whose “heart” lies in its rural regions. This mythology is so well engrained that even years after America’s transformation to an urban/suburban/exurban society (wherein the Jeffersonian/Jacksonian family farmer has long since been driven off the land by corporate agribusiness conglomerates), states like Iowa and New Hampshire (whose image in the American imagination continues to be rural and agricultural) are allowed a grotesquely disproportionate voice in the making and unmaking of presidential candidates. The second relevant cultural semiotic point is that of America’s cultural contradictions—that is, its simultaneous cherishing of diametrically opposing mythologies—that are most prominently signified in the red state/blue state cultural divide that is still very evident in American politics. So hardened has this division become that the three most populous and racially diverse states in the nation—California, New York, and Texas—are disproportionately undervalued in the primary electoral process, and are virtually ignored (except for campaign contribution purposes) by both parties in the actual presidential election—the reason being that everyone knows how they are going to vote anyway (New York and California being firmly in the blue column, and Texas in the red). Beyond mythologies and contradictions, there is the way that the mass media cover the primary season as a kind of combination soap opera/WWE smackdown. This should not be surprising. The American mass media are part of a culture industry whose purpose is not to inform but to sell advertising space (or otherwise profit) through the entertainment of a mass audience. Elections, accordingly, are treated as entertainments, and because having to wait for months for the outcome of something is not entertaining, Iowa and New Hampshire are useful; their elections offer instant gratification by allowing these tiny states to determine, right away in the very beginning of January, who the nominees are going to be. The fact that the eventual nominees are not always the winners in Iowa and New Hampshire has no effect on this media illusion. All that matters is that huge audiences can be attracted by broadcasting the Iowa and New Hampshire caucus/primaries as if they were equivalent to election night in November, complete with live blogging and Twitter feeds, raising plenty of excitement and advertising dollars. And now that the dust has settled on Iowa, we can look forward to New Hampshire, and afterward to the Super Bowl—I mean Super Tuesday. Finally, there is a new element in the game this time around, which is the fact that the election “markets” are now an integral part of campaign reporting. Yes, the American electoral system is now more or less officially joined at the hip to a kind of stock market: gambling. When Las Vegas meets Wall Street in the race to the White House, you really know that you are living in a hypercapitalist society, a world in which the only measure of anything any longer is money.
... View more
0
1
390

Author
11-17-2011
01:35 PM
Given the affinity that the current Occupy Wall Street (or Wherever) movement has with many of the protest movements of the 1960s, I am minded to take a semiotic look here at the legacy of the counterculture and, more relevantly for this blog, the connections between popular culture and the New Left. It is a virtual truism to note the at least superficial leftist tendencies of popular music since the 1960s. We’re talking rock-and-roll here, not country/western. With the exception of such performers as that gun-slinging guitar slinger Ted Nugent, the pose, if not the actual politics, of the rock star is that of the “street-fighting man” (or woman: anyone for 4 Non Blondes’ “What’s Up”?). Having evolved from the countercultural positioning of the Beats (whose music of choice was jazz), the sound track of the 1960s and since has often expressed the voice of an American youth in dissent against the Establishment. Given the growth in importance of popular music over the past five decades, one might expect a concurrent political effect, a leaning to the left. But that hasn’t happened. Indeed, the country has shifted so far to the right that centrist Democrats like Clinton and Obama are regarded as left-wing radicals in many quarters. So the big question is, what happened on the way to the revolution? A key to answering this question lies in noting one of the fundamental contradictions around which American society has always been structured: the contradiction between our Puritan tradition of social conformism and sexual repression, on the one hand, and our secular tradition of individual liberty and personal expression, on the other. These two cultural tendencies have coexisted in an uneasy tension throughout our history, with one side or the other achieving dominance at various times. What we call “the Sixties,” for example, was in effect a rebellion against the strikingly Puritanical upswing in American culture during the 1950s, representing a youth-led swing of the pendulum toward a free-wheeling individualism that included a hedonistic celebration of pleasure and entertainment (sex, drugs, and rock-and-roll) as a major component in its development. Somewhat ironically, the hedonistic tendencies of the New Left sixties stand in stark contrast with the rather Puritanical tendencies of the Old Left, with its old-school connections to the early labor movement, the Progressive Era (which also brought us Prohibition), and pre-Stalinist socialism. The backlash against the sixties (which has helped produce the “red state/blue state” cultural divide) has stuck with Puritanism through and through, rallying around such “wedge issues” as abortion and gay rights. Popular culture, for its part, has been more or less on the side of the Cultural Revolution (if only because, in Thomas Frank’s words, the “commodification of dissent” can be highly profitable), but that hasn’t resulted in the marginalization of right-wing Puritanism. Quite the contrary. So again, what happened? Well, there is no easy or single answer. But one explanation can be seen in the different ways that the Right and the Left today rally their forces. The Left has Jon Stewart, Neil Young, Saturday Night Live, U2 (as elder statesmen at any rate), and Lady Gaga (in her own way a Left-tending entertainer), as well as countless others. The Right has figures like Bill O’Reilly and Glenn Beck, news preachers one might call them, who appeal to anger and outrage, not pleasure and humor. After all, Jon Stewart is just plain fun. Now, ask yourself which emotion is more conducive to stimulating the kind of uncomfortable sacrifices anyone has to make in order to undertake the laborious process of winning power in America—pleasure or anger—and you will get a glimpse into the enduring power of the Right in America. The current anger being expressed in the Occupy ______ movement may mark a significant difference (though urban camping does have a certain adventurous aura of fun about it) whose effectiveness only time will tell. But the emergence of trademarked tee-shirts and other Occupy-related merchandise does not bode well. And watch out for Occupy-Aid rock concerts. Then indeed another revolution will be over.
... View more
0
2
471

Author
10-07-2011
08:34 AM
As true fans of Casablanca know, no one in the film ever actually uttered these words. Rick says “Play it,” and Ilsa says “Play it, Sam,” but it was Woody Allen who put “Play it again, Sam” in our heads. No matter, it is the principle of repetition I’m after here, for this is a blog about a song well sung, or rather, too often sung. That song is the ongoing Hollywood tendency to rehash former programs and films, or remix them. This season’s return of Charlie’s Angels is an example of the former, and the premier of Terre Nova, of the latter. Let's begin with Charlie’s Angels. The show that turned a one-time shampoo model into one of America's favorite sex symbols, the original Charlie’s Angels was a signifier of how the sexual revolution of the 1960s had become mainstreamed for middle America by the 1970s. When the television program was reprised in 2000 as a feature film (Charlie’s Angels), and in 2003 as a feature film sequel (Charlie’s Angels: Full Throttle), that significance had long since vanished. By then it was simply a convenient vehicle for a new generation of sex symbols in a film industry that preferred already tested entertainment formulae to the risk of genuine innovation (though the casting of Lucy Liu did at least signify the maturing of Hollywood's depiction of Asian American women). The current televised rehash of Charlie’s Angels entirely repeats what the films signified: a vehicle for a new generation of actresses, a testimony to the risk-aversiveness of the entertainment industry, and a case of yet another insertion of a nonwhite lead while preserving the status quo racial ratio at two-to-one. In Hollywood as in Ecclesiastes, there is nothing new (or at least very little) under the sun. Which takes us to Terra Nova. Essentially a remix of Swiss Family Robinson, Lost In Space, Survivor, and Lost, with a hint of Jurassic Park and probably a dash of Avatar thrown in for good measure, Terra Nova cannot simply be explained as another example of the universal appeal of archetypes. The thing about archetypes is that they appear in stories as a reflection of a kind of cultural unconscious (this is explicit in Jung’s version of archetypal theory but is also implicit in Northrop Frye’s more literary archetypalism), not as a calculated remix of audience-tested entertainment formulae. But, of course Lost (to take just one of Terra Nova’s sources) was itself a remix of Swiss Family Robinson, The X-Files, Survivor, and, in an odd example of farce repeating itself as tragedy, Gilligan’s Island. Now, one could reasonably argue that all this repetition is really a sophisticated expression of postmodern aesthetics: the creed that in a media-saturated world the artist can only repeat, with a difference, already existing cultural images and motifs. But I just don’t see that ineffable, but recognizable, air of parody or pastiche in these reruns of preexisting entertainments, that indescribable aura of postmodern chic. When Tim Burton sent Batman up a dark tower in pursuit of the Joker, plenty of other evidence from the movie made it clear that the allusion to the finish of Hitchcock’s Vertigo was not only deliberate but in the full postmodern spirit, but the television premiers of the new Charlie’s Angels and Terra Nova, not to mention Pan Am and The Playboy Club (obvious channelings of the success of the period drama Mad Men), just don’t send that signal. To me the message is that once again Accounting is in charge of Creative, and repetition signifies profit sensitive risk aversion. So, play it again, Sam: In Hollywood as in Ecclesiastes, there is nothing new (or at least very little) under the sun.
... View more
0
1
372