-
About
Our Story
back- Our Mission
- Our Leadershio
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
- Macmillan Community
- :
- English Community
- :
- Bits Blog
- :
- Bits Blog - Page 131
Bits Blog - Page 131
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Bits Blog - Page 131

Author
11-20-2013
11:45 AM
I composed my last post on my trusty iPad 2. I use it mostly for reading (both for pleasure and for classes) and (I admit with some shame) Candy Crush Saga. When I travel, I use it for email. I rarely compose on it not because it’s not capable but because my fingers tend to hurt from stabbing at hard glass. I bring this all up because it looks like we’re going ahead with plans for a mobile iPad lab. My former assistant Mike (now pursuing his PhD in Creative Writing (yay Mike!)), had wrangled the grant for the lab but we shelved it with his departure. Now, through a complex series of internal political travesties we’ve decided to sacrifice one of our computer classrooms to transform it into an advanced media production lab for our graduate programs (complete with 3D printer). To offset the lost, we’re resurrecting the idea of the mobile lab. We’re pricing out the cart and iPads, an Air for the instructor, and an Apple TV for projection. We’re also working out the details of reservation, storage, and security. Anyone out there using a set up like this? Any pitfalls we should be aware of?
... View more
0
0
296

Author
11-15-2013
09:30 AM
I'm sitting here composing this post in a Fiat dealership studio, waiting for them to finish changing the oil in my saucy “rosso” Fiat 500 Pop. And I’m lingering on the memory of a recent moment of radical self-doubt, the kind that all too often masks itself as an epiphany. I was sitting there grading another set of papers, making the same comments about the same things we've covered in class again and again, and I suddenly realized that it's not that I've made these same comments again and again in this class but in every class. It was a startling, disorienting moment: a disturbing tunnel vision of the past in which I saw myself writing and writing over and over on and across nearly twenty years of student papers. The moment passed but left six words echoing in my mind: What the hell am I doing? It was, in every way, a "crit" moment. It was critical—both in the sense of feeling like some sort of crucial juncture and (clearly) as a kind of self-damning indictment. It was criticism—both in the sense of evaluating and analyzing the whole oeuvre of my work as a teacher and an act of careful judgment about what we do as teachers of writing as a whole. It was crisis—perhaps the existential kind but equally perhaps the midlife kind (having just reached a “MOACA” stage in my life). These moments are, undoubtedly, a natural consequence of the kind of labor we’re called to do (as well as the cultural and financial esteem (or lack thereof) of such labor), but sitting here now I feel the “why” of it is not the point. The point for me is the “what,” and not the “what” of “What the hell am I doing?” but the “what” of “What the hell do I do about this?” Is it time, perhaps, to change pedagogies—perhaps even radically so? Is it time to rethink how and what I do? Is it time for new technologies, new assignment designs, new courses? Or is it just time for a nap? You know, most people have no idea what we really do for a living. I’m sure you’ve experienced it. You’re at a party and someone asks you what you do and you say you’re a writing teacher / English professor and they invariably respond with “Uh-oh! I better watch my grammar!” (I always tell them not to worry because I’m off the clock). The lived experience of teaching—with its joys yes but also with its doubts and crises and frustrations—is hidden. Thank goodness for blogs, for anonymous Internet venting, and for a decent espresso machine at a Fiat studio.
... View more
0
1
410

Author
11-08-2013
02:05 PM
With October 31st being the submission deadline for this, my 78th Bits blog, I thought I'd turn to answer a question a student of mine asked about the significance of the sorts of costumes being marketed to women these days for Halloween wear. Well, that one's pretty easy: in a cultural system that includes such phenomena as a young Miley Cyrus seeking to shake off her Hannah Montana image by (to put this as politely as possible) making an erotic spectacle of herself in order to succeed as a grown-up singer, the immodest (let's stay polite) wear marketed to women at Halloween is just another signifier of what Ariel Levy has quite usefully labeled a "raunch culture." Whether or not such explicit displays (and expectations thereof) of female sexuality constitute a setback for women's progress (which would be a Second-wave feminist assessment of the matter) or an advance (which might be a Third-wave interpretation) is not something I want to get into here. It's Halloween as a cultural sign that I'm interested in now. To see the significance of the contemporary Halloween, we need (as is always the case with a semiotic analysis) to situate it within a system of signs. We can begin here with the history of Halloween. Now, whether or not Halloween is a Christianized version of an ancient pagan harvest festival, or, as, All Hallow's Eve, is simply the liturgical celebration of the saintly and martyred dead that it claims to be at face value, is not something we need be concerned with. More significant is that neither of these meanings have been operative in modern times, when Halloween became a children's holiday: a night (with no religious significance whatsoever) to dress up in costume and go trick-or-treating for free candy. But in these days of an ever more restricted children's Halloween, with parental escorts or carefully monitored parties taking the place of the free range trick-or-treating of a generation and more ago, along with an ever expanding adult celebration of Halloween, we can note a critical difference, which, as is usually the case in a semiotic analysis, points to a meaning—actually, several meanings. The first is all too painfully clear: whether or not we actually live in more dangerous times (which is a question that has to be left to criminologists), we certainly feel that America has become a place where it is not safe to let children roam about on their own at night. The trust that Americans once had for each other has certainly evaporated, and the modern Halloween is a signifier of that. (One might note in this regard the ever more gruesome displays that people are putting up in their front yards: yes, Halloween began as a celebration of the dead, but this obsession with graphic and violent death hints at an insensitivity to real-life suffering that does not do much to restore that old trust.) But as Halloween has shrunk in one direction, it has exploded in another, becoming one of the premier party nights of the year for young adults. Joining such other originally liturgical holidays as Mardi Gras, today's Halloween is essentially a carnival—an event that has traditionally featured an overturning of all conventional rules and hierarchies: a grand letting off of steam (sexual and otherwise) before returning to the usual restrictions on the day after. Dressing in costume (whether along the more traditional lines as some sort of ghoul, or as some other more contemporary persona), enables a freedom—a licentiousness even—that ordinary life denies. At a time when, in reality, the walls are closing in for a lot of Americans, carnivalesque holidays like Halloween are, quite understandably, growing in popularity. There is more to it than that, of course. A further significance is the way that Halloween, like so many other American holidays (both religious and secular), has become a reason to buy stuff—not only costumes and food and candy, but also decorations, Christmas style, that start going up a month or more before the holiday arrives. Like Valentine's Day, and Mother's Day, and Father's Day, and, of course, Christmas, Halloween is now part of a different sort of liturgical calendar: a signifier of the religion of consumption. And no, I don't celebrate Halloween. October 31st has a very private significance for me: on that day in 1980 all of my Harvard dissertation directors signed off on my PhD thesis. I think of it as the Halloween thesis. I suppose that my doctoral gown is a costume of sorts, but I haven't worn it in years.
... View more
0
0
281

Author
10-31-2013
08:59 AM
Le sigh. Once again my proposal has been declined for the Conference on College Composition and Communication. It feels like any time I propose anything on technology—and I mean anything—I’m in. But if I try to “break into” any other track it’s a no-go. It becomes a self-defeating loop. The best way to get accepted, I’ve found, is to put together a panel with friends at Cs. So if you’re not accepted you’re behind to start with. To put it more bluntly, the best way to present at CCCC is to be presenting at CCCC. Eh, maybe it’s me. Anyone else feel this sort of catch 22?
... View more
0
1
400

Author
10-24-2013
09:30 AM
One of the most fruitful subjects for popular cultural semiotics is advertising. After all, existing to substitute signs of desire for substantial things, advertising is semiotic through and through. Indeed, much of the time, ads do not even show the products they are pitching, only complex narratives and/or images designed to promote a psychological connection between the product and its target market's fantasies. But while advertising as we have known it throughout the mass media era is hardly on its last legs (every year the Super Bowl breaks another record in its thirty-second spot prices), it is now being rather seriously challenged by the mobile market, which works quite differently from traditional commercial messages. Let's call it "Advertising 2.0." Advertising 2.0 works differently than video and print promotions in three particular ways. The first, of course, lies in its scale. The distinction here rests not in digitality as such (a desk top, lap top, or even tablet computer has plenty of space for the presentation of traditional video-style ads) but in mobility. There is only so much you can do on a tiny smart phone screen, which is why Advertising 2.0 often makes use of simple banner and text-based ads that are largely informational rather than psychological. And they don't have to play around with fancy psycho-semiotic techniques because unlike traditional advertising, which disseminates its commercials to a largely anonymous mass market/audience, mobile ads can be customized just for you because your personal behavior has been monitored, recorded, and sold to whoever has something to sell that is related to anything you have shopped for, looked at, or simply even mentioned online. In other words, there is a lot less guesswork involved. Advertising 2.0 is thus simpler and more direct than traditional mass marketing schemes, but there is yet a third way in which it differs from the past. This lies in the way that Ad 2.0 relies on the virality enabled by mobile media. Like chain letters writ large (WRIT VERY LARGE, that is), mobile ads are intended to be sent around. This makes the consumer a promoter as well, massively reducing the load for those who have products and services to sell. The result is a burgeoning revolution in the advertising industry. While the era of "mad men" is hardly over (mobile advertising still commands only a very small share of the commercial market), Advertising 2.0 is likely to grow very quickly. One absolutely trivial outcome of this will be a lot of advertising that is not semiotically very interesting. Less trivial will be the emergence of a form of what I'll call "market messaging" that will be much more efficient and cost effective than the advertising of the past. Also less trivial will be the likely decimation of the advertising industry as a career destination because market messaging does not require the sort of sophisticated thinking and strategizing that conventional ad agencies provide. As in so many other cases, IT specialists will largely take the place of a profession that has been rendered obsolete. Finally—and in a cultural sense this may the most significant outcome of all—Advertising 2.0 will continue to undermine our rights to personal privacy, mining our data for marketing purposes in the creation of a vast Panopticon that is quite different from the one that Foucault envisioned. Indeed, once this blog is posted, I may receive an email promoting a new edition of Discipline and Punish.
... View more
0
0
278

Author
10-24-2013
07:30 AM
Prezi. Either you love it, hate it, or have no idea what it is. If you’re in the last category, go check out www.prezi.com. Me? I’m in the first category. I love its Web 2.0-ness, its fluidity, its boundlessness, its exploration of virtual space. But haters hate and not without reason. I’ve had more than one colleague complain about “Prezi-sickness” from endless zooming and swirling. I point out that dismissing Prezi because of bad Prezis is akin to dismissing PowerPoint, which is almost always bad. I’m thinking about the question now because the Dean needs a snazzy presentation for a donor event. “Prezi!” I say. “No!” my colleague says. And what say you?
... View more
0
0
696

Author
10-18-2013
02:11 PM
I want to revisit a blog post from a couple of years ago about my changing stance on cell phones and other portable technologies in the classroom [please link to this if you can; it’s called Cell Phones and the Classroom and is from 2011]. In that post I explained how I had moved from banning cell phones to a simpler policy: Use portable technologies responsibly or not at all. The policy has generally served me well—especially since in the intervening years smart phones have become smarter, tablets have become more available, and laptops have become utterly common. I always explain the policy and tell students “Look, I know you have a life. I know your babysitter may have to call you or you just got a text from your boss or whatever. If you need to text or use your phone, just step outside.” I’d like to think the policy is simple, generous, and flexible: if you are pulling out your phone to pull up a PDF of the class reading, great; if you’re pulling it out to text just step outside. And yet, increasingly, I find students utterly unable to follow this policy. When I catch them texting I call them out, especially the ones sitting right next to the door. It’s quite frustrating for me, really. I don’t want to ban portable technologies—my roots are in computers and composition—but I also want students to learn simple civilized (and job-friendly) skills about using these technologies. I’m baffled. I’m starting to suspect that there’s something Pavlovian about their behavior, that having tech banned throughout high school and in other classes creates a “hide and text” reflex. Perhaps, that is, it’s just automatic for them. More troubling, it may be that students can’t even wait the five seconds it would take for them to walk out the classroom to answer that text. Either way I am stymied. Thoughts?
... View more
0
1
295

Author
10-10-2013
07:34 PM
So the jury is in, the dust has settled, and the critics, by and large, are happy. Breaking Bad has concluded, and, unlike The Sopranos (with which it had so much in common), it ended pretty decisively. As I noted in a previous blog, at least one TV critic (Mary McNamara at the Los Angeles Times) had hoped that the show would end with some sort of message that Walter White was evil—not a mere anti-hero—and that crime, in the end, does not pay. And in Walter White’s confession that everything that he had done he did for himself, McNamara found a satisfying conclusion that set “this series free”. Another Times critic usefully noted a different angle, however, observing that “Walt turned from anti-hero to outright villain at some point, but there was always somebody worse in the room” (Robert Lloyd). Lloyd’s point is that this is a common Hollywood strategy used to keep audiences on the side of anti-heroes even when they cross the line into outright villainy. As Lloyd puts it, “In the age of the bad-good/good-bad guy, the formula is to ensure that however bad your bad guy is, there is a badder one around. This is a cheat, of course, but a common one.” Finally, yet another Times critic has offered the following shrewd description of the story arc of this much discussed show: “Since some point in its second season, 'Breaking Bad' has effectively been two shows in one. The first of those two shows was the one we thought we were watching in the pilot: A mild-mannered chemistry teacher breaks bad and discovers how thrilling it can be, then drags us into the thrills right alongside him. In general, he is someone we’re supposed to root for, someone we’re supposed to cheer. In the second show, 'Breaking Bad' was all about a man who made a choice to break bad and revealed untold depths of bleak awfulness within himself. It was a series about a man who broke himself open and revealed a monster driving the controls, then decided he kind of liked that version of himself” (Todd VanDerWerff). All in all, this is some pretty danged good television criticism, criticism whose cultural significance is quite wide ranging. In the first place, it illustrates how, in what I call an “entertainment culture,” a great deal of significant "high" cultural activity has migrated to the “low” sphere of popular culture. In such a world not only has popular art largely taken the place of “high” art, but even such traditionally “high” cultural activities as literary criticism can now be found operating at a very high level in the analysis of television programs. (Please note that I do not at all intend this observation as a criticism: it is only an interpretation of the signs. I’m actually delighted when I see good television criticism, and that is why I often turn to the L.A. Times, which has a very able staff of TV writers.) But there is another meaning to be gleaned from the finale of Breaking Bad, one which indicates the conditions under which popular entertainment still exists. This is the fact that, as VanDerWerff and Lloyd imply above, commercial art (and television is commercial art) is generally beholden to audience desire. If Franz Kafka once remarked that reading a novel should be like getting hit over the head with an ice-axe, the commercial artist’s credo is more along the lines of “give the audience what it wants.” And Breaking Bad gives everyone everything: for the moralist, there is Walter White finally paying the ultimate price for his crimes, provided by what I'll call a deus ex machinegunna. For the millions of viewers who identified with him, on the other hand, there is the fact that Walter White (who was often not the worst person in the room) achieves a certain justice (of sorts) in the end, and is at least trying to look out for his children. Such a have-it-both-ways conclusion was a masterpiece of keeping-everyone- satisfied commercial art. (Incidentally, my guess is that the reason why The Sopranos had such an unsuccessful finale was that its writers couldn’t figure out any such conclusion. How on earth could Tony Soprano achieve any sort of posthumous justice? What even worse villains would he need to revenge himself upon? And yet, there were all those fans who so identified with Tony that killing him off just wouldn’t work either, so the end was . . . sidestepped—perhaps, as a student of mine has suggested, simply to leave the door open to a revival of the series sometime down the line.) And there, at least, lies the only tenuous distinction that I can make between high art and popular art. Unbeholden to the profit motive, high art can afford to disappoint its audience, or even bother it. Commercial art can’t. It isn't a hard and fast distinction, and there are certainly many examples that could challenge it, but at this time when commercial art is, like billboards along a country road, blotting out the high art horizon, there are no other distinctions that could be any more certain.
... View more
0
0
295

Author
10-10-2013
07:26 PM
It’s strange being a WPA (in more ways than I can count). I become most aware of that strangeness when meeting other administrators of various species: chairs, program directors, associate deans. Sometimes it just hits me: many of these people don’t want to be doing what they’re doing. I guess that should be obvious; after all most people in academia are there because they wanted to be academics. Composition/Rhetoric must be one of the few disciplines that has a whole subfield focused reflexively on its own condition within the institution. And that’s why I love being a WPA—because I find institutions endlessly fascinating and because I know that I am always already within one so the better I come to understand how they work the better I am able to survive and thrive. It’s so embedded in me that I just don’t get why everyone else isn’t fighting for my job (and really most would prefer to do anything but my job). Composition/Rhetoric is probably also one of the few disciplines with a subfield wholly focused on assessment. That’s not my specialty at all but in my new role with the dean’s office I am overseeing assessment for the entire college. And I am encountering similar moments of utter bafflement, moments that remind me that my discipline takes for granted what many others do not. There is throughout my institution an enormous resistance to the very idea of assessment (which is often muttered with a distinct distaste, as though the word itself were obscene). It’s seen as a chore or irrelevant. Colleagues comment that they already assess whenever they give a grade. Others aren’t quite sure what’s expected by the institution (which is sometimes the very condition of the institution). What’s clear is that I am currently Chief Cat Herder, charged with making sure a very unpleasant task gets done properly and on time by people who believe there are much more important things to do (and indeed they may be right). For myself, the idea of assessment just makes sense: who wouldn’t want to do better, teach better, help students better? But as I’ve noted assessment isn’t my field—writing program administration is. And my investment in assessment is filtered entirely through that lens. Do I think assessment in and of itself is important? The answer is irrelevant. What drives me in this position is not assessment but institutional dynamics. I know that assessment passes through a complex bureaucratic language translation matrix and pops out the other side and into legislatures as “accountability.” Assessment doesn’t scare me but accountability definitely does. Accountability threatens tenure, which threatens my job, which means I care very much about assessment. If you don’t believe me, ask any one in secondary education. They’ve been through it all already. And now it’s headed our way. If I preach assessment while in this position it’s because I know that outside the walls of academia there is a rising chorus demanding accountability. I want good assessment because when that tsunami hits us I want us to be prepared. I want to be able to coopt the conversation by saying “Accountability? Yes of course. Look at our excellent record of careful assessment. What more could you ask? We’re totally on top of this.” I’m a rhetorician. I’m a Machiavellian institutional manipulator. I’m a writing program administrator. And I intend on making sure all those skills are honed in the guise of assessment if and when our walls are breached.
... View more
0
0
284

Author
10-03-2013
12:30 PM
I was raised in a writing program where we administered a first day writing sample in order to confirm a student’s placement. Usually, there was no need to change that placement but occasionally the sample indicated a student needed to be moved down (or up) the ladder of writing courses. It was an imperfect system (aren’t they all?) but it’s what we had. What’s strange is that in my current program we still use writing samples, even though there’s no placement to confirm. In Florida, all remediation must take place at the level of the community college, which is to say that we can’t offer any basic writing courses—even if it’s clear a student needs one. It gets worse, I’ve heard through the grapevine that the Floridian powers-that-be are proposing to do away with developmental reading and writing entirely by guaranteeing that anyone who graduates high school can go straight into the standard FYC course, ENC 1101. Bracketing the wisdom of such a move (if it comes to pass), I am left with the question of the first day writing sample. The way I explain it to new teachers in our program is that it allows us to get a sense of where the class is as a whole in terms of writing and also allows us to identify very early on students who might need extra help. But of course, we are a local manifestation gnarled by institutional inertia and state bureaucracy. What about you? Do you do a first day writing sample? Does it do anything?
... View more
0
1
353

Author
09-26-2013
06:53 AM
Let’s begin with Carl Straumsheim at Inside Higher Education: The Full Report on Udacity Experiment“San Jose State University on Wednesday quietly released the full research report on the for-credit online courses it offered this spring through the online education company Udacity. The report, marked by delays and procedural setbacks, suggests it may be difficult for the university to deliver online education in this format to the students who need it most.The report's release lands on the opposite end of the spectrum from the hype generated in January, when university officials, flanked by the Udacity CEO Sebastian Thrun and California Governor Jerry Brown, unveiled the project during a 45-minute press conference. The pilot project, featuring two math courses and one statistics course, aimed to bring high-quality education to students for a fraction of the cost of the university's normal tuition. Wednesday's report went live on the university’s website sometime before noon Pacific time, appearing with little fanfare on the research page of the principal investigator for the project, Elaine D. Collins. Collins serves as associate dean in the College of Science.The report provides a long-awaited look into how the pilot project has fared. The initials results from the spring pilot led the university to put its partnership with Udacity on “pause” for the fall semester. Last month, the university released results from the summer pilot, showing increased retention and student pass rates. However, those reports barely scratched the surface of the data the university collected during the project . . . .Research has shown that at-risk students tend to struggle in online classes, said the education consultants Michael Feldstein and Phil Hill. That disadvantaged students enrolled in SJSU Plus courses posted similarly poor pass rates suggests the spring pilot was rushed, they said.‘We have to be careful that our sense of altruism doesn’t overcome our sense of common sense,’ Hill said. ‘If we know that at-risk students don’t tend to do well in online courses, you can’t just wish away that problem.’"After weeks of delays, San Jose State U. releases research report on online courses Take away points: A highly touted large scale experiment to determine whether MOOCs are the answer to cost and accessibility problems in public higher education has produced negative results. Those students most in need of reduced costs and increased access obtained the least benefit from the experiment. The negative results have been released in stages, appearing with “little fanfare.” And now Stacey Patton at The Chronicle of Higher Education:Influx of Foreign Students Drives Modest Increase in Graduate-School Enrollments“Enrollments in graduate programs at American colleges and universities have increased modestly, driven largely by a rise in international students, according to a report being released on Thursday by the Council of Graduate Schools . . . .The number of international students in American graduate programs went up by 8 percent from the fall of 2011 to the fall of 2012, up slightly from the 7.8-percent increase in the previous year. By contrast, first-time graduate enrollment increased by only 0.6 percent for U.S. citizens and permanent residents over the same period . . . .First-time enrollments of U.S. citizens and permanent residents was flat or down from the previous year in a number of science, technology, engineering, and mathematics fields. And total enrollment, counting both new and returning students, in graduate programs fell by more than 2 percent, to nearly 1.74 million students, in the fall of 2012, following a decline of 0.8 percent the year before . . . .It is good news that international-student enrollments are trending upward, said Debra W. Stewart, the council's president. But an increase of less than 1 percent in domestic students is worrisome, she added, given that the American economy will have an increasing need for highly skilled workers. The U.S. Department of Labor has forecast a 22-percent rise in jobs requiring at least a master's degree from 2010 to 2020, and a 20-percent rise for jobs requiring doctorates.‘We have strong increases for international students, which is good because if we didn't have strong enrollment from abroad, some graduate programs would be faltering,’ Ms. Stewart said. ‘But there are some particular concerns about where declines continue to persist for U.S. students. We are seeing a widening gap between U.S. and international first-time enrollments in engineering, math, and computer science.’In the fall of 2012, more than half—54.7 percent—of all graduate students who are categorized as temporary U.S. residents were enrolled in science, technology, engineering, and mathematics fields, compared with 17.3 percent of U.S. citizens and permanent residents . . . .” http://chronicle.com.libproxy.csun.edu/article/Graduate-School-Enrollments/141577/ Take away points: Thirteen years and three generations of millennial college students (classes of 2004, 2008, & 2012) into the new millenium, and American-born enrollments in STEM graduate programs are “flat or down.” While 57.4% of international graduate student enrollments are in STEM fields, only 17.3% of U.S. citizens or permanent residents are. The increased use of digital tools in education over these years does not appear to be increasing technological acumen or interest. Could this be an effect of digital devices being toys more than tools in too many instances?
... View more
0
0
290

Author
09-25-2013
01:30 PM
I have, officially, become “The Man.” Of course, as a writing program administrator I was in many ways already “The Man.” In fact, I tell new teachers in our program to use that to their advantage, to ally themselves with students against me (“Yeah, I hate this essay too but our mean writing program director is making us use it so I’m going to do my best to help you all survive it.”) But now I am a whole new level of “Man”. Now I am the Coordinator for Credentialing, Assessment, and Interdisciplinarity—an adjunctive functionary of the dean’s office tasked with making sure people who teach are qualified to do so (according to a strictly literal interpretation of our accrediting body’s guidelines), figuring out what assessment means for our college and helping departments to enact it, and shepherding a diverse collection of interdisciplinary programs in our college. Wow. I mean I can’t even talk about the position in non-institutionalese. One of the things that strikes me about the position is that it’s an odd assortment of tasks (what does credentialing have to do with interdisciplinarity?), yet at the same time the near random coupling of it all feels right in a bureaucratic sense. My motto has been, for some time, “embrace the illogic”; this position merely confirms that as the path to sanity. On the one hand I’m thrilled. I enjoy administration. It’s why I’m a WPA. And I enjoy it too because one of my primary research interests is the institution itself. What better way to understand it than to be swallowed by it whole? On the other hand, I’m uneasy. It’s more work and a WPA is, by definition, a busy person to start with. And it’s unpleasant work. I’m a cat herder, sending out frequent and carefully worded emails to chairs to get this or that done. I’m the sheriff, dropping the boom on some unsuspecting professor who is not precisely credentialed to teach such and such course. Ultimately, I am hoping to unravel more of the Gordian knot that is the institution. I hope to be sharing what I learn here, so stay tuned (and, please, wish me luck).
... View more
0
1
290

Author
09-18-2013
01:30 PM
As I write this post, a lot is happening in Syria. I’ve been thinking about how one might use essays in Emerging to help students both unravel this tense international situation and to help them understand that what we do in the classroom doesn’t just exist in the classroom but connects to the world we live in. Several essays come to mind: Madeleine Albright’s “Faith and Diplomacy” is always a useful reading when thinking about international affairs. Albright’s central argument—that while we may separate religion from politics other countries do not—can give students tools for understanding the complexity of the conflict in Syria. Kwame Anthony Appiah’s essays “Making Conversation” and “The Primacy of Practice” are in many ways the heart of Emerging because they speak so universally. Appiah’s notion of “cosmopolitanism”—the necessity of living with difference—is being put to the test on the global stage. His separation of values and practices also underscores how difficult it is to simplify a situation like Syria. Thomas Friedman’s “The Dell Theory of Conflict Prevention” is also a faithful go-to when thinking about global politics, since his central argument is that the economic forces of globalization have a stabilizing effect on geopolitics. Students could use Syria to both confirm and complicate Friedman’s claims. Malcolm Gladwell’s “Small Change,” new to the second edition, is a wonderful piece (because, well, Gladwell is a wonderful writer). His examination of “high risk activism” and his debunking of the “Twitter revolution” are both applicable to what’s going on in Syria. It’s not clear what’s going to happen. What is clear is that something will and must. I like to think that the work we do in our writing classrooms produces students who are ready to emerge as political agents in the public sphere. Thinking about Syria is one such opportunity.
... View more
0
1
341

Author
09-12-2013
01:30 PM
In order to interpret American popular culture one has to understand America and its history. Part of that understanding involves a knowledge of America’s many, often contradictory mythologies, as I have frequently noted in these blogs and in Signs of Life in the USA, but I want to add something else about America that isn’t precisely a mythology nor even, quite, a value or ideology. And yet it has an enormous impact on American culture—one that is especially visible in the ongoing juggernaut that is the university MOOC. This is the manufacturing regime that was known in the 19 th century as “the American System.” The American System was (and is) a way of manufacturing things based on standardization, efficiency, and simplicity. Providing semi-skilled workers with sophisticated machine tools with which they can assemble products whose parts are standardized and interchangeable, the American System enabled American industry to overtake and surpass English industrial production to become the world's leading industrial nation before the turn of the century. So it's all good, right? The American System led to modern mass production, high productivity, and the ability to create a mass consumer society within which a cornucopia of consumer goods was made available to all classes of people on a scale unequaled in history. Henry Ford's Model T—a highly standardized automobile that could be mass produced for a mass market— put automobility into the reach of the working classes, and is probably the most famous success story of the American System. But even if we ignore the ecological unsustainability of mass production and mass consumption (and we shouldn't), there is still a problem with the American System, one that has to do with the precarious balance in a mass society between quality and quantity. So the problem is that the American System excels at quantity, but at the expense of quality. Unlike, say, the tradition of German over-engineering, which produces goods manufactured to tolerances designed to enable them to last a lifetime and beyond, the American System—aided today by very precise computer calculation—designs products according to the most minimal tolerances, so minimal that the goal sometimes appears simply to be to get the thing out of the store before it breaks. At the same time, as the American System increases the quantity of a laborer's productivity, it decreases the quality of his or her life, because the System replaces highly paid skilled workers with lower paid semi-skilled workers. The result is the kind of socio-economic imbalance that is so apparent today. From the perspective of the consumer, on the other hand, there are also problems, because even as the American System fills the marketplace with goods, much of that production is junk. What is more, it not only conditions consumers to expect their purchases to break and need replacement, it also conditions them to demand that products be very cheap. The result has been that rather than purchasing fewer expensive items designed to last a lifetime, the American consumer purchases huge quantities of stuff that may be cheap at the checkout counter but can be costly in the long run because of the constant need of replacement. This is an old story, of course, but it explains a lot about American behavior and consciousness. And it can help us understand the rush to technologize and standardize education. Briefly put, the American System as applied to education is spelled MOOC. Or "Common Core." Or whatever. Standardizing education so that its parts are interchangeable everywhere, the coming American System of Education not only sacrifices quality for quantity in its quest for greater "productivity," it also attempts to transform skilled educators into semi-skilled clerks whose job is to manage online "learning management systems." Of course, the upper classes will still demand (and get) their own old fashioned highly skilled labor force at those expensive private schools and universities that will be exempted from this process, and everyone else can eat MOOC. Americans are tolerating this diminution of their educational choices precisely because of the earlier successes of the 19th century American System. Expecting and demanding cheap goods, they are appalled by the expense of education, not realizing that education has always been expensive, but, because of the subsidy of public higher education by governmental expenditures, they have been shielded from that expense. Now that the public alternative to expensive private education has become expensive as well—due to the withdrawal of those subsidies—Americans are quite open to the application of the American System to an "industry" (and, yes, higher education is being called an industry these days) where it never belonged. And the result should be pretty much what the result was in the sphere of industrial production. Quantity without quality.
... View more
0
1
446

Author
08-22-2013
12:30 PM
With all the buzz surrounding the final season of Breaking Bad (not to mention the fact that this is my last blog for the summer, though I'll be back in the fall), I thought that this would be a good time to consider the significance of endings. Of course, if you want to read a full, and magisterial, study of fictional endings as a whole, I highly recommend the late Sir Frank Kermode's The Sense of an Ending. Here I wish only to look at the finales of highly popular television dramas. The high level of attention that the concluding episodes of Breaking Bad is receiving just now puts it into some pretty select company, joining such other series as M*A*S*H, Twin Peaks, The X-Files, Lost, and The Sopranos, whose conclusions were also national news. In the case of Twin Peaks, Lost and The X-Files, much of the fascination with their finales was due to the devilishly complicated—not to say obscure—nature of those series, whose audiences looked forward to their concluding episodes rather in the manner of someone waiting for the arrival of next Sunday's newspaper to find the solution to that impossible crossword puzzle. Here sheer curiosity, the desire to finally have clarified just what exactly had been going on through all those seasons, was the driving factor (and which is why the impossibly obscure final episode of Lost was such a letdown for many of its fans). With M*A*S*H (whose final episode set a long-running record for television viewership), on the other hand, there was a kind of family dynamic going on, a desire to see what would happen to this cast of characters who through eleven seasons had become like personal friends to their audience. But when it comes to shows like Breaking Bad, while the curiosity and the sense of personal relationship are there too, something more is going on. What this "something more" might be is suggested in Mary McNamara's Los Angeles Times review, "What will Breaking Bad's story mean?". Seeing Walter White not as an anti-hero who has blundered into evil territory but as an out-and-out villain, McNamara proposes that "Breaking Bad wasn't about how good men will go to extremes when pushed; it was about how 'good men' can be secretly bad." This significantly raises the stakes, and so, for McNamara, while series finales "are notoriously difficult . . . this one seems more important than most, carrying with it a discernible moral weight. In Gilligan's [Breaking Bad's creator] worldview, does evil survive and thrive? Can there be redemption or simply containment?" Good questions, these, because, like The Sopranos, Dexter, and Mad Men, Breaking Bad has been a television series that invites viewers to put themselves into the shoes of some awful creeps. What this TV trend signifies is not easy to assess. Of cpourse, there is always the old "misery loves company factor" to take into account: that is, the appeal of a program featuring people whose lives are so bad that your own life feels better to you. But I think that there is something more going on here, a sense of growing desperation in this country that causes millions of viewers to wonder what their own breaking points might be, just how far they can be pushed before they abandon every restraint of civil society—before they even care any longer what those restraints are. For such viewers, Breaking Bad, and shows like it, may be a vicarious experience. The success of series like The Walking Dead, in which all that is left in the lives of its characters is an unending war for survival of everyone against just about everyone (even your own relatives can become the "enemy" with just one fatal infection in The Walking Dead), indicates that something like this is indeed the case. It isn't the horror here that stands out: it's the absolute freedom, the complete abandonment to violence. In the end, then, the power of Breaking Bad lies in the way it holds a mirror up to a society that is tearing itself apart, and where crime, all too often does pay (they don't call them "banksters" for nothing). My guess is that the finale of Breaking Bad will hold neither redemption nor containment, only the sickening feeling I presume we were supposed to get at the end of Natural Born Killers, when it becomes clear that the psycho-killers have finally gotten away with it.
... View more
0
0
465
Popular Posts