Analyzing the Title of The X-Files

The first and most superficial meaning of the show’s title refers to the cases that Special Agents Mulder and Scully work during the series. Within this diegetic framework (that is, within the fictional world of the show), the X-Files are a collection of FBI “cold cases” that have not been closed because they contain one or more elements that lack plausible, rational explanations. As such, the X-Files are quite literally the mysteries being investigated.

images-114

Nicknamed “Spooky” by his fellow FBI agents and given a windowless basement office to signify his marginalized status, Agent Mulder is the keeper of these X-Files. His role in the series is to serve as the voice of credulity. He’s the one who gives the series its first catchphrase, “I want to believe.”

The series is launched in the pilot by Agent Scully’s new assignment as Mulder’s partner. The pairing makes sense because Scully’s training as a medical doctor gives her the enlightened rationality that Mulder seems to lack. If Mulder is too gullible, Scully is a skeptic. Also, even though Scully is putatively the junior partner with less experience in the Bureau, her FBI supervisors make clear to her that her job is not only to work with Mulder to close X-Files by finding rational explanations for them, but also to keep an eye on Mulder personally and to report back to the supervisors about his questionable activities.

So, the agents’ job is to research and to close these X-Files, but of course the logic of the series quickly reveals itself. Mulder and Scully are confronted with some answers but not all, and most episodes end by maintaining the insolvability of the mystery in question. The X-Files can never be fully closed because the “truth” of each case never allows itself to be read.

Another way to read the title of the series is to see the X as representing “ex-“ the Latin prefix for “out of” or “from.” As a prefix, ex- can also mean former. As a preposition, the word ex often means “without” or “excluding.” Tying this back to the internal logic of the show, these then are files that are outside or not included in the main body of files. They are outliers, both literally and figuratively. This descriptive way of looking at the X-Files then also makes a clear allusion to the show’s obvious antecedents in American television. Both “Twilight Zone” and “Outer Limits” presented similar types of episodes that explored the supernatural, the creepy, and the weird. Not surprisingly all three of the shows have titles that refer to their status as being somehow outside normal experience or existing in a some sort of a liminal space. The “twilight zone,” for example is neither day nor night but always in between. Again, I haven’t been researching the show, but I wouldn’t be surprised if Chris Carter, the creator of The X-Files, cites these other shows as influences on his own series. In fact, I’d be surprised if he didn’t.

Continuing this idea of the X-Files belonging to a liminal space, the X in the title is also the Greek letter chi that is often representative of a crossing over or a transition. We see this in everyday usage on road signs that say “Xing” rather than spelling out the word “crossing.” In that sense, the title of the series suggests that it offers place of transition or crossing over for its characters and its audience.

Finally, as a label for the series, this title possesses a final key virtue: it is at once familiar and unfamiliar (which, by the way, is precisely Freud’s recipe for the uncanny). For now, we can just observe that we know what files are. They’re boring, mundane collections of information. Whether on the computer or in manila folders in a big metal cabinet, most of us probably have to deal with files at work all the time. But we don’t initially know what the X stands for, so this adds a dimension of mystery to the title and makes it all the more memorable.

Further, our experience of the show corresponds directly with this initial hit from the title. The X-Files follows the basic pattern of a police procedural with federal agents, the structure of each episode is familiar. A crime or other mysterious event happens, Mulder and Scully go to investigate, they find clues, begin to structure plausible solutions, etc. Yet the discoveries these detective figures make are often unexpected and sometimes defy explain within the normal limits of modern rationalism.

That this standard-issue, late 20th-century rationalism itself has an ideological agenda is where the show begins to venture into conspiracy theory. We tend to imagine that at least as far as science and technology go, we live in a post-ideological world, but The X-Files clearly wants to challenge this notion. Still, I’ll leave that can of worms on the shelf until some future post.

Advertisements

Jim Thompson Novels Reissued

I’m thrilled to see that Mulholland Books has reissued such a sizable chunk of the Jim Thompson oeuvre. The time is ripe for a new generation to discover this “Dimestore Dostoyevsky,” as noir scholar Geoffrey O’Brien dubbed him in the afterward to Black Lizard’s 1986 edition of After Dark, My Sweet. The Vintage arm of Random House sensed that Barry Gifford and company were onto something and snatched up the Black Lizard imprint.

AfterDarkMySweet

Vintage subsequently made quite a few Thompson novels available in the more aesthetically pleasing trade paperback format. Still, those books came out over twenty years ago and can be hard to find. Sadly too, Vintage quit digging so deep into the catalogue of forgotten writers like David Goodis and Harry Whittington and used their version of Black Lizard mostly to produce new editions of Chandler and Hammett who are formidable talents of course but who had already been readily available.

As I argued in my own Master’s thesis a few years back, Thompson still deserves a much larger audience than he’s ever managed to attract in the US, but his vision is perhaps too unrelentingly dark to achieve mainstream acceptance here. Thompson’s Marxist sympathies shine through in his savage critiques of America’s capitalistic positivism and that makes folks uneasy. So too does his insistence that criminal misfits and killers are not the monstrous others we’d like to believe; the ugly face of humanity is right there in the mirror if we’re willing to take an unflinching look. Indeed, Jordan Foster gets it exactly right in the title of her piece for Publishers Weekly: “The Killers Inside Us.” This is the mark of Thompson’s break from our post-Enlightenment pieties.

The French get Thompson, which is perhaps why we still use the French word noir to describe this sort of crime fiction. The best film adaptation of a Thompson novel is probably still Tavernier’s Coup de Torchon, which moves Pop. 1280 to French West Africa. As O’Brien writes in that afterward, the average reader of mystery fiction “wants his anxieties alleviated, not aroused,” which is why cozies are so popular with invalids and retirees. And perhaps it’s true that most crime fiction is essentially “conservative” in that it tends to resolve any rupture in the social order (such as murder or theft) by reasserting that order and ensuring that the lawless are appropriately punished.

Pop1280

But Thompson doesn’t work that way.

A relentless experimenter in literary forms, he continually breaks genre conventions to claw through that paper-and-ink barrier that separates author and reader. The end of Savage Night is a case in point, but there are plenty of other examples. One of Thompson’s crazed narrators unravels so completely that divergent voices occupy alternating lines in the final pages of the novel.

Personally, I’m seizing on this excuse to refresh my memory of some favorite titles and revisit the ones I don’t recall as well. Thompson’s work always rewards multiple visits and he scarcely ever wrote a novel longer than 50,000 words.

Cheers to Mulholland for reissuing these novels with new forewords by a number of today’s best crime writers (with a few curious omissions). Here’s hoping Thompson gets under the skin of a whole new generation of readers.

killer-inside-me


Rewatching The X-Files

Thanks to Netflix streaming services, I’ve recently been re-watching The X-Files in order from the beginning. I have an abiding interest in conspiracy theories so the show is a natural fit for me. However, while I enjoyed the numerous episodes I saw back during the show’s original run from 1993 to 2002, my life at that time wasn’t such that I could watch anything too religiously. So I missed a lot.

images-113

For the most part, the episodes worked as stand-alones since this era when shows like The X-Files and Buffy the Vampire Slayer were first pioneering their ideas of multi-season story arcs, a bold move back in the 90’s, before the advent of Netflix or Hulu or other “on demand” television providers. It’s true that shows were released on VHS tapes but these still had nowhere near the social currency currently enjoyed by DVD boxed sets in the 2000’s, and they lacked the commentary tracks, alternate takes, and other special features now routinely available on DVD.

The release of shows in DVD boxed sets marked the potential for endless re-watching, but binge-watching a series doesn’t seem to have gained quite the popularity it currently enjoys until shows started appearing via online streaming. As few as five or ten years ago, you still would have had to swap out the DVD in your machine every hour or two. Now you don’t even have to buy anything, just subscribe to an inexpensive monthly service, and if you do nothing but keep watching, Netflix streaming will run episode after episode of a show one full season at a time, and it even conveniently edits out the opening credits so you don’t have to sit through those repeatedly.

Instead, you can stare in full spectatorial wonder without so much as touching your TV’s remote or the screen of your iPad from sunrise to sunset. Or perhaps, as is more commonly the case, from sunset to sunrise, when you groggily turn on your side and hope to catch a short nap before the world expects you as a civilized person to make your first dignified appearance for the day.

I’m not much of a binge-watcher, but clearly the current technology has opened new frontiers in sleep deprivation and social catch-up-ism. Miss the first season or three of that show everyone seems to be gabbing about at the water cooler? No problem. Just bluff your way through a cursory chat and then power through the requisite material over the weekend. Just like with Wikipedia and Shazam, we’ve never had such rapid ability to fake and amass cultural literacy. There’s really no excuse anymore for not watching everything.

Personally, I tend to watch shows and films like I read books, slowly to savor them and to give some attention to detail. For me, the joys of analysis always overmatch our contemporary drive for sheer consumption. I recall a few terms back when I had an undergrad boast to me that his Netflix queue showed that he’d watched over 10,000 films, but I wasn’t overly impressed by this factoid since he had a hard time performing a decent critical analysis of any of the stories or novels we read in class. Shoveling massive amounts of media into your head doesn’t mean you’re actually digesting it,, which is why I have a bit of a hard time watching things that don’t satisfying my interpretive impulse.

Fortunately, only half way through the first season I’m already finding The X-Files holds up. Yes, the clothing and hairstyles are a bit dated. And the technology is occasionally quaint, like when Scully gets paged at dinner and needs to find a pay phone or when Mulder develops old-fashioned rolls of film in a chemical bath or gets lost in the woods and can’t call anyone for help or look up his location on GPS. But these are minor details. The central premise of the series and various phenomena and conspiracies taken up by the individual episodes are still as rewarding and intriguing as they ever were.

I’m taking notes as I go and plan to use episodes along the way to launch into broader discussions here. For example, the pilot starts with the reliable and rational Scully first receiving her assignment to work with conspiracy-minded Mulder. Her exchange with the FBI bosses and her subsequent initial encounter with Mulder warrant some closer scrutiny. Similarly, the second episode, about a missing Air Force pilot, contains the series’ first truly uncanny moment and it’s something I think could serve as the basis for a larger exploration of Freud’s notion of the unheimlich.

Not that all my planned posts will be so densely theoretical, diving into psychological or philosophical esoterica. Not at all. It’s television after all. It’s meant to be entertaining. So you can count on me to also explain why I think The X-Files could be looked at as the anti-Scooby Doo. See, fun!

Finally, this particular post launches a couple new categories for my blog, “television” and “conspiracy theories.” I’m hoping the introduction of both these topics will prompt me to blog more regularly. Next time I write about The X-Files, I plan to start by examining its trio of catch phrases: “The truth is out there,” “Trust no one,” and “I want to believe.” Evocative statements, but what does each of these really mean?


Teaching Video Game Theory, Part II

What Video Game Study Can Do for Academia

In my last post (“Teaching Video Game Theory, Part One: What Academic Study Can Do for Video Games”), I argued that video games deserve critical attention. But the question remains whether video games have anything essential to offer in return. What benefits can the inclusion of video games offer to Culture & Media Studies?

Well, in many ways the humanities are suffering. It’s no secret that universities around the world are in financial straits. While cutting budgets and raising tuition, administrations are looking at the numbers. And the liberal arts are not pulling their weight. According to a New York Times article about the global crisis in liberal arts, the number of students studying the humanities at Harvard has halved in the last 50 years. Yet another NYT piece about waning student interest in the humanities reports that although nearly half of faculty salaries at Stanford University go to professors in the liberal arts, only 15% of recent Stanford grads have majored in those disciplines. Those are alarming trends and suggest the humanities are fundamentally unsustainable. At least as they are currently imagined.

images-108

In response to this crisis in the humanities, the American Academy of Arts and Sciences issued a report last year stating: “At a time when economic anxiety is driving the public toward a narrow concept of education focused on short-term payoffs, it is imperative that colleges, universities, and their supporters make a clear and convincing case for the value of liberal arts education” (32). This report also stressed the importance of facing the new challenges of the Digital Age.

So, how do we do that? How do we make the case that a liberal arts education is worthwhile especially with the advent of the Digital Age?

Well, teaching video games is a start. We need to bring this powerful cultural medium into the classroom and engage students on their own terms. Over the past decade I’ve become aware that fewer and fewer of our students read for enjoyment. But nearly all of them use significant amounts of their free time to play video games. Male or female, younger or older, they choose to experience these video game “texts” of their own free will.

I already argued last time for the significance of video games as cultural artifacts. Every year more academic studies of video games are published and certain trends of intellectual thought about games have already begun to emerge.

All of this scholarly focus on video games is performing interesting and culturally important work; however, as academics we need to do more to translate this emerging discipline into the classroom experiences of our students.

They crave it. Not only that, but they deserve it. And so do we.

Video games can revitalize the humanities.

In order the remind the world how valuable a liberal arts education can be, we first need to entice students into taking our classes and then we need to make the classroom experience meaningful enough that they want to pursue degrees in our disciplines. When students are clamoring to study the humanities, financial support become available.

Three keys to attracting students are relevance, fun, and depth.

Relevance. Students want to take classes and study subject that connect to their actual lives and provide them with better ways of understanding the real (and often virtual) world they inhabit on a daily basis. For a class to be relevant, it needs to provide students with the analytical tools that help them interpret the information that bombards us from every side. Part of this is learning to ask the right questions. Part of it is learning how to understand the stuff our social interactions are made of – language and ideas and assumptions and rhetorical strategies. When it comes to teaching critical thinking and effective reading and writing skills, the humanities are not just relevant but central. There’s a reason two out of the three basic R’s of education are in the humanities! Yes, ‘rithmetic is important, but try surviving a day in the Digital Age without reading and writing.

Fun. Students learn best when they’re having fun. This is why so many young people retain seemingly endless minutia about the video games they play (which they experience as fun) and recall so little about that boring world history or chemistry class where they were forced to memorize dates or formulae. Fun lights up the brain like a Christmas tree. Just look at all those presents! By contrast, boredom shuts down the mind. “Eat your peas” and “do your chores” do not inspire enthusiasm and engagement. Psychological studies bear this out and pedagogues are already busily trying to create “useful” video games that can surreptitiously indoctrinate players with real world information.

Depth. This one is trickier, but in some ways it’s the secret ingredient because it’s key to what students crave from classes. Relevance and fun are both very important, but alone they cannot complete the circuit of education. The avid mind of a student wants to think new thoughts, to make surprising connections, to explore uncharted areas, to see the ordinary as strange and to view the strange as ordinary, to learn how to ask important questions and how to find interesting answers, to discover the mysterious joys of an intellectual life.

Video games offer a powerful way to provide students with relevance, fun, and depth. Not only is that good education; it’s where the humanities shine.

**This essay is cross-posted on the Marylhurst Blog.**


Teaching Video Game Theory, Part I

What Academic Study Can Do for Video Games

This past spring I presented an academic paper on issues of spatial representation in the video game Portal at the annual Society for Textual Studies Conference. My paper fit well with papers by my fellow panelists, including Marylhurst’s English Department chair Meg Roland, who offered important new insights on early modern maps, and recent Marylhurst alumna Jessica Zisa, who presented a smart paper on social and natural spaces in Sebold and Thoreau. As it turned out, the juxtaposition of our various analyses provoked a lively discussion with the audience. But as we jostled out of the room after our session, I couldn’t help overhearing one of the curmudgeonly older professors grumbling, “I can’t believe there was an academic paper about a video game!”

But why not? Did I do something wrong? Was I squandering my mental energies and straining my peers’ patience with a topic beneath scholarly attention?

As you can imagine, I’ve thought about this a lot for a while, but the more I considered the issue the more important it seemed to me that I continue studying video games.

images-85

In fact, I “doubled down,” as they say. I’ve already presented another conference paper on the video game L.A. Noire‘s adaptation of the detective genre, and this fall I’m attending a semiotics conference to discuss the paradoxical fantasies of military first-person shooter games. Not only that, but I developed a reading list that turned into a syllabus, and this summer I’m proud to say that I’m teaching Marylhurst’s first ever Video Game Theory class.

So, I suppose I have some explaining to do. Why is a 19th-century Americanist with expertise in textual studies and psychoanalytic criticism spending his time playing video games? Even worse, why is he talking about it in public?

Video games are no longer the exclusive province of nerdy teenaged boys who live in their parents’ basements. Recent demographics studied by the Entertainment Software Associations show that over half of American households own a dedicated gaming console, the average gamer is 31 and nearly 40% of gamers are over 36. While men do still edge out women among the gaming population, currently 48% of gamers are women.

And, beyond these basic stats, we really need to recognize that it’s not just about online fantasy games or military shooter games. Just about everybody has a game or two on their phone these days. Angry Birds anyone? Farmville? Flow? These games are changing how and when we communicate with each other. Some people use Words with Friends as an excuse to chat more frequently with friends and relatives over distance. Others use a regular online gaming night to maintain group friendships across the miles that separate their homes.

Games have been adapted to create fitness programs like Fitbit and Nike+. There are community-oriented good Samaritan game-type apps like The Extraordinaries app or the app that notifies CPR-trained specialists if someone in their vicinity needs help. Apart from the studied benefits of video games helping autistic kids adapt to social rules and learn how to communicated, there are also games specifically designed to help a variety of medical patients recover better and faster.

Beyond the stereotypes about video games that persist, what are some of the other reasons we need to think critically about this topic? For one thing, video games are big business, with the gaming industry generating over $21.5 billion last year. 2013’s top-selling game, Grand Theft Auto V, made over a billion dollars in its first three days. Compare that to other media. Top-grossing film Iron Man 3 also made over $1 billion in worldwide ticket sales, but it took nearly a month to hit that mark. Runaway bestseller Fifty Shades of Grey shattered every publishing record by selling 70 million copies in the U.S.–in both print and e-books. Counting those all at $15 (the print price), that’s also over a $1 billion, but it took a couple years to reach and it’s exceedingly rare.

Yes, I know it’s funny to hear an English professor measuring cultural significance by looking at sales figures. I know money isn’t the be all and end all of social values, but it’s a strong indicator. We all fundamentally “know” that books are obviously better and more serious works of art than movies, and even TV shows and comic books are infinitely more important than video games.

And yet… can we really just assume (or even argue) that either Fifty Shades of Gray or Iron Man 3 is an inherently superior cultural artifact than Grand Theft Auto V? In fact, do we even want to try to assert that position?

Granted, part of our job in academia is to serve as a standard bearer for important works from the past, to ensure they are not forgotten. As a 19th-century literary scholar, I’m acutely aware of this duty and I’m proud to say that I routinely inflict canonical “high literature” on my students, many of whom I actually convince to enjoy the experience and continue it of their own free will. But part of our job too should be showing our students how to use these powerful analytical tools at our disposal to analyze cultural artifacts that the general public chooses to experience on their own. What good are these various apparatuses we develop if they only apply to analyzing the works of “high culture” that Academia elevates to special, masterpiece status? Shouldn’t we also be able to apply our tools to “low-brow” works created primarily to entertain?

I think so. And I’m not alone. In fact, English professors have been expanding the canon from the very beginning. It’s a slow and painful battle, but notice how (despite the vestigial name) English Departments now routinely teach American literature. We take it for granted now of course, but that wasn’t always the case. We even teach post-colonial “world” literature and regularly include works of “popular” fiction in our academic purview. It’s much the same throughout the humanities. For years now Culture and Media scholars have been analyzing films and television and comic books, so isn’t it time we stretch ourselves to include video games in our conversation?

Whatever one thinks of them, video games are cultural artifacts. They are “texts” of a sort, and as such they communicate meaning. Furthermore, as we know, people are choosing more and more to experience these video game “texts” on their own in preference to reading or even to watching films. So, isn’t it better for us to teach our students how to apply critical thinking and analytical tools to these new texts?

It doesn’t mean that we will quit teaching Chaucer and Shakespeare. Not at all. But it means that we must also find a way to discuss Grand Theft Auto and Call of Duty.

*A slightly edited version of this multi-part essay is being cross-posted on the Marylhurst Blog.*


The Liberal Arts Make Us Free

Last winter, the New York Times ran an article (“Humanities Studies Under Strain Around the Globe“) about the current crisis facing humanities departments at universities around the world. While the humanities have long weathered criticism that they are impractical or irrelevant in the “real world,” and cuts to humanities funding are nothing new, the present situation seems worse than ever. Now more than ever, those of us in the liberal arts need to fight for our existence and demonstrate the value of our disciplines. Not just to politicians but also to the increasingly non-academic administrators who manage our schools under the ubiquitous common (non)sense idea every public entity needs to be “run like a business.” (More about this in a future post.)

Even more importantly, we need to convince undergraduates that studying the humanities is meaningful, that a liberal arts degree provides a valuable education with essential elements not available in other more “practical” disciplines. That “soft skills” like critical thinking and sophisticated communication can prove just as significant to one’s life and career as understanding accounting principles or knowing the fundamentals of biochemistry. Unfortunately, we in English departments and other humanities haven’t been doing a very good job of demonstrating the centrality of our role in higher education, so we’re being perceived as peripheral. As dispensable.

Harvard reports that the number of students studying humanities has halved since 1966. According to a related piece in the NYT, although nearly half of faculty salaries at Stanford University are for professors in the humanities, only 15% of recent Stanford grads have majored in the humanities. Florida governor Rick Scott recently suggested that humanities students should pay higher tuition as a penalty for pursuing “nonstrategic disciplines.” Public response to the proposal has been relatively anemic. An online petition against the proposal gathered only 2,000 signatures and could only muster the weak argument that differential tuition would result in the “decimation of the liberal arts in Florida.”

Sure that sounds terrible, if you happen to care about the “liberal arts.” But it seems that most people don’t have much idea what their loss means to our culture. The “liberal arts” (English, all the other languages, literature, culture and media, philosophy, classics, etc.) represent the highest ideals of a university education. If abstract and theoretical rather than practical, the liberal arts are those disciplines designed to empower students as individuals, to inculcate the wisdom and responsibility that allows them to be good citizens, to inspire them to work for ideals and to pursue social justice, to help them serve as productive, compassionate and innovative leaders.

By contrast, the hard sciences seem almost limited by that very practicality they tout. Never mind their absolute faith in the ideological oxymoron of “scientific progress.” (I plan to discuss this at some length in a later post.) Set against the philosophical depth of the liberal arts, professional degrees and certificate programs (MBA, MD, CPA, DDS, etc.) seem like glorified trade schools.

This goes straight to why the humanities are often called the “liberal arts.” Regardless of whatever Gov. Scott may believe, they are not “liberal” (as opposed to “conservative”) in the narrowly American political sense that tends to equate them with bleeding-heart socialist ideologies. In fact, the liberal arts do not have a fundamental political bias at all. They are “liberal” in the sense of liberating, of making one free, of freeing students to think for themselves, of teaching one how to imagine what freedom means, of exploring ways to experience human freedom.

What could be more important than liberal arts to education in a democratic society? What could be more central to human experience or more vital to a meaningful life?


Review of Valhalla Rising

valhalla-rising-01

To say that Nicolas Winding Refn’s Valhalla Rising (2009) is a quiet film is not to say that it lacks impact or even action.  There’s not as much action as a typical Hollywood film-goer has been trained to expect, but when violence erupts on screen you can almost feel the blows against your own flesh. Yet, I would still say the film is quiet. As the protagonist, Danish actor Mads Mikkelsen utters not a single line of dialogue during the film’s entire hour-and-a-half running time, and very few of the other actors say much more than that.  But even without much dialogue this film speaks volumes more than lesser films manage with their empty banter and their relentless if barely coherent plots.

Some will complain that this film doesn’t have enough of a plot, but again that all depends on what you’re used to seeing on the screen.  Valhalla Rising satisfies itself with a simple and almost plodding story. Ultimately, this film isn’t very concerned with that bare-bones plot; it has other things in mind.  The story, such as it is, follows Mikkelsen as One Eye, a stoic warrior who escapes from slavery as a gladiator and joins a somewhat confused Viking Crusade that sets sail from the Scottish Highlands and ends up in “Hell.”  Mute and half-blind, One Eye proves himself adept surviving through barbaric times with only his brawn, brains and unflinching willingness to use brute force against just about anyone and everyone.  But again, the action that strings this film into a somewhat puzzling story doesn’t begin to describe the visual experience of watching it.

This film compels with its beautiful cinematography.  Under a hypnotic soundtrack, we are stunned by shot after shot of the emerald green highlands, gorgeous evening skies, underwater blues and reds, bare skin speckled with crimson blood, hair matted with mud.  I suspect the problem for most viewers, and the reason this film hasn’t garnered more attention, is that Valhalla Rising is neither fish nor fowl.  The violence is far too shocking and graphic for those who enjoy art-house films but the pace is far too slow and the plot too confusing to hold the interest of those looking for action.  This movie seems sort of like Conan the Barbarian as filmed by Ingmar Bergman, if you can imagine such a thing.

For those who can stomach it, this film offers wonderful rewards, but it’s not for everyone.  After two viewings I’m still pondering One Eye’s final enigmatic action.  Personally, I found Valhalla Rising mesmerizing and powerfully moving.

Four out of five stars.


Science Fiction Summer Course

My online science fiction class at Marylhurst University (LIT215E/CMS215E) has gotten off to a lively start with another batch of great students this summer.  I teach this class pretty much every other year, and I’m always finding ways to tweak the syllabus.  This time around our main texts are Robert Silverberg’s excellent Science Fiction Hall of Fame: Volume One, 1929-1964 which we’re using in tandem with Volume One (the 2006 issue) of Jonathan Strahan’s annual Best Science Fiction and Fantasy of the Year series.  These two volumes provide us with a nice variety of science fiction stories across the last century of the genre.  While we can’t read every story for the class, these two books allow us to hit most the high points in the Golden Age from Asimov, Bradbury, and Clarke to some of the standout newcomers to the field, like Ian McDonald and Paolo Bacigalupi.  We also read just three novels: Mary Shelley’s Frankenstein (1818), which started it all; H.G. Wells’s War of the Worlds (1898), which isn’t my personal favorite of his works but which introduces the important SF theme of alien invasion; and Ursula K. Le Guin’s Dispossessed (1974), which helps us tackle both utopian themes and feminist/gender themes.

To give the students time to read Frankenstein, our first week’s discussion taps into a discussion of the two most culturally prominent SF franchises by reading David Brin’s somewhat dated but still relevant 1999 article “‘Star Wars’ despots vs. ‘Star Trek’ populists”.  I like this piece especially since it allows even those students without much interest or experience with SF to jump right into the fray.  Also, I’ve found people tend to feel pretty passionate about both of these franchises.  We also do a bit of work exploring the line between SF and contemporary technology by reading an interview with noted futurologist Ray Kurzweil and a slightly paranoid rant against the merging of humans with machines by Eric Utne.

This time around I’m also including a lot more films than I have in the past.  This seems important since at least in film and television SF seems to have become accepted as virtually mainstream, whereas SF novels are still somewhat consigned to the genre ghetto except when authors who are already considered “real writers” employ SF tropes in their “serious” work.  This is the only way to account for the different cultural reception of Margaret Atwood and Ursula Le Guin for example.  Yes, Le Guin has achieved broad literary acceptance, but this is often presented as being “in spite” or her being an SF author.  Okay, I know, I know, saying that genre writing isn’t “serious” literature amounts to fighting words in some circles, but the (perhaps) disappearing divide between “high” and “low” art is probably an issue for another blog post.  Scratch that – it’s an issue for a series of blog posts.  I’ll get on that.

So, anyway, we’re watching the following films:

  • Metropolis (1927), dir. Fritz Lang
  • The Day the Earth Stood Still (1951), dir. Robert Wise
  • 2001: A Space Odyssey (1968), dir. Stanley Kubrick
  • The Man Who Fell to Earth (1976), dir. Nicolas Roeg
  • The Matrix (1999), dir. Andy and Laura Wachowski
  • A Scanner Darkly (2006), dir. Richard Linklater
  • Children of Men (2006), dir. Alfonso Cuarón
  • Moon (2009), dir. Duncan Jones
  • Hunger Games (2012), dir. Gary Ross

I know I’ve probably opened up a whole can of space worms by publicizing my selections here, but before you reply with your own suggestions (which I welcome), just remember that this list is not supposed to represent the “best” of SF film.  It’s merely a collection of some interesting films that span a lot of years (skewed toward the present, admittedly).  I also wanted to touch on a wide variety of themes and trends in SF.

As always, I’m reading and viewing alongside my students as the term progresses.  No matter how many times I read Frankenstein, I always find new things to ponder.  I’m also excited because as I wrap up my current project on Edgar Allan Poe, I’m starting to consider attempting a longer academic work about science fiction.  Specifically, I think it might be interesting to perform psychoanalytic readings of Golden Age stories and novels.  I plan to take copious notes this term and see where this idea leads me.


Drowning in Digital Democracy, Part II

Last week I wrote a post about some of the challenges we face in a digital age where expertise and authority seem to be under constant attack, but I’d like to follow that up here by exploring this issue from a slightly different angle.

What I see as the crux of our current challenge is this: how can we ensure that the digital democratization of human knowledge does not become mired in the same anti-intellectualism that has for so long been a hallmark of our American democracy?

I know what some of you are thinking. How can I say that America is anti-intellectual?  Isn’t it true that we are home to many of the greatest universities in the world, schools that continue to draw the best and brightest from around the globe for graduate studies?

Yes, that may be true, but looking at our culture as a whole, the anti-intellectualist attitude that pervades our country is undeniable. Consider how casually and caustically our politicians and pundits dismiss “experts” and “authorities” when such learned wisdom (or book-learnin’) disagrees with their own cherished personal opinions. Witness how during last fall’s debates before the elections, senatorial candidate Elizabeth Warren’s opponent called her “Professor Warren” as a put-down. True, Professor-cum-Senator Warren still won in Massachusetts but that state prides itself on the prestige surrounding its academic institutions.

By contrast, there are plenty of regions in our country where Warren’s academic credentials would more surely have done her irreparable political damage. Throughout most of the country, American anti-intellecualism is a hard fact. And it’s one I’d guess more than a few of us eggheads had thumped into us in grade school.

This isn’t a new observation. From the 1940’s to the 60’s, historian Richard Hofstadter explored these ideas in his works of social theory and political culture. The most important of Hofstadter’s studies may be Anti-intellectualism in American Life (1964), one of two separate books for which he won the Pulitzer Prize. While Hofstadter saw American anti-intellectualism as part and parcel of our national heritage of utilitarianism rather than a necessary by-product of democracy, he did see anti-intellectualism as stemming at least in part from the democratization of knowledge. Not that he opposed broad access to university education.  Rather, Hofstadter saw universities as the necessary “intellectual and spiritual balance wheel” of civilized society, though he recognized an ongoing tension between the ideals of open access to university education and the highest levels of intellectual excellence.

Of course, important as his work remains, Hofstadter wrote before the dawning of our own digital age. He didn’t grapple with the new challenges presented by an online world where (for better or worse) communication is instantaneous, everything is available all the time, and everyone not only has a voice but has the ability to speak in a polyphony of voices masked in anonymity.

Still, we must be willing to admit that things may not be as dire as all that. As Adam Gopnik has pointed out in the New Yorker (“The Information: How the Internet Gets Inside Us,” Feb. 14, 2011), the World Wide Web is sort of like the palantir, the seeing stone used by wizards in J.R.R. Tolkien’s Lord of the Rings. It tends to serve as a magnifying glass for everything we view through it.

As such, it’s no surprise that many have viewed the dawn of the Digital Age as signaling the end of everything that made the modern civilized world great. Indeed, for years now, academics and public intellectuals have lamented the way our digital media has seemed to dumb down our discourse, but there are signs of hope.

In his 2009 article “Public Intellectuals 2.1” (Society 46:49-54), Daniel W. Drezner takes a brighter view of the prospects for a new intellectual renaissance in the Digital Age, predicting that blogging and the various other forms online writing can in fact serve to reverse the cultural trend of seeing academics and intellectuals as remote and unimportant to our public life. Drezner argues that “the growth of the blogosphere breaks down—or at least lowers—the barriers erected by a professional academy” and can “provide a vetting mechanism through which public intellectuals can receive feedback and therefore fulfill their roles more effectively”  (50).

Some views are not so optimistic, but it’s true that are great online resources for serious scholarly work and there are even smart people who are thinking amazing thoughts and writing about them online.

But let’s see what a vigorous online discussion can look like. I anxiously look forward to hearing what others have to say about the issues I’ve raised here.

This piece is cross-posted at the Marylhurst University blog:  https://blog.marylhurst.edu/blog/2013/03/26/digital-democracy-american-anti-intellectualism-part-ii/ 


Drowning in Digital Democracy, Part I

It’s become commonplace, and maybe even a little passé, to describe our own ongoing digital revolution as analogous the advent of Gutenberg’s printing press in the 15th century.  Indeed, some points of comparison do continue to seem remarkably apt.  For example, the role of printed documents in spreading new ideas during the Reformation looks a lot like activists using Facebook and Twitter to share news and schedule protests during the Arab Spring.  Both show how technology can be a powerful force for democratization.  (Apologies if I’m stepping on any toes by seeming to valorize the Reformation as a positively democratic movement on the blog of a Catholic university, but you know what I mean.)

However, critics like Adam Gopnik in his New Yorker piece “The Information: How the Internet Gets Inside Us” (Feb. 14, 2011) have been quick to point out that overly enthusiastic interpretations of such revolutionary possibilities not only tend to confuse correlation with causation – that is, did the printing press give rise to the Reformation and the Enlightenment, or did it just help spread the word?  The truth probably rests somewhere in between cause and coincidence, but we should be careful not to ignore the distinction.

Similarly, technology’s vocal cheerleaders seem all too ready to ignore the potential negative aspects of such improved communication technologies – like the inconvenient historical fact that totalitarian regimes have typically printed far more works of propaganda than they’ve destroyed in book burnings.  Dictators figured out quickly that it’s far easier to drown out the voices of opposition than silence them.  Pervasive misinformation can do far more damage than tearing down handbills.

Now, I’m not suggesting that our globalized digital community is a totalitarian regime.  At least on the surface it feels like just the opposite, though Jaron Lanier expresses some dire warnings about what he calls “cybernetic totalism” in his “One-Half of a Manifesto.”  I’ll plan to address Lanier’s thoughts more fully in a future post.  For now, it’s enough to observe that in this brave new world of online culture we’ve adapted to communication being instantaneous, everything being available all the time, and everyone having a voice.  Well, in such an environment, succumbing to the endless seas of unmediated information (and misinformation), the rule of the mob begins to feel like a real possibility.

We’re drowning in digital democracy.

Forgive me if I sound less than perfectly egalitarian here, but when everyone not only has a voice but has the ability to speak in a polyphony of voices masked in anonymity, we’re no longer looking at a lively exchange of ideas.  We’re looking at the well-known horrors of mob rule, and it doesn’t make the stakes any lower or the threats any less real that it’s happening online.

Even in the best of scenarios, when everyone has a voice the quality of the conversation can plummet very quickly.  I’m not talking about those annoying people who use their social media to tell everyone from Boise to Bangladesh that they’re making a batch of chocolate chip cookies.  Those folks are easy enough to avoid and to ignore.

No, my concern is that too many of our students and friends and journalists and politicians and, hell, all of us are relying on Wikipedia and Google.  Not only are we trusting crowd-sourced encyclopedias written by people who may have little or no education or expertise (and some of whom are hoaxers or pranksters), but we’re relying on logarithmically-driven and advertisement-enhanced search engines to provide most (or all) of our information, without pausing to question or to ascertain the authority of what we’re reading.

Not only that, but because of such ready access to information we’re hearing people who are smart enough to know better trumpeting the end of all cultural and social authority.  “The expert is dead,” such digerati claim.  And indeed throughout much of our irreverent, anything-goes society, many people do seem to be acting as if at last the king is dead.

But is it really true that we no longer have any need for cultural, political, and intellectual authority?

No, in fact just the opposite is true.  Greater freedom brings with it greater responsibility.  We now need experts in every field to exert their authority more powerfully than ever.  Reason must lead.  Functional democracies (even digital ones) still need organization and leaders.  Otherwise we’re left with the chaos of a shouting match.

Having a voice is not the same as knowing how to participate a conversation.  Access to information is not the same thing as knowing how to use it.  We didn’t close up schools because every home had a set of encyclopedias.  We didn’t tear down universities because people had access to public libraries.

All those online sources might be fine places to start looking for information, but we need to be constantly vigilant about verifying what we’re accepting as valid and credible.  We also need to get better about documenting and providing links to our sources (as you’ll notice I’m trying to do in these posts).

And finally, we need to make sure we remain very clear about the vital differences between having ready access to information and gaining an education.  Now more than ever, our students need us to teach them how to read, how to research, how to analyze information, and how to participate responsibly in this emerging digital democracy.

Of course, if the Digital Revolution truly lives up to its name, its effects will be further reaching and less predictable than any of us can imagine.  That’s the problem with revolutions – they change everything.

This piece is cross-posted on the Marylhurst University blog: https://blog.marylhurst.edu/blog/2013/03/19/drowning-in-digital-democracy-part-i/