Wednesday, August 20, 2008

The Dumbing of America

By Ted Pease
©1996

As students, teachers and parents celebrate the annual fall rites of readin’, writin’ and ’rithmatic, the question arises of whether we’ve lost our way, educationally speaking.

I mean, speaking as a teacher myself, are we doing any good, really? There are various persuasive indications that we (society as a whole, because this sure ain’t the fault of teachers alone) aren’t, and that the country is in intellectual decline. Is there any reason to wonder whether America as a whole—as gauged by yardsticks ranging from weighty scientific studies to the carping of lawmakers eager for reelection to what is probably a more accurate measure, gut feelings—is any stupider than it used to be? (Or should that be “more stupider”?)

Well, maybe “stupid” is a bit harsh. But what about students’ commitment to learning?

One indicator of our collective dumbing down that should be of particular interest to teachers comes from UCLA, which houses not only impressive basketball teams, but also is home to something called the Higher Education Research Institute (which sounds authoritative, although I can’t vouch for its won-lost record).

The UCLA report says that U.S. college freshmen “are increasingly disengaged from the academic experience.” According to this week’s U.S. News & World Report, that means college students nationwide “are more easily bored and considerably less willing to work hard” at learning than they were a decade ago. More than one-third of the students in the survey said they were “frequently bored” in their high school classes, and 65 percent said they had spent less than six hours a week on homework as high school seniors. None of which bodes well for their college careers, or for the professors who will attempt to teach them. Or, finally, for the employers who hire them and the society they create.

One of the dirty little secrets of higher education is that few of us who teach are much surprised by this finding. We see disengaged students day after day, and the job of getting them off the dime seems harder every year.

At the same time, test scores and grade-point averages continue to rise. That seems to indicate that America is getting smarter, and that teachers are teaching more effectively, but grade inflation is simply an artifact of the larger academic environment, a defensive reaction to public attitudes toward education these days: Academic won-lost records are measured in enrollment and graduation rates, career income potential and alumni giving, not in terms of knowledge and learning, so grades and academic and intellectual don’t necessarily mean a thing.

“During the last decade,” writes chemistry professor Henry Bauer of Virginia Tech, “college students have changed for the worse. An increasing proportion carry a chip on their shoulder and expect good grades without attending class or studying.”

Other professors surveyed by Bauer agreed that U.S. college students have become “progressively more ignorant, inattentive, inarticulate,” expecting that shelling out tuition (but not necessarily attending classes) should automatically guarantee them A’s and nifty careers after four years.

Parents and legislators and others who foot the bill at colleges and universities quite rightly have certain expectations about what they are paying for. They should take note of all this, because the UCLA report is not bunk, and Professor Bauer is not simply a cranky old fart. The fact is (speaking from the classroom trenches) that students have increasingly higher assumptions of their worth, and less willingness to prove it through hard work. Given that students pay our salaries and are increasingly demanding about what they think they are “buying” with their tuition dollars, professors and institutions are driven to acquiesce, to fudge on grades, to ratchet down their standards. It’s a demoralizing process.

But we do it. As Jacob Neusner, a rabbi and scholar formerly teaching at Brown University, says, college professors end up lying to their students and their parents, telling kids that they are smarter, more talented, motivated, challenging, interesting and engaging, than they really are.

It’s not that the potential quality of the educational product is lower than it once was, but it is true that professors are beaten down daily by the relentless indifference of students who snooze or chat or talk on cell phones in class, by students’ resentment of being expected to read and remember and employ facts and concepts, by the consumer mentality of students who assume that simple enrollment—with or without engagement—means they’ve earned (and learned) something.

People who teach do so for reasons that people in other careers usually don’t consider. Somewhere in there, whoever we are, lives a curiosity, a love of something—whether it’s Chaucer, or how chemistry shapes life, or what it takes to push a rocket from here to Pluto, or how this fall’s presidential race might affect the world—along with some kind of desire to ignite the same excitement in others. For people with those kinds of passions, it is intensely demoralizing to be faced with apathy, but a tremendous rush to be able to displace it, to wake up students who bring to the university experience what author and community college teacher Peter Sacks calls a “disengaged rudeness,” and replace it with a re-engagement of a 20-year-old’s attention, a new kindling of the same passions.

Sacks, writing about “Generation X,” worries that colleges have bowed to educational consumerism and, in the process, accommodate “a generation of students who [are] increasingly disengaged from anything resembling an intellectual life.” The implications of such capitulation are dire for both higher education and for the larger society and culture that knowledge should illuminate, and in which this fall’s newest generation of students—and the rest of us—will have to live.

RIP, Noble Eddie

.GOOD DOGS GONEShaughnessy, left, Eddie the dumb but lovable Golden, and little Lucy are all gone now, but never forgotten.

.
Goodbye, Old Friend
Eddie (Edward R. Murrow the lesser), 1988-2001

Right before our eyes, the Old Dog is dying. A devoted friend of nearly 13 years — that’s 91 in people years — Eddie’s lights are going out.

How it hurts to watch him go.

Eddie’s a patient and sweet Golden Retriever. This is the kind of dog that has epitomized “unconditional love” ever since he found us back in Ohio in 1988. All his life he has gently and gratefully accepted whatever came, kind of sloppy and pretty clumsy, but always painfully trusting.

Now that he has to die, he’s doing it with a grace that I hope I remember when it’s my turn.

He’s dying. No more stupid human tricks, like the birthday hats we inflicted on him, or the reindeer antlers at Christmas. He’s a pleaser, and even as the strength ebbs in him, the will to please glows strong. But we can see him slipping away.

It’s cancer, of course. And how ironic: He’s not been sick a day in his life. While the other two dogs went through surgeries and therapy and broken limbs and convalescence, Eddie always soldiered on. He was not an exciting dog. But he has been part of our family since there has been a family. When he goes — any day now, I think — there will be a hole in our house you can drive an Alpo delivery truck through.

At the final moment, I think Eddie will look at us apologetically, because he’ll feel that he’s somehow failing us.

At least he’s not in pain. That’s what the vet says. You have to love a veterinarian who tears up as she delivers bad news. The tumors, she says, are growing and spreading. When it comes, the end will be fast. He’ll just bleed away and be gone.

He’s obviously not hurting, and that’s a comfort. Since he was a puppy, Eddie did a thing we call “circus dog” when he sees his dinner coming. He jumps all four feet right off the ground, grinning and wagging like only a sweet hound can. These days he’s still jumping, but not quite off the ground anymore. When there’s no more circus dog, we’ll know Eddie’s about done, because he loves his kibbles.

He’s endured a lifetime of indignities with grace. His first veterinarian suggested cosmetic surgery to correct his sometimes awesome drooling ability: “Eddie has defective lips.” Imagine. Our daughter, when she was maybe 8, was fighting with her sister: “You’re as dumb as Eddie!” she said. It took him years not to hide in the bathtub, and even now he’s most comfortable under a table or in a corner.

But even if he wasn’t the brightest bulb, Eddie was a sweet boy who always accepted whatever came with sloppy and genuine gratitude.

We and Eddie found each other when we were in grad school. It was dumb to get a dog — grad students can’t afford to feed themselves, so the last thing we needed was the responsibility for a big dog. But one Sunday in Athens, Ohio, we had a weird spasm while reading the classified ads, went out and brought home a beefy 12-week-old puppy, wet, smelly and trembling.

He had been battered as a puppy, I think, and was scared of everything. He sat on Brenda’s lap on the living room floor for two hours while I went to the grocery. When he finally felt safe enough to move, he peed in the corner of the kitchen, sniffed at the kibbles and new puppy bowl I’d bought, and hid in the bath tub. That was his safe place for two years.

Last summer, another of our three dogs, Shaughnessy, died, also of cancer. Eddie and his black Lab younger sister, Lucy, were bewildered by the sudden hole in their lives. They were needy for reassurance, and spent a few weeks lying on each other for company. Every move we made around the house, they were there. Now we worry about Lucy, herself a decrepit 11 years old: What will she do when her best friend is no longer there to flop down beside.

Eddie has had a good, happy and healthy life. The sudden lump on his back a few weeks ago led to two surgeries. Now, amazingly fast, the tumors are back. The vet thinks they are everywhere, and the ugly shaved patch on his back is now ringed with hard, evil lumps. He’s not in pain, I think, but time is clearly short. And when he comes from the water dish to lay his dripping defective lips in my lap, I can’t push him away.

Our friend Mark says he defines periods of his life by his pets. He still can’t talk about a cat he lost in the 1980s. He’s right. This is the end of the Eddie Era, which began more than one-quarter of my own life ago. I know that we’ll always mark the day Eddie died. There won’t be any more dumb birthday dog tricks, but he’s had a good run.

He is a sweet old poop, half blind, stiff, unflaggingly devoted. Even now, as the tumors eat him from the inside, Eddie perks up at a tennis ball. He still lies in exactly the wrong place in the kitchen, where we’ll trip over him. He still breathes eager dog-breath in my face when I’m in bed. He still twitches and chases rabbits in his sleep.

Not much longer to go now. We’ll miss him. Good-bye, Old Dog.

EXTRA! EXTRA! Books Are Dead!

The Late, Lamented Book?

By Ted Pease
© 2007

(This column appeared in the Eureka (Calif.) Times-Standard, Aug. 26, 2007)

Since most Americans don’t read anymore—according to the latest poll—most of us probably missed this week’s news item about books. Turns out book publisher Jeremiah Kaplan may have been right more than a decade ago when he predicted the death of the word.

A new poll finds that 27 percent of Americans have not read a book in the past year—not a single book at all, not the Bible or Harry Potter or even Pat the Bunny. Conducted by the Associated Press/IPSOS, the survey shows that most Americans find reading irrelevant to their lives. Most of us, apparently, wouldn’t recognize a book if it bit them.

This is pretty depressing news as schools across the country reopen, full of eager students and—let’s face it—lots of books.

One guy from Dallas told the pollsters he hadn’t read any books in recent memory. He prefers to float and wrinkle in his backyard pool.

“I just get sleepy when I read,” said the 34-year-old telecommunications manager, unapologetically. Which probably says just about everything about the intellectual health of the Information Age.

According to the report, “The survey reveals a nation whose book readers, on the whole, can hardly be called ravenous. The typical person claimed to have read four books in the last year—half read more and half read fewer.

How is this possible? This cannot be the world in which I grew up. At this moment, I have at least seven books open on my bedside table. Without books, I’d have withered as a child and surely would long since have dithered as an adult.

“Books are like vitamins,” Clare Booth Luce once said. “When you walk into a library, you tend to pick, almost instinctively, the intellectual or the emotional vitamins you need.”

Surely America needs such sustenance. And someone must still read books. After all, the latest Harry Potter sold more than 11 million copies in its first 24 hours. But somehow, despite the efforts of J.K. Rowling, the book seems at last to be dying.

Legendary New York book publisher Jeremiah Kaplan predicted this sorry state in the mid-1990s. “Sometime in the next century,” he wrote, “we will be in a world without books, victim of the latest technological evolution in publishing.” I scoffed then, but maybe he was right.

So what toxicity has so polluted our brave new world that books gasp for breath on the sand and readers shrivel beside them? Books, said Garrison Keillor, “contain our common life and keep it against the miserable days when meanness operates with a free hand, and save it for the day when the lonesome reader opens the cover and the word is resurrected.”

But this new study of Americans’ faltering reading habits indicates that we’ve forgotten the lives that thrive between the covers of the book.

What killed the book and the vibrant and exciting worlds they contain? Was it technology—the Internet, TV, video games, ipods and telephone sex—that closed American minds? Sure, backyard pools are wonderful things—especially in Dallas in summertime. But how can lukewarm chlorine come close to washing the richness of the written word from our lives and culture?

Ever since Ooog the Caveman scratched drawings of his tribe’s great triumph over the woolly mammoth on his cave wall, we have used words to paint pictures in the mind, to tell sweeping tales of great deeds and events, both real and fantasy, to escape into and to inspire us.

Words on the page—or on cave walls, in letters or diaries or newspapers or books—take us out of ourselves into other lives and times and places, where people like us can do other things and where we can understand what it is to be and to understand other lives.

Books aren’t about publishing. As any Hogwarter can tell you, books are about ideas and dreams and courage and pain and love and loss and about the greater good that lives in all human souls. Lord knows, books are about more than reality TV or the Internet.

But this survey on the death of the book is worrisome.

I’m not worried about books disappearing--at least not in my house. Just because knuckledraggers from Dallas (or down the road in Crawford) can’t crack a book, that doesn’t mean that I won’t.

I do worry, though, what that means for the larger society. If more than 25 percent of Americans don’t read books, they probably also don’t read newspapers, they don’t pay attention to the news, they never pause to reflect about the world, and they surely don’t wonder or care much about why things are they way they are.

What saddens and frightens me about this latest evidence of American sloth is not that books themselves will perish, but that, even as we float in the end-of-summer swimming pool, that ideas and concepts about the world and humanity—about what we are and who we can be—might.

____
Ted Pease is a columnist and journalism professor.

Hollywood & Free Expression

Moviedom’s ‘Nickel Delirium’:
Free Expression and Hollywood’s Evil Empire


By Edward C. Pease

Pease is professor and former head of the Department of Journalism and Communication at Utah State University, Logan, Utah. This chapter appears as “Free Expression in Hollywood: First Amendment & Censorship,” in F. Miguel Valenti, Les Brown and Laurie Trotta. More Than A Movie: Ethics in Entertainment. (Boulder, CO: Westview Press, 2000).

Every change of century is redolent with the peculiar scent of history repeating and reassessing itself in every arena, from the arts to commerce to politics. With the Mass Media Age well established as the 20th Century ends and the 21st gets itself under way, it is little wonder that one of the focal points for debate and self-examination concerns the role of the mass media throughout society.

With this chapter, we examine the collision between free expression and what many people see as a growing need to monitor and regulate the mass media industries — especially movies, television and that promising new interactive hybrid offspring, the Internet. More and more Americans, from political leaders to grandmothers, are calling for more “morality,” accountability, regulation and self-restraint. At issue, ultimately, is not just what my own grandmother would have called “good taste” and decorum, but the question of who will be licensed to decide what those things are when they appear as products of a new era of mass media omnipotence.

For both the makers and consumers of movies, TV programs and the wide wealth of self-expression that the Internet represents, the issue is an old one that has been fought before. It is a recurring debate.

On the one hand, here in the land of the free and the home of the brave, we are secure in our tradition of free and open self-expression as defined in what is arguably the most powerful sentence since “Let there be light . . . .” In just 45 words, the First Amendment to the Constitution of the United States sets forth a simple but potent recipe for a society that values individual liberty and a reverence for self-expression: “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances.”

What the First Amendment guarantees are four critical freedoms that define a line separating the tyranny of the many from the rights of the individual; it is a way of thinking about where a single person’s freedoms end when they infringe on someone else, and vice versa. The First Amendment protects our individual rights, and tells government regulators and self-appointed do-gooders to back off. We can 1) think and believe what we like, and to observe those beliefs as we wish; 2) to express what we believe, in spoken or written word, or in newspapers, film, theater, television, or any other means of communication; 3) to get together with other people to talk about those beliefs, or anything else; and 4) to be able to complain to government leaders and bureaucrats if we have a gripe, without getting in trouble for it, and to expect solutions.

But even with those powerful protections, free expression is constantly in jeopardy in the mass media age, as we continue to push the envelope of those freedoms with new kinds of self-expression — words, images, ideas — in new kinds of packages that result in recurring reassessment of the wisdom of permitting absolute and completely free self-expression. The balancing act is between individual liberties of free expression on one side, and the point where those freedoms infringe on someone else’s on the other.

Some fear that raunchy excesses by the few put the freedoms of all in danger. Although the constitutional protections and social benefits of free and unfettered and wide-open self-expression — including everything from political debate to artistic expression — have been defined and reinforced by U.S. courts over more than two centuries as crucial to a free society, some stuff has always crossed the line. Every time that happens — typically media content involving sex, religion, political beliefs, violence and other issues offensive to the status quo — everyone’s First Amendment freedoms, in ways small or potentially enormous, are endangered.

As the renowned baseball manager and tangle-tongued philosopher Yogi Berra said, it’s déja vu all over again for all of us — television, moviemakers, and for those in the news print and broadcast media, who have fought these kinds of battles in the past in the face of threats by the public and politicians for legislated “solutions” to the “problem” of free and unfettered expression. Given the crucial importance of free expression to society, complaints about perceived “excesses” in the mass media are nothing to be taken lightly or with arrogance. As central as the First Amendment is to all of American life, it is also under almost-daily attack in multiple arenas — when lawmakers attempt to legislate “morality” or “family values,” when presidential wannabes attack television and movies for promoting social violence, when legislators hold hearings on song lyrics or actors’ political beliefs, when community groups try to ban books or movies as “obscene,” and when elected officials censor artwork and photographs, or mandate technological solutions to “protect” TV viewers or Internet users — from what?

Every time technology has offered us an opportunity to express ourselves more broadly, it was burned at the stake by the powers that be. It’s not that people don’t want to know things, but new things tend to scare us. There has always been conflict between the sayer and the listener, between the individual and the group.

• In prehistoric days, when Oooog the Caveman reported in cave drawings on the day’s woolly mammoth hunt, some of those who heard Oooog’s tale probably took exception to his version of events (“Not Oooog! My cousin Vinnie led the tribe!” Bam BAM!)
• And when minstrels wandered the land singing their songs, which were oral histories of “news” from the village over the hill, it was sometimes easy to offend.
• The written word took centuries on various continents to develop, in different ways. But consider the jeopardy in which the writer placed himself, and the tension between truth and prudence for the poet who wrote truth or parody about the powers-that-were.
• In 1456, Johann Gutenberg changed the world: By inventing the printing press he both freed legions of ink-stained monks from their tasks of copying manuscripts by hand, and made ideas and knowledge accessible to the many instead of just the powerful. Remember that a single upstart priest, Martin Luther, used the new printing technology to challenge the Church of Rome; he was excommunicated in 1520, but Protestantism was on the map.

It’s not the technology, of course, but how it’s used and what it expresses. And there is the challenge for each new generation. In the 1600s, the prolific and politically incorrect poet of his day John Milton spat in the eye of the entire power structure of England, and practices of censorship: “Give me the liberty to know,” he wrote, “to utter, and to argue freely according to conscience, above all liberties.” 1

The powers-that-be have always argued that it’s exactly that kind of uppity-ness, about liberty and free thought and individual choice — that has always led to trouble. After all, who’s better equipped to decide what kinds of stuff we see and hear and read: the writers, poets, reporters, artists and filmmakers and the rest, or those who don’t like what they say?

Up until the advent of the Internet and technology that now permits simultaneous and personally selected exchange of ideas in various forms — multimedia — the last time we had “new media,” it was television. That was the 1930s, although the corner tavern didn’t start getting TVs until after World War II. Even right-thinking people (or maybe especially right-thinking people, as it turns out, given what’s on the tube most days . . .) worried about this new technology would bring.

Back in 1938, preeminent American essayist and wordsmith E.B. White took himself to the World’s Fair in New York City, billed as the “World of Tomorrow,” where he saw that new marvel, television, demonstrated for the first time. By that time, White was living on a saltwater farm on the Maine coast, where he found life simpler and easier to understand. Television was amazing, he thought, but troubling. White wrote:

Television will enormously enlarge the eye’s range, and, like radio, will advertise the Elsewhere. Together with the tabs, the mags, and the movies, it will insist that we forget the primary and the near in favor of the distant and the remote. More hours in every 24 will be spent digesting ideas, sounds, images — distant and concocted. In sufficient accumulation, radio sounds and television sights may become more familiar to us than their originals. . . .
When I was a child, people simply looked about them and were moderately happy; today they peer beyond the seven seas, bury themselves waist-deep in tidings, and by and large what they see and hear makes them unutterably sad. . . .
I believe television is going to be the test of the modern world, and that in this new opportunity to see beyond the range of our vision we shall discover either a new and unbearable disturbance of the general peace, or a saving radiance in the sky. We shall stand or fall by television — of that I am quite sure. 2

Motion pictures — the “movies” — had already arrived to trouble the peace before television, of course, but White makes a good point that needs considering. There are lessons for you modern Miltons and Gutenbergs and D.W. Griffiths to learn from the experiences of your predecessors, who have already fought many of the fights you’ll have to fight again.

Free and open and unrestrained expression is not only a constitutional issue, but (as Thomas Paine argued) a natural right. Although none of us takes kindly to being told what and when and where to express our thoughts, we do accept and even promote the idea that we who have the tools to communicate information and ideas should use them responsibly, with empathy, understanding, and insight.

Struggles within the press — both those who do and those who teach — over questions of performance, social responsibility, and the importance of free expression and wide-open debate as a cornerstone of democracy, have been continuous, robust, often acrimonious, and always vital to keeping freedom of expression and what it means healthy. With that freedom, to which the film and TV industries are just as entitled as newspapers and book publishers, comes a responsibility to the society and the audience that both includes and transcends “art,” a responsibility which filmmakers, TV producers and entertainment moguls ignore at their peril. And ours.

I might suggest two models from print journalism that other, newer media might consider to fight off the threat of censorship. Both connect the content of the media to their responsibilities to the audience, yet another balancing act between individual free expression and a responsibility to a larger society.

The first model comes from an era when print media were seen as dangerously powerful, with the potential not only to influence, but convince entire nations of people to act in certain ways. It was in the period immediately after World War II, when fears about the power of propaganda combined with concerns about concentration of ownership in the press. Maybe, said some in Congress, we should take another look at the liberties granted to the press in the First Amendment. Maybe this was too powerful a tool to permit just anyone to use, willy-nilly.

This worried Henry Luce, the co-founder of Time magazine and surely one of the preeminent media moguls of his day, so he commissioned a panel of scholars to study journalism and its responsibilities in a mass media age. Luce understood that journalism’s power — or perceived power — could be its downfall. The Commission on a Free and Responsible Press, which he created and paid for, was charged with coming up with a recipe for just that: a press that is responsible enough to continue to enjoy its freedom licensed under the First Amendment.
The Hutchins Commission, as it became known after its chairman, University of Chicago Chancellor Robert Hutchins, produced an admirably succinct, five-point formula for maintaining a free and responsible press in a democratic society: “The first requirement is that the media should be truthful. They should not lie,” the commission said. Further, the press must report the facts in context: “There is no fact without context, and no factual report which is uncolored by the opinions of the reporter.”

A responsible press also must provide (2) “a forum for the exchange of comment and criticism” about social issues from (3) all “constituent groups in society,” and help clarify (4) our “goals and values” while providing (5) “full access to the day’s intelligence.” Who could object to any of that? 3

A second model is more modern, the constantly evolving Code of Ethics of the Society of Professional Journalists, the primary association of the working press in the United States. The SPJ Code, developed by scholars and journalists themselves, begins this way: “Members of the Society of Professional Journalists believe that public enlightenment is the forerunner of justice and the foundation of democracy. The duty of the journalist is to further those ends by seeking truth and providing a fair and comprehensive account of events and issues.”

The journalists’ code of ethics focuses on the service that journalists provide to society and their communities. It includes four goals to govern journalistic behavior: (1) Seek truth and report it. (2) Minimize harm to sources and others. (3) Act independently, beholden to no one. (4) Be accountable to readers and viewers.4

Movies and TV entertainment people are different from news people, of course, but we are siblings who can learn from one another, especially as multimedia blend written, spoken, and visual communication. My suggestion is that, first, moviemakers might take a moment at the turn of the century to remember some history, lest it repeat itself, and that the film industry might profit from some lessons about social responsibility that news people have already learned, and compiled in the Hutchins Report and the SPJ Code of Ethics, among other sources. Because a look back at movie history and efforts to control the medium indicate that many of the battles fought by film producers already had been waged by older cousins in print.

The last time the planet turned the page on a new century, the invention of an amazing new technology — the “moving picture” — revolutionized American society. Says film scholar Gregory D. Black, in 1907, “Entertainment films quickly transcended ethnic, class, religious, and political lines to become the dominant institution of popular culture.”5 In New York City alone, nickelodeons — cheap, silent moving picture shows costing a nickel — pulled in some 200,000 people a day, and as many as 2 million nationally.6 Harper’s Weekly called it a nationwide “nickel delirium.”7

Sound familiar? The scale is larger today, and the choices are greater, but the delirium is the same. At the end of the 20th Century, more American homes had TV sets than indoor plumbing. The final episode of a “television show about nothing” — Seinfeld — was touted as THE cultural event of the year. A movie about the 1912 sinking of the Titanic had grossed $1 billion worldwide (that’s Billion with a B) in the six months after its release. The networks fell all over themselves to bid $18 billion (with another B) to televise professional football games through 2005. Researchers find that the average 4-year-old spends about 70 times as much “quality time” with the boob tube as with Dad (35 hours a week compared to 30 minutes). The Internet has revolutionized communication, information, education and entertainment (but what about the 500,000 hits a day to a website called “Jennycam,” which offers Web voyeurs a live view of a 21-year-old woman’s bedroom — what’s that about?). And, as a Time writer put it, when the last president of the old Communist Soviet Union is making TV commercials for Pizza Hut, “it seems pointless to argue with the medium that so dominates our lives and culture.”8

A century ago, in the early 1900s, clergy and politicians decried the advent of movies as dangerous — “a new and curious disease,” as one child expert put it in 1909; a Philadelphia minister called movies “schools for degenerates and criminals,”9 and a fellow man of the cloth said they were “schools of vice and crime . . . offering trips to hell for a nickel.”10

“Unless the law steps in and does for moving-picture shows what it has done for meat inspection and pure food,” a YMCA official intoned in 1912, “the cinematograph will continue to inject into our social order an element of degrading principle. The only way that the people, and especially the children, can be safeguarded from the influence of evil pictures is by the careful regulation of the places of exhibitions.”11

So what has changed? In the 1990s, movies like Natural Born Killers are blamed by politicians for prompting copycat killing sprees, and Hollywood generally was held responsible for a nationwide decline in “family values” and a growing preoccupation with sex and violence. Researchers counted up the number of acts of violence per hour in “family programming,” and (rightly) chastised broadcast executives for their claims that shows like The Flintstones, The Jetsons and Smurfs were “educational.”

“You can’t even watch cartoons any more! What are you going to do? Why have you let TV go so far?” an angry mother of three told TV executives at a public hearing on TV content in Peoria, Ill.

As ever, politicians and civic leaders hear the cries. Media-bashing has always been a popular pastime, whether on the floor of Congress, on the campaign trail, from the pulpit or in the classroom. The response to the advent of cinema in the first decade of the 1900s was censorship and regulation; and in the last decade of the century, the cry is for technological remedies to “protect” us from TV — the V-chip and TV ratings systems — and for legislation to clean up movies.

Progressive reformers in the early 1900s “worried that a new generation of children would learn their moral lessons at the movies,” historian Black observed, and “about the impact of modernization and urban living on the moral fiber of the nation. . . . With religious fervor, Progressives attacked saloons, dance halls, houses of prostitution, and equally harmful ‘immoral’ books, magazines, newspapers, plays — and, of course, movies.”12

The inclination — then as now — was to “protect” society from the perceived excesses of unbridled self-expression by imposing a variety of forms of censorship. Specifically, social reformers of the early 1900s, who thought the cinema was changing America’s traditional values, not reflecting them, started monitoring and then licensing movies. By 1907, community censoring bodies were springing up everywhere; Chicago’s Vice Commission proposed to ban nickelodeons altogether within the city limits, and the city passed a statute requiring prior censorship and a permit from the Chicago police superintendent before any movie could be shown. One of the first movies censored was a version of Macbeth, described as a bloody melodrama and unfit for polite society. By 1908, a national motion picture censorship board had been formed in New York.13

These days, it seems impossible to think that movies should not receive the same protection from the First Amendment that books and news media enjoy, but back then, even newspapers in Pittsburgh and Kansas City and Cincinnati and elsewhere editorialized in opposition to freedom for the “perverted” and “immoral” new medium. And the pressure on lawmakers, from state legislatures to the U.S. Congress, was often intense — as it sometimes is today as well, and for the same reasons. (Not everyone agreed, of course. The mayor of Topeka, Kan., a staunch opponent of censorship, thought moviegoers should be given more credit for common sense. “If you have a boy who can be corrupted by the ordinary run of moving picture films you might as well kill him now and save the trouble,” he told Moving Picture World magazine in 1915.14)

Moviemakers finally challenged the constitutionality of licensing and censorship in 1915, arguing to the U.S. Supreme Court that movies were part of the press and “increasingly important . . . in the spreading of knowledge and the molding of public opinion upon every kind of political, educational, religious, economic and social question.” Unbelievably, the Supreme Court, including Oliver Wendell Holmes, unanimously disagreed.15

Couldn’t happen today, right? Well, it seems impossible that the U.S. Supreme Court would ever repeat the 1915 decision, but there is plenty of evidence that the latter-day “reformers,” drawing support from otherwise unlikely partners on both the left and the right, want something to be done to curb a mass media that increasingly is seen as out of control.

“A good part of what has gone wrong in this country is due to our mass media,” U.S. Sen. John Danforth, R-Mo., said during Senate hearings on TV violence in 1993.

And this from U.S. Sen. Paul Simon, then a liberal Democrat from Illinois, the same year: “The evidence is just overwhelming that entertainment that glorifies violence adds to violence in our society. . . . There are a lot of people in the industry who won’t acknowledge that.”16

In 1993, Attorney General Janet Reno, representing the Clinton administration, testified before Senate hearings and sent chills through Hollywood, raising the specter of the infamous House Un-American Activities Committee and, later, the McCarthy hearings of the 1940s and ’50s. “In only half a century, television-brought violence has become a central theme in the lives of our young people, as central as homework and playgrounds,” Reno said. “If immediate voluntary steps are not taken and deadlines established, government should respond, and respond immediately.”17

Government control of TV and movies? In America? Simon wasn’t sure he’d go that far, but he agreed that something needs to be done to protect society from the excesses of TV and movies. “I don’t want the federal government to control content,” he said, “but what I do want to do is to put some pressure on the industry so that they will regulate themselves in an area where clearly harm is being done to our society.”18

For many of us, these kinds of sentiments, coming from the likes of Simon, who as a liberal Democrat is the kind of politician traditionally on the angels’ side of First Amendment protections, are horrifying. But they appear to be increasingly widespread as parents worry about whether the mass media, and especially television and movies, are corrupting their kids. Even though the constitution says clearly and directly that “Congress shall make no law” curbing free expression in any form, pressures on lawmakers to insulate children from the corrupting influences of the mass media age are fierce, and often place politicians in a bind between a reverence for First Amendment guarantees and electoral realities. For many in Washington, it’s an uncomfortable place to be.

“Every day, battle lines cross movie lines wherever controversial topics are at odds with local attitudes,” points out Jack Perkins of the A&E TV network. “Now Congress is asked to decide how far is too far, how much is too much?”19

It’s history repeating itself. In Hollywood the first time around, in the early 1900s, the way the issue was resolved was for the motion picture industry to respond to public outcry by agreeing to cooperate with oversight boards, such as the National Board of Review (NBR). At least, studios hoped, establishing the NBR would halt the proliferation of different censorship bodies in every community in the land; cooperation with the NBR and industry self-censorship were seen as the lesser of a variety of greater evils and, thus, good business.

But it didn’t work. Movies such as D.W. Griffith’s 1915 epic, The Birth of a Nation, perhaps the most heavily censored film in history, still outraged critics like the National Federation of Women, which condemned it as “vile and atrocious.” (Although audiences obviously disagreed — The Birth of a Nation stood, until the 1980s, the top-grossing movie of all time, earning about $60 million in 1915 dollars, which would amount to about a half-billion dollars today.)

By 1922, the National Board of Review was seen to be too lax and in collusion with Hollywood. The movie producers, fearful of more draconian measures against them, elected to create a trade association, the Motion Picture Producers and Distributors of America (MPPDA), to work at improving Hollywood’s image and to lobby against censorship bills at the state and national level. The “squeaky clean” front man to represent the industry was the chairman of the Republican National Committee and former Postmaster General in the Harding administration, Will Hays from Indiana.

“Hays was an inspired choice,” writes film historian Black. “His roots were solidly Midwestern, his politics conservatively Republican, and his religion mainstream Protestant. Teetotaler, elder in the Presbyterian church, Elk, Moose, Rotarian and Mason, Hays brought the respectability of mainstream middle America to a Jewish-dominated film industry. He symbolized the figurative Puritan in Babylon.”20

But although Hays’s group created rules requiring studios to send him scripts before they were produced, and even though he rejected some 125 of them over six years, pressure was building for federal legislation against what one religious leader called movies’ “threat to civilization.” When the talkies came in in the late 1920s, the situation — from everyone’s perspective — worsened. Now the immoral corrupters could talk on the screen, rationalize their behavior, and further flout law, order and decency. As one Catholic critic, Father Daniel Lord, S.J., the primary author of the production code that would govern Hollywood for three decades, put it, “Silent smut had been bad. Vocal smut cried to the censors for vengeance.”21 Censors stepped up their work, with the New York State censorship board alone cutting more than 4,000 scenes from 600 movies in 1928.22

Although Hays was an unqualified success in the public relations department, he made no headway at all in terms of regulating movie content or reducing the level of public complaints. What was missing was an agreement on rules that would help the studios regulate themselves. In the late 1929s, a group of Chicago Catholics, including Cardinal George Mundelein and Jesuit dramatics professor Father Lord, approached Hays and the MPPDA with “a Catholic movie code . . . a fascinating combination of Catholic theology, conservative politics and pop psychology — an amalgam that would control the content of Hollywood films for three decades.” 23

Hays and Lord managed to sell the code to moguls from MGM, Warner, Paramount and Fox, in part because the studios were jittery about their finances in the wake of the stock market crash, and the threat of a boycott by 20 million Catholic moviegoers. Given the possible alternatives, the studios embraced self-regulation in order to engender goodwill and to prevent more dire repercussions. The Code also avoided federal regulation or other outside intervention and censorship; enforcement resided in the Hays office and the MPPDA, as well as a jury of producers, which would make decisions in the case of disagreements between studios and Hays.
“The Code sets up high standards of performance for motion picture producers,” Hays said in 1934. “It states the considerations which good taste and community values make necessary in this universal form of entertainment — respect for law, respect for every religion, respect for every race, and respect for every nation.”24

Oversight of the Code eventually fell to Joseph L. Breen, but only after two other men had passed through as filmdom’s chief censors and keepers of American morality. Breen, a Catholic anti-Semite who had been doing public relations work for Hays in Hollywood, was perfect for the job. He thought the studios had duped his boss: If Hays had really thought “these lousy Jews out here will abide by the Code’s provisions, . . . he should be censured for his lack of proper knowledge of the breed. . . .” Breen wrote in a 1932 letter to a Jesuit priest friend:

They are simply a rotten bunch of vile people with no respect for anything beyond the making of money. Here [in Hollywood] we have Paganism rampant and in its most virulent form. Drunkenness and debauchery are commonplace. Sexual perversion is rampant . . . . These Jews seem to think of nothing but money-making and sexual indulgence. . . . [And these] are the men and women who decide what the film fare of the nation is to be. . . . They are, probably, the scum of the earth. 25

Among the sex fiends Breen undoubtedly was referring to was the blond bombshell herself, Mae West, whose sultriest and sexiest films, She Done Him Wrong and I’m No Angel, came out during this period, along with other “celebrations of immorality and adultery,” including Jean Harlow in Bombshell and Joan Crawford in Dancing Lady.

Says author and film historian Nat Segalhoff, “Mae West did three things: She entertained people, she saved Paramount Pictures, and she single-handedly created the Legion of Decency, which was single-handedly created to fight Mae West.”26 That may be a bit overstated — it wasn’t only Mae West — but what is certain is that the increasingly explicit and risqué fare coming out of Hollywood in the 1930-34 period created yet another crisis for Hays, the studios and America’s forces of morality.

Though certainly more violent and racist in tone than most Catholic discussion about Hollywood during the early 1930s, Breen’s attitudes about Tinseltown as the modern Sodom and Gomorra were increasingly widespread among American Catholics, who started talking again about a Catholic boycott. Hollywood, “[t]he pesthole that infests the entire country with its obscene and lascivious motion pictures must be cleaned and disinfected,” suggested the Catholic magazine Commonweal in 1934.27

In 1933, clearly disgusted by how poorly the Code was protecting American values, the Catholic Church founded the Legion of Decency, which created its own rating system and asked church members to boycott movies the Legion deemed immoral.

Monsignor Francis J. Weber served as an officer on the Legion of Decency. “The one thing a producer did not want to do was to get one-fifth of the American public on his back, and they knew they could expect that if the movies were too raunchy,” he said in a 1994 interview.28
It was in response to the Catholic challenge that the MPPDA installed Joe Breen as the official protector and interpreter of the Code. “The vulgar, the cheap and the tawdry is out,” Breen announced. “There is no room on the screen at any time for pictures which offend against common decency, and these the industry will not allow.”29

It is clear that Hollywood had a serious problem in Breen, but by the end of 1930s, the silver screen genie was starting to escape from its bottle. In 1939, Gone With the Wind made various kinds of movie history. One of the reasons that Rhett Butler’s one-liner to Scarlett O’Hara is among the all-time most famous movie quotes is not just his delivery, but its content. “Frankly my dear,” actor Clark Gable told Vivien Leigh, “I don’t give a damn.” He might have been addressing Breen and the do-gooders. Producer Hal Roach remembers struggling with the script, in the context of the edict of the Production Code banning profanity. “We had a meeting,” Roach says, “and we decided that there was just no other word that could do the job that ‘damn’ would in that situation, so we left the ‘damn’ in.”30

Other “subversive” and “immoral” films glorifying sex and drugs slipped through: Sex Madness, Reefer Madness, Howard Hughes’ production of Jane Russell (and her controversial cleavage) in The Outlaw, among others. But then came Pearl Harbor, and U.S. attention — and Hollywood’s — was drawn overseas and to more patriotic fare. World War II provided other grounds for censorship — national security and American morale. It was forbidden to show American soldiers dead or injured; John Huston, who went to work for the military as a filmmaker, often had scenes cut; footage of the attack on Pearl Harbor was suppressed for 16 months to avoid demoralizing the folks back home.

After the war, with the beginning of the Cold War, censorship took a new form as Hollywood was scrutinized not so much for immorality and indecency, but on grounds of patriotism. Led by U.S. Rep. J. Parnell Thomas of New Jersey (often with colleague Richard M. Nixon at his side), the House Un-American Activities Committee went hunting Communist sympathizers in Hollywood in the late 1940s. It was one of the most shameful and painful episodes in American history, a witch hunt that blacklisted many of the most talented people in Hollywood, and ruined lives. In 1947, Thomas and his HUAC colleagues subpoenaed 41 Hollywood directors, screenwriters and actors who had joined to Communist Party during the 1930s, when it had been considered rather stylish to do so, or who had contributed funds to its activities during the Depression.

Of the 41, 19 were declared “unfriendly,” meaning that they declined to answer questions about their political affiliations; 11 were questioned directly about their Communist Party membership and, minus German playwright Bertolt Brecht, who left the country immediately after his appearance, made up the infamous Hollywood 10, the first blacklist.

But as the focus turned from morality to political ideology, the grip of the Hays Office and the Production Code started to fade. Social change through the late 1950s and into the Civil Rights and Viet Nam war eras, along with a more cosmopolitan American population, effectively blunted controls on movie content.

“When I became president of the Motion Picture Association of America in May 1966,” recalls Jack Valenti, “the slippage of Hollywood studio authority over the content of films collided with an avalanching revision of American mores and customs.

“The national scene was marked by insurrection the campus, riots in the street, rise in women’s liberation, protest of the young, doubts about the institution of marriage, abandonment of old guiding slogans, and the crumbling of social traditions. It would have been foolish to believe that movies — that most creative of art forms — could have remained unaffected by change and torment in our society.”31

The MPAA was the successor organization to MPPDA and, although as Valenti points out, the end of the era of big Hollywood studios had reduced the ability of something like the Hays Office to control content, the Production Code was officially still on the books. “From the very first day on my succession to the MPAA president’s office, I had sniffed the Production Code,” Valenti says. “There was about this stern, forbidding catalogue of ‘Do’s and Don’ts’ the odious smell of censorship. I determined to junk it at the first opportune moment.” 32

As Valenti recalls, the problem of the Code was illustrated for him in his first weeks in office, when the script of Who’s Afraid of Virginia Woolf? landed on his desk for review. He met with studio mogul Jack Warner to discuss some of the language in the script. “We talked for three hours, and the result was deletion of ‘screw’ and retention of ‘hump the hostess,’” he says. “It seemed wrong that grown men should be sitting around discussing such matters. Moreover, I was uncomfortable with the thought that this was just the beginning of an unsettling new era in film, in which we would lurch from crisis to crisis, without any suitable solution in sight.” 33

But it was both more complicated and simpler than that, despite Valenti’s hindsight. In the 1960s, movies had slipped, and television had taken over as America’s family entertainment option of choice. Hollywood was hemorrhaging as television, not movies, defined American popular culture and, increasingly, defined how American families spent their leisure time. As we had gathered around the new invention of popular radio in the 1930s and 1940s, Americans clustered in their living rooms around the tube in the 1960s, and the movie houses suffered. Television’s wholesome stuff like Leave It to Beaver and Father Knows Best appealed to entire families, so in order to attract teen-agers and young adults in the more permissive 1960s, the studios tried sexier stuff — grittier, riskier, more violent — that the TV networks couldn’t or wouldn’t.

In order to respond to those market and societal forces, and to shore up profits of the ailing movie business, the old Hays Code was abolished in Fall 1968, replaced with a voluntary film rating system, adapted from a similar formula in place in England. This semblance of industry self-regulation eventually evolved into the familiar G-PG-PG/13-R-NC/17 system, providing some minimal guidance at least about the content of movies and, now, TV programs as well.

“I invented what I think is the only sane way to deal with an unruly marketplace,” Valenti later said, “and that is to give advance, cautionary warning to parents, and let them use their own judgment about the movies they give their children permission to see.”34

That, too, is a bit self-serving. As ever, the specter of government stepping into the movie and TV business has been enough to force the industries into self-regulation as the less evil of available bad choices. The transition of MPPA-like ratings systems from movies to TV has been a political battleground in recent years, and also the source of both amusement and consternation among foreign observers.

Claude-Jean Bertrand, the late French media scholar, points out a couple of inconsistencies: U.S. television and movies are full of violence of the most abhorrent and bizarre kind, from the sci-fi Jurassic Park, to the dark Pulp Fiction and LA Confidential, to Schwarzenegger-esque senseless dismemberments and explosions. But while accepting such violence as entertainment, repressed U.S. attitudes toward sexuality censor most forms of nudity and love-making, even as a 1997 research study finds that the United States is the world’s No. 1 producer of pornography.

A trans-Atlantic flight, Bertrand recalls, offered two in-flight movies — Arnold Schwarzenegger’s The Eraser and the adapted 18th-century romance Moll Flanders, both edited for “family viewing.”

The Eraser is “really very funny, but terribly violent,” Bertrand said. In the other film, the Moll Flanders character poses nude for her painter husband, and the in-flight censors had superimposed a digital mosaic to obscure her bare breast.

“It’s just silly,” Bertrand said. “I really don’t know why Americans get so excited about sex in movies. American films are so full of violence. I would rather my children know how to make love than how to kill people, blow up their heads or cut them up into little pieces.”35

Whether it’s Puritanism or something else, issues of sexual and violent content in the mass media continue to capture center stage in the ongoing U.S. debate over life in the information age. Early in 1997, threatened with rising public ire over TV content and the threat of congressional action (sound familiar?), broadcasters adopted an MPAA-like ratings system for TV, which imposed the familiar PG-system labels used since 1968 in Hollywood.

Like the movie system, however, these labels — termed “age-based” because they advise viewers on the appropriate audience age — are criticized as too vague, too arbitrary, essentially useless for parents who want some guidance as to what’s actually in tonight episode of X Files or ER. The ratings system is about as helpful, said U.S. Sen. Joe Lieberman, D-CT, as “putting a sign up in front of shark-infested waters that says: ‘Be careful while swimming.’”

What’s ironic about this plan to “protect” children from the boob tube is that it may actually help the television industry target child audiences for their advertisers and make more money. That’s why there wasn’t much complaining from broadcasters about how rating TV shows is censorship, an attack on constitutional rights of free expression, overbearing big government’s meddling in private lives.

When the ratings plan came out in 1997, President Bill Clinton, who identified television as one of the most dire threats to the American family, appeared in the Oval Office with TV and Hollywood executives to call the ratings plan “a huge step forward over what we have now — which is nothing.”

So why were broadcasters so accepting of the “voluntary” plan to rate something like 2,000 hours of TV programming a day? Simple: It’s a lot better for the networks and studios than content-based plans. TV mogul Ted Turner made no effort to claim that this is something broadcasters really want to do: “We are voluntarily having to comply. We don’t have any real choice. We’re either going to do it or we’re going to be done for.”

Another government-mandated “choice” forced on broadcasters is the technological “solution” of the V-chip, which permits parents to program their television sets to block out shows that carries certain amounts of sexuality, nudity, violence or profanity. Whether this will really be a useful tool for parents or simply an entertaining technological challenge to teen-aged hackers is unclear.

The threat TV producers fear is a content-based system proposed by “radical” critics who think TV producers are about as inclined to responsibly oversee children’s programming as the tobacco industry is to adopt clean-air standards. The problem, the critics argue, is that rating programs for 7-year-olds assumes that all 7-year-olds are alike and, further, that broadcasters, who will be rating their own shows, are dependable judges of what parents want their 7-year-olds to see.

The modern-day reformers prefer a more detailed, content-based scale, such as the one used in Canada. The Canadian system rates programs on a six-point scale for sex, violence and profanity, with 0 low and 6 high, and then keys each program for the V-chip attached to the TV set. For example, a given episode of NYPD Blue might be rated (by whom we don’t yet know) a 2 for partial nudity, a 3 for violent content and a 4 for strong language, and then viewers could make a more informed choice about whether they want to see it. Ultimately, the V-chip, when activated on televisions manufactured after 1998, could be set to block out programs whose ratings exceed the viewers’ preferred limits.

The content-specific system is far preferable to children’s TV groups, which would prefer information about what’s in a program rather than some industry exec’s judgment of what kind of kid it suits. After all, who knows your child’s psyche better, yourself or some PG-17 rater? In fact, what seems more likely is that a TV show with a sexy rating is going to attract more, not fewer, young viewers — what 14-year-old is going to run from shows that are advertised by their ratings as containing nudity and strong language?

Advertisers already select programs to be vehicles for their products, of course, but this becomes a more exact science when they know that shows branded with a scarlet S for SEX! or V for VIOLENT! or FS for FILTHY SMUT! will deliver tidy will demographic packages of young viewers, thus increasingly their advertising revenues. The average American child watches about 20,000 commercials a year, and advertisers spend about $700 million a year trying to reach them. About 96 percent of the food ads that run during kids’ shows are for sugared cereals, candy, cookies and junk food. How much easier life will be for advertisers once they can target shows rated TV-K (“suitable for all children”)!

It is clear that parents want more information about what’s on TV. A 1997 poll of 1,000 American households by the Freedom Forum Media Studies Center in New York found that about 70 percent of Americans want some kind of TV ratings system to help them navigate growing cable and satellite offerings. They also preferred a content-based system to the age-based plan, 73 percent to 15 percent; 72 percent of American households with children said they would use the V-chip technology, compared to 37 percent of households with no children.

The dilemma facing producers of all mass media, new and old — newspapers and books, film and TV, Internet content, multimedia and all the rest — at the beginning of a new century is the same old song: How to balance the free expression of individual artists, writers, filmmakers, and reporters against the perennial complaint of society about “morality”? It is a classic constitutional tension and political struggle, as parents and the politicians who hear them pressure Congress to “protect” kids from the depravities of Hollywood. And, of course, the critics are not always wrong about content that is too raunchy or crass or foul-mouthed or otherwise in poor taste. Where the critics and reformers are wrong, and badly, is in their efforts, however well-intentioned, to control or censor content, whether for reasons of “decency” and “morality” (whatever that means), or to promote the “public good” and “family values” (whose?). The basic fallacy of all such attempts to legislate morality and to control mass media content is the explicit assumption that you can stop people from having dangerous thoughts simply by telling them not to.

At best, rating systems and V-chips and other such legislated efforts to protect us from television, movies and other media content might be tools for viewers, but all such measures are flawed because they depend on someone else — governments or program producers themselves — to make judgments on what is “appropriate.” The specter of similar efforts looms to “clean up” cyberspace and the anarchy of free expression that is the Internet. But what kind of chilling effects will such arbitrary mechanisms have on those who create content for television and other mass media? If it is shown that certain kinds of phrases or action or situations become red flags for the censors who rate programs, will writers and actors and producers self-censor, as occurs now in the film industry, in order to avoid ratings that restrict their audiences and revenues? Or, on the other hand, will content producers use such mechanistic measures as an excuse to abdicate their personal judgment and responsibilities for content balance, and go straight for the more profitable ratings that young audiences find more titillating?

It is hard work being a responsible consumer of media content in a mass media age of thousands of choices, but for me, I’d rather do the work of filtering and judging myself rather than let either self-appointed or elected censors decide what’s appropriate for me and my family in my own home.

Like all media, movies, entertainment television and all the other new products converging in the age of interactive media mass have the capacity both to educate and inspire, as well as to horrify and degrade. The difference is often in the eye and mind of the beholder, of course and, for those to produce media as well as for those who consume it, the challenge is to find ways to take advantage of the former traits while not overreacting to the latter. It is a delicate balance. The answer is not more rules or the threat of regulation, which can lead to a chilling self-censorship that can be more insidious and damaging to the spirit — and to principles of free expression — than legislated censorship may be. A better, more lasting and more responsible solution will involve an understanding both by those who produce media content — TV, movies, web pages, video games, interactive media and all the rest — as well as those who are concerned with media effects, of the important balancing act required in a free society, between two critical values: individual free expression on the one hand, and larger social good on the other.

For consumers and creators of mass media content alike, this relationship in a society that is truly dedicated to both individual liberties and a larger social responsibility is fragile at best. The First Amendment says nothing about responsible free expression, but in a social climate of growing concern over the impact of pervasive mass media, especially as they may influence children, recklessness can pose threats to the freedom and individual liberties of content producers and consumers alike. It is not enough to push the envelop in terms of media content simply because we can, either technologically or legislatively, but because we as thoughtful, responsible individuals really think we should.

Notes
1. John Milton, “Areopagitica,” 1645.
2. E.B. White, “Removal,” One Man’s Meat. (New York: Harper & Row, 1938), pp. 2-3.
3. Hutchins Commission, A Free & Responsible Press: A General Report on Mass Communication: Newspapers, Radio, Motion Pictures, Magazines and Books. (Chicago, University of Chicago Press, 1947).
4. Society of Professional Journalists, “Code of Ethics” (Greencastle, Ind.: Society of Professional Journalists, 1995).
5. Gregory D. Black, Hollywood Censored: Morality Codes, Catholics, and the Movies. (New York: Cambridge University Press, 1994) pp. 8-9.
6. John Fell, Film Before Griffith. (Berkeley, CA: University of California Press, 1983), pp. 162-3.
7. Barton W. Currie, “The Nickel Madness,” Harper’s Weekly, Aug. 24, 1907.
8. Bruce Handy, Time, February 1998.
9. Garth Jowett, Film: The Democratic Art. (Boston: Little, Brown, 1976).
10. Wilbur F. Crafts, National Perils and Hopes: A Study Based on Current Statistics and the Observations of a Cheerful Reformer. (Cleveland: O.F.M. Barton, 1910), p. 39.
11. Letter to the editor, Darrell O. Hibbard, in The Outlook, July 13, 1912.
12. Black, Hollywood Censored, op. cit., pp. 8-9.
13. Ibid.
14. Moving Picture World 11, May 22, 1915; See Black, p. 12
15. See Black, pp. 15-16.
16. “The Hollywood Wars,” A&E Network, 1994.
17. Ibid.
18. Ibid.
19. Ibid.
20. Black, op. cit., p. 31.
21. Daniel A. Lord, S.J. Played By Ear. (Chicago: Loyola University Press, 1955).
22. Black, op. cit., p. 34
23. Ibid., p. 39
24. See A&E, “The Hollywood Wars”
25. Letter from Breen to Wilfrid Parsons, S.J., Oct. 10, 1932; See Black, pp. 70-71.
26. Nat Segalhoff, in A&E, “The Hollywood Wars.”
27. Commonweal, May 18, 1934.
28. A&E, The Hollywood Wars.”
29. Ibid.
30. Ibid.
31. See Jack Valenti (URL http://www.mpaa.org/)
32. Ibid.
33. Ibid.
34. A&E, “The Hollywood Wars.”
35. Claude Bertrand, personal communication, March 1997