Skip to main content

Harmful speech as the new porn

In 1968, Lyndon Johnson appointed a National Commission on Obscenity and Pornography to investigate the supposed sexual scourge corrupting America’s youth and culture. Two years later — with Richard Nixon now president — the commission delivered its report, finding no proof of pornography’s harm and recommending repeal of laws forbidding its sale to adults, following Denmark’s example. Nixon was […] The post Harmful speech as the new porn appeared first on BuzzMachine .

In 1968, Lyndon Johnson appointed a National Commission on Obscenity and Pornography to investigate the supposed sexual scourge corrupting America’s youth and culture. Two years later — with Richard Nixon now president — the commission delivered its report, finding no proof of pornography’s harm and recommending repeal of laws forbidding its sale to adults, following Denmark’s example. Nixon was apoplectic. He and both parties in the Senate rejected the recommendations. “So long as I am in the White House,” he vowed, “there will be no relaxation of the national effort to control and eliminate smut from our national life.” That didn’t turn out to be terribly long.

A week ago, as part of my research on the Gutenberg age, I made a pilgrimage to Oak Knoll Books in New Castle, a hidden delight that offers thousands of used books on books. On the shelves, I found the 1970 title, Censorship: For and Against, which brought together a dozen critics and lawyers to react to the fuss about the so-called smut commission. I’ve been devouring it.

For the parallels between the fight against harmful and hateful speech online today and the crusade against sexual speech 50 years ago are stunning: the paternalistic belief that the powerless masses (but never the powerful) are vulnerable to corruption and evil with mere exposure to content; the presumption of harm without evidence and data; cries calling for government to stamp out the threat; confusion about the definitions of what’s to be forbidden; arguments about who should be responsible; the belief that by censoring content other worries can also be erased.

Moral panic

One of the essays comes from Charles Keating, Jr., a conservative whom Nixon added to the body after having created a vacancy by dispatching another commissioner to be ambassador to India. Keating was founder of Citizens for Decent Literature and a frequent filer of amicus curiae briefs to the Supreme Court in the Ginzberg, Mishkin, and Fanny Hill obscenity cases. Later, Keating was at the center of the 1989 savings and loan scandal — a foretelling of the 2008 financial crisis — which landed him in prison. Funny how our supposed moral guardians — Nixon or Keating, Pence or Graham — end up disgracing themselves; but I digress.

Keating blames rising venereal disease, illegitimacy, and divorce on “a promiscuous attitude toward sex” fueled by “the deluge of pornography which screams at young people today.” He escalates: “At a time when the spread of pornography has reached epidemic proportions in our country and when the moral fiber of our nation seems to be rapidly unravelling, the desperate need is for enlightened and intelligent control of the poisons which threaten us.” He has found the cause of all our ills: a textbook demonstration of moral panic.

There are clear differences between his crusade and those attacking online behavior today. The boogeyman then was Hollywood but also back-alley pornographers; today, it is big, American tech companies and Russian trolls. The source of corruption then was a limited number of producers; today, it is perceived to be some vast number of anonymous, malign conspirators and commenters online. The fear then was the corruption of the masses; the fear now is microtargeting drilling directly into the heads of a strategic few. The arena then was moral; now it is more political. But there are clear similarities, too: Both are wars over speech.

“Who determines who is to speak and write, since not everyone can speak?” asks former presidential peace candidate Gene McCarthy in his chapter of the book. But now, everyone can speak.

McCarthy next asks: “Who selects what is to be recorded or transmitted to others, since not everything can be recorded?” But now, everything can be recorded and transmitted. That is the new fear: too much speech.

A defense of speech

Many of the book’s essayists defend freedom of expression over freedom from obscenity. Says Rabbi Arthur Lelyveld (father of Joseph, who would become executive editor of The New York Times): “Freedom of expression, if it is to be meaningful at all, must include freedom for ‘that which we loathe,’ for it is obvious that it is no great virtue and presents no great difficulty for one to accord freedom to what we approve or to that to which we are indifferent.” I hear too few voices today defending speech of which they disapprove.

Lelyveld then addresses directly the great bone of contention of today: truth. “That which I hold to be true has no protection if I permit that which I hold to be false to be suppressed — for you may with equal logic turn about tomorrow and label my truth as falsehood. The same test applies to what I consider lovely or unlovely, moral or immoral, edifying or unedifying.” I am stupefied at the number of smart people I know who contend that truth should be the standard to which social-media platforms are held. Truths exist in their contexts, not relative but neither simple nor absolute. Truth is hard.

“We often hear freedom recommended on the theory that if all expression is permitted, the truth is bound to win. I disagree,” writes another contributor, Charles Rembar, an attorney who championed the cases of Lady Chatterley, Tropic of Cancer, and Fanny Hill. “In the short term, falsehood seems to do about as well. Even for longer periods, there can be no assurance of truth’s victory; but over the long term, the likelihood is high. And certainly truth’s chances are better with freedom than with repression.”

A problem of definition

So what is to be banned? That is a core problem today. The UK’s — in my view, potentially harmful — Online Harms White Paper cops out from defining what is harmful but still proposes holding online companies liable for harm. Worse, its plan is to order companies to take down legal but harmful content — which, of course, makes that content de facto illegal. Similarly, Germany’s NetzDG hate speech law tells the platforms they must take down anything that is “manifestly unlawful” within 24 hours, meaning a company — not a court — is required to decide what is unlawful and whether it’s manifestly so. It’s bad law that is being copied so far by 13 countries, including by authoritarian regimes.

As an American and a staunch defender of the First Amendment, I’m allergic to the notion of forbidden speech. But if government is going to forbid it, it damned well better clearly define what is forbidden or else the penumbra of prohibition will cast a shadow and chill on much more speech.

In the porn battle, there was similar and endless debate about the fuzzy definitions of obscenity. Author Max Lerner writes of the courts: “The lines they draw and the tests they use keep shifting, as indeed they must: What is ‘prurient’ or ‘patently offensive’ enough to offend the ‘ordinary reader’ and the going moral code? What will hurt or not hurt children and innocents? What is the offending passage like in its context…?” Even Keating questions the standard that emerged from Fanny Hill: to be censored, content must be utterly without redeeming social importance. “There are those who will say that if you can burn a book and warm your hands from the fire,” Keating says, “the book as some redeeming social value.” Keating also argues that pornography “is actually a form of prostitution because it advertises ‘sex for sale,’ offers pleasure for a price” — and since prostitution is illegal, so must pornography be. Justice Potter Stewart’s famed standard for obscenity— “I’ll know it when I see it” — is the worst standard of all, for just like the Harms White Paper and NetzDG it requires distributors and citizens to guess. It is chilling.

Who is being protected?

“Literary censorship is an elitist notion: obscenity is something from which the masses should be shielded. We never hear a prosecutor, or a condemning judge (and rarely a commentator) declare his moral fiber has been injured by the book in question. It is always someone else’s moral fiber for which anxiety is felt. It is ‘they’ who will be damaged. In the seventeenth century, ‘they’ began to read; literacy was no longer confined to the clergy and the upper classes. And it is in the seventeenth century when we first begin to hear about censorship for obscenity.” So writes Rembar.

In the twentieth century ‘they’ began to write and communicate as never before in history — nearly all of ‘them.’ That has frightened those who had held the power to speak and broadcast. Underrepresented voices are now represented and the powerful want to silence them to protect their power. It is in the twenty-first century that we hear about control of harmful speech and hate.

“I am opposed to censorship in all forms, without any exception,” writes Carey McWilliams, who was then editor of The Nation, arguing that censorship is a form of social control. “I do not like the idea of some people trying to protect the minds and morals of other people. In practice, this means that a majority seeks to impose its standards on a minority; hence, an element of coercion is inherent in the idea of censorship.” It is also inherent in the idea of civility, an imposition of standards, expectations, and behavior from top down.

McWilliams then quotes Donald Thompson on literary censorship in England: “Political censorship is necessarily based on fear of what will happen if those whose work is censored get their way…. The nature of political censorship at any given time depends on the censor’s answer to the simple question, ‘What are you afraid of?’” Or whom are you afraid of? Rembar’s “they”? McWilliams concludes:

But in a time of turmoil and rapid social change, fears of this sort can become fused with other kinds of fears; and their censorship becomes merely one aspect of a general repression. The extent of the demands for censorship may be taken, therefore, as an indicator of the social health of a society. It is not the presence — nor the prevalence — of obscene materials that needs to be feared so much as it is the growing demand for censorship or repression. Censorship — not obscenity nor pornography — is the real problem.

Lerner puts this another way, examining a shift in the “norm-setting classes” over time. In the past, the aristocracy set norms for dress, taste, and morals. Then the middle classes did. Now, I will argue, the internet blows that apart as many communities are in a position to compete to set or protect norms.

Freedom from

Rembar notes that “reading a book is a private affair” (as it has been since silent reading replaced reading aloud starting in about the seventh century A.D.). He addresses the Constitution’s implicit — not explicit — right to be let alone, the basis of much precedent in privacy. “Privacy in law means various things,” he writes; “and one of the things it means is protection from intrusion.” He argues that in advertising, open performance, and public-address systems, “these may validly be regulated” to prevent porn from being thrust upon the unsuspecting and unwilling. It is an extension of broadcast regulation.

And that is something we grapple with still: What is shown to us, whether we want it shown to us, and how it gets there: by way of algorithm or editor or bot. What is our right not to see?

Who decides?

Max Lerner is sympathetic with courts having to judge obscenity. “I view the Court’s efforts not so much with approval or disapproval as with compassion. Its effort is herculean and almost hopeless, for given the revolution of erotic freedom, it is like trying to push back the onrushing flood.” The courts took on the task of defining obscenity though, as Lerner points out above, they never really did draw a clear line.

The Twenty-Six Words that Created the Internet is Jeff Kosseff’s definitive history and analysis of the current fight over Section 230, the fight over who will be held responsible to forbid speech. In it, Kosseff explains how debate over intermediary liability, as this issue is called, stretches back to a 1950s court fight, Smith v. California, about whether an L.A. bookseller should have been responsible for knowing the content of every volume on his shelves.

Today politicians are shying away from deciding what is hateful and harmful, and these questions aren’t even getting to the courts because the responsibility for deciding what to ban is being put squarely on the technology platforms. Because: scale. Also because: tech companies are being portrayed as the boogeymen — indeed, the pornographers — of the age; they’re being blamed for what people do on their platforms and they’re expected to just fix it, damnit. Of course, it’s even more absurd to expect Facebook or Twitter or Youtube to know and act on every word or image on their services than it was to expect bookseller Eleazer Smith to know the naughty bits in every book on his shelves. Nonetheless, the platforms are being blamed for what users do on those platforms. Section 230 was designed to address that by shielding companies — including, by the way, news publishers — from liability for what others do in their space while also giving them the freedom (but not the requirement) to police what people put there. The idea was to encourage the convening of productive conversation for the good of democracy. But now the right and the left are both attacking 230 and with it the internet and with that freedom of expression. One bite has been taken out of 230 thanks to — what else? — sex, and many are trying to dilute it further or just kill it.

Today, the courts are being deprived of the opportunity to decide cases about hateful and harmful speech because platforms are making decisions about content takedowns first under their community standards and only rarely over matters of legality. This is one reason why a member of the Transatlantic High-Level Working Group on Content Moderation and Freedom of Expression — of which I am also a member — proposes the creation of national internet courts, so that these matters can be adjudicated in public, with due process. In matters of obscenity, our legal norms were negotiated in the courts; matters of hateful and harmful speech are by and large bypassing the courts and thus the public loses an opportunity to negotiate them.

What harm, exactly?

The presidential commission looked at extensive research and found little evidence of harm:

Extensive empirical investigation, both by the Commission and by others, provides no evidence that exposure to or use of explicit sexual materials plays a significant role in the causation of social or individual harms such as crime, delinquency, sexual or nonsexual deviancy or severe emotional disturbances…. Studies show that a number of factors, such as disorganized family relationships and unfavorable peer influences, are intimately related to harmful sexual behavior or adverse character development. Exposure to sexually explicit materials, however, cannot be counted as among those determinative factors. Despite the existence of widespread legal prohibitions upon the dissemination of such materials, exposure to them appears to be a usual and harmless part of the process of growing up in our society and a frequent and nondamaging occurrence among adults.

Over this, Keating and Nixon went ballistic. Said Nixon: “The commission contends that the proliferation of filthy books and plays has no lasting harmful effect on a man’s character. If that were true, it must also he true that great books, great paintings and great plays have no ennobling effect on a man’s conduct. Centuries of civilization and 10 minutes of common sense tell us otherwise.” To hell with evidence, says Nixon; I know better.

Keating, likewise, trusts his gut: “That obscenity corrupts lies within the common sense, the reason, and the logic of every man. If man is affected by his environment, by circumstances of his life, by reading, by instruction, by anything, he is certainly affected by pornography.” In the book, Keating ally Joseph Howard, a priest and a leader of the National Office of Decent Literature, doesn’t need facts when he has J. Edgar Hoover to quote: “Police officials,” said the FBI director, “unequivocally state that lewd and obscene material plays a motivating role in sexual violence…. Such filth in the hands of young people and curious adolescents does untold damage and leads to disastrous consequences.” Damn the data; full speed ahead.

Today, we see a similar habit of skipping over research, data, and evidence to get right to condemnation. We do not actually know the full impact of Facebook, Cambridge Analytica, Twitter, and social media on the election. As the commission says of the causes of deviancy, there must be other factors that got us Trump. Legislation and regulation are being proposed based on a candy bowl of tropes — the filter bubble, the echo chamber, hate speech, digital harms — without sufficient research to back up the claims. Thank goodness we are starting to see research into these questions; see, for example, Axel Bruns’ dismantling of the filter bubble.

Here I will lay some blame at the feet of the platforms, for we cannot have adequate research to test these questions until we have data from the platforms that answer questions about what people see and how they behave.

How bad is the bad of the internet? That depends on evidence of impact. It also depends on relative judgment. Rembar’s view of what he calls the “seductio ad absurdum” of sex and titillation in media: “There is an acne on our culture.” It is “an unattractive aspect of our cultural adolescence.” And: “acne is hardly fatal.”

Is today’s online yelling and shouting, insulting and lying by some people— just some, remember — an “all-pervasive poison” that imperils the nation, as Keating viewed porn? Or is it an unsightly blemish we’ll likely grow out of, as Rembar might advise?

I am not saying we leave the zits alone. I have argued again and again that Facebook and Twitter should set their own north stars and collaborate with users, the public, and government on covenants they offer to which they will be held accountable. I think standards of behavior should apply to any user, including politicians and presidents and advertisers. I strongly argue that platforms should take down threatening and harassing behavior against their users. I embrace Section 230 precisely because it gives the platforms as well as publishers the freedom to decide and enforce their own limits. But I also believe that we must respect the public and not patronize and infantilize them by believing we should protect them from themselves, saving their souls.

Permission is not endorsement

To be clear, the anti-censorship authors in the book and other allies of the commission (including The New York Times editorial page) are not defending pornography. “To affirm freedom is not to applaud that which is done under its sign,” Lelyveld writes. van den Haag, the psychoanalyst, abstracts pornography to a disturbing end: “Pornography reduces the world to orifices and organs, human action to their combinations. Sex rages in an empty world; people use each other as its anonymous bearers and vessels, bereaved of individual love and hate, thought and feeling reduced to bare sensations of pain and pleasure existing only in and for incessant copulations, without apprehension, conflict, or relationship — without human bonds.”

Likewise, by opposing censorship conducted or imposed by government, I am not defending hateful or noxious speech. When I oppose reflexive regulation, I am not defending its apparent objects — the tech companies — but instead I defend the internet and with it the free expression it enables. The question is not whether I like the vile, lying, bigoted rantings of the likes of a Donald Trump or Donald Trump Jr. or a video faked to make Nancy Pelosi look drunk — of course, I do not — but whether by banning them a precedent is set that next will affect your or me.

Hollis Alpert, film critic then for Saturday Review, warns in his essay: “The unscrupulous politician can take advantage of the emotional, hysterical, and neurotic attitudes toward pornography to incite the multitude towards approval of repressive measures that go far beyond the control of the printed word and the photographed image.”

Of freedom of expression

Richard Nixon was quite willing to sacrifice freedom of expression to obliterate smut: “Pornography can corrupt a society and a civilization,” he wrote in his response to the commission. “The pollution of our culture, the pollution of our civilization with smut and filth is as serious a situation for the American people as the pollution of our once pure air and water…. I am well aware of the importance of protecting freedom of expression. But pornography is to freedom of expression what anarchy is to liberty; as free men willingly restrain a measure of their freedom to prevent anarchy, so must we draw the line against pornography to protect freedom of expression.”

Where will lines be drawn on online speech? Against what? For what reasons? Out of what evidence? At what cost? These questions are too rarely being asked, yet answers are being offering in legislation that is having a deleterious effect on freedom of expression.

Society survived and figured out how to grapple with porn, all in all, just as it survived and figured out how to adapt to printing, the telegraph, the dime novel, the comic book, and other supposed scourges on public morality — once society learned each time to trust itself. Censorship inevitably springs from a lack of trust in our fellow citizens.

Gene McCarthy writes:

There is nothing in the historical record to show that censorship of religious or political ideas has had any lasting effect. Christianity flourished despite the efforts of the Roman Emperors to suppress it. Heresies and new religions developed and flourished in the Christian era at the height of religious suppression. The theories of democracy did not die out even though kings opposed them. And the efforts in recent times to suppress the Communist ideology and to keep it from people has not had a measurable or determinable success. Insofar as the record goes, the indications are that heresy and political ideas either flourished or died because of their own strength or weakness even though books were suppressed or burned and authors imprisoned, exiled, or executed.

And he concludes: “The real basis of freedom of speech and of expression is not, however, the right of a person to say what he thinks or what he wishes to say but the right and need of all persons to learn the truth. The only practical approach to this end is freedom of expression.”

Epilogue

An amusing sidebar to this tale: When the commission released its report, an enterprising publisher printed and sold an illustrated version of it, adding examples of what was being debated therein: that is, 546 dirty pictures. William Hamling, the publisher, and Earl Kemp, the editor, were arrested on charges of pandering to prurient interests for mailing ads for the illustrated report, and sentenced to four and three years in prison, respectively. According to Robert Brenner’s Huffpost account, someone received the mailing, took it to Keating, who took it to Nixon, who told Attorney General John Mitchell to nab them. (And some wonder why we worry about Trump attacking the press as the enemy of the people and having a willing handmaid in William Barr, who will do his any bidding!) The Supreme Court upheld their convictions but the men served only 90 days, though the owner was forced to sell his publishing house and was not permitted to write about the case: censorship upon censorship.

Two months later, Richard Nixon left office.

The post Harmful speech as the new porn appeared first on BuzzMachine.

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.