Defending Liberal Values in Illiberal Times
An eternal movement: critical thought, at first subversive, turns against itself and becomes a new conformism, but one that is sanctified by its former rebellion.
Pascal Bruckner, The Tyranny of Guilt: An Essay on Western Masochism
This essay will be denounced as bigotry, which will prove its point.
Its point is not that political activists have become increasingly intolerant, or that political speech is being suppressed, or that political correctness has run amok. Those cases have already been made, widely and with varying degrees of plausibility; among their least persuasive advocates is the man elected President of the United States in November 2016. Rather, the message here is that laudable ideals of pluralism and inclusion have been distorted beyond any practical application in contemporary society. The message is that decent people have been caricatured as ugly extremists, that average citizens find themselves classed as a favored elite, that workable rules which benefit everyone are dismissed as cynical rationalizations of hierarchy, and that discrimination has been defined downward. The message is that the humble and the well-intentioned are more and more being left out of civil discourse because of social currents which place them, unfairly and illogically, among the powerful and the selfish. These are the messages critics will say represent only the presumptions of privilege. By way of illustration, read the most hostile responses to these words when they appear.
I write as a middle-aged straight white male. I also write as a husband and a father. I further write as a pro-choice, pro-same-sex marriage, pro-cannabis decriminalization individual. I write as someone who has cast ballots for left-leaning Green and New Democratic Party candidates in Canadian elections, as someone whose book shelves hold well-thumbed works by Henry David Thoreau, Jack Kerouac, Barack Obama, Lenny Bruce, and Aleister Crowley, who has been photographed as the London gravesite of Karl Marx, and who has been an unwilling occupant of a police vehicle. I write as a child of the 1970s, who grew up on Sesame Street and Free To Be, You and Me, before discovering the joys of sex, drugs, and rock ‘n’ roll. I write as a member of Generation X, who was educated about racism and sexism throughout my academic life, who throughout my employment history has filled in job applications which ask whether I am a woman, disabled, or a visible minority, and whose cultural environment has been peopled with everyone from Hillary Clinton and Nelson Mandela to Ellen DeGeneres and RuPaul. I write as someone whose vague neo-hippie principles of 1987 have been recast as reactionary prejudices three decades later. I write to find out why that should be so.
Of course, all this might be put down as mere intransigence. Times change; people like me may not. In his 1940 short story “The Eighty-Yard Run,” the American writer Irwin Shaw wrote of a onetime college athlete who, after marrying his campus sweetheart and then struggling through the uncertainties of the Great Depression, gradually realizes the sporting exploits of his youth were his greatest achievement. The protagonist, given the splendidly symbolic name of Christian Darling, is left dislocated as his wife becomes the household breadwinner and enters the sophisticated circles of Manhattan publishing, where there is
conversation about things a normal man wouldn’t know much about, Frenchmen who painted as though they used their elbows instead of brushes, composers who wrote whole symphonies without a single melody in them, writers who knew all about politics and women who knew all about writers, the movement of the proletariat, Marx, somehow mixed up with five-dollar dinners and the best-looking women in America and fairies who made them laugh…Louise was made assistant editor and the house was always full of strange men and women who talked fast and got angry on abstract subjects like mural painting, novelists, labor unions. Negro short-story writers drank Louise’s liquor, and a lot of Jews, and big solemn men with scarred faces and knotted hands who talked slowly but clearly about picket lines and battles with guns and leadpipe at mine-shaft-head and in front of factory gates. And Louise moved among them all, confidently…
In Shaw’s characterization, Christian Darling is not unsympathetic. He loves his wife and would like to understand her new career and her new friends. Yet he feels an inexorable longing for the simplicity of the gridiron and his lost status as a prospective star player; he wonders why qualities which were so valued in the pristine past, such as the talent for making deft plays on the football field, can become so irrelevant, even obsolete, in the cosmopolitan flux of the present. In the Twenty-First Century, there are many Christian Darlings reflecting back on their own eighty-yard runs. Perhaps I’m just one of them.
This could also be just compassion fatigue. A diagnosis sometimes attached to the burnout experienced by overworked members of caring professions (nurses, social workers, and so on), the term could also refer to the general apathy felt by anyone confronted with too many causes – good causes, perhaps, and causes which anyone would normally support, but only in the absence of a dozen others haranguing for his or her attention. Tell me once about racial profiling, or sexual harassment, or gay-bashing, and you will certainly get a sympathetic hearing. Badger me incessantly about all of those and many more, public shaming after public shaming, official apology after official apology, year after year, and my patience will wear thin.
So the trends I lament may well be irreversible, and in any case I have no desire to turn back the clock or arrest anyone else’s advances, much less a coherent plan to do so. And I can acknowledge my own biases and blind spots, the instilled outlook and personal experiences which have shaped my opinion and which will no doubt cause some to write off whatever opinions I express. Yet it should still be possible to critique the more simplistic ideas now touted as forward thinking, not from the perspective of someone who has always resisted them but as someone to whom they once seemed more reasonable. It should still be possible to critique double standards and false equivalencies, whatever notions of equality they are meant to serve. It should still be possible to critique modern excesses of ideology, no matter what noble traditions the ideologues imagine themselves to be maintaining. It should still be possible to take issue with fashionable positions, without being rejected, fashionably, as a hater. It is still possible to be aware without needing to be woke. I hope to present here a kind of liberal skepticism – to question whether what is termed social justice is always a genuine improvement over existing conditions, and, conversely, to ask if so-called injustice must always be protested and prevented. Yes, I am who I am, with all the preconceptions that go with my background and my age. Nevertheless, I have seen enough change over the years to notice that not all of it has brought unqualified good. I have applauded enough change to wonder why it should need to be changed again. I have supported enough change to have earned my misgivings about what changes may come. At the very least, I have learned a crucial distinction which I will outline in the paragraphs that follow: Progress is always change, but not all change is progress.
I. FLOWER POWER FAILURE
Black Lives Matter. Gay Pride. Idle No More. Homophobia. Transphobia. Islamophobia. Rape Culture. Take a Knee. #Me Too. Virtue signalling. Snowflakes. Identity. Diversity. Safe spaces. Trigger warnings. Antifa. Occupy. Intersectionality. We are presently in the midst of the greatest cultural disruption in fifty years, as a range of distinct populations – women, blacks, Muslims, Natives, gay people, transgender people – are demanding rights and recognition with an urgency not seen since the protest era of the 1960s and 70s. The stories dominate the news; the controversies inflame the message boards; the causes absorb the courts and the legislatures. But just like the Summer of Love, the breakthroughs of one epoch can’t be perfectly replicated in another. Whereas today’s activists insist that social justice has a long way to go, their critics point out how far it has come – and how far it has strayed.
The civil rights and sexual revolutions of the Twentieth Century’s second half were historic in their impact. Longstanding codes and customs, around race, gender, private morality and public order, changed forever. For the first time, the full security of citizenship was deemed applicable to non-whites as well as whites. For the first time, the full benefits of personal autonomy were made accessible to females as well as males. For the first time, the full opportunities of human freedom were granted to unconventional fringes as well as dominant centres. The gains were uneven, certainly, but the language of equality, fairness, and liberation which idealized them became entrenched in our discourse and our democracy.
So entrenched, indeed, that several generations have grown up equating grievance with goodness: to claim a place in society, goes this assumption, you must first be by some reckoning a victim of it. To ensure you are treated the same, you must first specify your difference. If you have no unmet needs, you have only unearned advantages. Racism, sexism, and other prejudices are everywhere. The past should be continuously evaluated, and condemned, against the sensibilities of the present. Governments exist to protect rights, the legal system exists to define them, and progress is judged by the varieties of people they specially acknowledge. The compelling moral dramas presented by the civil rights and feminist movements of the 1950s, 60s, and 70s have been uncritically adopted and extended by later offshoots. Not just in their specific tactics – marches, sit-ins, lawsuits, media appearances, and so on – but in their very premise: a defined segment of the citizenry has been systemically deprived of basic civil liberties long enjoyed by others. Yet as this premise has been taken up by more and more (and smaller and smaller) groups, it has become less and less convincing.
What is significant here is that, unlike five or six decades ago, not all the unconvinced are ignorant or hostile. Of course, the rise of Donald Trump may be attributed to an ugly “whitelash” driven by an uglier alt-right, the contemporary versions of Richard Nixon and the silent majority caricatured by Archie Bunker. But it isn’t just resentful conservatives questioning the modern rhetoric of social protest. Many people wholly support feminist principles – which is why they oppose sexual standards that reduce women to helpless objects of male predation. Many people enjoy living in multicultural communities – which is why they dislike seeing single cultures invoke an unapproachable separateness off-limits to outsiders. Many people agreed with former Canadian Justice Minister (later, Prime Minister) Pierre Trudeau’s 1967 comment that the state has no business in the bedrooms of the nation – so they deplore the notion that all manner of bedroom predilections should have to be addressed by the state. Many people revere Martin Luther King’s dream of being judged not by color of skin but by content of character – so they lament a reality where color and gender are factored in everywhere from federal cabinets and federal funding to municipal boards and municipal budgets. Many people have been pleased to share cultural space with those of other backgrounds and persuasions – so they are dismayed when the already accepted strive to reinvent themselves as persecuted minorities. Many people reject bigotry and rejoice in its decline – which is why they despair when its meaning now encompasses an “unconscious bias” impossible to overcome.
These people ask, moreover, whether it’s feasible to live under what sometimes resembles a permanent program of including the supposedly excluded (however self-defined their exclusion) and shaming the supposedly prejudiced (however open they thought they already were). In ever more convoluted ways, it seems that any social order is assessed by one-dimensional measures of bigotry — who demonstrates it, and who suffers from it — more than any other standard. We are so accustomed to the idioms of social justice that we uncritically accept its portrayals of stodgy, chauvinistic authorities denying things to a rainbow of vulnerable free spirits. But that simple stereotype of noble victims and narrow-minded villains has outlived its usefulness.
Why? Distinct social groups have been demanding rights and recognition from a dominant majority for a long time; now, after legal breakthroughs and demographic evolution, the dominant majority is itself indistinct, yet it is one cohort which remains the default characterization of it. The current calculus of rights and freedoms insists on a guilty-until-proven-innocent paradigm, whereby anyone not claiming oppression is potentially an oppressor. In this climate of constant, quasi-official alertness for racism, sexism, and homophobia, we assume that actual racists, sexists, and homophobes are everywhere, and once more, naturally, it is a particular category of persons given the roles of the comfortable, the advantaged, the bad guys. When extolling the virtues of diversity, that is, we mean fewer heterosexual men of European descent in positions of visibility and control.
And those roles – of the undeserving and the unsympathetic – have been expanded. “Tolerance” used to signify a live-and-let-live pluralism, but it has since come to imply something closer to active endorsement, such that anything less than enthusiastic support (of LGBTQ rights, of feminist values, of Islamic customs) is deemed to be hate. It was once deemed okay to be different; now it’s obligatory. Indeed, in some arguments the mere fact of not holding a given identity is to carry the suspicion of being opposed to it; whatever set of people are being defended, there must be a corresponding set of -ists or -phobics to defend against. True tolerance, however, is a willingness to coexist, not a requirement to cater. When the charge of racism can be leveled at the people who run the Academy Awards as easily as at the people who once ran Alabama, the term has lost much of its force. When any community can stake out a mandated separateness and, at the same time, petition for full integration into the mainstream, the discrimination the mainstream once learned to reject becomes somehow acceptable. Democracy promises equality despite differences, not because of them. What is needed is not more diversity for its own sake, wherein one specific class of people is, always and exclusively, obligated to accommodate every other, but the willingness to transcend it.
Straight white males, to be sure, can sometimes become Angry White Males – the supremacist thugs, racial separatists, and online trolls comprising today’s alt-right. Yet at least some of those fanatics may have been pushed toward the socio-political fringes rather than drawn to them. Angry White Males can certainly be bad guys who resent ceding any of their long-held social comforts, but there are just as many who appreciate the wrongs of the past and who welcome new regimes of openness and acceptance. Yet failing to differentiate between these camps is partly why we are faced with bathroom wars, a Brexit, and Donald Trump – and neo-Nazis. To be continually scolded for harboring attitudes they have long outgrown, for enjoying prerogatives they have long relinquished, or for having a nature they have long stopped assuming is everybody else’s, is to provoke decent men into a reaction they don’t really want to feel.
Many current neo-Nazis were probably warped all along, but perhaps a few have concluded they have nothing to lose, and a supportive community to gain, by living up to a reputation imposed on them from the outside. Over the decades, politicians as varied as Richard Nixon, Ronald Reagan, George W. Bush, John McCain, Mitt Romney, and Canadian conservatives Preston Manning, Stockwell Day, and Stephen Harper, have all been charged with stoking racial resentment and playing to the worst instincts of white Christian tribalism. Legitimate debates around free speech, immigration, and diversity have been written off as mere smokescreens for bigotry. Goaded by outrage outlets like Fox News and talk radio, some people who might have otherwise remained oddball libertarians or wonkish tax-cutters have seen themselves categorized as radicals and then embraced the label, in an ongoing cycle of vilification and polarization. Yesterday’s Rush Limbaugh-listening adolescent – uneducated, unemployed, and constantly reminded of supposed insults to his identity – has grown into today’s meme-sharing bully.
Unlike their forebears of the Jim Crow southern US or the office culture of the Mad Men era, most angry white males of today are not fighting to protect their privilege but protesting others’ presumption that they still have privileges to protect. They are the one community that cannot complain of the historical discrimination routinely (and accurately) raised by women, gays and lesbians, and people of color. While few suggest they’ve been denied an ample share of economic security, they are often charged with being on the wrong side of social justice. In court cases, media outrages, campus policies, and national legislation, straight white males are told, at least implicitly, that they are both obsolete and guilty: woefully outdated as identities and shamefully undeserving as citizens. Yet this message, repeated across the political conversation, has complications for democracy.
One is that private self-definition is now a key determinant of public activism. Labor and environmental movements, which presume a universal relevance (pollution and job markets affect everyone), now compete with far narrower campaigns in which small groups claim popular attention disproportionate to their numbers. Feminism and the civil rights demands of the 1960s derived from the obvious unfairness directed at broad populations; women and blacks were objectively denied full civic equality, whatever their individual responses to the fact. Today, however, assertion of a hitherto unseen disposition invariably accompanies an appeal for acknowledgement before the law. It is no longer enough to live as one pleases and assume legal protection; the slightest personal inclination must be righteously politicized, goes the orthodoxy, or it isn’t real and it won’t be protected. It is no longer just that the personal is the political, as the old slogan declared, but that only the personal is the political. Any ideals of common interests, held by the mainstream as much as the periphery, the numerically dominant as much as the numerically obscure, are thus eroded.
The reflexive denigration of straight white maleness points to deeper problems. One of the triumphs of our modern judicial system is that power is no longer exclusive to a favored caste of WASPs: no contemporary employer, landlord, merchant, or educator would cite his Caucasian superiority and expect to get away with it in court. No one today insists that black people should sit at the back of buses, or that women should be paid less than men who do the same job, or that gay people should be charged as criminals. Yet institutional racism, sexism, and homophobia are still blamed whenever a female, gay, or non-white person is offended by something – the possibility of honest differences or legitimate debates on matters of race or sexuality is, all too often, discounted.
And then illegitimate debates come to the fore. As early as 1993, in Robert Hughes’ book Culture of Complaint: The Fraying of America, a previous cohort of angry white males were spotted scoring cheap points in the then-novel controversy around political correctness. “[C]ampus conservatives,” Hughes wrote, “…are flourishing, delighted that the PC folk give some drunken creep of a student who bellows ‘nigger’ and ‘dyke’ into the campus night the opportunity to posture as a martyr to speech-repression.” As Hughes recognized, the key sensibility driving such crudities was the students’ conceit of being disconnected from power. Contemporary angry white males, likewise, see themselves as rebels against orthodoxy, rather than defenders of status: the pendulum has swung, they say, and they are at the impudent, unconventional end of it, just as liberals used to be. So if liberal values have since been elevated into federal laws, common custom, and received wisdom, they are now fair game for protest and pranks.
Intentionally offensive as their pranks can be, when the alt-rightists and their ilk sneer at “political correctness,” they mean to illustrate how racism ought to mean something more deliberate and more entrenched than just any act, statement, or condition someone finds objectionable, and how sexism alone cannot explain why there are not more female CEOs. It is one thing to say that not enough has changed since the days of segregation and the Playboy Club, but it is something else to say that nothing has changed at all. The wrongs that motivated people in 1960 or 1975 have no obvious equivalents in the Twenty-First century. Given the landmark legal and cultural evolutions of the last fifty years, progressives who try to shame others into coming over to their side will find their opponents are becoming, in every sense, shameless.
That shamelessness, then, is an indirect response to the personal-as-political sensibilities of progressives themselves. Since white males have so often been told they can never understand the needs of women and non-whites, the most incensed of them have retreated into openly racist and sexist “White Pride” or “Men’s Rights” associations, which they posit as no more than the equivalents of similarly sequestered groups formed by others. The criticism that some kinds of ethnic or gendered solidarity are good but some are bad has become less and less potent, as numerous alt-right bodies brazenly proclaim their own allegiances in the same unapologetic terms as their social opposites.
Even the more moderate conservatives who predate (and usually disavow) the rise of the alt-right have occasionally been accused of practicing “dog whistle politics” and using “code words.” The phrases refer to the alleged deployment of innocuous-sounding statements which in fact are meant to rile simmering prejudices: the implication is that, rather than play directly to crude stereotypes or dangerous emotions, politicians and others will slyly frame their positions with subtle winks and nudges that carry no obvious, indictable offense yet which are nonetheless inflammatory. To be accused of using dog whistles and code words is considered a very serious charge.
For example, Richard Nixon’s 1968 victory in the US presidential race has been credited to his promise to restore “law and order,” an invocation which liberals understood to signify a real message of anti-black alarmism. But the liberals never explained why law and order was so inherently bad, or why only racists and reactionaries would support it. In the 2015 Canadian federal election campaign, then-Prime Minister Stephen Harper’s reference to “old-stock” Canadians was taken in some quarters as a subtle anti-immigration quip, yet again, no one could say that some citizens did not in fact have deeper generational roots in the country than others. At some point, it must be acknowledged that the alleged code-talkers and dog-whistlers are just expressing reasonable positions, rather than masking ugly ideas with polite euphemisms. How coded do the words have to be, or how high does the dog whistle have to sound, before we admit that the speaker means only what he or she says?
In the 1940s and 50s, the same use of symbolic terms and secret signals was perceived at another end of the political spectrum. Anyone who called for stronger workplace laws was really pushing the Moscow line, it was said; anyone who marched against racial segregation was really working for the Reds; anyone who listened to folk music was really a closet Communist. Perhaps some of those people truly were Soviet sympathizers (indeed, a few have been proven to be), but most were simply and sincerely in favor of unions, integration, and folk music. Likewise, someone who today expresses an opposition to hijabs or gay marriage, or even criticism of them, is not likely plotting mass incarcerations. Both pro-labor or pro-family propositions ought to be freely stated without their being dismissed as mere fronts for some sinister ideology. Where exactly does seeking popular support end and reckless demagoguery begin? Who gets to determine the line between moderation and extremism?
Contemporary audiences have become so steeped in the machinations of politics, and so accustomed to pundits and talking heads cynically deconstructing politicians’ every utterance, that the legitimacy of rhetoric as a political tool has been forgotten. A candidate who pledges to “invest in the future” rather than “raise taxes” is trying to persuade, not to deceive; there is nothing malign about advancing arguable points in effective language. There is also something presumptuous in asserting that we can read the evil thoughts of superficially decent orators and their audiences, as if democratic principles of free speech and free conscience only apply to those we already agree with. “Code words” and “dog-whistle politics” are actually symptoms of the same problem they are meant to denounce: the inexorable polarization and distrust of public discourse.
Whether they engage with the alt-right or not, and whether or not they detect code words or dog whistles, the driving cause of progressives in the present is usually characterized as “social justice.” The concept has been raised for some time and has been given many broad definitions, all clustering around general ideals of economic security for everyone, equal treatment of diverse populations, and opposition to discrimination. Social justice is the rallying cry of anti-poverty, anti-racism, and anti-globalization groups, among many other organizations and individuals; it also serves as the underlying platform of Americans’ “Resistance” to the cultural forces and legislative policies represented by the administration of Donald Trump. Yet in the very popularity of the words, and in the very vagueness of what they represent, are some nagging uncertainties.
For one thing, although some especially passionate advocates are kidded as “social justice warriors,” there are no social justice police and, indeed, no social justice judges, juries, or jails. In this respect, social justice is analogous to international law – it functions well enough when there is wide agreement on a particular crime and its appropriate penalty, but in reality such consensus is seldom achieved. Even with an International Criminal Court and a United Nations, it is rare that international law is administered with anything like impartiality. Alleged violators of it will often claim a right to self-defence, and would-be enforcers often hide their own self-interest. Social justice has the same defects. Absent a universally recognized system of rules and referees, it remains, for the most part, a subjective desire rather than a tangible public service. It is difficult enough to agree on plain justice, which at least has codified lists of rights and responsibilities, let alone a metaphorical kind which may mean different things to different people.
This leads to another flaw of the social justice idea, which is that it can sometimes seem uncomfortably close to mob justice. Even if it is to be called something more contemporary like Wikiconstitution, or crowdsourced commandments, social justice can mean right and wrong are determined not by clear-headed application of documented statutes but by inarticulate emotions of resentment or envy. The whole point of traditional, legislated justice is that it has been built up and written down over centuries of precedent and democratic debate – yes, it is slow, dry, and sometimes exasperating to those on the wrong side of it, but once established it has a substance that fleeting instincts never attain. Social justice, by contrast, attends more to loud grievances and fashionable causes than permanent truths. It can seem valid in the heat of the moment, and it may even remain so long afterwards, but only when it has been incorporated into the formal, legalized standards already on the books. Social justice makes little room for fair trials and the presumption of innocence. To call for social justice at every instance of somebody, somewhere, not getting what they want out of life, is to suggest that conventional justice is insufficient. But suffragettes, Freedom Marchers, and many others sacrificed to change or enforce the laws as they stood, not to appeal to an abstract set of laws which only existed in their minds. The greatest failing of social justice is not that it is a bad idea, but that an idea is all it is.
Nowhere is social justice more often or more heatedly demanded than at colleges and universities across North America. The academy is now, infamously, the site of the most contentious disputes around rights, speech, and diversity which have pervaded society as a whole: violent protests, disrupted or prevented lectures, and other public scandals have brought unwelcome attention to many schools. Certainly, students, teachers, and teaching have been at the center of controversies since Socrates was accused of corrupting his young pupils, and the tradition has continued down to the campus unrest of the 1960s, and the political correctness flaps of the 1980s and 90s. Beyond the specific issues at hand, it’s helpful to consider why colleges and universities are so often the scenes of such impassioned debate. In one sense, this is exactly what post-secondary education is for: to propose and argue complex ideas. A school where difficult points about the arts or sciences weren’t being raised would not be fulfilling its mandate. Inquiry, hypothesis, and disagreement, whether in the engineering labs or the English classes, are vital to human betterment.
Secondly, these philosophical contests are usually generated by or among the population most energized by idealism and perceived injustice: youth. There is an aphorism, sometimes attributed to Winston Churchill, to the effect that “A man who is not a liberal at twenty has no heart, and a man who is not a conservative at thirty has no head.” Universities are crowded with people between the ages of twenty and thirty. Nowhere else are matters of politics and principle so important to so many, who all have so much time to think and talk – and quarrel in coffee shops and dorm rooms – about them. But campuses are self-contained communities with their own governments, laws, media, and even police. Differences of opinion which go ignored or accommodated in the wider society can be formally adjudicated in the groves of academia. Unfortunately, this possibility – of defining and penalizing offences in ways that are impossible in the outside world – also carries a potential for zealotry. Students can level charges of racism, sexism, homophobia and other wrongs against instructors or classmates, knowing they have a power they could never wield as ordinary citizens; it’s additionally significant that they are not just learners but (directly or indirectly) paying customers, whose protests against a program or professor imply a threat to take their business elsewhere.
As well, colleges and universities are by their nature comprised of the intellectually and economically advantaged, so speech codes, rules of sexual consent, safe spaces, trigger warnings, and the like afford children of privilege an opportunity to posture as a vulnerable underclass. Scholarly dialogue, youthful curiosity, and free thought, which have long made higher learning so beneficial for the cultures which support it, are giving way to orthodoxy, consumerism, and a petty authoritarianism that makes it a crime to say “he” or “she” to the wrong person. For these reasons, campus contretemps are currently the most visible illustrations of how the crusade for social justice can be perverted into rhetoric and policies which are neither peaceable nor progressive.
The other great forum for social justice battles is in the media. Whatever else these matters may signify – the troubling deterioration of democracy, or the overdue overthrow of systemic inequity – they amplify and are amplified by the dominance of instant electronic communication in the daily lives of hundreds of millions of people. Social justice and social media are inextricable from each other. This bond between medium and message, particularly among the young, mirrors the same kinds of ties witnessed in previous eras. In 1967, teenagers telephoned radio stations to request airplay of their favorite psychedelic 45s; in 1987 they made mixed cassette tapes and bootleg VHS recordings of alternative music and movies; in 2016 they posted clever memes about female Ghostbusters. In each case, a particular socio-political stance is made relevant – is made possible – through the very manner in which it is expressed.
How social justice is represented in pop culture can be a legitimate question. Fifty years ago, African American performers made breakthroughs on US television (Bill Cosby on I Spy and Ivan Dixon on Hogan’s Heroes), along with black cartoon characters (Valerie in Josie and the Pussycats and Franklin in Peanuts), although critics of all colors detected a facile tokenism in the casting. Over time, TV series about independent women (Mary Tyler Moore, One Day at a Time), gay people (Ellen, Will & Grace), and even overweight people (Roseanne, Mike & Molly), have all been credited with promoting the acceptability of their real-life equivalents in contemporary society. By now there is a long history of “firsts” in show business, whereby once-invisible communities finally got to see one of their own in a sitcom, a superhero franchise, or a talk show. But the recurring pattern here is not just that community standards evolve, but that their evolution is driven by technology and commercial entertainment. Politics is no longer only practiced in town halls, or church basements, or voting booths, but on glitzy awards broadcasts, celebrity fan websites, and an array of trendy apps.
There is then a problem of whether the serious topics of civil rights and discrimination are advanced or trivialized when they play out against backgrounds of lowbrow comedy, science fiction, and fairy tales (the 2017 Beauty and the Beast remake, for example, had a much-publicized “gay moment” which won sober commentary from journalists and critics). It may be that the issues which get people frantically liking and sharing are mere fodder for whatever lets them like and share to begin with. Which forces are more deeply affecting the modern public mind – feminist activism, or Facebook advertising? Gradual tolerance of alternative lifestyles, or weekend grosses of Disney blockbusters? Everyday interaction among neighbors, colleagues, and families, or a famous person’s controversial tweets?
Again, times change, and so do tastes, customs, and demographics. People should be able to discuss those changes by whatever means are convenient for them. Yet there is a sense now that the means matter more than the discussions themselves: that, whatever is felt about racism in the film industry, or sexism in gaming, or transphobia on Tumblr, the really important thing is to keep watching films, playing games, and clicking on Tumblr. Polarized opinions and message board flame wars merely reiterate the old adage that there is no such thing as bad publicity. Perhaps the real winners of the latest culture battles will be neither the progressive scolds nor the reactionary trolls, but the ubiquitous, indispensable media over and through which the battles are waged.
The broader point here is that these battles are not about to be resolved anytime soon. The alt-right; code words; angry white males; political correctness; social justice warriors; campus disturbances; online invective – all are likely to remain prominent barriers to cohesion and comity in the coming years. And it is worth reiterating that cohesion and comity are, indeed, liberal aspirations. Challenging the various strands of activism and outrage cited here is (or should be) a project of open-minded, unselfish citizens, rather than the restrictive or parochial chauvinists such challenges are often attributed to. As convenient as it might be to defame doubters of progressivism as mere rednecks and reactionaries, those portraits overlook the more complex truth that the doubters are frequently just as enlightened and just as forward in their thinking as the progressives themselves. And sometimes, as we will see, even more so.
II. VYING FOR LAST
It was an moment illustrative of our times when, on July 3, 2016, the annual Pride parade in Toronto was briefly halted by a demonstration of the local Black Lives Matter movement. Essentially, two groups were competing for a supremacy of victimhood, and all the public attention, cultural standing, and government payouts that go with it. The prospect of the Idle No More Native organization in turn staging their own disruption of the BLM protest of the Pride celebration, and so on and so on, is as yet unrealized.
Admittedly, this is an easy jest for an outsider to make. Certainly a straight white person has never known the discrimination and exclusion faced by either blacks or gay people, all of whom surely have legitimate and outstanding claims to popular compassion and fair treatment. Yet there is an element of farce in the claimants almost literally stepping over each other to have their respective says, as happened during Toronto’s 2016 Pride event. On one hand, straight black folks were free to fall in love, marry, and raise families when homosexuals weren’t, but on the other, white people who happened to be gay were afforded opportunities and privileges most people of color could never attain. So what happens when those very real histories face off? Who were the prejudiced, and who were subject to prejudice? Whose suffering trumps whose?
These puzzles have become the logical consequence of our decades-long concerns over diversity and supposed intolerance. If citizenship becomes increasingly defined by identity – thus so-called “hyphenated Canadians” – then eventually some identities will consider themselves more identifiable than others. In fact, however, everyone is in some way part of an invisible or isolated cohort. Vietnamese budgie breeders, disabled history professors, Mennonite pastry chefs, overweight albino model airplane buffs, and a thousand other categorizations, might reasonably block traffic with their own sit-ins or marches. Left-handed Sikh gymnasts are people too; tattooed and body-pierced Satanists have faced harassment and shame, just for being who they are.
The concept of equal rights was devised to help citizens who were obviously and explicitly excluded from opportunities available to others: women, blacks, Hispanics, Asians, Natives, and Jews are the most familiar examples from North American history. The liberation campaigns which invoked their equal rights necessitated that hitherto dominant populations cede some exclusive privileges, and so racially segregated schools and businesses, men-only clubs and professions, and other social organizations, were forced by courts or economic pressure to accept previously excluded people. Sometimes there was resistance; sometimes the resistance was harsh. Yet now it is no longer exclusive privileges which must be relinquished, but intangible (and to a large extent subjective) habits of mind, like microaggressions or unconscious biases. The goal of equality therefore stays perpetually out of reach, like the horizon, as long as anyone perceives anything impeding their attainment of it.
And as equal rights have been sought by less distinct groups – homosexuals, people with subtle emotional or cognitive dysfunctions, people with unusual allergies or dietary needs, people who practice uncommon religious faiths – there has been a natural drive to claim such distinctions in order to win whatever rights must have to go with them. But again, everyone has unique qualities which cannot be recognized immediately, yet which still might be held as inherent qualities in need of legal security. Extreme examples may be conjectured: a movie buff who charges the loud theatregoer beside him with offending his identity as a cineaste; a disruptive tenant who contests his eviction on the basis of his unaccommodated Attention Deficit Disorder; an incompetent employee who blames her mistakes on Post-Traumatic Stress Disorder, and so on. As long as any personal characteristic can be defined as a fundamental condition deserving of inalienable rights, any person can be exempt from any regulation which inconveniences them, and can use the law against anybody else who fails to acknowledge their exemption.
Certainly, all decent societies strive to protect their most vulnerable members. But that protection is not achieved by denying the vulnerability itself. We can provide assistive technologies to people who use wheelchairs, for example, while still structuring our rules and our values around an assumption that not requiring a wheelchair is the default condition of human life. Yet too often, advocates insist that their particular community – sufferers of ADD and PTSD, or sensitive movie buffs, say – is fully an alternative parallel to others, rather than what it in most cases really is: a small, relatively inconsequential variation on an established standard.
Consider how, over the past few years, the word minority, used in a socio-political context, has increasingly been replaced by marginalized. Whereas the earlier term implied a mere accident of demographics, more recent usage suggests active rejection: you might happen to be one of a numerical minority within a broader population, but you’ve been marginalized away from your rightful position within it. The distinction is important. Over decades, racial, linguistic, religious, and sexual minorities have demanded and won legal protection that guarantees them the same rights as those in larger groups. Previously, an individual’s security within his or her community rested on how many others belonged in the same sub-category; those without the numbers to back up their needs or wants were at the mercy of whoever surrounded them. Discrimination, exploitation, and (in the worst cases) attempted extermination might result. Reforms in the Twentieth Century finally granted a measure of assurance to peoples who were otherwise socially insignificant. Ideally, thus, even in places where nearly everyone was Anglophone, a Francophone could still get government services in French; where nearly everyone was Christian, a Jewish person could still freely worship without losing employment or educational opportunities; where nearly everyone was of European descent, an Asian could still count on access to impartial courts and secret ballots.
Common to all these gains, however, is that no one pretended being one of the few was exactly like being one of the many. By all the important principles of citizenship, yes, you were necessarily the same – but in the little day-to-day exchanges, you might not always enjoy identical considerations. The concept of marginalization rejects this. Marginalization’s opponents insist that statistical minorities must be regarded as no less visible or valuable in the culture as anyone from a larger cohort, that it is wrong to notice the unconventionality of the unconventional. Transgender people, for example, may comprise only a tiny percentage of the Canadian population, but trans activists demand to be treated as if they are no less than the opposite half of the non-transgender segment. Likewise, though only 1.7 million of Canada’s 36 million people are Native (according to the 2016 census), advocates claim that marginalization has reduced Natives’ deserved influence upon government, economy, and public discourse. Similar arithmetical spins are now applied to a variety of other classes.
How persuasive is it to finesse the figures this way? Could it be that, in some cases at least, the purportedly marginalized are just getting what their relative contributions warrant? Women, to be sure, have a better case to make here. Slightly more than half of all people in the world are female, but historically far fewer females have ever attained leadership positions in politics, business, and other fields. Whether that is a result of deliberate marginalization, though, is another issue. Over the centuries, women have also been under-represented in military service, commission of violent crimes, and in mining and forestry fatalities, yet no one argues that such disproportionate ratios were only imposed by discriminatory laws. Marginalization, and the appropriate response to it, is in the eye of the beholder.
A simple illustration: the nearsighted. Many people need to wear eyeglasses, but most people do not. Nearsightedness is a widespread medical concern, but it is not a universal one. Images of glasses-wearing nerds and sharp-eyed heroes are fixtures of popular culture (think of mild-mannered Clark Kent disguising the true identity of superhuman Superman). A majority of elite athletes, soldiers, pilots, and other occupations which demand strong physical abilities and excellent hand-eye coordination, have 20/20 vision. Public signage favors those who can see from a distance, rather than those who cannot. Eyeglasses and contact lenses are an extra expense for those who require them. Glamorous models and celebrities are usually photographed without glasses. Indisputably, myopic people bear a range of social liabilities, but there is no orchestrated agenda against them. Indisputably, dependence on corrected eyesight is a kind of handicap, but no one insists the non-handicapped should not enjoy their independence. Indisputably, myopic people cannot do certain things, like look cool or perform acrobatic feats, but no one is actively preventing them from doing so. Myopic people are a minority, but they are not marginalized. Sometimes, minorities are only census percentages, rather than intentionally rejected individuals; some human characteristics are bound to be disadvantages where most humans don’t have them; complex societies tend to reward some cultural styles over others. Beyond a certain point, personal vulnerability stops being an unjust imposition and is merely an ordinary circumstance of life.
Moreover, the freedom to do, say, or be a particular something does not mean those who don’t need or don’t choose to exercise that freedom for themselves are restricting it elsewhere. Rights are essentially legal options, not legal compulsions. Yet much of the present language of diversity and community has confused these understandings: a recurring message is that if some people fit into a demographic subset, then all people must appreciate that special status or else be judged intolerant. In no other fields beside social activism are minorities safeguarded this way. A voter whose preferred candidate wins a minority of ballots will not see his or her policies enacted as law. A merchant whose product is purchased by a minority of consumers may go out of business. A team who scores the minority of goals loses the game. The voter, the merchant, and the team are entitled to a fair election, an unfettered market, and an impartial referee, but they are owed no compensation for not ending up on the majority side. If proponents of equality applied their principles here, though, the minorities would have their losses overturned: the simple fact of anyone not having what someone else has is an obvious injustice to rectify.
In his 2012 book The Righteous Mind: Why Good People Are Divided by Politics and Religion, Jonathan Haidt noted the seldom-observed difference between equality and fairness. Often used as synonyms, in fact fairness can generate inequality (an open election where mainstream parties win power and fringe candidates don’t), and equality can generate unfairness (hiring quotas which screen applicants through criteria besides ability). Marginalization, as an idea, aims to achieve equality at the expense of fairness. There must be no outsiders in democratic life, the idea holds, even if some kinds of insiders are far more common, and far more familiar to everyone else, than others. Such a mandate faces many obstacles, chief among them common sense. Writing about the American Civil Liberties Union’s campaigns against public Nativity displays, a 1983 column by the anonymous New Republic essayist “TRB” put it well:
The battle for minority rights goes on, of course, but does final victory require eradication of the majority culture? And is every manifestation of that culture an insult to those who aren’t fully a part of it? People who want to go through life with nothing to remind them of their minority status ask too much. They will not get it, and full civil equality does not require it.
Toronto’s clash of Black Lives and Gay Pride in 2016 bears this out. When artificial targets of pride and mattering are placed in competition, some competitors are bound to feel like they have lost. If any people can launch a public protest that they are not being treated equally – because they are in fact not equal, as a bloc – inequality becomes a handy excuse. As long as we are slotting ourselves into narrower and narrower communities, through which to rail against the perceived repressions of an imagined “System,” we will continue to jostle in line for ever-more meaningless gestures of accommodation.
One gesture has recently become more prominent. For many years, discrimination was a word with negative undertones. Discriminatory hiring practices, showing discrimination in choosing friends or soliciting customers, discriminating on the basis of race or sex or religion or ability – all were denounced as wrong. The wrongness was defined in the familiar “I have a dream” oratory, but it was also explained in less celebrated anecdotes and cases which were regularly dramatized in popular culture or put forward in legal actions. The child excluded from the game just because he was black; the employee turned down for a raise just because she was female; the family kept out of the neighborhood just because they were Jewish; the student barred from class just because she was handicapped, and so on. Again and again, the lesson was imparted that discriminating against such superficial markers was to miss the real human worth of the marked. The black child could be a fun playmate, the industrious female worker deserved the raise, the Jewish family were respectable homeowners, and the handicapped student had a first-class brain. We are more united than divided, the morals ran. There are more qualities we share than we don’t. Look past the surfaces to see the true natures underneath.
But now discrimination is not so often decried. Instead, the new instruction is an almost complete reversal of its predecessor, as we are no longer prohibited from discriminating but enjoined to celebrate difference. In this iteration, the earlier examples have been updated. Now the child must be included in the game because he is transgender; the employee should be promoted because she is Aboriginal; the family has to have a seat on the neighborhood council because they are Muslim; the student ought to be recruited because he has special needs. These changes were driven by a vague sense that the emphasis on commonality still left too many out – that supposedly color- or gender-blind standards were still geared toward (and maintained by) straight white males, who thereby forced outsiders to adapt to their own ways as if they had not had a long head start in learning them. By celebrating difference, social scientists enthused that achievement and merit were no longer delineated by things only one group had ever valued. Societies might flourish not by downplaying the dissimilar backgrounds of their individual components, but by highlighting them.
Another reading of this transformation, however, sees a darker story. As opportunities were truly opened to all, some genuinely positive results came about. The talents of women and minorities were no longer refused or neglected, but finally put to service of the public good. Millions of people felt a gratifying freedom to live and work and pursue their dreams, no longer thwarted by ignorance or outmoded laws. But not everyone succeeded in their pursuits: the promises of liberation and dismantled barriers had failed to mention that freedom to try was not a guarantee to get. As more freshly empowered citizens discovered that some broad yardsticks of temperament and skill continued to apply – and, inevitably, that sometimes they could not be met – they decided that their real problem was not in being forbidden to demonstrate their temperament and their skills but in the yardsticks themselves. Swept along by the imperatives of their own movement politics, feminists and civil rights campaigners had to devise something else to move against. Swept along by the career self-interest of their professions, social workers and diversity experts had to devise another institutional fault only they could repair. Thus the obstacle of discrimination, as a crucial foil of progressive activism, was replaced by the paradigm of difference.
Now the argument was no longer that women could perform jobs as well as men, or that people of color could be as learned as whites, or that homosexuals had as much ability to offer as heterosexuals. Those propositions had been largely proven – which is to say, everyone turned out to be pretty much even in their potential to excel or falter at anything, given the chance. Rather, the argument became that women performed jobs in a uniquely female manner, and that people of color had their own ways of learning, and that homosexual abilities were wholly distinct from those of heterosexuals, and they all needed to be accepted as such. Absolute measures of performance or knowledge or character were now obsolete. What counted was whose performance, knowledge, or character was being measured. Where once they had discouraged discrimination, progressives had moved to endorsing it.
In consequence of this endorsement, there emerged numerous new ways to be different. Naturally, every single human being is different from every other, but within the framework of social justice, difference was sorted into various human groups. Each group could then become a lobby, calling on governments, schools, and other institutions to formally recognize their particular difference, and, if recognition was not forthcoming, calling on the institutions to change their bigoted practices. Just as anti-discrimination agitators had identified novel prejudices such as ageism, look-ism (the favoring of physical attractiveness) and shade-ism (the favoring of lighter-skinned persons of color), pro-difference agitators might promote the Islamic niqab custom, the careers of female computer engineers, or two-spiritedness (Native people claiming both feminine and masculine qualities). Whether the people embodying these identities were good employees, students, or citizens, the critical thing is that they were judged to be different from most of their peers, and therefore deserving of celebration.
Hence celebrating difference fosters the incentive to have it, with all the impairment of social cohesion that entails. Note too that the differences sought are never ones of material superiority – more wealth, more prestige, more security – but of the moral variety: more marginalization, more suffering, more victimhood. This leads inexorably to conflicts of competing grievances, as between Black Lives Matter and Gay Pride. When the tiniest numerical minority can call themselves martyrs to social marginalization, marginalization becomes an attractive status to have. When numerous social groups seek the sympathy of the wider public, public sympathy becomes a contested prize. When a spectrum of self-defined identity movements appeal for designated government aid or legal standing, the appeals blur into self-marketing. When difference is transformed into a political lever, through which extra consideration goes to whoever can demonstrate some personal strain of it, more differences will result. And when differences cannot be easily demonstrated, the next step is to fabricate them.
III. OVER THE RAINBOW
Human sexuality is a subtle and subjective impulse. Its focus and intensity range from person to person, and from minute to minute and from year to year. People can be sexually adventurous or sexually repressed, or both over the course of a lifetime (or an evening). Sexual fantasies can be built around femme fatales or rugged cowboys, around swimsuit models or men in uniform, around long-term partners or complete strangers. Sexuality can be expressed in ways spanning from tender affection to homicidal cruelty. Sexual desire can be directed towards people of the same or opposite sex, or towards clothing, objects, or environments. Sexual identity can be as idiosyncratic as identity itself.
It is likely, however, that of the countless billions of sexual feelings felt and acted upon across human experience, the vast majority have been between post-adolescent males and females. Sex practices and sex taboos vary greatly by times and cultures, but heterosexual behavior is by far the most common biological drive of the species. While the increasing complexity of our social arrangements since the Industrial Revolution has seen increased opportunities for and acceptance of alternative sexualities – notably involving artificial contraception and non-marital relations – this one fundamental model has remained the standard form of reproduction and erotic instinct.
For some five decades, obviously, homosexuality has been a key arena of legal and social evolution. Once, homosexual activity was a criminal offense; today, gay marriages are formally recognized in many jurisdictions. Once, gay people were subjects of jokes and insults, if they were noticed at all; today politicians, movie and music stars, high-powered professionals, and average citizens can candidly state that they are gay and suffer no stigma in doing so. Once, gay people were depicted in the media as perverts, psychopaths, and predators; today they might be shown as heroes, heroines, best friends, or regular folks. Once any personal appetites outside a rigid definition of heterosexuality were viewed as degenerate; today there are a wide range of non-straight intimate leanings considered normal and natural.
In expanding the public understanding of different sexual identities, however, there are still questions around the place of sex within human communities and individual humans. Are adult sexual preferences inherent in newborns, or can they shaped by experiences in childhood and adolescence? Or some combination of both? Does society improve when sex is widely discussed and depicted everywhere from elementary education to commercial entertainment, or should the essential privacy of sex be sheltered by religious and moral sanction? Or some combination of both? Is gratifying an individual sexual taste a right, or a choice, or some combination of both? Is sexuality measured by what one does, or what one is? Can sexual desires be suppressed or shifted, or are they unchanging and innate? Is gender only a social construct, or is it inextricable from hormones and physiology? What do we really mean by social constructs? Are there any particular elements of humanness that are immutable, or are they all artificial and arbitrary assignments? Do the determinants of male and female, heterosexual and homosexual, lie within separate genetic materials, or do they share a genetic continuum? Or some combination of both? And can any of these questions be permanently settled through the apparatus of social justice?
The answers are, we know now, less precise than once supposed. All people are capable of having all varieties of sex, and in many cultural settings – from ancient Greece and Victorian boarding schools to contemporary prisons – same-sex relationships have been far more frequent, and far less noteworthy, than absolute parameters of straightness and gayness have usually admitted. Likewise, modern and historical research has uncovered hitherto uncharted nuances to different societies’ assigned projections of masculinity and femininity. Many activists and educators today draw on this knowledge to support their own redefinitions of sexual styles, and the rights which accompany them, in the present. But how well the research supports their campaigns is debatable.
One hundred years ago, there surely were frustrated and lonely people who could not express their desires (as in Oscar Wilde’s “The love that dare not speak its name”), but there were also many more people who had not examined exactly what those desires were, and who in any event were too busy with the more pressing matters of daily life to give the question much thought. Such was personal morality in 1918. Ignorance was widespread, but so was indifference. Some people were closeted, but most did not consider why they might ever need to be in one. There were small undergrounds of sexual outlaws, but there were not millions of secretly gay, bisexual, pansexual, or transgendered people dreaming of a more enlightened future where they could reveal their true beings. In the hyper-sexualized present, by contrast, we are told that our libidinal tastes are central to our identities, that they must be plainly acknowledged by our selves and approved by others, and that they must be fully embraced and acted on by no later than early adulthood. Whereas earlier arguments simply held that gay sex, like recreational drug use, should be legal between consenting adults, now the implication is that gay allegiance – or bisexual allegiance, or genderqueer allegiance, or an assortment of other categorical allegiances – should be mandatory among hesitating teenagers. If this is progress, it has not come without complications.
Complication Number One: it is impossible to ignore the transgender issue today. The trans community is not only the T in LGBTQ, but their cause has been raised everywhere from the campus of the University of Toronto (psychology professor Jordan Peterson has stirred controversy by bucking the institution’s standards for addressing transgender individuals) to the public restrooms of North Carolina (state officials have ruled that gender-specific facilities may only be used by those born as that gender) to popular culture (via celebrities like Chaz Bono and Caitlyn Jenner) to high school graduation ceremonies (trans kids have sought to attend as different genders than the ones they registered as Kindergartners). The transgender movement is often cited as the next, natural avenue of liberation and social justice – but is it?
We have lived with an idealized notion of civil rights for so long that many believe citizenship is only recognized through protest and disruption. The historic advances achieved by non-whites, women, homosexuals and the disabled in the last century are now rightly seen as triumphs of democracy, but they also established a pattern of public resistance and legal advocacy which later groups have unthinkingly continued. Thus even the most comfortably enfranchised people can cast themselves as heroic pioneers, just like Rosa Parks or Harvey Milk, by declaring a transgender identity and demanding public support of it.
The civil rights model is problematic when applied to the trans campaign for other reasons as well. Though we have been conditioned to see all social restrictions or cultural norms as inherently unjust, we forget that overthrowing them can generate unintended, undesired consequences. With the sexual revolution of the 1960s and 70s, for example, prudish moral strictures gave way to healthier, happier possibilities of intimacy and pleasure – but there was a corresponding increase of fatherless children and STDs. Popular tolerance of recreational drugs has spurred overdue relaxation of punitive criminal penalties – but addiction and overdoses remain pressing crises. Likewise, our faddish embrace of the transgender trend (including promotion of genital surgery and hormone therapy) may one day yield unforeseen complications for thousands of transgendered trailblazers. In a 2013 New Yorker article, “About a Boy,” journalist Margaret Talbot interviewed the mother of a transgender teen, who worried, “There are tides of history that wash in, and when they wash out they leave some people stranded. The drug culture of the sixties was like that and the sexual culture of the eighties, with AIDS. I think this could be the next wave like that, and I don’t want my daughter to be a casualty.”
The transgender lobby, more than many others, is notable for its self-absorption. In a world still wracked by war, pollution, and poverty, transgender activists are most interested not in the poor or the environment, but in themselves: no needs are more urgent than which washroom they get to use or what pronoun describes them. Yet surely when entitlement claims are taken to the level of “gender expression,” we are down to a concept of identity that applies to virtually everyone. The gender expression of a macho bodybuilder is not the same as that of a sensitive poet or an aged military veteran; the gender expression of a burlesque dancer is not the same as that of surgeon or a small-town mayor. A transgender identity must be a flimsy one if it requires other people to validate it with terms corresponding to the individual’s sense of self (or be penalized by institutional bylaw, as at some schools). There are no civil rights which promise reference to “Asian Mary” or “gay Jim,” since Mary and Jim know who they are without anyone else’s enforced acknowledgement.
Regulated pronoun use, furthermore, stipulates that outsiders share the Walter Mitty-like imaginings of the transgendered themselves, but by that reasoning any person can come forward with a request to be labeled according to their private conceits rather than their public face: the civilian who wants to be addressed as “General,” the old man who wants to be called “Junior,” the undergrad who feels deserving of a doctorate, and so on. All of them, like the transgendered, want to be known as the person in their minds rather than the person the world can see. As many have pointed out, this amounts to coerced speech rather than forbidden speech – a legal order not to stop but to do. Enforcing proper action, instead of policing improper action, contravenes the principles of democratic law. If my right to swing my fist ends at another’s nose, as the judicial adage holds, I nevertheless am not obliged to provide the other with a protective face mask.
None of this is to say that transgender people should not be allowed as much security and dignity as anyone else. Objections to the transgender movement derive not from any desire to deprive transgender people of personal freedoms but from a practical vision of what personal freedoms anyone is owed. To many, being transgender is simply not the same as being born a particular race, religion, or, indeed, gender – there is too much fog of socialization, psychology, cultural vogue and political ideology obscuring a clear definition of transgenderism for it to be recognized as describing a unique order of human beings who have been uniquely disadvantaged. Rather, transgender campaigns blur the line between solidarity and lifestyle, wherein the individual identity and the public grievance are so closely linked that it is impossible to consider the community apart from its demands. Viewed that way, the transgendered are less like Native people, or old people, or disabled people, and more like Republicans, or unionists, or pro-choice marchers. They are regular citizens with special wants, rather than special citizens with regular wants.
After the transgender cause runs its course, the pressing social issue five or ten years hence may be the public acceptance of polyamory. Polyamory has already been identified as a distinct lifestyle by researchers and practitioners: it is not the faith-based polygamy (one husband with several wives) of some religious sects but rather the consensual sharing of intimate partners among several secular adults. If the past is any indication, polyamorous advocacy groups should be clamoring for rights and recognition in about 2023. It will begin with a small but strident community challenging the conventional standards of couplehood, opening up about the happy and fulfilling conjugal arrangements they share among numerous other trusting, tolerant men and women. Crass old stereotypes of infidelity and swinging will be replaced by progressive analyses detailing the psychological health of poly spouses and their families, and distinctive “Poly and Proud” logos and other paraphernalia will be increasingly visible.
This will all play out in the media, of course. Expect heated editorial commentaries and radio call-in discussions, a rush of sympathetic novels, films, and TV programs about polyamorous liaisons, and much-publicized announcements by celebrities who come out as poly. Historical figures’ hitherto secret polyamorous relationships will be revealed for widespread reappraisal. Then the cause will work its way into courts and legislatures, as lawyers and activists will argue on behalf of multi-matrimony and its attendant entitlements, including spousal benefits, parental privileges, inheritance claims, tax exemptions, and the like. Hotels, airlines, restaurants, and other businesses will advertise to the growing market of poly triples, quartets, and quintets. Polyamory will have truly arrived in popular culture when its few critics – most of them conservative older people, or so-called jealous lovers – will be condemned as polyphobic.
The fact is that much of the behavior we permit or celebrate today was inconceivable fifty or one hundred years ago. Using recreational drugs; assisting a suicide; obtaining an abortion; reading or viewing sexually explicit entertainment; adopting transgender identity; being homosexual; having interracial sex; having premarital sex; even having certain sex acts – all were once strongly outlawed by governments, churches, and common morality. It is not hard to imagine other longstanding taboos which might fall in the future, through the same pattern of open acknowledgement, public defiance, gradual familiarity, and then legal accommodation and eventual normalization. Indeed, by some political perspectives the last several centuries are no more than a long sequence of emancipations (of peasants, colonial subjects, non-whites, women, gay people, et cetera), so it’s hardly surprising that we assume the cycle should continue indefinitely.
Yet though no one wants to define the limits of ethics or custom or pluralism so restrictively that they end up on the wrong side of them, everyone nevertheless has limits. Absolute measures of right and wrong may become relative over a lifetime, and shaded interpretations of fairness may become black and white. What is permissiveness today may be bigotry tomorrow; one man’s libertine is another’s prude. It’s easy to congratulate ourselves for being generously liberal, open-minded members of inclusive environments – right until it isn’t. The transgender and polyamory movements anticipate an endless drive to advance the rights of fractional slivers of society as if they were substantial cohorts already existing in plain sight. Malleable characteristics confessed by some are treated as either-or properties held by all. Private, fluctuating feelings about sex and the body, articulated by citizens otherwise indistinguishable from any other, are being touted as basic human typologies. Over time, not noticing those differences – between affected and intrinsic, between subcultures and civics – will become more and more awkward.
Oddly (or perhaps not) the contemporary fascination with sexual identity has coincided with the abundance of pornography made available since the advent of the Internet. On one hand, young people are accepting homosexuality, even celebrating it, in ways their parents and grandparents never thought possible: high schools have Gay-Straight Alliances; author Dan Savage’s 2012 book It Gets Better advises youngsters on coming out; LGBTQ-related films including Brokeback Mountain, Carol, and Blue Is the Warmest Color win broad acclaim; popular Young Adult novels like Shadowplay and Fan Art sensitively explore gay themes. On the other, the pervasiveness of erotic imagery in the Internet era is unprecedented, and the most enthusiastic consumers of it – young males – are being exposed to a wider variety of sexual acts and sexual fetishes than is healthy for developing personalities to absorb; psychologists and counselors caution that porn inculcates in teenage boys wholly unrealistic notions of what they can expect from partners and themselves; constant viewing of pornography shrivels the capacities for the intimacy, trust, and care which are at the heart of meaningful sexual experience.
So how does the familiarity of gay or transgender issues in contemporary media compare to the familiarity of X-rated material? Like pornography, LGBTQ books and movies make relatively unusual sexual values seem common; like pornography, they present a range of personal options many readers and viewers hadn’t hitherto imagined and which they may well have been better off not imagining at all. With both porn and LGBTQ campaigns, the message is that sex is more important than anything else in life, and that identifying and stimulating a private sexual predilection is a fundamental freedom. These are daunting ideas to introduce to any fifteen-year-old, whether via illicit peeks at a laptop or state-sanctioned lessons in the classroom.
Despite decades of study and debate, no one has definitively confirmed whether sexual orientation is a result of fundamental genetics or social conditioning. Either way, the implications make both LGBTQ supporters and opponents uneasy. If preference is built in to humans from conception, then family-values conservatives must accept that gay people cannot change their God-given essences any more than people of color can change their biological heritage (and so anti-gay discrimination is as unfair and as bigoted as racism); but gay-rights lobbyists are forced to admit that homosexuality is a comparatively rare trait which only affects a small portion of citizens (and so homosexuality is a fluke of nature that deserves more sympathy than celebration). But if preference is molded by upbringing and other external factors, then conservatives have to acknowledge that everyone, including themselves, has a potential latent homosexuality within them which might emerge under the required stimulus; but LGBTQ activists are in turn forced to concede that their identities are no different from acquired tastes in clothes or entertainment, which therefore deserve no special protection under the law.
It’s true, certainly, that some people are attracted to members of their own sex, just as some people enjoy and benefit from looking at sexually explicit materials. Fine; neither affinity ought to be judged. There is great damage to be done in classifying homosexuality and pornography as “deviance” or “obscenity,” as both were in the not-so-distant past. But there is also damage risked in the mixed signals sent to the next generation – that same-sex love is good but sexy pictures are bad, that we must embrace all kinds of sexual preference but reject any kind of sexual fantasy, and above all that our most inward impulses are matters of the widest public concern. Are they really? Is it progress, when both adults and adolescents devote so much of their intellects to discouraging some physical feelings but promoting others, and so much of their energies toward determining which types of desire to favor and which to deplore? We should remember that sex could be fulfilling long before centerfolds and adult videos were required to spice it up, and that people could be content in their identities long before Dan Savage and Brokeback Mountain were around to persuade them that they were not. Either we denounce both porn and gay-positive culture together, or accept them both equally, but it should not be only one and not the other.
The synergy between commercial media and social change, which is certainly present in modern activism, often goes ignored. In his 1998 book The Conquest of Cool, Thomas Frank argued that the societal upheavals of the 1960s and later were driven, or at least reinforced, by advertising – ubiquitous campaigns which urged people to “Think Different” (Volkswagen) or drink “the Un-cola” (7-Up). So if there was a booming drug underground in the era, for example, there were also Cheech & Chong, the Doobie Brothers, and High Times magazine to make drugs a little more mainstream, and to not incidentally profit from their popularity. Likewise, it’s not so much that contemporary generations are spontaneously embracing gender fluidity and gay pride, but that a whole industry of pundits, artists, and educators has been energetically promoting LGBTQ ideas to a receptive audience and potential customers.
This isn’t necessarily bad; after all, sanctioned cannabis use has solid medical and legal arguments in its favor, no matter any exploitation by Cheech & Chong or the Doobie Brothers. But it must be recognized that much of what is welcomed as organic evolution is abetted by commerce, academic fashion, and media opportunism. Adolescents who decide to identify as gay or transgender might imagine they have been liberated, but they may just as likely be unwitting suckers for an intellectual and journalistic hype. In other contexts, contrived political campaigns have been exposed as “Astroturf” – supposedly grassroots, but really engineered by greedy corporations – and the current preoccupation with LGBTQ causes, and the current attention given to self-defined LGBTQ people, may be subject to a similar scrutiny. The same forces that persuaded suburban white kids to wear Afros and wear “Black Is Beautiful” t-shirts in 1972 are now persuading suburban straight kids to wear rainbow pins and think of themselves as “nonbinary.” Despite the panics of religious conservatives, there is no sinister agenda behind this – but nor is there an authentic populist uprising, either.
In the end, the enigma of sexual identity may become so diffused by politics and economics that even the most partisan defenders of any individual orientation may find their support collapsing under the weight of so many other individuals’ orientations which do not quite overlap with the prototype. All people should have the liberty to live and love as they want to, but must every possible want be enumerated in advance? Can they? And should governments ensure that each of those possible wants are satisfied? Once populations are sorted by criteria as internal as self-image, or as fluid as social dynamics, or as unpredictable as lust, or as unique as personality, the logical end is to make every single person a minority.
IV. MISREADING MISOGYNY
The year 2017 may be remembered for a cultural watershed in which age-old patterns of male behavior towards women were finally called out and given consequences. A range of famous and powerful men – movie producers Harvey Weinstein and James Toback, broadcast personalities Bill O’Reilly and Charlie Rose, politicians Roy Moore, George H.W. Bush, and Al Franken, performers Bill Cosby, Louis C.K., and Jeremy Piven, and many others – were publicly challenged by the women who had received their unwanted sexual attentions. The attentions varied in severity, from lewd jokes and gropes to exposure to physical assaults, but the openness of the charges, the widespread condemnation of the charged, and the near-universal support of the chargers was unprecedented. All this followed the previous year’s election to the US presidency of Donald Trump, who had been revealed during the campaign as a man who had once boasted of grabbing women’s genitals with impunity. This was also a period when a popular television series based on Margaret Atwood’s dystopian novel The Handmaid’s Tale made plausible a future society where females were reduced to property for purposes of male procreation and male pleasure; the October 2017 death of Playboy publisher Hugh Hefner also prompted critical reappraisals of how his magazine had perpetuated a crude objectification of women’s roles and women’s bodies which had damaged all women’s dignity across society. At last, it seemed, men would no longer get away with the boorish actions and boorish attitudes they had inflicted upon women for hundreds or thousands of years.
What was also new, however, was how the climate of anger emphasized what had once been only a single strand of feminist thought. To be sure, women’s movements had long raised the issues of domestic violence and workplace harassment, and women had often pointed out the male propensities for coerced sex and personal intimidation. Changed laws and social customs around rape, spousal abuse, and employee protection were important breakthroughs achieved by women’s advocates. Yet the naming and shaming of men for their sexual wrongdoings, often through social media, looked to have become the cornerstone of contemporary feminism. Economic status, professional barriers, childcare and reproductive health – those causes were relegated to the background. Feminism once called for empowerment; now it highlighted vulnerability. Feminism was once about women comparing favorably with men; now it was about women suffering at men’s hands. Feminism used to focus on women’s untapped potentials; now it focused on men’s unwanted advances.
While for the most part welcoming the willingness to acknowledge inappropriate words and deeds which might have once gone unreported, some critics warned of a possible panic in the making. For one thing, citations of men’s sexual transgressions were now being made outside of the traditional arenas of policing and law. At colleges and universities, complaints of rape or verbal intimidation could be leveled against male students without recourse to impartial trials or presumptions of innocence; some young men found themselves expelled and bearing permanent records as sex offenders even though they had never been formally charged or convicted in criminal court. Hugely accelerated through social media like Facebook and the #MeToo Twitter hashtag, career-destroying accusations against celebrities proliferated. In many cases, the men admitted to misconduct and offered apologies, but sworn testimony and the weight of evidence had never been considered: simply to be identified as a perpetrator of sexual assault or harassment was usually enough to render a verdict in the public mind, and that of the men’s employers, who inevitably dismissed them from their high-profile positions.
Women’s groups made clear that false claims of sexual victimization were exceedingly rare, and lauded the courage of the numerous actresses, students, or staff members who had come forward with painful accounts of their personal hurt and humiliation. The problem, however, was less the possibility of made-up allegations and more the principles at stake. Are some crimes subject to lower thresholds of proof? Should some statements automatically be believed? In October 2017, retiring Chief Justice of the Supreme Court of Canada, Beverley McLachlin, in an address to the Canadian Criminal Lawyers’ Association, noted how men’s crimes against women ought to be addressed by courts: “Complainants and witnesses need to understand what is required of them in a trial and what they can realistically expect from it…No one has a right to a particular verdict but only to a fair trial on the evidence.” The notion that justice systems should consider women who charge sexual assault as uniquely credible had already been put to a damaging test in the 2016 trial of Canadian broadcast personality Jian Ghomeshi, accused by several women of inflicting physical violence during sex. Though the stories of Ghomeshi’s brutal behavior were not disputed, it emerged that the women continued to approach him romantically following the episodes and had otherwise withheld important details of their subsequent actions. Ghomeshi was acquitted, on the basis that the women’s charges could not be believed beyond a reasonable doubt. More complicated was the question of male-female dynamics in modern society. Among other demands, the feminist campaigns of the 1960s and 70s had called for women’s sexual autonomy – the right to express and act on desires as they chose, to select how many and whichever partners they wanted, to access contraception and abortion as needed, and to be considered independent sexual beings no different than men. These feminists qualified that the Playboy ideal of indiscriminate sexual consumption was not theirs, and that they had no aspirations merely to be pornographic trinkets; they did insist that the pursuit and enjoyment of sex should be the unquestioned prerogative of women just has it had long been for males. Fifty years later, the complaints of sexual imbalance and the mood of sexual threat seemed to reverse such outlooks.
In fact, it was often women themselves who raised the point that underscoring the dangers faced by women amounted to a return to the prim morality of a pre-feminist era. Downplaying female sexual independence and stressing male sexual menace was not that different from the social codes of bygone generations; the drive to ensure women’s “safety” could easily shade over into restricting their autonomy. “When it comes to sexual culture, obviously each generation bills itself as an improvement over the last,” conceded author Laura Kipnis in her 2017 book, Unwanted Advances: Sexual Paranoia Comes to Campus. “No doubt the slogans about pleasure and liberation were our own little lies about sex – the realities were obviously a lot thornier, especially for women. But today’s hazard story, too, comes with its own evasions, namely the blind spot about women’s agency.” Sexual assault and harassment were real problems, of course, but the rush to root them out, no matter the nuances of any given case, depicted an environment of rapacious wolves and chaste damsels bearing little resemblance to contemporary social realities. In the New Yorker, columnist Masha Gessen warned about the implications of such cultural shifts: “In the current American conversation, women are increasingly treated as children: defenseless, incapable of consent, always on the verge of being victimized. This should give us pause. Being infantilized has never worked out well for women.”
Notably, most of the cases of confessed or alleged sexual misconduct came from the elite levels of media, politics, and business. These were not the familiar scenarios of a leering manager pestering the secretarial pool, or a lascivious shop manager roaming among the women on the factory floor. Instead, observers were given glimpses behind the closed doors of Hollywood, Washington, and other prestigious workplaces (“Politics is show business for ugly people,” someone once said), lending an element of vicarious glamour to the incidents. Who wasn’t interested in the private indignities of the high and mighty? An overlooked distinction, however, was not only that the accounts concerned prominent and successful men and women – domineering studio executives and rising actresses; celebrated talk-show hosts and eager young network employees; self-important cabinet members and overachieving legislative aides; powerful party leaders and energetic party operatives – but also that the men and women were all effectively independent contractors rather than regular working stiffs. Such occupations are widely known for their intrigues of ambition, careerism, and their ever-present potential for big breaks and catastrophic flops. The point is not that accusers fabricate tales of harassment to advance their own positions, but that an element of personal competitiveness is inherent in the professions themselves, and both perpetrators and complainants of particular sexual behaviors understand the advantages and disadvantages that go with perpetrating and complaining. Hitting on a junior colleague, or exposing a senior colleague’s sleazy vices, will unfold differently in a movie studio or a corporate boardroom than in a warehouse or a restaurant. In fields where job security is precarious and vindication or disgrace is very public, misconduct of any kind has ramifications which are not factors in less high-stakes offices.
Like so many other causes conveyed over social media, the #MeToo movement could sometimes seem indistinguishable from a bandwagon. Ideas which once might have taken a manifesto to fully sum up were reduced to easily shared memes and hashtags, inviting users to choose from an either-or, for-against paradigm which turned complicated private stories of consent, fear, shame, guilt, and ambivalence into an inclusive category of belonging. Just as in other cultural and political conversations, nuanced, fence-straddling opinions were made impossible, or at least less visible, by the instantaneity of online sharing – you had to sign up for the movement, or you did not count; you had to Follow and Like, or you did not care. Before Twitter and Facebook, the enthusiastic cry of “Me Too” had usually been heard among children racing to join in each other’s playground games. It was that fleeting and that frivolous.
Beyond the heightened anxiety, justified or not, around sexual predation, there is the looming vision of males and females in permanent opposition. Here, women’s gains can only come with men’s surrenders (of habits or privilege), it is said, while men’s successes must surely imply a hidden sacrifice (of pride or power) by women. In this reading, the actions of individual men – or more commonly, their worst actions – are portrayed as reflecting the instincts of all men. Generalizations are always suspect when applied to large numbers of people. When they are applied to half the human population, they are especially dubious. Feminist leaders have sometimes invoked ideas like “the feminization of poverty” or “gendered violence” to denote phenomena which seem to critically affect females, but seldom have they referenced “the masculinization of warfare” (far more males than females have historically been killed in battle) or “gendered incarceration” (jails hold more men than women worldwide). This sense of ongoing, existential clash between wholly separate categories of people has tainted the spirit of feminism which many people of good will – possessing both XX and XY chromosomes – have long sought to advance together.
In the same way that activists cite “racialized” injustice or other ills, the glib feminist jargon about gendered this or gendered that may carry its own rebuttals. Neither white people nor men are in any way a default form of human being – a basic model to which all others are compared for how closely they correspond. Women do not have to be “like” men to be valued; their femaleness is not a defect. But by that logic, men’s maleness is not a defect either: their physical size, their propensities for aggression, perhaps even their innate mechanical or mathematical aptitude – all those may one day be regarded as core parts of male identity rather than a dominance which must be artificially handicapped. Politicized men of the future may charge gendered discrimination when their amorous advances are spurned by a woman, or when pacifists picket a military base, or when protesters block a construction site. If things commonly felt to be markers of male self-worth – sexual attractiveness, or tests of armed strength, or displays of creative prowess – are demeaned or dismissed, how might this not be called sexism? What’s to stop tomorrow’s Men’s Rights or Male Pride movements from defending the exact characteristics women currently condemn? Why should men be obliged to meet with women’s approval any more than women are obliged to meet men’s? In a 2016 New Yorker article, journalist George Packer quoted Glenn Loury of Brown University, who cautioned, “I don’t know how you live by the identity-politics sword and don’t die by it.”
Men and women are separated by biology and socialization, but they share a fundamental humanity. Discussions around sex and law can easily deteriorate into an us-against-them antagonism which asserts that no women and no men can ever understand each other. Yet there is a significant refutation of that gloomy premise, and it is evident at all levels of class, race, and culture. There are hundreds of millions of men who actually identify very closely with particular women, and vice-versa; they may remain distinct in their temperaments and preferences, even to the point of mutual incomprehension, but they can successfully work together for common goals and frequently cooperate to resolve joint problems. Despite contrasting experiences in childhood and youth, and often contrasting public expectations imposed on each of them, a man and a woman may nonetheless be more compatible with one another than with anyone of their own sex. The open secret of this special cross-gender bonding? Long-term romantic partnership. Countless heterosexual couples around the planet are living proof that intractable conflict between males and females is only an overstated political metaphor rather than a dire social reality.
V. BLACK AND TAN FANTASIES
In 2015, some prominent Canadians, including then-Supreme Court Chief Justice Beverley McLachlin and Truth and Reconciliation Commissioner Justice Murray Sinclair, used the expression “cultural genocide” to describe the system of residential schooling imposed on Aboriginal Canadians over much of the Twentieth Century: thousands of Indian children were taken from their homes and sent to live in institutions, where they were subject to harsh regimens of linguistic and religious indoctrination. Many suffered physical and psychological abuse; some were sexually victimized by their white overseers. But what good such acknowledgements would do, exactly, went unexplained. The residential school system had become the latest scab of Canadian history for native and non-Native politicians, activists, journalists, social workers and lawyers to pick, and “cultural genocide” was the latest way for them to pick it.
Genocide is a word not well modified by metaphoric adjectives. “Cultural genocide,” like “economic murder” or “emotional rape,” is the appropriation of a powerful term to characterize something significantly milder than what the term represents by itself: a person who has been verbally castrated has not undergone the same experience as a person who’s been castrated. Yes, Canadian Aboriginals were certainly subject to a rigid program of Christian and Anglophone conditioning, intended to make them productive members of a modern Christian and Anglophone society. However cruel and insensitive this might seem in retrospect, though – and whatever cruelties were perpetrated at the time – the residential school system was not a deliberate campaign of physical extermination. We might as well say that feminists committed cultural genocide against male chauvinists, that scientists committed cultural genocide against religious faithful, or that punk rock was a cultural genocide of hippies. In each case, an unforgiving imposition of new ideas made a way of life or belief obsolete, but there was no necessity of actually killing off the living believers. “Cultural genocide” cheapens our understanding of the original Armenian, Jewish, Cambodian and Rwandan genocides by drawing a false comparison between ethnic heritage and biological existence.
Beyond the overstated phraseology around residential schools is, sadly but nonetheless inevitably, the sheer fact of compassion fatigue. Canadians have been asked to consider the plight of the Aboriginal population for generations now; indeed, residential schools themselves were once thought the best way to address the problem of Native “backwardness” within a growing industrial nation. Yet no official treaty, royal commission, task force, apology, reparation, special status or reconciliation has done much to alleviate the longstanding pathologies of poverty, addiction, corruption, incarceration and suicide which continue to plague Aboriginal communities across the country. So it is hard to imagine that the current moral theatrics of residential schools (former students are designated “survivors,” and they or their descendants are entitled to government payouts) will bring Natives any closer to a dignified self-sufficiency, especially when new Canadians from Asia or Africa are already there, despite having themselves suffered decades of racial injustice. How much more sympathy can the rest of us give? What more truth can we uncover, and what more reconciliation is obliged? The real ongoing tragedy of Canada’s Native peoples would seem to have become their debilitating identification with ongoing tragedy.
Then, in 2017, the issue of cultural appropriation set off two Canadian episodes of what can plausibly be called censorship. In the first, a non-Native painter, Amanda PL, had her Toronto gallery exhibit cancelled because her canvases paid tribute to the colorful visuals of famous Ojibway artists like Norval Morriseau and Joshim Kakegamic. In the second, Hal Niedzviecki, editor of the Writers’ Union of Canada’s newsletter Write, was forced to resign for an editorial in which he noted, “In my opinion, anyone, anywhere, should be encouraged to imagine other peoples, other cultures, other identities. I’d go so far as to say there should be an award for doing so – the Appropriation Prize…” Both cases raised an outcry of Indigenous and non-Indigenous protest, accusing PL and Niedzviecki of exploiting Aboriginal voices and styles (or encouraging same) at the expense of actual Aboriginal people, who remain a disproportionately vulnerable segment of Canadian society.
“Cultural appropriation” is a nebulous term that can mean anything from old blackface minstrel shows to the Native-derived logos of numerous pro sports teams (the Cleveland Indians, the Chicago Black Hawks, the Vancouver Canucks, etc.). It is a charge mostly, perhaps always, leveled at white people, insofar as a dominant majority is said to have taken advantage of minority motifs for fun and profit: whereas a Native person who (say) uses a computer is merely trying to survive in an overwhelming alien environment, a Caucasian who (say) incorporates Native designs into her paintings is a colonialist dilettante depriving Natives of one of the few things they can claim as their own unspoiled heritage.
So the best response to cultural appropriation is – silencing the appropriators? Day-to-day conditions on Native reserves will improve as soon as Amanda PL and Hal Niedzviecki are prevented from expressing themselves? Tragic rates of Aboriginal addiction and suicide will miraculously plummet when no non-Aboriginal dares to represent shamans, totem poles, or igloos in his or her work? Should we also ban the Native-inspired art of Emily Carr and Ted Harrison, along with the Beatles’ “Norwegian Wood” for its sitar, and neoclassical architecture for its Doric columns? Who gets to precisely distinguish between cultural appropriation and cross-cultural influence, or between cultural appropriation and multiculturalism, or between cultural appropriation and culture?
In the wake of the controversy, numerous pundits of diverse backgrounds struggled to explain why cultural appropriation must be stopped. “We have to understand that cultural appropriation is institutionalized, it is the very foundation of what Canada is built on,” CBC commentator Jesse Wente said. “And not just cultural appropriation, but appropriation of all things Indigenous…None of us that I’ve seen want to limit free speech.” In the Toronto Star, columnist Shree Paradkar added, “Appropriation is not about artistic licence or freedom of expression.” Yet the rejoinder to such denunciations might be, “Or else what?” If a creator characterizes a culture not their own with a lazy stereotype – not obscene, not libelous, not threatening, just lazy – do we call the police? Wente’s foundational critique also sounds dubious, in a country which for decades has launched costly government initiatives addressing Native concerns. If anything, it is anti-cultural appropriation that has been institutionalized in Canada.
Cultural appropriation – or rather the public shaming of it – is really a political ploy rather than a social problem. By broadening the definition of bigotry to encompass any subjectively offensive deed or statement, Natives can indefinitely invoke a perceived injustice no outsider can dispute. For a long time, injustice meant tangible barriers like laws and customs which actively discriminated against certain groups; then it meant intangible prejudices which subtly demeaned or excluded them; now injustice means whatever allows anyone to blame anyone else for their troubles. Cultural appropriation meets this last criteria well. Even people as admiring as Amanda PL or as open as Hal Niedzviecki still make convenient villains, all for looking at others the wrong way. Except the sheer opportunism of the concept, the illiberality of the reactions, and the futility of their purpose, suggests that those who condemn cultural appropriation are once again victims only of their self-conferred, self-marginalizing victimization.
Also in 2017, the Canadian literary community was roiled by the news that popular novelist Joseph Boyden (Three Day Road, Through the Black Spruce) had significantly overstated his Native ancestry. Boyden had been a prominent national voice on Aboriginal topics, until investigators from a Native TV network aired findings which showed the author was, in fact, mostly Celtic. “I fear that I’ve become a bit too big, one of the go-to people when it comes to Indigenous issues in this country,” he eventually admitted. “A small part of me is Indigenous, but it’s a big part of who I am.” L’affaire Boyden exposed much about contemporary politics and culture – including much we might not want to know.
There is an established assumption that some classes of people are so vulnerable that any artistic expression credited to any of their members deserves extra attention from outsiders. The canon is full of dead white males, goes the reasoning; the relatively rare females and persons of color who write, paint, compose or film must be singled out and celebrated, thus helping to redress social wrongs of the past and promote social progress in the present. This belief – manifested in public grants, academic curricula, and other official programs – has now been around long enough to be exploited. It’s not that Joseph Boyden cynically made up a nonexistent personal heritage so that he could cash in on whatever handouts were available to him, but that an entire industry of publishers, publicists, and critics were complicit with Boyden in emphasizing and exaggerating his Native background in order to burnish his reputation as a special talent meriting special coverage and special sales figures. No one, probably least of all Boyden himself, really set out to fool anybody else, but the exciting-new-spokesman-for-an-underrepresented-group angle, once it was put forward in the reviews and the book stores, was too successful to refute.
It’s also important that Boyden “identified” as Native, a verb which has lately been freighted with extra meaning. Whereas Toni Morrison never had to identify as a black woman, other categorizations require an active effort on the part of the identified. Ethnicity isn’t always apparent, of course. Yet when being part of a disadvantaged ethnicity is something that one needs to explain to others, rather than a self-evident quality others have already perceived, the marginalization seems less real (this is why we have the awkward designation “visible minority”). Discrimination is certainly a problem when it’s meted out by discriminators. When it’s sought by discriminatees, it’s not quite the same.
Joseph Boyden, to use another uncomfortable term, could pass as an Anglo-Canadian, in a way that other famous Native people (say, actor Adam Beach, hockey player Jordin Tootoo, or singer Buffy Sainte-Marie) likely couldn’t. His case recalls that of Rachel Dolezal, head of the Spokane chapter of the National Association for the Advancement of Colored People (NAACP) until 2015, when family members came forward to reveal that, despite her dreadlocks, darkish complexion, and avowed black identity, she was actually Caucasian. Here again, we have for so long considered particular populations as handicapped by history – and therefore take pains to compensate for the imbalance – that in some contexts being underprivileged has turned into a strange kind of upper hand. At least as defined by Joseph Boyden and his compassionate but credulous readership, identity is becoming like power: if you have to say you have it, you don’t.
This trio of controversies – the contested ideas of cultural genocide, of cultural appropriation, and of claimed cultural identity – share an uncontested premise of racial victimization. North American society has lived with a pervasive awareness of the second-class citizenship held by millions of people (Natives in Canada, African-Americans in the United States) for decades. Of course the actual differences in status long predated their popular familiarity among those not subject to them, and it is in that disconnect between historical reality and modern rhetoric where the controversies arise. Blacks, Aboriginals, and other once-persecuted minorities have won major social and constitutional victories in the last fifty or sixty years. The implementation of residential schools and segregated public facilities, or the open expression of racial stereotypes, or a white majority’s casual disrespect for any non-white minority achievement, are politically unthinkable in our time. Yet when an unfair past continues to be the strongest argument invoked in a fair present, there is a reasonable case that the argument should be updated.
Indeed, the relentless attention given to supposedly racist treatment may be the most daunting obstacle faced by groups which once were subject to obviously racist conditions. In Canada, Native leaders have erected what amounts to a bureaucratic complex of legal advocacy, educational programming, media promotion, and social assistance to combat the bigotry they claim to still face, all of which serves only to further reduce average Natives to wards of government and so reinforce a bigoted caricature of Natives as hapless primitives unable to live as independent citizens in Twenty-First Century Canadian society. In Disrobing the Aboriginal Industry (2008), Frances Widdowson and Albert Howard point out the damaging effects of romanticizing Native spiritual customs long after they (and those of other groups) have proven obsolete in the post-industrial world: “When the government provides funds to appease the political demands inspired by the Aboriginal Industry, the result is the artificial retention of an idealized past.”
There is no reason on earth why individual Native people cannot become scientists, pilots, teachers, doctors, entrepreneurs, engineers, or other professionals; some already have. Yet the orthodoxy of Aboriginal activists and non-Aboriginal apologists holds that Natives are so burdened by history that they cannot hope to succeed in a modern economy without an initial – some might say indefinite – phase of overcoming the legacy of racism and colonialism. Generations of Natives have grown up receiving messages about what they are owed rather than what they might do for themselves. Where other cultures teach the link between effort and reward, First Nations lessons emphasize the link between complaint and compensation. Who knows how many potential mathematicians, dentists, and businesspeople has Canada lost because young Natives were subtly and explicitly told to defer their individual ambitions until the larger group got its due in a court decision, or an official apology, or a financial settlement?
At the same time, some Native and black commentators assert that the affluence and prestige enjoyed by white North Americans is a direct consequence of Native and black oppression: the forced dislocation of Native peoples and the famine, disease, and disenfranchisement they faced afterward, and centuries of slavery, lynching, and Jim Crow laws, are the exact factors which propelled European settlers to their privileged position atop a continental hierarchy. In a 2017 Atlantic essay, Ta-Nehisi Coates stated bluntly that “With one immediate exception, [Donald] Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it… [R]acism remains, as it has since 1776, at the heart of [American] political life.”
Is Coates’ summary accurate? Since the Eighteenth Century, African-Americans in the United States have generally comprised about 15 percent of the total population. Before the Civil War, their numbers neared 20 percent, but with the waves of European immigration in the early Twentieth Century, blacks made up about a tenth of American people, and as of 2010 the figures have risen somewhat, to about 13 percent. In the more sparsely populated land mass of Canada, Aboriginals in 2016 numbered around five percent of the population, a proportion steadily growing since the first encounters with Europeans and the epidemics of smallpox and tuberculosis which accompanied colonization. That African Americans and Indigenous people were exploited, cheated, murdered, raped, and otherwise denied basic rights because of their races is an indisputable truth of US and Canadian history. However, the census numbers do not suggest that North America’s unprecedented material wealth and global power were attained entirely on the backs of their black and Native inhabitants. Whatever was good and bad in North American life through the years, most of it happened among white people. Unlike the Indian Raj or the apartheid system of South Africa, North America was never the scene of a tiny European elect dominating a huge underclass of dark-skinned unfortunates. Racism was and is real, but it is not the foundation of today’s socio-economic structure.
No doubt, the broader grievance of Ta-Nehisi Coates and other critics is valid: for centuries, whites imposed their own rules on everyone else, and then handicapped non-whites for not living up to them. Whites have been a minority of the world’s people, but over the years they ruled the world, either through brute force or political control. Yet beyond a certain point in the past – before Columbus and other explorers began making regular journeys to the Far East, the Americas, Africa, and Australia – whites could not have ruled anyone, and in any case they had little idea that anyone else existed and had scarcely traveled to them if they did.
To Europeans of five hundred years ago, the “inferiority” of Africans, Asians, and Aboriginals was simply self-evident – colonialism and the slave trade could not have occurred if the subject peoples had effectively resisted them, or if the explorations and “discoveries” of unknown territories had gone the other way. In time, of course, grand systems of law and custom were designed to maintain white supremacy, but the initial concept of an ascending racial order was not some cynical excuse for genocide. European dominance came first; the rationalization of it came after. The game was rigged only after one team had decisively won the opening rounds. Though we rightly deplore the cruel logic that justified owning, degrading, or exterminating millions of human beings, that logic extended Eurocentrism, rather than enabled it. Though racism can be blamed for many historical wrongs, the history of a civilization from one continent conquering civilizations from others is not among them. Conquest and subjugation were what everyone did, and had long done, in 1500. Europeans just did to more people, over a wider area, than anyone else.
This carries some uncomfortable implications. Even without five centuries of white men dictating the terms of political organization, it’s unlikely there would today be an African superpower, or an advanced Aztec or Inuit nation-state in the western hemisphere. Some European gains, such as the settlement of the Americas and the plundering of resources from Australia, Africa, and Asia, were assuredly achieved at the expense of non-Europeans. On the other hand, a counterfactual history with no Columbus and no slave trade would not guarantee native Australians, Africans, and Asians their own Renaissances, Enlightenments, and Ages of Empire in parallel with England, France, Spain, Italy, and other western kingdoms – at least not at the same time or at the same pace. Europeans have much to feel guilty about in their encounters with the world’s other people. But that unpleasant fact often overshadows one nearly as disturbing: in their encounters with Europeans, other peoples have much to feel humiliated by.
So sweeping are the effects of racism judged to be, however, that descendants of racism’s most direct victims continue to find it everywhere. Sometimes they cite the “microaggressions” committed by whites when they ask seemingly innocuous questions such as “Where are you from?” as proof that, while the overt bigotry and discrimination of decades past is now frowned upon, there still exist subtler, slyer manifestations of hostility towards people of color. Thus even a superficially friendly encounter between colleagues or classmates might be interpreted as laden with hidden messages of racial chauvinism – indeed, any unpleasant experience felt by any non-white individual can theoretically be blamed not on the universal vicissitudes of social engagement but on a systemically or institutionally racist culture.
In a sense, this too is a legacy of racism. Tens of millions of people, of all backgrounds and in many societies, have acknowledged the forced inequality that marked Europeans’ treatment of Africans, Arabs, Asians, and Native Americans, and have internalized that history to such an extent that no compensatory program of equality can erase its memory. An assumption of unfair white dominance, because it was so real for so long, is now a permanent plank of social justice. Racism is today like stupidity: a problem which can be found in any context and which can explain any negative situation. Racism is like stupidity, too, in that lives can be built around resenting it. Both racism and stupidity can be blamed for every disappointment anyone ever suffers – every job not obtained, every slight not deserved, every movie not enjoyed. Persons of intelligence can define their entire place in society as victims of stupidity, because there are certainly many stupid people who will distract them from their learning and their knowledge. Stupid people definitely make life difficult for those who aren’t stupid, and fighting stupidity can become central to one’s identity. The difference is that there are no Smart Lives Matter movements, that baseless accusations of stupidity cannot end careers or reputations, and the stubborn persistence of public stupidity has not become an excuse which might be invoked at any instance of private failure.
At its most pernicious, the entrenchment of anti-racism as a social cause has given rise to a tangle of civil and legal regulations which mandate hiring quotas, reserve organizational positions for women or minorities, and craft the familiar job and school applications which ask candidates to identify by sex, race, or sexual orientation. Insofar as such programs are weighted in favor of females, non-whites, and non-heterosexuals, they inarguably class straight white men as second-class citizens. In practice, certainly, straight white men are not barred from employment or education, nor do they routinely get dirty looks from co-workers or classmates who don’t want them in their midst. But the principle of discrimination is the same. For job seekers, the sense of being arbitrarily discounted for belonging to the wrong category is the same. For the ambitious and the aspiring, the quiet confirmation of being excluded is the same. For the willing and the talented, the lingering suspicion of opportunities denied and abilities overlooked is the same. The fine-print, elaborately phrased No is the same.
Defenders of affirmative action and employment equity – or whatever euphemisms describe these systems – are quick to justify them as belated redresses of longstanding discrimination the other way. Obviously, straight white males once recruited workers and enrolled students almost wholly from among their own; today the thumb is only pressing on the opposite scale. Yet the wrongs of yesteryear were not simply callous exercises of power by a stronger group, at least as the group convinced itself. Segregationists truly believed that blacks and whites could not peaceably share public spaces, and would each have more chance to succeed, as much as they were able, in isolation from each other. Developers and landlords who did not accommodate families named Goldberg or Bernstein really felt Jewish people would not happily fit into a WASP milieu. Managers who gave women lower salaries for doing the same work as men genuinely thought men were entitled to extra pay, since they were the breadwinners with wives and children to support. These were all biased and shortsighted views, but they were nonetheless considered reasonable justifications of a perfectly workable status quo. “We strongly encourage applications from women, Aboriginals, visible minorities, gay, lesbian, queer, transgender people, and persons with disabilities” is also considered to be reasonable and workable terminology which serves the interests of everyone. This gives all people more chance to peaceably succeed, the language implies. This ensures all people will happily fit in. This extra consideration is just what is entitled. Who could disagree? Only, perhaps, the collective memories of the Goldbergs and the Bernsteins, of the underpaid female staff, and the users of colored-only facilities.
One of the recurring figures in the discussions over race and gender is that of the white liberal, the “well-meaning” member of the majority whose good intentions are either fatally negated by his ignorant brethren or insidiously compromised by his own deep-seated favoritism. But perhaps more common is the merely indifferent or distracted person who has never been able to seriously entertain an active hostility to those outside his group. Such people may be Jewish, Irish, or Italian – ethnicities who have themselves known exclusion and discrimination (or worse) yet who are still categorized with a white overclass. Or they may be among those types who make the self-mocking crack, “I’m not prejudiced, I hate everyone equally.” Where do we situate these figures in the context of an allegedly racist culture? Do we divide the citizenry into well-meaners and ill-meaners, with no one in between? Can some people only be guilty of racism, and others only victims of it? Is a pre-emptive complaint of racism the only defense against an accusation of it?
For now, long-held feelings of mistrust and enmity around race, religion, and ethnicity seem as enduring as ever – demonstrated, in some parts of the world, through open warfare and virtual attempted genocide, and in others, through strident language and restrictive legalism. One new element of this endurance, however – at least in Canada, the United States, and other relatively peaceful regions – is that the mistrust and enmity are not aimed one way. While not completely reversed, the roles of the historically strong and the historically weak have been considerably altered by shifting population patterns, by assimilation, and by genuine social advances. Racial prejudice, so often glibly associated with a single overwhelming majority, has evolved into a pan-racial attitude that is harder to identify and call out, precisely because of that past association. Lasting movement toward a society of different heritages, faiths, and colors living contentedly alongside each other will be made when racism ceases to be an explanation, as much as it has already ceased to be a problem.
VI. WRONG + WRONG ≠ RIGHT
A number of recent public opinion polls have shown a troubling decline in the importance many people place on living in a democracy. A World Values Survey suggested just 30 percent of Americans born since 1980 felt citizenship in democratic societies was important to them (in contrast to 72 percent of older respondents), while a Pew Research Center survey of 2015 showed that some 40 percent of the millennial generation (born between 1980 and 2000) thought governments should have the capacity to restrict speech deemed to be offensive. Other findings indicate similar lukewarm attitudes. The ideals on which World War II and the Cold War were fought and won by western nations, at a cost of millions of lives and countless trillions of dollars, may have lost some relevance to generations with no direct experience of them.
Such was one interpretation of the data. Another reason for the sentiments may be that democracies simply have not delivered on their promises. Economic stagnation and decline; political and corporate corruption; unresponsive, dysfunctional, or irresponsible governments; Donald Trump. If these are what free and open societies have yielded for their citizens, who would have much use for freedom and openness? What good is an idealized system of law and liberty if most people under its rule feel no benefit from it?
Perhaps a third conclusion to be drawn, though, is that which follows from the observations made in this essay. If democracy is to survive – if it deserves to survive – a general diagnosis of three crucial challenges now facing it is warranted. These may not be the only problems democratic nations must address, but they are surely pressures which today’s liberal democracies cannot withstand or ignore indefinitely.
The Information Revolution: For all the explosive spread of knowledge, entertainment, and opinion afforded by the Internet in the last two decades, the transformation of global communications by online media has not been without its downsides. Issues of privacy and surveillance are prominent, as is the deluge of ephemera and distraction which sometimes threatens to overwhelm serious news and confirmed fact. But another complication the Internet has brought to modern culture is the skewing of political belief toward extremes of polarization. Anonymous message boards allow users to make statements of such hostility, rudeness, or sheer inflammatory nihilism as would be unthinkable under actual identities traceable to real individuals. Civil discourse has badly deteriorated in cyberspace. Moreover, the immediacy and convenience of social networking enables convictions deeply held by a very few to be casually parroted by a great many; the weight of genuine popular sentiment has become much harder to gauge when ideological fringes can amplify their voices far beyond what their populations could project in what is still referred to as real life. Social networking also permits damning moral verdicts to be instantly rendered on serious matters of personal behavior (charges of offensive behavior or language, say), without recourse to evidentiary hearings or back-and-forth discussion. As with many other areas of public interest, the Internet can badly misrepresent whatever is being considered, making it difficult to distinguish signal (experience, learning, truth) from noise (attention, hype, moral panic). This difficulty was present long before Facebook, Twitter, and Google, of course, but today the separation between mediated and unmediated society has become sharper than ever. The question to decide is in which of those worlds we truly live and believe.
Heightened Sensitivity: Much of the controversy which now attends debates over social justice stems from the view that standards of proper and improper thought and conduct have become far too elastic to be reasonably upheld. The controversy dates back to easy jokes about political correctness many years ago, and the “PC” epithet is today a staple insult from the right wing. But the fact that so many people for so long have had at their disposal a caricature so instantly recognizable suggests a certain validity to it: the sudden, sanctimonious shaming of harmless individuals who unknowingly say the wrong word or make the wrong observation has become much more common than any continued battles against the intentionally antagonistic. It is the ostensibly harmless, indeed, who have often become the antagonists. Thus whereas progressivism was once dedicated to the practical welfare of the manifestly disadvantaged – harassed women, segregated blacks, closeted gays, and so on – it has since turned into a gestural lobby which yields little direct good for those it claims to represent. The stereotype of the scolding, morally superior activist who would rather berate the unenlightened than help the unfortunate has become all too apt. Millions of citizens have found themselves classed as bigots for an offhand tolerance which, they are warned again and again, is too offhand and not tolerant enough. At some point, we ought to rethink whether academic models of pluralism and diversity can function well in a messily pluralistic and diverse society.
Historical momentum: It could be that today’s political climate is only a deep, resounding echo of the truly transformative changes achieved forty or fifty years ago. Feminism and civil rights left such a legacy that generations too young to have lived through their first stirrings are participating in a shared daydream of advocacy, upheaval, and martyrdom in which the moral dramas of the 1960s and 70s are updated with the same essential story performed by new casts against new backdrops. But this daydream, acted out in imperfect democracies with an array of other crises to face, threatens democracy itself. The threat is that the centuries-old principles which defined liberal systems – one adult, one vote, period; equal rights for all, full stop – will be endlessly modified and qualified according to the demands of whichever category of people is currently accessing the levers of power. In this vision, citizens will not be the units of society – communities will be. Overturning structures of second-class citizenship in the past was so successful that the process of overturning is now assumed to be inherent in the structure of citizenship generally. Freedom is now measured not so much by what everyone and anyone can do (speak, worship, think), but rather by what particular people do more or less than others (appear on television, serve in government, go to jail). The interests of individuals are now limited to only those who share their personal characteristic of color, heritage, gender, or sexual orientation, with no possible broader concerns of economic class, party preference, or regional affiliation to unite them with outsiders. A healthier democracy would better trust its millions of single participants to choose their unique destinies, instead of lumping them into blocs of predetermined choices and collective outcomes.
Progressive politics and social justice are not evil. They have inspiring histories. The people who promote them mean well. But in recent years their movement has regressed towards exemplifying the very injustices they claim to combat: absolutism, restriction, and orthodoxy. A worthwhile future for everyone requires progressivism to turn itself around, toward the admirable goals it once had and which too many of its current acolytes seem to have forgotten.