All posts by SeeDeR68

lover of the ocean; friend and advisor to felines; co-conspirator in their plan to take over the world Long Live FLUFFY, Destroyer of Worlds

How pre-existing conditions became front and center in health care vote

Posted on May 5, 2017
Rep. Billy Long (R-Mo.) speaks to reporters outside the White House on May 3, 2017 after a meeting with the president on proposed legislation that could limit coverage for preexisting conditions.
Susan Walsh/AP
Simon Haeder, West Virginia University

Pre-existing conditions became the focus of debate on the American Health Care Act, which was narrowly passed 217-213 by the House of Representatives.

The debate led to bitter disagreement, as Republicans sought to undo a requirement of the Affordable Care Act that insurers be forced to cover pre-existing conditions and at the same premiums as others.

The issue, long contentious, gained further fuel this week through two illustrative videos seen by millions of Americans. On the one hand, a tearful late-night show host Jimmy Kimmel described the nightmare of every parent when his son was born with a serious, complex, and costly birth defect. On the other hand, Rep. Mo Brooks (R-AL) stated that those Americans “who lead good lives” and “‘ve done the things to keep their bodies healthy” should not have to support Americans with pre-existing conditions.

Why should this be such a contentious issue? As someone who studies and teaches health care policy in West Virginia, one of the states with the highest percentage of individuals with pre-existing conditions, let me offer some answers.

What is a pre-existing condition, anyway?

Pre-existing conditions are health conditions which were diagnosed or treated by a provider prior to the purchase of insurance. Twenty-three states even include cases where individuals did not seek medical attention but when a “prudent” person would have sought care.

Pre-existing conditions apply only to those circumstances where the sale of insurance policies is based on individual risk, as opposed to risk spread across many people, such as in employer-sponsored insurance or Medicare.

Addressing the contentious issue of pre-existing conditions, and most importantly how to distribute the costs associated with them, is a crucial one for all health care systems. The issue has been with us from the very emergence of health insurance, particularly as for-profit insurers sought to minimize their risks and to maximize their profits.

However, while most other industrialized nations have long resolved the issue equitably, the U.S. continues to struggle with it, even after the passage of the ACA.

Before passage of the ACA, pre-existing conditions were subject to a confusing mix of state and federal laws, regulations and enforcement. Almost 20 percent of the states provided no definition of preexisting conditions at all.

Insurers hence had significant leeway in determining what counted as a preexisting condition unless a state specifically banned the practice for certain conditions.

States also differed on how far back health conditions were relevant, ranging from six months to indefinitely.

Insurers could elect to deny coverage altogether to individuals with preexisting conditions in most states. In others, insurers charged much higher premiums for those with preexisting conditions.

Man being treated for sleep apnea, once an excluded preexisting condition.
From www.shutterstock.com

Insurers are generally not concerned about preexisting conditions per se, but only about those that are expected to incur significant medical costs in the future.

Basing their decisions on risk models, individual insurers have developed lists of declinable conditions (such as substance abuse, acne and sleep apnea), medications (such as heparin, Zyrexa and Interferon) or occupations (such as miners, pilots and air traffic controllers).

A congressional report found that 425 medical diagnoses have been used to decline coverage.

Certain reasons for rejection fueled public outrage more than others. For example, immediately prior to the ACA’s passage, being the victim of domestic violence counted as a preexisting condition in eight states.

Similarly, many insurers also included rape as a pre-existing condition, and 45 states allowed the practice for C-sections.

How the idea of denying coverage got started

The issue of pre-existing conditions is not new to the American health care system. At the beginning – in the 1920s and 1930s – emerging health insurers like Blue Cross and Blue Shield were created as nonprofits with special tax treatment. Most plans charged the same rates to all consumers.

As the insurance market became more profitable, for-profit insurers entered the market. Focused on maximizing their profits, these companies sought to attract only the healthiest individuals. They did this by offering lower premiums than their nonprofit competitors to healthy individuals.

Naturally, this entailed excluding individuals with preexisting conditions. In order to avoid being left with only the sickest individuals, all insurers eventually had to move to medical underwriting, at least in the individual market.

Over time, both states and federal government enacted certain, albeit very limited, protections, such as high-risk pools, for individuals with preexisting conditions.

Some states also required insurers to issue policies to all comers. These guaranteed issue requirements, however, often did not address costs issues.

As result, while consumers may not have been denied coverage, they were penalized with higher premiums for having these conditions.

Common efforts to limit losses for insurers from those with preexisting conditions included the temporary or permanent restriction of benefits for certain enrollees based on their health condition; the creation of so-called bare-bone plans or allowing insurers to charge discriminatory premiums.

However, none of the approaches offered a comprehensive solution.

A study by the Commonwealth Fund in 2007 found that 36 percent of individuals had been turned down or charged a higher price for a preexisting condition.

An investigation by the Committee on Energy and Commerce of the House of Representatives showed that the nation’s four largest for-profit insurers covering close to three million individuals had turned down more than 600,000 individuals between 2007 and 2009. Moreover, during the same period they refused to pay medical treatment for a preexisting condition for more than 200,000 claims.

Those most closely affected were those 16 million Americans (in 2008) who held policies in the individual market and the additional 50 million who were uninsured.

However, transition between insurance is inherently frequent in a mobile society like the United States. A significant number of people in any given year lose their jobs. Both instances leave many Americans uncovered for at least part of the year, and potentially seeking insurance in the individual market.

Obamacare’s call for coverage

The pre-existing condition issue is one pretty much unique to the American health system.

The ACA sought to solve the issue through a variety of arrangements surrounding the insurance marketplaces including community rating, a minimum amount of benefits (the Essential Health Benefits), the elimination of annual and lifetime benefit limits, and subsidies.

In contrast, the American Health Care Act would allow insurers to charge higher premiums to those individuals.

The AHCA does offer some very limited funding to offset its negative effects. However, policy experts, providers and patient groups have described these as inadequate. The most recent Upton Amendment slightly increased this funding – something that possibly contributed to the law’s passage. But policy experts continue to see the funding as significantly too small.

Are we all in this together, or not?

Millions of Americans could potentially be affected by the changes under the new legislation.

The point is that pre-existing conditions remain ubiquitous in American society. A Kaiser Family Foundation analysis a few months ago found 52 million Americans under age 65, or 27 percent of the population would not be able to obtain insurance on their own under pre-ACA conditions.

The situation was considerably worse in states like West Virginia, Mississippi, Kentucky and Alabama, where more than one in three residents, according to the analysis, would not be able to.

Making sure that those among us with pre-existing conditions have health care is challenging and unquestionably costly. It also requires a degree of sacrifice, in terms of higher premiums, from those who, at any given point in time, are relatively healthy.

What is required is a degree of solidarity with our neighbors, friends and family members who, often through no fault of their own, have suffered from poor health. Not the least, it is a degree of solidarity with our own future selves as all of us could fall sick at any point in time.

Americans of all political persuasions seem to be willing to make the required sacrifices. Most Americans, including 63 percent of Republicans and 75 percent of Democrats in a recent poll, support the preexisting condition components of the Affordable Care Act.

Simon Haeder, Assistant Professor of Political Science, West Virginia University

This article was originally published on The Conversation. Read the original article.

 

https://mediabiasfactcheck.com/2017/05/05/how-pre-existing-conditions-became-front-and-center-in-health-care-vote/

16:9 in English: The Original Function of Groucho Marx’s Resignation Joke

Forside Indhold i dette nummer Arkiv Abonnement In English

16:9 in English: The Original Function of Groucho Marx’s Resignation Joke

Af RICHARD RASKIN

Groucho Marx sent the following wire to a Hollywood club he had joined: “Please accept my resignation. I don’t want to belong to any club that will accept me as a member.”

Introduction

Jokes that play on self-disparagement should not be taken at face value, as though they were unequivocally sincere expressions of the way in which the jokester actually perceives him- or herself. Sometimes the self-presentation involved is based on a fictional persona, propped up as a target of ridicule, such as the character Jack Benny played in his radio and television shows, when he gave new meaning to the concept of stinginess. The most memorable radio sketch was the one in which Benny is stopped by a mugger who says something like, “All right, buddy, your money or your life,” after which the continuing silence becomes funnier with every passing second. To mistake the fictional character who can’t decide whether he cares more about his own life or the money he is carrying at the moment, for the person pretending to be that character, would be stupid, even if one didn’t know that in his private life, Benny was notoriously generous in giving to charities.

It is also common for professional entertainers to base their jokes on a potential liability for their career – turning that liability into an asset. This is what George Burns did for decades, with self-disparaging jokes that call the audience’s attention to the state of his aging body and his presumed loss of sexual viability. For example at a show he did in 1974, at the age of 78, he made such cracks as: “At my age, the only thing about me that still works is my right foot – the one I dance with,” and “The only thing that gets me excited is if the soup is too hot.” Through these jokes, the comedian turns to his own advantage a condition which might otherwise interfere with his continued acceptance as a vital entertainer. Some comediennes use jokes disparaging their sexual attractiveness in much the same way, such as Phyllis Diller’s “I never made Who’s Who but I’m featured in What’s That,” and “Have you ever seen a soufflé that fell? – nature sure slammed the oven door on me.”

Fig. 1. Groucho Marx.

 

 

The present article is a slightly modified version of a chapter in the author’s book, Life Is Like a Glass of Tea: Studies of Classic Jewish Jokes (Aarhus: Aarhus University Press, 1992), pp. 121-130.

One of the all-time classics of self-disparaging humor is Groucho Marx’s famous telegram. In reconstructing the situation in which the comedian actually used the telegram, I will try to show in a kind of “case study” of the joke, that the last thing on Groucho’s mind was any concern with his own failings as a human being. But first, a brief discussion of the way in which the joke was used by Woody Allen, will help to set the stage for our analysis.

Annie Hall

Soon after the opening credits of Annie Hall (1977), Woody Allen tells the “Resignation Joke” while facing the camera (fig.2), in his role as Alvy Singer (1):

The – the other important joke for me is one that’s, uh, usually attributed to Groucho Marx but I think it appears originally in Freud’s Wit and its Relation to the Unconscious. And it goes like this – I’m paraphrasing: Uh… “I would never wanna belong to any club that would have someone like me for a member.” That’s the key joke of my adult life in terms of my relationships with women. (Allen, 1983: 4)

This “key joke” functions here as a self-diagnostic tool enabling our hero – as well as the viewer – to conceptualize a particular neurotic pattern in the life of a person who allows his feelings of unworthiness to prevent him from wanting any woman who would want him. This self-diagnostic use of the joke is further developed in a subsequent scene in which Alvy Singer interrupts his love-making with Allison Portchnik, and succeeds in engaging her in a discussion of John F. Kennedy’s assassination (fig. 3-4). When Allison says: “You’re using this conspiracy theory as an excuse to avoid sex with me,” Alvy replies:

Oh, my God! (Then, to the camera) She’s right! Why did I turn off Allison Portchnik? She was – she was beautiful. She was willing. She was real… intelligent. (Sighing) Is it the old Groucho Marx joke? That – that I – I just don’t wanna belong to any club that would have someone like me for a member? (Allen, 1983:  22-23)

As already seen, Alvy attributed this joke to Sigmund Freud’s Wit and its Relation to the Unconscious (1905). Actually, neither the joke itself nor any likely forerunner appears in that book. Alvy’s creator was probably thinking of a joke which had appeared in Theodor Reik’s Jewish Wit in the following form (2):

Every day in a coffee house, two Jews sit and play cards. One day they quarrel and Moritz furiously shouts at his friend: “What kind of a guy can you be if you sit down every evening playing cards with a fellow who sits down to play cards with a guy like you!” (Reik, 1962: 57-8)

Alvy’s confusion of Reik’s book with Freud’s, takes nothing away from Woody Allen’s brilliant use of the joke in Annie Hall.

Virtually nothing has been written about the “Resignation Joke” in the literature on Groucho Marx. This is surprising, considering the notoriety enjoyed by the joke, especially since interest in it was revived by Woody Allen in 1977. Furthermore, none of the commentators who discuss the joke at all – Sheekman (3), McCaffrey (4), Wilson (5) and Arce (6) – raise the question as to why Groucho sent the famous telegram and what purpose it was intended to fulfill. The situation in which the telegram was sent will now be reconstructed, after which the original function of the “Resignation Joke” will be described, and an attempt will be made to account for its effectiveness in fulfilling that intended function.

The Friar’s Club Incident

We have two sources of information concerning the context in which Groucho Marx first used the “Resignation Joke.” The earlier of these sources is the biography written by the comedian’s son, Arthur Marx, who provided the following account:

[The actor, Georgie] Jessel has always been able to make Father laugh, and as a favor to him, he joined the Hollywood chapter of the Friar’s Club a couple of years ago. But Father doesn’t like club life, and, after a few months, he dropped out. The Friars were disappointed over losing him, and wanted to know why he was resigning. They weren’t satisfied with his original explanation – that he just didn’t have time to participate in the club’s activities. He must have another, more valid reason, they felt.

“I do have another reason,” he wrote back promptly. “I didn’t want to tell you, but since you’ve forced the issue, I just don’t want to belong to any club that would have me as a member.” (A. Marx, 1954: 45)

Since this biography appeared in 1954, “a couple of years ago” would place the incident somewhere in the vicinity of 1950-1952, assuming that a year or two may have elapsed between the writing and the publication of the book.

The other account we have was written by the comedian himself in the autobiography that was published in 1959. Much unpleasantness had apparently been omitted from the earlier record, perhaps out of discretion, in order to avoid offending anyone, or because any public criticism leveled at the club had to come from Groucho himself, and not his son. And even here, Groucho took the precaution of withholding the name of the club, which appears under the same alias (“Delaney”) that is jokingly applied to a number of parties portrayed in the autobiography in an unfavorable light.

Groucho begins by telling of his general aversion for clubs, and this is consistent with the earlier description in his son’s book, though here the aversion is concretized to a fuller extent:

I’m not a particularly gregarious fellow. If anything, I suppose I’m a bit on the misanthropic side. I’ve tried being a jolly good club member, but after a month or so my mouth always aches from baring my teeth in a false smile. The pseudo-friendliness, the limp handshake and the extra firm handshake (both of which should be abolished by the Health Department), are not for me. This also goes for the hearty slap-on-the-back and the all-around, general clap-trap that you are subjected to from the All-American bores which you would instantly flee from if you weren’t trapped in a clubhouse. (G. Marx, 1959: 320)

In the remainder of his account, specific grievances Groucho had against the Friar’s Club (alias “Delaney Club”) come to light:

Some years ago, after considerable urging, I consented to join a prominent theatrical organization. By an odd coincidence, it was called the Delaney Club. Here, I thought, within these hallowed walls of Thespis, we would sit of an evening with our Napoleon brandies and long-stemmed pipes and discuss Chaucer, Charles Lamb, Ruskin, Voltaire, Booth, the Barrymores, Duse, Shakespeare, Bernhardt and all the other legendary figures of the theatre and literature. The first night I went there, I found thirty-two fellows playing gin rummy with marked cards, five members shooting loaded dice on a suspiciously bumpy carpet and four members in separate phone booths calling women who were other members’ wives.

A few nights later the club had a banquet. I don’t clearly remember what the occasion was. I think it was to honor one of the members who had successfully managed to evade the police for over a year. The dining tables were long and narrow, and unless you arrived around three in the afternoon you had no control over who your dinner companion was going to be. That particular night I was sitting next to a barber who had cut me many times, both socially and with a razor. At one point he looked slowly around the room, then turned to me and said, “Groucho, we’re certainly getting a lousy batch of new members!”

I chose to ignore this remark and tried talking to him about Chaucer, Ruskin and Shakespeare, but he had switched to denouncing electric razors as a death blow to the tonsorial arts, so I dried up and resumed drinking. The following morning I sent the club a wire stating, PLEASE ACCEPT MY RESIGNATION. I DON’T WANT TO BELONG TO ANY CLUB THAT WILL ACCEPT ME AS A MEMBER. (G. Marx, 1959: 320-321)

Allowances should certainly be made for a good deal of exaggeration in the account cited above. Much of it is tongue-in-cheek, and designed to entertain the reader. However, the basic picture, regarding Groucho’s attitude toward the Friar’s Club, can undoubtedly be taken at face value.

If the two accounts – the son’s and the father’s – are allowed to complete each other, we can conclude that the full sequence of events probably looked something like this:

1) Groucho allows himself to be talked into joining the Friar’s Club, though he doesn’t like clubs in general.

2) He quickly becomes fed up with this club in particular, because of what he sees as its low intellectual and ethical standards.

3) The last straw is the final offensive remark in a series of insults to which he is subjected by a member of the club.

4) Groucho notifies the club that he is quitting, inoffensively giving as his excuse that he just doesn’t have time to participate in the club’s activities.

5) Unhappy about Groucho’s resignation and sensing that there may be more to it than the comedian is letting on, club members press him for the “real” reason.

6) Wanting to be done with this entanglement once and for all, Groucho pretends to disclose the real reason in the famous telegram, and is left alone from then on.

Seen in this light, it is clear that the “Resignation Joke” was invented to fulfill a tactical purpose: that of extricating Groucho from an unpleasant situation, by discouraging any further efforts on the part of club members to obtain a fuller explanation as to his reasons for resigning. But why did it work? To some degree, the apparent self-disparagement may have had a disarming effect. However, I suspect that two properties of the telegram played an even more important role in enabling it to fulfill its intended social function.

One of those properties is a defiance of logic of essentially the same type as that found in impossible figures which induce cognitive confusion by violating their own logic in so logically compelling a manner that we cannot grasp how they fit together.


“Penrose triangle”

“Three-stick clevis” or
“Two-pronged trident”

When Groucho Marx couched his “explanation” in the form of an impos­sible figure, he confronted the club-members with a piece of reasoning that was as impregnable to logic as a “Penrose triangle” or “three-stick clevis,” and which undoubtedly mystified those who would otherwise have pressed him for the real reason for his resignation. There is simply no arguing with an impossible figure, or with a person who is capable of generating one, which in a game situation is like checkmate in the sense that it marks the end of the contest, allowing for no subsequent move.

The second property of the telegram which accounts for its effectiveness, is the fact that it was framed as a joke. In delivering his “explanation” in a form calculated to provoke laughter, Groucho made it difficult for the club-members to know how to react without looking foolish, especially since they were already implicated in the joke, as a collective butt. As one commentator put it–though not in connection with the famous telegram: “Groucho may be the most powerful clown ever. […] because Groucho has the power to turn us nonfools into his private stock.” (Despot, 1981: 671)

Furthermore, the comedian’s toying with shared ridicule may have func­tioned as a kind of negotiation on his part: signaling his preference for sever­ing the relationship in a playful spirit, as well as his willingness to assume (or pretend to assume) the blame for its failure, thereby sparing the club-members’ feelings in exchange for a clean break. It was also a means for telling them indirectly and unmistakably that they were no match for his wit.

In any event, the joke put an end to the club-members’ requests for an explanation, thereby fulfilling a very specific social function. In the process, of course, Groucho launched a hilarious “one-liner” which (he must have sensed) would be retold countless times, and would become a lasting part of his own comic profile.

Paradoxically, one of the most striking examples of a self-disparaging joke turns out to have been motivated by a wish on the jokester’s part to dissociate himself once and for all from a group of people to whom he felt superior.

– – –

Fig. 2. Woody Allen tells the “Resignation Joke” in Annie Hall.

 

(1) I have taken the liberty of correcting the typography of the title of Freud’s book.

 

Fig. 3. “You’re using this conspiracy theory as an excuse to avoid sex with me.”

Fig. 4. “Is it the old Groucho Marx joke?”

 

(2) For the publication history of this joke, see Life Is Like a Glass of Tea: Studies of Classic Jewish Jokes (Aarhus: Aarhus University Press, 1992),  pp. 189-190.

(3) In his introduction to The Groucho Letters (New York: Simon & Schuster, 1967), Arthur Sheekman wrote of the joke: “There, in a few satirical words, is one of the most astute and revealing observations about the self-hating, socially ambitious human animal” (p. 8).

(4) For Donald W. McCaffrey, the joke was a non sequitur, resulting from “a chain reaction of delightful pseudo-logic that almost sounded valid.” The Golden Age of Sound Comedy (South Brunswick and New York: Barnes, 1973), p. 74.

(5) Christopher Wilson described the “Resignation Joke” as an example of shared ridicule, through which “the joker derides himself and his audience simultaneously […] The message of shared disparagement being–’If you don’t mind me, you’ve got no taste!'” Wilson was also the first to identify the joke as “a variant of the famous Jewish joke–’What sort of a shmuck do you think I am? I’m not going to sit down and play cards with the sort of shmuck who’d sit down and play cards with me.” Jokes: Form, Content, Use and Function (London: Academic Press, 1978), p. 190.

(6) In his introduction to The Groucho Phile (London: W. H. Allen, 1978), Hector Arce was the first to set the telegram in its social context: Referring to the Friar’s Club of Beverly Hills, Arce wrote that Groucho “had some misgivings about the quality of the members, doubts which were verified a few years later when an infamous card-cheating scandal erupted there. When he decided to drop out of the group, he wrote: ‘Gentlemen: Please accept my resignation. I don’t care to belong to any social organisation that will accept me as a member'” (p. xv).

Postscript

After completing this article, I found the following remark entirely by chance: “I can never be satisfied with anyone who would be block-head enough to have me” – a statement penned by none other than Abraham Lincoln in 1838 (7). Its possible significance in relation to Groucho Marx’s resignation joke will be considered in a future article.

(7) Letter to Eliza Browning (Mrs. Orville H. Browning) dated April 1, 1838. This letter is reproduced in its entirety in The Collected Works of Abraham Lincoln, edited by Roy P. Basler (New Brunswick, N.J: Rutgers University Press, 1953), Vol. 1, pp. 117-119, and can presently be accessed here or here.

Facts

Quatation record

Curiously enough, this one-liner is never quoted in precisely the same way by any two people. However, its underlying concept is so strong that the wording of the punch-line can be varied without in any way altering the impact of the joke. Here are twelve versions of the main sentence:

“I just don’t want to belong to any club that will accept me as a member.”

Arthur Marx, Life with Groucho. New York: Simon & Schuster, 1954; p. 45.

“I don’t want to belong to any club that will accept me as a member.”

Groucho Marx, Groucho and Me. New York: Bernard Geis, 1959; p. 321.

“I don’t care to belong to any club that will have me as a member.”

Arthur Sheekman in Groucho Marx, The Groucho Letters. New York: Simon & Schuster, 1967; p. 8.

“I wouldn’t belong to any organization that would have me for a member.”

Joey Adams, Encyclopedia of Humor. Indianapolis and New York: Bobbs-Merrill, 1968; p. 359.

“I wouldn’t join a club that would have me as a member.”

Lore and Maurice Cowan, The Wit of the Jews. London: Leslie Frewin, 1970; p. 96.

“I wouldn’t belong to an organization that would have me as a member.”

Donald W. Mc­Caffrey, The Golden Age of Sound Comedy. South Bruns-wick and New York: Barnes, 1973; p. 74.

“I would never want to belong to any club that would have someone like me for a member.”

Woody Allen’s film, Annie Hall (1977).

“I don’t care to belong to any social organization that will accept me as a member.”

Hector Arce in Groucho Marx, The Groucho Phile. London: W. H. Allen, 1978; p.  xv.

“I don’t wish to belong to any club that would accept me as a member.”

Christopher P. Wilson, Jokes: Form, Content, Use and Function.. London: Academic Press, 1978; p. 190.

“I wouldn’t join any club that would have me as a member.”

William Novak and Moshe Waldoks, The Big Book of Jewish Humor. New York: Harper & Row, 1981; p. 85.

“I do not care to belong to a club that accepts people like me as members.”

Joseph Dorinson, “Jewish Humor. Mechanism for Defense, Weapon for Cultural Affirmation,” Journal of Psycho-History 8, 4  (1981);  p. 452.

“I do not wish to belong to the kind of club that accepts people like me as member.”

Leo Rosten, Giant Book of Laughter. New York: Crown, 1985; p. 227.

I have run into only one commentator who actually succeeded in butchering this joke:

“Another of the many stories about the Marx Brothers concerns Groucho, who is alleged to have applied for membership of an exclu­sive New York club. When he was told that his application was accepted he is said to have pointed out that no club with a good repu­tation could possibly accept Groucho Marx as a member–therefore he would rather stay away. And he did.” John Montgomery, Comedy Films. London: George Allen & Unwin, 1954; p. 251.

The false dichotomy of trigger warnings BY DANTIP on MAY 28, 2015

The false dichotomy of trigger warnings

 

The false dichotomy of trigger warnings

Ovid

by Massimo Pigliucci

There has been lots of talk about so-called “trigger warnings” lately. Although they originated outside the university (largely on feminist message boards in the ‘90s, and then in the blogosphere [1]), within the academy this is the idea that professors should issue warnings to their students about potentially disturbing material that they are about to read or otherwise be exposed to. The warnings are necessary, advocates say, because such material may “trigger” episodes of discomfort, emotional pain, or outright post-traumatic stress disorder (PTSD).

This is clearly a crucial issue for a teacher such as myself, who is responsible for contributing to the education of scores of students every semester, and who is of course also concerned about their welfare and their thriving as human beings. So I read a lot, and widely (meaning both pro and con), about the issue, and have talked to colleagues and a number of students, in order to make up my mind not just in a theoretical sense, but also as guidance to my own actual practice in the classroom.

One of the most recent episodes concerning the controversy over trigger warnings (henceforth, TW) featured four Columbia University students belonging to the local Multicultural Affairs Advisory Board, who wrote a letter to the Columbia Spectator [2] arguing that exposure to the writings of the classic Roman poet Ovid should have come with TW because they contain references to rape. Referring to the experience of another student in a Literature Humanities course, the letter reads, in part:

During the week spent on Ovid’s “Metamorphoses,” the class was instructed to read the myths of Persephone and Daphne, both of which include vivid depictions of rape and sexual assault. As a survivor of sexual assault, the student described being triggered while reading such detailed accounts of rape throughout the work. However, the student said her professor focused on the beauty of the language and the splendor of the imagery when lecturing on the text. As a result, the student completely disengaged from the class discussion as a means of self-preservation. She did not feel safe in the class. When she approached her professor after class, the student said she was essentially dismissed, and her concerns were ignored.

As far as I can tell, this is pretty representative of some students’ point of view on the issue. Let me now give you a taste of how some faculty responded to this sort of argument. (I will ignore the more brash and insensitive commentary that has come especially from some conservative and libertarian quarters, because I don’t think they help the discussion move forward. If you really wish to have a taste of them, read through a partial compilation published by The Washington Post [3].)

For instance, a group of seven professors who teach in some of the fields most often targeted by advocates of TW — gender studies, critical race studies, film and visual studies, literary studies —listed a number of reasons why TW are a bad idea [4], among which:

Faculty cannot predict in advance what will be triggering for students.The idea that trauma is reignited by representations of the particular traumatizing experience is not supported by the research on post-traumatic stress disorder and trauma.

There is no mechanism, in the discourse of ‘triggering,’ for distinguishing material that is oppositional or critical in its representation of traumatizing experience from that which is sensationalistic or gratuitous.

PTSD is a disability; as with all disabilities, students and faculty deserve to have effective resources provided by independent campus offices.

Faculty of color, queer faculty, and faculty teaching in gender/sexuality studies, critical race theory, and the visual/performing arts will likely be disproportionate targets of student complaints about triggering.

Trigger warnings may provide a dangerous illusion that a campus has solved or is systematically addressing its problems with sexual assault, racial aggression, and other forms of campus violence, when, in fact, the opposite may be true.

These two excerpts already lay out much of the meat of the discourse on TW. On the one hand, faculty ought to be sensitive, rather than dismissive, to students’ concerns. This is our duty both as teachers and, simply speaking, as human beings. On the other hand, there are several reasons to think that requiring formal administrative policies about TW (as a number of students are now requesting, and universities are considering) is likely to have a good deal of negative consequences, not only for faculty, but for the students themselves.

Indeed, some students are pushing back against their own colleagues. Here is a number of comments collected during a survey on TW by a faculty who wished to explore the issue with her own students [5]:

“I would like to experience the novel without warning beforehand.”

“I think one purpose of triggers is to face deep trauma and to hopefully grow from it.”

“This is the real world and bad things happen. Caring for those affected by these topics is also a necessity.”

“If someone is so shocked that they couldn’t deal with readings, they should really be seeking help professionally and not take the class at this time.”

The same faculty, Lori Horvitz, points out that she feels unjustly attacked when students who push TW imply (or say outright), that she is simply unconcerned about their welfare: “I want to scream: ‘I care! This is why I have chosen to teach difficult material, about the oppression of women and minorities, in the first place.’”

The American Association of University Professors has also tackled the issue, and it has come down squarely against TW [6], for many of the same reasons listed by the multi-faculty op-ed mentioned above. The report begins by noticing how the range of subject matters that have been put forth for TW is vast, covering pretty much every potentially controversial (and educational) topic within the academy: racism, classism, sexism, heterosexism, cissexism, ableism, and other issues of privilege and oppression. The authors of the report mentioned a specific incident in which students at Wellesley College objected to a sculpture of a man in his underwear on the grounds that it might be a source of triggering thoughts regarding sexual assault, even though the artist wanted to represent sleepwalking.

Here are some of the most salient points of the AAUP report:

* The presumption that students need to be protected rather than challenged in a classroom is at once infantilizing and anti-intellectual.

* [TW] single out politically controversial topics like sex, race, class, capitalism, and colonialism for attention. … If such topics are associated with triggers, correctly or not, they are likely to be marginalized if not avoided altogether.

* Administration regulation constitutes interference with academic freedom; faculty judgment is a legitimate exercise of autonomy.

* Trigger warnings conflate exceptional individual experience of trauma with the anticipation of trauma for an entire group.

* A trigger warning might lead a student to simply not read an assignment or it might elicit a response from students they otherwise would not have had.

* Some discomfort is inevitable in classrooms if the goal is to expose students to new ideas, have them question beliefs they have taken for granted, grapple with ethical problems they have never considered.

* Trigger warnings reduce students to vulnerable victims rather than full participants in the intellectual process of education.

* The classroom is not the appropriate venue to treat PTSD, which is a medical condition that requires serious medical treatment.

* Trigger warnings are a way of displacing the problem, however, locating its solution in the classroom rather than in administrative attention to social behaviors that permit sexual violence to take place.

Again, while students’ concerns should never be treated lightly, the above list also raises a number of crucial objections to TW which go right to the core of what it means to engage in higher education, and they too should not be dismissed as simply a parochial attempt by faculty to retain their “privilege,” or simply to save their ass from being sued. (To achieve the latter goal, actually, it would probably be easier to just slap a generic label on every syllabus and be done with it. Though of course that would hardly do anything useful for anyone.)

One of the recent contributions to discussions about TW that I found most compelling, however, is an article by Todd Gitlin in Tablet Magazine [7]. Gitlin doesn’t provide a systematic list of concerns, he simply begins by recalling a stark episode that occurred during his own education, when one of his teachers exposed the class — needless to say, without warning —  to two films about Nazi Germany and the Holocaust. The first one was Triumph of the Will [8], often considered the “greatest” Nazi propaganda movie ever made; the second one was Night and Fog, by Alain Resnais [9], the first ever documentary about the Holocaust. This is Gitlin’s commentary on the episode:

The juxtaposition of the two films was, of course, no accident. They were programmed in sequence to make unavoidable the sense of a causal vector running from the submissive ecstasies of Nuremberg to the horrors of Auschwitz. You didn’t need a diagram. It was a shattering afternoon. The audience left in dead silence.

I’ve not forgotten the shock and logic of the segue. (Neither has a classmate I checked with, who was there as well.) Those images were engraved into our souls. The cinematic double whammy certainly made me, to use the current euphemism, ‘uncomfortable.’ Oh yes, to put it mildly, it made me very uncomfortable. That was the point. Mission accomplished, Professor Sam Beer of Harvard’s Soc Sci 2. You impressed upon this 19-year-old soul an unbearable, ineradicable warning about mass rallies and mass murder. You didn’t draw me a diagram. You burned into me that more powerful thing: a synapse.

Gitlin then recounted his very recent encounter with journalist Charif Kiwan, who introduced — at Columbia University — a documentary about the ongoing destruction in Syria with the following words: “We want to haunt your imagination. Please be disturbed.”

So, what is there to be done about trigger warnings? On the one hand, we have a strong push by (some) students for a fairly broad application of the concept to an extensive variety of subject matters and individual texts or other materials that are often used in academic settings. The rationale behind this push is a series of concerns, ranging from not wanting to experience discomfort in class to wishing to avoid episodes of PTSD in people who are prone to them.

On the other hand, we have a pretty strong push back by a number of faculty, and one of their leading organizations. Here too the rationales are varied, from issues of academic freedom to the lack of empirical support for the effectiveness of TW, from the possibility that they offer a false sense of security (and, mostly, cover for the administration) to the risk of invalidating precisely what is most precious about higher education.

There is one concept that seems to have eluded the majority of articles and interviews on TW, though (of course, I haven’t done an exhaustive search, and I would be stunned if nobody has brought this up before!): the idea of best practice, on the part of faculty.

University faculty are professionals who develop fundamentally two skills during their careers: scholarship and teaching (either in that or in reverse order of importance, depending on the institution at which they work). Both of the corresponding activities inevitably present ethical issues. A faculty qua scholar, for instance, knows (or should know) that it is not acceptable to plagiarize other people’s work, or to fabricate data, or to take unfair advantage of the work done by junior colleagues and students. This doesn’t mean, of course, that these things don’t happen. But when they do, both the University and professional organizations already have tools to act appropriately to redress the wrong and punish the offender.

Similarly, when it comes to teaching, my colleagues and I know that certain things are unacceptable. Students’ complaints should not be dismissed out of hand; students should not only be allowed, but encouraged, to analyze critically not just the materials they are given, but even the very structure of the courses they are taking, no holds barred. Again, when faculty fail to do so there are already mechanisms at the professional and administrative levels to deal with it (I know because I was a Department Chair for five years, and I have dealt with some pertinent cases).

When it comes to the issue we have been tackling, then, best practice most certainly includes the idea that one doesn’t spring shocking material on students for the sake of shock: it has to have pedagogical content. It is not okay, say, to start a class on human anatomy by showing a video of a beheading carried out by ISIS. There would be no point at all in doing so, other than a perverse delight in disturbing one’s students. But such a video may very well be pertinent in a class devoted to the study of terrorism, for instance. Should it be accompanied by a warning that potentially disturbing material will be shown? Hell yes, but that’s just commonsense (and, again, good practice), as the material is disturbing quite irrespectively of whether it triggers memories of one’s own experiences (not many students have that particular kind of memory, after all). What about something like Ovid’s description of rape in the Metamorphoses? [10] I’m not a classicist, but that does not sound to me like it requires a special warning, although it would be good — in the modern classroom — for the faculty to lead a discussion not only about the poetic language (which is, indeed, beautiful) but also the cultural and historical context that made rape the subject of poetry to begin with.

The issue, then, is: can we come up with general, encompassing rules for what requires a warning and what doesn’t? And who is to make the relevant decisions in practice?

The answer to the first question seems clear on empirical grounds: no. There are too many situations and materials, and too varied students’ experiences for it to be possible to arrive at operationally useful rules. And a simple generic label won’t do the trick, in fact likely having negative pedagogical consequences, making a mockery of the whole idea of TW.

The answer to the second question is: not students, and not administrators, but faculty (though with input from both students and administrators). Why? Because, as hard as it seems to understand for the American public these days, teachers are professionals, who are therefore much better positioned than either students or administrators when it comes to decide what and how to teach.

Administrators these days tend to think of themselves as the real owners of universities, but in fact they are by far the least important component of all: both research and teaching is done by faculty, and the major point of a university is to teach students. Administrators are there to, well, administer, i.e., to do their best so that the people they serve — the students and the faculty — can respectively learn and do their jobs. Period.

Students, for their part, are not “customers,” as they are often portrayed nowadays. And they are not equal players in the classroom either. There is a (good) reason why I’m standing in front of the class and they are lined up on the other side, just like there is a reason why you sit on the table while the doctor examines you, not the other way around. That said, of course, students (like patients) have rights, which include being heard by the faculty (doctor) with the expectation that their point of view will be taken into due consideration, and that if it isn’t, they have further recourse (to the university’s or the hospital’s administration).

Best practice, then, means that we should reject the imposition of official policies about TW, but also that faculty have a (moral, pedagogical) responsibility to conduct themselves in the classroom in a way that serves their students to the best of their abilities. And this may include occasional warnings for specific instances of potentially disturbing material. But bear in mind the conclusion of Gitlin’s essay mentioned above: “Ye shall know the truth, and the truth shall make ye free” Not comfortable — free.”

_____

Massimo Pigliucci is a biologist and philosopher at the City University of New York. His main interests are in the philosophy of science and pseudoscience. He is the editor-in-chief of Scientia Salon, and his latest book (co-edited with Maarten Boudry) is Philosophy of Pseudoscience: Reconsidering the Demarcation Problem (Chicago Press).

[1] How The “Trigger Warning” Took Over The Internet, by A. Vingiano, BuzzFeed, 5 May 2014.

[2] Our identities matter in Core classrooms, by K. Johnson, T. Lynch, E. Monroe, and T. Wang, Columbia Spectator, 30 April 2015.

[3] Columbia students claim Greek mythology needs a trigger warning, by M.E. Miller, The Washington Post, 14 May 2015.

[4] Trigger Warnings Are Flawed, by E. Freeman, B. Herrera, N. Hurley, H. King, D. Luciano, D. Seitler, and P. White, Inside Higher Education, 29 May 2014.

[5] Life doesn’t come with trigger warnings. Why should books?, by L. Horvitz, The Guardian, 18 May 2015.

[6] On Trigger Warnings, by the AAUP Committee on Academic Freedom and Tenure, August 2014.

[7] Please Be Disturbed: Triggering Can Be Good for You, Kids, by T. Gitlin, Tablet Magazine, 13 March 2015.

[8] You can watch the full Nazi propaganda movie (1hr 44m) here.

[9] Here is Resnais’ documentary (about 32m).

[10] Which you can check out for yourself here.

The false dichotomy of nature-nurture, with notes on feminism, transgenderism, and the construction of races

Source: The false dichotomy of nature-nurture, with notes on feminism, transgenderism, and the construction of races

The Reactionary Temptation An open-minded inquiry into the close-minded ideology that is the most dominant political force of our time — and can no longer be ignored. By Andrew Sullivan

The Reactionary Temptation

http://nymag.com/daily/intelligencer/2017/04/andrew-sullivan-why-the-reactionary-right-must-be-taken-seriously.html

An open-minded inquiry into the close-minded ideology that is the most dominant political force of our time — and can no longer be ignored.

By

Border wall near Los Indios, Texas, 2015. Photograph by Richard Misrach
8:59 pm

Look around you. Donald Trump is now president of the United States, having won on a campaign that trashed liberal democracy itself, and is now presiding over an administration staffed, in part, with adherents of a political philosophy largely alien to mainstream American politics. In Russia, Vladimir Putin has driven his country from postcommunist capitalism to a new and popular czardom, empowered by nationalism and blessed by a resurgent Orthodox Church. Britain, where the idea of free trade was born, is withdrawing from the largest free market on the planet because of fears that national identity and sovereignty are under threat. In France, a reconstructed neofascist, Marine Le Pen, has just won a place in the final round of the presidential election. In the Netherlands, the anti-immigrant right became the second-most-popular vote-getter — a new high-water mark for illiberalism in that once famously liberal country. Austria narrowly avoided installing a neo-reactionary president in last year’s two elections. Japan is led by a government attempting to rehabilitate its imperial, nationalist past. Poland is now run by an illiberal Catholic government that is dismembering key liberal institutions. Turkey has morphed from a resolutely secular state to one run by an Islamic strongman, whose powers were just ominously increased by a referendum. Israel has shifted from secular socialism to a raw ethno-nationalism.

We are living in an era of populism and demagoguery. And yes, there’s racism and xenophobia mixed into it. But what we are also seeing, it seems to me, is the manifest return of a distinctive political and intellectual tendency with deep roots: reactionism.

Reactionism is not the same thing as conservatism. It’s far more potent a brew. Reactionary thought begins, usually, with acute despair at the present moment and a memory of a previous golden age. It then posits a moment in the past when everything went to hell and proposes to turn things back to what they once were. It is not simply a conservative preference for things as they are, with a few nudges back, but a passionate loathing of the status quo and a desire to return to the past in one emotionally cathartic revolt. If conservatives are pessimistic, reactionaries are apocalyptic. If conservatives value elites, reactionaries seethe with contempt for them. If conservatives believe in institutions, reactionaries want to blow them up. If conservatives tend to resist too radical a change, reactionaries want a revolution. Though it took some time to reveal itself, today’s Republican Party — from Newt Gingrich’s Republican Revolution to today’s Age of Trump — is not a conservative party. It is a reactionary party that is now at the peak of its political power.

The reactionary impulse is, of course, not new in human history. Whenever human life has changed sharply and suddenly over the eons, reactionism has surfaced. It appeared in early modernity with the ferocity of the Catholic Counter-Reformation in response to the emergence of Protestantism. Its archetypal moment came in the wake of the French Revolution, as monarchists and Catholics surveyed the damage and tried to resurrect the past. Its darkest American incarnation took place after Reconstruction, as a backlash to the Civil War victory of the North; a full century later, following the success of the civil-rights movement, it bubbled up among the white voters of Richard Nixon’s “silent majority.” The pendulum is always swinging. Sometimes it swings back with unusual speed and power.

You can almost feel the g-force today. What are this generation’s reactionaries reacting to? They’re reacting, as they have always done, to modernity. But their current reaction is proportional to the bewildering pace of change in the world today. They are responding, at some deep, visceral level, to the sense that they are no longer in control of their own lives. They see the relentless tides of globalization, free trade, multiculturalism, and mass immigration eroding their sense of national identity. They believe that the profound shifts in the global economy reward highly educated, multicultural enclaves and punish more racially and culturally homogeneous working-class populations. And they rebel against the entrenched power of elites who, in their view, reflexively sustain all of the above.

I know why many want to dismiss all of this as mere hate, as some of it certainly is. I also recognize that engaging with the ideas of this movement is a tricky exercise in our current political climate. Among many liberals, there is an understandable impulse to raise the drawbridge, to deny certain ideas access to respectable conversation, to prevent certain concepts from being “normalized.” But the normalization has already occurred — thanks, largely, to voters across the West — and willfully blinding ourselves to the most potent political movement of the moment will not make it go away. Indeed, the more I read today’s more serious reactionary writers, the more I’m convinced they are much more in tune with the current global mood than today’s conservatives, liberals, and progressives. I find myself repelled by many of their themes — and yet, at the same time, drawn in by their unmistakable relevance. I’m even tempted, at times, to share George Orwell’s view of the neo-reactionaries of his age: that, although they can sometimes spew dangerous nonsense, they’re smarter and more influential than we tend to think, and that “up to a point, they are right.”

I met Charles Kesler in March on an idyllic sunny day in Pasadena, California, where he lives. He’s a soft-spoken, thoughtful figure, with a shock of white hair and a bemused smile on his face. He grew up in West Virginia, with a schoolteacher mom and a dad who owned a grocery store. They were, he told me, culturally conservative and politically mixed. He is now a professor at Claremont McKenna, where he focuses on the roots of a specifically American conservatism, exemplified by his reading of the Founding Fathers. (He’s the editor of a very popular edition of The Federalist Papers.) He also edits the Claremont Review of Books, a small conservative version of the New York Review of Books that attracted attention first in its critique of George W. Bush’s Iraq War, and again last year, when it came out in support of Donald Trump just when the entire Republican Establishment was trying to destroy him. Along with The American Conservative and the new quarterly American Affairs, it’s now a central forum for many of the sentiments that helped Trump win the presidency.

What on earth was a professor like Kesler doing backing a man who has barely read a book in his life, seems to think Frederick Douglass is still alive, and who’d last less than a few seconds in a Kesler seminar? He smiled a little defensively. He’s perfectly aware of Trump’s manifest flaws — his “crudity, anger and egotism,” as he has written. He has conceded that Trump was seeking a job “for which everyone — everyone — agrees he is conspicuously unready.” Even when we met, he averred: “I don’t know how serious he is.” And yet he still gambled on a despotic, undisciplined, impulsive former Democrat.

It was an act of desperation, he explained. In classic reactionary fashion, he believes that we are living through a crisis of American democracy. The Claremont consensus (to put a name on this strain of thought) holds that beneath the veneer of constitutional democracy, we are actually governed by a soft despotism of permanent experts, bureaucrats, pundits, and academics who ignore the majority of the American people. This elite has encouraged a divisive social transformation of the country, has led us into disastrous wars, and has created a deepening economic crisis for the middle class. Anyone — anyone — who could challenge this elite’s power was therefore a godsend.

Kesler’s worldview is rooted in the ideas of the 20th-century political philosopher Leo Strauss. Strauss’s idiosyncratic genius defies easy characterization, but you could argue, as Mark Lilla did in his recent bookThe Shipwrecked Mind, that he was a reactionary in one specific sense: A Jewish refugee from Nazi Germany, Strauss viewed modernity as collapsing into nihilism and relativism and barbarism all around him. His response was to go back to the distant past — to the works of Plato, Aristotle, and Maimonides, among others — to see where the West went wrong, and how we could avoid the horrific crimes of the 20th century in the future.

One answer was America, where Strauss eventually found his home at the University of Chicago. Some of his disciples — in particular, the late professor Harry Jaffa — saw the American Declaration of Independence, with its assertion of the self-evident truth of the equality of human beings, as a civilizational high point in human self-understanding and political achievement. They believed it revived the ancient Greek and Roman conception of natural law. Yes, they saw the paradox of a testament to human freedom having been built on its opposite — slavery — but once the post–Civil War constitutional amendments were ratified, they believed that the American constitutional order was effectively set forever, and that the limited government that existed in the late-19th and early-20th centuries required no fundamental change. (Jaffa made an exception for the Civil Rights Act of 1964, which he believed was the only way to enforce the post–Civil War amendments against southern resistance.)

The expanded government of the last century, begun in earnest by Woodrow Wilson, was, therefore, an unconstitutional and anti-democratic power grab by educated elites. Kesler and many of his fellow Claremonters believe democracy is exercised best at the local level, in accord with the “unenlightened” views of the citizenry, or directly through members of Congress, unencumbered by the layers of bureaucracy, executive fiat, and the control of centralizing modern governments. They call this ever-growing apparatus “the administrative state,” and they loathe it not so much for how it constricts economic growth (as many conservatives do) but for how it creates a kind of political tyranny — a ruling class that can enforce its morality and policy preferences through Executive-branch regulation. The Obama administration’s reworking of Obamacare after its passage, for example, and its climate and immigration policies were all big policy changes that never went through Congress.

The Claremonters were particularly upset last year by the Obama administration’s use of Title IX to direct all public schools to institute transgender-friendly policies for bathroom facilities. “Political correctness,” Kesler believes, “is a serious and totalist politics, aspiring to open the equivalent of a vast reeducation camp for the millions of defective Americans who are products of racism, sexism, classism, and so forth.” He supported Trump because the candidate relished taking on both the administrative state and the PC movement: “If relimiting the government by constitutional means was not an option … then what is left but to use the system as it is, and try placing a strong leader, one of our own, someone who can get something done in our interest, at the head of it?”

Kesler also saw in Trump’s instincts on immigration and trade a return to 19th-century Republicanism, which he believes is newly relevant in a post–Cold War world. The party of McKinley and Coolidge had, after all, been one that favored tariffs. The party platform of 1896 declared, “We renew and emphasize our allegiance to the policy of protection, as the bulwark of American industrial independence, and the foundation of American development and prosperity.” In 1924, the GOP platform reiterated this: “We believe in protection as a national policy.” Kesler saw Trump as tapping into this old Republicanism, noting that he was the first president in living memory to use the word protection favorably in his inaugural address.

On foreign policy, too, Kesler projects onto Trump’s impulses a return to the classic American position before the Second World War: suspicious of multinational entanglements, prickly in the defense of the western hemisphere, and dedicated primarily to the national interest. On immigration, Kesler sees in Trump a return to the 1920 Republican platform, which proposed to limit the number of foreigners to “that which can be assimilated with reasonable rapidity, and to favor immigrants whose standards are similar to ours.” Trump, Kesler wants to believe, vaults the conservative movement back more than 70 years. And he’s fine with that.

“We would happily trade our current government for one that worked exactly as designed in 1787, as amended in 1865 and shortly thereafter.” You would be hard put to find such a blunt declaration in Kesler’s Claremont Review, but it’s just one of many provocations that appeared last year in the now-defunct group blog the Journal of American Greatness. The blog had a madcap feel to it, bristling with almost tongue-in-cheek assaults on the modern world, on stuffy career conservatives, and risible “social justice warriors.” Its authors included a young Straussian, Julius Krein, who is now editing a new journal, American Affairs, and an older student of Jaffa’s, Michael Anton, who now works in the press office at the National Security Council.

Anton is the most interesting intellectual behind Trumpism, today’s American version of reactionism. He’s the suave, credentialed foil to Steve Bannon’s rumpled autodidact, a Trump official who just published a paper on Machiavelli in an academic journal. I recently met him for dinner near the White House. An immensely tall man, of piercing intelligence and meticulous attire, Anton is a product of post-hippie California, one of many contemporary reactionaries who rejected their reflexive youthful liberalism because of their revulsion at the political left they encountered on campus — in Anton’s case, at Berkeley.

Once a conventional Republican, an aide to George W. Bush, and an advocate of the Iraq War, Anton decisively broke ranks in 2016 and came out as a proud reactionary. Anton’s critique of the current moment — and his justification for backing Trump — can best be summarized by the slogan the group blog adopted: “What difference, at this point, does it make?” (It doubled as a snarky reference to one of Hillary Clinton’s comments during the Benghazi hearings.) He became famous for his essay “The Flight 93 Election,” in which he compared America in 2016 to the 9/11 plane hijacked by jihadists and on a course to crash in Washington. In those circumstances, he recommended: “Charge the cockpit or you die. You may die anyway. You — or the leader of your party — may make it into the cockpit and not know how to fly or land the plane. There are no guarantees. Except one: if you don’t try, death is certain.” It’s not just that Trump is better than the alternatives: “The truth is that Trump articulated, if incompletely and inconsistently, the right stances on the right issues — immigration, trade and war — right from the beginning.”

The Claremont critique of the administrative state and the liberal elite does not appear to be enough for Anton. His aim is at what he calls, rather wickedly, “the Party of Davos,” or the “Davoisie.” This is the administrative state gone global. With The Economist as its Bible and its social liberalism and economic conservatism turned into unquestionable dogmas, the Davoisie, perched in the Alps, luxuriates in self-love. It routinely shoots down any critiques of globalization, sees few problems with mass immigration, and is still busy celebrating an ever-more-powerful European Union and ever-more-expansive free-trade agreements among ever-more countries.

None of this, Anton concluded, has anything to do with the American people and their interests. The Davoisie were too busy lifting foreigners out of poverty and celebrating the latest disruptive tech invention to cast a glance toward, say, the beleaguered inhabitants of Kansas or Michigan. Anton admired Trump, he wrote last year, largely because “he’s single-handedly revived talking about government serving its own citizens first.” Trump understood that the American idea is a compact “for the American people, and not for foreigners, immigrants (unless we choose to welcome them) or anyone else.” Three months into a Trump presidency, Anton hasn’t changed his mind.

Politics comes before economics, Anton insists. Free trade may boost our economy, encourage efficiencies, and advance innovation and wealth, but it affects different people differently. And this matters in a democracy. A society’s stability and fairness and unity count for more than its aggregate wealth — especially when, as in recent decades, almost all the direct benefits have gone to the superrich, and all the costs have been paid by the working poor. In the Journal of American Greatness, Krein scorned the abstractions so beloved of the Davoisie: “There is no ‘free trade’ outside of undergraduate economics textbooks,” he wrote, “and trade agreements exist precisely to determine the winners and losers of those zero-sum transactions inherent in any global competition.” Economically unifying the entire planet is not necessarily in a nation’s interest at all.

Nor, according to today’s reactionaries, is mass immigration. And it’s on this topic — more than any other — where the abstract ideas of neo-reactionaries connect with the fears, passions, and cultural panic of many among the population at large.

The Journal of American Greatness’s position goes something like this: The economic benefits (for capitalist elites) and multicultural delights (for progressive elites) of mass immigration are taken for granted by the Davoisie — and by liberals and free-market conservatives more generally. If you live in a major metropolis, with unprecedented prosperity and a tradition of assimilating newcomers, what’s not to like? And if you’re an immigrant, these places are full of jobs you are happy to take. But if your family is in a rural area or a heartland city, where ethnic diversity has not been the norm in the past, and where globalization has dramatically eroded traditional blue-collar jobs, it’s a little more complicated.

Mass immigration, neo-reactionaries argue, creates more job competition for those without college degrees, and, by the laws of supply and demand, lowers wages for some, even as it massively increases profits for a few. At some point, a citizen on the losing end will surely ask: Why is my country benefiting foreigners and new immigrants, many of them arriving illegally, while making life tougher for its own people? And why doesn’t it matter what I think? It’s this question that Anton has a policy answer for. Scaling back free trade and ending mass immigration would “improve the economic prospects of the lower half of our workforce to a greater extent than either would in isolation,” he has written. “The people have repeatedly said ‘no’ to more immigration, ‘no’ to more free trade … but the administrative state will not allow itself to be driven in a direction it does not want to go. It therefore must be broken.”

And then there is the cultural impact of mass immigration, which the Party of Davos, living in a post-national world, celebrates as a vision of the global future. Neo-reactionaries beg to differ. They get a little vague here — tiptoeing awkwardly around the question of race. A nation, they believe, is not just a random group of people within an arbitrary set of borders. It’s a product of a certain history and the repository of a distinctive culture. A citizen should be educated to understand that country’s history and take pride in its culture and traditions. Honed and modulated over time, this national culture gives crucial legitimacy to the American political system by producing citizens acclimated to the tolerance, self-government, and other civic values that democracy needs if it is to function. And so Anton, who gives America’s long history of successful integration of immigrants short shrift, worries about the influx of what he delicately calls “non-republican peoples.” “What happens when the West ceases to be western?” he asked me. On the blog, he was much more direct: He wrote that “Islam and the West are incompatible” and that Muslim immigration should be almost entirely banned. A country like the United States requires “a certain type or character of people.”

Isn’t all this just code for white nationalism? That’s certainly what self-described white nationalists cite in their support for Trump. When I asked Anton bluntly about whether he believes race matters to a national identity, he turned uncharacteristically silent: “I’m not going to say something that could be used to destroy my livelihood and career.” Kesler, when I confronted him with this as well, responded: “The definition of ‘white’ is a political definition. It may be that a lot of people we now regard as inherently and unchangeably Hispanic will turn out to be whites eventually as their incomes go up, as their place in society changes over time in the same way that Italians and Poles and Central Europeans were once ‘second-class’ whites.” Kesler seemed to be describing a white-nationalist country that slowly absorbs others into the fold — turning their cultural “otherness” into an integrated, but still somehow “white,” American identity. “The rate of intermarriage among African-Americans is going up, too,” he tells me as my eyes widen. “How different would American politics be if Obama had defined himself as multiracial rather than black as such … as a new kind of American transcending race?”

Neo-reactionary unease with mass immigration is exacerbated by what they see as the administrative state’s shift from belief in a “melting pot” model in which all immigrants assimilate to a common American culture to the multicultural model, where the government, business, and society recognize different languages and celebrate ethnic diversity over national unity. Anton notes that America is now “a country in which Al Gore mistranslates e pluribus unum as ‘Out of one, many’ and in his error is actually more accurate to the spirit of our times.” The problems of ethnic division are further compounded by the view growing among the elites that America itself is at root a racist white construction, and that “assimilation” is therefore an inherently bigoted idea.

This notion of a national culture, rooted in, if not defined by, a common ethnicity, is even more powerful in European nations, which is why Brexit is so closely allied to Trumpism. In the case of Britain, the question of race is framed within a euphemism used by the British government itself: a “visible minority” versus an “invisible one.” “Since 2001, Britain’s ‘visible minority’ population has nearly doubled, from 8 percent to 14 percent today,” Benjamin Schwarz, the national editor of The American Conservative, noted last year. “It is projected to rise to about 38 percent by mid-century.” Is Britain changing so fast that it could lose any meaningful continuity with its history and culture? That is the question now occupying the British neo-reactionaries. Prime Minister Theresa May has not said many memorable things in office, except this: “If you believe you are a citizen of the world, you are a citizen of nowhere.”

A year ago, Anton took issue with an article I wrote for this magazine in which I described Trump as reminiscent of Plato’s description of a tyrant emerging out of a decadent democracy and argued that we should do what we could to stop him. Anton’s critique was that I was half-right and half-wrong. I was right to see democracy degenerating into tyranny but wrong to see any way to avoid it. What he calls “Caesarism” is already here, as Obama’s abuse of executive power proved. Therefore: “If we must have Caesar, who do you want him to be? One of theirs? Or one of yours (ours)?” Krein put it even more plainly: “Restoring true constitutional — or even merely competent — government requires a fundamental transformation of the underlying culture and elite opinion. It requires, in a certain sense, regime change in America.”

That indeed is the explicit aim of Curtis Yarvin, who takes Kesler’s and Anton’s dismay at modern America to new and dizzying heights — and reactionism to its logical conclusion. A geeky computer programmer in his 40s, he writes a reactionary blog, Unqualified Reservations, under the pseudonym Mencius Moldbug and has earned a cult following among the alt-right. His magnum opus — “An Open Letter to Open-Minded Progressives” — is an alternately chilling and entertaining assault on almost everything educated Westerners hold to be self-evidently true. His critique of our present is not that we need a correction to return us to traditional notions of national culture and to unseat the administrative state and its elites; it is that we need to take the whole idea of human “progress” itself and throw it in the trash can. Things didn’t start going wrong in the 1960s or under the Progressives. Yarvin believes that the Western mind became corrupted during the Enlightenment itself. The very idea of democracy, allied with reason and constitutionalism, is bunk: “Washington has failed. The Constitution has failed. Democracy has failed.” His golden era: the age of monarchs. (“It is hard not to imagine that world as happier, wealthier, freer, more civilized, and more pleasant.”) His solution: “It is time for restoration, for national salvation, for a full reboot. We need a new government, a clean slate, a fresh hand which is smart, strong and fair.”

At first, Yarvin reads like some kind of elaborate intellectual prank (as well as a legendary exercise in trolling). And he writes with a jocular, designed-to-shock style that is far more influenced by snarky web discourse than anything in, say, the Claremont Review. But the more you read, the more his ideological transgressions seem to come from a deadly serious place. He challenges the idea that the present is always preferable to the past: “There is no strong reason to think that governments recent and domestic are any better than the governments ancient and foreign,” he writes. “The American Republic is over two hundred years old. Great. The Serene Republic of Venice lasted eleven hundred.” The assumption that all of history has led inexorably to today’s glorious and democratic present is, he argues, a smug and self-serving delusion. It’s what used to be called Whig History, the idea that all of human history led up to the democratic institutions and civilizational achievements of liberal Britain, the model for the entire world. This reflexive sense that the world is always going forward has become an American orthodoxy almost no one questions. Insofar as progressives see flaws in the system, Yarvin suggests, it is only because the work of progress is never done.

Why do so many of us assume that progress is inevitable, if never complete? Yarvin, like the Claremonters and American Greatness brigade, blames an elite that he calls by the inspired name “the Cathedral,” an amalgam of established universities and the mainstream press. It works like this: “The universities make decisions, for which the press manufactures consent. It’s as simple as a punch in the mouth.” If that concept of “manufacturing consent” reminds you of the Chomskyite far left, you wouldn’t be wrong. But for Yarvin, the consent is manufactured not by capitalism, advertising, and corporations but by liberal academics, pundits, and journalists. They simply assume that left liberalism is the only rational response to the world. Democracy, he contends, “no longer means that the public’s elected representatives control the government. It means that the government implements scientific public policy in the public interest.”

And the Cathedral has plainly failed. “If we imagine the 20th century without technical progress, we see an almost pure century of disaster,” Yarvin writes, despairing from his comfy 21st-century perch. His solution is not just a tyrannical president who hates all that the Cathedral stands for but something even more radical: “the liquidation of democracy, the Constitution and the rule of law, and the transfer of absolute power to a mysterious figure known only as the Receiver, who in the process of converting Washington into a heavily armed, ultra-profitable corporation will abolish the press, smash the universities, sell the public schools, and transfer ‘decivilized populations’ to ‘secure relocation facilities’ where they will be assigned to ‘mandatory apprenticeships.’ ”

This is 21st-century fascism, except that Yarvin’s Receiver would allow complete freedom of speech and association and would exercise no control over economic life. Foreign policy? Yarvin calls for “a total shutdown of international relations, including security guarantees, foreign aid, and mass immigration.” All social policy also disappears: “I believe that government should take no notice whatsoever of race — no racial policy. I believe it should separate itself completely from the question of what its citizens should or should not think — separation of education and state.”

And with that final provocation, Mencius Moldbug disappears into cyberspace.

Reaction is a mood before it is anything else, and I know its psychological temptations intimately. Growing up steeped in traditional religion, in a household where patriotism seemed as natural as breathing, I became infatuated with a past that no longer existed. I loved the countryside that was quickly being decimated by development, a Christianity that was being overwhelmed by secularism, and an idea of England, whose glories — so evident in the literature I read, the history I had absorbed, and the architecture I admired — had self-evidently crumbled into dust. Loss was my youthful preoccupation. The mockery I received because of this — from most of my peers, through high school and college — turned me inward and radicalized me still further. I began to revel in my estrangement, sharpening my intellectual rebellion with every book I devoured and every class I took. Politically I was ferociously anti-Establishment, grew to suspect and even despise much of the liberal elite, and rejoiced at Margaret Thatcher’s election victories.

So a sympathy for writers and thinkers who define themselves by a sense of loss comes naturally to me. I’ve grown out of it in many ways — and the depression and loneliness that often lie at the core of the reactionary mind slowly lifted as I grew more comfortable in the only place I could actually live: the present. But I never doubted the cogency of many reactionary insights — and I still admire minds that have not succumbed to the comfortable assumption that the future is always brighter. I read the Christian traditionalist Rod Dreher with affection. His evocation of Christian life and thought over the centuries and his panic at its disappearance from our world are poignant. We are losing a vast civilization that honed answers to the deepest questions that human beings can ask, replacing it with vapid pseudo-religions, pills, therapy, and reality TV. I’ve become entranced by the novels of Michel Houellebecq, by his regret at the spiritual emptiness of modernity, the numbness that comes with fully realized sexual freedom, the yearning for the sacred again. Maybe this was why as I read more and more of today’s neo-reactionary thought, I became nostalgic for aspects of my own past, and that of the West’s.

Because in some key respects, reactionaries are right. Great leaps forward in history are often, in fact, giant leaps back. The Reformation did initiate brutal sectarian warfare. The French Revolution did degenerate into barbarous tyranny. Communist utopias — allegedly the wave of an Elysian future — turned into murderous nightmares. Modern neoliberalism has, for its part, created a global capitalist machine that is seemingly beyond anyone’s control, fast destroying the planet’s climate, wiping out vast tracts of life on Earth while consigning millions of Americans to economic stagnation and cultural despair.

And at an even deeper level, the more we discover about human evolution, the more illusory certain ideas of progress become. In his book Sapiens, Yuval Noah Harari points out that hunter-gatherers were actually up to six inches taller than their more “civilized” successors; their diets were much healthier; infectious disease was much rarer; they worked less and goofed off more than we do. They didn’t even have much shorter lives: If you survived the enormous hazards of childhood, you could reach the age of 60, and some lived into their 80s (and stayed within their tribes rather than being shunted off into lonely rest homes). Famines and plagues — the great catastrophes of human history — were less common. Harari notes another paradox: Over hundreds of millennia, we have overcome starvation … but now are more likely to die of obesity than hunger. Happiness? Globally, suicide rates keep rising.

Certain truths about human beings have never changed. We are tribal creatures in our very DNA; we have an instinctive preference for our own over others, for “in-groups” over “out-groups”; for hunter-gatherers, recognizing strangers as threats was a matter of life and death. We also invent myths and stories to give meaning to our common lives. Among those myths is the nation — stretching from the past into the future, providing meaning to our common lives in a way nothing else can. Strip those narratives away, or transform them too quickly, and humans will become disoriented. Most of us respond to radical changes in our lives, especially changes we haven’t chosen, with more fear than hope. We can numb the pain with legal cannabis or opioids, but it is pain nonetheless.

When the velocity of cultural change combines with economic anxiety, is it shocking that human beings want to retreat into a past?

If we ignore these deeper facts about ourselves, we run the risk of fatal errors. It’s vital to remember that multicultural, multiracial, post-national societies are extremely new for the human species, and keeping them viable and stable is a massive challenge. Globally, social trust is highest in the homogeneous Nordic countries, and in America, Pew has found it higher in rural areas than cities. The political scientist Robert Putnam has found that “people living in ethnically diverse settings appear to ‘hunker down,’ that is, to pull in like a turtle.” Not very encouraging about human nature — but something we can’t wish away, either. In fact, the American elite’s dismissal of these truths, its reduction of all resistance to cultural and demographic change as crude “racism” or “xenophobia,” only deepens the sense of siege many other Americans feel.

And is it any wonder that reactionaries are gaining strength? Within the space of 50 years, America has gone from segregation to dizzying multiculturalism; from traditional family structures to widespread divorce, cohabitation, and sexual liberty; from a few respected sources of information to an endless stream of peer-to-peer media; from careers in one company for life to an ever-accelerating need to retrain and regroup; from a patriarchy to (incomplete) gender equality; from homosexuality as a sin to homophobia as a taboo; from Christianity being the common culture to a secularism no society has ever sustained before ours.

When this velocity of cultural change combines with a deepening — and accurate — sense of economic anxiety, is it shocking that human beings want to retreat into a past, to resuscitate the nation-state, and to reach backward for a more primeval and instinctual group identity? Or that they doubt the promise of “progress” and seek scapegoats in the governing classes that have encouraged all of this to happen? And is it not evident why, when a demagogue occupies this cultural vacuum and finally speaks this forbidden language, they thrill to him?

Our job in these circumstances is not to condescend but to engage — or forfeit the politics of the moment (and the future) to reaction. Lincoln got the dynamic exactly right with respect to the Trump voter: “Assume to dictate to his judgment, or to command his action, or to mark him as one to be shunned and despised, and he will retreat within himself, close all the avenues to his head and his heart; and though your cause be naked truth itself, transformed to the heaviest lance, harder than steel, and sharper than steel can be made, and tho’ you throw it with more than Herculean force and precision, you shall be no more able to pierce him, than to penetrate the hard shell of a tortoise with a rye straw.”

The tragedy of our time, of course, is that President Obama tried to follow Lincoln’s advice. He reached out to those who voted against him as often as he could. His policies, like Obamacare, were aimed at helping the very working poor who gave Trump the White House. He pledged to transcend the red-blue divide. He acknowledged both the necessity of law enforcement and the legitimate African-American fear of hostile cops. A black man brought up by white people, he gave speech after speech attempting to provide a new narrative for America: one of slowly integrating moral progress, where racial and class divides could be overcome. He criticized the reductive divisiveness of identity politics. And yet he failed. He couldn’t prevent the disappearance of the American middle class; he couldn’t calm the restive anxieties of the white working class; he couldn’t stem the reactionary tide that now washes ever closer ashore. If a man that talented, with that biography, found himself spitting into the wind, a powerful storm is indeed upon us.

This, of course, is not to defend the neo-reactionary response. Their veiled racism is disturbing, and their pessimism a solipsistic pathology. When Anton finds nothing in modernity to celebrate but, as he put it to me, “nice restaurants, good wine, a high standard of living,” it comes off as a kind of pose, deliberately blind to all the constant renewals of life and culture around us. When Houellebecq has one of his characters sigh, “For a man to bring a child into the world now is meaningless,” I chortle. When Dreher hyperventilates that today’s youngsters “could be one of the last generations of this thing called Western civilization” and that American Christians today must “live lives prepared to suffer severe hardship, even death, for our faith,” I take my dogs for a walk. When Yarvin insists that “if the 20th century does not go down in history as the golden age of awful government, it is only because the future holds some fresher hell for us,” I check my Instagram account. There is something hysterical here, too manically certain, bleaker than any human being can bear for long.

And how can you seriously regard our political system and culture as worse than ever before in history? How self-centered do you have to be to dismiss the unprecedented freedom for women, racial minorities, and homosexuals? Or the increased security for the elderly and unemployed, and the greater access to health care by the poor and now the working poor? Compare the air we breathe today with that of the 1950s. Contrast the religious tolerance we take for granted today with the enmities of the past. Compare the racial integration of today, incomplete as it may be, with Jim Crow. Observe the historically low levels of crime compared with the recent past — and the absence of any world wars since 1945. Over the very long haul, too, scholars such as Steven Pinker have found convincing evidence that violence among humans is at the lowest levels since the species first emerged.

If the neo-reactionaries were entirely right, the collapse of our society would surely have happened long before now. But somehow, an historically unprecedented mix of races and cultures hasn’t led to civil war in the United States. In fact, majorities welcome immigration, and enjoy the new cultures that new immigrants bring. A majority backed Trump’s opponent last November. America has assimilated so many before, its culture churning into new forms, without crashing into incoherence. London may be 40 percent nonwhite and repellent to much of rural England — but it works, its inhabitants seem unfazed, its culture remains world-class. The European Union massively overreached by mandating a common currency and imposing brutal austerity, but its conflicts have not led to mass violence, its standard of living remains high, and its achievement of Continental peace is far preferable to the carnage that destroyed Europe in the last century. It may well stagger on, if it can only moderate itself.

It is also one thing to be vigilant about the power of the administrative state and to attempt to reform and modernize it; it is quite another to favor its abolition. The more complex modern society has become, the more expertise is needed to govern it — and where else is that expertise going to come from if not a professional elite? For that matter, the liberal media has nothing like the monopoly it once enjoyed. There are two “Cathedrals” in the 21st century — and only one has helped produce a conservative Supreme Court, a Republican Congress, a Republican president, and near-record Republican majorities in statehouses around the country. Non-leftist thought is suppressed in the academy and is currently subjected to extreme intolerance and even violence on many campuses. That has to change. But some ideas from the neo-reactionary underground — like the notion that carbon has little to do with rising world temperatures — are in the underground for a reason. And still, climate-change denial is the de facto policy of the American government.

Beyond all that, neo-reactionaries have a glaring problem, which is that their proposed solutions are so radical they have no chance whatsoever of coming into existence — and would be deeply reckless to attempt. Their rage eclipses their argument. The notion that public opinion could be marshaled to effect a total reset of American government in favor of a new form of monarchy, as Yarvin suggests, is, to be blunt, bonkers. And is America seriously going to remain a white-majority country? How, exactly? Can the U.S. economy suddenly unwind global manufacturing patterns? Can America simply abandon its global role and its long-standing commitments to allies?

Of course not. And the Trump administration is, day by day, proving this. An isolationist foreign policy collapsed at the first gust of reality. A thinly veiled Muslim immigration ban would have accomplished nothing — most Islamist terrorism is homegrown — and went nowhere. The communities that once thrived off manufacturing or coal mining are not coming back. Even the most draconian mass deportation of undocumented immigrants will not change the demographics of America — or suddenly raise wages for the working class. Global trade has become too entrenched to be reversed. The dismantling of Obamacare dismantled itself — not because of an elite plot but because, when confronted with its being taken away, a majority of Americans balked.

There is, perhaps, a way to use reactionary insights and still construct a feasible center-right agenda. Such a program would junk Reaganite economics as outdated but keep revenue-neutral tax reform, it could even favor redistribution to counter the deep risk to democracy that soaring inequality fosters, and it could fix Obamacare’s technical problems. You could add to this mix stronger border control, a reduction in legal immigration, a pause in free-trade expansion, a technological overhaul of the government bureaucracy, and a reassertion of Americanism over multiculturalism. This is not an impossible direction for the Republican Party to go — though it would have to abandon its know-nothing narcissist of a leader and its brain-dead congressional leaders. The left, for its part, must, it seems to me, escape its own bubble and confront the accelerating extremism of its identity politics and its disdain for millions of “deplorable” white Americans. You will not arrest the reactionary momentum by ignoring it or dismissing it entirely as a function of bigotry or stupidity. You’ll only defuse it by appreciating its insights and co-opting its appeal.

Reaction can be clarifying if it helps us better understand the huge challenges we now face. But reaction by itself cannot help us manage the world we live in today — which is the only place that matters. You start with where you are, not where you were or where you want to be. There are no utopias in the future or Gardens of Eden in our past. There is just now — in all its incoherent, groaning, volatile messiness. Our job, like everyone before us, is to keep our nerve and make the best of it.

*This article appears in the May 1, 2017, issue of New York Magazine.

Beyond Alt: Understanding the New Far Right

Syd Barrett: The Madcap Who Named Pink Floyd Rolling Stone’s 1971 interview with Syd Barrett, Pink Floyd’s founding lead singer

Syd Barrett: The Madcap Who Named Pink Floyd

Rolling Stone’s 1971 interview with Syd Barrett, Pink Floyd’s founding lead singer

Syd Barrett Gems/Redferns

LONDON—If you tend to believe what you hear, rather than what is, Syd Barrett is either dead, behind bars, or a vegetable. He is in fact alive and as confusing as ever, in the town where he was born, Cambridge.

In 1966–67, Barrett was playing lead guitar with Pink Floyd. He’d named the band and was writing most of their music, including the only two hit singles they ever had. His eerie electronic guitar style and gnome-like stage presence made him an authentic cult figure for the nascent London underground, then just beginning to gather at the UFO club and the Roundhouse. The Floyd were a house band and the music went on into the wee hours.

Cambridge is an hour’s train ride from London. Syd doesn’t see many people these days. Visiting him is like intruding into a very private world. “I’m disappearing,” he says, “avoiding most things.” He seems very tense, ill at ease. Hollow-cheeked and pale, his eyes reflect a permanent state of shock. He has a ghostly beauty which one normally associates with poets of old. His hair is short now, uncombed, the wavy locks gone. The velvet pants and new green snake skin boots show some attachment to the way it used to be. “I’m treading the backward path,” he smiles. “Mostly, I just waste my time.” He walks a lot. “Eight miles a day,” he says. “It’s bound to show. But I don’t know how.”

“I’m sorry I can’t speak very coherently,” he says, “It’s rather difficult to think of anybody being really interested in me. But you know, man, I am totally together. I even think I should be.” Occasionally, Syd responds directly to a question. Mostly his answers are fragmented, a stream of consciousness (the words of James Joyce’s poem “Golden Hair” are in one of his songs). “I’m full of dust and guitars,” he says.

“The only work I’ve done the last two years is interviews. I’m very good at it.” In fact, Syd has made three albums in that time, produced by the Floyd. The Madcap Laughs, his second, he says, was pretty good: “Like a painting as big as the cellar.” Before the Floyd got off the ground, Barrett attended art school. He still paints. Sometimes crazy jungles of thick blobs. Sometimes simple linear pieces. His favourite is a white semi-circle on a white canvas.

In a cellar where he spends much of his time, he sits surrounded by paintings and records, his amps and guitars. He feels safe there, under the ground. Like a character out of one of his own songs. Syd says his favourite musician is Hendrix. “I toured with him you know, Lindsay (an old girlfriend) and I used to sit on the back of the bus, with him up front; he would film us. But we never spoke really. It was like this. Very polite. He was better than people really knew. But very self-conscious about his consciousness. He’d lock himself in the dressing room with a TV and wouldn’t let anyone in.”

Syd himself has been known to sit behind locked doors, refusing to see anyone for days at a time. Frequently in his last months with the Floyd, he’d go on stage and play no more than two notes in a whole set. “Hendrix was a perfect guitarist. And that’s all I wanted to do as a kid. Play a guitar properly and jump around. But too many people got in the way. It’s always been too slow for me. Playing. The pace of things. I mean, I’m a fast sprinter. The trouble was, after playing in the group for a few months, I couldn’t reach that point.”

“I may seem to get hung-up, that’s because I am frustrated work-wise, terribly. The fact is I haven’t done anything this year, I’ve probably been chattering, explaining that away like anything. But the other bit about not working is that you do get to think theoretically.”

He’d like to get another band together. “But I can’t find anybody. That’s the problem. I don’t know where they are. I mean, I’ve got an idea that there must be someone to play with. If I was going to play properly, I should need some really good people.”

Syd leaves the cellar and goes up to a sedate little room full of pictures of himself with his family. He was a pretty child. English tea, cake and biscuits, arrives. Like many innovators, Barrett seems to have missed the recognition due to him, while others have cleaned up. “I’d like to be rich. I’d like a lot of money to put into my physicals and to buy food for all my friends.

“I’ll show you a book of all my songs before you go. I think it’s so exciting. I’m glad you’re here.” He produces a folder containing all his recorded songs to date, neatly typed, with no music. Most of them stand alone as written pieces. Sometimes simple, lyrical, though never without some touch of irony. Sometimes surreal, images weaving dreamily, echoes of a mindscape that defies traditional analysis. Syd’s present favourite is “Wolfpack,” a taut threatening, claustrophobic number. It finishes with:

Mind the Reflecting electricity eyes
The Life that was ours grew sharper
and stronger away and beyond
short wheeling fresh spring
gripped with blanched bones moaned
Magnesium Proverbs and sobs

Syd thinks people who sing their own songs are boring. He has never recorded anyone else’s. He produces a guitar and begins to strum out a new version of “Love You,” from Madcap. “I worked this out yesterday. I think it’s much better. It’s my new 12-string guitar. I’m just getting used to it. I polished it yesterday.” It’s a Yamaha. He stops and eases it into a regular tuning, shaking his head. “I never felt so close to a guitar as that silver one with mirrors that I used on stage all the time. I swapped it for the black one, but I’ve never played it.”

Syd is 25 now, and worried about getting old. “I wasn’t always this introverted,” he says, “I think young people should have a lot of fun. But I never seem to have any.” Suddenly he points out the window. “Have you seen the roses? There’s a whole lot of colours.” Syd says he doesn’t take acid anymore, but he doesn’t want to talk about it… “There’s really nothing to say.” He goes into the garden and stretches out on an old wooden seat. “Once you’re into something…” he says, looking very puzzled. He stops. “I don’t think I’m easy to talk about. I’ve got a very irregular head. And I’m not anything that you think I am anyway.”

This story is from the December 23rd, 1971 issue of Rolling Stone.

Better Than the Beatles (and DNA, Too) by Lester Bangs The Village Voice Jan. 28-Feb. 3, 1981

Better Than the Beatles (and DNA, Too)
by Lester Bangs
The Village Voice
Jan. 28-Feb. 3, 1981

. . Thanks to Jeff Grimshaw for sharing our wonderment that no one has ever reprinted Lester Bangs’ hilarious and prescient 1981 review in the Village Voice of the Shaggs’ Philosophy of the World LP.reissue.

 

– – I have been getting whiny letters from a lot of you lately complaining about the general state of the art. “What is all this shit?” you ask. “We thought New Wave was supposed to be this awakening of New Avenues of Self Expression and Freedom, resulting in new musical verities and new insights into the human condition even! Instead we went out and spent all this money, and all these records are shit!”

– – You’re right about about one thing at least: all those record are shit, and you might as well have burned all those dollar bills. (Closer, 12 bucks, haw haw haw!) But those records aren’t shit for the reasons that you think: those records are shit because they’re all too good!

– – That’s right. All those stupid bands were so stupid they plumb went out and learned to play their instruments, a process as ineluctable as the putrefaction of a corpse. Teach ’em a chord or two, then just watch those little bastards practice till they can switch off, back and forth between those two chords (then three, then four . . . never shoulda learned even one!) deft as Al DiMeola if he wanted to play that which he probably will soon! Damn!

– – Which is why the only hope for rock’n’roll, aside from everybody playing nothing but shrieking atonal noise through arbitor distorters, is women. Balls are what ruined both rock and politics in the first place, and I demand the world be turned over to the female sex immediately. Only hope. Valerie Solanas was so much greater a prophet than Warhol that I can only pray she might consent to lead the group I’m forming. The absolute best rock’n’roll anywhere today is being played by women: the other night I saw God in the form of the Au Pairs, the Slits are stupendous, the Raincoats are better than London Calling or anything by Elvis Costello, Chrissie Hynde doesn’t count, Joan Jett deserves her place in the sun if not reparations, Lydia Lunch is the Female Role Model for the ’80s besides being one of the greatest guitarists in the world . . . the list is endless. (Patti, come home!)

– – But credit must be given to the foremothers: the Shaggs. Way back in 1972 [sic] they recorded an album up in New England that can stand, I think, easily with Beatles ’65, Life with the Lions, Blonde on Blonde, and Teenage Jesus and the Jerks as one of the landmarks of roll’n’roll history. The Wiggins [sic] sisters (an anti-power trio) not only redefined the art but had a coherent Weltanschauung on their very first album, Philosophy of the World. Basically what it comes down to is that unlike the Stones these girls are saying we love you, whether you’re fat, skinny, retarded, or Norman Podhoretz even. Paul Weyrich. Don’t make no difference, they embrace all because they are true one world humanists with an eye to our social future whose only hope is a redefined communism based on the open-hearted sharing of whatever you got with all sentient beings. Their and my religion is compassion, true Christianity with no guilt factors and no vested interest, perhaps a barter economy, but certainly the elimination of capitalism, rape, and special-interest group hatred. For instance, in their personal favorite number, “My Pal Foot Foot,” they reveal how even a little doggie must be granted equal civil rights perhaps even extending to the voting booth. Hell, they let Nancy Reagan in! They also believe that we should jettison almost completely the high-tech society which has now perched us on the lip of global suicide, and return to third world-akin closeness with the earth, elements, nature, the seasons, as in my personal favorite on this album, “It’s Halloween,” which emphasizes that seasonal festivals are essential to a healthy body politic (why d’ya think all them people in California got no minds?).

– – Unfortunately the Wiggins’s masterpiece was lost over the years — it came out on a small label, and everybody knows the record industry has its head so far up its ass it’s licking its breastplate. But this guy from NRBQ had the savvy to rescue it from oblivion (in a recent issue of Rolling Stone, he compared their work to early Ornette Coleman, and he’s right, though early Marzette Watts might be more apt), so now we got it out on the Red Rooster label, which of course is a perfect joke on all those closet-queen heavy-metal cockrockers. How do they sound? Perfect! They can’t play a lick! But mainly they got the right attitude, which is all rock’n’roll’s ever been about from day one. (I mean, not being able to play is never enough.) You should hear the drum riff after the first verse and chorus of the title cut — sounding like a peg-leg stumbling through a field of bald Uniroyals, it cuts Dave Tough cold and these girls aren’t even junkies (of course!). They just whang and blang away while singing in harmonies reminiscent of three Singing Nuns who’ve been sniffing lighter fluid and their voices are just so copacetic [sic] together (being sisters, after all) you’d almost think they were Siamese triplets. Guitar style: sorta like 14 pocket combs being run through a moose’s dorsal, but very gently. Yet it rocks. Does it ever. Plus having one of the greatest album covers in history, best since Blank Generation. God Bless the Shaggs. Now if they will only emerge from (semi?) retirement (?) no one ever will have cause again to say “Rock’n’Roll is dead, man . . .” Up an’ at ’em, Valerie.

What Is Schizo-Culture? A Classic Conversation with William S. Burroughs

What Is Schizo-Culture? A Classic Conversation with William S. Burroughs

http://flavorwire.com/488637/what-is-schizo-culture-a-classic-conversation-with-william-s-burroughs

Schizo-CultureThis Sunday, MoMA PS1 joined with publisher Semiotext(e) to present The Return of Schizo-Culture, an afternoon of screenings, music, performances, and readings from the storied 1975 Schizo-Culture conference, which featured an array of cultural, intellectual, and artistic radicals. The conference produced a series of writings that were later collected into a book designed by a group of artists including Kathryn Bigelow and Denise Green. Taken together, the book and the papers from the conference document the chaotic downtown arts and cultural scene of NYC in the 1970s and feature an amazing collection of interviews and essays from artists, writers, and musicians including Michel Foucault, Gilles Deleuze, The Ramones, John Cage, Philip Glass, Jack Smith, and William S. Burroughs.

So what in the hell is Schizo-Culture? This is not an easy question to answer. It appears to be, on one hand, the undulating breakdown of governmental control in society, a development which leads to 1) new forms of psychological control (think: the 1970s fascination with mind control techniques) and 2) new forms of social organization that don’t rely on conventions, like political parties.

“There can be no doubt that a cultural revolution of unprecedented dimensions has taken place in America during the last thirty years,” William S. Burroughs writes in his seminal “The Limits of Control.” “And since America is now the model for the rest of the western world, this revolution is worldwide… the fact that this worldwide revolution has taken place indicates that the controllers have been forced to make concessions.” But what is the schizo-cultural revolution that Burroughs is identifying? His broader point is that governments are making political concessions to the dissenting population, but that these concessions are just another form of control, albeit a preferable one.” He continues: “They could of course take all the concessions back, but that would expose them to the double jeopardy of revolution and the much greater danger of overt fascism.”

So Schizo-Culture, from Burroughs’ angle, is both the recognition and rejection of new forms of control. Or as Sylvère Lotringer writes: “‘Schizo’ does not refer here to any clinical entity, but to the process by which social controls of all kinds, endlessly re-imposed by capitalism, are broken up and opened to revolutionary change.” The below Q&A, excerpted from the recently republished Schizo-Culture book, shows Lotringer in a lively and sometimes hilarious conversation with Burroughs.

 

William Burroughs

 

What is schizo-culture according to you?

William Burroughs: Well, I think the “schizo-culture” here is being used in rather a special sense. Not referring rather to clinical schizophrenia but to the fact that the culture is divided up into all sorts of classes and groups, etc., and that some of the old lines are breaking down, and that this is a healthy sign.

Do you think that too much concentration is being placed in our culture on identifying, describing, you know, saying “This is this,” and this kind of a description of what exactly our culture is, “What we are?” like, you know, as opposed to past cultures that maybe were more preoccupied with basic arts and, you know, cultural uh…

Well yes, this defining, etc., is a luxury that the affluent society permits itself. I mean, poor people in Morocco and Spain and places like that are just too busy keeping alive to think about what they are, who they are…

Can I ask one thing? What do you think of all this fascism [audience laughter]?

Ah, well … [chuckles]

And if…we have been told that we are, and how they interpret the fact that they are fascist if they are.

Well, every question is different, and poses a different problem.

Just because it would be stupid to call in the army, that’s no reassurance that they won’t call in the army. All the examples seem like cases where they were least likely to before.

Every time they have done so they have regretted it. I don’t say that they won’t. I say they’ll regret it if they do. Because some of these people have read history, after all. They know what happened in the Roman Republic, they know what happened in Germany, and to a lesser extent what happened in Spain, all those places…

It seems your analysis is based on one good thing, and that is hoping for enlightened self-interest amongst leaders. Seems that one hitch in what you said, that you need to work out is, if we get an insane leader…

Well, are we fooling ourselves? These are very serious concessions that are being made. Remember that all censorship is political, and when they start removing censorship they have made a concession, and that’s important. Don’t expect to get everything all at once, because you won’t — yeah?

A lot of your analysis was in terms of the limits of the use of control—what about, which you mention in passing when answering his question, what about the demand to be controlled? Don’t you think that in some sense if fascism is to develop it has to develop either through a growth in the demand to be controlled or, more subtly you might say, in the growth of the control of the demand to be controlled?

I…don’t quite follow you. [audience laughter]

No, but that’s what I’m saying, is that, I mean, as you get a greater and greater amount of co-operation, it’s quite obvious that a lot of people get freaked out by it, and people have a great problem dealing with change, so I mean obviously they’re going to want something to control that change and slow it down. No, if the impasses that you’ve described are correct, the leadership has to control that demand to slow down change, and ’cause that could always get out of hand, and from your analysis it would seem that if it did get out of hand that would create exactly the kind of situations where they would have to exercise the sorts of repression that would be suicidal for them to exercise. You see, and that’s how even giving them the assumptions of rationality you are forgetting about insane leaders, and things still could get out of hand.

Well, of course, I said that. They’re never more dangerous than when embarking on a self-defeating course. That’s what happens. It’s unlikely that it would happen unless we got in some kind of war, I mean a serious war.

Do you think that it’s happening now is what I really want to ask you…

No, I don’t think so.

I have a related question. You said that you were optimistic; that is, we have every reason to believe this change will continue in its present direction. You also said that “they,” I guess you were referring to the leaders, are not in fact [likely] to take back any of their concessions. Are we then to assume that this is to culminate in a complete lack of control, at some point in the future, or rather, I think more pragmatically, is that there is going to be a trade-off point?

Trade-off point. I think there will be continued modifications of control. They can’t very well take everything back at this point.

What do you mean by “continued modifications”?

Well, what we have seen in the last thirty years. Now even ten or twenty years ago there was no right to protest, even the right to protest is a very important concession. A minority group thirty years ago had absolutely no recourse against police brutality or anything else. Now they can protest and that undoubtedly has an effect.

So are we to assume that there will be a culmination of this general direction, resulting in a total lack of control?

BURROUGHS: Ah well, I wouldn’t…I mean, I’m not a prophet; I wouldn’t speculate about the future. I’m talking about what has happened up until now.

This seems to be a sort of big boom theory of history, with sort of a continual diffusion of power, and theoretically I mean it would seem to come down to this area of total noncontrol or … But it seems clear that what we know until now is that societies can reconstitute themselves at high levels of control, and I think that what some of the earlier questions were about—you’ve defined some of the techniques by which governments or those in power maintain their control…And you also defined a situation where one group would want to re-establish or establish control, but what you don’t seem to have spoken to as far of the question was, what the specific techniques of this re-establishing or establishing controls would be when there is a low level of control.

Well, it depends on what you mean by “low level of control”—if you have complete anarchy, such as we might have if we got in a war with China, and this country was subject to atom bombing, the control reverts to almost a mediaeval or warlord state, where anyone with a small army is in a pretty good position. If that’s what you mean by when control reaches a state when it doesn’t exist, well then you do get warlords and city states in that kind of a situation.

Yeah, but we’re not in the Middle Ages anymore, I mean, that might’ve been, we may be able to explain how power was reconstitute in the Middle Ages, but how would you see that happening now, I mean do you see it happening in the very same way?

It isn’t happening now. It isn’t happening now. We’re not anywhere near that, we’re not in a state of anarchy.

But don’t you see that point coming?

Well, I could see it coming under certain circumstances. I could see it coming if we got in a war; I could see it coming with a complete economic collapse. But none of these things are right here now, or even around the corner.

Can you envision a complex social organization where control does exist?

Uhm, no, not with regard to a heterogeneous city population. I mean, there is, a certain control is absolutely necessary. Where’s all the food come from here—it’s brought in, right? There’s a whole unseen bureaucracy that is bringing that food in, and putting it in the shops, it’s providing power, etc. If those people didn’t work, millions of people would be starving overnight. So any system must find a way to keep those people on their jobs, whether economically or giving them food coupons, or whatever.

But your presentation was from the standpoint of controllers and exploitation, and are those necessarily connected?

No, I mean, I wouldn’t say that you would say that the necessity of maintaining power and food in a big city was necessarily a part of exploitation…

[Shouted] Down with Foucault!

[Audience laughter]

Burroughs: [Chuckling] Hear hear…well, okay…

Were Republicans really the party of civil rights in the 1960s?

Were Republicans really the party of civil rights in the 1960s?

Once you control for region, it turns out that Democrats were actually more likely to support the 1964 Civil Rights Act

Civil rights stories: “Such struggles are never over”

Civil rights protestors are attacked with a water cannon.
Civil rights protestors are attacked with a water cannon. Photograph: Getty Images

With Republicans having trouble with minorities, some like to point out that the party has a long history of standing up for civil rights compared to Democrats. Democrats, for example, were less likely to vote for the civil rights bills of the 1950s and 1960s. Democrats were more likely to filibuster. Yet, a closer look at the voting coalitions suggests a more complicated picture that ultimately explains why Republicans are not viewed as the party of civil rights.

Let’s use the 1964 Civil Rights Act as our focal point. It was arguably the most important of the many civil rights bills passed in the middle part of the 20th century. It outlawed many types of racial and sexual discrimination, including access to hotels, restaurants, and theaters. In the words of Vice President Biden, it was a big “f-ing deal”.

When we look at the party vote in both houses of Congress, it fits the historical pattern. Republicans are more in favor of the bill:

Civil Rights support by party

80% of Republicans in the House and Senate voted for the bill. Less than 70% of Democrats did. Indeed, Minority Leader Republican Everett Dirksen led the fight to end the filibuster. Meanwhile, Democrats such as Richard Russell of Georgia and Strom Thurmond of South Carolina tried as hard as they could to sustain a filibuster.

Of course, it was also Democrats who helped usher the bill through the House, Senate, and ultimately a Democratic president who signed it into law. The bill wouldn’t have passed without the support of Majority Leader Mike Mansfield of Montana, a Democrat. Majority Whip Hubert Humphrey, who basically split the Democratic party in two with his 1948 Democratic National Convention speech calling for equal rights for all, kept tabs on individual members to ensure the bill had the numbers to overcome the filibuster.

Put another way, party affiliation seems to be somewhat predictive, but something seems to be missing. So, what factor did best predicting voting?

You don’t need to know too much history to understand that the South from the civil war to the Civil Rights Act of 1964 tended to be opposed to minority rights. This factor was separate from party identification or ideology. We can easily control for this variable by breaking up the voting by those states that were part of the confederacy and those that were not.

Civil Rights votes by region

You can see that geography was far more predictive of voting coalitions on the Civil Rights than party affiliation. What linked Dirksen and Mansfield was the fact that they weren’t from the south. In fact, 90% of members of Congress from states (or territories) that were part of the Union voted in favor of the act, while less than 10% of members of Congress from the old Confederate states voted for it. This 80pt difference between regions is far greater than the 15pt difference between parties.

But what happens when we control for both party affiliation and region? As Sean Trende noted earlier this year, “sometimes relationships become apparent only after you control for other factors”.

Civil Rights party region

In this case, it becomes clear that Democrats in the north and the south were more likely to vote for the bill than Republicans in the north and south respectively. This difference in both houses is statistically significant with over 95% confidence. It just so happened southerners made up a larger percentage of the Democratic than Republican caucus, which created the initial impression than Republicans were more in favor of the act.

Advertisement

Nearly 100% of Union state Democrats supported the 1964 Civil Rights Act compared to 85% of Republicans. None of the southern Republicans voted for the bill, while a small percentage of southern Democrats did.

The same pattern holds true when looking at ideology instead of party affiliation. The folks over at Voteview.com, who created DW-nominate scores to measure the ideology of congressmen and senators, found that the more liberal a congressman or senator was the more likely he would vote for the Civil Rights Act of 1964, once one controlled for a factor closely linked to geography.

That’s why Strom Thurmond left the Democratic party soon after the Civil Right Act passed. He recognized that of the two parties, it was the Republican party that was more hospitable to his message. The Republican candidate for president in 1964, Barry Goldwater, was one of the few non-Confederate state senators to vote against the bill. He carried his home state of Arizona and swept the deep southern states – a first for a Republican ever.

Now, it wasn’t that the Civil Rights Act was what turned the South against the Democrats or minorities against Republicans. Those patterns, as Trende showed, had been developing for a while. It was, however, a manifestation of these growing coalitions. The South gradually became home to the conservative party, while the north became home to the liberal party.

Today, the transformation is nearly complete. President Obama carried only 18% of former Confederate states, while taking 62% of non-Confederate states in 2012. Only 27% of southern senators are Democrats, while 62% of Union state senators are Democrats. And 29% of southern members in the House are Democrats compared to 54% in states or territories that were part of the Union.

Thus, it seems to me that minorities have a pretty good idea of what they are doing when joining the Democratic party. They recognize that the Democratic party of today looks and sounds a lot more like the Democratic party of the North that with near unity passed the Civil Rights Bill of 1964 than the southern Democrats of the era who blocked it, and today would, like Strom Thurmond, likely be Republicans.

Topics

Office of Federal Contract Compliance Programs (OFCCP) History of Executive Order 11246

Office of Federal Contract Compliance Programs (OFCCP)

History of Executive Order 11246

On September 24, 1965, more than two years after the Reverend Martin Luther King, Jr. delivered his “I Have A Dream” speech on the steps of the Lincoln Memorial and more than a year after the Civil Rights Act of 1964 became the law of the land, the Nation took a historic step towards equal employment opportunity when President Lyndon Johnson issued Executive Order (EO) 11246.

For the first time, EO 11246 charged the Secretary of Labor, a Cabinet–level official with strong enforcement authority, with the responsibility of ensuring equal opportunity for minorities in federal contractors’ recruitment, hiring, training and other employment practices. Until that time, such efforts had been in the hands of various Presidential committees. EO 11246 continued and reinforced the requirement that federal contractors not discriminate in employment and take affirmative action to ensure equal opportunity based on race, color, religion, and national origin.

Signed by President Johnson that early autumn Friday 50 years ago, EO 11246 became a key landmark in a series of federal actions aimed at ending racial, religious and ethnic discrimination, an effort that dated back to the anxious days before the U.S. was thrust into World War II.

Today, Executive Order 11246, as amended and further strengthened over the years, remains a major safeguard, protecting the rights of workers employed by federal contractors—approximately one–fifth of the entire U.S. labor force—to remain free from discrimination on the basis of their race, color, religion, sex, sexual orientation, gender identity, or national origin…and opening the doors of opportunity through its affirmative action provisions.

 

EO 8802

As America geared up its industrial might for what proved to be its inevitable entrance into a global war, President Franklin Delano Roosevelt responded to leaders, such as A. Philip Randolph and Baynard Rustin, who protested that African–American workers were blocked from taking jobs in segregated war production factories. On June 25, 1941, FDR signed Executive Order 8802, outlawing discrimination based on race, color, creed, and national origin in the federal government and defense industries.

 

EO 9346

In 1943, President Roosevelt broadened the coverage of Executive Order 8802 by making it applicable to all government contractors.

 

EO 10308

Nearly a decade later, on December 3, 1951, President Harry S. Truman’s Executive Order 10308 advanced the achievements initiated during WWII by creating the Committee on Government Contract Compliance. The committee, as its name implies, was tasked with overseeing compliance by federal contractors with the non–discrimination provisions of Executive Order 8802.

 

EO 10479

President Dwight D. Eisenhower took a further step on August 13, 1953, by creating the President’s Committee on Government Contracts under Executive Order 10479. This reorganization furthered the principle that “…it is the obligation of the contracting agencies of the United States Government and government contractors to insure compliance with, and successful execution of, the equal employment opportunity program of the United States Government.”

This Executive Order made the head of each contracting agency of the federal government responsible for obtaining compliance by their contractors and subcontractors with the nondiscrimination provisions of the contracts into which they entered. Coordination would be provided by the President’s Committee on Government Contracts, housed in the Department of Labor, and comprised of representatives of major contracting agencies, the Labor and Justice Departments, and the General Services Administration as well as eight Presidential appointees. The President designated the Committee’s chair and vice chair.

 

EO 10925

By the time John F. Kennedy was elected President, it was evident that to advance equal employment opportunity federal involvement needed to be broader and more proactive. On March 6, 1961, shortly after JFK took office, he signed Executive Order 10925, opening a new chapter in achieving access to good jobs by requiring government contractors to “take affirmative action to ensure that applicants are employed, and that employees are treated during employment, without regard to their race, creed, color or national origin.”

Executive Order 10925 gave federal contracting agencies authority to institute procedures against federal contractors who violated their EEO obligations–including contract cancellation, debarment from future contracts and other sanctions. It also created the President’s Committee on Equal Employment Opportunity, which upon passage of the Civil Rights Act in 1964 became the Equal Employment Opportunity Committee. The President’s Committee was chaired by Vice President Lyndon Johnson and later by Vice President Hubert Humphrey. The Committee’s vice chair was Secretary of Labor Willard Wirtz.

Like its predecessors, EO 10925 gave each federal department and agency Executive Order enforcement responsibility for its contractors, and each developed its own organizational approach to carrying out these responsibilities. The President’s Committee oversaw issues of policy and the Department of Labor played a coordinating role.

 

EO 11246

President Johnson’s vision of creating a “Great Society” led to a host of endeavors that sought to change the political, social and economic landscape of the U.S. In his 1965 commencement address to graduates of Howard University, LBJ gave voice to his vision, declaring, “We seek not just freedom but opportunity. We seek not just legal equity but human ability, not just equality as a right and a theory but equality as a fact and equality as a result.”

At LBJ’s request, Vice President Humphrey led a comprehensive review “of the activities of the various federal agencies involved in the field of civil rights.” Humphrey’s conclusions and recommendations, articulated in a memorandum to Johnson, were based on the principle that “…whenever possible operating functions should be performed by departments and agencies with clearly defined responsibilities, as distinguished from interagency committees or other interagency arrangements. That principle is particularly applicable to civil rights programs where it is essential that our objectives be pursued vigorously and without delay that frequently accompanies a proliferation of interagency committees and groups.”

The Vice President continued, “The Secretary of Labor, as Vice Chairman of the [President’s] Committee [on Equal Employment Opportunity], has had primary responsibility for reviewing complaints and, through the contracting departments and agencies, insuring compliance by government contractors with nondiscrimination requirements. With all the experience gained over a period of years by the personnel involved in this program, responsibility should now be vested directly in the Department of Labor, and I so recommend.”

Thus, on September 24, 1965, President Johnson signed Executive Order 11246, making the Secretary of Labor responsible for administering the order’s non–discrimination and affirmative action provisions. Soon thereafter, Secretary of Labor Wirtz established the Office of Federal Contract Compliance. Edward C. Sylvester, Jr. was appointed as the agency’s first director.

 

EO 13665

On April 8, 2014, President Obama signed the Presidential Memorandum and Executive Order 13665, amending Executive Order 11246. These measures, which apply to federal contractors and subcontractors, are aimed at promoting equal pay for women by improving transparency of wages and making gender pay disparities easier to identify. The new Executive Order prohibits retaliation by federal contractors against employees or applicants who inquire about, discuss, or disclose details of their own or other employees’ or applicants’ compensation. The stated goal of the order is to provide workers with greater ability to identify violations of equal pay laws.

 

EO 13672

On July 21, 2014, President Obama signed Executive Order 13672, amending Executive Order 11246, to prohibit federal contractors and subcontractors from discriminating on the basis of sexual orientation or gender identity. This Executive Order prohibits federal contractors from discriminating against lesbian, gay, bisexual, and transgender employees and applicants. The Executive Order directed the Secretary of Labor to prepare regulations implementing the new protections. As a result, the Department of Labor published a Final Rule in the Federal Register on December 9, 2014, changing OFCCP’s regulations to require federal contractors and subcontractors to treat applicants and employees without regard to their sexual orientation or gender identity. This Final Rule took effect on April 8, 2015. Contractors covered by the new rule will have to ensure that agreements modified or entered into after the effective date of the final rule, as well as job solicitations and postings, contain appropriate references to the new prohibited forms of discrimination. Contractors will need to revise their EEO and affirmative action policies and statements to include sexual orientation and gender identity as protected classes.