wax banks

second-best since Cantor

Category: americana

All seeing is seeing-as, or, Why Trump thinks you’re stupid.

I’ve said it before: stupidity is the problem.

Trump assumes that everyone is as ignorant as he is, lies as much as he does, hates as he does, precisely because he’s stupid — and he’s stupid because he’s apparently never, not even for a second, made any kind of intellectual or emotional effort in his life. He’s a xenophobe: he fears difference, newness. He believes himself historically unique, so everyone and everything is the Other, and he hates the Other. Which is why he’s infamously disloyal, a petty backstabbing coward, when it comes to anyone he doesn’t see as an extension of himself/his will.

Trump’s stupidity means that, as far as he knows, he occupies a stupid world — so why shouldn’t he rule it? He doesn’t know how to spot climate change, so climate change isn’t real. He doesn’t have any real relationships with women, so women are trash. Nazis make him feel good by puffing him up on Twitter and at rallies, so Nazis must be good.

Of course he relished a chest-puffing contest with the witless nepotist Kim Jong-Un. I imagine it made him feel less alone.

One of the saddest things I know is that more than 1/4 of Americans don’t read at all.1 Trump is, by his own admission, one of them. He might be a psychopath or a narcissist, but the reason he has such a dangerously, unfunnily narrow conception of the good — the reason he goes on endlessly about ‘deals’ but is incompetent to discuss the content, the meaning, of any of his business — is that he has no intellectual bulwark against the stupidity of the world he alone lives in. He fills up every day with the idiot stories he sees on Fox News because he doesn’t know how to find anything deeper in the world.

Trump can’t see, he can only see-as — not in the phenomenological sense, but in the coarse psychological one. He thinks you and I are idiots because he’s an idiot; he thinks he alone possesses The Whole Truth about this or that issue (the ‘climate change hoax,’ say, or ‘black-on-black crime’) because he can’t imagine anyone having an inner life that’s richer than his. He’s a ‘transactional’ being because any other kind of existence is literally impossible, and you’re stupid for thinking otherwise. (Look at how he treats his wives, at the obvious contempt he and Melania have for one another.)

I feel sorry for Donald Trump the boy, semiliterate, unloved, allowed by teachers and parents to remain forever angry and dumb. I suspect he’s wired wrong, but I’m certain he didn’t need to end up as he did. I feel no sympathy for the cruel ignorant coward he became.

Please, please, please: make sure your children love learning, which is to say, love life.


  1. Some are illiterate. Some can read but find it taxing. Some will tell you they don’t have the time — though I’ll bet you $5 that all but a vanishing minority of our non-readers make the time to watch television… 

Difference and indifference.

The ‘Google guy’ was fired, which should worry anyone who cares about reasoned discourse (don’t worry, you are exempt), but since I can’t really affect Google hiring/training practices, I’ll stick to a small observation. The science about sex difference is settled, but not the way you probably think: meta-analyses of sex-difference studies going back decades suggest, unsurprisingly, that there are very large differences (link goes to Slate Star Codex) between physiological males and females in a host of areas relevant to the Google diversity discussion (e.g. people- vs thing-orientation), and very small differences in a host of areas where people might expect strong divergence.

In other words, the ‘Google guy’ wasn’t spouting pseudoscience in his ‘screed,’ he was spouting at least some actual science. If you used the word ‘pseudoscience’ to piss on him from your soapbox, consider the possibility that you have no idea what you’re talking about.

Now, I’m sticking with links to/via Scott here, because he’s good at finding/collecting the kind of analysis I’m interested in, and I’m not. Your mileage, as they say, may vary — but only if you actually hit the road.

Sidebar: Scott (SSC’s author) points out that ‘Big Five’ sex differences are magnified by increased economic prosperity. Funny. No, not actually funny.

Scott also links to a piece by Freddie deBoer (who blocked me on Twitter when I pointed out that he’d cut short his mental-health Twitter break after like a day) called ‘Why selection bias is the most powerful force in education’ and you should read it:

Tell me how your students are getting assigned to your school, and I can predict your outcomes – not perfectly, but well enough that it calls into question many of our core presumptions about how education works.

The SSC post closes with an aggressive attack on the prevailing narrative that the lack of women in Silicon Valley (or ‘tech’ writ large) is solely about entrenched sexism. Before he gets to the data, which is damning, Scott unspools a funny little rhetorical gambit:

In the year 1850, women were locked out of almost every major field, with a few exceptions like nursing and teaching. The average man of the day would have been equally confident that women were unfit for law, unfit for medicine, unfit for mathematics, unfit for linguistics, unfit for engineering, unfit for journalism, unfit for psychology, and unfit for biology. He would have had various sexist justifications – women shouldn’t be in law because it’s too competitive and high-pressure; women shouldn’t be in medicine because they’re fragile and will faint at the sight of blood; et cetera.

As the feminist movement gradually took hold, women conquered one of these fields after another. 51% of law students are now female. So are 49.8% of medical students, 45% of math majors, 60% of linguistics majors, 60% of journalism majors, 75% of psychology majors, and 60% of biology postdocs. Yet for some reason, engineering remains only about 20% female.

And everyone says “Aha! I bet it’s because of negative stereotypes!”

This makes no sense. There were negative stereotypes about everything! Somebody has to explain why the equal and greater negative stereotypes against women in law, medicine, etc were completely powerless, yet for some reason the negative stereotypes in engineering were the ones that took hold and prevented women from succeeding there…

Turns out the difficulty in getting women interested in programming kicks in by elementary school. Why is that? Hint: Scott links to the paper about prenatal androgen that you might’ve seen floating around this week.

(I’ll add a bit of handwavey, marginal speculation: it’s also worth looking specifically at differences in TV/videogame interest in very young kids; the videogame revolution does seem to correlate with the moment the undergrad CS enrollment starting tilting heavily toward boys…)

In the middle of talking about people/thing interest, Scott veers back to medicine, points out male/female variation between subfields, and offers these two graphs…

NewImage

NewImage

…which suggest that ludicrous people/things difference, y’know, the one some cultural-politics blogger told you was ‘pseudoscience.’

Reasoned discourse

The best thing about Scott’s post: it started out as a response to a piece by Wharton organizational psychologist Adam Grant (scare quotes only because I don’t know what precisely that job title means), and Professor Grant responded to the post — with Scott responding in turn. This is what actual grownup conversations look like, people.

One of Grant’s essential points — if sex/gender disparities in tech are about ‘interest, not ability,’ then we mustn’t forget that interests can be changed — is a very important one. Pushing back against dumb blankslateism isn’t the same thing as saying there’s no entrenched systemic sexism or just societal influence on development; that would be literally insane.

But what’s in our shared interest, culturewide? At the moment, one of the clear correlates of our elite/coastal push for equitable hiring everywhere is the literal suppression of basic scientific research (in popular discourse). Do you feel it’s worth it, on balance, to have twice as many female coders at Google, if one of the costs (not ‘effects’) is a marked increase in willful scientific illiteracy, which is already sky-high? Could we have it both ways? Yes — but that means letting go of ideologies which demand that we dismiss, or ‘merely’ aggressively cherrypick, basic science.

Scott’s last response to Grant (so far) closes like so:

If we continue to insist that, no, women really want to do tech, but stereotypes and sexists are pushing them out, we’ll end up with constantly increasing social engineering to prevent stereotypes, and constantly increasing purges to ferret out sexists (and “benevolent sexists”, and “unconscious sexists”, and people who are progressive but not progressive enough, and so on). Since these will never work (or even have paradoxical effects for the reasons mentioned above), we’ll just ramp these up more and more forever. I’m saying we don’t have to do this. We can fight any stereotypes and sexists we find, but understand we’re doing this in a context where even 100% success won’t achieve perfect gender balance.

We’re talking here about competing notions of freedom and of fulfillment, and I worry that the better, more sustainable such notions are being throttled. But don’t take my word for it.

Ritual and control (systems): freewrite.

The word ‘ritual’ is overloaded w/judgment because the 20th century was horrible. We have a screwy notion of what time is — the body’s relationship to time, and the mind’s.

Neonates’ hearts have to be taught to beat in time. Ever wonder why they respond so well to bouncing at ~80bpm? Their hearts are learning how to keep a beat. They’re learning how to live.

Technologies collapse space and time, can we agree? One major effect of the Internet is that all libraries are local. My car lets me be 60 miles away in an hour; traveling five miles takes ‘no time at all,’ a unit of time so small I don’t notice it unless I’m in a hurry. Benedict Anderson wrote about this already — the psychic effects of 19C mass media. James Scott as well, in another register. Manovich, Kittler — yr Media Studies 101 reading list, basically.

(The Language of New Media put me off when I read it in grad school; I wonder how I’d feel about it today, where my almost unreadably marked-up copy is…)

What’s ritual? Programmatic action to imbue a moment with meaning: to change the relationship of the mind/body to spacetime. Ritual differs from habit by intention. It differs from ‘process’ in its metaphoricity — rituals aren’t always representational but the action/effect mapping passes through metaphor, which isn’t true of a functional process. How do you make scrambled eggs? Crack, whisk, milk, heat, scramble, no need to pour a ring of salt around yourself in the kitchen. Each step of the process accomplishes something physical, obvious; each step in the ritual (the crimson shawl, the ring of salt, the prayer to Pelor) accomplishes psychotropism.

Psycho+tropism: mind+changing. ‘Learning.’ I’ve been making this point (well it’s not a ‘point’ exactly) in writing for 15 years now.

Science — or no not ‘science’ but whatever hip idiots mean when they say ‘Yay, let’s do science!‘ — is supposed by now to’ve freed us from the Terrible Shackles of ritual. We no longer evoke or imbue or incant or call down the ______ but rather we ‘boot up’ and ‘lifehack’ and oh God it’s too stupid to write down. Point being we’ve replaced magical metaphors with technological ones and have failed to register the implied insult, i.e. that you and I are the same kinds of machines as the ones we serve all day. (On the other hand, given this subservience, maybe calling ourselves ‘computers’ is meant as a compliment? Well: I don’t take it as one.) The idea that you can pop a nootropic or microdose and unlock the awesome power of the human mind isn’t even wrong, it’s a betrayal on another conceptual register altogether — of dignity. The idea, I mean, that there’s nothing else to be gained by taking human time: time at a biological scale.

What am I angry about now. What am I going on about. Please, please look: Western minds have shifted over the last few decades toward a resentment/rejection of ritual, languor, symbol, secret, time as pleasure, mind as space — magic, basically. Magical thought. I mean even the phrase ‘magical thinking’ is a denigration now, as if magic hasn’t been a way of working (in) the world since the dawn of the species, as if ‘magic’ referred simply to the incorrect belief that a fingersnap can make a hated enemy feel pain and not to, oh, the years-long process of careful ego-thinning and -reshaping by which minds open up to an ecstatically imaginative (sur)reality.

Or from another angle: if you drink your stupid burnt Dunkin Donuts coffee-sludge in a hurry on the drive into work, the caffeine will make you somewhat more productive for a short time. There are better habits and worse ones. But you should know that in another world, that drink was part of an inexpressibly more potent behavioural psychotropic, a (don’t tell the boss) ritual of movement from hanging-at-home mode to whatever mode you need to get into to work for those predators at the top of the org chart — and billions of dollars are spent every year to convince you that you don’t need it, that there’s no time for that sort of New Age frippery. For those five minutes of generative peace and wonder and focused consciousness.

So: life gets faster and worse. And the other world, which was only ever within you, a metaphor of unspeakable power, gets smaller and emptier and harder to find.

Four things to read.

Not ‘news,’ still timely:

B.R. Myers on North Korean propaganda, internal and external:

It’s an undiplomatic point to make, but the inconvenient truth is that most North Korea-watchers in the United States don’t speak Korean and don’t read Korean. They’re not able to read even the legend on a North Korean propaganda poster. So they, for decades, have had to depend on secondary sources of information, primarily in English. When they read North Korean materials, they have to read the so-called Juche Thought, because the regime has been careful to put this pseudo-ideology, this sham ideology, into English. So when foreigners want to read about North Korean ideology, they have to turn to these books on Juche thought, which really decoy them away from the true ideology.

Juche Thought is a jumble of humanist cliches like “Man is the master of all things.” This fake doctrine has absolutely no bearing on North Korean policymaking. While people are wasting their time trying to make sense of Juche Thought, the regime is propagating this race-based nationalism. Another problem we have in the United States, a little bit, is political correctness, inasmuch as we are uncomfortable attributing racist views to non-white people.

Scott Alexander (Slate Star Codex) on motte-and-bailey arguments:

Post-modernists sometimes say things like “reality is socially constructed”, and there’s an uncontroversially correct meaning there. We don’t experience the world directly, but through the categories and prejudices implicit to our society; for example, I might view a certain shade of bluish-green as blue, and someone raised in a different culture might view it as green. Okay.

Then post-modernists go on to say that if someone in a different culture thinks that the sun is light glinting off the horns of the Sky Ox, that’s just as real as our own culture’s theory that the sun is a mass of incandescent gas a great big nuclear furnace. If you challenge them, they’ll say that you’re denying reality is socially constructed, which means you’re clearly very naive and think you have perfect objectivity and the senses perceive reality directly.

The writers of the paper compare this to a form of medieval castle, where there would be a field of desirable and economically productive land called a bailey, and a big ugly tower in the middle called the motte. If you were a medieval lord, you would do most of your economic activity in the bailey and get rich. If an enemy approached, you would retreat to the motte and rain down arrows on the enemy until they gave up and went away. Then you would go back to the bailey, which is the place you wanted to be all along…

John Holbo’s (nearly 15-years-old!!) critique of David Frum’s conservatism:

The funny thing about this book is: it isn’t nearly as bad I just made it sound. I don’t think Frum is obsessed with beards or anything, actually. He sometimes seems like a pretty sharp guy. The middle chapters – full of history and policy detail, so forth – are quite cogent. Just the main chapters have problems. Frum has written a book about the need for a reflective, conservative philosophy. And: that’s the one thing he hasn’t got. He just has no clue why he is a conservative, or why being one might be a good idea – or even what ‘conservatism’ ought to mean. Whenever he starts trying to talk about that stuff, his mind just goes blank and he fantasizes about shaving beards and the Donner party.

Daniel Davies’s ‘One Minute MBA,’, which may possess more value-per-word than any other blogpost yet written:

Anyway, the secret to every analysis I’ve ever done of contemporary politics has been, more or less, my expensive business school education (I would write a book entitled “Everything I Know I Learned At A Very Expensive University”, but I doubt it would sell). About half of what they say about business schools and their graduates is probably true, and they do often feel like the most collossal [sic] waste of time and money, but they occasionally teach you the odd thing which is very useful indeed. Here’s a few of the ones I learned which I considered relevant to judging the advisability of the Second Iraq War.

The founders were people, but The Founders aren’t people.

Idle, irresponsible, testy thoughts, unedited and unfiltered and (to be frank) probably un-thought-through.

Problem: The world of the Founders seems impossibly distant from our own, and Americans are pig-ignorant about our history.

Bad solution: Pretend the Founders were essentially modern Americans, somewhat abstracted perhaps, and try to draw political/cultural lessons from them on those terms. (This is known amongst historians of the era as ‘Founders Chic,’ and is popular for boring reasons — cf. Wall Street reporter Ron Chernow’s laudatory book on Hamilton, or the current backlash against Thomas Jefferson.)

Better solution: Treat them as fallible human beings while acknowledging the historical specificity of their time and place — i.e. maintain their status as historical figures rather than mythic characters.

In my family we’ve been listening basically nonstop to Hamilton, which is a great success on its own terms but seems, based on what little guilt-motivated research I’ve done, to be bad history. The play’s full of anachronisms, which don’t bother me because (1) they’re groovy and (2) I’m not a priggish asshole, but the specific recasting of the Hamilton/Jefferson conflict (Hamilton married into a family of slaveowners and himself rented slaves, yet he gets a number of abolitionist applause lines; Jefferson’s genuinely radical democratic ideals are laughed off as aristocratic hypocrisy) damages the history for no damn reason except, I think, to pander to Miranda’s ‘progressive’ audience.

(Testy aside about Miranda’s own background goes here, but I can’t be bothered.)

It’s dangerously distorting to portray humans of hundreds of years ago as basically modern in their outlooks — though I can see why you’d do so; no one would give a shit about Alexander Hamilton today if Miranda hadn’t made that choice. It works, and you’ve got to put asses in seats. Hamilton is a multimillion-dollar business. Yet the cost of that distortion is the audience’s cheaply acquired false certainty, which leads to recklessness:

Casting black and Latino actors as the founders effectively writes nonwhite people into the story, [Chernow] said, in ways that audiences have powerfully responded to.

Sadly, no! It just substitutes a fashionable interpretive matrix for, y’know, actual historical understanding, and piggybacks the noble and correct idea that ‘Anything You Can Dream, You Can Be’ on a sugarcoated misreading of history that shuts down further inquiry. It slots the Founders into contemporary conversations too easily, and the cost to our collective historical imagination will far outlast any tactical gains that one or another side might make in the culture wars. (‘Culture wars’: rather a grand name for local proxy conflicts whose chief purpose seems to be distraction from, among other things, actual wars…)

The Founders don’t need to be mythic embodiments of Good and Evil to be useful to us today — quite the opposite, if they’re to be sustainably useful and meaningful. Our inability to admit that the Founders were complex human beings is part of the reason we have such a childish relationship to our national history. The idea of America is an ongoing conversation, a history of debate between complexly invested humans. We go back to Colonial history wanting it to illustrate a point or settle an argument. But that’s not what historical inquiry does — the past doesn’t settle our arguments, we have to do that for ourselves. And we’re best able to handle our own business when we know where we’ve really come from.

Anyhow, the upshot here is twofold:

  1. You should listen to (or see) Hamilton, which is a great musical on its own terms.
  2. You should ignore the people who tell you it ‘brings the history to life.’ For ‘history,’ there, read ‘mythology.’ Hamilton settles for being a passion play when it could have been something so much more interesting: a problem play.

The problem is stupidity.

There are evil people in the White House, sure, but they’re outnumbered by the deeply stupid ones — dilettantes and pseudointellectuals like Bannon (who seems to be both), empty suits like Priebus, and of course the president himself, who by all accounts is too dumb to sit through briefings or comprehend ideas beyond grade-school level.

‘Ohhh you elitist jerk! Intelligence and goodness are orthogonal!’

Too-easy response: would you want a stupid doctor examining your daughter, or a stupid contractor building your house? These people need to be smart in order to do complicated things well — ‘goodly,’ as they say (I hope).

Let’s go further, though.

Intelligent people can be betrayed by their feelings, their ‘cognitive biases,’ same as anyone else. Obviously! And equally obviously, ‘smart’ folks can’t claim moral superiority — you can start with little more than the Golden Rule and live a good life, and the road’s littered with corpses left behind by ‘intellectuals.’ But intelligent folks, folks who can read critically and argue, who can handle irony and work through complex lines of reasoning and think dialectically, are much less susceptible (on average) to bad ideas.

Racism, for instance, is stupid — but you can learn that racism is stupid, and more importantly make yourself robust against it. Not through tribal-identitarian rituals (which just teach a kneejerk response to unfashionable forms of bigotry while blinding you to fashionable ones) but by introspecting about your racist beliefs and thinking through their consequences.

Censorship’s stupid too: morality aside, it doesn’t work (censored ideas grow more potent), and since the power to censor changes hands regularly, it’s short sighted to boot — next time around it’ll the other side silencing you. Those who advocate for censorship do so because they can’t think beyond the satisfactions of the moment, and can’t reason their way out of distaste. Empathy at a distance is a learned skill, and by developing that skill you begin to make yourself robust against your terror at unwelcome thoughts and expressions.

Why do intellectuals fall for bad ideas? Because they’re scared to make use of their faculties — they crave status, fear exposure, succumb to parochialism, or are just lazy.

The stupidity of the Trump White House bothers me because, even if Trump’s people are exactly as (im/a)moral as Obama’s, high-level thought can’t survive in that environment. Their organization is dysfunctional because so many people in it are too stupid to work together, for the future, at short-term cost to themselves. The best opportunity for the Republican/conservative agenda in more than a decade has been pissed away because the White House can’t play smart.

Which is merely quite bad right now, but will be a disaster when an actual external crisis hits. That’s the risk: the White House, the federal government, is not robust against calamity. You look at it the wrong way and it wobbles and falls.

Ignorance is our natural state, but willful ignorance is a sin. The president trusts Fox News and the Breitbart mis/disinformation machine for his daily news, even though he’s got the entire intelligence community ready to do that work for him. Why?

Because actually doing his job is too hard. Because he’s too stupid and too scared to keep up with the work.

So was GW Bush, of course — but Bush had principles, a compass (however faulty), and a deeply held sense of noblesse oblige. He was a cretin but he knew what the job was, more or less, and seemed to know his limitations. And like Obama, Bush was a voracious reader — you don’t suppose that’s a coincidence, do you?

Stupidity makes you cruel because it keeps you afraid. It makes you violent because it blinds you to better solutions. Stupidity makes you weak, because it keeps you from seeking out the interesting challenges that make you strong. It makes you boring because it shuts out all but the most obvious desires.

Lionel Trilling spoke of a ‘moral obligation to be intelligent.’ I look at Trump and his gang of second-raters and for a second I know just what he means.

‘Design philosophy’ is a smokescreen: initial point.

A habitual point-misser at rpg.net — a guy who was banned for threadshitting about a game he doesn’t seem to play, returned weeks later, and still can’t resist the urge to insert himself into every thread on that subject — said this in a thread about D&D 5th edition:

I just feel like there is a really deep, philosophical difference between what 4e does, within its niche, and what 5e does, within that same niche, and that it’s unusual for someone to like such significantly different takes within such a narrow space.

Maybe my issue is more that I see “I like 4e” as saying more than simply “when I play 4e, I have fun.” I see it as affirming support for a thing behind the game–hence my repeated references to design philosophy, and my comparison to a philosophical difference in literature. Because, by the latter definition, I literally like all TTRPGs I’ve ever played, because I’ve had fun while playing them. Even games I would never actually say I like.

He talked about Disney’s Sleeping Beauty and Game of Thrones, and about C.S. Lewis and Ayn Rand, and how he doesn’t understand how someone could claim to like both paired terms, for ‘philosophical’ reasons.

And I think this in response:

4e and 5e don’t do the same thing. They don’t really try. This is one source of your confusion. One is an anime-superpowers ‘cinematic’ fighty minis game, one is a streamlined modernish take on 80s D&D.

But even if they aimed at the same genre, there’s this: almost no one cares about ‘design philosophy,’ and talking about it (even on nerd fora) is often a smokescreen. The stable sensible adults I know don’t find it unusual at all to like very different takes on the same material. When I read Game of Thrones, I dig its vastness, its human-scale history, its grim postapocalyptic antiwar outlook, its conspiratorial complications. when I watch Disney’s Sleeping Beauty with my son, I dig its grand primary-coloured good’n’evil story, its desperation, the courage and terror and childlike wonder of it. I like The Wasp Factory and Catcher in the Rye (‘sourly funny adolescent works through emotional issues’ stories), I like Tolkien and Moorcock (and both understand and disagree with Moorcock’s ‘kill mommy’ bashing of Tolkien), I like Pynchon and the faintly embarrassing sub-Pynchon pretentious sex-comedy of Illuminatus!, I learn something from Tony Judt’s Euro-cosmopolitanism and John Gaddis’s unabashed USA-triumphalism, and none of the philosophical ‘contradictions’ between these works are as important as what I (you) take from them in the moment.

Justifying your affection for some popcult thing by talking about the ‘principles’ it embodies is the same lazy identitarian bullshit that

POLITICAL/ACADEMIC/CULTURAL RANT REDACTED

and if you can’t stretch yourself a tiny bit to see and enjoy things on their own terms, and to empathize with others doing the same even with texts you ‘don’t get,’ then don’t be surprised when sane sensible adults politely show you the door.


The deeper point here is that consumers who talk about ‘design philosophy’ are for the most part just borrowing hip terminology to mark themselves as above the material they don’t like. You get the same from the dilettantes and status-seekers in ‘Apple punditry’ and the gadget press, acting as if they’ve intuited the deeply admirable design principles behind a gadget which (coincidence!) happens to fill a need for them.

Fear of pleasure, lack of empathy, and ignorance about process: these are, you will hopefully be unsurprised to hear, problems. We will talk.

Pierce.

Does this look like wordcount padding to you?

In the clamor of a presidential race, which this year is even more distracting because of a clamorous and vulgar talking yam, a lot of important information gets drowned out that ought to be part of the presidential race in the first place.

That’s Charles Pierce, beloved of leftish readers who prefer articulate but shopworn outrage to analysis, drowning out some important information with a rush of cliché over at his blog ‘shebeen.’ Annoying as I find the ‘yam’ bit, it’s the misused ‘in the first place’ that puts me off. Surely a quick reread should’ve flagged that clunker?

Sensible people insist Pierce is a Great Writer in his mode, but his Esquire Politics blog has been trash all year, and paragraphs like the one quoted above are the reason why. Every single fucking post is riddled with ‘clever’ nicknames like ‘the vulgar talking yam’/’He, Trump’ or ‘Tailgunner Ted Cruz,’ tired rehashes of years-old jokes, and threadbare secondhand verbiage out of the Sclerotic Greyhair anthology. There’re ten thousand leftward bloggers like him, frankly, and dozens of them are reaching for new insights and new prose without any noticeable loss of perspective. I’ve linked admiringly to Pierce in the past, when his brand of overwrought doomsaying has suited the emotional tenor of some darker-than-usual cultural moment. But at this point he’s stamping about the ol’ shebeen like a more historically informed and somewhat less self-important Keith Olbermann — remember how K.O. got off a couple of memorable ‘viral’ speeches on his TV show before abruptly reaching the limit of his insight? — which is a damn shame considering Pierce’s actual talent and skill levels.

He was necessary reading once, back when he couldn’t be reduced so easily to a formula.

I say all this because Pierce talks constantly (and with extraordinary condescension) about the decline of rationality and sense in the USA — this from a man who in 2009 wrote a book called, wait for it, Idiot America — yet as near as I can tell, he long ago joined the parade of hurt/comfort pundits whose main job is to point at an extremely obvious outrageous affront to leftesque sensibilities (dumb people with guns! the politics of the image!) and recite a comforting litany of complaints (our nation is in decline and you and I are in no way to blame!), the balance of outrage and been-there-blogged-that worldweariness calibrated to go well with, say, a sugary milky caffeinated drink from your local fast-coffee chain. The ~left blogosphere has made this sort of pageview-trawling pseudoanalysis its primary sport for years and years now.

And while you might well think that the Real Problem is the rise of right-wing talk radio (which has been a major cultural force in this country for a quarter-century, you knob) or Citizens United or the lack of safe spaces or Lin-Manuel Miranda not getting enough awards or whatever issue you fill your Outrage Moments with…the fact that the tribe which identifies itself as Educated and Informed and More or Less Left But Also Totally Jazzed About the Fruits of Hypercapitalism — know anyone like that? — just can not be bothered to communicate with any of the other tribes, the fact that our Elites are doing their best to turn not only their neighbourhoods but their entire mediated existences into gated geographic/cognitive communities (the Safe Space as model of the Self), is exactly isomorphic with the ‘epistemic closure’ which was such a big deal amongst Righty crankfluencers a few years ago.

In other words: if you prefer your own tribe’s clichés to merely being in the world with members of any other tribe, you are part of the Idiot America that Pierce and his (mostly younger and dumber, therefore more forgivable though no more tolerable) cohort like to think they stand outside of. The system is rigged against you, just like it’s rigged against everyone who isn’t in charge of it, but you still bear a portion of the blame. Just like me and Charlie.

But you’re not getting paid to pass off your outraged gesticulation as critical insight. So your share of the blame is that much smaller.

That’s all.

A ten-minute Obama lecture on cynicism and polarization? Sure, why not.

I’m going to miss this man when he leaves office. His successor will be…ugly.

New media, old lies.

A rant follows. –wa.

‘Social media’ is, for the most part, a marketing term — the ‘social’ dimension of e.g. Twitter is actually only a small part of many (most?) users’ experiences, which consist largely of ongoing broadcast streams with brief semisocial side interactions. Check-ins. Socializing means two-way interaction; Twitter’s format militates against that. You interact directly with a tiny fraction of the people you follow, right? You may well follow 300 people, or 1,000 — how many of them did you exchange words with this week? And so now what were the other ones doing in your feed?

Twitter users you follow but don’t interact with are little different from TV stations you leave on in the background while you attend to…let’s call it ‘real life stuff.’

The idea that clicking on a link means ‘interaction’ has permanently cheapened that term. If I throw out a piece of mail and you take it out of my garbage and read it, you and I aren’t interacting.

This morning I read a Medium post about how Slack — a groupware application which replaces email with a more robust version of IRC — has ruined some UX designer’s life, by fragmenting his attention. The trouble with the piece isn’t its unfunny conceit (a breakup letter) or its tone of hip abjection (‘You are such a wonderful sane ethical corporation, please fix what you’ve made wrong with my life!’), but rather the unexamined assumption, totally unsupported by anything like evidence, that an adult should be able to live a happy healthy life sitting in front of a screen having IRC conversations with coworkers and ‘friends.’

And where did that ludicrous idea come from?

Not from Slack or Twitter or Facebook. Not from the Internet, Compuserve, Prodigy, or Illuminati Online. The idea that a good life was one in which you had time to settle down before a flickering lightbox and pretend to be connected to other human beings is very, very old. And because it’s transparently false, it has to be sold to us.

What’s deeply wrong with the Web was deeply wrong with TV

One of the most important pseudophilosophical currents of the last hundred years is the technophilic triumphalism that animates Silicon Valley culture. (I say ‘pseudo-‘ because it’s too poorly thought out to rise to the level of philosophy.) The idea that personal computers, and then the Internet, and then smartphones, and then more elegantly designed personal Internet smartphone computers will (1) fix what’s wrong with human civilization while (2) costing us nothing is a pernicious myth that (with honorable exceptions like that maybe-crank Morozov) goes essentially unchallenged in the cultural mainstream today. A lot of people are getting rich off the idea that the Singularity is not only real but here already. ‘Disruption’ is the watchword and dysfunction is the result.

But we should be aware of the roots of that dysfunction, which extend back a good deal further than the latest antisocial outrage. The problems with our all-Internet-all-the-time lives are outgrowths of much older cultural movements.

The broadcast-TV consensus wasn’t killed by YouTube but by cable TV and the VCR.

The atrophy of social/empathetic faculties now manifesting in our politics of secondhand identitarian outrage-cosplay has been in progress since long before the Web existed — for all the vaunted cognitive benefits of video games, Atari killed more brain cells than Snapchat.

The total collapse of trust in American news media owes a hell of a lot more to Ted Turner’s insane idea of a 24-hour ‘news’ network than it does to Matt Drudge or Gawker or whoever we’re blaming now.

Controlling the televised image has been essential to national political success for more than half a century — ask Richard Nixon — and remember that this country already (re)elected a second-tier movie star and cigarette pitchman to the White House.

(Don’t you dare fall prey to the stupid idea that Trump’s candidacy has been ‘conducted via Twitter.’ Even younger voters who didn’t constantly hear about his Yuuuge Business Deals in the 80s (as I did) know him from The Apprentice, which laid the groundwork for his current performance by normalizing his repugnant charismatic-villain persona. Funny, I didn’t actually watch the show. Did you?)

It’s not exactly controversial to say that computer-dependent life is dangerous and costly, though saying so won’t win you any popularity contests. But — I know this is basic but it needs saying because it’s, y’know, basic — what’s shitty about staring at the Web was shitty about staring at the TV half a century ago.

‘But we have such short attention spans today!’ Blame the handheld infotech device known as the remote control, which fed you the daft notions that not needing to get up from the couch was a measure of meaningful control over your media diet, and that if you didn’t immediately like a work of televisual art you should ‘see what else is on.’ And remember that most of us now watch our movies at home — in a domestic, distraction-filled ‘curated’ environment, rather than the sense-heightening identity-submerging collective ritual strangeness of the movie theater. We’re too ‘enlightened’ to recognize the importance of the magic circle of ritual transformation which surrounds the theater, and too lazy to seek it out, at fatal cost to our imaginations.

There is no magic circle around the Web. There can’t be.

Obviously the Web differs from TV. Obviously. But the danger it poses isn’t anonymous comments, it’s that your computer is a glowing box that you stare at while a stream of mostly irrelevant nonsense flows by your face, barely registering beyond its immediate sensation — and you convince yourself that you’re ‘interacting,’ that you’re an ‘active viewer.’ Does that sound familiar? We’ve replaced CNN with cnn.com, People Magazine with TMZ, watching Arrested Development with downloading Arrested Development, talking about comic books with tweeting about comic-book movies — but the basic problem of dependence on the screen, of its fundamentally antisocial and dehumanizing nature, has only deepened. It’s not new. We sit motionless in our ‘filter bubbles’ and congratulate ourselves on how comfortable our chairs are, how carefully ‘curated’ our Twitter feeds, but before ‘Web surfing’ there was ‘channel surfing’ and we tried not to think about what it was costing us because we’re in love with our illusions, not least the Illusion of Choice.

Hey by the way, remember when anyone gave a damn that Goldman-Sachs was one of the most dangerously powerful organizations ever formed by a group of human beings? No, of course not. That was a long time ago. It’s scrolled off the bottom of my screen.

We’ll have to catch it in reruns.