wax banks

second-best since Cantor

Category: essaying

Sharing ecology. Social objects.

Epistemic status: Wrote this in mid-2016 but I don’t think I posted it anywhere. It appears to be a semi-hasty first draft (These are a few of my…). At the point where I mention the Great Depression, I found a note from my 2016-self: ‘TK THIS GRAF IN PROGRESS; CONSIDER CUTTING NEXT GRAF OR 2 AND REDOING APPROACH TO THE ENDING ALTOGETHER, MORE IN KEEPING W/THEME OF THE PIECE.’ I’ve disregarded that note for the moment, because revising and rewriting this essay would mean sinking back into these feelings, and right now that thought turns my stomach. Maybe someday. –wgh.

I recently wrote a book about my favourite band, and found myself reliving — and longing for — the early days of my fandom. The band’s reputation rests on their improvisatory live performances, and for their first 20 years fans eagerly traded amateur concert recordings on audiocassette, with the band’s blessing. Walking into an acquaintance’s house and finding a shelf full of Maxell XL-II tapes, each with a handwritten ‘j-card’ listing tracks, segues, guest appearances, and improvisations of note, meant you’d found a fellow obsessive, and probably a friend.

That’s over now. My tapes gather dust in our attic, and I listen exclusively through iTunes and streaming sites. ‘In my day’ you’d arrange a trade with a stranger online — I’ll send you a 1st-generation tape of Amsterdam July 1997 for a clean copy of those two classic Red Rocks shows on your tapelist — and send the tapes bubble mailers, then post a Good Trader Alert to the newsgroup. We shared physical objects which resided in our homes, and our relationship to the music was artifactual, sacramental. You could be ‘in the presence’ of the music in a literal way.

Now, when you want the show, any show the band has ever played, you find the link in a single handy spreadsheet and download it from ‘the cloud.’ This has reduced but not quite eliminated audience taping, but it’s entirely done away with the fan trading network which was the backbone of our community. Once I shared the music with you; now a computer somewhere on Earth shares the music with our computers.

Everything is always available. There’s no need for us to share. There is no one to thank.

Free public wi-fi, streaming HD video, same-day book delivery, timeshifted TV, effectively unlimited free email: the benefits of these technocommercial advancements are so obvious that we needn’t talk about them, and so never bother thinking about them, and so tend to assume that these glorious advances and their glorious advantages are the Way of Things, steps already taken and so either fully accounted for or simply beyond counting. They can have no cost, this non-reasoning goes, because honestly why talk about cost when we finally have nice things?

But of course there are costs. There always are.

They manifest subtly at first.

In recent years the words ‘own’ and ‘ownership’ have acquired new senses: to ‘take ownership of your trauma’ means to acknowledge and make peace with the fact of a bad thing having scarred you emotionally, and to ‘own your privilege’ means to recognize the ways in which you benefit from your social class, and then pantomime remorse. ‘Ownership’ here means something like ‘reckoning,’ usually melodramatic.

And of course there was President G.W. Bush’s ‘ownership society,’ an idea which combined deficit spending (instead of ownership) and further atomization (instead of society).

When I was a kid in the 80s, ‘ownership’ seemed to me much less complicated: owning a thing meant being able to hold it, touch it, and — within reason — do what you wanted with it. If you owned a Nintendo game, for instance, you had a plastic cartridge full of subtle electronics which you inserted into a plastic and metal console in order to play. To share the game meant walking it down to Scott’s house and playing it over there; at the end of the day you brought it home. It was ‘yours’ the way your sneakers were yours. If it broke, it was lost to you, but you usually knew why.

Sharing music meant lending a compact disc or dubbing a cassette tape, and woe betide the would-be pirate who wanted music that didn’t fit cleanly on either side of the tape. Sharing a drawing meant sending it by mail; sharing movies meant inviting Jimmy and Craig over to watch them at your house on a weekend.

This was, I don’t need to tell you, a pain in the ass; and I’m assured that things are Better Now. To ‘share’ a movie with a friend in 2016, you simply point her to where you got it. Same with music and games. Easy breezy: your precious objects never actually leave our hands, and you can share without giving. ‘Generosity’ doesn’t come into it; when your neighbour asks to borrow your copy of the Game of Thrones finale, you either divide it by mitosis (copying the file) and pass along a copy, or maybe email her a .torrent file so she can grab it directly from the 16-year-old who pirated the episode in the first place.

Failing that, of course, you can just give your friend your Netflix or hbogo.com password — both companies have accounted for such ‘violation’ in their business models. The miracle is that she gains while you appear to lose nothing at all.

To borrow terms from computer science, this is ‘sharing’ as reference-passing1 rather than object-passing. The shift is meaningful, its benefits are clear, and we will be paying its hidden costs for a long time.

As Richard Stallman and his cohort have been pointing out for decades, our ‘possessions’ are increasingly rentals — beyond the simple fact that we’re not permitted ‘inside’ the digital tools we rely on, the ongoing shift from local (desktop) computational resources to online services, invisible server-side processing, and remote storage means that it’s typically ‘more convenient’ in the short term to have easy access to digital resources we don’t control than to actually ‘possess’ them. It’s nice not to need to synchronize multiple copies of your email archive, isn’t it? Easier, certainly, to let Google have it, and simply view your messages on a webpage (‘in the cloud’). All you have to do to get access to your most intimate thoughts is this: when the Alphabet corporation of Mountain View CA asks for the magic word, you type it — and it would be helpful if you handed over your phone number too, just in case.

This isn’t about Ludditism, mind you. We’re doomed, yes, and it’s our absolute dependence on biologically incompatible industrial infrastructure that’s doomed us, but: Cloud computing really does make modern life easier; accessing your entire music collection from your phone really is a miracle; not having to worry about server maintenance makes running a website not only easy but possible in the first place. I like being able to stream every Phish show to my phone. No, that’s not strong enough: 18-year-old me would have murdered his friends to get access to the digital tools which 37-year-old me, taking them for granted, finds insufficiently convenient.

And yet.

And yet when we consider whether to buy gadgets or embrace hip new software service, the alternative to Gadget A or Online Service B is always Functionally Equivalent Gadget X or Interchangeable But Less Snazzily Branded Service Y — the alternative is never Doing Without — and the sole reason for this state of affairs is that if you and I Do Without, the companies which sell us things will make a touch less money.

Retail businesses can be divided cleanly into two camps: those that produce truly useful, essential goods, and those that benefit from consumer anxiety.

With a smartphone and earbuds, you can now talk to Aunt May in far-off Osbaldeston anytime, in realtime — but since everyone else is wearing earbuds too, you can’t talk to a stranger on your street. Your teenage kid doesn’t think twice about listening to music across cultural borders, but has also never even heard of ‘social music,’ and probably knows none of the music that kept your parents and grandparents alive. Turn-by-turn GPS directions make navigation trivial, and the only cost is that relying on that technology means you never form a mental map of your city; but then, why would you need that? Why would you need to talk to a stranger, or learn your grandparents’ emotional language? What’s so great about being able to imagine a city without looking at a cartoon map of its streets?

If you’ve ever looked at your year-old iPhone and felt, deep in your bones, that it was time to pay a couple hundred bucks for an upgrade, then the people selling you pills are the ones who made you sick.

The redefinition of ‘sharing’ from transferring to copying is an inevitable knock-on effect of ubiquitous digital networking. When copying is cheaper than transferring, you copy; that’s why every Harvard freshman cheats. (And the faculty blame the kids and their parents, never themselves or the institution.)

But that redefinition, plus the creeping status anxiety and ‘FOMO’ (fear of missing out) engendered by a gadget/tech culture that’s easy to enter but difficult to leave, plus the ubiquity and pace of ‘social’ media, creates a toxic dependence on corporate media — meaning not only Sony, NBC, and HBO but Google and Apple and Twitter too. There’s social pressure to pay constant attention, and most of what there is to pay attention to is advertising. And while corporations carefully pressure their marks (us) to ‘create,’ to feel ’empowered’ (or else!), they also beam out the false but convincing message that the path to empowerment is consumption, and creativity outside the corporate-media envelope is somehow suspect. Fulfillment can’t be sold, only shared, so marketers forcibly and falsely equate fulfillment with satisfaction — the palliative, the rush of sensation — and ‘sharing’ is reduced to word-of-mouth advertising.

The late Prince famously said that his enormous back catalogue was the result of a kind of odd pragmatism: when he got the urge to hear a certain kind of music, his tastes were specific enough and his process refined to the point where he was better off just heading into his studio and making the music himself. We are told that in the Digital Future of Today it’s that easy for all of us. But when was the last time you recorded a song, actually edited a home video or article (‘sharing’ increasingly also means ‘sharing your first draft’), or built a Lego project from scratch instead of buying a kit? The instant sugar hit of recognition and pseudoconnection that comes from engaging with a brand, a franchise, a ‘magical revolutionary device,’ will almost always overpower the more complex, delicate experience of doing it yourself — and the conscious choice to DIY is short-circuited by the saturated colours, beveled edges, high framerates, and deep bass frequencies of the mediasphere.

This is a new spin on old news: the American public’s tendency toward absolute passivity before the screen has been a problem since the first television beamed out the first time-killing inanity, if not before. (During the Depression, 65% of the American population went to the movies each week.) Never mind that sitting for hours in front of a screen is bad for you, a fact everyone has known and seen firsthand for decades; thinking for hours through a screen is bad for you too. It fundamentally changes how you see, how you want, how you experience Others.

Americans’ democratic rhetoric has never quite hidden our desperate yearning for a strong hand, ideally an invisible one, guiding our choices. Consumerism, conformism, identitarian narcissism — these are such longstanding concerns you can watch expensive cable dramas about them. Madison Avenue didn’t invent the insane notion that happiness means inactivity (‘kick back and relax’), slaves did. And yet Being Able to Accomplish More is the core sales pitch in modern life: enhancing your productivity, being a ‘more effective you,’ decreasing your footprint while ‘increasing your impact’… Accomplishing more while doing less is the essence of the American Dream, which is one reason Silicon Valley’s rapacious technophilia has so thoroughly colonized the contemporary American imagination.

Yet it bears repeating: our tools also constrain our ability to create, coarsely (you can’t do calligraphy with a hammer, or drive nails with a watercolour brush) and more subtly. Tools come with ideas attached, with cultures of practice, social histories, private associations…and many of the ideas attached to our modern digital tools are poisonous to our long-term health. The idea that sharing means referring to Something Neat rather than making something of our own and passing it along. The idea that your urge to hear music is best satisfied by turning on the radio-equivalent rather than picking up a guitar (or even an iPad drum machine). The idea that, because pseudostate corporations can provide essential social services more efficiently in the short term than the actual state, they should do so. The idea that the best of you is what you can broadcast to the world right this instant. The idea that a company that inserts advertisements into your email while pretending not to care about their contents is, in any way at all, ‘on your side.’ The idea that ‘self-sufficiency’ is corrupt and valueless simply because it’s a myth.

The idea that it’s important to find out what other people think about your new favourite show before you ‘support’ it by watching.

The idea that it’s better to let the machine remember for you.

The idea that you can form a human connection with a username.

The idea that ‘curation’ is ‘creation,’ rather than ‘acquisition and accumulation with better branding.’

Blah, blah, blah.

(And now a moment for us: If you’re not blocking them with a clever bit of Javascript, please click one of the ‘Share’ buttons on this page so we can both get a sense of self-worth from this piece.)

We began with ‘sharing’ but have ended up on ‘creating’ and ‘curating,’ which makes sense: in the jungle of the ‘social’ Web, your taste is your identity and everything is a remix, and pointing out that these are deranged wrong ideas — pure ideology, good for business and bad for everything else — is uncool, i.e. irrelevant. The redefinition of ‘sharing,’ the weird felt obligation to point out that we don’t agree 100% with everything we retweet, our gadget anxieties, our self-satisfied consumerism, our literally childish equation of fulfillment with momentary satisfaction, our march toward an imagistic attention-deficit politics divorced from actual economic or cultural or indeed climatic reality…these are contemporary manifestations of our century-long movement toward absolute dependence on a corporate-cultural complex, and if that sounds creepily like ‘military-industrial complex’ then give yourself a gold star.

Our technological dependency makes us dependent in turn on the corporations who sell the technology, and those corporations spend billions to make sure we not only depend on them but feel sympathy for them, expend emotional energy caring about their wellbeing — think of how many hours supposed adults spend arguing with each other about Google’s ‘rebranding’ as Alphabet or Apple’s choice of default system font or whether the repulsive multibillion-dollar oligarchy called the NBA should sully the ‘purity’ of its player uniforms with advertisements that already blanket every unused square inch of every NBA arena. So far, so distracting…but when we treat corporate interests as emotionally equivalent to human interest, we silently accept encroachments on our inner lives, steep cuts to our imaginative autonomy, which we’d never countenance if our acquiescence hadn’t been bought. We learn not to mind Google reading our email, which means the NSA reading our email (and Google providing tools for them to do so), etc., etc., etc.

We’ve outsourced our taste to record labels, our imaginations to movie studios, our memories to email providers, and our creative urges to whatever shiny thing crossed our field of vision most recently; what’s onscreen is real — ‘friending’ a user account is making friends, ‘liking’ something is liking it, a selfie is a memory, the show’s better than the book — so what’s real is onscreen. Who has time for anything else?

And outside the window, just offscreen, the seas rise and the world dies.

We’ve lost (ourselves) and the timing couldn’t be worse.

But surely I don’t need to tell you that. Everything is a remix, after all; you’ve heard all this before.

  1. It’s no coincidence that our art reflects this ideological shift with an aesthetic shift toward relentless referentiality — the bored cynic who first got rich painting soup cans wouldn’t have been the least bit surprised by a cash-in like Captain America: Civil War, which (like dozens of comic book crossover (non)events before it) exists primarily to answer the question ‘What if Spider-Man fought Cap and made a weirdly specific Star Wars joke and we saw Crossbones in an early scene for some reason and The Vision wore a sweater and Stan Lee called Tony Stark “Tony Stank”‘ with a resounding KA-CHING!! From Lost and Family Guy to every Dreamworks movie ever made, from TV-Game of Thrones‘s inept citations of its bestselling source novels to the torrent of swill that is Disney’s ‘secret origins of fairy tale characters’ sequence, from the cloddish point-missing of Sopranos ‘death pools’ to the tiresome pedants who ‘fact checked’ Mad Men by craftily googling every date that appeared on what ended up being nothing more than a (brilliant) work of fiction, the demeaning game of spot-the-reference has become a staple of what credulous academics call ‘active media consumption.’ We could go on, but shouldn’t. 

Novelty and discovery.

Epistemic status: Of the quadrillions of blogposts, this is the blogpostiest blogpost. I’ve no idea whether any of it’s correct or indeed whether I’ll believe it tomorrow. I notice that my pairwise comparisons are pretty much only men, and have thoughts about that, but this is a good ol’ fashioned first draft and they’re for another time. –wgh.

Some artists are driven by a desire for novelty, some by a need for discovery. The one has, I think, nothing to do with the other.

The primary virtue of the work, for an artist chasing novelty, is that it seem new and surprising. An artist chasing discovery wants something subtly different: uncertainty, creative dislocation. In other words, novelty artists have a product in mind while discoverers are drawn to a process, which is why the former put out novelty art while the latter make God-knows-what, and sometimes reach the very center of things.

Plenty of interesting artists beign by obeying the first impulse and discover in themselves a hidden capacity, less immediately saleable but more sustainable. Deeper.

Many ‘avant-garde’ artists in any given medium are obsessed with novelty as such, which is why avant-garde art so often feels like toy work in which nothing is at risk — necessitating, in turn, an envious critic-class whose job it is to go on about how risky it is. Anxiety about sales figures drives novelty art, though such artists are expected to pretend otherwise.

Artists doing the deep work wish to be present at the revelation without demanding to hear a specific message, so their work has an unsettling unpredictability quite distinct from the shock-value of the novelty artist.

Sometimes such discoveries pull artists apart.

Think of Brian Eno and Paul Simon, Grimes and Beyoncé, Andre 3000 and Big Boi, George Lucas and JJ Abrams, David Mamet the writer and David Mamet the director, Ricky Jay and David Copperfield, PKD and RAW, Thomas Pynchon and Jonathan Franzen, David Milch nd Aaron Sorkin, David Lynch and David Fincher The Simpsons/Rick & Morty and Family Guy/South Park, if you like. (Feeling silly? Apple and Google.)

I think of one sort of art (and artist) in terms of vision(s) — input, throughput, the encounter — and the other in terms of objects, artifacts, output. Discoverers take up the work not knowing whether they can pull it off, what it means

I hate the tone I take in this sort of blogpost but there you are, here we are.

Compromise and happiness; Living space.

(two posts dated 2010, a couple years after I got married but before my son was born. posted as-were, as it were; i wouldn’t write it this way anymore, and reading it gives me that ol’ familiar red-pen-itch, plus a weird sadness (was i always so angry?). but… –wa.)

Compromise and happiness.

Married couples learn to compromise out of necessity, but the scope of this necessity tends to be misunderstood. Compromise is often thought of as an invasion or reduction of personal sovereignty: the up-front price of a marriage’s survival. But in the long run compromise isn’t ‘judged’ correct or successful. Rather, the act of compromise is itself the substance of a successful relationship, which after all consists of (and is expressed in terms of) nothing more than the actions of those within it. Such category errors are common. You might say they’re the most human thing about us.

Some youngsters fear to take naps or relax, thinking they’ll miss out on ‘what’s happening.’ But napping doesn’t recharge you for work or play, it is itself a component of both, just as rests and silence are essential components of music. ‘What’s happening’ is nothing more than: rest. Adults who conceive of ‘me time’ as a vacation from the imposition and inconvenience of marriage make the same error. Time to oneself isn’t a break from the hard work of a relationship, it’s an essential component of it, and spouses who treat time apart as ‘a break from the marriage’ only stress the marriage further, by defining it implicitly or explicitly as the opposite of vacation, or of fun.

The same goes for office meetings and solitary work, or for standardized tests and individual study: the various parts of the job must share a purpose in order to be successful. Students aren’t given detailed feedback on their SAT performance, so the test stands outside the ongoing learning process, indeed it temporarily arrests that process. (Student skill mastery can only be reliably tested in situ, i.e. the best way to know whether you can solve chemistry problems is to solve chemistry problems, not recite vocabulary words.) Meanwhile, the effectiveness of office meetings increases dramatically when workers conceive of them as team work rather than mere institutional obligation. If a meeting is not ‘checking in’ but ‘working face-to-face,’ workers are less likely to shift into passive ‘meeting mode’ when the group is called to order, and can maintain their solitary intensity in the group setting. It’s the shift in intensity that’s costly and prone to malfunction.

The same goes for relationships, of course. If lovers can be honest with one another when they’re together then they can experience the full joy of being apart without assuming that they’re incurring some cost to the relationship. The leering alpha-cretin’s barroom refrain: ‘If only I were still single, y’know what I mean dude?’ The bitterness in his voice is real, but the problem isn’t being one thing or the other. The problem is that he imagines himself as something other than he is, and judges his experience in categorical rather than experiential terms. Of course married men also enjoy themselves in bars, and children wake from naps refreshed, and workers even return to their cubicles with a spring in their step.

Prerequisite to such happiness is the simple but difficult act of abandoning one’s illusions: I could not have been other than I am. I need to be away from you today so that I can be with you tomorrow, the next day I will carry into solitude the memory of our fellowship, what I am flows from what I do rather than vice versa, and no time is uncoupled from any other. John Lennon said something about life happening to us when we’re making other plans, but he had it wrong: life does not happen to you. It is only what you get up to, planning or acting or otherwise. If you compromise in order to be happy later you will never be happy. As someone or other said, be happy in your work (of love, art, labour, learning, etc.). Step one is accepting what you’re really doing and getting on with it. There is no step two. Joy is honesty.

Living space.

Let’s let the idea of ‘killing time’ stand in for the full sweep of modern American cultural thought, shall we?

Americans are raised to believe in the sanctity of work and so forth, but also to be suspicious of people who do nothing but work. The label of ‘workaholic’ combines both admiration and derogation, and adults are expected to maintain a ‘healthy work/life balance,’ as if work were somehow separate from ‘real life,’ whatever that means. Moreover, most adults have at some point responded to the question ‘What are you up to today?’ with a shrugging dismissal: ‘Nothing much.’ In the American language it’s possible to be awake, eat, use the restroom, even walk around town or read a book, yet still ‘do nothing.’ This phrase subtly denigrates solitary reflection and ‘idle’ thought, but such disparagement is reinforced throughout our culture. (The Dutch ride bicycles upright, so as to look around at their cities and countrysides; Americans put on athletic pants and ride quickly with their heads down, so as ‘get somewhere,’ or worse yet, ‘just go for a ride.’)

And so we find ourselves killing time: struggling to fill the minutes or hours between scheduled or anticipated activities. ‘I have three hours to kill,’ we say, and our intonation depends on our attitude — not toward time as such, but toward the actions we imagine we’re going to take in that time. We even reserve a class of actions for ‘dead time’ — the space in our daily schedule not yet full with appointments. Such ‘time-wasters’ or ‘downtime activities’ are less honorable than proper action — picking up trash for one minute (during a TV commercial, say) doesn’t count as ‘real cleaning’ unless it’s part of an hourlong string of such maintenance tasks.

The same dynamic is readily observed in American sexual culture. The adulterer convinces himself that his transgression is ‘only sex’ and not, say, lovemaking (‘I told you, dear, she doesn’t mean anything to me’); the closeted fundamentalist preacher can have sex with his male meth dealer without believing himself a homosexual; more benignly, young lovers go on weekly dates but insist they ‘aren’t dating.’ This self-delusion stems from something like the intentional fallacy: if we don’t ‘mean’ what we’re doing, it doesn’t ‘count.’ The delusion serves to defend against true recognition of our acts, which would induce guilt and despair.

Yet the delusion itself obviously isn’t doing us any favours.

Many (if not most!) Americans are evidently unhappy, but so much modern experience consists of avoiding unhappiness — or at least putting off confronting it — rather than ensuring happiness. It’s widely known that working in a focused way for long stretches of time on tasks we believe in produces feelings of peace and fulfillment. The arbitrary segmentation of each day into ‘work life’ and ‘home life’ and ‘play time’ and ‘downtime’ and ‘free time’ necessitates jarring mental shifts, which damage our focus (and in turn our serenity). But this segmentation alone is bearable; plenty of people benefit from sticking to a schedule and so forth. Rather, it’s the belief that some times, some actions, are more ‘real’ or meaningful than others — our belief that some time deserves killing — that leads to despair, because when we review the work of the day we can no longer hide behind the false distinction between what we’ve done and what we meant by it. If you work for two hours and end up ‘killing’ three, do you count it a successful work day with periods of waste? Or a ‘lazy’ day in which you managed a little work?

Is a school day ‘busy’ if fully 30% of it is brief idle periods separating distinct work practices and environments? Can that possibly be the optimal use of a child’s vitality?

Taking a nap is a specific, conscious use of time rather than a failure to use it, ‘killing time’ is a conscious decision to perform a specific action — just like going to work, reading a book, plucking a chicken, painting a portrait, or sleeping around. ‘If you choose not to decide / You still have made a choice,’ the man said, and the time we kill is as much a part of our life as the time we fill. We go on dying at the same rate, regardless.

But never you mind! These are only idle thoughts. You haven’t really read them and, honestly, I didn’t mean even a word. I feel better. Do you feel better.

The X-Files, ep. 4×17 and 4×18: ‘Tempus Fugit’ and ‘Max.’

Note: I normally post these over at Medium, where the rest of my X-Files writeups are. But I’m feeling self-conscious about this site’s barrenness, so here you go.

Max Fenig returns.

In light of our present fallen condition, two bits of dialogue from ‘Tempus Fugit,’ one of the highlights of the strong but uneven fourth season. First, Mulder and Scully walk’n’talking while investigating a downed airplane which seems to’ve been the site of a (botched?) Grey abduction:

SCULLY: Mulder, why can’t you just accept the facts?

MULDER: Because there are no facts, Scully. What they’re telling you, what they’re going to report, they’re the opposite of the facts. A claim to ignorance of the facts. Claimed steadfastly, ignorance becomes as acceptable as the truth.

Second, the frightened Air Force air traffic controller and Scully talk in her apartment after he confesses to his role in downing a civilian airliner:

FRISH: You think I’ll be prosecuted?

SCULLY: For what?

FRISH: I gave the coordinates.

SCULL: You didn’t bring that plane down, Louis.

FRISH: I lied. I misled a federal investigator, I misled you. A hundred and thirty-four people, Sgt Gonzales…they’re all dead.

SCULLY: It wasn’t your fault.

FRISH: But I’ll have to live with it. I watched that plane fall out of the sky. It was just a dot on the screen, just a…set of numbers. The wreckage… I can’t get that out of my mind. How those people died — how easy it is to lie, just to say it was a dot on the screen…until you see it.

One of the strongest running themes of the show was the ongoing betrayal of America’s veterans, not by ungrateful citizens (one of several dangerous reactionary myths of Vietnam), but by the government. The ‘super soldiers’ storyline of later years is sometimes derided for coming out of nowhere, but haunted and betrayed vets were all over the show from the beginning — and of course federal employees Mulder and Scully are cast out and trod upon by the government.

This goes to a point that I’m sure I’ve made several times already: Spotnitz said ‘Every episode is a mythology episode,’ and critics do well to take that claim seriously. The show’s parade of scarred and damaged veterans (Deputy Director Skinner among them; ‘Tempus Fugit’ followed on the heels of the Vietnam fable ‘Unrequited’) is a metaphor for the same culture-wide alienation, the same pervasive dissatisfaction with received narratives, the same distrust of ‘rational’ authority, the same horror of demythologization which animates the show’s other narrative threads. I write this only days before Donald Trump is inaugurated as President. In this dangerously fallen era, the authentically subversive message of The X-Files — Trust no one but one another, and while we’re at it fuck the US government — feels like a strong tonic, a genuine curative. Maybe Chris Carter believes ‘alien abductions’ really do involve grey-skinned extraterrestrial dwarves, but his show argues something deeper and more upsetting: we’ll never know (much of anything) for certain, and the systems of authority which supposedly protect us are mechanisms of control and subversion…so the only authentic life left is a visionary journey to outer/inner space. And to take that journey, to assume the mantle of holy fool, of seeker, is to abjure ordinary living and become in a sense ‘uncivilized.’ It is to resist colonization (of mind and spirit, of social order) by avatars of control.

Perhaps this sounds silly. No: this definitely sounds silly. Even the parts that sound sensible sound silly.


Aliens almost certainly aren’t real, there’s almost certainly no such thing as the ‘astral plane,’ and only a proper epistemological humility keeps us from dismissing these somewhat silly possibilities out of hand. But as Uncle Joe (Campbell) tried over and over to remind us, the meaning of all myth is the journey from suffering and self-deception toward authentic being-in-the-world. The X-Files was explicitly mythological, not just in the ‘mythology==backstory’ sense of today’s fan/critics, but in the way it recurred endlessly to ancient narratives of visionary transformation. Visionary experience is real, the transformations it generates are real, even if the content of the vision is culturally contingent fantasy (fiction).

That was one of the undercurrents of Couliano’s generous, far-ranging Out of This World, a comparative study of ‘otherworldly journeys’ in myth, fiction, and firsthand testimony. Couliano correctly hedged his bets about the sources of mythic content, but he was clear on the continuity of visionary narratives from Gilgamesh to Dante. Visions come from the same place as gods: the eternal desire to escape the ‘human condition’ (need, struggle, death). They’re imaginative tools for social/emotional problem-solving, generated under more or less conscious control. The desire remains the same, and mythic figures and structures have proven remarkably effective at addressing that desire. The genetic algorithm which sorts and selects narratives over millennia has produced our assortment of distinct but thematically and typologically related mythoi. And The X-Files, from the very first episode, was a documentary rendition of the darkest American dreams, which is why it’s both silly and serious, political and wigged-out, superstitious and skeptical.

That said, it’s also a mess.

Much of the time, I don’t think The X-Files holds up as drama; in terms of scene construction, narrative interconnection, and ‘mytharc’ construction, it now feels primitive — even inferior successor shows like Lost (cripplingly indebted to The X-Files) assumed a level of audience sophistication which Chris Carter and his writers, in that time after the VCR transformed film editing but before the DVD permanently changed expectations about information density, couldn’t yet assume. ‘Tempus Fugit’ is, I think, a good strong draught of X-Files weirdness, but it’s a clunky hour of television. And of course Chris Carter’s dialogue is simply embarrassing. Look again at the quoted exchanges above: Mulder’s ‘claimed steadfastly’ line sounds like a bad machine translation. In terms of screen craft, The X-Files remains impressive compared to its contemporaries, but it does feel like a prototype rather than a finished thing.

Yet it still strikes me as one of the only mature visions of our hallucinatory premillennium culture ever presented onscreen in America. The content of its myth was balderdash, but you and I aren’t stupid enough to take mythic content literally, are we? Leave that to the critic-dilettantes, the cultural-politics bloggers, the quick-take thinkpiece club. Even Freud knew the difference between manifest and latent dreamstuff.

The latent content of the dream/vision/hallucination called The X-Files is: the secret history of 20th-century America, a crime story in which every citizen is the victim.

I think of the end of Whedon/Goddard’s Cabin in the Woods — the Virgin and the Fool refusing to propitiate the gods who demand their suffering, refusing to trade their tiny lives for Life in the unseen abstract, and incidentally sharing a well-deserved joint while the bad guys’ base burns down — and perceive a subtle continuity with the endless deferrals and digressions of The X-Files‘s evolving narrative…and with Chris Carter’s sweetly empathetic vision of a nationwide meshwork of loners and outcasts, scholars and kooks, dishevelled angels and prophets with honour. The hell with ending the story on their terms, right? Trust no happy ending. Trust no one but each other.

Not for nothing do Mulder and Scully look an awful lot like Men in Black.

Mulder’s moment with Max’s body in the hangar — that spasm of grief. I’m not convinced that Duchovny’s any sort of great actor, but that moment…

A lazy critic can find a way to say something about CSI. They shouldn’t — laziness is a mistake at best, keep it to yourself — but CSI demands nothing of you and gives nothing back, and the obvious criticisms, while insufficient, must nonetheless be delivered. It really is magical thinking in a box; it really does steal wisdom from its viewers. Calling out its emptiness is easy, but it’s a service.

We shouldn’t be lazy talking about The X-Files, I think. It’s up to something that can’t be understood without at least a little effort. Not a years-long project of Talmudic interpretation, no, and not the kind of fannish nitpicking that comes so easily to young poorly socialized obsessives. I’m just asking you to watch the show, if you’re watching, without recourse to the boring and banal and imagination-deadening interpretive frames which Cultural Critics deploy in order to score Experience Points in the Standard Discourse. Please consider the possibility that it wasn’t playing the usual game. Consider the possibility that entertainment isn’t the only goal of a TV show — even a monster-of-the-week anthology show about two crimefighting feds and their wacky ideas. I’m not saying it’s scripture, for God’s sake. I’m saying we can get more out of it by taking a long weird look inside.

4×18 Max

The second half of a two-parter — keep your expectations low.

It’s good that Chris Carter runs shows and tells his great big scary stories, but he shouldn’t be allowed to write scripts. His monologues are embarrassing, and his infodump ‘dialogue’ is artless, tedious masturbation.

That said, it was nice to see Max again.

The Third Man speech barely touches me because I’ve long assumed that half of Washington thinks in exactly those terms, and that’s all I’ve got to say about this flaccid hour of TV.

Briefly, on William Gibson’s cyberspace trilogy (plus Lovecraft, a bit).

Epistemic status: A hasty first draft.

Gibson insisted over and over in interviews, back when he was the hot new thing, that he wasn’t interested in details of technology as such, but rather the nature of human relationships to/within what we might call the technocapitalist machine: his ever-nearer-future world is one of routine surveillance, always-on reality TV, gated corporate computer networks, nation-states superseded by transnational corporations — all compelling in themselves — but Gibson never seems to’ve cared much about the way those technologies work in any terms but the social, the psychological. Which is why the retrospectively dippy cyber-voodoo magical metaphors(?) of Count Zero and Mona Lisa Overdrive perfectly fit the rest of the series: ‘magic’ in its many forms is another enabling/distorting neurosociocultural technology, viral-memetic biosoft, and what matters is what new modes of being-in-the-world it enables.

There’s a constant sense, in the ‘cyberspace’ books — Neuromancer and its two direct sequels — of vast terrible copresence, whether it’s the matrix or the ruthlessly violent megacorps or the AIs attaining sentience and looking to the stars. The end of the trilogy is a journey undertaken by a handful of dead ‘people’ to a faraway planet. You can’t have Gibson, in other words, without Lovecraft, who wasn’t what you’d call a ‘social novelist’ like Gibson, but who first crystallized the language of thermodynamic horror, rational inquiry as maddening vastation, which forms the backbone of 20C science fiction. Gibson’s matrix is a site of ecstasy for the deck-jockey Case, but he and everyone else ends up encountering it as an ocean of potentially fatal information, where looking the wrong way at the wrong ice (correlating the contents of the matrix?) can fry even the most prepared mind.

Lovecraft’s bleak ‘cosmicism’ has something of the convert’s didacticism — he was as touchy about his pedantically miserable atheism as he was about squid — but the more socially attuned Gibson seems to have been aiming at present-time cultural commentary. Both, I think, would claim to have been speaking to the modern condition in some sense, but Lovecraft’s cosmisicm is opt-in, a kind of recreational moping for lapsed Christians, where Gibson spoke more pointedly to post-60s (hyper)urban dislocation. You can’t keep living a normal life once you gain knowledge of the Cthulhu Mythos, after all, but you can go on to live a normal (shitty) life in the shadow of the matrix, Maas Biotek, the invisible war amongst the yakuza. The triumphant final act of Mona Lisa Overdrive‘s digital deceased is to leave earth altogether, after all, and something new awaits them out there. Entropy is the extent of Lovecraft’s everywhere, though: the original vast, cool, unsympathetic intelligence is F=ma (or the senseless system it falls out of). Gibson, not even really an enthusiast of new technology, closes all three volumes with a nod to twisted romance, which Lovecraft had neither time nor feel for.

In the Sprawl, unlike Arkham, sentiment is permitted. You might say it’s mandatory, since real movement is impossible. The cyberspace books are stories about transgressors, after all, criminals and (at times banally familiar) noir antiheroes; only at the margins is even the illusion of freedom possible.

Which is why it’s not too big a strike against Gibson’s early books that their characters are weak, particularly the women. Think of each volume as a handful of storylines from The Wire, glimpses of an autonomous order (Gibson’s world remade by the matrix, David Simon’s ‘postmodern institutions as Greek gods’), and Gibson’s characters as sentimental stock figures caught up in social transformation — the at times literally cosmic feeling which Gibson and his work can’t escape. His characters wear neoplastic carapaces or safety-pin piercings, but these serve the same characterological functions as the proverbial grey flannel suit, showing the constraints under which even his protean transhumans operate; the constraints are the interest, I think, so cliché is is a perfectly fine narrative strategy. Gibson’s well-meaning but clumsy deployment of Black Characters points up both the value and the limits of this approach, while his throwaway streetside visions suggest its power, the vividness of his dreamt-world…

In other words, don’t look to Gibson’s matrix for technological prognosis, rather for cultural diagnosis. (The idea of a fully rendered Internet, for instance, would never have occurred to a guy who didn’t write his novels on a manual typewriter.) Neuromancer and its sequels, like The X-Files, are a visionary encounter which How We Live Now, or rather how a man who identified himself as in some sense disappointed by the 60s lived in the early/mid-80s, and their central insight — ‘The future is already here, it’s just unevenly distributed’ — is one of the smartest things anyone’s thought or said about our world right now, the post-financial-apocalypse landscape of Obama/Trump and whatever loose affiliation they define between them. Gibson’s books remain essential guides to the human condition in a world defined by transnational flows of production and consumption, fluid information and identity, media supersaturation and the toxic fallout which precipitates from it as the heat rises. Like Gibson’s beloved Dhalgren, the cyberspace books see clearly a shared (im)possibility, an unwanted inheritance, and choose to speak of it science-fictional terms; their subject is the cost and weight and ruined ecstasy of interpersonhood in what’s left of the world.

Whether they’re dystopian or utopian novels — whether any honest accounting of How We Live Now can help but be both — I leave as an exercise to the reader. To us.

STAR WARS: THE LAST JEDI (2017): first thoughts.

Spoilers abound, obviously. If you haven’t seen the film: well, you’re going to or you aren’t, it doesn’t matter what I say.

First thoughts, not last:

Episode VIII is the most ‘mythology’-heavy episode of the series, dealing more explicitly with the Skywalker legacy as legacy than any of the previous seven. The prequels were about Anakin Skywalker and the end of the Jedi, in a mix of mythic register and present-time biographical/psychological mode — the sort of ironic story a middle-aged artist tells about the idols of his early life (even those he himself created). But the sequels are (pre)occupied with ‘Star Wars’ as legendarium. They’re not ‘mythic’ at all, they’re about myth. A neat, characteristic moment: Rose the starship technician genuinely squees when she meets Finn — ‘a hero of the Resistance’ up close! — but once she realizes he’s trying to escape from the Resistance/Rebel ship, she doesn’t hesitate to tase him and turn him in. That’s the film’s relationship to the mythos in a nutshell.

The Last Jedi, like the considerably tighter but less resonant The Force Awakens, treats ‘Star Wars’ as something received, to be acknowledged and honoured and then tweaked. Both films are explicitly political in this regard: if the multiracial and multigenerational ensemble doesn’t feel at all casually constructed, that’s a function of the advancing age of its writers and directors, but their agenda is entirely progressive. Rian Johnson, much moreso than JJ Abrams, seems able to imagine a universe after the Skywalkers and the Solos — the difference between the liberal and, well, the rebel — and in Episode VII he reveals again a gift for building on the old stories, looking past them, without anathematizing them.

The shocking death of Snoke is the smartest turn in a smart (but at times confused and overly busy) film: tasked with becoming ‘the next Darth Vader,’ Ben Solo does precisely what Vader tried to do in this film’s elemental template-story, The Empire Strikes Back — he kills his abusive surrogate father and reaches out to the powerful enemy he he envies and perhaps even loves…who rejects him, beginning the process of his dissolution.

Boyega is great. Go watch Attack the Block, the kid’s a star.

Laura Dern is great. Go watch literally everything she’s done, she’s a national treasure.

Poe and Leia are well characterized, though it’s frustrating to have an actor with Oscar Isaacs’s extraordinary charisma cooped up for the entire film; on the other hand, that frustration puts the audience in the character’s position, which nearly justifies the decision to ground Poe early. Leia, meanwhile, is utterly Leia, which is to say my heart leapt every time she appeared onscreen. If Carrie Fisher in her final days was no longer able to be as expressive as in the original films, she manages a weary grace that nicely suits the story.

(The young Fisher had genuine comic gifts to go with her princess-next-door beauty: timing, flexibility, and enough trust in her innate dignity to play the goof. She played comedy like a writer-actor, which of course she was. Johnson makes excellent use of footage from the original film, in unexpectedly moving tribute to Fisher. As Edelstein put it in his perceptive review: Fisher and Leia merged, in the end. This is a lovely swan song for both.)

Unfortunately, what goes on around Isaacs and Fisher is silly. The chase bits are nonsense, and of course the overall plot premise — apolitical lunatics in Empire cosplay manage to destroy the entire galactic republic with a single gun, then reduce the Rebellion to a single shipful of goodies — is almost offensively stupid. It’s no coincidence, I think, that the worst of the film’s plotstuff is the sequences that get us from one iconic tableau to the next. The unusual power of these sequels is generated by the tension between the passing old world and the emerging new, but the actual mechanics of the First Order/Resistance material are deadly boring. Characteristically Abrams-y, you might say, though indebted to Galactica in the second and third acts.

I need to think more about Mark Hamill’s place in the film, and Luke’s place in the story. I’ll say this: Hamill does strong work, handles the comic material with a sparkle that made me wish he’d worked on camera more often over the years, and invests the dramatic pieces with real dignity. It’s so good to see him again. Hamill has a very different presence from Harrison Ford, less plastic in his physical bearing but nicely flexible in his voice work — not for nothing is Hamill a sought-after voice actor, best known for his decades-long recurring role as The Joker. (Hamill and Ford were a compelling odd-couple comic pairing in the original films; their prickly friendship is one of the series’s best features, its transformation in the third film one of its more complex emotional lines, while their ecstatic greeting in the Yavin hangar — ‘That was one in a million!’ — is peak Star Wars.)

What troubles me a bit is that, by his own account, Hamill disagreed with ‘every choice’ that Johnson made for Luke in his script. I think I see why; Kylo/Ben is supposed to present Luke with a once-in-a-lifetime problem, but because we haven’t seen Luke since Return of the Jedi, the character is effectively reduced, in the film audience’s eyes, to having fucked off to mope for several decades. This strikes me as unfair to Luke: after all, the final turn of Return of the Jedi sees Luke proudly declaring he’s one of the Jedi, ‘like my father before me’ — it’s in the film’s title for heaven’s sake! But for the sake of plot movement, Luke has to be a problem for Rey to solve, and —

Oh, Rey.

Rey remains a problem. The character’s much more defined here than in The Force Awakens, where she was a cipher, but Daisy Ridley’s natural charm can’t protect Rey from having to carry the idiot ball at times. The character’s core identity is odd: she’s a ‘nobody’ who’s stumbled into an ongoing Oedipal saga, and Just So Happens to be the most powerful creature in the galaxy. (We, the geeks, told you she was a Mary Sue, and even if Jedi Master Rilstone says she’s not, she damn well feels like one — regardless of whether many of the people banging on about this subject are sexist morons.) This is, I think, part of the political program of the film: Rey isn’t ‘old money’ in Force-user terms, no member of the old-boy Jedi network, so she has to hustle twice as hard to get where she’s going…except she doesn’t, not at all. Never having been taught word one about lightsaber fighting, she takes on three highly trained Knights of Ren (former Jedi trainees, I assume?) and comes away with a scratch on her shoulder. Never having actually tried the lifting-rocks thing that took the son of Vader weeks to learn, she lifts an avalanche by herself. Leaving aside the ‘worldbuilding’ implications, Rey’s fast-forward developmental stuff means Rey’s more like Harry Potter than Luke Skywalker, not so much ‘refusing the (Campbellian) call’ as waiting to press the Win button.

Rey’s relationship with Luke is boring. Luke should smile more. That’s a bigger deal than you might think.

I’ll come back to this movie. Why not? It’s less depressing than talking about Trump.


It seems to me that recession is one of the key features of the Viriconium cycle: the city is vivid, immersive, without ever actually being clear, and over the course of the series — particularly this maddening final ‘volume’ of short stories — it recedes entirely from view (like an eyeball drying out, or a chrysalis desiccating and collapsing onto itself) along with its citizens, its stories, any hint of clear meaning. What’s left, in ‘A Young Man’s Journey to Viriconium,’ is less than an echo; and yet the city is unbelievably rich and present even in its (or rather as an) absence. When Harrison describes a building sited in a valley ‘like a metaphor’ there’s a cruelty to it. Light seemed cruel as well, beyond the grey pitilessness which characterizes all four Viriconium books.

Empty gestures and fading memories characterize the city in this final chapter. ‘A Young Man’s Journey’ takes place in our England, more or less, and while it ends with the musical acclamation which has characterized the series throughout — ‘Viriconium!’ — the weird hollowness of it is blacker than irony. But Nights is a coda; the whole series is a coda. The other short stories, especially ‘Strange Great Sins’ and ‘The Luck in the Head,’ depict a world in its Evening, reduced to reminiscence and meaningless recapitulation. I realize now that to call the mad goings-on ‘surreal’ is to dismiss them, to consign them to the aesthetic: this is a careful rendering of ugly nonsense, which after all sounds a lot like our world in our moment (or Thatcher’s, yes?).

The language of Nights varies, though it never returns (I’m glad, or relieved) to the terrible1 dense static of A Storm of Wings. After the deliberate slow movement of the two middle volumes, night comes, rest, the fog seems to recede — but there’s nothing left to see, or rather much to see and not to understand. The clocks have run down and the creative urge is gone.

It’s hard to talk about Viriconium. No: it’s easy but futile, like talking about entropy. The concept defeats you. Digging into Viriconium is like laboriously decrypting a piece of bad news. By the end it doesn’t promise anything; in the Evening even teasing is heartless.

I realized only now that I hadn’t thought of the Afternoon cultures since at least A Storm of Wings, maybe before. Harrison deals from the same deck as before — insects, horse heads, deranged artists, lightsabers, dwarves — but face-down, now. You hardly remember that any of it ever meant anything. Maybe there never was a fucking Viriconium.

I loved these stories (this story). I’m not sure I liked it in the end, though I’m sure Harrison doesn’t care; it filled me with an intense and unidentifiable emotion.


and and and

I wrote that in early July, and I’m surprised at its negativity, or no, at its anger. I suppose I was angry that Viriconium had finally been taken away, though that taking was the work of the entire series, which seems to me altogether to be one of the great works of the imagination — or rather its imagination seems greater creeping up on me/mine than, oh fuck it. I adored the book and it angered me. I can’t decide whether Harrison loves or even likes Viriconium; he must, mustn’t he? but you wouldn’t know the way he lets it go. I resent his pitilessness as I’m not convinced it’s necessary, though maybe if I knew who/what/when was the butt of the joke and maybe if I also disliked him/it/then — well —

I catch myself wanting things from Viriconium that it was built (I imagine) precisely to refuse, and so catching, I get angry at the dwarf, the city, the insect, myself, and Mr Michael John Harrison, though not in that order. Me first. What a world, a world-city unlike any other, as they say in the ad-copy biz. Viriconium!

  1. ‘Terrible’ like ‘inspiring a kind of all-consuming existential dread,’ not ‘bad.’ Prosewise, Harrison is a living god. Also a darkdreaming fucker. 

Men, man.

Attention conservation notice: Drafty outboard note-taking, of neither use nor interest to other humans, unless you wanna laugh at some dweebs.

The phrase ‘everyday carry’ has apparently come to mean ‘things you buy to pretend to be a real man, y’know, like your grandpa,’ which is a sad thing — when I first heard the phrase it just meant ‘a useful all-purpose knife,’ and the guys using it weren’t styleboy wankers. Here’s the founder of the site everydaycarry.com guest-posting at a site called, I shit you not, ‘The Art of Manliness‘:

At the most literal level, your everyday carry is the collection of items you carry with you in your pockets or in your bag on a daily basis.

You don’t say!

Like the ‘hipster PDA’ i.e. notecards held together with a binder clip, the ‘everyday carry’ kit (penlight, keyring, knife/multitool, wallet, watch, and of course your expensive smartphone — y’know, ‘what’s in your pocket’) is a dumb affectation; unlike the hipster PDA, it’s also a moneymaking opportunity for the kind of guys who carry moustache wax and wear $200 watches to their coffeeshop jobs. The ‘hipster PDA’ was mostly a moneymaking opportunity for Merlin Mann of 43folders.com, but only for about ten seconds.

Which reminds me, as so many things do because I’m wired wrong, of Susan Faludi, whose still-excellent book Stiffed came out around the same time as Fight Club and leveled a related critique of contemporary ‘ornamental masculinity,’ though her hangups are different from Palahniuk’s thank Christ. Faludi holds up the WWII-era G.I. (hey when was your grandfather born again?) as a lost ideal of manliness: stout of heart, simple of tongue, off liberating Auschwitz one day and back to work at the high-rise the next. In her telling as I remember it, a toxic stew of advertising dollars, economic disempowerment, the collapse of ancient social mores, rapid heedless postwar technologization, and good ol’ fashioned late-patriarchy led to the replacement of manliness as community service by, well, The Art of Manliness.

(The Sopranos tells a particularly nasty, ironic version of this story.)

I look at the EDC fetishists and see guys playing dressup. Which is fine, I’ve got nothing against dressup. But you have to acknowledge what you’re doing — and you ought to think a moment about why.

The EDC club use the word ‘functionality’ when they mean ‘style’ which means, basically, game over.

Game of Thrones and ‘narrative economy.’

Game of Thrones in its seventh season has become a different show, set in an entirely different world: instead of the densely populated, richly imagined world of the first few years, or even the rapidly collapsing stage set of seasons 5 and 6, the show now takes place in a purely abstract space unmoored from anything like actual geography. This makes for more efficient ‘narrative economy,’ but the transition from dramatic to almost comedic abstraction comes at enormous cost — to believability, obviously, and (worse) to the books’ delicately balanced historical consciousness.

What I’ve always loved best about Martin’s books is the sense that the history of Robert’s Rebellion is playing out across a second generation twenty years later; history is present for the book’s characters at just the right scale, if that makes sense, with just the right weight. The Starks and Lannisters relate to the past as people do, rather than as Player Characters, and the generational struggle which drives the various court intrigues is simply correct. (‘Realpolitik Tolkien,’ as they say.) The TV show has never given me that feeling, mainly because its world is so much smaller than the books’. The Citadel is three rooms onscreen, but Martin can situate it in a complex ecology. The same for, say, the Starks’ relationship to the ‘smallfolk,’ who don’t appear in the show because that kind of ‘worldbuilding’ (the kind that matters) means hiring even more extras. The fifth book in the series ends with Kevan Lannister grandly murdered by Varys — a complexly motivated chess move which the TV series is too coarsely plotted to accommodate. (Kevan Lannister barely appears in the show, as does his entire stratum of ‘second-string’ players in the titular Game.)

Season 7 has so totally collapsed the physical and temporal scale of the story that the connection to Robert’s Rebellion, say, has been lost altogether. This doesn’t seem to bother the audience, whose numbers swell further as the show abandons Martin’s sense of seriousness or purpose. But it bothers me.

All of which is maybe just to say: I told you years ago that the show would lose its brain when Benioff and Weiss passed Martin’s books and had to go it alone.

Tonight’s episode was funny and ‘heartwarming,’ and oh yes, patently absurd. It was, at several points, a literal parody of itself. Most disappointing.

Epistemic status; attention conservation warning.

Reading Scott Alexander’s Slate Star Codex (one of the best blogs out there, no question), I’m reminded of a feature of his blog that I wish were more widely adopted: the epistemic status note at the top of a post.

A recent example:

Epistemic status: idea for one’s toolbox of ideas; not to be followed off a cliff


Epistemic status: So, so speculative. Don’t take any of this seriously until it’s replicated and endorsed by other people.

You might think this is humourlessness, or the author assuming his readers’ humourlessness or poor reading comprehension, and some idiot is probably getting ready to use the phrase ‘Swiftian satire’; please don’t. What Scott is doing is suggesting one or more reading frames for his readers, in order to shape both their approach to the posts and the discussions that follow. But crucially, this isn’t about content — it’s just additional information about how strongly certain claims are intended to be taken.

This is the most important thing, I think, and the strongest indication that Scott’s site is ‘grownup’ in a way most modern-USA ‘intellectual’ discourse simply isn’t: he assumes that the point of his writing is to generate and contribute to robust adult communication, and avails himself of the right tools for that job. Moreover, he does not assume that his readers will agree with him (and they often don’t) — only that they’re willing to read in good faith and assume that he’s writing in the same spirit.

This isn’t quite the same as a content note: if you look at, say, shakesville.com, the ubiquitous content notes often (usually?) function as neutral guides to topics under discussion, but surprisingly often serve as editorial prefaces, e.g. a (hypothetical) discussion of rates of gender-detransition might be framed with a ‘transphobia’ content note. The purpose of such notes isn’t to increase reader flexibility, and they don’t assume readers’ good faith — they’re there in part to shape the readers’ attitude toward the content itself. They aren’t just warnings to stay away, of course: most readers will read the posts regardless of the content notes. For those readers, the content notes are just guides to reception posture at the level of content.

Scott’s ‘epistemic status’ warnings guard against unproductive forms of argument but are agnostic as to reader perspectives; Melissa’s, I’d argue, militate subtly against specific perspectives. Both are intended inclusively, I think, but my sense is that they don’t both function that way, at least not to the same degree.

The great and knowledgeable Cosma Shalizi includes ‘attention conservation notices’ atop his long posts, which are somewhat more complicated (or at any rate pretentious) than normal content notes/trigger warnings.

In theory, credentials serve as persistent epistemic status warnings: ‘I have a PhD in area XYZ, so I can be expected to know A, B, and C.’ But life is complicated and dumb.

But again: why would you take my word for any of this?