wax banks

second-best since Cantor

Category: writing

Attention. Immersion.

Epistemic status: Unwieldy articulation of what I take to be a commonplace.

The economy of attention is zero-sum or indeed negative-sum: if you’re paying attention to me you can’t also pay attention to your work. Attention is a scarce resource, and easily damaged, which is why it commands such high prices. Moreover, it’s now widely understood that there are ‘transaction costs’ when moving attention around, so that looking at a single article for twelve seconds has infinitely more value than looking at six articles for two seconds apiece. The myth of ‘continuous partial attention’ refers to specific circumstances requiring only low-yield passive monitoring — say, checking on the stove to see if the pasta’s done (yes or no).

The economy of immersion, so to speak, is positive-sum: deep immersion in one activity generates not only a sense of fulfillment but a supply of usable energy which can be turned to other activity: more life, as the blessing goes. Sustained immersive activity (writing, biking, sex, cooking) not only generates important negative feedback — pulling you back to the activity itself — but builds excess capacity. A daylong hike can begin to restore fragmented attention, a fifteen-minute freewrite realigns your internal verbal mechanism, good sex this morning will leave you with naughty thoughts all day which seem to enliven as much as (or more than) they distract; in each case, the energetic/attentional output has a long wavelength, a gentle contour, so that you might not notice how much it has reduced the effect of local (mental) noise. But three or four such waves will effectively drown out high-frequency cognitive bother.

Immersion has a tidal or oceanic character. There’s a reason we talk about ‘flow’ states, ‘waves’ of calm, etc. Peaceful vs panicked breaths. This is obvious.

Sane people know that fifteen minutes of exercise will give you an hour of deeper creative productivity — i.e. ‘I don’t have time’ is straightforwardly false for nearly all cases. The same goes for any joyful (≠ pleasurable in many cases) immersive activity.

Immersion is generative, tourism is usually costly. Ask your Spanish teacher.

Advertisements

morning morning morning morning morning

Epistemological status: Nonsense.

freewrite to start the day. can’t be bothered with proper capitalization and punctuation. ok cheating: i’m allowed to delete a word or sentence.

science fiction is afflicted, not surprisingly, by the same disease as ‘the humanities’ in academia: pathologically lazy metaphors deployed by writers pig-ignorant of even basic math and science. sokal and bricmont had blades out for the french critique-of-power dweebs years ago. i think this is why ‘speculative fiction’ has become the label of choice: science is hard, scoring political points is easy. coming-of-age ‘genre’ stories are (comparatively) easy. partly this is a specific instance of the ‘ignorant people can’t write good literature’ complaint, but it goes deeper: SF claimed its role as the essential late-20C literature not least because great SF writers could imagine and translate and articulate complex concepts in terms other than the popular — they could talk about their time in a language that wasn’t simply of their time, if that makes sense. Tolkien the same: estrangement at the level of language yes but also conceptually, in terms of worldview. ‘heroism’ meaning something fundamentally different to Tolkien than to modern readers. i think of Ancillary Justice, which disappointed me last year, and its too-familiar handling of ‘identity’ and ‘gender.’ it needed more philosophy, more science, more alienness. ursula le guin could have worked wonders with that material.

SF’s aliens are most interesting as alien modes of thought — but writers bound to the present, to fashion, have a hard time generating that generative alienation. ‘the present’ is a metaphor-field. think too of Deadwood and its astonishing imagined language, the way David Milch’s multiply inverted verses could represent streams of self-modifying consciousness. think of Westworld‘s replicants, the depth with which that story’s writers explored specific theories of consciousness in technical language. compare those great achievements to the embarrassingly shitty ‘worldbuilding’ in Ready Player One, barely qualifying as an act of the imagination: naked contemporary wish-fulfillment without a moment’s thought for a world beyond our own. think of clarke’s Ramans, who ‘do everything in threes’ for reasons that remained inscrutable even to clarke himself (the haunting closing line was added as an afterthought), or of Roy Batty storming across the rooftops of LA after rick deckard, or of the thousand and one meanings which attach to pynchon’s Rocket. (this is one reason pynchon is our best writer: he sees his conceptual material through. allows it to flower.)

if Robert Anton Wilson’s schtick has value, it’s his combination of at times intense alienation and attraction: sex for its own sake, puns for their own sake, and then a grinding assault on pious certainty. of course RAW was a great dilettante, he was just smart and fun enough to get away with it.

china miéville deep in his political theory to write books full of SF/fantasy political theory. and then how thin his stuff gets when he’s talking on memes and squids in Kraken. i liked what i could be bothered to read of it, but Admirably Strange Images Embodying Concepts Familiar Even to Neil Gaiman’s Readers doesn’t get my dollar.

michael swanwick. john crowley. delany, man.

don’t bother writing science fiction (or criticism) unless you care about the systems that your metaphors are drawing on. please, please, please. the details are the form. it’s all details.

(Deadwood is in part a story about magic and John from Cincinnati is its direct sequel, but i’ll tell you about that some other time.)

this is why you shouldn’t post your freewrites, folks.

Irreal Life Top Ten, September 2017.

Note: These posts have nothing to do with the Greil Marcus columns to which the title refers; nor is there anything particularly ‘irreal’ about all this, not by design anyway. This go-round, at least, it’s just a collection of short things glued together into a longer thing. I gave no thought to what I was going to write until I’d begun typing, and none after I’d finished the first draft of each paragraph. This post is a mess. But so’s everybody else and so are you, or you wouldn’t be reading this. On we go. –wa.

  1. The Genius in the Writers’ Room: Every great TV show needs one, where by ‘genius’ I mean the caretaker of a coherent (read: generative) vision which backstops creative arguments and serves as a conceptual/thematic/imagistic home to return to. Buffy had one and arguably several; for a while Lost had a couple (but crucially not the showrunners); Game of Thrones started out with a whopper, GRRM and his vision for ASOIAF, but now obviously has none; The Sopranos had at least two after Matt Weiner joined up; The Gilmore Girls, which I can’t stand, obviously had one; Seinfeld had two, Arrested Development maybe more; peak Simpsons is said to’ve had a handful. Fawlty Towers and The Office obviously had theirs (the UK system has long been built around individual/paired writers, which isn’t always a strength), and even the American Office glowed for a moment. Mad Men and Deadwood are clear examples of one visionary master guiding an expertly assembled workshop, as is The Wire. The GITWR keeps the story from taking obvious or easy turns; she intuitively connects storyworld elements because her innerworld is so connected. This isn’t just a matter of craft — Chris Carter’s a miserable scriptwriter but was unquestionably The X-Files‘s GITWR, like the equally hamfisted George Lucas — rather a reflection of a holistic conception, an ability to serve the whole story at once. In music, think David Byrne, Trey Anastasio, Peter Gabriel: the one to whom the low-energy method never even occurs as a possibility, who holds the door open for everyone else in the Room to work at a level above themselves.

  2. Guardians of the Galaxy 2: The trouble with Marvel’s ‘cosmic’ movies is that they seem to think ‘cosmic’ means ‘great big,’ which is incorrect. ‘Cosmic’ is (should be) the opposite not of ‘microscopic’ but of ‘myopic,’ and that’s why GotG2‘s lack of daring was such a bummer. Not to link numbered items like some kind of hippie, but commercial formula and creative vision tend to end up in tension, and with Marvel, the formula has so far tended to win decisively.

  3. Peak Phish: I know I know, you just don’t care about Phish and you wish tasteless myopic Phish fans would stop going on about them. OK then lemme put it this way. Phish formed in 1983 and hit their creative peak in 1993-99, and if they were a normal band the story would end there. But since 2013, defying nearly every rock/pop precedent, they’ve been doing work that in some ways equals — and in some ways surpasses — their glory years. Consider their 2013 experimental album premiere; the Halloween 2014 theatrical production; Trey’s 2015 woodshedding, Dead guest gig, and triumphant return to a band inspired to mid-90s-level improvisation; and of course the 2017 ‘Baker’s Dozen,’ thirteen shows without a single repeated song featuring their most consistently successful experimental improvisation in nearly two decades. They can’t do what they used to, which is OK — no one ever has. (I mean that literally.) But as they enter their mid-50s in a bad that formed nearly 35 years ago, no other band in America can do what they’re doing right now. For weeks I’ve been trying to think of other popular musicians their age taking such risks, and am growing a little worried, because names like ‘Miles Davis’ keep coming to mind. And that’s just ridiculous. Right?

  4. John Wick 2: I know I know, you’ve heard the first film is a ‘cult classic’ and an ‘expressionist noir-action masterpiece’ and blah blah blah, but John Wick 2 is 70% unbearably dumb unfunny bullshit, and 30% witty balletic film art. Wait no, make that 85/15 with error bars pointing the wrong way. The risk the Wick flicks take is in depicting unrealistic (indeed superhuman) mastery in realistic-ish detail — John/Achilles is always reloading his guns (because ‘realism’) but he never ever misses (because ‘hero’)…which is an iiiinteresting, thoroughly modern approach. And the photography’s nice. But the vaunted ‘mythology’ is the wrong kind of stupid, the dialogue is always tedious (I did laugh twice, but at gunfire), and Keanu Reeves’s weary beauty is poorly served by his dirgelike line readings. I liked looking at the film, sometimes, but so what? I like looking at Chungking Express too, and it made me want to say things other than ‘Cool!’ How old-fashioned of me.

  5. Art as self-advertisement: It should be its own best reason for being, right? Beauty is enough, wisdom and wit are enough. But last year’s film Kong: Skull Island is all witless exposition and witless ‘character work’ until the first ape attack; then more witlessness, more ‘character-building,’ until the next big animal thing, and so on. John C. Reilly, some ‘jokes,’ then some computer graphics. Samuel L. Jackson giving a speech; computer graphics. The film has no personality whatsoever. Why not? Did no one with even a trace of wit or creativity touch the script? Did the director not realize how many strong comic actors he’d been given to work with? Even the usually effervescent Tom Hiddleston shows not a spark of life here, and I wonder: did someone, at some point, watch the dailies or just read the script and point out that this was a waste of time? The scenes not shown in the trailer may as well not be in the film, and hundreds of people worked extremely hard to make this movie. Not ‘but’ or ‘yet,’ just…’and.’ Aaah, Hollywood.

  6. Clarity and correctness: I used to tell students — excuse me, to pronounce self-importantly at students — that all edits are for clarity, the point being that you need first/most of all to know what the hell you’re trying to do, which will generate corrective impulses as you edit; ‘prettier’ and ‘more intense’ and ‘more exciting’ are side effects of ‘clearer.’ If the music is clear in your head then you’ll know right away which notes on the page don’t work, and part of the craft is learning to hear those infelicities as directional, i.e. indicating at least onedimensionally how a wrong note’s wrong. It seems to me most bad writing’s bad because of a mismatch between intention and attention, e.g. you (white Pundit) don’t want to share cultural privilege w/economically ascendant blacks/Latinos but also don’t want to be called racist so you instead write garbled nonsense about e.g. something called ‘black-on-black crime’ or go on about the e.g. nobility of racist historical figures, netting a plum job at the NYT opinion page. If you’d done your reading and had principles and written what you actually thought, you’d have produced a coherent and testable argument. Instead you produced an anxious one. The reason mainstream cultural/political pundits are bad is that they don’t (generally can’t) say what they think and mean. This is part of what Angela Nagle’s talking about in Kill All Normies: saying what you feel liberates certain energies which are, for a variety of reasons, unavailable to ‘respectable’ figures, which is why it’s taken so long for MSM pundits to know what the hell’s going on with Trump’s supporters.

  7. The First World War: George RR Martin says you should read about WWI rather than WWII; the latter has clear heroes and villains and a strong narrative arc, meaning it’s a freak occurrence in military history, while the former is a more conventional ‘bastards with armies force boys to murder each other in the mud’-style conflict, with an appropriately disastrous end that made a sequel inevitable. I’ve just read Norman Stone’s World War One: A Short History, 200 pages of witty insight from a British historian angrily dismissive of the rampant stupidity which it was his job to describe, and now I’m desperate to dig deeper into the subject — starting with Ludendorff himself, who presided over the collapse of the German military in 1918 and first spread the ‘stabbed in the back’ calumny which Hitler (whom Ludendorff legitimized!) and his angry mongrels turned into a cultural/political organizing principle. The Great War really was in a sense the death-spasm of an entire civilizational project, the beginning of a long-delayed reckoning with Europe’s changing role in the changing world, which (reckoning) wouldn’t end until August 1945’s two ultimate expressions of mechanistic modernity in the sky over Japan. As is usually the case, getting a strong dose of historical detail has reminded me that today is not 1914, nor 1933 — and reminded me, too, as Angela Merkel likely coasts to another term as leader of Europe’s dominant economic power, how much our historical moment owes to the decisions made during that decades-long crisis of modernity.

  8. An analogy: politics : identity politics :: political party : personality cult

  9. …by which I mean: David Runciman’s superb Talking Politics podcast recently did a ‘the year ahead’ episode, in which Runciman and his boon companion Helen Thompson expressed frustration with Emmanuel Macron’s almost fraudulent use of the electoral process to advance a kind of glorified personality cult (this is my gloss; as good Englishmen they were appropriately measured in their assessment). It occurred to me that Trump had, of course, run the same kind of campaign, with similarly disappointing results for his supporters, who’ve gotten nothing of substance from his administration. And I immediately thought of Mark Zuckerberg, the vicious resentful little dilettante who’s done more than any living person to convince otherwise sane humans that ‘social networks’ have something to do with actual healthy social relations. I can’t imagine Zuckerberg wanting anything to do with an established political party — they’re too messy, too compromised and compromising, too grounded in actual human-speed social processes to appeal to the millennial par excellence. Like Trump, Zuckerberg has given no indication whatsoever that he sees his cultural/economic position as entailing any responsibility; what I take to be his self-conception, his appraisal of his own ‘visionary’ talent (what rubbish), leaves no room for the political collective. Which is why Facebook has accelerated the gutting of coalition politics in the name of identity politics, at terrifying cost to representative democracy (a system whose innate conservatism mitigates its innate potential for radical individualism). Runciman suspects that Macron’s failure, when it comes, will come because he has no party, only a ‘movement’; notes that social movements are very easy to get going; and imagines Macron and Co. will be overcome in time by other, better organized, more sustainable social movements, Left or (let’s hope not) Right.

  10. …by which I MAYBE (but on the other probably don’t) (but) (but) really mean: Sarah Palin, the grifter whose sole political platform was ‘I feel aggrieved,’ was the real winner of the 2008 election.

Pronoun schema.

In my writing I use female generic pronouns by default, freely switch when I get bored, sometimes switch to male pronouns when I’m talking about some characteristically male idiocy, and am especially careful to refer to Dungeon Masters as female and RPG players as male — when the latter distinction is relevant, which it fucking always is.

How and what should you read?

Someone asked the other day whether the things I read bear directly on the writing I do.

I said somethingsomethingsomething but what I meant was:

You can’t plan knowledge

Learning is association-making, connection, but those connections are capricious (cf. those sexually aroused by feet, those who think they saw the Virgin Mary at Fatima, those who can play twelve games of high-level chess simultaneously without actually loving chess). Human brains aren’t purposefully wired, they’re grown; instead of plans they develop according to tendencies. The phrase ‘perfectly reasonable deviations from the beaten track’ might come to mind here if you’re me.

You can consume information according to a plan. I wanted to know about the influence of Charles Fort on midcentury pulps and comix; I read Kripal’s Mutants & Mystics. I wanted to know what Jacques Vallée actually argued in Passport to Magonia; I read it, simple. But it’s silly and self-defeating to start out wondering what you’re going to do with that information. You can’t know, and in any case the action-arrow points the other direction: as it transforms interpenetratively into knowledge, the reading does something with you.

I mean that almost literally. We can only consciously control our learning with gross imprecision, which is why cramming for tests is a terrible idea (too much too late). You learn in a trickle or a rush, but crucially you don’t decide which, and it’s best to think of learning practice and knowledge-formation (not ‘-acquisition’) as distinct and almost disjoint practices. The making of your mind can go on without you. Good thing, too: it’s what ‘you’ are made of.

Point being, you can control the inputs to the psychotropic process (the books you read, the drugs you take, your adherence to or rejection of the diurnal cycle) but you can’t control the emergent coral-reef forms which knowledge takes in the mind/brain. And this is good, because while you are a sadly limited person living in a sadly limited world, the self-modifying bioelectrical system which epiphenomenally generates ‘you’ is a good deal less neurotic and scared.

And so you should read whatever you’re passionate about, because

  • passion intensifies and accelerates this mindmaking process, while
  • boredom kills it, and since
  • you can’t control whom you turn into,
  • your best bet for generating a robust mind-body ecology is richly varying inputs

Which brings us to the secret central question of all blogposts,

What does this have to do with my D&D campaign?

But the only reason anyone asks this question is that he hasn’t yet internalized the great paradox of our everything-bad-on-demand-everywhere time, which is that

Fantasy isn’t a genre, it’s an activity

If you get that fantasy is something you do (creation connection narrativizing spatializing eroticizing etc.) and not a set of genre markers (elves sorcery talkingswords) then you already know what all this has to do with your D&D campaign — the more and better you know, the more deeply and widely you experience, the richer your fantastic imagination.

False Patrick occasionally looks for D&Dables in James Scott or Geoffrey of Monmouth with superb results — you can see why G. of M. would be a good RPG source, but James Seeing Like a State Scott? Well, read the post. I picked up Barbara Tuchman’s A Distant Mirror having heard it described as the book that birthed not only Game of Thrones but a generation of medievalists (who later went on to disavow it as decidedly non-scholarly history), but in the end I experienced it as a kind of hellish postapocalyptic dystopia, the apocalypse in question being the bubonic plague. That, in turn, put me onto William NcNeill’s Plagues and Peoples, a brilliant short book which argues for an advanced understanding of humans as coexisting in complexly evolving predator/prey relationships with, say, syphilis (or bubonic plague, or HIV). That was immensely clarifying as history, but it doubled for me as a kind of SFnal primer on both ‘deep time’ and dystopic transhuman history — a depectively matter-of-fact story about the place of the human species at the center of a slowly tightening ecological net.

Not longer after I finished Plagues and Peoples I picked up Jeff Vandermeer’s Annihilation, the first third of his Southern Reach trilogy, which is a kind of Rendezvous with Rama/Lost/Lovecraft mashup with mushrooms swapped in for tentacles. I liked it, but it was twice the book it otherwise would’ve been, and ten times the dream-fodder, for the way it echoed and weirded-up McNeill’s book.

Come to that, there’s no reason Lovecraft’s ‘cosmicist’ vision requires tentacles in the first place — the creepiest thing about ‘The Call of Cthulhu’ is the bat-winged things in the swamp, and frankly the Cthulhu statue itself only creeped me out to the extent that it recalled the statue of Mbwun from Lincoln/Child’s Relic, which I read in middle school because I’d heard that ‘If you liked Jurassic Park‘ and of course I did, but then I only picked up Jurassic Park because there was an article about it in a science newsletter we read in our Earth Science class, and if we’re in honest-confession mode then the fact that my godfather went to MIT (Course 2, class of 1924) made me wanna attend that school slightly less than the fact that Michael Crichton had spent a year as Writer-in-Residence there…

See?

Evolutionary weirdness

The least interesting thing about fantasy is its content. (Have you ever had to listen to someone else tell you about last night’s ‘amazing’ or ‘hilarious’ dream? Soporific stuff.) What makes fantasy fantastic is its visionary quality, the way it animates primal urges and throws light on hidden mental corners. Worthwhile art is deeply personal: the work of a strong ego seeking out egolessness. The best stuff is necessarily at least a little inaccessible, mysterious, resistant to analysis, however welcoming its formal presentation; great art always proceeds according to an intuitive logic that’s inexpressible in rational terms. And because it speaks to a unified (continuous, cohesive if not logically coherent) vision, it could only have been made by the person or people who made it.1 Good, in other words, is always strange.

But ‘strange’ is the last thing central planners want to deal with — cf. the aforementioned Seeing Like a State. The inescapable, essential fallacy of the central-planning ethos is this:

Orderly processes do not necessarily produce orderly results.

Indeed the one’s got little or nothing to do with the other except by chance.
Working artists get this, hence the irritation/frustration/disappointment writers evidently all feel when asked when their ideas come from. Critics, meanwhile, tend not to understand this — if the disjunction between aesthetic means and ends were widely understood, entire schools of criticism woulda been strangled in the crib. I think of the weird mismatch between Joyce’s literary dreamworlds and his pedantic fan-critics, and (because I’m me, and have written the books I’ve written) of the way Phish’s most hyperrational practice exercises have generated their wildest improvisations while their most deeply structured longform improv has come at moments of surpassing looseness and intuitive responsiveness. (The same goes for other rational/ludic/dreaming improvisatory scholar-artists — think of Johns Zorn and Coltrane.)

I want to have The Right Information at my fingertips when I write, but I also want to experience and share strange knowledge, a Weird innerworld which only I can see but which through my craft I can make knowable to others. And I aim to build deep written structures through intuitive improvisatory methods — so that, for instance, the structure of my 33-1/3 book mirrors the structure of the album it discusses, and the fractal form of my Allworlds Catalogue embodies/allegorizes the Big Themes it bangs on about, etc., though both those formal arrangements were arrived at with those pretentious-sounding purposes in mind.

And I find that the best way to achieve these tight-loose performances, this particular pleasing-to-me dreamlike relationship between form and content and private experience, is to immerse myself in material and see what forms spontaneously appear.

We forget that evolution isn’t just a winnowing process of natural selection — it’s punctuated and catalyzed by far-from-equilibrium self-organization, which can altogether shift the topology on which the selection process works, ‘skipping tracks’ in terms of descent. This is biological innovation, and its absence from the standard schoolhouse evolutionary narrative is just one more expression of (and reinforcing element in) a dangerous, thoughtless cultural conservatism, a pseudosci retelling of the myth of heavenly bureaucracy. Evolution isn’t a one-way road running straight, it’s a network of migrations through an ever-shifting topology toward no particular destination — the endless fitness gradient scarred with switchbacks, channels, deep caves, inscrutable truths spelled out in the bones of lost travelers…

Back to the start

‘No one can see beyond a choice they don’t understand,’ said the Oracle in The Matrix: Revolutions. Put another way: you’re trying to get from one stable equilibrium (not exercising, say) to another (being in the habit of exercising daily) but between them is a hill down which you can backslide all too easily (forcing yourself to exercise daily for a few weeks until the habit has formed). The zone of extreme flux — of frustration, worry, pain, seemingly endless struggle — of uncertainty — between equilibria is a hard place to be if you can’t handle uncertainty. If you need to know the outcome before you begin the process, you’ll never do anything new. Everything truly new is a risk.

So how and what should you read?

My sincere answer:

Keep reading until you figure it out.


  1. Reasoning through the ethical implications of this paragraph for the art-consumer and the DIY creator is left as an exercise for the reader. 

The Goodreads problem synopsized.

You must have a sense of how people respond to your work, but you mustn’t fixate on any one response — learning to manage variation in tastes is an important skill for anyone doing creative work.

It’s harder than ever to escape people’s responses to your writing; to ‘be online’ (to live online) is to be constantly, destructively aware of the ultimately irrelevant. Yet you should never get drawn into a lengthy exchange with a reviewer of your work, paid or volunteer, except to clarify errors of fact.

There is no good solution, other (I suppose) than doing good enough work that you can confidently ignore reviews altogether.

The waX-Files.

Reminder: if you like this stuff, you will likely like these posts, on The X-Files. The perspective is, shall we say, eliptonic-appreciative, and the attitude toward existing popular coverage of the show is (shall we say?) largely contemptuous. They aren’t ‘recaps,’ sorry, just responses, each pitched in whatever register made nonsense at the time.

I didn’t grow up watching The X-Files, which went off the air while I was in college or grad school. I’ve now seen most of the first four years, and consider it both good and (both historically and potentially) important. The incoherence of the ‘Mythology’ doesn’t bother me, and shouldn’t bother you; caveat conspirator.

Stories are made of time and change, not information.

The justification for spoilers (beyond ‘I am anxious, impatient, and have no self-control’) is that you don’t need to receive the story’s info-payload at the moment prescribed by the writers — having the facts, we are told, only clarifies the story, it doesn’t diminish it. Knowing how it ends frees you up to enjoy the unfolding of the story without anxiety.

This disgusts and worries me.

We might think about stories this way:

Narrative structures aren’t vessels containing information, they’re machines for creating information in the mind of the audience. ‘Little Nell dies.’ ‘Oh, is that so? Who’s Little Nell?’ Little Nell is part of a structure which, when activated, effects psychotropism — mental transformation — in the reader. She’s not ‘contained’ in the machine The Old Curiosity Shop, she’s a gear in that machine. To put it another way: the production of fictional knowledge (e.g. ‘informing’/’teaching’ the reader about hobbit feet or the one-eyed bigot at a Dublin bar) is an epiphenomenon of the process of generating the experience of reading itself.

Fictions don’t contain facts, they contain meaningful time: algorithmically generated encounters between audience and story. The text exists to generate the experience of living through it. Characters, plot, setting, are just ‘local variables,’ generated at runtime, which cease to exist when the work is done. But more than that: the work of a fictional scene can’t simply be summarized after the fact (writing tip: if it can, the scene is bad and probably unnecessary). The story effects a set of transformations through sustained audience contact: it’s a smooth curve, flow, the path on which the fictional outcome is dependent. Alter the path, break the curve, obstruct the flow, and you lose the story. What remains are chains and gears, sprockets and lenses — pieces of the machine, meaningless outside of its working.

This isn’t a niggling narratological concern, it’s a serious cultural problem. What’s good about a story is the telling, the reading, the watching, encounter, immersion, sharing — the act of communication, the provisional formation of a network which includes reader, text, artists, imagined-artists (notions which complicate the reading experience), setting, moment… Surprise, as Joss Whedon puts it, is a ‘holy emotion,’ and even in the small doses afforded by the ‘literary novel,’ surprise is an essential element of the fictional contract. But it seems that more and more Americans are terrified of surprise. Parents, bosses, workers, people on dates, schoolteachers, students(!), and of course Discerning Media Audiences — we imbue surprise and uncertainty with anxiety (wishing not to be tested, to risk our precious selves, in a world where the Self is our only permanent or meaningful possession) and seek dumbly to control our microworlds instead of seeking out or cocreating new ones.

Serial novels (‘franchises’) sell like hotcakes, ‘literary’ fiction all but disappears. We read a dozen reviews before settling on a TV show. We ‘swipe right’ based on the literal covers of figurative books. Theaters (both cinemas and the other sort) run only remakes and sequels. We seek out films by particular studios. We welcome a new era of nakedly partisan pseudojournalism. A man who plays a businessman on television becomes president on the strength of his ‘business acumen.’ We are horrified by the news but can hardly pretend to be surprised…

In the grand scheme of things, ‘spoilers’ are a small thing. But as we reconceive what stories and storytelling are, what they’re for, we incur hidden costs. One honourable task for ‘critics’ in this fallen era would be to tally up those costs.

P.S. Scott Alexander writes authoritatively (vs anecdotally) about the value of ‘trigger warnings’, which I pass on as countermelody to my naïve carrying-on about ‘surprise’ as a pillar of fictional experience.

Panjanconundrum.

Desperate to write something, trying, trying, but it’s all just shit.

How’s your Sunday?

A note about Terry Pratchett, to keep you from getting cocky.

Between 1989 and 1992, Terry Pratchett wrote sixteen books.

Sixteen: 16.

One of them is Good Omens, which admittedly he only cowrote and which I can take or leave, honestly.

Ten (10) of them are Discworld novels, including Pyramids, Reaper Man, Witches Abroad, and the next two — Small Gods and Lords and Ladies — which are considered some of Pratchett’s best work. Small Gods in particular is the consensus pick for ‘peak Pratchett,’ near as I can tell.

Oh, and some short stories and a computer game and so forth. No big deal.

So next time you let yourself think Man, I’m really good at my job…