wax banks

second-best since Cantor

Category: cognitivemusic

Sharing ecology. Social objects.

Epistemic status: Wrote this in mid-2016 but I don’t think I posted it anywhere. It appears to be a semi-hasty first draft (These are a few of my…). At the point where I mention the Great Depression, I found a note from my 2016-self: ‘TK THIS GRAF IN PROGRESS; CONSIDER CUTTING NEXT GRAF OR 2 AND REDOING APPROACH TO THE ENDING ALTOGETHER, MORE IN KEEPING W/THEME OF THE PIECE.’ I’ve disregarded that note for the moment, because revising and rewriting this essay would mean sinking back into these feelings, and right now that thought turns my stomach. Maybe someday. –wgh.

I recently wrote a book about my favourite band, and found myself reliving — and longing for — the early days of my fandom. The band’s reputation rests on their improvisatory live performances, and for their first 20 years fans eagerly traded amateur concert recordings on audiocassette, with the band’s blessing. Walking into an acquaintance’s house and finding a shelf full of Maxell XL-II tapes, each with a handwritten ‘j-card’ listing tracks, segues, guest appearances, and improvisations of note, meant you’d found a fellow obsessive, and probably a friend.

That’s over now. My tapes gather dust in our attic, and I listen exclusively through iTunes and streaming sites. ‘In my day’ you’d arrange a trade with a stranger online — I’ll send you a 1st-generation tape of Amsterdam July 1997 for a clean copy of those two classic Red Rocks shows on your tapelist — and send the tapes bubble mailers, then post a Good Trader Alert to the newsgroup. We shared physical objects which resided in our homes, and our relationship to the music was artifactual, sacramental. You could be ‘in the presence’ of the music in a literal way.

Now, when you want the show, any show the band has ever played, you find the link in a single handy spreadsheet and download it from ‘the cloud.’ This has reduced but not quite eliminated audience taping, but it’s entirely done away with the fan trading network which was the backbone of our community. Once I shared the music with you; now a computer somewhere on Earth shares the music with our computers.

Everything is always available. There’s no need for us to share. There is no one to thank.


Free public wi-fi, streaming HD video, same-day book delivery, timeshifted TV, effectively unlimited free email: the benefits of these technocommercial advancements are so obvious that we needn’t talk about them, and so never bother thinking about them, and so tend to assume that these glorious advances and their glorious advantages are the Way of Things, steps already taken and so either fully accounted for or simply beyond counting. They can have no cost, this non-reasoning goes, because honestly why talk about cost when we finally have nice things?

But of course there are costs. There always are.

They manifest subtly at first.

In recent years the words ‘own’ and ‘ownership’ have acquired new senses: to ‘take ownership of your trauma’ means to acknowledge and make peace with the fact of a bad thing having scarred you emotionally, and to ‘own your privilege’ means to recognize the ways in which you benefit from your social class, and then pantomime remorse. ‘Ownership’ here means something like ‘reckoning,’ usually melodramatic.

And of course there was President G.W. Bush’s ‘ownership society,’ an idea which combined deficit spending (instead of ownership) and further atomization (instead of society).

When I was a kid in the 80s, ‘ownership’ seemed to me much less complicated: owning a thing meant being able to hold it, touch it, and — within reason — do what you wanted with it. If you owned a Nintendo game, for instance, you had a plastic cartridge full of subtle electronics which you inserted into a plastic and metal console in order to play. To share the game meant walking it down to Scott’s house and playing it over there; at the end of the day you brought it home. It was ‘yours’ the way your sneakers were yours. If it broke, it was lost to you, but you usually knew why.

Sharing music meant lending a compact disc or dubbing a cassette tape, and woe betide the would-be pirate who wanted music that didn’t fit cleanly on either side of the tape. Sharing a drawing meant sending it by mail; sharing movies meant inviting Jimmy and Craig over to watch them at your house on a weekend.

This was, I don’t need to tell you, a pain in the ass; and I’m assured that things are Better Now. To ‘share’ a movie with a friend in 2016, you simply point her to where you got it. Same with music and games. Easy breezy: your precious objects never actually leave our hands, and you can share without giving. ‘Generosity’ doesn’t come into it; when your neighbour asks to borrow your copy of the Game of Thrones finale, you either divide it by mitosis (copying the file) and pass along a copy, or maybe email her a .torrent file so she can grab it directly from the 16-year-old who pirated the episode in the first place.

Failing that, of course, you can just give your friend your Netflix or hbogo.com password — both companies have accounted for such ‘violation’ in their business models. The miracle is that she gains while you appear to lose nothing at all.

To borrow terms from computer science, this is ‘sharing’ as reference-passing1 rather than object-passing. The shift is meaningful, its benefits are clear, and we will be paying its hidden costs for a long time.

As Richard Stallman and his cohort have been pointing out for decades, our ‘possessions’ are increasingly rentals — beyond the simple fact that we’re not permitted ‘inside’ the digital tools we rely on, the ongoing shift from local (desktop) computational resources to online services, invisible server-side processing, and remote storage means that it’s typically ‘more convenient’ in the short term to have easy access to digital resources we don’t control than to actually ‘possess’ them. It’s nice not to need to synchronize multiple copies of your email archive, isn’t it? Easier, certainly, to let Google have it, and simply view your messages on a webpage (‘in the cloud’). All you have to do to get access to your most intimate thoughts is this: when the Alphabet corporation of Mountain View CA asks for the magic word, you type it — and it would be helpful if you handed over your phone number too, just in case.


This isn’t about Ludditism, mind you. We’re doomed, yes, and it’s our absolute dependence on biologically incompatible industrial infrastructure that’s doomed us, but: Cloud computing really does make modern life easier; accessing your entire music collection from your phone really is a miracle; not having to worry about server maintenance makes running a website not only easy but possible in the first place. I like being able to stream every Phish show to my phone. No, that’s not strong enough: 18-year-old me would have murdered his friends to get access to the digital tools which 37-year-old me, taking them for granted, finds insufficiently convenient.

And yet.

And yet when we consider whether to buy gadgets or embrace hip new software service, the alternative to Gadget A or Online Service B is always Functionally Equivalent Gadget X or Interchangeable But Less Snazzily Branded Service Y — the alternative is never Doing Without — and the sole reason for this state of affairs is that if you and I Do Without, the companies which sell us things will make a touch less money.

Retail businesses can be divided cleanly into two camps: those that produce truly useful, essential goods, and those that benefit from consumer anxiety.

With a smartphone and earbuds, you can now talk to Aunt May in far-off Osbaldeston anytime, in realtime — but since everyone else is wearing earbuds too, you can’t talk to a stranger on your street. Your teenage kid doesn’t think twice about listening to music across cultural borders, but has also never even heard of ‘social music,’ and probably knows none of the music that kept your parents and grandparents alive. Turn-by-turn GPS directions make navigation trivial, and the only cost is that relying on that technology means you never form a mental map of your city; but then, why would you need that? Why would you need to talk to a stranger, or learn your grandparents’ emotional language? What’s so great about being able to imagine a city without looking at a cartoon map of its streets?

If you’ve ever looked at your year-old iPhone and felt, deep in your bones, that it was time to pay a couple hundred bucks for an upgrade, then the people selling you pills are the ones who made you sick.


The redefinition of ‘sharing’ from transferring to copying is an inevitable knock-on effect of ubiquitous digital networking. When copying is cheaper than transferring, you copy; that’s why every Harvard freshman cheats. (And the faculty blame the kids and their parents, never themselves or the institution.)

But that redefinition, plus the creeping status anxiety and ‘FOMO’ (fear of missing out) engendered by a gadget/tech culture that’s easy to enter but difficult to leave, plus the ubiquity and pace of ‘social’ media, creates a toxic dependence on corporate media — meaning not only Sony, NBC, and HBO but Google and Apple and Twitter too. There’s social pressure to pay constant attention, and most of what there is to pay attention to is advertising. And while corporations carefully pressure their marks (us) to ‘create,’ to feel ’empowered’ (or else!), they also beam out the false but convincing message that the path to empowerment is consumption, and creativity outside the corporate-media envelope is somehow suspect. Fulfillment can’t be sold, only shared, so marketers forcibly and falsely equate fulfillment with satisfaction — the palliative, the rush of sensation — and ‘sharing’ is reduced to word-of-mouth advertising.

The late Prince famously said that his enormous back catalogue was the result of a kind of odd pragmatism: when he got the urge to hear a certain kind of music, his tastes were specific enough and his process refined to the point where he was better off just heading into his studio and making the music himself. We are told that in the Digital Future of Today it’s that easy for all of us. But when was the last time you recorded a song, actually edited a home video or article (‘sharing’ increasingly also means ‘sharing your first draft’), or built a Lego project from scratch instead of buying a kit? The instant sugar hit of recognition and pseudoconnection that comes from engaging with a brand, a franchise, a ‘magical revolutionary device,’ will almost always overpower the more complex, delicate experience of doing it yourself — and the conscious choice to DIY is short-circuited by the saturated colours, beveled edges, high framerates, and deep bass frequencies of the mediasphere.

This is a new spin on old news: the American public’s tendency toward absolute passivity before the screen has been a problem since the first television beamed out the first time-killing inanity, if not before. (During the Depression, 65% of the American population went to the movies each week.) Never mind that sitting for hours in front of a screen is bad for you, a fact everyone has known and seen firsthand for decades; thinking for hours through a screen is bad for you too. It fundamentally changes how you see, how you want, how you experience Others.

Americans’ democratic rhetoric has never quite hidden our desperate yearning for a strong hand, ideally an invisible one, guiding our choices. Consumerism, conformism, identitarian narcissism — these are such longstanding concerns you can watch expensive cable dramas about them. Madison Avenue didn’t invent the insane notion that happiness means inactivity (‘kick back and relax’), slaves did. And yet Being Able to Accomplish More is the core sales pitch in modern life: enhancing your productivity, being a ‘more effective you,’ decreasing your footprint while ‘increasing your impact’… Accomplishing more while doing less is the essence of the American Dream, which is one reason Silicon Valley’s rapacious technophilia has so thoroughly colonized the contemporary American imagination.

Yet it bears repeating: our tools also constrain our ability to create, coarsely (you can’t do calligraphy with a hammer, or drive nails with a watercolour brush) and more subtly. Tools come with ideas attached, with cultures of practice, social histories, private associations…and many of the ideas attached to our modern digital tools are poisonous to our long-term health. The idea that sharing means referring to Something Neat rather than making something of our own and passing it along. The idea that your urge to hear music is best satisfied by turning on the radio-equivalent rather than picking up a guitar (or even an iPad drum machine). The idea that, because pseudostate corporations can provide essential social services more efficiently in the short term than the actual state, they should do so. The idea that the best of you is what you can broadcast to the world right this instant. The idea that a company that inserts advertisements into your email while pretending not to care about their contents is, in any way at all, ‘on your side.’ The idea that ‘self-sufficiency’ is corrupt and valueless simply because it’s a myth.

The idea that it’s important to find out what other people think about your new favourite show before you ‘support’ it by watching.

The idea that it’s better to let the machine remember for you.

The idea that you can form a human connection with a username.

The idea that ‘curation’ is ‘creation,’ rather than ‘acquisition and accumulation with better branding.’

Blah, blah, blah.

(And now a moment for us: If you’re not blocking them with a clever bit of Javascript, please click one of the ‘Share’ buttons on this page so we can both get a sense of self-worth from this piece.)


We began with ‘sharing’ but have ended up on ‘creating’ and ‘curating,’ which makes sense: in the jungle of the ‘social’ Web, your taste is your identity and everything is a remix, and pointing out that these are deranged wrong ideas — pure ideology, good for business and bad for everything else — is uncool, i.e. irrelevant. The redefinition of ‘sharing,’ the weird felt obligation to point out that we don’t agree 100% with everything we retweet, our gadget anxieties, our self-satisfied consumerism, our literally childish equation of fulfillment with momentary satisfaction, our march toward an imagistic attention-deficit politics divorced from actual economic or cultural or indeed climatic reality…these are contemporary manifestations of our century-long movement toward absolute dependence on a corporate-cultural complex, and if that sounds creepily like ‘military-industrial complex’ then give yourself a gold star.

Our technological dependency makes us dependent in turn on the corporations who sell the technology, and those corporations spend billions to make sure we not only depend on them but feel sympathy for them, expend emotional energy caring about their wellbeing — think of how many hours supposed adults spend arguing with each other about Google’s ‘rebranding’ as Alphabet or Apple’s choice of default system font or whether the repulsive multibillion-dollar oligarchy called the NBA should sully the ‘purity’ of its player uniforms with advertisements that already blanket every unused square inch of every NBA arena. So far, so distracting…but when we treat corporate interests as emotionally equivalent to human interest, we silently accept encroachments on our inner lives, steep cuts to our imaginative autonomy, which we’d never countenance if our acquiescence hadn’t been bought. We learn not to mind Google reading our email, which means the NSA reading our email (and Google providing tools for them to do so), etc., etc., etc.

We’ve outsourced our taste to record labels, our imaginations to movie studios, our memories to email providers, and our creative urges to whatever shiny thing crossed our field of vision most recently; what’s onscreen is real — ‘friending’ a user account is making friends, ‘liking’ something is liking it, a selfie is a memory, the show’s better than the book — so what’s real is onscreen. Who has time for anything else?

And outside the window, just offscreen, the seas rise and the world dies.

We’ve lost (ourselves) and the timing couldn’t be worse.

But surely I don’t need to tell you that. Everything is a remix, after all; you’ve heard all this before.


  1. It’s no coincidence that our art reflects this ideological shift with an aesthetic shift toward relentless referentiality — the bored cynic who first got rich painting soup cans wouldn’t have been the least bit surprised by a cash-in like Captain America: Civil War, which (like dozens of comic book crossover (non)events before it) exists primarily to answer the question ‘What if Spider-Man fought Cap and made a weirdly specific Star Wars joke and we saw Crossbones in an early scene for some reason and The Vision wore a sweater and Stan Lee called Tony Stark “Tony Stank”‘ with a resounding KA-CHING!! From Lost and Family Guy to every Dreamworks movie ever made, from TV-Game of Thrones‘s inept citations of its bestselling source novels to the torrent of swill that is Disney’s ‘secret origins of fairy tale characters’ sequence, from the cloddish point-missing of Sopranos ‘death pools’ to the tiresome pedants who ‘fact checked’ Mad Men by craftily googling every date that appeared on what ended up being nothing more than a (brilliant) work of fiction, the demeaning game of spot-the-reference has become a staple of what credulous academics call ‘active media consumption.’ We could go on, but shouldn’t. 
Advertisements

Novelty and discovery.

Epistemic status: Of the quadrillions of blogposts, this is the blogpostiest blogpost. I’ve no idea whether any of it’s correct or indeed whether I’ll believe it tomorrow. I notice that my pairwise comparisons are pretty much only men, and have thoughts about that, but this is a good ol’ fashioned first draft and they’re for another time. –wgh.

Some artists are driven by a desire for novelty, some by a need for discovery. The one has, I think, nothing to do with the other.

The primary virtue of the work, for an artist chasing novelty, is that it seem new and surprising. An artist chasing discovery wants something subtly different: uncertainty, creative dislocation. In other words, novelty artists have a product in mind while discoverers are drawn to a process, which is why the former put out novelty art while the latter make God-knows-what, and sometimes reach the very center of things.

Plenty of interesting artists beign by obeying the first impulse and discover in themselves a hidden capacity, less immediately saleable but more sustainable. Deeper.

Many ‘avant-garde’ artists in any given medium are obsessed with novelty as such, which is why avant-garde art so often feels like toy work in which nothing is at risk — necessitating, in turn, an envious critic-class whose job it is to go on about how risky it is. Anxiety about sales figures drives novelty art, though such artists are expected to pretend otherwise.

Artists doing the deep work wish to be present at the revelation without demanding to hear a specific message, so their work has an unsettling unpredictability quite distinct from the shock-value of the novelty artist.

Sometimes such discoveries pull artists apart.

Think of Brian Eno and Paul Simon, Grimes and Beyoncé, Andre 3000 and Big Boi, George Lucas and JJ Abrams, David Mamet the writer and David Mamet the director, Ricky Jay and David Copperfield, PKD and RAW, Thomas Pynchon and Jonathan Franzen, David Milch nd Aaron Sorkin, David Lynch and David Fincher The Simpsons/Rick & Morty and Family Guy/South Park, if you like. (Feeling silly? Apple and Google.)

I think of one sort of art (and artist) in terms of vision(s) — input, throughput, the encounter — and the other in terms of objects, artifacts, output. Discoverers take up the work not knowing whether they can pull it off, what it means

I hate the tone I take in this sort of blogpost but there you are, here we are.

Compromise and happiness; Living space.

(two posts dated 2010, a couple years after I got married but before my son was born. posted as-were, as it were; i wouldn’t write it this way anymore, and reading it gives me that ol’ familiar red-pen-itch, plus a weird sadness (was i always so angry?). but… –wa.)

Compromise and happiness.

Married couples learn to compromise out of necessity, but the scope of this necessity tends to be misunderstood. Compromise is often thought of as an invasion or reduction of personal sovereignty: the up-front price of a marriage’s survival. But in the long run compromise isn’t ‘judged’ correct or successful. Rather, the act of compromise is itself the substance of a successful relationship, which after all consists of (and is expressed in terms of) nothing more than the actions of those within it. Such category errors are common. You might say they’re the most human thing about us.

Some youngsters fear to take naps or relax, thinking they’ll miss out on ‘what’s happening.’ But napping doesn’t recharge you for work or play, it is itself a component of both, just as rests and silence are essential components of music. ‘What’s happening’ is nothing more than: rest. Adults who conceive of ‘me time’ as a vacation from the imposition and inconvenience of marriage make the same error. Time to oneself isn’t a break from the hard work of a relationship, it’s an essential component of it, and spouses who treat time apart as ‘a break from the marriage’ only stress the marriage further, by defining it implicitly or explicitly as the opposite of vacation, or of fun.

The same goes for office meetings and solitary work, or for standardized tests and individual study: the various parts of the job must share a purpose in order to be successful. Students aren’t given detailed feedback on their SAT performance, so the test stands outside the ongoing learning process, indeed it temporarily arrests that process. (Student skill mastery can only be reliably tested in situ, i.e. the best way to know whether you can solve chemistry problems is to solve chemistry problems, not recite vocabulary words.) Meanwhile, the effectiveness of office meetings increases dramatically when workers conceive of them as team work rather than mere institutional obligation. If a meeting is not ‘checking in’ but ‘working face-to-face,’ workers are less likely to shift into passive ‘meeting mode’ when the group is called to order, and can maintain their solitary intensity in the group setting. It’s the shift in intensity that’s costly and prone to malfunction.

The same goes for relationships, of course. If lovers can be honest with one another when they’re together then they can experience the full joy of being apart without assuming that they’re incurring some cost to the relationship. The leering alpha-cretin’s barroom refrain: ‘If only I were still single, y’know what I mean dude?’ The bitterness in his voice is real, but the problem isn’t being one thing or the other. The problem is that he imagines himself as something other than he is, and judges his experience in categorical rather than experiential terms. Of course married men also enjoy themselves in bars, and children wake from naps refreshed, and workers even return to their cubicles with a spring in their step.

Prerequisite to such happiness is the simple but difficult act of abandoning one’s illusions: I could not have been other than I am. I need to be away from you today so that I can be with you tomorrow, the next day I will carry into solitude the memory of our fellowship, what I am flows from what I do rather than vice versa, and no time is uncoupled from any other. John Lennon said something about life happening to us when we’re making other plans, but he had it wrong: life does not happen to you. It is only what you get up to, planning or acting or otherwise. If you compromise in order to be happy later you will never be happy. As someone or other said, be happy in your work (of love, art, labour, learning, etc.). Step one is accepting what you’re really doing and getting on with it. There is no step two. Joy is honesty.

Living space.

Let’s let the idea of ‘killing time’ stand in for the full sweep of modern American cultural thought, shall we?

Americans are raised to believe in the sanctity of work and so forth, but also to be suspicious of people who do nothing but work. The label of ‘workaholic’ combines both admiration and derogation, and adults are expected to maintain a ‘healthy work/life balance,’ as if work were somehow separate from ‘real life,’ whatever that means. Moreover, most adults have at some point responded to the question ‘What are you up to today?’ with a shrugging dismissal: ‘Nothing much.’ In the American language it’s possible to be awake, eat, use the restroom, even walk around town or read a book, yet still ‘do nothing.’ This phrase subtly denigrates solitary reflection and ‘idle’ thought, but such disparagement is reinforced throughout our culture. (The Dutch ride bicycles upright, so as to look around at their cities and countrysides; Americans put on athletic pants and ride quickly with their heads down, so as ‘get somewhere,’ or worse yet, ‘just go for a ride.’)

And so we find ourselves killing time: struggling to fill the minutes or hours between scheduled or anticipated activities. ‘I have three hours to kill,’ we say, and our intonation depends on our attitude — not toward time as such, but toward the actions we imagine we’re going to take in that time. We even reserve a class of actions for ‘dead time’ — the space in our daily schedule not yet full with appointments. Such ‘time-wasters’ or ‘downtime activities’ are less honorable than proper action — picking up trash for one minute (during a TV commercial, say) doesn’t count as ‘real cleaning’ unless it’s part of an hourlong string of such maintenance tasks.

The same dynamic is readily observed in American sexual culture. The adulterer convinces himself that his transgression is ‘only sex’ and not, say, lovemaking (‘I told you, dear, she doesn’t mean anything to me’); the closeted fundamentalist preacher can have sex with his male meth dealer without believing himself a homosexual; more benignly, young lovers go on weekly dates but insist they ‘aren’t dating.’ This self-delusion stems from something like the intentional fallacy: if we don’t ‘mean’ what we’re doing, it doesn’t ‘count.’ The delusion serves to defend against true recognition of our acts, which would induce guilt and despair.

Yet the delusion itself obviously isn’t doing us any favours.

Many (if not most!) Americans are evidently unhappy, but so much modern experience consists of avoiding unhappiness — or at least putting off confronting it — rather than ensuring happiness. It’s widely known that working in a focused way for long stretches of time on tasks we believe in produces feelings of peace and fulfillment. The arbitrary segmentation of each day into ‘work life’ and ‘home life’ and ‘play time’ and ‘downtime’ and ‘free time’ necessitates jarring mental shifts, which damage our focus (and in turn our serenity). But this segmentation alone is bearable; plenty of people benefit from sticking to a schedule and so forth. Rather, it’s the belief that some times, some actions, are more ‘real’ or meaningful than others — our belief that some time deserves killing — that leads to despair, because when we review the work of the day we can no longer hide behind the false distinction between what we’ve done and what we meant by it. If you work for two hours and end up ‘killing’ three, do you count it a successful work day with periods of waste? Or a ‘lazy’ day in which you managed a little work?

Is a school day ‘busy’ if fully 30% of it is brief idle periods separating distinct work practices and environments? Can that possibly be the optimal use of a child’s vitality?

Taking a nap is a specific, conscious use of time rather than a failure to use it, ‘killing time’ is a conscious decision to perform a specific action — just like going to work, reading a book, plucking a chicken, painting a portrait, or sleeping around. ‘If you choose not to decide / You still have made a choice,’ the man said, and the time we kill is as much a part of our life as the time we fill. We go on dying at the same rate, regardless.

But never you mind! These are only idle thoughts. You haven’t really read them and, honestly, I didn’t mean even a word. I feel better. Do you feel better.

Twenty-three.

Content note: Silliness. Only silliness.

Out on the good-natured fringes of conspiracist culture, the number 23 is said to possess cosmic significance — this is tied to the ‘Law of Fives,’ which is too silly to explain.

If you want to understand American counterculture(s), you must understand this:

  1. No one seriously thinks the number 23 is intrinsically significant.
  2. ‘Seriousness’ is beside the point.
  3. Unserious belief can have incredible psychotropic effects.

The opposite of ‘serious’ (when used sarcastically) isn’t ‘frivolous,’ it’s ‘playful,’ and play is the heart of antirational belief and practice. The number 23 isn’t meaningful until you make it so — at which point its presence is as meaningful as you like. Meaning is an effect generated by interpretation, by reading. Antirationalism is playful reading practice.

Attention. Immersion.

Epistemic status: Unwieldy articulation of what I take to be a commonplace.

The economy of attention is zero-sum or indeed negative-sum: if you’re paying attention to me you can’t also pay attention to your work. Attention is a scarce resource, and easily damaged, which is why it commands such high prices. Moreover, it’s now widely understood that there are ‘transaction costs’ when moving attention around, so that looking at a single article for twelve seconds has infinitely more value than looking at six articles for two seconds apiece. The myth of ‘continuous partial attention’ refers to specific circumstances requiring only low-yield passive monitoring — say, checking on the stove to see if the pasta’s done (yes or no).

The economy of immersion, so to speak, is positive-sum: deep immersion in one activity generates not only a sense of fulfillment but a supply of usable energy which can be turned to other activity: more life, as the blessing goes. Sustained immersive activity (writing, biking, sex, cooking) not only generates important negative feedback — pulling you back to the activity itself — but builds excess capacity. A daylong hike can begin to restore fragmented attention, a fifteen-minute freewrite realigns your internal verbal mechanism, good sex this morning will leave you with naughty thoughts all day which seem to enliven as much as (or more than) they distract; in each case, the energetic/attentional output has a long wavelength, a gentle contour, so that you might not notice how much it has reduced the effect of local (mental) noise. But three or four such waves will effectively drown out high-frequency cognitive bother.

Immersion has a tidal or oceanic character. There’s a reason we talk about ‘flow’ states, ‘waves’ of calm, etc. Peaceful vs panicked breaths. This is obvious.

Sane people know that fifteen minutes of exercise will give you an hour of deeper creative productivity — i.e. ‘I don’t have time’ is straightforwardly false for nearly all cases. The same goes for any joyful (≠ pleasurable in many cases) immersive activity.

Immersion is generative, tourism is usually costly. Ask your Spanish teacher.

‘Cheer up honey, I hope you can.’

Maybe the power of Yankee Hotel Foxtrot comes from just this: its songs are designed to create a world, one less perfect-plastic-lossless-synthetic, one accessible only at night, by a journey inward. It’s a nostalgic album, and a fearful one: about 60% of its 52-minute runtime is touched with feedback, fuzz, static, electronic glitches, or its infamous Conet Project samples (whence comes the title) — and it seems to me the album’s heart dwells in its darkest corners rather than its cleaner, more straightforwardly ‘anthemic’ moments. The brighter, warmer tunes recall the band’s brilliant Summerteeth, while the more heavily laden tracks (the collagelike opening song, the astonishing Poor Places > Reservations, whose interrupting silence is as much a part of the suite as the songs that surround it) look sadly forward to the nightmares of A Ghost Is Born.

I like to think (can’t help it) of albums like YHF as portraits of an imagined world the musicians invoked and inhabited and responded to in making the album, rather than a ‘statement’ of some sort. That kind of hero-narrative doesn’t appeal to me when it comes to musicians; I believe them when they report that deep inside the work, they feel they’re responding to impulses from beyond themselves — though I treat the specifics of those testimonials (the Muse, the Cosmic Consciousness) as pretty fictions only. YHF and atmospheric artworks like it not only depict but create a kind of listening-consciousness, about which you feel however you feel, but which is in a sense complete unto itself: pocket universe, paracosm. And in that place, everything comes to mean everything else. Symbol and referent are jumbled, interwoven, the symbolic layer is the ground of the real and vice versa and permute further and so on. If ‘psychedelia’ is this I don’t mind.

Did the album come on the coffeehouse stereo while I was writing this? Yes of course, and it doesn’t mean anything in itself but it means something in me-here-now, or I mean us-there-whenever. This is there; now is every other ever; I become ‘us,’ and it’s about time isn’t it. The music is the echo-artifact-pretense of the transformation which is the art, or (boring) the art’s purpose. Means to many ends, including pleasure (sure!) but especially joy. And ‘joy’ might just be the somatic component(?) of being-truly-in-the-world. Any world. Even this world of ghosts and remembering and war beneath the bedroom window and a mystery voice on a shortwave radio.

Magic.

System(s) of ritual/programmatic antirational worldmaking, way(s) of being-in-the-world resting on a number of ridiculous, factually inaccurate claims, but producing extraordinary results. Our corporate-capitalist unculture’s present interest in psychotropism (microdosing, nootropics, etc.) charts a smooth curve downdowndown from techtopia’s counterculture roots to the Carefully Managed State — SV execs taking meetings at Burning Man, etc. — nothing magical about it. Lost for now for most: enveloping ritual which cleansed the personal of its parochiality (the absolute opposite of ‘myopic’ is ‘cosmic’). The ground of magical practice is the community, the macro-self, the trans-self. No place for that now, no more…eppur si muove.

Cormac McCarthy, BLOOD MERIDIAN.

An apocalyptic novel in a literal sense: for 350 pages strange lightning flashes and murderous horseman stalk a land bleached of meaning and bands of painted savages manifest suddenly on distant rises and the language is self-consciously ‘biblical,’ but none of that is as important as the fact that McCarthy’s remythologization of the West places the (no: an) apocalypse in the middle of the 19th century and says in nearly as many words that we are the ones living in the post-apocalypse. Blood Meridian reminded me strongly of the ‘Dying Earth’ tales of Viriconium, not least in the way McCarthy’s characters seem left behind by fate to play out terrible rituals against a backdrop of absolute loneliness. When the kid dies he’s surrounded by civilization, by merriment and physical pleasure — he even buys the services of a prostitute on his last night though we turn away from the act itself — but this being a western of course he can’t be fully restored to the fellowship of mankind. He and the judge converse at the bar, the judge seems to take a bottle of whiskey as if it belonged to him, and no one notices: they’re outside time. (Apocalyptic time, mirror time…) A bear is shot for nothing, suffers for nothing, dies for nothing; a couple of people in the audience notice, none reacts. If the judge had an insect’s head this would be the Bistro Californium.

McCarthy’s prose is literally breathtaking: I kept pulling up suddenly, trying to figure out how a phrase or sentence or extended metaphor could possibly have made it into our world. I’m in awe of his talents and the depth of his devotion.

There are no women in the book.

Let that sentence stand alone.1

And while the kid and the expriest do come to life somewhat by the end, they’re only players in a kind of nightmarish dumbshow — the narration comes to be indistinguishable from the judge’s weird oration, the kid’s sickbed hallucinations are exactly as real as the judge’s visit to the jail or his disappearance into the desert (the heath?) with his fool. This is myth, a world-tragedy rather than a human one. The kid’s death is sad but the horror is not that the judge outlives him but that he’ll never die. He is something unnameable and eternal. He sees himself clearly: he is a great favourite, the judge. He will never die.

Blood Meridian is one of the most American stories I’ve ever read.2 Which is to say it could only be told here, about this country’s (these people’s) twisted relationships to time, to place. Like Gravity’s Rainbow it tells the story of an American boy in a Zone stripped of comfort or sense, a zone of free play; Glanton’s gang is childish, though not at all childlike, and the judge is of course a figure of monstrous fun — dancer, fiddler, reader, scholar, autodidact, hedonist. (People forget that the life of the autodidact is both hard work — no teachers — and extraordinarily joyful, as every forward step is given meaning by the ongoing pursuit of knowledge. The autodidact has constant, deep purpose.) The horrors of Blood Meridian are not lifted or mitigated but enriched by its dreadful humour; the book gets funnier as it goes along, and the final act is preceded by a chapter-length comic interregnum. Its humour is as American as its landscape: you might even say they’re one and the same, as Americanness has always depended on an earnest-ironic response to the impossible mismatch between the vast ancient American topology and the foolishly intimate American idea.

The scene of the kid in the jail cell receiving a visit from the devil himself reminded me funnily of The Stand, which (no surprise) treats American expansion and expansiveness more literally: McCarthy’s novel compresses an universe of terror and judgment into just over 300 pages, while King’s big book treats ‘epic’ as a function of scale rather than vastness (depth of field, colour, time). Both books are ‘inappropriately’ jaunty in places, ‘too serious’ in others; both take violence as a given because it is given to men as a way into the heart of the world. To some men as the only open way. (McCarthy’s elevated tone is infectious…)

Not many novels better than this one that I know of. Christ. Plenty to say but I’ll leave it there.


  1. To be clear, there are a handful of female background figures, none named (few of the book’s characters get named; the protagonist is only ‘the kid,’ then ‘the man’). None of the women in the book are more than props for the main story — though of course, that all-extremely-male arrangement is itself an aspect of the story. (wb. 10 aug 2017) 
  2. I use phrases like this all the time, don’t I. And but they’re always stupid, and but here we are. (wb. 10 aug 2017) 

Ritual and control (systems): freewrite.

The word ‘ritual’ is overloaded w/judgment because the 20th century was horrible. We have a screwy notion of what time is — the body’s relationship to time, and the mind’s.

Neonates’ hearts have to be taught to beat in time. Ever wonder why they respond so well to bouncing at ~80bpm? Their hearts are learning how to keep a beat. They’re learning how to live.

Technologies collapse space and time, can we agree? One major effect of the Internet is that all libraries are local. My car lets me be 60 miles away in an hour; traveling five miles takes ‘no time at all,’ a unit of time so small I don’t notice it unless I’m in a hurry. Benedict Anderson wrote about this already — the psychic effects of 19C mass media. James Scott as well, in another register. Manovich, Kittler — yr Media Studies 101 reading list, basically.

(The Language of New Media put me off when I read it in grad school; I wonder how I’d feel about it today, where my almost unreadably marked-up copy is…)

What’s ritual? Programmatic action to imbue a moment with meaning: to change the relationship of the mind/body to spacetime. Ritual differs from habit by intention. It differs from ‘process’ in its metaphoricity — rituals aren’t always representational but the action/effect mapping passes through metaphor, which isn’t true of a functional process. How do you make scrambled eggs? Crack, whisk, milk, heat, scramble, no need to pour a ring of salt around yourself in the kitchen. Each step of the process accomplishes something physical, obvious; each step in the ritual (the crimson shawl, the ring of salt, the prayer to Pelor) accomplishes psychotropism.

Psycho+tropism: mind+changing. ‘Learning.’ I’ve been making this point (well it’s not a ‘point’ exactly) in writing for 15 years now.

Science — or no not ‘science’ but whatever hip idiots mean when they say ‘Yay, let’s do science!‘ — is supposed by now to’ve freed us from the Terrible Shackles of ritual. We no longer evoke or imbue or incant or call down the ______ but rather we ‘boot up’ and ‘lifehack’ and oh God it’s too stupid to write down. Point being we’ve replaced magical metaphors with technological ones and have failed to register the implied insult, i.e. that you and I are the same kinds of machines as the ones we serve all day. (On the other hand, given this subservience, maybe calling ourselves ‘computers’ is meant as a compliment? Well: I don’t take it as one.) The idea that you can pop a nootropic or microdose and unlock the awesome power of the human mind isn’t even wrong, it’s a betrayal on another conceptual register altogether — of dignity. The idea, I mean, that there’s nothing else to be gained by taking human time: time at a biological scale.

What am I angry about now. What am I going on about. Please, please look: Western minds have shifted over the last few decades toward a resentment/rejection of ritual, languor, symbol, secret, time as pleasure, mind as space — magic, basically. Magical thought. I mean even the phrase ‘magical thinking’ is a denigration now, as if magic hasn’t been a way of working (in) the world since the dawn of the species, as if ‘magic’ referred simply to the incorrect belief that a fingersnap can make a hated enemy feel pain and not to, oh, the years-long process of careful ego-thinning and -reshaping by which minds open up to an ecstatically imaginative (sur)reality.

Or from another angle: if you drink your stupid burnt Dunkin Donuts coffee-sludge in a hurry on the drive into work, the caffeine will make you somewhat more productive for a short time. There are better habits and worse ones. But you should know that in another world, that drink was part of an inexpressibly more potent behavioural psychotropic, a (don’t tell the boss) ritual of movement from hanging-at-home mode to whatever mode you need to get into to work for those predators at the top of the org chart — and billions of dollars are spent every year to convince you that you don’t need it, that there’s no time for that sort of New Age frippery. For those five minutes of generative peace and wonder and focused consciousness.

So: life gets faster and worse. And the other world, which was only ever within you, a metaphor of unspeakable power, gets smaller and emptier and harder to find.

How and what should you read?

Someone asked the other day whether the things I read bear directly on the writing I do.

I said somethingsomethingsomething but what I meant was:

You can’t plan knowledge

Learning is association-making, connection, but those connections are capricious (cf. those sexually aroused by feet, those who think they saw the Virgin Mary at Fatima, those who can play twelve games of high-level chess simultaneously without actually loving chess). Human brains aren’t purposefully wired, they’re grown; instead of plans they develop according to tendencies. The phrase ‘perfectly reasonable deviations from the beaten track’ might come to mind here if you’re me.

You can consume information according to a plan. I wanted to know about the influence of Charles Fort on midcentury pulps and comix; I read Kripal’s Mutants & Mystics. I wanted to know what Jacques Vallée actually argued in Passport to Magonia; I read it, simple. But it’s silly and self-defeating to start out wondering what you’re going to do with that information. You can’t know, and in any case the action-arrow points the other direction: as it transforms interpenetratively into knowledge, the reading does something with you.

I mean that almost literally. We can only consciously control our learning with gross imprecision, which is why cramming for tests is a terrible idea (too much too late). You learn in a trickle or a rush, but crucially you don’t decide which, and it’s best to think of learning practice and knowledge-formation (not ‘-acquisition’) as distinct and almost disjoint practices. The making of your mind can go on without you. Good thing, too: it’s what ‘you’ are made of.

Point being, you can control the inputs to the psychotropic process (the books you read, the drugs you take, your adherence to or rejection of the diurnal cycle) but you can’t control the emergent coral-reef forms which knowledge takes in the mind/brain. And this is good, because while you are a sadly limited person living in a sadly limited world, the self-modifying bioelectrical system which epiphenomenally generates ‘you’ is a good deal less neurotic and scared.

And so you should read whatever you’re passionate about, because

  • passion intensifies and accelerates this mindmaking process, while
  • boredom kills it, and since
  • you can’t control whom you turn into,
  • your best bet for generating a robust mind-body ecology is richly varying inputs

Which brings us to the secret central question of all blogposts,

What does this have to do with my D&D campaign?

But the only reason anyone asks this question is that he hasn’t yet internalized the great paradox of our everything-bad-on-demand-everywhere time, which is that

Fantasy isn’t a genre, it’s an activity

If you get that fantasy is something you do (creation connection narrativizing spatializing eroticizing etc.) and not a set of genre markers (elves sorcery talkingswords) then you already know what all this has to do with your D&D campaign — the more and better you know, the more deeply and widely you experience, the richer your fantastic imagination.

False Patrick occasionally looks for D&Dables in James Scott or Geoffrey of Monmouth with superb results — you can see why G. of M. would be a good RPG source, but James Seeing Like a State Scott? Well, read the post. I picked up Barbara Tuchman’s A Distant Mirror having heard it described as the book that birthed not only Game of Thrones but a generation of medievalists (who later went on to disavow it as decidedly non-scholarly history), but in the end I experienced it as a kind of hellish postapocalyptic dystopia, the apocalypse in question being the bubonic plague. That, in turn, put me onto William NcNeill’s Plagues and Peoples, a brilliant short book which argues for an advanced understanding of humans as coexisting in complexly evolving predator/prey relationships with, say, syphilis (or bubonic plague, or HIV). That was immensely clarifying as history, but it doubled for me as a kind of SFnal primer on both ‘deep time’ and dystopic transhuman history — a depectively matter-of-fact story about the place of the human species at the center of a slowly tightening ecological net.

Not longer after I finished Plagues and Peoples I picked up Jeff Vandermeer’s Annihilation, the first third of his Southern Reach trilogy, which is a kind of Rendezvous with Rama/Lost/Lovecraft mashup with mushrooms swapped in for tentacles. I liked it, but it was twice the book it otherwise would’ve been, and ten times the dream-fodder, for the way it echoed and weirded-up McNeill’s book.

Come to that, there’s no reason Lovecraft’s ‘cosmicist’ vision requires tentacles in the first place — the creepiest thing about ‘The Call of Cthulhu’ is the bat-winged things in the swamp, and frankly the Cthulhu statue itself only creeped me out to the extent that it recalled the statue of Mbwun from Lincoln/Child’s Relic, which I read in middle school because I’d heard that ‘If you liked Jurassic Park‘ and of course I did, but then I only picked up Jurassic Park because there was an article about it in a science newsletter we read in our Earth Science class, and if we’re in honest-confession mode then the fact that my godfather went to MIT (Course 2, class of 1924) made me wanna attend that school slightly less than the fact that Michael Crichton had spent a year as Writer-in-Residence there…

See?

Evolutionary weirdness

The least interesting thing about fantasy is its content. (Have you ever had to listen to someone else tell you about last night’s ‘amazing’ or ‘hilarious’ dream? Soporific stuff.) What makes fantasy fantastic is its visionary quality, the way it animates primal urges and throws light on hidden mental corners. Worthwhile art is deeply personal: the work of a strong ego seeking out egolessness. The best stuff is necessarily at least a little inaccessible, mysterious, resistant to analysis, however welcoming its formal presentation; great art always proceeds according to an intuitive logic that’s inexpressible in rational terms. And because it speaks to a unified (continuous, cohesive if not logically coherent) vision, it could only have been made by the person or people who made it.1 Good, in other words, is always strange.

But ‘strange’ is the last thing central planners want to deal with — cf. the aforementioned Seeing Like a State. The inescapable, essential fallacy of the central-planning ethos is this:

Orderly processes do not necessarily produce orderly results.

Indeed the one’s got little or nothing to do with the other except by chance.
Working artists get this, hence the irritation/frustration/disappointment writers evidently all feel when asked when their ideas come from. Critics, meanwhile, tend not to understand this — if the disjunction between aesthetic means and ends were widely understood, entire schools of criticism woulda been strangled in the crib. I think of the weird mismatch between Joyce’s literary dreamworlds and his pedantic fan-critics, and (because I’m me, and have written the books I’ve written) of the way Phish’s most hyperrational practice exercises have generated their wildest improvisations while their most deeply structured longform improv has come at moments of surpassing looseness and intuitive responsiveness. (The same goes for other rational/ludic/dreaming improvisatory scholar-artists — think of Johns Zorn and Coltrane.)

I want to have The Right Information at my fingertips when I write, but I also want to experience and share strange knowledge, a Weird innerworld which only I can see but which through my craft I can make knowable to others. And I aim to build deep written structures through intuitive improvisatory methods — so that, for instance, the structure of my 33-1/3 book mirrors the structure of the album it discusses, and the fractal form of my Allworlds Catalogue embodies/allegorizes the Big Themes it bangs on about, etc., though both those formal arrangements were arrived at with those pretentious-sounding purposes in mind.

And I find that the best way to achieve these tight-loose performances, this particular pleasing-to-me dreamlike relationship between form and content and private experience, is to immerse myself in material and see what forms spontaneously appear.

We forget that evolution isn’t just a winnowing process of natural selection — it’s punctuated and catalyzed by far-from-equilibrium self-organization, which can altogether shift the topology on which the selection process works, ‘skipping tracks’ in terms of descent. This is biological innovation, and its absence from the standard schoolhouse evolutionary narrative is just one more expression of (and reinforcing element in) a dangerous, thoughtless cultural conservatism, a pseudosci retelling of the myth of heavenly bureaucracy. Evolution isn’t a one-way road running straight, it’s a network of migrations through an ever-shifting topology toward no particular destination — the endless fitness gradient scarred with switchbacks, channels, deep caves, inscrutable truths spelled out in the bones of lost travelers…

Back to the start

‘No one can see beyond a choice they don’t understand,’ said the Oracle in The Matrix: Revolutions. Put another way: you’re trying to get from one stable equilibrium (not exercising, say) to another (being in the habit of exercising daily) but between them is a hill down which you can backslide all too easily (forcing yourself to exercise daily for a few weeks until the habit has formed). The zone of extreme flux — of frustration, worry, pain, seemingly endless struggle — of uncertainty — between equilibria is a hard place to be if you can’t handle uncertainty. If you need to know the outcome before you begin the process, you’ll never do anything new. Everything truly new is a risk.

So how and what should you read?

My sincere answer:

Keep reading until you figure it out.


  1. Reasoning through the ethical implications of this paragraph for the art-consumer and the DIY creator is left as an exercise for the reader.