wax banks

second-best since Cantor

Category: internetting

The Goodreads problem synopsized.

You must have a sense of how people respond to your work, but you mustn’t fixate on any one response — learning to manage variation in tastes is an important skill for anyone doing creative work.

It’s harder than ever to escape people’s responses to your writing; to ‘be online’ (to live online) is to be constantly, destructively aware of the ultimately irrelevant. Yet you should never get drawn into a lengthy exchange with a reviewer of your work, paid or volunteer, except to clarify errors of fact.

There is no good solution, other (I suppose) than doing good enough work that you can confidently ignore reviews altogether.

The OSX Terminal.

The vast majority of Mac owners probably never fire up Terminal.app, which is a pity: some of OSX’s power comes from its BSD underlayer. The command line is your way the core of OSX, and even with underrated tools like Automator available, some tasks are only feasible right at the command line.

Folks who code on Macs, meanwhile, have long known Apple’s Terminal as a nonideal CLI.

Craig Hockenberry begs to differ, offering the most detailed rundown of Terminal’s handy GUI integrations, clever keyboard extensions, and assorted hidden features that I know of. That’s 9,000 very useful words from 2014.

See? The Internet isn’t just a sociopathic hellscape! Only mostly.

Pierce.

Does this look like wordcount padding to you?

In the clamor of a presidential race, which this year is even more distracting because of a clamorous and vulgar talking yam, a lot of important information gets drowned out that ought to be part of the presidential race in the first place.

That’s Charles Pierce, beloved of leftish readers who prefer articulate but shopworn outrage to analysis, drowning out some important information with a rush of cliché over at his blog ‘shebeen.’ Annoying as I find the ‘yam’ bit, it’s the misused ‘in the first place’ that puts me off. Surely a quick reread should’ve flagged that clunker?

Sensible people insist Pierce is a Great Writer in his mode, but his Esquire Politics blog has been trash all year, and paragraphs like the one quoted above are the reason why. Every single fucking post is riddled with ‘clever’ nicknames like ‘the vulgar talking yam’/’He, Trump’ or ‘Tailgunner Ted Cruz,’ tired rehashes of years-old jokes, and threadbare secondhand verbiage out of the Sclerotic Greyhair anthology. There’re ten thousand leftward bloggers like him, frankly, and dozens of them are reaching for new insights and new prose without any noticeable loss of perspective. I’ve linked admiringly to Pierce in the past, when his brand of overwrought doomsaying has suited the emotional tenor of some darker-than-usual cultural moment. But at this point he’s stamping about the ol’ shebeen like a more historically informed and somewhat less self-important Keith Olbermann — remember how K.O. got off a couple of memorable ‘viral’ speeches on his TV show before abruptly reaching the limit of his insight? — which is a damn shame considering Pierce’s actual talent and skill levels.

He was necessary reading once, back when he couldn’t be reduced so easily to a formula.

I say all this because Pierce talks constantly (and with extraordinary condescension) about the decline of rationality and sense in the USA — this from a man who in 2009 wrote a book called, wait for it, Idiot America — yet as near as I can tell, he long ago joined the parade of hurt/comfort pundits whose main job is to point at an extremely obvious outrageous affront to leftesque sensibilities (dumb people with guns! the politics of the image!) and recite a comforting litany of complaints (our nation is in decline and you and I are in no way to blame!), the balance of outrage and been-there-blogged-that worldweariness calibrated to go well with, say, a sugary milky caffeinated drink from your local fast-coffee chain. The ~left blogosphere has made this sort of pageview-trawling pseudoanalysis its primary sport for years and years now.

And while you might well think that the Real Problem is the rise of right-wing talk radio (which has been a major cultural force in this country for a quarter-century, you knob) or Citizens United or the lack of safe spaces or Lin-Manuel Miranda not getting enough awards or whatever issue you fill your Outrage Moments with…the fact that the tribe which identifies itself as Educated and Informed and More or Less Left But Also Totally Jazzed About the Fruits of Hypercapitalism — know anyone like that? — just can not be bothered to communicate with any of the other tribes, the fact that our Elites are doing their best to turn not only their neighbourhoods but their entire mediated existences into gated geographic/cognitive communities (the Safe Space as model of the Self), is exactly isomorphic with the ‘epistemic closure’ which was such a big deal amongst Righty crankfluencers a few years ago.

In other words: if you prefer your own tribe’s clichés to merely being in the world with members of any other tribe, you are part of the Idiot America that Pierce and his (mostly younger and dumber, therefore more forgivable though no more tolerable) cohort like to think they stand outside of. The system is rigged against you, just like it’s rigged against everyone who isn’t in charge of it, but you still bear a portion of the blame. Just like me and Charlie.

But you’re not getting paid to pass off your outraged gesticulation as critical insight. So your share of the blame is that much smaller.

That’s all.

‘Pitch me, baby!’ or: David Pogue’s ego blocks our view of a much deeper, much scarier cultural problem.

From the archives: July 2011. The last of today’s batch. My contempt for gadgetbloggers (also ‘Apple pundits’) is limitless, as you can guess. I used to love venting my spleen like this. Now I tend to feel bad about it, though obviously not bad enough to keep this to myself. –wa.

David Pogue, a freelance gadget columnist best known for his work at the NYTimes, recently spoke (for pay) to an audience of PR professionals. The talk was entitled ‘Pitch Me, Baby.’ Last week the NYTimes ombudsman described Pogue exhorting the publicity men to suggest column material to him:

In the presentation, Pogue jumps out of the gate with a Power Point page inviting the audience to “Pitch me, Baby!”” The presentation goes on to offer do’s and don’ts and emphasizes his own close reliance on pitches that come his way from professional public relations people.

On a later slide, he displays eight recent New York Times columns and identifies five as having come from public relations people. Pogue explains that, as a reviewer of new gadgets, there is no comprehensive database he can rely on to learn about new stuff. Hence he relies on companies and their hired pitchmen to tell him about new products.

Pogue’s basic advice boils down to two imperatives: 1) “Save me time,” and 2) “Don’t be a robot.” This means that public relations people should tailor the pitch to its audience (avoid spamming, in particular) and avoid jargon and other extraneous matter.

This strikes me as a violation of journalistic ethics, not to mention good taste. The NYT agreed; Pogue has been forced to curb his appearances at such little get-togethers. But I don’t care at all about that aspect of the article; my disgust at Pogue’s behaviour isn’t new, nor is it unique; nor is he different in that regard from, say, Judith Miller pawning off Cheney/Rove PR as reportage. We don’t use the term ‘corporate media’ for nothing.

The deeper issue, which doesn’t seem to be getting talked about this week, is this:

Pogue’s job consists of advocating for the business interests of large corporations. That’s it. Like so many other ‘tech columnists,’ he masquerades as an advocate for better living with/through technology, but it’s easy to see that he’s always been a paid shill, nothing more: he’s only capable of talking about technology on a corporate PR timeline, within a logic of consumption rather than creation. He’s an advertiser for The New (and Expensive).

If Pogue mattered, he’d be writing about amazing! new! corporate! technology! with an eye toward an actual alternative: i.e. instead of saying ‘Should we buy the new iPhone or the new “iPhone-killer?”‘ a serious critic would ask, ‘Should we buy this new tech at all?’

A simple thought experiment: if you’ve bought a new computer in the last five years, why did you do so? If you’re a grownup, chances are you didn’t do it in order to play the latest video games. So ask yourself: what does your new computer enable you to do that your last computer didn’t? If your last computer was less than four years old, the answer is probably nothing.

My first iPhone altered the way I traveled (thank you location-aware computing) and used email (thank you 3G data service). My new one lets me shoot video, take better pictures, and run the old apps faster. I can imagine needing to replace it when it breaks, but what in the world could I possibly want from a ‘better’ phone?

Pogue and his fellow tech writers would answer by listing the features of next-gen phones. But ‘Why should I buy this phone?’ isn’t a question about a phone, it’s a question about me; and Pogue and his ilk should know it. Their defense is always the same: Well, you don’t have to buy what we recommend. And that’s true, of course. But these idiots then turn around and write about ‘tech’ from the perspective of collectors, ‘early adopters,’ fetishists. And they orient the culture toward these perverse logics.

Pogue isn’t a commentator on the ‘gadget industry,’ he’s part of it. He’s a servant of his corporate masters, who provide him with free shit in exchange for free publicity. But in his capacity as an NYTimes columnist, he’s presented as something else: a servant of his readers.

The only thing he creates in this world is a misperception of the need to buy new things.

So no, David Pogue’s recent bout of new-money tackiness isn’t a ‘journalism story.’ It’s not a ‘tech industry story.’ A paid advertiser got spanked by his bosses, who rely on paid advertising for their livelihoods. So what.

The actual story is that at this point, we can’t imagine ‘modern life’ without people like David Pogue. We are fucked.

Trouble online, trouble behind.

From the archives: August 2011. I’m not proud of this one but ‘as writing,’ but it was important to write it, and it hurts me to read it. So here it is. –wa.

I don’t get along with people online, and that’s the plain fact. It’s taken me a while to be matter-of-fact about it, but there it is. I spent a bunch of time discussing the situation in therapy a couple years ago, but never did arrive at a satisfactory solution.

OK. The the problem goes deeper than incivility.

The summer after 10th grade (1995) I spent five weeks at Johns Hopkins, taking classes in the Pre-College Program. (It’s different from the well-known precocious-child program, CTY.) I got my first C (in a molecular biology lecture) and worked hard to get a life-changing A (in a small, prescient ‘Explorations in Text-Based Virtual Reality’ humanities seminar). Both grades were portents, but I didn’t understand them.

The focus of the seminar was MUD/MOO/MUSH culture — ‘A Rape in Cyberspace,’ Barlow’s ‘Declaration of the Independence of Cyberspace,’ Neuromancer, some Bukatman, some Dery, that kind of thing. One of the requirements was to spend a bunch of time exploring the Diversity University MOO (moo.du.org:8888). I did. I also signed up for LambdaMOO (lambda.moo.mud.org:8888).

I’d never used the Internet before.

Some days I would get up, read the Millennium Whole Earth Catalog or my newly-purchased Principia Discordia for a while, then head over to the computer lab for a 12-hour stint in Lambda. I missed meals. I even missed class (see above re: ‘my first C’). Tuition for the program came to $3,600 for five weeks. My dad mowed lawns to raise a few hundred dollars. A wonderful man in my hometown lent us the balance of the tuition and it took us a long while to pay him back; or else we never did.

I got some sun but not as much as I needed. I fell hard for a girl in the next dorm, who didn’t notice me. Then I fell for someone with the username ‘Sirena,’ and that’s one of the weirdest stories of my whole life, I think.

I learned to ‘speak in public’ on LambdaMOO but I learned plenty of other things as well; and I came to rely on it. When I went home at summer’s end I felt totally disconnected from my hometown. I told myself and my family and even my couple of close friends that I just missed Baltimore, had a great time ‘at college,’ had never been around people who shared so many of my interests, just needed a little time to adjust. Junior year ahead, yay. That kind of thing. All of which was true, I suppose —

— but it occurs to me today, for the very first time, that as much as I missed the people and the school and the freedom, I was also going through withdrawal from the online world where my new self was being born. I mean that literally.

The term we’re looking for is addiction, of course, more specifically a form of ‘Internet addiction,’ which in the late 90’s was a subject of no small concern in the press and in academia.

You never hear about it now. Once everyone does some activity all day every day it’s not an addiction, it’s just ‘part of life.’ Like TV, or worrying about work, or hating the government.


I check my email several dozen times a day, yet I fail to respond in a timely fashion to friends and acquaintances. I may in fact be the worst correspondent I know. Yet I don’t immediately forget about the ‘need to respond’: indeed, waves of anxiety about my Inbox full of unanswered emails continue to ripple for weeks and weeks. I am never, ever free of anxiety about these communications — but I avoid responding.

I’ve destroyed friendships — and strained family relationships — this way.

When I have spare time, I read websites and occasionally comment on them. Sometimes I do this even when I don’t have spare time. Altogether I spend hours (hours!) a day looking at webpages and retaining almost nothing. I take no great pleasure from this activity. Indeed it has the dry sterility of pure compulsion, like pulling the arm of the slot machine.

I’ve posted to this blog more than 3,100 times since 29 September 2003. In that time I’ve been banned from one website, slunk away from several others, and stormed off several more. I get into fewer ‘flame wars’ than I used to, but it still happens. I still feel anxiety about websites I’ve ‘stopped reading’; indeed, at the site where I’ve been banned, I continue to comment under a different name.

I feel contempt for such behaviour but haven’t found a way to stop it, as yet.

Since 2009 I’ve posted upwards of 150 reviews to the phish.net — but I’ve only posted one or two since June, during which time I’ve posted 50 comments in discussion threads and in response to the admins’ blog posts. I consciously avoided any such discussions until this summer. This correlation between ‘chatting’ online and posting more thought-out frontline pieces (reviews and articles) has held, in my case, for many years.

After building a (very very minor) reputation as a thoughtful writer at whedonesque.com, I’ve all but scuttled it by turning into a persnickety, ill-tempered commenter. Unsurprisingly, none of my posts have been featured there since I started commenting more regularly.

The term isn’t brand dilution, but then what’s the term? Would I be happier if I knew?


A longtime netizen (remember that term?) told me this when I was banned from phishthoughts.com (for ‘trolling’):

You are a highly intelligent, very cerebral and I believe well meaning person but it seems that you have some form of internet Asperger’s which makes it impossible for you to determine what is and is not socially acceptable in many circumstances online.

I wrote him a long email telling him, essentially, that he had no idea what he was talking about and I was perfectly justified in what I said about the site’s owner and EVERYONE NEEDS TO THICKEN THE OLD SKIN, ETC., ETC. But I didn’t send it. My wife approvingly refers to this kind of thing as de-escalation and always looks so relieved when I choose not to carry on such exchanges. The look on her face breaks my heart. I realize, at such moments, that I don’t actually know how much damage I do to myself — or I won’t acknowledge it, or (worst of all, and most likely) I’ve decided I need to hurt myself ‘socially’ in order to continue living as I am.

Last summer I wrote this:

I think we should purge the books and sell them, to alleviate my guilt (not a writer, not a devoted enough reader, nothing special…) and maybe recoup a bit of money. My wife thinks we should keep the books around[…] And dust them. I try to explain that life will stop and start over, better, if she’ll just allow this one gesture; I mistake my self-indulgence for patience.

She evidently believes — insists — that life can’t start or stop, can only continue, so we might allow ourselves to do the same. I imagine that our future must resemble my past. The books, I’m certain, are signs of my…well, my irresponsibility, profligacy, compulsions, status-consciousness. My individual failings, you might say. Don’t I get the future I darkly deserve?

But what comes next is ours, not mine. `Mine’ is just for comfort — like the books. In our future[…]I’m glad my wife[…]made me keep the dreadful damned books way back when, and frustrated my urge to reduce our life to my story.

In grad school I went to a conference and met a young professor from some college out of sight/mind, and over the course of several joyful drunkening hours it became clear that we wanted to fuck each other, quite, but I was dating someone and she had to get back to her friends’ house where she was staying, and in any case it would have been an absolutely colossal mistake, quite, but unforgivable? Who knows? Probably yes and deservedly so I’d say (were the situation reversed). Well. One of those stories I hold onto in which I ‘miss an opportunity’ to have a conventional ‘good story’ but still come close enough to some inner horizon that the light goes strange and new (or very old) things are revealed. So how bad a story can it really be, what I’ve got now? She was a Buffy fan too and I definitely should have called her when I was single, later. But I wasn’t ever really single.

I mention it because, though I can’t find the email she sent a few days later in response to my own message, I’ve memorized these phrases:

  • ‘maybe too smart for your own good’
  • ‘extremely socially awkward’

I’ve used ‘Asperger’s Syndrome’ as a term of derision.

I am ashamed. This is inappropriate and callous.

It would be, even if I were Oprah Winfrey.


Everyone wants his favourite band to also be The Very Best Band. This is really important to teenagers, who in this country have nothing else to do, but it stays important to nominal adults. Like me. Same for books/films of course. (Phish, Coltrane, James Joyce, Fight Club, etc.) Same for people, though I wouldn’t know. I can’t imagine what I’d be like if I didn’t map my tastes on to the cosmic quality scale.


The point being that there are two problems compounding one another: I compulsively fiddle about on the Internet, either getting into arguments or zoning out pretending to be interested in what Ezra Klein and Arthur Silber have to say about anything, but at the same time I have very serious trouble maintaining a civil tone and spirit of congeniality in online fora. I tend to monologue at people — ever notice how rarely I respond to the wonderful comments around here? When the conversation gets two-sided I lose control of something (maybe just the conversation), and I end up saying things I regret. ‘Being misunderstood,’ HORROR!, but more than that: no longer trying to understand the people I talking to. Not reaching out.

And that’s where I am this morning. Worried, if you’re wondering, that I’ll slowly lose friends and alienate readers and never stop doing the things I most hate about myself. And — you must know this is deeply related — worried, too, that I’ll never write freely because it will always be about me.

You want 100% employment? Assign every single citizen to border patrol. The true meaning of the nation-state right there, the geographic Self. OK, hold one guy back to make dinner I guess. One guy for laundry. And someone to make sure the cable bill gets paid.

My son will probably wake up soon, and my wife with him. The day will start. Real life will start. This…this is the shadow. If you walk toward the light it’ll hide from your sight, but not as a favour: your shadow will follow you wherever you go.

From a work in progress: Nomic and net.culture.

Rough draft, work in progress, claims nonbinding, etc.

For Hofstadter, unresolvable self-contradictions and infinite regresses aren’t failure modes for play or argument, they’re toys. And the same holds true in Nomic play — finding and exploiting a ruleset self-contradiction ‘wins the game,’ but figuring out how to keep play alive beyond that individual victory is one of the major challenges every nontrivial Nomic has solved (footnote deleted –wa.) at which point the players enter the odd state of simply living with/through paradox, in a state of exultant philosophical strangeness unlike anything else in gaming. As much as we went on about The Rules (a/k/a ‘all that’s holy, man’), our true focus was on the ever-shifting texture of experience within them, the space for improvisation and experimental sociality and playful ideation that the rules opened up.

The online world of Nomic was a corner of early cyberculture that had more in common with, say, the collaborative fictional project alt.devilbunnies or the roiling cauldron of Dobbsian lunacy that was alt.slack than with any aspect of today’s ‘games culture.’ In retrospect, it makes sense to think of net.nomics as experiments in (ahem) ‘stateless’ living, close cousins to virtual communities like LambdaMOO, miniature models of the disembodied digital utopia that John Perry Barlow imagined in his Declaration of Independence of Cyberspace. [^barlowdeclaration]

The Declaration is, of course, mortifying to read today — a handy summary of everything ridiculous in early cyberculture discourse. He wrote it in Davos, for Christ’s sake. But it’s dangerous to read that document through a modern lens, when his rhetoric has been terminally co-opted by Silicon Valley execs. We now grant ourselves permission only to imagine what life online can do for our precious mind-bodies, our anxiety/productivity levels, rather than what it could and does do to nation-states; it’s hard to imagine in 2015 that Barlow’s global revolution could ever have been. But for a second there, however hackneyed the language, you could believe it. You’d telnet to lambda.moo.mud.org:8888 and give yourself another name, another body, another gender, another species. You’d type ‘say Hello’ and greet a ‘room’ full of imaginary strangers, each stranger than the last, stranger than they’d ever been. There in the aether, the absolute otherness across the Ethernet, playing freely with every idea you’d ever been, every rule of thought and deed seemed purely mutable. You could be a paradox and keep playing, iterating and experimenting and binding yourself to a self-made system of selves until your own private ruleset generated not ‘sense’ (who cared?) but play. Joy. Right there and then, for as long as you could hold your breath and float through the water, the mad idea that a nation without nations could take form in a realm beyond the senses was not just real but obvious; it was already here, I was there; I swear you could almost taste it.

New media, old lies.

A rant follows. –wa.

‘Social media’ is, for the most part, a marketing term — the ‘social’ dimension of e.g. Twitter is actually only a small part of many (most?) users’ experiences, which consist largely of ongoing broadcast streams with brief semisocial side interactions. Check-ins. Socializing means two-way interaction; Twitter’s format militates against that. You interact directly with a tiny fraction of the people you follow, right? You may well follow 300 people, or 1,000 — how many of them did you exchange words with this week? And so now what were the other ones doing in your feed?

Twitter users you follow but don’t interact with are little different from TV stations you leave on in the background while you attend to…let’s call it ‘real life stuff.’

The idea that clicking on a link means ‘interaction’ has permanently cheapened that term. If I throw out a piece of mail and you take it out of my garbage and read it, you and I aren’t interacting.

This morning I read a Medium post about how Slack — a groupware application which replaces email with a more robust version of IRC — has ruined some UX designer’s life, by fragmenting his attention. The trouble with the piece isn’t its unfunny conceit (a breakup letter) or its tone of hip abjection (‘You are such a wonderful sane ethical corporation, please fix what you’ve made wrong with my life!’), but rather the unexamined assumption, totally unsupported by anything like evidence, that an adult should be able to live a happy healthy life sitting in front of a screen having IRC conversations with coworkers and ‘friends.’

And where did that ludicrous idea come from?

Not from Slack or Twitter or Facebook. Not from the Internet, Compuserve, Prodigy, or Illuminati Online. The idea that a good life was one in which you had time to settle down before a flickering lightbox and pretend to be connected to other human beings is very, very old. And because it’s transparently false, it has to be sold to us.

What’s deeply wrong with the Web was deeply wrong with TV

One of the most important pseudophilosophical currents of the last hundred years is the technophilic triumphalism that animates Silicon Valley culture. (I say ‘pseudo-‘ because it’s too poorly thought out to rise to the level of philosophy.) The idea that personal computers, and then the Internet, and then smartphones, and then more elegantly designed personal Internet smartphone computers will (1) fix what’s wrong with human civilization while (2) costing us nothing is a pernicious myth that (with honorable exceptions like that maybe-crank Morozov) goes essentially unchallenged in the cultural mainstream today. A lot of people are getting rich off the idea that the Singularity is not only real but here already. ‘Disruption’ is the watchword and dysfunction is the result.

But we should be aware of the roots of that dysfunction, which extend back a good deal further than the latest antisocial outrage. The problems with our all-Internet-all-the-time lives are outgrowths of much older cultural movements.

The broadcast-TV consensus wasn’t killed by YouTube but by cable TV and the VCR.

The atrophy of social/empathetic faculties now manifesting in our politics of secondhand identitarian outrage-cosplay has been in progress since long before the Web existed — for all the vaunted cognitive benefits of video games, Atari killed more brain cells than Snapchat.

The total collapse of trust in American news media owes a hell of a lot more to Ted Turner’s insane idea of a 24-hour ‘news’ network than it does to Matt Drudge or Gawker or whoever we’re blaming now.

Controlling the televised image has been essential to national political success for more than half a century — ask Richard Nixon — and remember that this country already (re)elected a second-tier movie star and cigarette pitchman to the White House.

(Don’t you dare fall prey to the stupid idea that Trump’s candidacy has been ‘conducted via Twitter.’ Even younger voters who didn’t constantly hear about his Yuuuge Business Deals in the 80s (as I did) know him from The Apprentice, which laid the groundwork for his current performance by normalizing his repugnant charismatic-villain persona. Funny, I didn’t actually watch the show. Did you?)

It’s not exactly controversial to say that computer-dependent life is dangerous and costly, though saying so won’t win you any popularity contests. But — I know this is basic but it needs saying because it’s, y’know, basic — what’s shitty about staring at the Web was shitty about staring at the TV half a century ago.

‘But we have such short attention spans today!’ Blame the handheld infotech device known as the remote control, which fed you the daft notions that not needing to get up from the couch was a measure of meaningful control over your media diet, and that if you didn’t immediately like a work of televisual art you should ‘see what else is on.’ And remember that most of us now watch our movies at home — in a domestic, distraction-filled ‘curated’ environment, rather than the sense-heightening identity-submerging collective ritual strangeness of the movie theater. We’re too ‘enlightened’ to recognize the importance of the magic circle of ritual transformation which surrounds the theater, and too lazy to seek it out, at fatal cost to our imaginations.

There is no magic circle around the Web. There can’t be.

Obviously the Web differs from TV. Obviously. But the danger it poses isn’t anonymous comments, it’s that your computer is a glowing box that you stare at while a stream of mostly irrelevant nonsense flows by your face, barely registering beyond its immediate sensation — and you convince yourself that you’re ‘interacting,’ that you’re an ‘active viewer.’ Does that sound familiar? We’ve replaced CNN with cnn.com, People Magazine with TMZ, watching Arrested Development with downloading Arrested Development, talking about comic books with tweeting about comic-book movies — but the basic problem of dependence on the screen, of its fundamentally antisocial and dehumanizing nature, has only deepened. It’s not new. We sit motionless in our ‘filter bubbles’ and congratulate ourselves on how comfortable our chairs are, how carefully ‘curated’ our Twitter feeds, but before ‘Web surfing’ there was ‘channel surfing’ and we tried not to think about what it was costing us because we’re in love with our illusions, not least the Illusion of Choice.

Hey by the way, remember when anyone gave a damn that Goldman-Sachs was one of the most dangerously powerful organizations ever formed by a group of human beings? No, of course not. That was a long time ago. It’s scrolled off the bottom of my screen.

We’ll have to catch it in reruns.

Apple vs the FBI.

Tim Cook’s letter about the FBI’s request for help cracking the password on a killer’s iPhone is important reading. Cook claims that Apple does not have a version of iOS that would allow the FBI to circumvent security on one of the San Bernadino killers’ iPhones. The FBI has demanded that Apple engineers create one.

The FBI has explicitly asked for help with this one device, one time. But that’s not what Cook’s letter is about. (Just between the two of us, I’d like for the FBI to know what’s on this murderous piece of shit’s phone too.) And while there’s always the chance that the software backdoor in question could get out, rendering every iOS device in the world totally insecure, that’s a secondary concern here too — I imagine Apple’s engineers could do a reasonable job securing this one phone. This is a much bigger deal than that. Bigger, even, than the FBI’s specific request for the ability to input passwords wirelessly. (Just think about what a privacy apocalypse that could trigger.)

If Apple agrees to the FBI’s demand, then every iOS device is fair game — and future demands of other tech companies for such extraordinary violation of user privacy will be more likely, and more likely to be followed.

We already know that Google happily helps the NSA with their online equivalents of warrantless wiretapping. We already know that some companies build such backdoors for the government to use. But this is something else:

The FBI is asserting that it has a right to backdoors which do not yet exist.

This should repulse you. It should repulse Barack Obama, to be frank.

Apple’s respect for user privacy — the company’s willingness to hide user data even from Apple itself — is a big part of why I favour their products and services. They are absolutely right to pick this hill to die on, especially when the phone in question belonged to such unsympathetic filth…which is, by the way, a deliberate political calculation by the FBI; bet on it. This is about precedent.

We should reward companies and individuals who do the right thing when frighteningly powerful groups like the FBI put their thumbs on the scales.

Write to your representatives in Congress to push back on the horseshit coming from Senators Cotton and Feinstein, and demand that your presidential candidate of choice speak out against this overreach by law enforcement.

Reports of Twitter’s demise are #myopic.

So let’s take it as read that Twitter is run by the usual Silicon Valley/FinanceWorld mix of sociopathic predators, poorly socialized nerds promoted well beyond their competence, gated-community cowards unable to imagine the world beyond their little circle, tech wonks with neither aesthetic nor social senses, and well meaning earnest GenX/Millennials suckered by fashionable contemporary pseudo-ideas (e.g. the notion that comfort is redress, or online conversation is conversation).

Let’s take it as read, too, that Twitter has long been a service in search of a business model, and that the answer they’ve hit on — ‘let’s sell ads’ — is both lazy and in the long run incorrect.

All that said:

Today’s announcement of Twitter’s latest move away from strict timeline chronology, toward ‘curated’ and algorithmically foregrounded content, will be bad for culture, bad for human beings, for what I hope are obvious reasons: the social/emotional incompetence of The Algorithm and the engineers who feed and water it, the choking rich-get-richer effect which folks acted surprised by when the words ‘power law’ were a big deal more than a decade ago, the way it props up the insane feeling of ‘FOMO’ which is the core of online pseudosociality, etc. Leaving Twitter in charge of your news feed means that your news feed will look more like everyone else’s — more like the computer’s dumb idea of what human beings like — which is good for Twitter and its financial co-conspirators and bad for everything else.

But two years from now, no one will care about this particular change to Twitter’s model, just like no one really cares today about the ‘Moments’ non-feature, or the fact that advertisements interrupt our Twitter feeds with increasing frequency and unavoidability, or the company’s decision to cull the ranks of third-party app developers and choke out its app ecosystem. Twitter’s interference with our feeds’ chronology isn’t the end of Twitter because Twitter has been worsening in so many ways for years and years…and you still use it, you still spend hours and hours a day staring at this stupid feed.

The #RIPtwitter outcry is blame-shifting and excuse-making by a small but loud group of addicts, most of whom once knew how good life could be without a pile of pseudosocial non-thought accumulating on their screen, but who are now incapable of imagining life without it. Now it’s a big deal to ‘go offline’ for a month. Now it’s a big deal to take a month or a year thinking about a work of art before sharing your ‘take.’ Today you’re gonna spend more time thinking about the next iterative step toward shitty irrelevance by a ‘social’ network than about the fact that the radically activist Roberts court just put an unprecedented stay on Obama’s sweeping clean energy EPA directives, potentially rolling back our nation’s response to the Paris accord by years.

This is the deep problem: Twitter is a service of deeply questionable value, but you’re addicted anyway. The same goes for Facebook. I’m willing to bet real money that nearly everyone who’ll ever read this has spent more time complaining about cosmetic changes to these services than you have wondering how you’d live (well) without them. The problem isn’t the existence of the drug. It’s your decision to stick it in your vein — and the multibillion-dollar business that depends on your addiction for its survival. (It’s like our political parties, in that way.)

Twitter will go on, and its value to users will decline, and eventually some ‘revolutionary’ alternative will pop up, make a lot of money, and repeat Twitter’s choices because Money Says So.

Meanwhile, stop worrying about whether Twitter is well or poorly off, and ask yourself whether you are well or poorly off, and what you can do to improve the situation.

Perils of Internet microfame, stanza one billion.

You see this arc over and over in the over-30 set — the generation that came of age without ubiquitous Internet:

Someone more or less good with words (and usually bad with people) gets a taste of Internet notoriety and accumulates a small but devoted following. As his voice grows confident and identity becomes complexly bound up in his ongoing Internet performance, his online persona becomes an extraordinarily rich character. This phase can last a couple of years. It is (in my case, it was) a good time to be online. The voice comes easily — it’s improvising in character, and autobiography is permitted, so there’s a deep well of character to draw on.

He’s very productive during this time. His best work.

But microfame is addictive, particularly for academics and writers, long unaccustomed to the fast enthusiastic feedback cycle and fast-moving ‘social’ dimension of online interaction. And online life is a magnet for sociopaths, troglodytes, and the socially malformed, who might value the distance and pseudonymity of the Net for normal healthy reasons but who are nonetheless a huge drag for everyone else.

Tender souls who’ve gotten a taste of microfame quickly harden themselves against what they take to be unjust or unkind attention. They shut off comments sections, no longer deign to discuss what they’ve written, and withdraw into their personae — tending toward self-aggrandizement and self-parody. Myopia.

They always get much less funny in the process. That’s the most predictable part.

This isn’t just a matter of losing their hunger. Most of these folks never ‘make it big,’ they just get a slightly higher dose of microfame. I’ve come to believe that the quickness and finality of this transformation — which has turned a hell of a lot of once-interesting human beings into petty, bitter, contemptuous assholes over the last 15 years — is largely a function of the destabilizing feedback cycles built right into the blog medium (and its online-magazine descendants).

(Instead of naming the assholes I’m talking about, I’ll mention one semifamous blogger-journo who’s avoided this trap: Josh Marshall of Talking Points Memo. But I’m sure you can come up with your own examples — and no, Andrew Sullivan doesn’t count. He was in the game long enough to make it out the other side, and his relatively open-access approach fortunately mitigated his horrifying tendencies. To an extent.)