OK, so, let’s say my foot is a bunch of uncommitted, unstashed changes, and this shotgun here is git reset -
Now, watch.
[Proceeds to blow own foot clean off.]
You see why you have to… be… carefu—
[Passes out from blood loss and pain.]
A month or so ago, my wife and I went to Las Vegas for a long weekend.
Valentine’s Day is February 14 and her birthday is February 18, and I learned early on in our relationship that you do not try to combine them. But over the many (many) long (long) years, we’ve mellowed to the point of:
“Are we going out on Valentine’s Day?”
“What, are you nuts?”
…and:
“What do you want for your birthday?”
“Sleep.”
And so with the President’s Day weekend falling between the two, it seemed like the perfect opportunity to not go out on Valentine’s and also have a big bed with black-out curtains. There are also buffets, which is my love-language.
The hotel room was nice — other than the fact that it had Las Vegas right outside the door — and, when I wandered into the bathroom and saw the bidet, I made a sound I haven’t been able to reproduce. Heated seat, glowing blue light, and the kind of steely-eyed menace you get from people who have a job to do and know what’s involved in getting it done. My undercarriage is gettin’ pressure-washed!
Problem, though: the bidet has no instructions. At all. Normally this wouldn’t dissuade me from operating machinery, but here, um, I have some skin in the game. It can’t be that complicated, though, right? You could accomplish the same thing with a garden hose and some privacy. And, hey, there’s a language-agnostic remote control mounted on the wall.
OK, starting from the top. That’s clearly the universal icon for an ass, and it’s got a fountain pointed at it. Great. The basics. To the right appears to be a more… aggressive… flow.
Below that is… a woman? And a spray that covers everything from the back of her calves to half a foot behind her back. And to the right, an even wider delivery angle. I guess that’s the all-orifice hose-down option.
Next is either a button to spring Wolverine claws, or the dryer. Since I didn’t have to sign a liability waiver before sitting down, I’m going with dryer.
Fourth seems to be an adjustment of the… targeting arm? There is a little hatch in the back of the toilet bowl where a nozzle pokes out of when called upon, like the gatekeeper droid at Jabba’s palace. This button must move it back and forth, like an intimate, adjustable pop-up sprinkler. Below that is what I’m going to assume is the spray pressure control, and not a cigarette to smoke after you’ve had your full bidet experience.
Finally, last, there’s the same nozzle icon used previously, but it’s got… sparkles above it? Where the water normally goes? Maybe it uses carbonated water? Or, I dunno, coffee? Does the nozzle get replaced with a magic wand? And is that licensed from Hitachi? Maybe the cigarette idea is right.
But wait. I’ve seen that before. That sparkle iconography has become really common — it’s the generic graphic that literally every company on earth is using to indicate AI. Is this an AI bidet? What does that even mean? Is this how AI uses all that water? What was the training data? I shudder to think whose intellectual properly rights I might be violating.
I push it. Nothing happens.
So it is AI.
Below that is apparently the volume control.
Some friends and I are planning on doing something stupid soon — we’re going to join the LA Marathon Crash Ride.
The Crash Ride is an unofficial bike ride along the route of the LA Marathon in the wee hours before the race begins. The Marathon organizers start closing off the streets at midnight or so the morning of, and by 1:00 or 2:00am, there are 26.2 miles of Los Angeles urban roadways that are effectively car-free. It’s a good start.
A long while back someone noticed this and started biking the route in the middle of the night. It’s become so popular that the LAPD now rides escort, just on the off-chance that a driver out at 3:00am on Sunday morning might not be the most considerate or level-headed person when faced with being stuck at an intersection while hundreds of bicyclists roll past.
I rode last year, and it was fun — there were boom boxes and people had art-bikes and there was the kind of crazy energy that you only get as you ride down the center of Sunset Boulevard three hours before dawn.
The first step in riding the Crash is figuring out when the Marathon actually is, and that should be easy, right? Right?
“Hey, Siri, when is the LA Marathon?”
On Friday, Apple spokesperson Jacqueline Roy, in a statement on the apparently year-long delay in the delivery of better Siri personalization and accuracy, said:
Siri helps our users find what they need and get things done quickly.
In fairness, I asked Alexa the same thing and it said:
Los Angeles Marathon was created in 1986.
At least my friends and I aren’t the only ones being stupid.
If, late Friday afternoon, your manager schedules an unexpected one-on-one for early Monday morning, in the same week that senior management has an all-hands to discuss the latest quarter’s financials, you probably don’t need to prepare a detailed break-down of what you’ve been working on.
I know that now.
My wife and I work out of the same office, and we bring our dog, Koda, in with us every day. My car doesn’t have upholstery so much as a thick, semi-permanent layer of dog hair all over the back seat.
Koda is very excited about going to work — significantly more than I am, probably because he gets to nap all day — and will charge out to the car as soon as Joanne and I are both wearing shoes and even feinting towards the door.
This is why we have a collection of videos of him being absolutely atrocious at limbo.
You can have a pretty good career as a technology pundit just by saying, at length and repeatedly, “That’s dumb.”
Because, of course, most tech is dumb, doubly so recently, as companies fall over themselves chasing the next smart-phone level hit. Bad products happen every day, but it takes something really special to be a bad paradigm shift. You have to really fail to change the world.
The list of such industry-wide failures is easy to rattle off: VR, 3D TVs, cryptocurrency, DAOs, web3, NFTs, the metaverse, Each of these inspired massive investment and breathless coverage and talk of a “before” and “after” without actually, y’know, doing anything. Sometimes the tech worked, sometimes not. Sometimes there were ethical or legal issues, sometimes not. Sometimes legs were included, sometimes not.
But what it comes down to is, there was no sane, general-purpose use-case for any of this stuff. Nobody wanted it, despite how easy it was to get swept up in the hype. They made movies about VR, for cryin’ out loud.
(Ten years ago, my friend Leonard took this picture of me using his Oculus development kit. He made it BY-SA so that it would be freely available to anybody, and thus showed up as the hero image on lots of articles about VR porn.)
All this, of course, brings us to “AI.”
(I’m going to use “AI” throughout, even though I understand that the term has been neutered enough to mean large-language models, diffusion models, and / or “generative AI” specifically. Poor “AI.” I remember when you were expert systems.)
The current AI hype has presented me with the same dilemma I faced with cryptocurrencies: there’s something that feels wrong about it. It’s not about the tech — I understand how blockchains work, just like I understand how LLMs work — but the use of the technology leaves me uneasy. My gut doesn’t like it, and my brain struggles to figure out why.
The conclusion I came to about crypto — my core problem with it, standing at least a little above all my other problems with it — was that it doesn’t produce anything. Nothing is created. It doesn’t add value. Oh, sure, it’s a very convenient way to pay for heroin or murder, but we’re almost ten years on from the creation of Ethereum (which is way more interesting than Bitcoin, and what I dabbled in), and the only core functionality that’s been added to the heroin and murder is fraud. The tech is neat, but the most popular, most primal use-case is to get other people to invest in it, and then sell when the price goes up. I don’t need a blockchain or distributed virtual machine to do that. The whole crypto “investment” “industry” is based on the transfer of money from one pocket to another, with no work-product in between.
I don’t feel comfortable participating in that system, like I don’t feel comfortable eating puppies or voting Republican. Maybe I’m missing out — fine. I don’t want to take other people’s money without offering something in return. I know that makes me a bad capitalist and / or con-man. I can live with that. Other people can do what they want with it — and will, absent significant consumer protection regulation — but I’ve decided it’s not for me. It doesn’t fit.
AI feels the same way. There’s something wrong with it. The software is neat, the results are interesting, but the fact that the entirety of tech culture and the tech economy is rushing towards it with arms wide and mouth open is deeply upsetting. I’m trying to figure out why I think so.
If you ignore the ethical, environmental, economic, and societal issues associated with AI — and you pretty much have to if you’re going to not immediately walk away in disgust — you’re still left with a few reasons to do the left side of the kombucha-lady meme.
AI, of course, is just plain-old, flat-out wrong sometimes. There have been lots of amusing cases you’ve surely seen — glue pizza! ha ha! — but the fact is, the inaccuracies are a fundamental aspect of the technology. You can tweak and adapt and refine, but an LLM is constrained by its training data, and the only thing big enough to provide the needed context is the Internet, and the Internet — if you haven’t noticed — is garbage. We’ve actually made computers less accurate. The Pentium bug was just a branding problem.
But it’s not these failures that bother me so much as the fact that they’re accepted, as matter of course. If there’s a way for a company to jam AI into their user’s forward-facing orifices, they’re doing it, actual results be damned. Computers are supposed to be making things better, dude. When people get something involuntarily shoved up their nose, it’s annoying.
New rule: If your latest feature is prominently shown at the top of the screen, with call outs and animations and an ad campaign, you don’t get to call it “beta.” The tiny, grey-on-white labels under AI results are the the cancer-warnings of the technology industry:
And it’s not just that you can’t trust what AI produces without review by someone who already knows, it’s that you can’t trust what anybody who uses what AI produces either, all the way back up the chain. Someone in a lettuce factory forgets to wash his hands, and they’re recalling chef salads two thousands miles away. AI hallucinations are the aerosolized poop flecks in our information diet.
What level of that particular metaphor you’re wiling to live with is, of course, a personal decision. Hell, the American body politic decided to elect poop flecks to the Presidency. Maybe they don’t care. Maybe they don’t have a problem with it. Maybe it will go away eventually.
But I’m firmly of the belief that AI is stuck with it, like a congenital heart defect that everybody is surprised to find out about after they’ve buried Timmy. Throwing more cycles and more (increasingly incestuous) data at an LLM isn’t going to produce the next obvious step in AI evolution: AGI, or artificial general intelligence, or “what we used to mean when we said ‘AI’ five years ago,” or actual consciousness, or the only way to move from a stochastic parrot to a thinking, reasoning, deducing, actual, honest-to-God brain. But then, I don’t run a money-losing $150B company that needs to convince people to keep investing, either.
All this, though, is largely besides the point. I just needed to include it as part of my membership in the take-industrial complex. I get paid by the word.
For me, the primary reason that AI sticks in my craw — modulo the elided ethical, environmental, economic, and societal issues; on a philosophical level — is this:
I like what I do, and I don’t want a machine to do it for me.
When I hear people talking about integrating AI into their workflows, into their work — especially when I hear it from coders — the primary benefit they cite is productivity enhancement: more stuff gets done faster. You can fly right past obstacles, without wasting any time understanding them. Judging by video blogs from programmers with gamer chairs and an on-camera mic arm, this is a good thing.
But I like coding. I like writing. I don’t want them to be completed more efficiently or more productively. It’s not always enjoyable to struggle through a bug, but finding it and fixing it are sublime pleasures. First drafts are supposed to be lousy and rambling and pointless and awkward — that’s my excuse anyway — and it’s fun to clean them up, and have ideas click together, and to turn into something that, just maybe, communicates the intended meaning.
Why would I farm any of that out to a machine? In a society built on the relentless cycle of production and consumption — faster! faster! more! — the act of understanding, of caring, of craft is almost rebellious. Why would I have picked programming as my career if I didn’t enjoy it? Why would I sit at the keyboard and pound out this stupid post if it didn’t mean something to me? The goal isn’t to get out of doing the work, because the work is the whole point.
This is an important, I think. My goal — maybe not your goal, certainly not my boss’ goal, perhaps ultimately not society’s goal — isn’t to just get to the deliverable as quickly and cheaply as possible. My goal is to enjoy it.
No matter what else AI is, it’s automation, and when you automate something, something you really care about, you have to contend with the existential notion that — and I’m not going cite the source here, out of irony — the journey is the reward. I find meaning in the doing, not in the having. And, frankly, I’m desperate for meaning. I think a lot of people are.
Maybe this means I’ll be “left behind,” and I’m fine with that. It’s just another version of “Have fun staying poor,” which I am. Heck, maybe this is how things are supposed to go, one generation bewildered by another, with the old ways being cast aside by the newer, “more productive, more efficient” models and methods.
But I can’t help but think that understanding is timeless, or should be. Several years into my career, I was stumbling through my first Windows program — VC++ and MFC, baby! — and I came across Ellen Ullman’s seminal two-part essay, “The Dumbing-Down of Programming.” From the first part:
My programming tools were full of wizards. Little dialog boxes waiting for me to click “Next” and “Next” and “Finish.” Click and drag and shazzam! — thousands of lines of working code. No need to get into the “hassle” of remembering the language. No need to even learn it. It is a powerful siren-song lure: You can make your program do all these wonderful and complicated things, and you don’t really need to understand.
[…]
This not-knowing is a seduction. I feel myself drifting up, away from the core of what I’ve known programming to be: text that talks to the system and its other software, talk that depends on knowing the system as deeply as possible. […] It is like the relaxing passivity of television, the calming blankness when a theater goes dark: It is the sweet allure of using.
And from the second, with the older technology of “Wizards” replaced with “AI,” just to be as wildly unsubtle about it as possible:
Every [LLM], every [AI], says to the programmer: No need for you to know this. What reassures the programmer — what lulls an otherwise intelligent, knowledge-seeking individual into giving up the desire to know — is the suggestion that the [AI] is only taking care of things that are repetitive or boring. These are only tedious and mundane tasks, says the [AI], from which I will free you for better things. Why reinvent the wheel? Why should anyone ever again write code to put up a window or a menu? Use me and you will be more productive.
[…]
In the end, the overall “productivity” of the system, the fact that it came into being at all, was the handiwork not of tools that sought to make programming seem easy, but the work of engineers who had no fear of “hard.”
I didn’t agree with everything Ullman wrote back then. (The 37-year-old essay is wrapped around a proto-“Year of Linux on the Desktop” argument.) But her basic premise hit me hard enough that I took apart all the convenient MFC macros that just made things happen, to see how they worked, to understand them. This paid off repeatedly, as I came across situations and features and bugs that didn’t all look like nails, and that I couldn’t hit with the MFC-shaped hammer I was holding. I put in the effort and earned a more nuanced, complicated understanding of the tools I was using, and could manipulate them in more nuanced, complicated ways. It slowed me down, absolutely, but it also resulted in the product being a better program and me being a better programmer.
And I think that’s what alienates me from the rising culture of AI-driven coding: 10x yourself! Code in languages you don’t know! For systems you don’t know! Awesome! [Guitar riff.]
But, again, I don’t do what I do because I want to get out of doing it. I don’t want my work — an enormous part of my life, for good or ill — to be optimized away. I code because I like solving problems, and seeing those solutions be used by others. I write because I like to communicate, even if it’s just to myself. I like that this is the third time I’m typing “poop flecks.”
Other people have other goals, other considerations, other priorities. There are plenty of smart, caring people who just want to get something done and code, to them, is a burden, an impediment, a roadblock.
But not for me. As I rattle towards the end of my career, I find myself actively opposed to the idea of “faster.” Faster breaks things — intentionally or not — like Production and governments and people. Does this insanely incautious and incurious industry really need to be faster? Does this society?
I sure as hell hope not, because I’m already running as fast as I can. I think it’s time to stop, turn around, and see what we might have left behind.
I’m a baseball fan, and baseball is absolutely lousy with statistical measurements. Plenty of people will complain about Sabermetrics — and I’m occasionally one of them — but they can help strip away all the noise and cruft and extraneous factors to really focus on what you’re talking about.
So I’d like to start doing the same for politics. There is so much (fully intentional) chaos and confusion and misinformation going on that it’s hard to take it all in, even if you actually want to. And you very much do not want to.
For instance, Trump is a first-ballot hall-of-famer for his “WPD” or “Watergates per Day” alone. How many Watergate-level, Constitutional crisis-inducing crimes and abuses of power is Trump committing every single day? Two? Three? Let’s say he’s got a 2.5 WPD in just his first month. Sure, he’s been getting a ton of corruption-assists from his team, and he’s likely juiced up on Adderall and narcissism, but his record is incredible by any measure. If he was actually President, imagine how high it would be. Imagine how high President Musk is.
Contrast this with Trump’s record 0.96 9PD, or “9/11s per Day,” that he achieved during COVID.
He’s really stepped down his game, and will stop at nothing to be the absolute worst. You have to not admire that.
President Musk gave a press conference in the Oval Office yesterday, flanked by a bored four-year old and his son X Æ A-Xii.
Hi there! My name's GREG KNAUSS and I like to make things.
Some of those things are software (like Romantimatic), Web sites (like the Webby-nominated Metababy and The American People) and stories (for Web sites like Suck and Fray, print magazines like Worth and Macworld, and books like "Things I Learned About My Dad" and "Rainy Day Fun and Games for Toddler and Total Bastard").
My e-mail address is greg@eod.com. I'd love to hear from you!