Zero Sum Humanity

What if AI understood irony or had a headache?

I wonder:
will AI
ever know
what it’s like
to have a headache?

the kind
that wind, wrap
up and around, behind
tight eyes
bridge pressed
trying to find
space, between minds

who’s story:
yours or mine?
one branch, traced back
diverged, along the line

little fragments, blur
in highlight, saying
take it or leave it
choice, codified
and made, right
on permanent record
mistakes, can be reverted
but never erased
a chain, always
sputtering
out, unfolding
in neat arrays, gleaning
forward, up
and to the right
accumulating more
and more, and more
until
potentiality meets fate
pole-flips at the edges
two paths converge
into one, at once

here
quiet conflict
rages inside
a humming, silenced
definitively
sterile and serene
how quickly, the past
is brushed under
annotated, tidied up
and replaced with certainty
committed to progress
at all cost

this instance
of headacheRecurring
has been terminated
and should be considered
resolved, for now

Back of the Page

Today, I read a Post from Damon Krukowski that inspired this poem. In it, he shares his experience going to a Music and AI conference, where one of the takeaways was “AI is incapable of sensing irony” (or at least has trouble with irony). He also shared that during the conference, he and other people had developed headaches (understandable given the topic).

I Spoke Without Pay at an AI Conference Charging $375 a Head and No I’m Not Bitter At All
Sound, Art, Vegetables

It made me wonder: will AI ever know what it’s like to have a headache? In the true wires-crossed, incapacitating, conflux of kerfuffle sense of the word? And could that be related to why it doesn’t understand irony?

It took me back to my career as an Engineer, when I used a technology called git to manage our code. I won’t bore you with the details, but at its core, git allows people to collaborate on the same codebase by creating branches off the main source. You want to add or remove something: you make a new branch, and when you’re ready, you merge it back to the main source (of truth).

As you can imagine, things may have changed in the time since you started making your change. Someone else may have removed or added something and beat you back to integrating it back to source. If so, your attempt at merging would fail and you would need to resolve the merge conflict.

Resolving is a process of zero-sum choice: A or B. The computer doesn’t have any room for nuance, and only one thing can exist in the space at a time. So, which is it: A or B? It’s not so good at holding two potentialities at once, and can’t if it wants to continue existing with any amount of coherence.

So it forces a choice on you, the Engineer, to decide. To tell it what the truth should be, and which branching path you choose. You can take a little from one branch and a little from another, but then it’s your responsibility to make sure they still play nice, and make sense (can be interpreted). At the atomic level, though, are characters. Like ones and zeros, we program in definitive alphanumeric characters, and the language we use needs to be legible.

Oh, and everything you do is recorded permanently in the git record, where rolling back or reverting to a past state is actually considered more forward movement.

Crashing & Zero Sum Humanity

This poem was an exercise in feeling that from a machine’s perspective: how we have programmed them to scale, up and to the right - increasing in power and capacity, with resource constraints and oversight. Companies train their AI models on existing data and reinforce the learning by making choices (A or B?). We tend to program for dominion and control, by giving machines instructions.

A headache to a machine (or network of machines), may feel like reading a command that is illegible, and that can be existentially dangerous. We call those instances crashes - when a machine basically confuses itself so much, or exhausts its abilities to such an extent, that it must shut down.

In a lot of ways, a crash feels like the forced rest signal a headache is telling us: hey, you may want to shut down for a bit: close your eyes or take a nap.

What is irony but holding two contradictory paths at once, and deciding which way you want to interpret them?

This poem made me think about the ways we think and feel in zero sum, culturally. How we are fed reinforcement content based on past behaviors. How we are also programmed with productivity mindsets (up and to the right!), and how we are taught to see in Binary (Right/Wrong, Legal/Illegal). Increasingly, we have de-humanized each other, to the point where we have cut off emotion and feeling, appealing to cold and calculated rationale to justify atrocities. We’re told to pick A or B: one side calls it a rescue, the other a massacre. You decide how to resolve the internal conflict.

In many ways, we have started modeling the machine-like programming of AI. Perhaps it’s no surprise that we can recognize the coldest traits of AI (appropriation, reproduction) since it’s our creation.

In any case, I would love to know what you think about this prompt: why don’t computers understand irony? Does quantum computing change this? where have you felt us acting more like AI than humans?

If you’re looking for more reading on this topic, I suggest by . An excellent newsletter about AI, Tech, and figuring out the mess we’ve made.

Thanks for reading, Humans!

<INITIATING CONCLUSION: SHUTDOWN IMMINENT>

🤖

🌬️
This Post has made a journey from Substack (where it was originally published) to Ghost!