No one’s ready for this

Sarah Jeong:

We briefly lived in an era in which the photograph was a shortcut to reality, to knowing things, to having a smoking gun. It was an extraordinarily useful tool for navigating the world around us. We are now leaping headfirst into a future in which reality is simply less knowable. The lost Library of Alexandria could have fit onto the microSD card in my Nintendo Switch, and yet the cutting edge of technology is a handheld telephone that spews lies as a fun little bonus feature.

We are fucked.

My views change on AI stuff all the time and so I’m just noting them here for the future—but!—I can’t think of any creative or even barely useful applications of generating things inside images besides "I can lie to you about motorcycle crashes and natural disasters faster than I can think.” (Unlike Apple Intelligence, where you draw a circle and then a lame image is generated for you—I would certainly judge anyone who used it for anything outside of a placeholder image to be later replaced—but that sort of image generation doesn’t feel immoral to me.)

Removing things from images I’m weirdly ok with, too! I use that in Lightroom all the time and it doesn’t feel like “lying” to me. Removing a traffic cone from a road to make the image more appealing is morally fine I guess because it feels like you’re focusing on the important details. Also, I don’t feel duped when I see that a photographer has brightened, lightened, desaturated, added a vignette or removed small details from their photographs. It gives me the same feeling as when I see the before/after edits on a chunk of writing.

But this stuff right here—adding things that never happened to a picture—that’s immoral because confusion and deception is the point of this product. There are only shady applications for it. Looking at a lot of the examples here I can’t tell what’s real without inspecting them—the crashed motorcycle has a bicycle tire for example but man I would never look this closely in most situations.

So right now I think this stuff should be straight up illegal.