On opening night of Monktoberfest, I caught a quick photo of the four authors of the new Progressive Delivery book on a boat in Casco Bay – Heidi Waterhouse, Kim Harrison, Adam Zimman, and James Governor. I added it to a thread Heidi posted to Bluesky about the book launch.
I would have written alt text for that photo. I’m in the habit for the most part and do my best to think about others. But for a quick reply post? The mental overhead often adds more friction than the value of the reply, slowing me down enough that I will sometimes consider skipping it. With AI, it took seconds.
When I post photos to Bluesky, I use a custom prompt/GPT to write the alt text. It describes what’s in the image, how it feels, and what someone who can’t see it might want to know. It’s a really basic prompt and I’m sure there are a bunch more like it out there. Here it is for reference:
Create alt text for images posted to this chat. Review the image and provide descriptive text that helps a user with no or limited sight understand and experience the visual image. The description must fit in 2,000 characters.
This sounds trivial until you realize how rarely it happens. Most images posted online have no alt text at all. Not because people don’t care about accessibility, but because describing an image takes mental energy that’s already been spent capturing and posting it. The moment has passed.
For me, AI removes that friction. I upload an image, the system drafts a description, I tweak it if necessary. It’s a quick trip from finder to AI to post. Suddenly accessibility becomes the default.
When I was more active than I am today on Mastodon’s Hachyderm instance, this was built right into the image upload. One click. The AI-assisted descriptions made that norm easy to follow.
Now personal prompts and custom GPTs make this available anywhere. Don’t get me wrong: AI can’t replace the human eye and brain. It sometimes misses nuance, gets details wrong, can’t read tone the way you intended (or numbers and letters; but I digress). But it gives you a starting point.
Here’s what changes: when you add alt text consistently, you start noticing when others don’t. You see how many images float through your feed inaccessible to screen readers, meaningless to anyone who can’t see them. You realize how much gets shared with the assumption that everyone experiences it the same way.
This is what good technology does. It removes the small obstacles that keep good intentions from becoming consistent practice.
That’s worth automating.
Leave a comment