I find LLMs (“AI”) fascinating. I haven’t been this excited about new technology since I discovered the internet. I am super interested in how they are changing the way we access information – admittedly, not necessarily for the better. I love the interactive interfaces.
But one thing I love less is the way LLM productions creep up all over the place, somewhat uncontrollably. “AI summaries” of search results are one thing. I actually quite like that, it’s clearly marked, usually quite synthetic and a good “overview” before diving into the search results themselves. But do I need a Quora AI-bot answer to the question I clicked on to look at? (Not that Quora is the highest-quality content on earth these days, it’s clearly fallen into the chasm of infotainment.) And of course, page after web page filled with AI slop, and invitations in pretty much all the tools we use to let “AI” do the job for us.
Which brings us to what irks me the most: humans passing off unedited and unreviewed LLM productions as their own. You know what I mean. That facebook comment that clearly was not composed by the person posting it. The answer to your WhatsApp or Messenger message that suddenly gives you the feeling you’re in a discussion with ChatGPT. This is another level from getting Claude to write your job application letter or craft a polite and astute response to a tricky e-mail. Or using whichever is your current LLM of choice to assist you in “creating content”. Slipping “AI stuff” into conversation without labelling it as such, is, in my opinion, a big no-no. Like copy-pasting without attribution.
As we use LLMs to create content for us and also summarise and digest the same content for our consumption, we’re quickly ending up in a rather bland “AI echo chamber”. I have to hope that enough of us will not be satisfied with the fluffiness of knowledge this leads to. That writing our own words and reading those of others will remain something that we value when it comes to making sense of the world and expressing what it means to be human.