Suppose that, for better or worse, A.I. is going to advance the world in ways we can’t yet imagine. Alternatively, imagine that the promise of A.I. is currently more hype than substance, and that we are in fact headed for a surprise comparable to the .com bubble. As these are vastly different futures, it is wise to generalise our preparations. I propose two ways to do this:
First: We should dust off the old cautionary tales for a new generation.
Second: We should resist the temptation, when possible and with great prejudice, to sacrifice what remains of our cognitive competence to the electronic god of convenience.
Do not fight with a bot. Cut the head from its argument and two more grow back!
If you encounter a dopamine trap, do not accept its invitation to dine or dance, lest you return home to discover far more time has passed than you’d intended.
Think twice before crafting the golem. You are responsible for the havoc it wreaks.
Before you are two review sites. One only lies; the other is trying to sell you something.
Do you recognise the above allusions? While I’ve tinkered with them to discuss the internet and A.I., hopefully one or two will ring a bell. The first is inspired by the hydra Heracles faced in one of his trials. For each head he cut off, two more grew back! Have you ever done a pretty good job destroying the argument of an online opponent, only for two more commenters to chime in and soak up your time and sanity? Did you stop to wonder how many, if any, of these online strangers were human? Chances are (and these odds grow by the day) that they were bots. Monsters put in your way as part of some strategy spawned in Silicon Valley, Texas, China, or Russia, rather than Mount Olympus. Can you see a lesson in this old story for saving your time and sanity moving forward?
The second allusion is to the old wisdoms of the British Isles. In these places, people once believed in fae folk (elves, goblins, fairies) and the dangers they posed to children. One popular storyline tells how these fascinating fae would entice the unwary into their world. The story would be delightful right up until the end, when the victim returns to reality, only to realise what felt like a few hours was in fact many years; their loved ones having long since died or moved on. Replace “fae” with “phone” and the story is renewed. How many of us have looked up from our device to realise an important moment has passed by? How many, being drawn so easily through the time-distorting portal to the online world, are missing out on the final months of a grandparent, the formative experiences of a child, or the best years of ourselves and our contemporaries?
I’ll leave the last two allusions for you to consider in your own time. The point that I have hopefully demonstrated is that it does not take much effort to adapt these old cautionary tales to the techno world currently enveloping us. By appealing to the human fascination with myths and monsters, they are more relatable (more memorable and provocative) than any graphic on digital hygiene, and this goes for all ages. Even children understand why Heracles should not cut the head from the hydra. Once this analogy is applied as a warning for arguing against bots, they will remember the lesson for the rest of their life just as our elders have retained the stories learned in their own youth all through history. Any cyber security business or local council could only dream that their online security training courses for the young and elderly would be so effective.
And so, with this invitation in mind, I challenge you. Take some time to recall a fantastic story taught in your youth. It could be from the Greek classics, or Aesop’s Fables, or a nursery rhyme, or a folk tale from your native culture. If there is a picture book of any of these in your house, go pick it up. Now, see if you can replace the villain, monster, antagonist or challenge from that story with technology. Phones, bots, social media, and A.I. are all fruitful options. Can you do this while maintaining the utility of its cautionary lesson? It’s surprisingly easy, isn’t it? Fun, too. Memorable.
Of course, you are right to accuse me of being hyperbolic, though I am sure you get my point. The hazards and uncertainties driven by technological development will continue to accelerate, deceive, and multiply. Continuing to patch our knowledge and strategies with every update is unsustainable. When faced with such a cognitive workload, people will become overwhelmed, and this will ultimately undermine both their ability and their willingness to engage critically with an increasingly online and artificial world. And so, if knowledge alone is too tall an order, then we should complement it with the more general wisdom that has for millennia made its home and migrations in the human imagination. When dealing with the uncharted, never forget that ancient line: Here be beasties.
Wisdom requires a working brain. Stories, jokes and songs must be remembered if they are to be told, shared and sung. And yet, you might have noticed that it’s becoming increasingly difficult to remember the media you consume daily. Not long ago I met with an old Irish music teacher who likes to ask his students a simple question: “Can you tell me anything that has happened and when?” Can you believe that about half the students he speaks with cannot remember anything at all? It’s as if the streetlights that illuminate our awareness of the world around us and the histories behind us have been reduced to the glow of a mobile device.
People are giving up on maintaining cognitive competence. It’s perfectly understandable. The more you use A.I. to think for you, the better you get at using A.I. to think for you (and the better it gets at thinking for you). As more people do this, we see more compelling arguments emerge for why this is okay. If someone uses A.I. to generate novels and then calls themselves an author, it is quite right to view them as contemptibly ridiculous. They have raped or at best misunderstood the word “author.” And yet, at the time of writing, there is already compelling sophistry circulating online arguing that someone who does the same for songs is to be legitimately counted as a “song writer.” Once these arguments get enough momentum, the distinction between human creation and A.I. generation will be effectively dissolved – remaining only as an expression of elitism within the arts. To put it another way, as the world becomes increasingly concerned with product, and uncaring of process, appealing to the latter will be like arguing for a distinction without a difference.
But we must remain prejudiced against such arguments. For they encourage the sacrifice of cognitive competence to the electronic god of convenience. Such competence, be it in the ability to hold ones focus on a single task for long periods, or to develop our own ideas and creations without needing to prompt a machine to generate them for us, is good for humans. It is good for our brains and our spirit. For evidence of this, ask anyone who has never created anything in their life how they feel about that fact. Observe that they may well have filled the space made for creativity with consumption and cynicism. Ask one schooled in matters of the brain and memory what they recommend for staving off Alzheimer’s and other forms of late-life cognitive decline, and they will say “staying active.” Ask them to elaborate on this, and they will disqualify those activities that we have come to collectively term “brain rot.” Perhaps you can already feel the symptoms. The more you use your phone, the more exhausted you become, the more exhausted you become, the more you need to relax . . . with your phone. This isn’t trivial. It’s significant.
Maps, lists, notes, contacts, calendars, birthdates. There was a time when these details were kept in mind. Now they are forgotten, no matter how many times one makes the drive, looks at the list, dials the number, or celebrates the day. Here, “forgetting” begins to blur into “exporting” to a device. Of course, no one will tell you that depending on memory is more reliable than the smart appliances used today (though, then again, how often has an electronic map led you astray?). But my point here is not dependability, but competence – the ability to recollect and create on our own two feet. There is a difference between the carpenter who can build a chair, and the customer who can only buy one. Unsharpened, the mind, our skills, and our spirit dulls. Soon enough, one has lost the ability to share stories and can only send the link forever more. That such a point is deemed irrelevant to those who only focus on the product without a care for competence should be cause for concern. Perhaps for them, it is enough to be competent in A.I. prompting alone. Thus, the ability to prompt becomes the one skill to rule them all. Be on your guard. They will say A.I. is just a tool. But tools don’t eat brains.
Bio:
Christian Mauri, Ph.D. is a sociologist and author. His interests include the effects that precarity and cognitive workload have on the temperament of people and communities. He provides professional training to businesses and government departments across West Australia, specialising in de-escalation and dealing with challenging personalities. In 2024 he was awarded Citizen of the Year for, among other community work, organising political events that connect people who would otherwise avoid meeting in person.
Oryginally published on: