The Average of Our Humanity
The race towards a mediocre middle has been on for long. AI is just accelerating it.
When you compete with someone, you become more like them. The longer you compete, the more you resemble each other. Eventually, you turn into clones of each other. It doesn't matter who starts competing, or even if the decision to compete is consciously taken. Market forces may force you into competition with someone or something, or consumer / viewer expectations might. The end result remains the same. Competing parties lose their identity to each other like victims lose their humanity to the vampire's bite by turning into vampires.
Facebook didn't have a feed architecture until it started competing with Twitter. When it got one, it became a little more like Twitter. Twitter, in turn, enabled the Like function in the form of the Heart button. Facebook then got the Share button on posts. Eventually, all that separated these two formerly distinct platforms was the fact that tweets could only be 140 characters long. But now that too is gone. Now, for all practical purposes, the two social media giants do the exact same thing and nothing distinguishes one from the other.
Wix and Squarespace were two website building tools that competed with each other, adding more and more cloned features. Now there is hardly anything that marks them as different.
Coke and Pepsi had distinct tastes until Coke started competing with Pepsi and made itself more syrupy. Today, they both taste pretty much the same.
Hindus took pride in the notion that they were peaceful and civilised. They criticised Islam for being violent. Now, after years of competing with Islam, there is hardly any difference between the two as far as advocacy for violence goes.
Who you compete with changes you, and in time, defines you.
In the time of AI, we are now at a junction where humans are being asked to compete with machines to keep their jobs. We are being told that unless we get good at doing what machines do (with the help of machines of course), we will be left behind. Machines, in turn, are being made to look and act like humans. Where this will lead is anyone's guess. But if there is a zone of mediocrity between humankind and machinekind, both would seem to be headed there. The ethic of various technologies has for a long time been pushing us towards that no-man/no-machine zone. People look and sound more mechanical and machines look and sound more human. Machines try (with the aid of people) to learn how to act more human and humans try (with the aid of machines) to look and sound more mechanical. The catastrophic middle has been approaching for decades.
I think that the thing that makes you you should not be lost. Compete all you want, but do not compromise your core. However, in order for you to be capable of protecting your core, you first need to know what your core is. The ones who lose it are the ones who don't know what it is and where it is. If you don't know who you are, you might give away that part of you without a second thought. So put time and effort into learning yourself. Learning what makes a human a human.
We already have a domain of learning that attempts to breach this veil of darkness. It is, quite aptly, called the Humanities. A list of disciplines that is about being human, and about living with humans in human society.
However, since we live in systems that necessitate survival by feeding industry, we don't do a great job of teaching any of it well enough. There is a hierarchy of respect whereby the Science Stream is at the top, followed by Commerce, and then Humanities at the bottom, but it has scarcely anything to do with the disciplines themselves. It has often to do with employment and social value.
People don't go into the Science Stream because they are interested in science. They do so because it is a stable pathway to employment, which in turn, is a stable pathway to dignity and respect (and probably marriage) in society. Many of these people, despite acquiring the skills required to perform adequately in a scientific career, still fail at developing a scientific temper and can be seen publicly advocating for obsolete and toxic traditions, and pseudo-scientific outlooks. They continue to subscribe to discriminatory practices, religious dogma, and bad politics. None of it has scientific validation of course, but since they have acquired a scientific qualification, they use it to justify their beliefs and practices. I am sure you have come across at least one nitwit online who says that his favourite superstition must be scientific because a scientist said so. This is obviously not how science works, but our nitwit in question neither knows, not cares.
Because of this twisted system of validation, we have also convinced ourselves that science is just skill. Many of us seem to think that to know how to do something is science, even if the ability is not accompanied by an understanding of the process and the principles that make the process possible. We have made know-how primary and understanding secondary.
Perhaps this is why, when we discuss the ability of AI to do something, we are happy to also say that AI is now as intelligent as human beings are. Our twisted system has brought us to the point where we don't think of human intelligence as anything more than skill and know-how. That is, after all, what we have defined as intelligence in our society for generations now.
When we think of memorisation and reproduction of memorised information as markers of intelligence, no wonder we start thinking of AI as intelligent. When we do not include originality and radical new thinking as defining characteristics of intelligence, no wonder we feel threatened by AI when it does the things we have only seen "intelligent" people do.
Our competition with AI, if it is based on these criteria, will take us deeper into this faulty way of thinking. Before long, both humans and AI will stand at a point where it all averages out and we are both left standing in each other's mediocre shadows.
Science Fiction author Ted Chiang (of Arrival fame) wrote in The New Yorker a few months ago about how the conception of art as mere skill drives a lot of "AI art" enthusiasm.
Many novelists have had the experience of being approached by someone convinced that they have a great idea for a novel, which they are willing to share in exchange for a fifty-fifty split of the proceeds. Such a person inadvertently reveals that they think formulating sentences is a nuisance rather than a fundamental part of storytelling in prose. Generative A.I. appeals to people who think they can express themselves in a medium without actually working in that medium. But the creators of traditional novels, paintings, and films are drawn to those art forms because they see the unique expressive potential that each medium affords. It is their eagerness to take full advantage of those potentialities that makes their work satisfying, whether as entertainment or as art.
If you have paid any attention to governmental attempts at improving the lot of young people, you will have heard of "skill development" more than once. While it is nobody's case that skills don't matter, we should probably try to remember that monkeys, machines, dolphins, can all be trained to have skills. There is a reason we speak of training AI instead of educating it.
It may be an inevitable fact of life in a poor country that one needs skills to survive. But judging one's value with skill as the only scale lessens us to an extent. In the age of AI, as skills become cheap, perhaps the value of being human and being able to think like a human should be kept out of the market.