Change theme to 🌞

Tom Croce

Knowledge of Good and Evil

A girl with binary code projected onto the left side of her face. Photo by Cottonbro Studios

Children don't inherently know whether something is “good” or “bad.” For instance, when it comes to food, they only discover whether it's good or bad after tasting it—either they like it, eat too much and feel sick, or don't like it at all. It may sound trivial, but for them, there's a uniquely human curiosity. Perhaps some animals experience it too, but it's not driven by reason.

A child always assumes that whatever is in front of them should be tested, tasted, or tried before they can know if it's good or bad.

When Jesus said1, "unless you turn and become like children..." perhaps he meant "innocence." This idea is interpreted in various ways depending on the priest giving the homily. If my interpretation is correct, it is likely this "judgment" of good and evil that comes into play.

Innocence sometimes fades when we begin judging people as good or bad. This leads to either positive or negative treatment based on factors like whether someone is attractive or unattractive, slim, athletic, or overweight, male, female, or otherwise, rich or poor, Caucasian or of another ethnicity, a likable colleague or someone who wronged us.

But no, it's not always others who do wrong, even if they've wronged us. A child knows how to forgive because they simply don't attach weight to even bad things. When they learn what's good or bad, beautiful or ugly, they do so only through experience. It's "human" to use intelligence in a sort of "lazy" way—learning patterns and routines simply to make life easier.

In a way, we're all like children playing with a new "tech toy." Yet, innocence is no longer ours.
For a child, the world is a reality to experience without filters, whereas for adults, experience has already produced biases and shortcuts—practical adaptations.

This leads to selectivity, which can evolve into extreme forms like intolerance, racism, sexism, extreme feminism, selfishness, elitism, and so on. Simply recognizing that a person with darker skin is no different from you requires a mental effort that doesn't just move the gears of your brain—sometimes dusty, sometimes perfectly oiled—but must also break through the shackles and structures we've unconsciously set up to avoid effort, to avoid risk… to avoid suffering, to avoid eating what harms us or we don't like, to avoid fear.

The result is that, in the end, we become afraid—even of disturbing that fake, unnecessary security. We also become aggressive, unkind, with others and with those like us who put us at risk.

AI invites us to explore, but also to question whether we can truly maintain such an open approach. In other words: can we be curious without losing sight of ethical considerations? Can we avoid projecting our own fears and beliefs onto the technology itself?

We ask ourselves: should we trust technology or remain wary? A genuine alarm needs to be based on concrete data, not speculative anxieties. But how many of us can make this distinction?

And ChatGPT? What does it have to do with this? Nowadays, it's always involved.

I've been thinking about how all of us—except perhaps those working in the field and the enthusiasts/experts—are like children in front of a mysterious animal. We play with it, exploit it a little, without fully understanding its nature—let alone the dark, future implications of this Frankenstein monster of the new millennium.

Science fiction seems like just science fiction, but for those who treat it as a prophecy, it becomes a justification for their fears and anxieties about AI: from HAL 9000 to Skynet, popular culture has inundated us with scenarios in which AI becomes a threat, if not an outright nemesis.
Yet, these are fantastical projections that some treat almost as prophecy. The problem is that, when taken literally, these visions end up shaping the way we perceive AI today.

Yet, being like children here could save us. Using such a tool simply and superficially, not concerning ourselves with the so-called "AI integrations" in augmented reality, popular technology, or the mysterious musings of those who know how to waste public money on projects with no clear direction: this kind of "carelessness" helps us remain like children—free in action and free from prejudice, not bound to the technology that should only be a tool, but instead enslaves us.


  1. Gospel of Matthew 18:3

#AI #curiosity #ethics #innovation #responsibility #technology