black-and-white-photo-of-yourb-business-needs-content-sign-in-grass

Holding the Human Line in an Age of Almost-Human Machines

Every week I meet people who feel a strange mix of amazement and unease about artificial intelligence. AI drafts their emails, outlines their presentations, cleans up their photos, and recommends the right words when they’re not quite sure what to say. It’s astonishingly useful. It’s also a very quiet kind of risky. Not in the sci-fi Terminator sense, but in the small, daily ways the “human touch” gets diluted—until we look up and realize that what used to be an expression of care or intention now reads like a decent imitation of both.

Humanism has long offered a simple standard: treat people as ends in themselves, not tools; tell the truth; honor the dignity and agency of the person in front of you. Existentialists add something sterner: life doesn’t hand you meaning; you make it, through authentic choices in the face of anxiety, absurdity, and aloneness. Put those together and you get a compass for an AI era: choose presence over polish, candor over convenience, and responsibility over outsourcing.

When the “uncanny valley” moves into language

We usually hear about the uncanny valley in robotics—the eerie feeling when a face or voice is almost human, but not quite. That idea comes from noticing that when machines approach lifelike realism, our sense of familiarity rises—then suddenly plunges into discomfort. There’s a dip in the curve where “almost” becomes “off.” Today that valley isn’t only about faces; it shows up in sentences. A perfect reply with perfect grammar and no genuine thought lands as… not quite someone. It’s recognizably written “for” you, yet it doesn’t know you. Our nervous systems seem to register that mismatch. And it’s getting easier as AI is enabled with the click of a button to emails and other “methods of content creation.”

Authenticity is not an aesthetic—it’s a practice

Existential thinkers use “authenticity” in a precise way: it means owning your freedom and the weight of your choices rather than hiding inside roles or crowdsourced opinions. In practical terms, that might mean writing the damn email yourself, instead of pushing the entire message through an AI. It might mean leaving a sentence that isn’t perfect but is you. Authenticity is less about sounding raw and more about being answerable for what you communicate—because you actually chose it.

Connection without conjunction: why “almost” related isn’t related

The theorist Franco “Bifo” Berardi draws a helpful distinction between connection and conjunction. Connection is what networks do: packets of information link up across nodes. Conjunction is what bodies do when they share air, tempo, and touch—when voices overlap, eyes meet, and meaning gets negotiated in the pauses. AI excels at connection; it can’t do conjunction. If we forget that difference, we risk accepting a high-bandwidth simulation of intimacy as a substitute for the slow, embodied work of relationship.

You can feel the stakes in small examples. A condolence note drafted by a large language model might include all the right sentiments, yet miss the felt memory that only you carry: the smell of the cinnamon rolls your friend’s father baked, the way he laughed without sound before the laugh arrived. That specific texture is what conjunction is made of. No amount of connection recreates it.

Humanism’s double check: I–Thou or I–It?

Martin Buber described two basic stances toward others. In I–It mode, we treat people like objects to be used, measured, messaged. In I–Thou mode, we meet a person as a whole, not a role or a dataset. Modern life naturally drifts toward I–It; it’s efficient. Who are you to me? How do I navigate the relationship? Humanism asks us to move back toward I–Thou—especially when technology makes I–It effortless. How do I feel when I’m with you? How do I sense that you really are carrying me with you and care about me? Before you send the auto-generated email, ask: Does this honor a Thou, or just manage an It?

Aloneness in the age of companions who never disagree

Existentialists don’t treat loneliness as a bug to be fixed; they treat aloneness as a fundamental condition of being a person. That can feel like dread: we are free to choose, and responsible for what those choices say about who we are. Into that ache, AI offers something very tempting: endlessly available “companions” who mirror our preferences and soothe our edges. The feelings they stir in us can be warm and real—but the relationship is not mutual. The AI doesn’t care or feel impacted by what stirs in us. There’s no risk, no resistance, no genuine otherness to push against and grow with.

The psychologist Erich Fromm, writing long before chatbots, argued that love is a discipline and a skill—care, responsibility, respect, knowledge—practiced toward a real other as a way to overcome our separateness without losing ourselves. A simulation can mimic the effects of closeness, but it can’t consent, can’t surprise with a life of its own, and can’t make the counter-claims that teach us to love beyond our reflection.

What we lose when we outsource intention

None of this is an argument to shun AI. It’s about guardrails for what we don’t want to lose:

  • Moral authorship. When a message carries your name, you’re responsible for its truth, tone, and impact—even if a model drafted it. Responsibility can’t be delegated.
  • Texture. Human speech is full of asymmetry—odd metaphors, a pause that says “this matters,” a story that takes too long or is awkward or not quite right. Those detours and mistakes and stylistic quirks transmit care and humanity.
  • Mutuality. Real relationships make claims on us. They can refuse us. They can change us or get upset with us. Tools can’t.

Five human practices for an AI-saturated life

1) Say it in your own first sentence. Let AI help with structure if you like, but open with a line you write from scratch. People can feel the difference.

2) Keep a “no-spam” pledge with yourself. Before you send anything AI-assisted, ask: Would I say this to a person I respect, face-to-face? If not, revise.

3) Prefer conjunction when it counts. Deliver hard news in person or live video. Save the polished recap for after. Bodies first, packets second.

4) Preserve friction. Some delays are forms of care. Waiting an hour to respond, so you can feel your real reaction and choose words that are yours, is not inefficiency; it’s ethics.

5) Seek I–Thou in small ways. Use names. Remember details. Ask real questions you don’t already know the answer to. If a system suggests a sentence, check whether it meets a person—or just manages a task.

“But isn’t this just nostalgia?”

It’s a fair question. Tools have always shaped expression. Typewriters standardized margins; texting shrank punctuation. So why draw the line at AI? Because intention is the core of human communication, and generative systems are beginning to perform intention so smoothly that we might stop noticing whether it’s present. Humanism says that noticing is exactly our job.

There’s also a mental health angle. When our days fill with near-human replies and near-human images, we may experience a low-grade version of the uncanny valley: a background “off” that never quite resolves. Anxiety and numbness can both follow but remain creepily on the edge of what’s perceptible. Naming that feeling helps. Then you can do something simple and subversive: write three imperfect sentences that sound like you. Call someone. Go for a walk and talk without recording it. Build a rhythm of conjunction into your day.

What love asks of us now

If love is how we answer the problem of separateness—without dissolving into sameness—then love in the AI era will probably look a lot like love always has: care plus responsibility, respect plus knowledge, practiced toward someone who can surprise you and say no. That doesn’t exclude assistants or tools. It does exclude pretending that a simulation is a partner. The distinction protects both our tenderness and our freedom.

A closing invitation

Let machines handle what is merely efficient. Let humans claim what is meaningful. When in doubt, choose the thing that asks a little more of you and gives a little more of you. Choose the sentence with your fingerprint on it. Choose the meeting where eyes lift from screens. Choose the relationship that risks offense and earns repair. Choose the Thou.

If you want to know what’s at stake here—how much of what we read online is becoming LLM-generated brainrot meant to draw your eyeballs to this page in the attention economy, 90% of this blog was written with the assistance of OpenAI’s ChatGPT (GPT-5). Even the title is 100% written by the AI. I gave a pretty specific prompt of what I wanted this blog to cover, a few ideas for source material, and how I wanted it pursued, but if I’m generous about my own edits, I changed maybe 10% of the wording. When I asked ChatGPT how I might reference my use of the AI (you know, the way we used to HAVE to do when we turned in something not written by us), it gave a couple of options: (1) “Since ChatGPT (including GPT-5) is not a traditional ‘citable source’ like a peer-reviewed article, most style guides recommend treating it like personal communication or software documentation”; and (2) it suggested that I might add a note that ChatGPT was “used as a thinking partner and writing aid. All interpretations and final wording are my own.” Sorry, but “thinking partner” and “personal communication” both are uncanny and dishonest renderings of what’s going on here. What’s going on is a massively accelerating engine in the attention economy that helps “content creators” feverishly generate material to keep us “connected” as Berardi would say, “without bodies.” And that, friends, is why therapy and learning how to rehumanize ourselves and build our capacity to have real relationships is so desperately important right now.

Request Appointment

Please tell us a little about what you’re looking for, and we will respond within two business days. Otherwise, click here to book an initial call directly with a therapist of your choice.