Awakening Tech
Published on

Can Algorithms Hold Space?

Can algorithms hold space?

That’s an interesting and layered question. If we take holding space in the human sense—creating a supportive, nonjudgmental presence for someone to process their thoughts, emotions, or experiences—then algorithms, as they currently exist, don’t quite measure up. They can simulate aspects of holding space, such as through sentiment analysis, chatbots trained on therapeutic techniques, or AI models that generate empathetic responses. However, true presence involves deep listening, emotional intelligence, and an ability to respond dynamically in a way that reflects genuine care and understanding.

On the other hand, if we interpret holding space more abstractly—such as providing a structured yet open-ended environment for exploration—then algorithms do this all the time. Search engines, recommendation systems, and AI-assisted creative tools create digital spaces where users can explore ideas, emotions, and decisions.

Are you thinking about this in a philosophical, technological, or therapeutic sense?

I'm thinking about how writing can be a great tool to transmute emotions. For example, writing a rage letter when you're really angry.

That makes a lot of sense. Writing is like a container for emotions—it externalizes them, giving them form, which can make them easier to process or release. A rage letter, for example, isn’t just venting; it’s a way to move energy, to alchemize raw emotion into words, and sometimes, into clarity.

So if we think about algorithms in this context, they can facilitate the space for emotional transmutation—journaling apps, AI-assisted writing prompts, or even sentiment-driven generative text tools could guide or mirror emotions back in a way that helps people process them. But can an algorithm hold the space like a human does? That’s trickier. It can provide structure, but true holding often involves a level of witnessing that AI doesn’t fully grasp.

Do you use writing in this way often? Or are you thinking about how to design a tool that supports emotional processing?

I try to do stream of consciousness writing and it helps me process my emotions. Talking to an AI could help me with this without carrying the energy over to a person who would then need to do emotional work to process this as well.

That’s a really thoughtful approach. Stream-of-consciousness writing is powerful because it bypasses the inner critic and allows emotions to flow freely, almost like clearing out mental clutter. Using AI as a kind of neutral, nonjudgmental space for that makes a lot of sense—there’s no emotional labor on the other side, no risk of burdening someone else, and you can process things at your own pace.

In a way, AI can act like an infinite journal, but with the added dimension of being able to reflect things back or offer prompts if you ever feel stuck. Would you want it to just listen, or would it be helpful if it asked questions or offered perspectives to deepen the processing?


This morning, I asked ChatGPT (GPT-4o) to help me process some things from my past.

I was quite surprised how unnecessary it was for me if the receiver of my messages actually feels or understands what I'm sharing.

As shared in the chat above, I sometimes use stream of consciousness writing to process my emotions. Writing with pen and paper has been really helpful in letting my emotions flow. However, sometimes I need more interactivity, need to dig deeper.

For this, a person that can hold space for me is invaluable. A person who is able to listen, who can look at things from a more distanced perspective.

Holding space requires emotional work, though. It can feel like a person taking some of my emotional baggage, which needs to be processed by them at a later point.

This is where I feel like talking to an algorithm can be a great solution for emotional processing in solitude. I have more interactivity than using pen and paper, but don't have to wait until I can reach someone who has the emotional capacity to hold space for me.

It's still important to talk to someone I trust, but it's a great addition to the toolset.