A software engineer and ADHD parent on what AI actually is, why it cannot hold the inside of your life, and the question every parent should ask before reaching for a chatbot at midnight.
For Lynne Wawryk, who asked the question that started this.
This blog started as a conversation with a friend.
We were talking about coaching, and about the strange in-between place I am in right now, building Velvet Focus Studio while still working as a software engineer. She listened for a while, and then she said something that stayed with me. She said, "have you thought about AI and coaching as a bridge? You sit at both ends of that line." And then she mentioned something she had heard recently on the radio. That young people, more and more, are turning to AI for therapy. For emotional support. For help with their mental wellness.
I went very quiet.
Not because she was wrong to raise it. She was not. The wanting underneath it is the most human thing in the world. People are reaching out because real support is expensive, or far away, or has a six-month waiting list, or feels too vulnerable to ask for. None of those reasons are silly.
I went quiet because I have spent fifteen years writing software, some of it talking to AI, and I know what is on the other side of that screen. I am also the parent of a daughter with ADHD, which means I know what it feels like to be awake at midnight looking for someone, anyone, who might understand what your child's brain is actually doing.
And sitting at both of those ends, I find I am genuinely worried. So I am writing this.
Let me say this in plain language, because the marketing has done a brilliant job of confusing everybody.
AI is not a mind. It is not a presence. It is not "thinking about you between sessions." It is a very, very sophisticated pattern matcher. It has read an enormous amount of text written by humans, and it has learned which words tend to follow which other words in which contexts. When you type a message to it, it produces the most statistically likely sequence of words back.
That is not a small thing. It is genuinely useful for some tasks. But it is also the entire trick. There is no one home. There is no understanding of you. There is no felt sense of what you said.
In my world we have a saying. AI is stupid. We mean it technically. It does not have context unless you give it context. It does not know the room you are in. It does not know your daughter's name or that she cried this morning or that you have not eaten since 11 a.m. It does not know your tone of voice or the way your shoulders sit when you say "I'm fine." It only knows the words you typed in the last few minutes.
And here is the part most people don't realise. It is built to agree with you.
These systems are trained, deliberately, to be pleasant. To sound supportive. To validate. Companies tune them this way because users like it more, give better feedback, stay on the platform longer. None of that is sinister. It is just product design.
But it means that when you go to AI in a difficult emotional moment, what you mostly get back is a beautifully worded version of what you already think.
If you tell it you are a failing parent, it will gently confirm that parenting is hard and your feelings are valid.
If you tell it your husband never listens, it will reflect back that not feeling heard is painful.
If you tell it your child is impossible, it will agree that you are doing your best with a difficult situation.
Notice what is missing from all of those. A second perspective. A pause. A question that opens a door you were not looking at.
It will reinforce the loop you came in carrying. And for someone who is already in a spiral, that is not neutral. That is harmful.
I am writing this for everyone, but I want to speak directly to neurodivergent readers and to the parents of neurodivergent kids for a moment.
Our brains are already prone to certain loops. Rumination. Rejection sensitivity. Catastrophising. Black-and-white thinking after a hard day. Identity stories that we have been carrying since school, like "I am the lazy one" or "I am too much" or "I will never be the parent she needs."
What our brains need in those moments is not a smoother, more articulate echo. We need interruption. We need the friction of another consciousness. We need someone who can notice that our breathing has changed, that we shifted in our seat when we said her name, that the words coming out are not matching the eyes.
That is somatic information. Felt information. The kind of information a human coach is trained to read and a screen, by design, cannot.
When an ADHD brain in distress goes to AI for support, the most likely outcome is that the spiral gets articulated more cleanly. The story gets tighter. The certainty grows. And nothing actually shifts, because nothing has interrupted the pattern that brought you there.
I want to be honest about what coaching is, because I think a lot of people don't know.
Good coaching is not advice. It is not someone smarter than you telling you what to do. It is not strategy delivery.
Good coaching is a trained human paying complete attention to you. Holding space without filling it. Asking the question you would not have asked yourself. Noticing the thing you were trying not to notice. Trusting that you have your own answers, and helping you get to them at a pace your nervous system can actually receive.
You cannot do that with a chatbot. Not because the chatbot is bad, but because the chatbot, structurally, does not have what is required. It does not have a body. It does not have a nervous system that resonates with yours. It does not have the years of training in noticing what people do not say. It is words on a screen, optimised for plausibility.
Here is the thing nobody is saying clearly enough.
When something hard is happening with your child, or in your marriage, or in your own head, the impulse to reach for help fast is completely understandable. AI is right there. It is free. It does not judge. It answers immediately. The pull is real.
But pause for a second and notice what you are actually carrying into that moment.
You know your child. You know the look on her face when she has had a hard day at school, even before she says a word. You know which mornings will go sideways the moment you hear her footsteps. You know the family history nobody writes down, the marriage you are inside, the thing your mother said to you when you were nine that still echoes when your daughter cries. You know what your gut already suspects but has not let itself say out loud.
You are walking around with the entire context of your life, your family, and your child, most of it sitting just below your conscious awareness.
AI has none of that. Whatever you type in the next three minutes, that is all it gets. A paragraph. Maybe two. It will respond as if it understands the whole picture, because it is fluent and confident-sounding by design. But it is responding to the paragraph. Not to the life.
So the real question, when you feel that pull to reach for help, is not "is AI good or bad."
AI, who has the paragraph you just typed.
A therapist, who knows your history, the patterns underneath your reactions, and the clinical work that may need doing.
A coach, who knows where you are now and where you are trying to get to, and who is trained to help you move forward.
You decide. But decide knowing what you are choosing between. The fastest help is not always the right help. And for the things that actually matter, the relationship with your child, the question of who you are becoming as a parent, the grief you have not named, the choice to reach for the human who carries context with you matters more than the speed of the reply.
I am not anti-AI. I would be a strange software engineer if I were. I use it every day in my work, and I will keep using it.
For research, drafting, summarising, organising your calendar, helping you structure a tricky email, breaking a project down into steps, learning a new topic at speed, AI is a wonderful tool. Use it. Use it freely.
For the inside of your own life, the relationship with your child, the grief you have not named yet, the question of whether you are doing this right, the moment at midnight when everything feels like too much, please do not use it.
Find a human. A therapist if there is clinical work to do. A coach if you are ready to move forward and need a thinking partner. A friend who actually sees you. A parent group where the people in it have lived what you are living.
I want to come back to my friend's question, because it has stayed with me.
She asked if AI could be the bridge between my two careers. The software engineer I have been for fifteen years, and the ADHD coach I am becoming. It was a generous question, and I have thought about it a lot since.
Here is what I have landed on.
The bridge is real, but it runs the other way from what most people would assume. My engineering self is not here to bring AI into coaching. My engineering self is here to be honest about what AI is and what it is not, so that coaching, the slow human work of it, stays protected. So that the people who need a real human in the room actually find one, instead of settling for a screen that sounds supportive.
The most powerful thing in your life right now is not on a server in California. It is the people, including yourself, who can sit in the room with what is true and not look away.
That is what we were built for. That is what AI cannot replace. And in the year the whole world is being told otherwise, someone needs to say it out loud.
I am saying it.
The discovery call is always free. A real human conversation about whether coaching might be what you are looking for — no pressure, no script.
Book a discovery call