Product strategies don’t unravel solely because the logic is wrong. Many strategies fail because they misread human behaviour. We assume users are rational, needs are explicit, and decisions are logical. In reality, decisions—whether it’s downloading an app or signing off on a multi-crore enterprise deal—are driven by fear, guilt, pride, and the need for control.
This isn’t about adding “empathy” to conversations while building a roadmap. It’s about seeing emotion as the core driver of action.
When you grasp what people carry emotionally, your product lens sharpens. Your choices get clearer. You stop building what makes sense and start building what makes meaning
I didn’t arrive at this by theory. I saw it unfold in a high-stakes space: elder care.
A Story About Guilt, Distance, and What People Are Really Paying For
I met Poulomi Bhattacharya, the founder of SilverGenie, at the Google for Startups Accelerator. She was building a healthcare network for seniors in cities like Kolkata and New Delhi. The idea: trained professionals would visit parents at home, coordinate care, and send updates to their children—most of whom lived in other cities or countries.
At first glance, it looked like a standard service model. High-touch, operationally heavy, but valuable.
But as they started raising funds, investor questions circled one theme: “Where’s the deep tech?”
Poulomi wondered if she should build an app. That’s what scale looks like, right? An app for seniors. A dashboard for families. Engagement metrics to show growth.
But the real work was elsewhere.
The seniors didn’t want apps. For them, the internet was WhatsApp, YouTube, maybe Facebook. Anything more—downloads, menus, logins—was foreign. Smartwatches and sensors didn’t work either. They forgot to wear them. Or charge them. They never wore them while sleeping.
Wearing a hearing aid restored a lost function. Wearing a smartwatch felt like being monitored. It wasn’t restoring anything. It was a tool for nagging questions from their children.
What they wanted wasn’t more tech. It was a human. Someone who knew their name, who showed up, who didn’t treat them like a list of ailments, but as people with histories, dignity, and preferences.
The startup understood this. Their network of trained professionals visited seniors at home, ran health checks, and arranged hospital visits. But this presented a conundrum: how could a fundamentally human-centric service satisfy investors looking for a 'deep tech' play, and where was the scope for that tech?
The more we spoke, the clearer it became. There was another set of users who saw this system differently.
The people paying for this service—the actual decision-makers—were their children. Many of them were CXOs, professionals abroad, or people with demanding jobs in other cities.
They had built successful lives, raised families of their own, and now found themselves in a difficult position: they couldn’t return home, and their parents wouldn’t move. And so they lived with this steady, low-grade guilt. The guilt of having done well, having built a life and a career, and still not being physically there for one’s parents.
These weren’t indifferent children, living glamorous lives in the US or Mumbai, ignoring responsibilities. These were deeply concerned individuals who had simply run into constraints.
Some had tried live-in help, but that came with its own risks—untrained staff, occasional theft, and a lot of emotional discomfort. Others managed through neighbours or friends. But over time, it became clear that what they really wanted was someone on the ground who they could trust. Not just someone who would do the job, but someone who would understand what they were feeling.
What they were buying wasn’t just a care service. They were buying a surrogate. An emotional proxy.
SilverGenie had already picked up on this. With consent, they had installed discreet cameras in the homes of some seniors, particularly those living alone or with medical complications. These weren’t surveillance tools. They were comfort devices. Keeping the children connected to home. I asked the founder how often the children checked the video feeds.
She said, “All the time. Especially when the parent is unwell. We’ve had people write in saying they can’t sleep unless they know their father moved in the last hour.”
This changed our frame. It wasn’t just about who the app was for. It was about what the product was really solving for.
Once we saw the emotional reality of the users—children carrying the guilt of distance—the product strategy began to shift.
This wasn’t a data-driven play in the conventional sense. You weren’t going to build an app, track usage, and mine insights from feature engagement. The seniors weren’t going to feed you structured data. And frankly, you didn’t need that kind of data.
What you needed was an emotional signal.
Tech still had a role—but not as the star of the show. Its role was to be ambient and reassuring. Not to drive behaviour, but to settle anxiety.
So we started thinking differently.
Not: How do we get the parents to use the app?
But: How do we reassure children without requiring constant check-ins?
This is where the real product opportunity lay.
What if you could offer intelligent, simple alerts?
What if the system could say, “Your mother is up and moving around this morning, everything looks normal”—without the child needing to log in?
What if it could notify gently when something was off, but also include a line from the staff: “We just spoke to her, she’s fine. She was feeling a bit dizzy and we’re checking in again in two hours.”
What if reassurance was the primary UX?
The moment the team saw the product as a guilt-offset layer, the brief changed. They weren’t building a “platform.” They were building a calm presence.
And even that had implications downstream: on who they hired, how they trained them, what kind of tone their communication carried, what response times meant in emotional—not SLA (service level agreement)—terms.
This understanding moved beyond just product to the kind of people they needed and their org design. If the offering was to act as an emotional delegate, then the people at the frontlines couldn’t be junior ops executives or call center staff.
I suggested that they hire their relationship managers from the hospitality industry—people who understood this clientele, who knew how to offer quiet reassurance, and who could say with confidence, "We’ve taken care of it," in moments when it mattered most.
This wasn’t product-as-platform. It was product-as-peace-of-mind.
Even Rational Buyers Have Emotional Stakes
We assume enterprise decisions are rational, driven by ROI, pricing, capability.
They’re not.
Emotions run deeper in enterprise, because decisions cross power centers. Identity, control, status drive outcomes.
A startup I coached built an AI engine to detect hospital bill fraud. It flagged fake or padded claims faster than a human team. The rational value was clear: faster processing and lower leakage. The proof of concept with a major insurer went well.
Followed by a lengthy silence.
The head of risk, who had agreed to try it, began stalling. There were more feature requests. More dashboards. Endless tweaks. But no sign-off.
I asked the founders: “Did you consider what this product means emotionally to him?”
He had a 500-person team. If the product worked, it meant his relevance dropped. His control, his influence would be gone. They weren’t offering him a tool. They were threatening his identity.
He couldn’t say no outright. But he also couldn’t say yes. Not without undoing himself.
“We tried going directly to the CEO…” they said.
“Let me guess. He didn’t want to move without the head of risk’s assent?”
From the CEO’s perspective, the current setup offered plausible deniability. If something went wrong, the head of risk was accountable. But if the CEO bypassed him and signed off on an AI-led system—and it failed? That responsibility would be his. Why take that bet?
What if they reframed the product proposition?
Instead of replacing the decision-making, the tool could support it. What if they introduced a triage system—red, yellow, green indicators to suggest risk level? The final call still rested with the team. The model just helped them prioritise.
It was the same underlying tech. But the emotional posture had changed.
The product wasn’t saying, “You’re going to be obsolete.” It was saying, “You’ve got leverage.”
Because even in organisations built on process and numbers, emotion still shapes what moves—and what stalls.
Emotion Isn’t a Layer. It’s the Ground
These stories seem worlds apart. But they show the same truth.
Whether you’re building for a child trying to care for their aging parent, or a risk leader trying to protect their turf in a high-stakes organisation, emotion shapes the decision far more than we’re trained to admit.
Because in the end, what we call first principles isn’t just physics or code. It’s purpose.
And purpose is powered by emotion.