
A few months ago, I sat in a boardroom with the leadership team of a mid-sized professional services firm. Smart people. Good business. They'd brought us in to talk about their website, which - by their own admission - had been neglected for a while.
"We know it's not perfect," the managing partner said, with the resigned tone of someone confessing they haven't been to the dentist in three years. "But it does the job. And we've got bigger priorities right now."
It does the job. I hear this a lot. And every time, I want to ask the same question: how do you know?
Because here's the thing. Most B2B firms have absolutely no idea what their digital experience actually feels like for the people using it. They assume it's fine because nobody's complained. But prospects who have a poor experience on your website don't send you a strongly worded email about it - they just leave. Quietly. And go somewhere else.
There's an uncomfortable truth lurking behind most B2B digital experiences: the people running the business almost never use their own website the way a prospect or client does. They might glance at the homepage every now and again, or check that a new case study has gone live, but they rarely sit down and try to actually do something with it - find a specific service, understand how the firm is different, or work out who to contact about a particular problem.
If they did, they might notice that the navigation is baffling, the service pages all say roughly the same thing, and the 'contact us' form asks for so much information it feels like a tax return.
PwC's research into customer experience found that one in three customers will walk away from a brand they love after just one bad experience. And whilst that study focused on consumer brands, B2B buyers aren't a different species. They're the same people, with the same expectations, shaped by the same frictionless experiences they have with the likes of Amazon, Monzo and Spotify every single day.
So when your website takes four clicks to find a phone number, or your client portal looks like it was built in 2014 (because it was), the gap between what people expect and what you deliver grows wider. And it's costing you more than you think.
I get it. Truly. The instinct to delay digital investment until there's more budget, more bandwidth, or more certainty is completely understandable. Especially when the alternative appears to be a six-figure website rebuild that'll consume your marketing team for the best part of a year.
But here's where I'd push back: improving your digital experience doesn't have to mean starting from scratch. In fact, some of the most impactful improvements I've seen have been relatively modest - both in cost and complexity. The trick is knowing where to look and, more importantly, actually looking.
Which brings me to the real point of this piece. If you want to make meaningful improvements to how people experience your digital products - your website, your portal, your app - you need to start by understanding what's actually happening out there. Not what you think is happening. Not what your agency told you at the last quarterly review. What's really going on.

And there are a handful of practical, affordable ways to do exactly that.
Surveys get a bad rap in some circles, and I understand why. Most of them are terrible. They ask vague questions, they're far too long, and they treat every respondent as if they're the same person with the same needs. The result? Generic feedback that tells you very little of use.
But a well-designed survey - short, specific, targeted at the right audience - can be genuinely revealing. The key is asking questions that get beneath the surface. Not "How would you rate your experience?" (which invites a reflexive 'fine'), but "What were you trying to do today, and did you manage it?" or "What nearly made you give up?"
The same principle applies to interviews and workshops, only more so. We ran a series of 25 face-to-face workshops across the United States for one client, a digital audiobook provider called Recorded Books. We travelled around the country meeting librarians, teachers and readers to understand how they actually used the website - not how we assumed they used it. We did the same in the UK and ran online sessions in Australia.
What we discovered was both simple and profound: people wanted the site to be easy to use, robust and smooth. Not groundbreaking insights on the face of it, perhaps, but the specifics of where the experience was falling short - and why - gave us a clear roadmap for improvement. The result? A 99 per cent increase in session durations and a 63 per cent increase in page views per session. Recorded Books now has the number one website in its market.
You don't need to fly around the world to get this kind of insight, of course. But you do need to go beyond a five-question SurveyMonkey form and actually sit with the people who use your products. Listen to what frustrates them. Watch where they hesitate. That's where the gold is.
Here's a limitation of surveys and interviews that doesn't get talked about enough: people aren't always honest. Not because they're liars, but because they often can't accurately describe their own behaviour. They'll tell you the website is 'fine' whilst simultaneously struggling to find the thing they came for. Humans are remarkably bad at self-reporting, especially when they don't want to seem difficult.
This is where heatmaps and session recordings earn their keep. Tools like Hotjar or Microsoft Clarity (which is free, by the way) let you see exactly how people interact with your digital products - where they click, how far they scroll, where they get stuck, and where they abandon ship entirely.
The data can be humbling. I've watched session recordings with clients where we've seen prospects land on a service page, scroll aimlessly for thirty seconds, then leave. No click. No enquiry. Just... gone. And yet, when we'd previously asked the client how they thought that page was performing, the answer was - you guessed it - "It does the job."
The caveat with heatmaps is that they show you what is happening, not why. You might discover that nobody clicks on your beautifully designed call-to-action button, but the heatmap alone won't tell you whether that's because the button is in the wrong place, the copy doesn't compel action, or the visitor simply isn't ready to take that step. That's why combining behavioural data with qualitative research (the conversations I mentioned above) gives you the fullest picture. Neither method is perfect on its own, but together, they're pretty powerful.
We worked with Bluebird Care, a home care provider with a network of franchisees across the country. They wanted to improve the experience on their website but weren't sure where to start. Rather than guessing, we ran a series of workshops with the people who actually mattered: carers, franchisees and family members arranging care for loved ones.
What emerged from those conversations was quite different from what the internal team had assumed. The priorities weren't what they expected. The pain points weren't where they thought they'd be. And the language people used to describe what they needed bore little resemblance to the language on the website.
After making changes based on those insights, Bluebird Care saw a 68 per cent increase in website users. Not because we threw money at a flashy redesign, but because we listened to the right people and acted on what they told us.
A/B testing works on a similar principle - just more systematically. You take two versions of something (a landing page, a form, a navigation structure), show each version to a different group of users, and measure which one performs better. It takes patience, granted. Reliable A/B tests need decent sample sizes and enough time to produce statistically meaningful results. But even simple tests - changing the position of a contact form, shortening a sign-up process, rewriting a headline - can produce surprisingly significant improvements.
The important thing is to start early and keep going. UX is never 'done', which is either depressing or liberating depending on your outlook.
This is where a lot of firms come unstuck, and it's the bit that rarely gets discussed in articles about improving user experience. You've done the research. You've gathered the insights. You've got a long list of things that could be better. Now what?
Without a clear roadmap - one that prioritises changes based on impact and feasibility, assigns realistic timescales and identifies the resources required - all that lovely insight just sits in a slide deck gathering digital dust. I've seen it happen more times than I'd like to admit.
We built a roadmap for continual digital innovation with the Building Societies Association, and it's a good example of how this should work. It wasn't a single 'big bang' project. It was a phased plan for ongoing improvement, grounded in evidence, with measurable goals attached to each stage. The result was a 9.5 per cent increase in session duration and a 7 per cent increase in registered online use. Not earth-shattering numbers in isolation, perhaps, but the compound effect of sustained, evidence-led improvement is where the real value lives.
Good question. And the answer, I think, comes back to how digital experience is perceived at board level in most B2B firms. It's still treated as a marketing problem rather than a business one. The website is 'marketing's thing'. The client portal is 'IT's thing'. And the overall experience a client or prospect has across all of those touchpoints? Well, that's nobody's thing, apparently.
Until someone decides it matters - really matters, not just in the annual strategy deck - it'll keep getting bumped down the priority list. Behind the office move. Behind the new CRM. Behind the rebrand that's been 'in progress' for eighteen months.
We've been meaning to sort the website out for ages, but it's not a priority right now.
I hear you. But consider this: every day your digital experience underperforms, you're losing prospects you never even knew you had. They visited. They looked around. They left. And they're not coming back to tell you why.
The good news is that understanding what's going wrong - and starting to fix it - doesn't require a blank cheque. It requires curiosity, a willingness to listen to your users, and the humility to accept that "it does the job" might not be the full story.
A few questions worth sitting with:
When did you last use your own website as if you were a prospect with no prior knowledge of your firm - and how did that feel?
Do you have any reliable data (not assumptions) about how people actually behave on your digital products?
If you asked ten clients to describe their experience of your website or portal, would you be confident in what they'd say?
And if the honest answer to any of those is "I'm not sure" - well, that might be the most useful insight of all.