Insight

Testing, Learning, and the Website Nobody's Optimising

In this episode of The CX Equation, Co-founder and Head of Experimentation at Hookflash, Nicole Storey, unpacks what real experimentation looks like, and why we shouldn’t just call it ‘CRO’.

Experimentation has become a familiar term, but few organisations practise it properly. Nicole Storey believes the problem lies in treating testing as a performance tool when it’s really a learning discipline.

She argues that the website – often the most neglected “channel” in the mix – holds untapped potential for optimisation. And she explains how data-led testing, integrated with analytics and SEO, helps brands align digital experiences with customer intent and drive measurable performance gains.

When analytics, SEO, and testing work together, teams gain both clearer insights and faster validation.

Here are some of the most useful ideas from the conversation:

Takeaways:

  • Treat your website as a conversion engine, not a reporting platform.
  • Run tests from a hypothesis, not a hunch.
  • Losing tests save money and reveal what not to do.
  • Data should fuel what you test – not follow it
  • Shared insights stop internal debates and align teams.
  • A tailored, consistent landing experience beats blanket personalisation.

For Nicole, experimentation isn’t about chasing uplifts; it’s about creating a culture that learns fast and acts with confidence. It’s a reminder that the best digital experiences come from curiosity, not certainty.

Listen on your favourite podcast platform, or watch the video on YouTube.

Spotify
Apple Podcasts
YouTube

Here's The Full Transcript

Chantelle – 00:26
Welcome to The CX Equation, a podcast by TapCXM.

Mark – 00:30
We share actionable insights and real-world case studies to equip you with the tools you need to drive loyalty, engagement, and sustainable growth.

Chantelle – 00:37
We’re your hosts, Chantelle Casey.

Mark – 00:40
And Mark Clydesdale.

Chantelle – 00:42
On today’s episode of The CX Equation, I’m delighted to welcome Nicole Storey to the podcast. She’s the co-founder and Head of Experimentation at Hookflash and also happens to be a good friend of mine. Nicole is a recognised leader in digital optimisation and experimentation. She has a wealth of experience in helping brands understand their customers’ behaviour and help them improve website performance. Prior to Hookflash, Nicole was leading the Analytics and Experimentation teams at Merkle until she decided to go off and do her own thing in co-founding Hookflash. Nicole now works with organisations to combine Analytics, SEO, and to create digital experiences that are actually smart, fast, and customer-focused.

Mark – 01:18
In today’s conversation, we’ll explore how experimentation fuels better customer experience, how to build insight-led optimisation programmes, and what it really takes to turn data into growth. Welcome, Nicole.

Nicole – 01:30
Hello. Thank you for having me.

Mark – 01:32
You’re welcome. It’s nice to have you on. So you’ve built your career around experimentation, testing, iteration, learning. What sort of first drew you to this as a discipline, and why do you believe it’s so critical to great customer experience?

Nicole – 01:48
Without giving you my whole life story, I originally started in the industry doing all things Analytics and Tag Management, implementation, helping clients set up the data that they want to get tracked, integrating different data sources, and basically slowly realised that I was more interested in figuring out actually the next piece in terms of what the data meant and what the insights were. So I then pivoted into more of an insights-led role and actually working on using the data that had been implemented to actually understand what’s happening, what are the key insights. And then a very natural progression after that was I kind of got frustrated of having so many insights, taking any action off the back of it. So it’s quite a nice story in terms of how I got into the industry.
And I think for me, I have a background in Computer Science. Actually, Computer Science and Business is what I studied at university. And I feel like experimentation gave me that perfect blend of science, creativity, data, technology, and psychology as well. So I always debated whether Computer Science was the right thing for me, whether I should study Psychology. And somehow—I wish it was through lots of research into the role—but actually, I fell into my perfect role through all of those different pivot sort of changes over the last ten years.

Mark – 03:22
You never know what you really wanna do, do you, when you leave university?

Nicole – 03:26
Yeah. So now I feel like through experimentation as a discipline, I get, like, the perfect balance of all those different things that I didn’t really know that I wanted to do and didn’t really know that this world existed, which is quite nice—worked out quite nice for me.

Chantelle – 03:42
You get to flex all of your different sides of you. So, obviously, co-founding Hookflash was a big step after your successful career at Merkle. What gap in the market did you see that made you think, “Oh, there’s a better way to do this,” or “I think this can be done differently”?

Nicole – 03:55
Good question. So I suppose what was key for us is that there is a lot of focus on media optimisation, focusing on optimising every channel, and it kind of felt like sometimes the website’s a bit of an afterthought. There’s not really a huge focus on what happens once the user actually gets to the website, and we feel like every channel’s optimised to the nth degree where possible, or there’s a huge amount of focus on optimising spend. But, actually, the website is such a big lever for optimising that spend in a different way. So we really felt like there was a bit of a gap in agencies focusing holistically on the website experience to complement all of the investment that’s given to all of the other channels and all of the optimisation that’s done there. As I said, get that holistic view of website optimisation. So bringing together SEO, experimentation, and Analytics was key there—being that real website conversion engine is something that we’ve toyed with for a while as whether that’s a good strapline, but, basically, really, really focusing on just website optimisation as our core focus and specialisation was really the gap that we saw.

Chantelle – 05:21
And you’ve just alluded to it there. So, historically, we’ve seen clients treat Analytics, SEO, experimentation very separately. You’re trying to bring all of that stuff together. Why does that matter, and how does that contribute to a good customer experience?

Nicole – 05:33
I think the most obvious one is you can’t really do experimentation without data. So running the tests, the obvious place that you need data is then to analyse the results. So I think you absolutely can’t do experimentation without Analytics. And those were the most obvious links to us when first setting up Hookflash. And originally, we actually launched Hookflash with just Analytics and experimentation. That’s without touching on having the data to actually fuel the experimentation programme and having those data-driven insights actually fuel what tests you have as well as the obvious one in terms of analysing the results.
SEO actually came a bit later for us. We spoke to a lot of clients, and one of the questions we kept getting was, “Oh, but are you gonna look at site speed as well?” We were like, “Just focusing on, like, the customer experience and not necessarily the technical foundation.” So we quite quickly realised that SEO was actually a bit of a gap for us when we first launched. So we actually introduced SEO a little bit later as a complementary service because it’s actually the questions that we got asked from clients. And I think the site speed side of it is maybe a bit obvious, actually. There’s no point in optimising this amazing experience on the site if a user’s already bounced because the page loaded too slow.
But there’s also the content side of things that has been really interesting for us in sharing learnings from a search intent perspective and what users are searching and what content is important to actually feed the on-site experience as well and really work together. We’ve had a lot of examples of clients who change a lot of content, add chunks of content from an SEO perspective, but really from a customer experience perspective, whether that’s actually good experience is something that then we can test. So pulling it all together has really been key for us. And as I said, really gave that holistic approach to really optimising the website.

Chantelle – 07:42
And how does that approach affect sort of day-to-day decision making? So when you’re working with clients, how does that impact their day-to-day?

Nicole – 07:49
Yeah. It’s actually not as seamless as you think, which we realised quite quickly as some of these teams actually sit quite separately. So it’s not actually as beautifully seamless as that, and it’s bigger picture because, really, these teams don’t sit together. And I think Hookflash really helps with tying those teams together and being able to help those teams share learnings and making sure that we’re all activating off each other’s insights.

Chantelle – 08:19
And if they are treated separately, what kind of common problems do you see that crop up?

Nicole – 08:23
I think the biggest one that we see is quite a big mismatch between, like, expected experience and the experience when you actually get on the website. So we had a really interesting one recently with a client who sells pizza ovens. And what we find from an SEO perspective is that a lot of users were searching for indoor and outdoor. So specifically, like, indoor pizza ovens, outdoor pizza ovens. And, actually, when you get on the website, there’s not really any distinction whether the ovens are indoor versus outdoor. It’s very fuel type focused. So we realise that there’s quite a big gap there, and that’s a good example of where users are searching for something. They find the website, they get there, and then there’s no representation of what they were actually searching for or a way to find that. So I think that’s really why it’s so key to tie those two together.

Mark – 09:23
And so are you helping clients just kind of optimise the web pages—a total web page and landing pages? Or are many of your clients actually doing full web personalisation where it can change based on how people are coming in?

Nicole – 09:36
I actually think personalisation is one of those buzzwords that everyone spoke about ten years ago and kind of resurrected again recently where everyone’s talking about personalisation again. I feel like it’s one that everyone tried to master, didn’t quite get there, realised there were so many different tech and teams involved and couldn’t really get it off the ground. And, actually, we’re seeing, again, that being quite a big focus for clients. This year is a core pillar that we’re really keen to focus on as well.
Hookflash is driving more personalisation. The first step for us is actually usually looking at landing pages, understanding, okay, what channel has that user come from? What’s the context that they have? Because, actually, if you come through email or paid search or even paid social, the experience that you have can be so different, and, obviously, the intent that you have can be so different. And the landing pages sometimes can be the exact same. So even, like, the first step into moving into personalisation, we usually more frame as tailored experiences. What we try to look at there is just making sure that, like, the creative, the copy matches from the channel that you’ve come through on the landing pages. And you wouldn’t believe the amount of clients who still don’t do that. It’s such a quick win for us. It’s so easy to execute, and the results have been insane every single time. I mean, it just makes sense if you can marry up the two experiences.

Chantelle – 11:12
Yeah. I think we see a lot of clients… One thing we’re always trying to hammer home is, like, the idea of consistent across-channels customer experiences. Right? And I think we don’t personally—me day to day—I don’t work directly with the websites, but that is always… it’s not really counted as a channel in, like, the marketing automation world, or it’s always the one that’s left out, I find.

Nicole – 11:30
And I feel like it’s such a quick win as well. Even just thinking about the landing pages that users are gonna land on, even when we’ve done simple redirect tests, some of the uplifts have been huge, and all we’re doing is, like, sending users to a different page. So it’s just one of those levers that can really help, like, save or optimise spend significantly, but it’s, yeah, it’s an afterthought, as you said.

Mark – 11:56
It’s just making the customer experience smoother. Right? So it feels like they’ve come through easily. And it’s frustrating when you’re looking for one thing, and you land there, and it’s like starting again. You want things to flow through.

Nicole – 12:09
There’s nothing that winds me up more than when I’m scrolling on Instagram, and I see something that I wanna buy. And I click through, and I’m, like, on the homepage with no sign of it or on a listing page, and I have to scroll to find it. That’s me done. I’m not doing it. And if it takes me more than a few seconds to find it, I’m like, oh, this isn’t worth it. I just… it was like a shiny thing that I saw. I wanted it, and now it’s gone, and then that’s it. And now I’m like, well, how do you have tailored this?

Mark – 12:37
But they still paid for my click, so fool on them.

Chantelle – 12:41
Yeah. Something me and Mark have spoken about before, and this is, like, how our attention spans as, like, decision paralysis and our attention spans over time are getting shorter and shorter. So the way that brands need to capture our attention and push us to whatever call to action they want us to do—and it just needs to be faster now.

Nicole – 12:58
And that’s exactly, like, the psychology side that comes into it as well. So there’s so many different biases that we have. So another common one that we see, for example, is clients might put or brands might put a “from” price in their ads, and they will pick the lowest “from” price in their ads, obviously, to drive click-through rate. You get through to the site, and then it’s really difficult to find anything near that price range. And because the user’s anchored at the lowest price, as soon as they come onto the site and they can’t see anything remotely near that or, you know, you need to do loads of “sort by low-high” to even get there… There’s one specific product or there’s travel journey that meets that price, and that’s another area where the kind of psychology that you’ve been anchored impacts, obviously, how you behave on the website.

Mark – 13:51
I think that takes us nicely into testing then. Right? Because how do you learn that that “from” price to what I then see is too jarring, or that trying a higher “from” price in the first instance would have been the best thing to do? I find most clients that I work with will say they do some form of testing, but they’re rarely doing experimentation and have a kind of a mature hypothesis-through-to-action model. What does good experimentation actually look like to you?

Nicole – 14:23
Yeah. It’s a tough one. I feel like there’s a bit of a balancing act between quantity and quality. I think experimentation, you know, is always “learn fast, fail fast,” keep things moving really quickly. And sometimes I feel clients fall into the trap of running so many tests, but not necessarily quality tests. So I do feel like it’s a really tough balance between quality and quantity and getting that right.

Mark – 14:55
I would say the clients I work with, they either don’t do any testing because it’s kind of complicated and they’re not quite sure how to do it, or they do lots of tests, but the path to action from the test isn’t there. They’re almost testing so that they’re testing, but how many of those tests actually lead to them doing something differently? So do you recommend any kind of structures or ways of working for people to say, “Look, make sure you are testing the right thing and that you know how you’re going to develop off the back of the experiment”?

Nicole – 15:25
I think there’s a level of maturity. So I think if someone’s starting out with testing and they’ve never tested before, I do feel like getting some tests live, building momentum, being able to share some results is a really positive thing. And I think once you can get everyone on board, everyone bought in, then it’s really time to start thinking about the quality point.

And the big one for us, when we start to look at that piece and thinking about frameworks and process, the big thing there for us is data. As, obviously, we have that Analytics background and Analytics team within Hookflash. And that’s really where we would recommend that clients then start to build that quality—is actually making sure that all of the tests are data-driven before they even get onto the website. Because it’s really easy for anyone to go on any website and pick from even one single page. You could pick a hundred tests that you could run, and you could just start running hundreds in no time. And then that’s where really looking into the data, seeing where the big opportunities are, where the gaps for performance are, that’s what’s really gonna help bring the quality and make sure that you’ve got that balance between both.

Mark – 16:38
So you’ve got almost a clear hypothesis to begin with that’s data-driven, and then you know the test is about proving or disproving that hypothesis. There’s a mentality there as well, I think, in that the idea of an unsuccessful test. For me, the only unsuccessful test is a test that you run and then you couldn’t measure something. Because even if the result wasn’t what you thought it was gonna be, you still learn something. Right? So that’s still a success.

Nicole – 17:03
100%. It’s a really tough mindset to build, especially with clients who are starting out. And I think the phrase “Conversion Rate Optimisation” has also not helped with the precedence that sets experimentation, which is why at Hookflash, we’ve actively chosen to double down on experimentation and not CRO, because that mindset of Conversion Rate Optimisation from the get-go just sets up the wrong expectations that it’s always going to be better than before, and not necessarily that, actually, we wanna find losers as well. I think that that’s really difficult to start to change that mindset. It’s a big one that we really try to focus on.

Chantelle – 17:51
Yeah. You wanna find what not to do as well as what to do.

Nicole – 17:54
Exactly. Money saved. What we’ve saved. I think my guilty pleasure, honestly—I prefer a losing test to a winning test sometimes. I just feel like you learn so much more. It’s probably the only time I like to be wrong, to be honest. But I do feel like you’ve learned something. You had an insight. You built a hypothesis based on that. You ran the test. And, actually, what you’ve assumed is so wrong, and I find that so interesting. And maybe that’s the psychology side of it coming in, but it’s really like you’ve gotta go back to the drawing board, figure out what to do next. Or are there any other channels if it’s language, for example, that are doubling down on this that we can help share the learning. And I do genuinely think there’s so much more from a losing test than a winning test, but, of course, clients prefer that.

Mark – 18:42
There’s a tricky bit with that there. Right? What if you’ve disproved someone else and they don’t like losing? Right? You’ve got to do that kind of stakeholder management now. “I’m experimenting, and I’m proving people wrong.” How do you help clients kinda manage that?

Nicole – 18:55
I do think the data-driven side of it really helps. It’s really difficult to argue with data, and it’s always so easy to go back and forth on gut feel or preferences. And, really, a lot of the time, that’s where some tests come from. It can be senior stakeholders who say, “Let’s change it to this,” and then someone’s like, “Uh, maybe let’s test it.” And then even though we might be the bearer of bad news sometimes, I do think because you’ve got the results there, there’s not really much that you can argue with it. Actually, one of our clients—we asked them, you know, “What do you love about experimentation?”—and they said settling internal debates was one of the biggest things or biggest benefits to their organisation was just they would go back and forth and just being able to say, “Let’s just test it,” helped them so much.

Mark – 19:47
But you can then argue, “Yes, it’s not what you wanted it to be, but we’ve now saved ourselves time and money in the future by not going down the wrong course.” Right? So it’s always communicating the benefit of what you’ve found.

Nicole – 19:58
We always have—in terms of that… When we look at even programme metrics or a single task, we’ll always have the kind of incremental revenue, but also the loss aversion or the money saved. And I think it’s as important to show both. And that’s kind of the true experimentation—is making sure we’re embracing both equally because, as I said, there’s so much learning from a losing test that you don’t always get with a winning test.

Chantelle – 20:26
Yeah. For sure. Just going back to sort of once you’ve tested, once you’ve been doing your experimentation, actually getting actionable insights… sorry, that was really badly worded… gaining actionable insights from those tests. With our clients, historically—you might feel free to disagree—we don’t seem… access to data isn’t necessarily a problem, but they do struggle to act on the right data and know what… How do you help your clients separate sort of the noise from the actual meaningful insights?

Nicole – 20:53
One of the things that we do is we always sit down with our clients and map out the key questions that we’re trying to answer. So before we actually start digging into any of the data sources, we actually just end up as a really simple, just a huge Word doc that we sit down together and collaborate on of just questions. So what actually are we trying to answer or find out? Because what I do find is you could sit for hours or days in all of the different data sources and end up with a load of random nothingness that fits together or helps tell any kind of story or understand any opportunity. So I do think the upfront prep work that you do before you even look at any data is as important as the time spent diving into the data, which I think always feels like for clients as… you do so much work before you even get into the data, then you have to do the data before you even get to a test, and it feels like a really long process. As I mentioned before, I think that is the key to making sure that you have that quality test. But I suppose even broader than that, just actionable insights rather than random loads of different data points that no one can really make sense of.

Chantelle – 22:09
And are there any sort of specific metrics or signals that you’d prioritise looking at first, or does it completely depend on what those questions are?

Nicole – 22:16
I think it completely depends on the questions. I think it depends on the questions, depends on the brands, like, the level of maturity that they’re at, how integrated the different teams are. Because if we’re working really closely with their media teams or the SEO teams, it might change the questions that we want to answer, especially when it comes to the on-site experience. It really depends. And that’s why there’s no kind of “one size fits all” set of questions. It will really depend on whether they’ve tested before, how well they know their audience. If we’re into personalisation versus just landing page optimisation, yeah, it is really gonna vary by client and by stage of maturity.

Chantelle – 22:58
And if your clients were to act on the wrong insights, if they didn’t have you there holding their hand, helping them along, telling them what to look at, what’s the risk there? Is it a waste of experimentation then if you don’t have… you know, if you’re not creating good experiments?

Nicole – 23:12
I suppose the worst-case scenario is maybe you won’t have tests that reach statistical significance. I suppose the big benefit or upside is that clients are gonna test based on the data. So one of the worst-case scenarios is maybe the insight has led them down the wrong path, and we have a really negative test. But, thankfully, we tested it, and now we know. But I think, potentially, just going down the wrong path is the biggest risk. Or inaccurate data is obviously another big problem that we see—is actually tracking not really being set up in the first place correctly to then get the right data because clients are using it to test. I think the worst-case scenario is we get a negative test or it’s insignificant because the insight wasn’t really valid in the first place.

Mark – 24:01
I do find people that are scared of the maths of things sometimes. There’s something psychology about people. Like, I can’t speak French, and the fact that I can’t speak French, I’m not particularly embarrassed about. But when I think when you’re in a corporate world and someone starts talking numbers and equations to you, and if you’re really not good at that, I do think people have an almost fearful reaction to it. So when you’re trying to talk about statistical significance and standard deviations and things like that, if people don’t get it, they almost kind of wanna run away. So I think knowing that they’ve got someone that can handle that for them and figure out that it is gonna work and it is gonna be robust is so, so key. Because otherwise, you just end up with 10% control groups. Am I learning anything? Am I leaving money on the table because I didn’t need to keep that many people out? And it’s just navigating all of that is so key.

Nicole – 24:51
It’s a really good point. I actually didn’t even think of that lens of it as acting on experimentation data as well. I was more thinking of the insights that feed into it, but actually not understanding if you’ve reached the right sample size, if you’ve got the right significance level, making sure you avoid false positives, false negatives, all of those different things. You’re exactly right that it’s so daunting for someone who’s never done any of this before. And I think that is one of the barriers where experimentation—sorry, slightly off on a tangent—but there’s so many different roles or skills involved with that discipline in terms of being an analyst, having stats knowledge, being a developer, having someone to QA, project manage. There’s no one person that can just get experimentation off the ground in an organisation. But yeah. Sorry. It’s slightly different to your question in terms of data and making the wrong decisions, but it is quite a daunting discipline, I think.

Chantelle – 25:52
Just on that then, do you… what’s the profile of the people that you’re normally working with with your clients? Are they people that would understand the relevance of the statistical significance and all of all of the other more sort of stats language? Or is it more people that care about, from a business perspective, what their customer experience looks like?

Nicole – 26:11
Yeah. I would say it varies significantly, actually. It depends. We do work with clients who, for example, our key stakeholder might have a background in experimentation and have been doing it for years, and that’s their specialisation. Or we can work with clients or Product Managers or Website Managers for whom it has not necessarily been their specialisation. It is part of a much bigger role, and I think that’s where it really helps to have a team of specialists from an agency, for example, to really help with all of the different skills that are needed and plug that gap because it’s quite a big team that you would need to build, especially if it’s not your full-time role to manage so many different parts of the process would be quite a huge undertaking.

Mark – 27:02
And are there certain tools that you can use to make this easier? Or and do you have a suite of tools that you recommend to clients in how to set up tests and measure them and so on?

Nicole – 27:12
To be honest, there’s so many testing platforms in the market now. We, of course, have our favourites. And, really, they have all evolved significantly with all of the in-platform reporting now. Even some of the kind of auto-generated insights and summaries has actually come a long way. So there’s actually—it makes it a lot more accessible for a lot more brands within one single platform without having to have custom calculators, exporting data, doing t-tests, etcetera. It does take a lot of that work away now, which is quite nice.

Mark – 27:51
And so what are your favourites? Or what are the ones that you see all the clients using… most clients using?

Nicole – 27:57
So we’re quite platform agnostic. For us, well, we have a free team of front-end developers, so they really can use any testing tool. But we partner really closely with AB Tasty. They’re our kind of core partner. We also work with WebTrends Optimize, VWO. We’ve recently started working with Chameleon, Convert. So there’s so many different platforms that clients have access to. Optimizely as well is another big player. So, yeah, there’s so many.

Chantelle – 28:28
And do they all tend to have the same, like, product offering, or do they differ in terms of—?

Nicole – 28:33
No. I think more and more, there’s very little differentiators, especially from an agency perspective. I do think from a client perspective, having a team in-house to use it themselves, there’s big differences between the tools and maybe the experience or the processes or workflows. But from an agency perspective, it actually makes no difference to us, as I said before, because we just inject the code for the test that we wanna run and then analyse the results, and we’re completely platform agnostic. Makes sense.

Mark – 29:06
I assume everyone’s using AI nowadays, or everyone’s at least claiming to use AI. Um, how is AI increasingly influencing Analytics and testing platforms?

Nicole – 29:18
It’s everywhere. It’s not escaped experimentation. It’s a big buzzword in experimentation. I think quite a few are actually across the whole process, to be honest. From a development side of things, obviously, it speeds up that time in terms of building codes. Some of the code editors now have AI code generators even in the platform. It’s gonna make it much easier for clients to get tests live.
And then from an insight side of things, as I mentioned, actually, a lot of the platforms now have AI-generated learnings based on the results that are in the platform. Summaries that they’re generating. And from an agency perspective, it really helps us focus on some of the bigger pieces and takes away some of those tasks that actually used to take quite a lot of time. We’re much more efficient, and it means that we can focus on some of the bigger strategic pieces or the bigger tasks that aren’t necessarily possible with AI yet. They would still need a developer to support with them.

Chantelle – 30:24
So more specifically, what types of strategic pieces would you focus on? Like, how do you see the role of human judgement evolving alongside AI?

Nicole – 30:31
I think one of the things that obviously… like, when we talk about, like, customer experience, one of the things that AI obviously doesn’t have is real-life experience or emotions. So I always think that there’s still gonna be an element of that that we will always bring to the table when it comes to experimentation or even those psychology pieces and layering that in and really being able to understand that kind of end-to-end journey and experience is still something that AI won’t be able to do as good as us just yet. But from a coding perspective, insights perspective, the speed at which AI can support with that and do some of the tasks that we’re doing—you can’t argue with it. It really does speed up and make so many processes more efficient.

Chantelle – 31:21
I guess it’s just adding to that level of productivity. You can just test more and find out more.

Nicole – 31:26
And so with some of the bigger tests, just… really with the way that testing works, you’re manipulating existing code. So it’s much more difficult for AI to get that. It can build code from scratch quite quickly, and, you know, you can build a website in five minutes and that kind of thing. But because we’re actually manipulating what’s already there, there’s a layer of experience that it doesn’t actually have yet with building those experiences. It really doesn’t get it quite right for anything beyond the basic. That’s a big piece that we would focus on—is those much bigger experiences that AI just can’t execute yet through a testing platform.

Mark – 32:08
I think we’re getting to a close, so we’ve just got a few questions to round off then. As a user yourself, are there any really great digital experiences that you’ve seen lately or you found particularly inspiring?

Nicole – 32:20
This might be a really boring one and maybe a bit old now or dated. But I recently moved house, and I have been loving the, like, “view a sofa in your room.” I’ve just found it so helpful. So I think I was using Dunelm the other day, and I was trying to look at what the sofa would look like in my room. And I just loved it. I had a ball. And it was a bit useless at first. Like, it took a while to get it, like, the right size. It was either giant or tiny. And I was like, “Is that really the right size?” But just from a look and feel perspective, I—it was really helpful, and I really enjoyed it.

Mark – 32:59
I’ve still—I’ve recently had some fitted wardrobes done and playing tools to see what height wardrobes I want and different in… playing with the configuration of the doors. Yeah. So you can… you lose hours of your day just playing with these things.

Nicole – 33:12
Yeah. Oh, you reminded me. Maybe I’ll change my answer. I used the IKEA wardrobe builder. I don’t know if you’ve used that.

Mark – 33:20
Yeah. The PAX and all of them. Yeah. Yeah.

Nicole – 33:23
So that was quite the job. Enjoyed see—enjoyed lots.

Chantelle – 33:28
Yeah. I guess it’s those, like, what seemingly should be, like, easy things—like the practical wins that brands give you that are the ones that make you remember that experience. So just focusing back on Hookflash, so, obviously, you’ve scaled very quickly, and you’ve earned lots of recognition in our industry, which is great. What do you think has been the one of the biggest factors behind that success?

Nicole – 33:50
I think the people. I think we’ve built a really great team. We’ve hired some great people, and we’ve really tried to build a great place to work. We’ve really focused on trying to figure out what culture means to us, unpick that. Obviously, it’s quite difficult when to measure. But, really, we think that if people like coming to work and like doing their job, they’ll do better work. And we did actually some clients for feedback for the kind of work that we did in 2025. And some clients really did call out that they had fun working with us, and I think that that’s because people do enjoy where… we try to help people enjoy work as much as possible. And I do think that makes a big difference. And the growth we’ve had—will be close to 30 soon, which is crazy from three in 2023. So I really do think that focusing on the people and having a great place to work has been a significant factor in that growth.

Mark – 34:53
Definitely. And I know your co-founder, Ollie, he’s a very nice person to work with. We’ve asked all of our guests if there’s been anyone that’s been a particularly great professional influence on you. So I’m sure you and Ollie will hopefully one day be the answer to other people’s questions as to who is a great influence on them. But who in your career or life has been a particularly great professional influence on you?

Nicole – 35:15
Well, I was gonna say my parents, actually. I don’t know if that’s an answer that you get quite a lot.

Mark – 35:21
Third person to do so. Yeah.

Nicole – 35:23
Yeah. I think that’s fair. But, yeah, I really do think I obviously watched them growing up work so hard. And I think they really did, like, teach me that, you know, hard work, perseverance—that it really does pay off. So, yeah, I think I would have to say them, and they’re definitely the first that came to mind.

Mark – 35:41
Oh, very nice.

Chantelle – 35:41
Oh, that’s very sweet. We’ll have to make sure we share the podcast with them so they can—

Mark – 35:45
Leave it now. Just listen to the last minute. They know.

Nicole – 35:48
They know that. They told me to say that. Yeah.

Chantelle – 35:52
That’s very sweet. It’s lovely. It’s been a great chat about experimentation. I’ve enjoyed finding out about what you do at Hookflash. Thanks for coming on the pod.

Nicole – 36:02
Thank you for having me.

Mark – 36:04
Thanks, Nicole. It’s been great. Well, I really enjoyed talking with Nicole, and I’d definitely trust her with my experimentation. For me, I really liked what she was saying about having a plan. Right? Being clear what it is that you want to test. Don’t just do hundreds of tests. Look at the data, have a clear hypothesis, and then make sure that when you run your test, you know what you’re looking for and what you might do with that information afterwards. But I also liked the thought of not being afraid of losing—you know, the test coming back not what you expected. Because even if you’ve not found something that can generate you a short-term uplift, you’ve learned something, and you can actually settle quite a lot of internal debates that way and make sure that you as a business don’t go down the wrong road and sink time and resources into something that’s not gonna generate you any benefit.

Chantelle – 36:57
Yeah. The idea that it challenges assumptions that you would have made alternatively if you didn’t know that it wasn’t a good thing to do. Losing just being as important as finding out what the winning thing is. Just to add to what you said as well before about having a hypothesis upfront—and that I think the thing you said about a practical takeaway, which was really good, was knowing what questions need to be answered up front. So having an exercise before you start any experimentation of writing out the questions that need to be answered so you know exactly what you wanna get out of the tests was quite a good practical takeaway to have.

Mark – 37:30
Indeed. Well, great. So I hope you’ve enjoyed listening to this month’s CX Equation, and we will see you again next month.

Chantelle – 37:38
See you next month.

Outro – 37:42
The CX Equation is brought to you by TapCXM. To find out more about what we do and how we can help you, visit tapcxm.com. And then make sure to search for The CX Equation in Apple Podcasts, Spotify, or wherever you usually find your podcast. Make sure to click subscribe so you don’t miss any future episodes. On behalf of the team here at TapCXM, thank you for listening.


Keep in touch

Stay informed with personalised updates and insights by signing up to our customer experience focused newsletter.

  • Get the latest CX trends and updates
  • Get inspired by the latest success stories
  • On average, we’ll send one email a month
  • You can unsubscribe at any time
We just need a few more details
Newsletter Pop Up
Radio Buttons
close