From Analytics Lightweight To CRO Program Heavyweight

From Analytics Lightweight To CRO Program Heavyweight

There is a lot of CRO content on the interwebs about tools, guides, statistical models, tips, tricks, and what have you. A quick web search on “a/b testing” renders more than 1 Billion results.

Whoa! 😱  A billion results, but what about the human side to CRO? What about the fight it takes to step in the ring and set up a CRO program?…especially when you are tasked to contribute to the digital transformation at Hewlett Packard, an 81-year-old giant?

How do you start from scratch and build a CRO Center of Excellence when you’ve been recruited as a mere QA tagging manager? 

pro wrestling image as an analogy to conversion rate optimization

Well that was me. Fresh off my bout as a web analytics and testing manager at www.1800contacts.com[1],  Hewlett-Packard’s analytics team brought me on–via Analytics Demystified–as a new consultant in 2014. Well, I instantly felt pretty timid. 

In fact, a few months into the gig, my HP line manager reminisced that “You were very quiet at first.  I wasn’t sure if you were even going to work out”.  Why?  Because I had previously been a big fish in a medium pond. I went from 1-800 Contacts where I knew everyone in our tight eCommerce department and where I got to play in all things a/b/n testing and digital analytics…to something quite different.

HP is a legitimate enormous world-wide enterprise with large disparate business units and a maze of stakeholders (“and in this corner weighing a whopping 325 lbs…”). Not to mention I was new to operating remotely, which introduced a whole new stream of anxiety (as I’m sure a lot more people can relate to since the COVID-19 pandemic…forcing so many more to work from home). 

On top of all that, I was unproven both to HP and to Analytics Demystified.  So how did I grow from the quiet new data guy in the corner, to a contributor of digital strategy at HP?   Let’s get down to the nitty gritty.

a meme on deep diving into digital analytics

Matthew Wright, digital analytics mega-guru and my HP line-manager, assigned me to set-up additional tagging on the recently redesigned enterprise software pages (HP Software for Work – HPSW). This enterprise software division built programs for IT management, server storage, etc. For all, I knew they were building flux capacitors…in other words, high tech stuff that is over my head[2]

I felt like mostly just a go-between for developers and the business. Which was good, at first. But secretly I wanted to magnify my job into more than chasing down tracking requests. Because at my previous gig, we had it all!  A/b tests, MVT tests, segment-triggered targeting, full Adobe Analytics set-up, surveys program, heatmaps, UX testing, a robust ideation process…the whole nine! 

And, trust me, once you get a taste of the CRO good life, you only want more.

a meme on the taste of glory

But before getting ahead of myself, I had to build a rapport. This meant not only delivering on tagging architecture projects, but going above and beyond.

For example, I had to provide thick analyses packed with extra insights, to run deep dives on site interactions–they really wanted to know where users clicked–and to help implement and analyze heat-maps.

Another way I won trust was to introduce engagement scoring to help with gauging the site’s appeal (idea credit to Eric Peterson over at Analytics Demystified…one of our industry’s godfathers. If you want to read more around his models of Engagement Scoring; this is a good place to start from[3]).

Eventually, bringing these types of insights led to my voice being heard more and more. Of course, learning from some incredible HP analytics and developer colleagues gave me a huge boost; they were always in my corner.  

The next step was noticing a recurring pattern: there was a data gap that the digital product manager (PM) wasn’t sure how to fill.  A curious guy with a capable team, the pm wanted to achieve a higher level of data-led decisions.

They wrestled constantly against too-large-for-its-own-good enterprise opponents, like infrastructure that didn’t promote rapid growth, and autocratic top-down decisions imposed on them by dotted-line owners.  Those foes favored gut-feelings and carryover designs. 

Come again?…disprove hunches and prove-out decisions with data?? Say it to my face! Yep, my mind went straight to a/b testing tools. Naturally, web experiments could help them isolate the unknowns that were either helping or hurting the site. 

“18% lifts here and 42% lifts there…we rapidly learned about how the customer preferred to respond to trial software offers”

Of course, I proposed that they budget for a robust and established platform–such as VWO–and for a cross-functional CRO/testing team.  Already pretty scrappy and agile by nature, they jumped all over the idea. After getting the platform in place within a couple of months, and with executive sponsorship from our analytics team and PM, the wind was at our backs.

With me as the eager power-user and partnering with a colleague for test strategy, we decided to dip our toe in basic tests to prove the tool. In other words, we started small.  The first test being simple CTA copy tests, and then onto testing color palette changes. 18% lifts here and 42% lifts there…we rapidly learned about how the customer preferred to respond to trial software offers.

a meme on i want to win

During this phase, I relearned one of the most critical life lessons in CRO – early wins are crucial to getting everybody emotionally invested. In HPSW’s case, suddenly everybody got excited and started contributing to ideas. Eventually, we needed a way to contain all the new testing ideas. We set-up an “ideation” meeting where we could brainstorm, collect, and refine testing plans. This gave the boss a testing roadmap and way to socialize this new branch of learning. 

“CRO is about enabling insights that are otherwise very difficult to gather.”

Requests to migrate to new features/designs were next prioritized. People had complaints about the usability and extra maintenance expense and secretive nature of progressive forms (or multi-step forms) in order to get downloads or trials.

So pitting the forms against a flat, all-on-one page in an a/b testing ring shut down the nay-sayers and proved out the control multi-step form by 25% (a similar single vs multi column form test was also highlighted in Ben Labay’s recent May 2020 webinar). Other tests to modernize web design were easily a/b tested and rolled-out once we saw no harm being done (no false positives).

Another winning strategy we tried was to back-test features. An example was to prove the usefulness of a long-standing secondary navigation that directed visitors deeper into the site. To some–I being one of them–the secondary navigation added clutter and information overload.

But I was happy to be wrong when the test showed a statistical drop of page depth without the navigation (in other words, we needed the secondary nav, even if the execution could have used a massage…which it later received via further testing). As we all know, it’s not about being right; CRO is about enabling insights otherwise very difficult to gather.  

Reflecting back

After 18 months of building out HPSW’s program, I was once again tasting the CRO good life. I made friends with some extremely smart colleagues and got to see the crawl-walk-run mode of the experimentation function. I loved helping my client fuel decisions quickly with tried and tested data. In a small way, this new CRO program contributed to my client team leaders getting promoted over digital transformation across all HP Enterprise business units. And isn’t the biggest win of a consulting engagement to see your client succeed? 

So…go away, read some books! Practice your full nelson hold and stuff. And you too could pin down a cool CRO job. It’s da beest!

meme on its the best