|By CX Blog||
|September 13, 2016 07:00 AM EDT|
Let’s run a pretend test together. Imagine a run-of-the-mill landing page with 300 words, minimal design, and a registration form. We’re really interested in which day of the week and at what time is ideal for sending prospects to this page to convert. We proceed with the test running coordinated pushes at certain intervals throughout the week and discover a conversions are abysmal on Friday afternoon and especially effective Wednesday morning. Let’s say we go through this exercise a few times a year to eliminate seasonality. We also comb through customer support feedback, social sentiment analysis and online survey responses. For good measure, let’s say we’ve also vetted the target audience qualifications and the positioning of this effort within our ideal customer journey timeline. No red flags so far to throw off our findings; time on site differences between days is negligible, no statistically relevant objections from customers/prospects directly from social or support. We’re ready to ban promotions on Friday now right? Everything so far according to our perceived experience data tells us so. If we sink our promotions investment into Wednesday mornings we could safely pre-order the Scrooge McDuck money bin to ski/swim in our profits later!
http://cxblog.dynatrace.com/wp-content/uploads/2016/09/Scrooge-money-pil... 300w" sizes="(max-width: 700px) 100vw, 700px" />
The weeks and months go by as we wallow in our success. Unheard of conversion rates! You hold a quarterly briefing with a trusty industry analyst and find out you’re losing ground to the competition. Why?
Every Friday at 4:00pm GMT your IT team runs a batch operation to process and organize orders within the system. The team was smart enough to run this process in a separate system but overlooked that it actually makes calls to the production system, the same system that your landing pages are served from. They also overlooked that it slows your page to a crawl, all but destroying the experience for visitors all day Friday on EST. The page never went down, so alerts were never triggered. I know this sounds stupid, but you’d be surprised how often stuff like this happens.
To make matters worse remember the following:
- Most of your users don’t report problems
- For every customer who bothers to complain, 26 other customers remain silent. (Source: White House Office of Consumer Affairs)
- Reactive reporting can alienate a lot of potential customers quickly
- It takes 12 positive experiences to make up for one unresolved negative experience. (Source: “Understanding Customers” by Ruby Newell-Legner)
- Most of us are using monitoring/measuring tools in silos & don’t know any better
- “I&O pros focus on technology performance and availability while developers monitor for efficient transaction navigation; marketers monitor page visits; […] No one sees the big picture” (Source: Take Application Performance to the Next Level with Digital Performance Management by James McCormick, Forrester.)
- “Digital Experience Platforms Wave – “there is a clear gap in what even the leaders can provide in connecting DX to actionable results.” (Source: Digital Experience Platforms Wave by Ted Schadler, Forrester)
Crap. But we were making data driven decisions based on what our customers say they want! How were we supposed to know IT would slow our site to a crawl on Friday with some silly “batch process thingy?”
It’s just as much your responsibility to be aware of the effects website page speed have on your customers as it is IT’s to know you’re sending important prospects to the site while they’re running their batch process. While this example is a bit dramatic, the disconnect between groups responsible for CX is real. You BOTH own it. You need to tools, process and culture in place to enable this. Making decisions in separate silos without the context delivered experience offers is irresponsible. The test scenario we ran earlier would get us an F in 8th grade science class. We didn’t consider the most important variable, the performance of the digital environment. Only when we couple this delivered experience with our previously mentioned perceived experience information do we really get a sense of reality. When you understand reality, you can make confident decisions. That’s when the fun starts.
Now that you understand a bit more about the CX gap between business and IT groups, let’s bridge it.
- Ask questions! Start with your IT Operations lead and ask him/her what performance monitoring tools they have in place. Are they responsible for any internal or external SLAs (service level agreements)? What does a successful customer experience look like to them?
- Share your perspective! Your counterparts in IT have probably been trying to show the value they provide to the business already.. just not in terms you care about or understand. Help them understand what works for you.
- Don’t make assumptions! Remember that perceived experience and delivered experience data is useful, but creates exponential value when integrated.
- Why Apps Should Be Built Like Planes | @DevOpsSummit #APM #DevOps #DigitalTransformation
- The Impact on Ecommerce Customer Experience | @DevOpsSummit #APM #DevOps #DigitalTransformation
- Digital CX Summit – 2016 Conference Highlights
- Is management of digital experiences lost on CX pros?
- Is perceived experience reality?
- Video: Digital customer experiences in the age of faceless organisations
- A Year Benchmarking Retail Website Performance with 7 Best Practice Resources
- Australian Election: Website and Social Media Showdown
- How website & application performance impacts customer experience & loyalty
- Digital Journey or Customer Journey?