GDPR is now a month old and as we continue working with our global clients and observing our European counterparts, it has become obvious that data protection is a result of heavy consumer nagging. Actually, it is an opportunity for organizations that embrace change to get ahead and prepare themselves for a better tomorrow.
Bill Gates recently made a proposal to all graduating college students in which he said he would pick up the tab for those who downloaded a copy of Hans Rosling’s book, “Factfulness: Ten Reasons We’re Wrong About the World – and Why Things Are Better Than You Think.”
This got us thinking at Consensus – what 10 things do we see in our digital journey that will be better than we think? Here’s our top 10 based on market data and more importantly, our insights:
The General Protection Data Act (GDPR) has probably been something you’ve seen on LinkedIn or hashtagged on Twitter. GDPR will affect almost every member of your office – in some way, shape or form a GDPR related task will pop-up in your inbox, if one hasn’t already.
If you’re a client of Consensus, I know these conversations started months ago and will continue for the foreseeable future. But, what we’ve seen with our clients – whether they have international digital properties or not – is a chance to gain a better understanding of the landscape in order to improve their customer experiences.
Build the right things
In our long history of building digital experiences, we know the most critical factor in creating a successful digital strategy is a deep understanding of our audience. It’s not enough, however, to understand our audiences’ demographics, their personas, or even customer segments.
In order to “build the right thing” for our audience, it’s not enough to understand what our audiences need; we need to understand why they need it.
Why motivation matters
When we know why our audience needs or wants something, we can identify the hidden motivations that drive our audiences’ behaviors. And that knowledge can drastically change the product, the campaign, and/or other solutions we might develop.
It used to be that awareness was all you could really hope for in advertising. Long before the days of first touch attribution, UTM Code, and tracking pixels, brands were just trying to get their name out there for a bit of recognition when shoppers saw products on shelves. But as digital marketing became more sophisticated and we were able to measure and attribute more and more user actions, the needle shifted to put less focus on engagement and more on down funnel stages like lead generation. People will often tell you now that conversion rate is king, and anything else just exists to support conversion and sales.
/ləv T͟Hə ikˈspirēəns/ – noun. a competitive differentiator.
Very few organizations understand why they do what they do. As an integrated marketing group, we work with our clients to understand their purpose, cause or belief – it’s one of the most difficult exercises in our overall engagement strategy because many times our clients find it difficult to answer one simple question – why does your company exist? The data supporting the success of a “why” based company (instead of one that focuses on “what” or “how”) is irrefutable (future blog post).
At Consensus, loving the experience is part of our DNA – it’s where we come from. It is the single thread of why that brings us together each day in Boston’s South End and across our satellite offices in Miami, Brussels, Lviv and Kolkata. It is about delivering an experience that delights – from the start of an engagement, through rounds of creative and content and ultimately to the delivery on ideas for a client. There is no single deliverable event more exciting then that first reveal of home page, app or campaign. It is the culmination of hours and hours of research, ideas and wireframes.
Loving the experience creates a measurable business advantage for our clients. A well-crafted experience — one based on customer motivators, emotional design, and relevant interactions — can make the difference between a loyal, repeat customer and one who clicks away.
Despite the data, many of our clients continue to under invest in user experience (UX) and design thinking in spite of the fact we watched as Apple became the most valuable company in the world betting on delightful experiences.
Great experiences are a competitive differentiator. The more a user loves the experience, the greater the return-on-investment. Let’s take a look at some important data across a number of ROI experiences to get a sense of the overall impact:
Net ROI. Forrester Research shows that, on average, every dollar invested in UX brings 100 dollars in return. That’s an ROI of a 9,900 percent.
Really big numbers. According to Baymard Institute, a 69.23% average shopping cart abandonment rate translates into millions of customers leaving their goods at the checkout due to poor user experiences. The e-commerce industry could have saved $1.42 trillion just by implementing better checkout flow and design based upon an improvement in conversion rates.
Overall. Studies show that companies that invest in UX see a lower cost of customer acquisition, lower support cost, increased customer retention and increased market share, according to a study done by Forrester.
Optimized Development time. A few years ago Experience Dynamics revealed that the input of a UX designer reduces the amount of time developers have to re-work a product by up to 50%, and reduces development time overall by between 33% and 50% by improving decision-making and helping to prioritize development tasks.
Reduced Support Costs. Support costs increase when products have a poor UX design because more users need help. This requires more support staff to reply to phone calls and emails. A recent cost estimate suggests that it costs about $1 per minute for the average call center to service a customer. If 100 people call tech support for 10 minutes each, that’s $1,000.00.
Increased User Satisfaction and Brand Loyalty. Customers are more likely to be satisfied, remain loyal to and engage with a brand if the product offers a good user experience. According to Avaya, over 75% of consumers said they were likely to continue spending money as a result of an exceptional customer experience, while 82% would stop spending money with a company as a result of a bad customer experience.
The functionality, design, and tools on a website or mobile app that captures your attention, draws users into a purchase funnel, and achieves your business goals creates the ultimate “user experience.” How do we get you to love this experience? It’s simple. We create fantastic design, experiences that satisfy, and ultimately experiences that win. #lovetheexperience
Want to learn more? Please feel free to contact us at email@example.com.
In our earlier post, Why the Headless CMS Changes Everything, we discussed the rise of the “Headless CMS” and its notable impact across emerging technologies. Since this time, we’ve implemented “headless” solutions across a number of projects. While each project capitalizes on the benefits associated with a headless approach, one of our most exciting projects is an iOS-based sales application.
With the goal of driving sales efforts, our client asked Consensus to develop an application that would empower their sales force to compare pricing and services against their competition in real-time, at the point of sale. Our teams partnered to strategize, design, and develop this new application, centered around a headless approach.
Developed on Drupal 8, this application uses the most efficient technology available for app development. As the success of the application continues to grow, here are a couple areas where the use of a headless approach has thrived:
- Front-end Usability
- As the back-end content (data in the case of this app) is exposed, the data is presented seamlessly to the application layer, which is written in AngularJS. Based on the flexibility of this approach, we’re able to leverage enhanced front end functionalities, such as an embedded camera capture and custom PDF/CSV production and distribution. This also enables routine data updates for new data, which impacts the application on deployment without requiring a new .ipa file. This ensures sales professionals are always working with up-to-date data without needing to consistently re-download the application.
- The headless approach allows for a completely native environment, another key factor in the application’s success. In the short term, we’re able to iterate through UAT feedback with a completely agile approach, making presentation layer updates for quick review and remediation. On a larger level, the native environment helps maintain a manageable code base, ensuring sustained success for the application as it continues to grow.
- While you may not realize it until it’s gone, speed is one of the crucial aspects of a good user experience, especially when working in real-time with prospective and existing customers. Coupled with Drupal’s headless framework, the application uses Acquia Cloud storage and administration for high performance and speed.
As always, our focus is on creating the best experience possible. The application is used in the field by our client’s sales representatives all across the country and acts as a key driver in differentiating their services from the competition. As a continued (and proven) success, this application is a yet another example of how impactful the headless approach can be.
As our company’s philosophy is dedicated to #lovetheexperience, we often find ourselves investing in user testing and listening closely to rounds and rounds of feedback. User testing is crucial for a number of reasons, but many times the most important insights are overlooked and left out of final recommendations. These insights may not directly relate to a specific type of test question (e.g., call-to-action or like/dislike question), but rather the reactions and sentiments shared by users as they journey through the site or application.
At Consensus, we are a firm believer that user testing isn’t solely about improving user flows or finding usability flaws, but rather it’s an opportunity to put your product, business goals and even your brand in front of a live audience to see how it performs under different scenarios. In fact, let’s do away with calling it “User Testing” or “Usability Testing” – it under values the possibilities of what a company can learn. We need to be bolder and call it an Experience Audition!
Now that we have rebranded user testing, let’s look at a couple ways to improve user tests and debunk a few myths along the way. Starting with an “expert” review or what some people may call a “heuristic” review, (reminder: this blog is about user testing). While we don’t want to steer anyone away from heuristic reviews because they are cost effective, timely and certainly provide benefits, but what don’t they do?
User Testing vs. Expert Reviews
Expert reviews are useful and it’s something we frequently do for clients. Often, clients assume we will derive the same findings as user testing or the findings of a heuristic review will provide a true, “aha” moment. Now, none of this is wrong, but it’s important to understand true user testing benefits as part of the “audition”.
Typically, a heuristic review is a great benchmark/rule-of-thumb for finding flaws across a number of areas – navigation, presentation, trust value, mistakes in general UX principles, hierarchy redundancies, and so on – in a lot of ways it’s like having a teacher grade a project. The site or application design will be put against a rubric and graded. You will see where you were deficient and gain insights on how to improve in specific areas.
As many may remember from their lecture hall days, academics seem to view problems through a specific lens, one that doesn’t always account for all real world scenarios. And, when in pursuit of the optimum user experience – it’s the ability to put human in digital that will provide results. Expert reviews are more cost effective and less time consuming, but it leaves you vulnerable to missing valuable data that will help you create an authentic experience.
User tests open up a whole new avenue of both quantitative and qualitative data. The “Live Audition” reference above, is not far off from how we view our own user tests. The data becomes even more valuable when you start asking the questions the right way – an issue we frequently see is our clients requesting CTA’s that push a user a certain direction or encourage feedback that will fit an agenda. This, for many reasons, is something we want to avoid. But let’s look at some general guidelines for question structure as this is how you’ll gain the best benefits over a heuristic review.
- Avoid industry related jargon when possible – It should be a goal to make the user feel that they are in a comfortable setting. A question that confuses the user or makes them over analyze their process will provide data reflective of that particular experience.
- Should you have complicated questions that may confuse a user, break the question into a multistep process – keeping it simple can be the most difficult part.
- Avoid hypotheticals – try to encourage first hand experiences. Forcing a user to contemplate the steps they would “probably” take to accomplish a goal can produce clouded answers.
- When responding to a comment DO NOT give the user the feeling they are being judged in anyway – Any answer is a useful answer. If a user cannot find the proper steps to add a friend, then ask for more insights. It is important to find what is causing their confusion.
- Offer neutral Like/Dislike based questions – This is a great way to discover subjective information about your design – and as you’ll see below, success is in the questions you ask and not how many people you ask them to.
User testing – How many users do I need?
The LARGEST misconception seems to be how many users are needed to compile a reliable data set. People assume the larger the user group, the better the results. When in fact, a group over 20 users is really bordering on a statistical analysis of user tendencies – you won’t discover much new after user 5, and this could even be more than sufficient.
The best results come from testing no more than 5-15 users and running as many small tests as your budget allows. Once we get into the 20+ user range, it becomes more of a statistical analysis of user tendencies. Which, in its own right might be useful, but as you’ll see below, we stop finding new usability issues after just a few users.
Tom Landauer and Jakob Nielsen (whose formulas below and graph above) shows that the number of usability problems found in a usability test with n users is:
N (1-(1- L ) n)
where N is the total number of usability problems in the design and L is the proportion of usability problems discovered while testing a single user. The typical value of L is 31%, averaged across a large number of projects. Plotting the curve for L =31% gives the results on the right:
The most striking truth of the curve is that as soon as you collect data from a single test user, your insights increase and you have already learned almost a third of all there is to know about the usability of the design. The difference between zero and even a little bit of data is astounding.
When you conduct testing on the second user, you will inevitably notice some overlap with the first tester. You should, however, receive some new feedback from the second test case, creating some differentiation for you.
The third user will very likely repeat many actions you ‘ve already observed with the first two testers, but will also provide some unique data for your study.
The curve clearly demonstrates that you need to test with ~15 users to discover all the usability problems in the design. However, the value of testing with a much smaller number of users better distributes budgets for user testing across many small tests, instead of exhausting a budget on a single, elaborate study.
Only note information related to your CTA’s or, even worse, only note the Negatives
Taking a look at the last User Testing session we administered, this was incredibly important and was the source of some of our most valuable discoveries! We had even left 3 questions open specifically for qualitative Like/Dislike comments and still – when asking users to accomplish a goal, found more from their opinions than their actual navigational paths.
When administering a user test, it is easy to fall in the routine of noting a few taps, maybe a swipe and then moving on. Sometimes being more careful of the questions you’re asking than the answers you’re receiving. What is often overlooked with user testing is the fact that you are gaining first hand views of what your everyday user will be going through. Remember the formula above? That’s scalable all the way up.
This is everything from the usability, to the users Emotional Response, and even how your new project aligns with your overall business goals and brand.
As your user progresses through the various tasks, encourage them to share their thoughts on different aspects they notice. You’re already engaged in a 30-40 minute session and this is a great time to learn what you did well.
Are users visibly excited when the app opens? Or is a rebranded feature not intuitive for the new user? View the results from a multitude of lenses, this will help give you the best understanding of how your user truly feels.
Successful online experiences enable users (i.e. customers, site visitors, etc.) to accomplish their goals. But oftentimes, in the design process, keeping users’ goals top-of-mind is no easy task.
To overcome this hurdle there is a simple yet effective exercise that ends with what we call a User Story Index. By the name alone you can derive two important facts about the exercise: it is about users (or site visitors) and what they want to accomplish when visiting a Website.