Data-driven design for B2B websites
If you’re a marketer conducting research in advance of an upcoming website build, you may have heard the terms data-driven design, data-informed design, UX design, and user-centered design batted around. For the uninitiated, this can be a bit confusing.
- Data-driven design uses quantitative (objective) data to inform design decisions.
- Data-informed design uses qualitative (subjective) feedback to inform design decisions.
- Data-aware design uses data as one factor used to inform design decisions.
Where data-driven web design focuses on gathering objective data from analytics, testing, and surveys, data-informed design focuses on gathering more subjective feedback on user experience. Data-aware design maintains that data is one of several sources of useful information.
This set of design approaches all follow the philosophy of user-centered design.
- User-centered design focuses on user experience when making design choices.
UX design is a set of strategies that puts the user-centered design philosophy into practice.
- UX (user experience) design is a set of strategies that strive to optimize user experience.
Just as inbound marketing strives to be helpful to the customer to earn and maintain their business, all of these terms that end with “design” try to maximize user experience to keep visitors on your site, and encourage them to convert into leads.
As a result, understanding how users interact with your site and tailoring it to meet their needs can have very real effects on lead generation.
- “Every $1 invested in UX results in a return of $100 (ROI = 9,900%)” (UXCam)
- “A well-designed user interface could raise your website’s conversion rate by up to a 200%, and a better UX design could yield conversion rates up to 400%” (intechnic; their emphasis)
- “Slow-loading websites cost retailers more than $2B in lost sales each year.” (intechnic)
Investing in data-driven design also presents an opportunity to one-up competitors and avoid unnecessary losses.
- “Only 55% of companies are currently conducting any online user experience testing.” (Skyhook)
- “85% of issues related to UX can be detected by performing a usability test on a group of 5 users.” (Truelist)
- “If a website needs more than 3 seconds to load, 40% of the people leave the website” (UXCam)
In this post, we’ll discuss design informed by data to some degree (driven, informed, or aware): how it works, best practices to implement on your website, tools to gather data, and red flags in data that may uncover design issues.
How data-driven design works
Your website is a tool, and one of its primary uses is to generate leads for you.
Of course, employing data-driven design doesn’t mean that you should ignore your intuition and creativity as a designer.
Instead, it combines your artistic sense with hard data about how your audience thinks and behaves so that you can make more informed decisions that are both beautiful and functional.
Rather than guessing about what your audience wants or likes, your research can provide educated guesses and facts about how your site visitors actually interact with your website.
This can help you identify what you are doing right, and help you do more of it. It can also help you discover near misses that are leaking opportunities, and plug the hole.
Since data-driven design focuses on objective data, it requires research. Four common ways to gather data are using 1) analytics platforms, 2) A/B testing, 3) usability testing, and 4) surveys.
- Analytics platforms capture volumes of data about how users interact with your site. This includes which pages are viewed the most, which routes users took to get to your page, how long or how much they engage with pages, where they clicked, and more.
- A/B testing tools allow you to create different versions of web pages or elements and test to see which one users like better.
- Usability tests reveal how easy or difficult it is for real people to interact with your site. These tests often involve asking participants to complete different tasks on a website while a researcher watches, noting where and how they struggle. This helps identify design problems.
- Surveys ask customers directly for information about what they do and don’t like. Some survey questions are closed ended, seeking yes or no answers for quantitative information, while other questions seek qualitative information about a particular user’s experience, such as likes and dislikes.
B2B data-driven design: best practices
1. Understand the basics of data gathering
Source: Google Analytics
In order to make informed design decisions, you need to understand how to gather and interpret data from analytics platforms, tests, and surveys.
You can go about this a number of ways, such as by working with an analytics expert, reading articles, or taking an online course. Here are some resources you might find useful:
- The Google Analytics for Beginners course (It’s free.)
- The Interaction Design Foundation’s “User Experience (UX) Design”
- UX Collective’s “Becoming a Data-aware Designer”
- UX Planet’s “This is all you need to know to conduct a UX survey”
- Designing with Data by King, Churchill, and Tan
- Hotjar’s “Beginner’s Guide to Usability Testing”
- Interaction Design Foundation’s usability testing course
- Interaction Design Foundation’s user research course
2. Identify your buyer personas
In order to better understand and empathize with potential customers using your site, you’ll want to put together buyer persona(s). This is a fictional representation of your ideal customer.
To create this character, talk to real customers who use your site and/or look at data about site users gathered from surveys, feedback, testing, and analytics. Another great way to learn about your buyers is to talk to people in your sales or customer service departments who work with customers and prospects directly.
In your persona, include information such as:
- Pain points (dealbreakers or dealmakers)
HubSpot has a free template that helps you put together a buyer persona quickly.
Coming up with one or more buyer personas will help you select participants for usability studies so that you have a more accurate idea of how your real users will react to your website designs.
3. Gather data scientifically
Source: Crazy Egg
In their book Designing with Data, King, Churchill and Tan lay out a process for setting up user experience experiments:
- Establish a goal
- Identify obstacles to achieving that goal
- Form a hypothesis
- Perform a test
- Analyze the results
Aaron Gitlin sums up what makes a good hypothesis this way:
“A hypothesis should clearly state and include:
The segment of users to be evaluated [user group]
The change we are making [change]
What we believe the result will be [effect]
Why we think that result will take place [rationale]
And finally, the measurable result we expect to observe [measure]”
To come up with a good hypothesis, pull from whatever data you already have about your users and follow King et al’s template:
“For [user group(s)] if [change] then [effect] because [rationale], which will impact [measure].” (King et al)
For example, a hypothesis might be, “For [our Buyer Bonnie persona], if [we use a single CTA on our landing page rather than three] then [more users will convert] because [it will represent a more clearly define next step], which will impact [our landing page’s conversion rate].”
Next, run a test that suits your needs, controlling variables just as you would in a true scientific experiment.
For example, Let’s say you perform a split, A/B test that shows half of your users a version of the page with multiple CTAs, and the other half a version of the page with a single CTA.
During this experiment, you wouldn’t want to change anything else on your landing page, such as the color scheme, because you wouldn’t know which changes affected the experiment’s results, or how much.
Finally, analyze the results of the experiment and interpret what changes you need to make to your design.
For example, let’s say that you achieve a higher conversion rate on the landing page with a single CTA. A logical next step might be to use the single-CTA version of the page on your website. How you use the data you gather will depend on the context and size of your experiment. For more details on running a UX experiment, check out this article.
For tips on collecting data scientifically using a survey, check out this article.
4. Incorporate testing into your ongoing marketing strategy
Source: Interaction Design Foundation
Make testing a habit, and you may be able to nip many UX issues in the bud, saving both time and money.
- “Fixing a problem in development costs 10 times as much as fixing it in design, and 100 times as much if you’re trying to fix the problem in a product that’s already been released” (UX Planet).
- “85% of issues related to UX can be detected by performing a usability test on a group of 5 users” (Trulist)
Of course, if you run regular tests, you will probably need to make regular changes to your website.
To make tweaking your website easier, use a flexible content management system (CMS) like HubSpot, Webflow, or WordPress.
Using a content management system on your website means that you can edit your website without getting a developer involved every time.
For example, say you ran an A/B test on one of your pages and found that the call-to-action copy “get in touch” got more clicks than the copy “contact us.” With a flexible CMS, you could edit the copy “get in touch” much like you would a document, and publish your changes immediately with minimal fuss.
4. Consider UX when gathering survey data
When conducting surveys, consider the experience of your participants.
- Keep surveys short (10-15 questions) so they are more likely to be completed.
- Be careful about incentivizing users to take a survey, since it might bias your results.
- Be sure to thank your survey takers when they complete the survey.
5. Consider mobile-first design
It has become critical for B2Bs to have mobile websites that are as great as their desktop versions.
In 2020, mobile devices make up a little over half of web traffic in the world, and in 2018, “50% of B2B queries were made on smartphones” (Smart Insights).
Google now analyzes the mobile versions of most websites first, which means that it grades your website (which effects how high you rank in search results) based on how good your mobile site is.
As a result, many businesses employ mobile-first design. This means that you design your mobile site first, and then adapt that design for the desktop version of your site. Doing this helps ensure that your mobile site is as user-friendly as possible, and that you’re showing Google the best version of your site.
Depending on your business, you may get more traffic on your mobile site, more traffic on your desktop site, or an equal number of visits to both versions. If most of your traffic is from desktop visits, then mobile-first design might not be as critical. Instead, you want to focus on presenting visitors with a great desktop experience, and then ensure that your mobile site is user-friendly.
If you receive a lot of mobile traffic, however, employing mobile-first design would be in your best interest.
There are hundreds of tools for gathering UX data, including various analytics, A/B split testing, usability testing, and survey tools. So how do you use them to improve your B2B’s website design?
Analytics tools: These tools are great for gathering loads of data in real time as users interact with your site. Monitoring analytics data for red flags via tools like Google analytics make it easier to spot issues such as slow loading speed, broken calls to action, and 404 errors (unreachable pages) as they crop up, year-round.
A/B split testing tools: These tools are great for running experiments to test the effect of specific design choices on a web page. For example, if you wanted to know whether changing the color of the call to action on a landing page from blue to orange had any effect on conversions, you could create a split test with two versions of that web page: one with a blue CTA, and one with an orange CTA.
Next, you could show half your site visitors the blue version, and half the orange version within a set time period. After concluding the experiment, you could then see if the orange button prompted more, less, or an equal amount of conversions than the blue button, and by how much.
This can be helpful when you’re choosing which version of a design to implement, especially in situations where a poor design choice could cost your company money.
Usability testing tools: Usability testing allows you to test how well users are able to navigate your website and perform tasks (such as searching for information) on a finished website or prototype.
Usability tests can be conducted in person or remotely using third-party software. You can also set up tasks for users to perform without a moderator (and view a recording of the session later), or you can watch users interact with a site in real time (a moderated test.)
Usability testing is great for identifying weaknesses in an existing design.
Survey tools: Survey tools give you the opportunity to gather UX feedback directly from your site users after they have used your site. They are excellent for gathering qualitative feedback like preferences (e.g. “do you enjoy our mobile deposit feature on our mobile app?”) and quantitative feedback, like answers to yes/no questions (e.g. “Could you find everything you needed on our site without using the search bar?”)
Google Analytics (analytics and A/B testing)
Source: Google Analytics
Odds are, you’re already somewhat familiar with Google Analytics. It is a widely-used and extremely powerful platform. Plus, it has a very robust free version.
If you set it up for your website, you will be able to track all kinds of useful metrics, such as:
- Conversion rates
- How much time a user spends on a page or on your site as a whole
- How many people who land on your site immediately leave (bounce rate)
- Which channels (such as paid ads or organic [unpaid] search traffic) bring in the most traffic and leads
- What type of device your visitors use to view your website (mobile, desktop, tablet)
This list only scratches the surface of all the data you can gather with Google Analytics. Google Analytics also has A/B testing features, although you will probably need to be comfortable with the more basic features of the platform to use them.
If you’re interested in learning more, you can check out Moz’s guide for getting started with Google Analytics. Otherwise, you can take the Google Analytics certification course for free and get a more hands-on overview.
Hotjar (analytics and user testing)
- Heat maps that visually represent where visitors click on desktop or tap on mobile devices, move their mouse on desktop, scroll, and move the most on a page
- Recordings of visitor sessions (visits to your site)
- Survey and feedback tools
- Conversion data funnels
- Submission form analysis tools
Crazy Egg (analytics, user testing)
Source: Crazy Egg
- Heat maps that visually represent where visitors click and scroll
- Recordings of visitor sessions (visits to your site)
- Reports that show the exact number of clicks and the percentage of click data on each element of your web page
- “Confetti” reports that allow you to filter click data and help you identify where clicks come from and who is clicking
- Reports that track metrics on each page and presents it in table format
HubSpot (A/B testing, analytics)
HubSpot offers a suite of marketing, sales, and website tools, including A/B testing, analytics, content management (CMS), and customer relationship management (CRM). This might be a good solution if you already have HubSpot and have basic testing needs, or are new to digital marketing and find other platforms difficult to use.
Optimizely (A/B testing, website optimization)
Optimizely is one of the leading conversion rate testing (CRO) platforms. A/B testing is one of its many website optimization features. One of the nice things about this tool is that you can run multiple experiments on the same page at once. Three tiered plans are available.
VWO (A/B testing, conversion optimization, user feedback)
VWO is another popular testing platform. Like Optimizely, it offers A/B testing features along with other conversion optimization tools. VWO allows you to get feedback from users, which is a nice bonus.
If you’re interested in more technical details about A/B testing platforms, this article is a nice resource.
Lookback (Usability testing)
lookback allows you to conduct both moderated and unmoderated remote or in person usability tests on both desktop and mobile devices. You can send an invite link to collaborators and participants, and your sessions will be recorded and uploaded to the cloud. All of your data syncs to the app’s dashboard.
The standout feature of this tool is that you can communicate with test participants in real time using video chat. This enables you to ask questions or conduct an interview with the user.
This application offers a 14-day free trial, and different tiers of paid subscriptions, including a “starter” option that is $49 per month per collaborator.
Userlytics (Usability testing)
Userlytics allows you to set up unmoderated tests that capture a recording of both the test-taker’s screen and their face, capturing facial expressions and the user’s audio as they talk through a task. You can then annotate and analyze recordings of these sessions.
One of the nice things about Userlytics is that it walks you through the test set-up process and allows you to target certain demographics, disqualify participants based on answers to specific questions, define device and operating system types, define task types, and more.
Userlytics' “Quick and Easy” plan starts at $49 per month.
UserTesting (Usability testing)
UserTesting is remote testing software that allows researchers to conduct tests on both desktop and mobile devices. It offers templates and same day recruitment from its team of test takers as well as features including live conversation, self-guided video recordings, transcripts, metrics, highlight reels, and sharing options, to name a few.
One cool feature of UserTesting is that you can request that users take your test in specific geographical locations, such as stores or on the go. You can also test any website, such as a competitor’s.
They have two pricing options: “Individual” and “Enterprise.”
Survey Monkey (survey tool)
Source: Survey Monkey
Survey Monkey is one of the leading survey platforms and is a powerful tool for conducting UX surveys. It includes features for filtering data, creating results summaries, generating reports, customizing survey designs, and visually representing data with graphs and charts. It also provides over 200 templates and pre-written questions.
However, full access to Survey Monkey’s features requires a paid subscription.
Google Forms (survey tool)
Source: Google Forms
Google Forms is a free, beginner-friendly survey tool that you can use with your Google account. However, it has limited analytical and design capabilities, so it’s best for simple surveys.
With this tool, you can create unlimited surveys, insert videos and the company logo in surveys, pick from 16 templates. Plus, it collects your survey data in a Google sheet and is integrated with G Suite, so you can collaborate with colleagues in real time.
The caveat is that it is a very simple tool with limited design customization, question types, visual display, and data analysis capabilities.
Typeform (survey tool)
Typeform is a survey tool that emphasizes user-friendly design. It’s designed to make writing and completing surveys easy and painless. It offers many templates and robust yet easy to use reporting capabilities.
However, it is a little more difficult to export data than Google Forms or Survey Monkey, and it is slightly harder to collaborate on a survey, since you need to use a third party plug-in like Airtable. It also does not offer the same level of design customization as say, Survey Monkey.
You can get a free plan or purchase a paid one in order to access more features.
If you want to check out other UX tools, this page provides an exhaustive list.
Analyzing data: common symptoms of UX issues
While usability tests may allow you to learn where and how users struggle in real-time, and surveys give users the opportunity to tell you directly, analytics data offers an indirect, quantitative window into your user’s struggles.
Like a runny nose could be caused by any number of ailments, an unfavorable number could be caused by one (or a combination) of many design, copywriting, and development problems.
However, certain analytics measures frequently coincide with UX issues. Here are some common red flags and possible culprits.
Slow page load speed
If you see this issue, ask:
- Are your images the right size? > Make sure images are only as large as they need to be for the size of your page on the widest screens your users use (typically 2000 pixels), and that you use compressed file formats like JPEG 2000, JPEG XR, or WebP when possible. When it's not, JPEGs should be used. JPEGs are generally lower weight than PNGs, which means that they are less likely to slow down your page speed.
- Are there large media files, such as videos and animations? > Convert video files to a more efficient format, such as WebM or MP4.
It could also be: making efficient use of caching, lots of HTTP requests, not compressing files, not using CDN, or a poor quality host (for example, your site has to share resources with another site or customer service is bad). All of these issues require a certain amount of expertise to fix.
To diagnose exact issues, you can run your web pages through Google’s PageSpeed Insights for free. If you use Google Analytics, you can also go to the “Behavior” report, select “Site Speed” and then “Speed Suggestions.”
Source: Google Analytics
This will give you a page speed report for each page of your website. You can click on the number of speed suggestions to the right to go to Page Speed Insights for a breakdown of opportunities for improvement.
For more page speed best practices, check out this Moz article.
High bounce rate
High bounce rates can be normal depending on context, but are usually a cause for concern. Bounces occur when a user views only a single page of your website before leaving.
For example, let’s say a user fills out a form or otherwise takes a desired action on a landing page, but they are not taken to a thank you page after they submit. Visits to this page, even ones that result in conversions, will be recorded as bounces even though nothing is wrong.
However, for most pages on your site, especially those that are designed to lead visitors deeper into your site, like your homepage, a high bounce rate is a red flag.
SEJ defines a higher-than-average bounce rate for an entire website as somewhere between 56–70%. Average bounce rates also vary by industry, and website type. To check your B2B website’s bounce rate, go into Google Analytics, click on the “Audience” tab, and select “Overview.”
This will show you a broad look at audience metrics for your entire website for a particular date range.
To examine the bounce rates of specific pages, go to the “Behavior” report. Click on the drop-down menu labelled “Site Content,” then select “All pages.”
This will show you bounce rate data for each page. Also keep an eye on "% Exit:" the percentage of users that leave your website after viewing a given page.
Tip: make sure that the primary dimension above the data is set to “page” to view data by page URL, or “page title” to view by page title. You can also make your primary dimensions “Source” or “Medium” to assess bounce rates by channel.
If you find that you have a high bounce rate on pages that should be leading potential customers deeper into your site, check for these common issues:
- Is the page loading too slowly? > Follow the site speed tips in the previous section.
- Is the design pleasing and up-to-date? > If your design looks outdated, unappealing, or unprofessional, visitors may be less likely to stay on your site. As a result, it’s a good idea to periodically update your site to reflect current website design best practices for your industry and ensure that it is as user-friendly as possible.
- Does the page include interruptive ads? > Visitors may be annoyed by pop-ups and leave. It may be a good idea to replace the pop-ups with less intrusive CTAs.
- Is your site secure? > If your URL begins with “https” your site is secure. If it begins with “http,” it is not. Secure sites reassure visitors that their information is safe and contribute to your search rank. Look into securing your site.
- Is the copy hard to read? > Modern website users tend to skim pages and read only the sections that are most relevant to them. If your copy is hard to skim, it may need to be revised for clarity and structure.
To make copy easy to read, break it up into short sentences using easy words and paragraphs of 5 lines or less. Organize the page using a nested hierarchy of headlines and subheadlines. Use bullet points and numbering to organize lists.
High traffic but low conversion rate on conversion-focused pages
Landing pages are designed to make potential customers take a next action (conversion) such as signing up to receive emails, submitting a form, or making a purchase. This is prompted by a call to action (CTA) like “subscribe to our email list,” “get it touch,” or “start your free trial.”
To ensure that your landing pages are doing their job, it’s a good idea to set up “goals” in Google Analytics to measure your conversion rate on your landing pages. A conversion rate is the percentage of people who land on a given page and take a desired next action, or “convert.” A desired next action (or "conversion") in this case might be a form submission, resource download, or even a click-through to a lower-funnel page on your website.
How you set up goals depends on the type of conversion you want to measure. Here’s a Google help page that walks you through the goal setting process. This Neil Patel post is also helpful.
If someone on your team needs to set up goals for each of your landing pages for the first time, you will need to wait for Google Analytics to gather data and identify your baseline conversion rate before you can draw any conclusions about the effectiveness of your landing pages. In this case, you can skip to the next section.
If someone on your team has already set up goals for landing pages, you’re good to go. In Google Analytics, navigate to behavior > site content > landing pages. You should see something like this.
Look at the conversion rates for each landing page. Are there any that have a much lower conversion rate than the others?
If your landing pages have an unusually low conversion rate, take a closer look at their calls to action. When looking at your designs, ask:
- Is the CTA relevant to the user? > CTAs should be relevant to the content of the page, the inferred wants and needs of the user, and represent a logical clear next action toward conversion. Make sure your CTAs check all three of these boxes.
- Is the CTA design stale or overused? > Even great CTA designs lose their shine after extensive or prolonged use. If you haven’t updated your CTA design lately, it might be time to refresh it.
- Could the CTA annoy the visitor? > Certain types of CTAs, such as pop-ups, may annoy a site user enough to cause them to leave the site, especially if they cannot “x” out the CTA or must take a conversion action to make it disappear.
- Is it easy to read and interact with the CTA button? > A CTA that blends in too much with the surrounding page (such as one with similar colors or hard to read font) offers poor UX. A CTA button that is very large in relation to the page can seem overly aggressive, while a very small CTA button is harder to see and click.
Design CTA buttons that are large enough to stand out without taking up a large portion of the page. Round the edges of flat CTA buttons, or add a 3-D effect to make them look more clickable. Use contrasting colors that draw the user’s eye. Ensure type is easy for anyone to read.
A large CTA button with rounded edges; Source: X-Centric
Early drop offs on the scroll map
Scroll map; Source: HotJar
If you use heat mapping software, you can see where visitors stop reading a page by the abrupt color change. In the screenshot above, the change from the hot color (red) to a cooler color, (yellow) is the drop-off point where many people stopped reading.
While some pages are designed to prompt visitors to convert quickly, others are designed to engage them for as long as possible.
It’s normal for readers on the web to read the top of a web page closely and skim more and more as they scroll toward the bottom, but usually you don’t want visitors to abandon your page near the top unless they’re taking a next action.
So if you see readers jumping ship early, you may want to ask:
- Is the text skimmable? > Readers online want information fast. If they have to work too hard to read your content, they might leave. This doesn’t mean that the best web content is short. Instead, it means that great web content is easy to digest.
If you break up your content into shorter paragraphs, organize it clearly with headers and subheaders, provide links to different sections on the page so readers can locate the information they want, and provide helpful visuals like infographics, videos, screenshots, and illustrations, readers are more likely to stick around.
- Does the spot on the page where many readers leave feel like a conclusion? > This is referred to as the “false floor” problem. If you think this is happening, you can provide the reader with visual cues, such as design that leads their eye downward or cut-off text so they know there’s more to read.
- Can the visitors see important information above the fold? > The “fold” is the part of a page that is visible before a reader has to scroll. It is the most valuable real estate on the page, since it may receive most of a reader’s attention and will be seen even if they leave without scrolling. Capture the reader’s attention before they have to scroll, and place the most important information above the fold.
High number of clicks on a non-clickable element
If you use heat mapping software like Hotjar or Crazy Egg, you can see the highest concentrations of clicks on a web page.
Source: Crazy Egg
If you see a high concentration of clicks on an element of your webpage that is not clickable, this can indicate that people expect it to be clickable. In this case, you may want to adjust your design to accommodate users’ expectations.
An unclickable item that prompts rage clicking; Source: HotJar
“Rage clicks” or repeated clicks on a webpage that indicate frustration, are helpful in identifying UX issues like broken links, confusing site elements, website bugs, and slow pages.
These can be observed real time during a usability test, from session recordings. Some platforms, like Hotjar and Fullstory, allow you to filter for “rage click” events and watch only the sessions where users rage clicked. Watching these sessions can help you figure out what is frustrating users.
For example, if many users rage click on the same site element multiple times, such as a CTA, this might signal a broken link or slow loading speed.
By gathering analytics and user experience data, you’ll be able to complement your design instinct with hard facts about user behavior and preferences. This could result in saved time, a better experience for your users, and more conversions on your website.