What's the evidence? Does user onboarding work?

September 2021

You get a new app and all these tips pop up. Do people engage with this stuff? Is it useful?

As far as I can tell, the jury’s still out on this. There’s research and plenty of advice, but no convincing evidence.

Onboarding introduces users to product or service

First things first: what’s onboarding? Apparently it’s an HR term about helping new employees adapt (Kosolapov, 2016).

There’s no consensus on what ‘onboarding’ means for apps, but it’s about introducing users to a product or service (Cardoso, 2017).

  • ‘Think of onboarding as building an entry ramp for people to use [an] app. Turning first-time users into active and engaged [users] is at the core of creating the ultimate onboarding experience.’ (Even, 2014).
  • Onboarding for apps involves showing users a set of brief messages on using the app to solve a problem, or to show the app’s main idea or features (Kosolapov, 2016).

Figure 1: User onboarding is like an entry ramp (Even, 2014). Photo by MoneyforCoffee.

Onboarding isn’t about comprehensive tutorials. Instead, it’s about the 80/20 rule: teaching people 'how to use the small subset of features that they will spend 80% of their time using' (Kosolapov, 2016).

The main idea is to give them aha! moments to show benefits of the app. Another important idea is to give ‘quick wins’. For example, Twitter helps new users follow a few people, which starts filling up the user’s timeline (Hulick, cited in Strahm, Gray, & Vorvoreanu, 2018).

By the way, ‘onboarding’ is a noun, not a verb, so we can’t say someone’s been ‘onboarded’.

They say it improves user engagement and willingness to pay — but how sure are we?

Onboarding is said to improve user engagement and retention, increase product use, and reduce questions to customer service (Kaya, 2018; cited in Gaal, 2019).

One article says onboarding affects customers’ willingness to pay. I wasn’t going to mention this article because I have reservations about it. But what the heck — we’re all grown-ups. So I’ll tell you about it, then tell you what I think, and let you decide for yourself. Here goes.

The article describes a company’s review of nearly 500 software products and nearly 25,000 customers of those products (Desai, 2019). The review concluded:

Customers who liked their onboarding were more willing to pay than the median customer. They were also more likely to stick around for 21 days than those who disliked their onboarding.
Customers who disliked their onboarding were less willing to pay than the median customer.
Most customers didn’t agree if onboarding was good or bad.

But the author didn’t describe the methodology. You know how reputable journal articles do that, so someone else can replicate the research? Well, this blog post didn’t. I asked for more information but didn’t hear back. Yes, the author has every right to protect their work and intellectual property — this is a private company, after all, and they weren’t doing academic research.

But I can’t ignore my unease. How can you and I be sure their research was robust? For example, how did they know if people liked onboarding? How did they measure willingness to pay? Are we confusing causation and correlation?

Research on game onboarding lacked control groups

Two studies looked at people during onboarding for mobile games, measuring their perspiration and heart rate. One study even identified heuristics (Petersen, Thomsen, Mirza-Babaei, & Drachen, 2017; Thomsen, Petersen, Drachen, & Mirza-Babaei, 2016).

But the studies compared onboarding processes without examining what happens if there’s no onboarding. Because the studies lacked control groups, they don’t show that onboarding helps.

No clear evidence that onboarding for a course platform worked

Researchers at the Hasso Plattner Institute in Germany studied onboarding for an online learning platform, openHPI (Renz, Hoffmann, Staubitz, & Meinel, 2016). They assigned students who signed up for their first openHPI course to one of two groups.

  • Control group — 172 participants simply received confirmation that they were enrolled.
  • Tour group — 119 participants were automatically taken on a virtual tour of the most important pages and shown the platform’s features.

After a week, the researchers checked how many videos, selftests (I suspect they meant quizzes), and text parts (articles or news pages?) the groups visited. Although those who had done the onboarding tour visited more items, the difference wasn’t conclusive (it wasn’t statistically significant; Figure 2).

Figure 2: Though those given a virtual tour checked out more videos, selftests, and ‘text parts’, the difference could be put down to chance (adapted from Renz et al., 2016).

Too few participants in research on event accreditation software

An intern examined onboarding for Accredion (Gaal, 2019), software that ensures the right people have access to the right places at events.

Seven participants were tasked with setting up an event in Accredion.

  • Three were given an onboarding tour — all three completed the task
  • Four had no onboarding — only one completed the task

But these results could have coincidental too (the results weren’t statistically significant), so the study doesn’t show that onboarding works. There was no difference in the confidence, enjoyment, or motivation of the two groups either (Figure 3). The author admits the small sample size meant conclusive results would have been difficult.

Figure 3: The participants of both groups rated their experiences of Accredion similarly. They also rated the software’s usability in similar ways, as shown by their answers to ten questions on the System Usability Scale. Adapted from Gaal, 2019.)

Where does that leave us?

We’re left without evidence that onboarding works. All I managed to find was some applied research on the dos and don’ts. If you believe onboarding works, or would like to research its effectiveness yourself (hint!), check out Instructional overlays and coach marks for mobile apps. The Nielsen Normal article offers tips like not overloading users, using visuals where possible, and making visuals easily distinguishable from the actual app.

Caveat: I generally rely on information that is publicly available and not paywalled.

References

Cardoso, M. C. (2017, May). The onboarding effect: leveraging user engagement and retention in crowdsourcing platforms. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (pp. 263–267). ACM. Retrieved from http://library.usc.edu.ph/ACM/CHI%202017/2exab/ea263.pdf

Desai, N. (2019, June 30). A solid customer onboarding experience drives higher willingness to pay and retention [Blog]. Retrieved from https://www.profitwell.com/blog/positive-onboarding-boosts-retention-wtp

Even, A. (2014, November 20) Refining your mobile onboarding experience using visual
analytics [Blog]. Retrieved from https://www.smashingmagazine.com/2014/11/refining-your-mobile-onboarding-experience-using-visual-analytics/

Gaal, E. (2019). Improving usability with user onboarding in event accreditation software [Thesis]. Retrieved from https://www.researchgate.net/profile/Erik_Gaal2/publication/335259925_Improving_usability_with_user_onboarding_in_event_accreditation_software/links/5d5ba7d5299bf1b97cf796af/Improving-usability-with-user-onboarding-in-event-accreditation-software.pdf

Kosolapov, A. (2016, June 20). A roadmap to building a delightful onboarding experience for mobile app users [Blog]. Retrieved from https://www.smashingmagazine.com/2016/06/complete-roadmap-building-delightful-onboarding-experience-mobile-app-users/

Petersen, F. W., Thomsen, L. E., Mirza-Babaei, P., & Drachen, A. (2017, October). Evaluating the onboarding phase of free-to-play mobile games: A mixed-method approach. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play (pp. 377–388). ACM. Retrieved from https://dl.acm.org/ft_gateway.cfm?id=3125499

Renz, J., Hoffmann, D., Staubitz, T., & Meinel, C. (2016, April). Using A/B testing in MOOC environments. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 304–313). ACM. Retrieved from https://s3.xopic.de/openhpi-public/pages/about/1sP5SeeH1RA1ixdAnnqntE/p304-renz.pdf

Strahm, B., Gray, C. M., & Vorvoreanu, M. (2018, June). Generating mobile application onboarding insights through minimalist instruction. In Proceedings of the 2018 Designing Interactive Systems Conference (pp. 361–372). ACM. Retrieved from https://www.researchgate.net/profile/Colin-Gray-2/publication/324169840_Generating_Mobile_Application_Onboarding_Insights_Through_Minimalist_Instruction/links/5ac3605b0f7e9bfc045fe94f/Generating-Mobile-Application-Onboarding-Insights-Through-Minimalist-Instruction.pdf

Thomsen, L. E., Petersen, F. W., Drachen, A., & Mirza-Babaei, P. (2016, September). Identifying onboarding heuristics for free-to-play mobile games: A mixed methods approach. In International Conference on Entertainment Computing (pp. 241–246). Springer, Cham. Retrieved from https://hal.inria.fr/hal-01640269/file/421094_1_En_24_Chapter.pdf