How to Do Usability Testing
A practical usability testing guide — moderated vs unmoderated, writing task scenarios, recruiting participants, and using Maze and Hotjar.
Usability testing is the fastest way to find out where your design breaks. Five users finding the same problem independently is more convincing than any internal debate, and it takes less time than most teams assume.
Moderated vs unmoderated
These are fundamentally different research methods. Choosing the wrong one for your question wastes time.
Moderated testing: You watch a user interact with your product in real time, ask follow-up questions, and can probe when something interesting happens. Sessions are typically 45–60 minutes. You control the flow. You get nuanced qualitative data.
Best for: early-stage designs, complex flows, situations where you have hypotheses you want to probe. When you don't yet know what you don't know.
Unmoderated testing: Users complete tasks on their own time using a tool like Maze. You get quantitative data — completion rates, time on task, click heatmaps — across many more participants without scheduling overhead.
Best for: validating specific design decisions, measuring improvement between versions, getting quick directional signal on prototypes when you already know the questions.
Most teams should run both, for different questions. Don't use unmoderated testing when you need to understand why something is happening. Do use it when you need scale or speed.
Writing good task scenarios
Bad task scenarios produce bad data. The most common mistake: asking leading questions.
Bad: "Find the premium upgrade option." Good: "You want to unlock team collaboration features. Show me how you'd do that."
Bad: "Use the search feature to find a product." Good: "You're looking for a blue running jacket under $150. Show me what you'd do."
Scenarios should:
- Describe a realistic situation, not an instruction
- Use the user's language, not your product's language
- Have a clear success criteria (so you know when they've completed it)
- Be specific enough to be actionable
Aim for 3–5 tasks per session. More than that and participants get fatigued.
Recruiting participants
The ideal participant is someone who matches your target user profile and hasn't seen your product before.
Options for recruiting:
Your own users: The highest quality for products with existing users. Email them, offer a $30–50 gift card. Response rates vary — plan on contacting 3x more people than you need sessions.
UserTesting.com panel: Fast — you can have results in hours. More expensive at $50–100+ per session depending on the panel. Good for general consumer products.
Prolific: Better for specific screener criteria. Participants are vetted researchers. Cheaper than UserTesting.
LinkedIn outreach: Time-consuming but effective for B2B products. You can target very specific job titles.
You need fewer participants than most people think. Five moderated sessions typically reveal 85% of the major usability issues. Don't wait until you have 20 participants scheduled — start with five and iterate.
Running a moderated test session
Set the expectation at the start: you're testing the design, not the person. There are no wrong answers. Ask them to think aloud.
The think-aloud protocol: ask users to narrate what they're doing and thinking as they go. This surfaces confusion that silent observation would miss.
Your job during the session: observe, not guide. When a user gets stuck, resist the urge to help. Their confusion is the data. Let them struggle for a reasonable amount of time, then prompt: "What would you do next?" not "Try clicking the button on the right."
Take notes in real time. Use a structured format: timestamp, what happened, participant quote, your interpretation. Separate observation from interpretation — "they clicked the wrong button three times" is observation; "they found the navigation confusing" is interpretation.
The Figma → Maze workflow
For unmoderated testing of Figma prototypes, Maze is the standard tool.
- Build your prototype in Figma and share the link
- Create a new study in Maze, paste the Figma link
- Add your tasks with start and goal screens
- Maze generates a shareable link to send to participants
- Results come in as participants complete the study — completion rates, misclick rates, heatmaps, time on task
Maze's free tier allows 1 active study with up to 5 responses. Starter plan is $99/month for 3 studies and 30 responses.
Using Hotjar for live-site testing
Hotjar works on live products rather than prototypes. It captures:
- Heatmaps: Where users click, scroll, and move their mouse
- Session recordings: Actual recordings of individual user sessions
- Surveys and feedback widgets: In-context questions you can show at specific moments
Use Hotjar when you want to understand behavior on your real product — not in a controlled test environment. A heatmap showing 80% of clicks going to a non-clickable element tells you something Figma testing can't.
Hotjar's free plan supports 35 daily sessions. The Plus plan starts at $32/month.
Analyzing and acting on findings
After testing, organize findings by frequency and severity.
Frequency: How many participants experienced this issue? Severity: Does it block task completion, or is it just friction?
Prioritize issues that are both frequent and severe. Document them with evidence — quotes, screenshots, timestamps from recordings.
The output of a usability test isn't a report. It's a prioritized list of design changes with the reasoning to back them up. Share it with your team, link the relevant recordings for anyone skeptical, and make the fixes.
Then test again. Testing is most valuable as a regular practice, not a one-time event.
Related
Maze Review 2026: Prototype Testing That Gives You Real Numbers
Maze connects to Figma prototypes, defines tasks, and collects task completion rates, time-on-task, and click maps. Free plan (1 study/month). Starter at $99/month.
Hotjar Review 2026: The Standard for Understanding What Users Actually Do
Hotjar offers heatmaps, session recordings, and feedback surveys to show how users behave on your live product. Free plan exists. Plus at $32/month. Here's the honest breakdown.
Best Tools for User Research in 2026
The best user research tools ranked — from live-site behavior analytics to prototype testing to moderated sessions. Exact pricing and when to use each.