Glossary

Lyssna

Looking to learn more about Lyssna, or hire top fractional experts in Lyssna? Pangea is your resource for cutting-edge technology built to transform your business.
Hire top talent →
Start hiring with Pangea's industry-leading AI matching algorithm today
A Pangea Expert Glossary Entry
Written by John Tambunting
Updated Feb 20, 2026

What is Lyssna?

Lyssna is a remote user research platform that lets product and design teams run studies and collect participant feedback without coordinating in-person sessions or building their own recruiter relationships. Originally launched as UsabilityHub in Australia, the platform rebranded to Lyssna in 2023 to reflect an expanded research scope. It supports first-click tests, five-second tests, card sorting, tree testing, prototype evaluation, surveys, and recorded usability sessions — covering both quantitative preference tests and qualitative session recordings in one place. The integrated participant panel of 530,000+ members across dozens of countries enables same-day recruiting for most studies. In early 2026, Lyssna launched AI-assisted participant targeting in its Labs feature, which generates demographic filter suggestions from a natural-language description of your target user.

Key Takeaways

  • Covers both quantitative preference tests and recorded usability sessions in one platform without additional tools.
  • Built-in panel of 530,000+ participants enables same-day recruiting, but skews heavily toward US-based respondents.
  • Platform subscription and panel credits are billed separately — panel costs can easily add $200–$400 per study.
  • Tree testing and card sorting tools are more mature than Maze's standard-tier equivalents, making it the stronger IA research option.
  • AI-assisted targeting launched in 2026 speeds up study setup but does not expand the underlying panel size.

What Lyssna Does Well

Lyssna's strength is removing the coordination overhead from early-stage design validation. Think of it like the difference between scheduling a usability lab versus running a five-second test over Slack — Lyssna occupies the fast, low-friction end of the research spectrum. First-click and five-second tests can be built, distributed, and analyzed in a single afternoon. Card sorting and tree testing are standout capabilities: Lyssna's information architecture tools are more accessible and more fully featured at standard pricing tiers than what competing platforms like Maze include. For teams running IA audits, navigation redesigns, or labeling studies, this matters. Prototype testing connects with Figma so researchers can send interactive screens directly to participants and capture where users click, hesitate, or exit — without writing a line of code.

Pricing: The Two-Product Model

Lyssna's pricing has a built-in gotcha that catches teams after initial adoption. The platform subscription and the participant panel are billed independently. The Free plan covers unlimited tests with your own participants. The Basic plan runs approximately $75/month (billed annually) and adds analysis features and higher response caps. Pro runs approximately $175/month annually and unlocks team collaboration, advanced logic, and priority support. Enterprise plans are custom-quoted.

The real variable is panel credits: 1 credit per minute of test length per participant response, purchased separately as-needed with bulk discounts. A 10-minute study to 50 participants can easily cost $200–$400 in credits on top of the subscription. Teams that budget only for the subscription and then rely heavily on the panel routinely encounter cost overruns on their first major study.

Lyssna vs. Maze

These two platforms compete for the same lean-team, fast-iteration UX research budget, but they make different bets. Maze runs a panel of 3M+ participants with 400+ targeting attributes — choose it when recruiting niche B2B segments like enterprise IT buyers or industry-specific professionals is a priority. Maze also offers stronger audio and video clips within sessions. Lyssna wins on information architecture research: card sorting and tree testing tools are more complete at comparable price tiers, and the platform supports a broader range of non-prototype test types. For teams doing IA audits, navigation redesigns, or running mixed-method studies beyond prototype testing, Lyssna provides more out of the box. Teams focused primarily on iterating Figma prototypes with large sample sizes should lean toward Maze.

The Panel Quality Problem Nobody Mentions

Lyssna's 530K-person panel looks sufficient on paper, but the practical constraints surface quickly for teams with specific recruiting needs. The panel skews heavily US-based, and researchers targeting non-English-speaking or regional audiences regularly hit minimum sample thresholds before studies close. The screener itself is limited: you cannot write custom screening questions or include images, and the available job title filters are inconsistent with common titles missing. Some researchers working on B2B software studies have reported abandoning Lyssna's panel entirely and reverting to external recruiting tools when they need finance leaders, healthcare administrators, or enterprise software buyers — audiences that exist in Maze's larger panel but are underrepresented in Lyssna's.

Response quality on open-ended questions is also inconsistent. Users in Capterra and G2 reviews flag low-effort responses — participants filling in irrelevant text without engaging with the actual test. Closed-format tests (click tests, card sorts) are more reliable since engagement is measured behaviorally rather than through self-reported text.

Who Hires for Lyssna Skills

Lyssna shows up most in job postings for UX researchers, product designers, and product managers at Series A to Series C SaaS companies where a small research team owns the full research function. Digital agencies conducting research sprints for clients and e-commerce product teams validating navigation and checkout flows are also common contexts. Lyssna expertise is rarely a standalone credential — it appears alongside Figma, Dovetail, and Notion in typical UX research tool stacks. Fractional UX researchers are frequently brought in for specific project types: information architecture audits, usability validation rounds ahead of engineering handoff, or first-click testing for redesigns. The low ramp-up time makes it well-suited to fractional engagements where a researcher needs to deliver results within one or two weeks.

The Bottom Line

Lyssna occupies a practical middle ground in the UX research tool landscape: more versatile than single-method tools, more affordable than enterprise platforms like UserTesting, and genuinely strong on information architecture research. Its two-part pricing model requires careful budgeting, and the panel's geographic skew limits its usefulness for teams with niche or non-US recruiting needs. For lean product teams and fractional UX researchers who need to run validated studies fast — especially IA studies, prototype tests, and first-click validation — Lyssna delivers results with minimal setup friction.

Lyssna Frequently Asked Questions

Is Lyssna the same as UsabilityHub?

Yes. Lyssna was previously called UsabilityHub before rebranding in 2023. The platform expanded its feature set beyond simple preference tests and renamed itself to reflect that broader scope. All historical data and accounts migrated automatically.

Does Lyssna include participants in its subscription price?

No. Lyssna's platform subscription and its participant panel are billed separately. The subscription covers test building, analysis, and team seats. Panel participants are purchased with credits on a pay-as-you-go basis — 1 credit per minute of test length per participant response. Teams that plan to use the panel heavily should budget for both costs separately.

How does Lyssna compare to UserTesting for enterprise research teams?

UserTesting is built for large research teams with enterprise budgets — pricing starts around $15,000 per year and emphasizes moderated video sessions with a large screened panel. Lyssna is significantly more affordable and better suited to lean teams or individual researchers running unmoderated studies. Many enterprise organizations use both: Lyssna for fast quantitative validation and UserTesting for in-depth qualitative sessions.

Can Lyssna test live websites or only prototypes?

Lyssna does not support live-site behavioral testing — it cannot inject tracking into production websites or generate heatmaps from real traffic. It tests prototypes (including Figma connections), static images, and task-based interactions within the Lyssna environment. For live-site behavioral analytics, tools like FullStory or Hotjar are typically paired alongside Lyssna.

How quickly can a fractional UX researcher ramp up on Lyssna?

Most researchers are productive within one to two days. The platform's guided study builder handles most configuration automatically, and documentation is thorough. Standard test types like first-click tests, card sorts, and five-second tests require no specialized Lyssna training. More advanced features like tree testing with multiple tasks and conditional branching logic take a bit longer to master but are still accessible within a week.
No items found.
No items found.