We scored 50 B2B SaaS homepages across 8 behavioral psychology dimensions. The average: 61.3 out of 100. The highest: 72. 80% fail on the same dimension. Here's the data.
How we scored 50 homepages across 8 dimensions grounded in behavioral psychology research.
We selected 50 B2B SaaS companies across 10 G2 categories, picking the top 3 and bottom 2 performers by G2 ranking in each category. Every homepage was scored on 8 dimensions, each worth 0-100 points. The overall SignalScore is a weighted average.
The 8 dimensions are built on established research, not opinion. Each maps to specific buyer behaviors documented in academic and practitioner literature.
Value proposition clarity. Can a visitor articulate what you do within 5 seconds? Based on Nielsen Norman Group eye-tracking research.
Message hierarchy and flow. Does the page tell a coherent story from headline to CTA? Grounded in narrative persuasion theory.
Customer-centricity and JTBD framing. Does the page speak to what the buyer wants to accomplish? Based on Christensen's Jobs-to-Be-Done framework.
Stakes and cost of inaction. Does the page articulate why doing nothing is the worst option? Grounded in Kahneman's loss aversion (Prospect Theory).
Risk reduction and buyer confidence. Does the page reduce perceived risk of switching? Based on Cialdini's principles and Gartner CEB buyer research.
Credibility and social proof. Are there logos, case studies, stats, and testimonials? Grounded in Cialdini's social proof and authority principles.
Competitive differentiation. Could you swap in a competitor's logo and the page still makes sense? Based on Ries/Trout positioning theory.
Conversion architecture. Is the CTA clear, is there a low-friction path, are objections handled? Based on Baymard Institute UX research.
Every dimension maps to peer-reviewed or widely validated frameworks: Kahneman & Tversky's Prospect Theory (loss aversion), Cialdini's Principles of Persuasion (social proof, reciprocity), Christensen's Jobs-to-Be-Done (customer-centricity), Dixon & Adamson's Challenger methodology (reframing the status quo), Gartner CEB's B2B buyer journey research, and Baymard Institute's UX conversion studies.
Each homepage was analyzed by evaluating the full-page content against dimension-specific rubrics (0-100 scale per dimension). Scoring prioritized observable evidence: what is on the page, not what we know about the company. A market leader with a poorly written homepage gets scored on its homepage, not its reputation.
Beyond the 8 dimensions, we also captured quantitative page-level metrics: word count above the fold (average: 98.7, median: 45.5), buyer-centric vs. company-centric sentence counts, social proof type and placement, competitive claim presence, buyer persona clarity, and primary CTA text. These data points inform the analysis throughout the report.
All 50 companies received individual GTM teardown analyses (linked throughout this report). Those teardowns provide company-specific findings, recommendations, and the full scoring rationale.
The overall picture is not flattering. Most B2B SaaS homepages are mediocre at best, with a universal blind spot that almost nobody addresses.
The Status Quo Tax averaged 41.5 out of 100. That is the lowest dimension by a clear margin, 2 points below the next-worst. A full 80% of companies scored below 50 on this dimension. Only 1 scored above 70.
This is the most reliable finding in the entire study: 35 out of 50 companies (70%) had Status Quo Tax as their single weakest dimension. Only 1 company out of 50 (2%) explicitly stated the cost of inaction on their homepage.
Kahneman's research tells us losses hit 2x harder than equivalent gains. Yet nearly every B2B SaaS homepage focuses exclusively on features and benefits while ignoring the question that actually drives buying decisions: "What happens if I keep doing what I'm doing today?"
68% of homepages show no social proof above the fold. Of the 50 companies analyzed, 34 had zero visible social proof (no logos, no testimonials, no metrics) in their initial viewport. Only 5 displayed customer logos prominently. Just 4 featured testimonials.
This is despite The Proof Stack averaging 57.3, putting it in the middle of the pack. Companies tend to bury their best credibility signals below the fold, where most visitors never scroll. Cialdini's research is unambiguous: social proof is most effective when it appears early in the decision process, not as an afterthought near the footer.
The Proof Stack is also one of the most polarized dimensions in the study, with a standard deviation of 14.2. 15 companies scored above 70, the most of any dimension. But 14 scored below 50. Companies either invest in social proof or completely ignore it. There is almost no middle ground.
Only 36% of companies make any explicit competitive claim on their homepage. The remaining 64% rely on generic "all-in-one," "AI-powered," or "purpose-built" language that could describe any competitor in their category.
The Logo Test (competitive differentiation) averaged 52.8, making it the third-weakest dimension. 38% of companies scored below 50. Only 3 companies scored above 70.
Ries and Trout's positioning research says differentiation must be specific, ownable, and immediately apparent. Most B2B SaaS homepages fail all three tests.
When we identified each company's single strongest dimension, two stood out:
The pattern is clear: companies invest in the visible, tactical elements (proof, CTAs, headlines) and underinvest in the strategic ones (stakes, differentiation, buyer framing).
We evaluated whether each homepage clearly communicated who the product is for. Of the 43 companies where we could fully evaluate this: 32 (74%) had a clear buyer persona. 8 (19%) were vague about their target buyer. 3 (7%) had no discernible buyer persona at all.
Having a clear persona does not guarantee good messaging. Several companies with crystal-clear buyer personas still scored below 50 overall because they used that clarity to describe product features rather than buyer outcomes. Knowing your audience is table stakes. Speaking to their world, not just at their job title, is what separates the top third from the bottom third.
A pattern runs through every finding above: B2B SaaS companies are good at the visible, tactical elements of homepage messaging and bad at the strategic ones. The three strongest dimensions (The 5-Second Verdict at 67.2, The Close at 65.3, The Story Arc at 58.3) are all execution-layer work: headlines, CTAs, free trials, page structure. The three weakest dimensions (The Status Quo Tax at 41.5, The Mirror Test at 43.6, The Logo Test at 52.8) all require strategic positioning decisions: why change, who benefits, why us.
This is not a copywriting problem. It is a strategy problem. You can A/B test CTA button colors all day, but if your page does not make inaction feel costly, does not frame the conversation around the buyer's world, and does not differentiate from alternatives, no amount of tactical optimization will close the gap.
The companies at the top of the leaderboard are not there because they have better design or bigger budgets. They are there because they made strategic messaging decisions before they started writing copy.
G2 category leaders do score higher on homepage messaging, but the gap is smaller than you'd expect, and it disappears entirely on one dimension.
The gap is +6.7 points in favor of G2 leaders. That is meaningful, but it is not large enough to say that market leaders have figured out homepage messaging. They are simply less bad.
The most interesting patterns emerge when you look at individual dimensions.
Market leadership does not guarantee good messaging. These G2 Top 3 companies scored among the lowest in the entire study:
Product Analytics leads the pack. Marketing Automation and ABM cluster near the bottom. And the companies selling marketing tools? They still have some of the weakest homepages.
Companies that sell marketing tools (Marketing Automation, ABM, Landing Page Builders) have worse homepages than companies that don't sell marketing tools.
A closer look at all eight dimensions, from the industry's biggest blind spot to its standout strengths.
Average: 41.5. That is a failing grade by any standard. 80% of companies scored below 50 on this dimension. The best score was 71 (Akita).
Only 1 out of 50 companies (2%) explicitly stated the cost of inaction on their homepage. This is not just a weakness; it is an industry-wide structural failure.
The research is clear: B2B buyers cite "preference for the status quo" as the #1 reason deals die (Gartner/CEB). Dixon and Adamson's Challenger research shows that the most effective salespeople lead with why change is necessary. But almost no one does this on their homepage.
Average: 67.2. This is the strongest dimension overall and the one with the widest range (18 to 82, a 64-point spread). 32 companies scored above 70 here, more than any other dimension.
The best: ChurnZero and Fireflies.ai, both at 82. Their headlines immediately communicate what the product does and who it's for. The worst: Salesforce Pardot at 18. A marketing automation platform whose headline communicates almost nothing about what the product does or who it's for.
Buyer persona clarity was generally decent: of the 43 companies where we could evaluate it, 32 had a clear buyer persona, 8 were vague, and 3 were absent entirely.
Average: 52.8. Here is the question we asked: "Could you swap in a competitor's logo and the page would still make sense?" For more than a third of companies, the answer was yes.
38% scored below 50 on differentiation. Only 36% of companies made any explicit competitive claim at all. ChurnZero and Amplitude tied for highest at 72, with Outreach close behind at 71.
This connects directly to the "sea of sameness" problem in B2B SaaS. When every company says "powered by AI" and "all-in-one platform," nobody says anything at all.
Average: 65.3, the second-strongest dimension overall. 16 companies scored above 70 and only 4% scored below 50. Most companies know how to put a CTA button on a page.
But this is also where G2 leaders separate from laggards: +6.9 points. Market leaders have invested in conversion architecture: clear CTAs, free trials, demo scheduling, low-friction entry points. Smaller companies often have a single "Contact Sales" button and call it a day.
Best: ChurnZero and Pendo tied at 78, followed by Insider (77), and Amplitude and Salesforce Sales Cloud (75). Worst: Salesforce Pardot at 45.
Average: 58.3. This sits in the middle of the pack, 6th out of 8 dimensions. 22% of companies scored below 50. 6 scored above 70.
No company in the study had Story Arc as their single weakest dimension. It is one of the "easier" dimensions to get partly right because any page with basic sections (hero, features, social proof, CTA) gets partial credit for message hierarchy. The problem is that most companies stop there. They build a brochure, not a narrative.
The research is clear on this: narrative persuasion (Green & Brock, 2000) shows that information presented as a story is more persuasive and more memorable than information presented as a list of facts. The best-scoring companies use a problem-agitate-solution flow or a clear "before/after" progression. Salesforce Sales Cloud (72) moves from the buyer's challenge to the platform's approach to proof. Salesforce Pardot (28) dumps disconnected capability statements with no coherent throughline.
Average: 43.6. The second-weakest dimension in the study. 74% of companies scored below 50. Only 1 company, Kompyte (Semrush) at 71, scored above 70.
The Mirror Test asks a simple question: is this page about the buyer, or about you? Clayton Christensen's Jobs-to-Be-Done research established that buyers do not care about your features. They care about the progress they want to make. Pages framed around "you" and "your outcome" convert. Pages framed around "we" and "our platform" inform at best, bore at worst.
The most striking example: Salesforce Pardot scored 12, the lowest in the study. A Salesforce product whose homepage reads like an internal spec sheet. Kompyte (71) takes the opposite approach, centering the page on what the user accomplishes: winning more deals. The difference between 12 and 71 is not budget or design quality. It is a framing decision.
Average: 55.0. Middle of the pack at 4th-strongest. 30% of companies scored below 50. Only 1 company scored above 70 (Pendo at 71).
Most B2B SaaS companies understand, at a basic level, that they need to reduce purchase risk. Free trials, demo buttons, security badges, and customer logos have become standard practice. But the range still spans 49 points (Pendo at 71 to Salesforce Pardot at 22), and the standard deviation of 10.8 means performance varies widely.
Gartner's CEB research shows that 40-60% of B2B deals end in "no decision," not a competitive loss. The buyer simply could not get comfortable enough to commit. The best companies layer multiple confidence signals throughout the page: free trial near the top, customer logos mid-page, security certifications near the footer, and case studies woven into the narrative. Only 1 company (2%) had Safety Net as their weakest dimension, meaning this rarely drags down an overall score. But when it does, the impact is severe.
Average: 57.3. The standard deviation of 14.2 is among the highest of any dimension. 15 companies scored above 70, the most of any dimension. Yet 14 companies scored below 50. Companies either invest in social proof or completely ignore it.
Unbounce scored 85 on the Proof Stack, the single highest individual dimension score in the entire study across all 400 data points (50 companies x 8 dimensions). Their homepage layers G2 reviews, customer testimonials with named companies, integration partner logos, and clear trust signals throughout. Insider follows at 81 with a similar multi-layered approach. Salesforce Pardot scored 34: a Salesforce product with minimal visible proof on its homepage despite having access to the entire Salesforce customer base.
Cialdini's research on social proof is among the most replicated findings in behavioral science. In B2B, proof takes specific forms: customer logos (most common but weakest), case studies with specific metrics (strongest), analyst endorsements, and compliance/security certifications. The best companies treat proof as a system, placing different types at different points in the page narrative. The worst treat it as a checkbox, dumping a logo bar above the footer and calling it done.
When we identified each company's single weakest dimension, the results were lopsided:
No company had The 5-Second Verdict or The Story Arc as their weakest dimension. Those two are the easiest to get partly right because they are the most visible.
We counted every sentence on 44 homepages and classified them as buyer-centric or company-centric. The results explain why most pages feel like product datasheets.
The average homepage has 17.3 buyer-centric sentences vs. 7.5 company-centric sentences, a 3.8:1 ratio. That sounds decent until you look at the median: 2.5:1. A handful of extreme outliers pull the average up. Half of all companies have a ratio of 2.5:1 or lower, meaning they spend nearly as much time talking about themselves as they do talking about the buyer.
The distribution is telling: 18 companies (41%) have a ratio above 3:1, which represents a buyer-first page. 9 companies (20%) clear 5:1. On the other end, 18 companies (41%) fall below 2:1, and 7 companies (16%) have a ratio below 1:1, meaning they use more company-centric language than buyer-centric language on their own homepage.
Above the fold matters most. In the first two sections of each homepage, we counted an average of 9.1 buyer-language instances vs. 4.3 "we/our" instances. Companies tend to front-load buyer-centric language where the stakes are highest, then revert to product-speak further down the page. The problem is that most visitors never scroll past the first two sections, so everything below the fold is speaking to an audience that has already left.
Here is the counterintuitive finding: high buyer:company ratios do not reliably predict high Mirror Test scores. Unbounce has a 24:1 ratio and scores 45 on the Mirror Test, not the 80+ you might expect from that ratio. Woopra has a 0.2:1 ratio but scored 48. What gives?
The Mirror Test scores capture something pronoun counting cannot: Jobs-to-Be-Done framing quality. Saying "you" constantly does not help if the framing is still feature-oriented ("you can integrate with 500+ tools" is technically buyer-language but reads as a feature list). Christensen's JTBD framework asks whether the page describes the progress the buyer wants to make, not just whether pronouns are in the right place.
The companies that score well on both the ratio AND the Mirror Test are the ones who use buyer-language to describe buyer outcomes. Kompyte (Mirror Test: 71) frames everything around what the user accomplishes. Salesforce Pardot (Mirror Test: 12, ratio: 4.7:1) proves that raw pronoun counts miss the point. Even with buyer-oriented pronouns, the framing is still about the platform, not the buyer's world.
With a SignalScore of 72, Insider earned the top spot (tied with 5 others). Even the best homepage in the study leaves 28 points on the table.
Their H1 reads: "Be unstoppable in customer engagement"
Insider leads with buyer-focused language and backs it up with the strongest Proof Stack in the study (81). Their homepage layers named customer stories with quantified outcomes, recognizable logos, and analyst recognition.
They scored 78 on 5-Second Verdict, 77 on The Close, and 68 on Safety Net. The page flows logically from value prop to proof to CTA with consistent messaging throughout.
The gap between the best and worst homepage is 40 points. Insider outscores Salesforce Pardot on every single dimension. The biggest gaps: Mirror Test (45 vs. 12 = 33 pts), 5-Second Verdict (78 vs. 18 = 60 pts), and Proof Stack (81 vs. 34 = 47 pts). Pardot's homepage reads like an internal spec sheet. Insider's reads like a conversation with a buyer who has a problem to solve.
Five awards for the companies that stood out, for better or worse.
All 50 companies ranked by SignalScore. Click any company name to read the full teardown.
| Rank | Company | Category | SignalScore |
|---|---|---|---|
| 1 | Insider | Marketing Automation | 72 |
| 2 | ChurnZero | Customer Success Software | 72 |
| 3 | Gainsight | Customer Success Software | 72 |
| 4 | Gong | Conversational Intelligence | 72 |
| 5 | Amplitude | Product Analytics | 72 |
| 6 | Pendo | Product Analytics | 72 |
| 7 | Unbounce | Landing Page Builders | 72 |
| 8 | Salesforce Sales Cloud | CRM Software | 71 |
| 9 | Vitally | Customer Success Software | 68 |
| 10 | Fireflies.ai | Conversational Intelligence | 68 |
| 11 | LogRocket | Product Analytics | 68 |
| 12 | 6sense | Account-Based Marketing | 68 |
| 13 | Salesloft | Sales Engagement | 68 |
| 14 | Outreach | Sales Engagement | 68 |
| 15 | Omniconvert | A/B Testing Software | 68 |
| 16 | Klue | Competitive Intelligence | 68 |
| 17 | Crayon | Competitive Intelligence | 68 |
| 18 | Kompyte (Semrush) | Competitive Intelligence | 68 |
| 19 | Adobe Marketo Engage | Marketing Automation | 64 |
| 20 | Woopra (Appier AIRIS) | Product Analytics | 64 |
| 21 | Contify | Competitive Intelligence | 64 |
| 22 | Keap | CRM Software | 62 |
| 23 | ActiveCampaign | Marketing Automation | 62 |
| 24 | Demandbase One | Account-Based Marketing | 62 |
| 25 | Apollo.io | Sales Engagement | 62 |
| 26 | Mailshake | Sales Engagement | 62 |
| 27 | Swipe Pages | Landing Page Builders | 62 |
| 28 | Instapage | Landing Page Builders | 62 |
| 29 | monday CRM | CRM Software | 58 |
| 30 | Totango | Customer Success Software | 58 |
| 31 | Fathom | Conversational Intelligence | 58 |
| 32 | Chorus by ZoomInfo | Conversational Intelligence | 58 |
| 33 | CallMiner | Conversational Intelligence | 58 |
| 34 | Countly | Product Analytics | 58 |
| 35 | RollWorks | Account-Based Marketing | 58 |
| 36 | Terminus (DemandScience) | Account-Based Marketing | 58 |
| 37 | Cirrus Insight | Sales Engagement | 58 |
| 38 | Kameleoon | A/B Testing Software | 58 |
| 39 | SiteSpect | A/B Testing Software | 58 |
| 40 | Insightly CRM | CRM Software | 54 |
| 41 | VWO | A/B Testing Software | 54 |
| 42 | HubSpot Sales Hub | CRM Software | 52 |
| 43 | HubSpot Marketing Hub | Marketing Automation | 52 |
| 44 | Akita | Customer Success Software | 52 |
| 45 | Foundry ABM | Account-Based Marketing | 52 |
| 46 | Pagewiz | Landing Page Builders | 52 |
| 47 | AB Tasty | A/B Testing Software | 52 |
| 48 | Competitors App | Competitive Intelligence | 51 |
| 49 | LanderPage | Landing Page Builders | 42 |
| 50 | Salesforce Pardot | Marketing Automation | 32 |
If you're a B2B SaaS company reading this, here are the three highest-impact changes you can make to your homepage messaging today.
This is the single biggest opportunity in B2B SaaS homepage messaging. Add a section, even just 2-3 sentences, that articulates what your buyer loses by sticking with their current approach. Frame it in terms of time, money, or competitive disadvantage. 80% of companies fail here. You will immediately stand out.
Only 36% of companies make any explicit competitive claim on their homepage. You do not need to name competitors, but you need to answer: "Why should I pick you instead of the alternative?" If you cannot swap out your logo and have the page still work, you have differentiation. If you can, you need to fix your positioning.
The average homepage has a 3.8:1 buyer-to-company sentence ratio, but the median is just 2.5:1. A few outliers pull the average up. Most companies talk about themselves nearly as much as they talk about the buyer. Rewrite your first two sections to center the buyer's problem and desired outcome. Your features and capabilities belong further down the page, after you have earned attention by proving you understand the buyer's world.
Want to save or share this report?
SignalScore evaluates your homepage across the same 8 dimensions used in this study. Get your score in minutes, not weeks.
Free scorecard delivered via email. Full diagnosis with findings, citations, and prioritized fixes available for $299 after you see your scores.