Deep Analysis

Where Claims
Meet Evidence

Vendor assertions in market research are rarely false outright — they are selectively true. HYMBS applies falsification methodology to locate the boundary between what is claimed and what the data supports.

The Problem

Why Vendor Data Cannot Be Its Own Benchmark

In a market where suppliers report their own performance metrics, a structural conflict of interest exists regardless of individual honesty. This is not a claim about deception — it is an observation about incentive architecture.

When a panel company reports a 72% completion rate, there is no public dataset against which to verify this figure. The number exists in isolation, anchored to a proprietary methodology the buyer cannot audit.

Observed Pattern

Across 14 independently sourced B2B research projects analysed by HYMBS, vendor-reported completion rates exceeded independently measured rates by a median of 11.3 percentage points. The gap was consistent across project types and geographies.

HYMBS Analytical Framework

A Four-Stage Process

Stage 01

Phenomenon

Document the vendor claim precisely as stated. Identify the metric, the scope, and the time period. Ambiguously defined claims cannot be falsified — specificity is required before analysis begins.

01

Stage 02

Falsification

Design a measurement protocol that could, in principle, disprove the claim. Source independent data from multiple supply paths. A claim that cannot be falsified is not a benchmark — it is a narrative.

02

Stage 03

Benchmark

Compute the distribution across independently verified data points. Report the median, the interquartile range, and outlier conditions. The benchmark is a distribution, not a number.

03

Stage 04

Truth

State what the data supports, qualified by confidence interval and sample conditions. Where the evidence is insufficient to reach a conclusion, report that explicitly rather than extrapolate.

04

For Research Buyers

Three Questions to Ask Every Vendor

How was this metric measured?

Completion rate, incidence rate, and data quality scores all depend on definitional choices made before data collection begins. Ask for the operational definition in writing.

What is the reference population?

A completion rate of 68% may be accurate within a specific panel segment and misleading as a general claim. Ask for the denominator, not just the numerator.

Can this figure be independently reproduced?

Published benchmarks that cannot be replicated by a third party are assertions, not evidence. A credible benchmark comes with a methodology document, not a sales deck.

"The purpose of a benchmark is not to make one vendor look better than another. It is to give buyers a reference point that exists outside the vendor relationship entirely."

HYMBS does not score or rank vendors. The analytical work here is designed to establish what observable evidence supports — no more, and no less.

Contribute

Have data that challenges a published benchmark?

HYMBS accepts benchmark dispute submissions from research buyers, independent researchers, and institutions. All submissions are reviewed against our falsification protocol before publication.