Methodology Explainer

House Effects in UK Polling: Why Pollsters Disagree

Data analysis team reviewing polling figures

Open the UK polling tables on any given week and you will find Labour anywhere from 24% to 30%, Reform UK anywhere from 24% to 30%, and the Conservatives anywhere from 19% to 25%. These differences are not simply sampling error. They are systematic, persistent, and directional. YouGov consistently shows somewhat different figures from Techne. Kantar consistently differs from Redfield & Wilton. These stable inter-firm differences are what political scientists and data journalists call house effects.

Understanding house effects is essential for anyone who wants to read UK polls intelligently. A single poll from one firm can be misleading not because the firm made a mistake, but because all of its polls are biased in the same direction relative to the true value. That direction can only be established by comparing firms over time.

What causes house effects?

House effects arise from methodological choices that each firm makes and does not change between surveys. When a choice systematically inflates or deflates a particular party measured support, the result is a house effect for that party. The main sources are:

1. Panel composition

Online panel members are not a random sample of the British public. They are people who voluntarily joined a research panel — typically more politically interested, more internet-savvy, and in some panels younger than average. Different panels have different demographic and political profiles. A panel that over-represents people who read the Express newspaper will over-represent Reform UK voters. A panel that over-represents Londoners will over-represent Labour voters. Post-fieldwork weighting corrects for demographic imbalances but cannot fully correct for attitudinal imbalances that are not directly measured.

In practice: Techne panel — recruited partly through Express Digital properties — is likely to contain more Reform-sympathetic respondents than YouGov panel, even after age and demographic weighting. This helps explain Techne persistent Reform-high house effect.

2. Weighting choices

All pollsters weight their raw data to match the population, but they do not all weight the same variables. The most politically consequential weighting choice is 2024 General Election vote recall. Firms that weight heavily to the actual 2024 result will push their samples toward the 2024 electorate distribution. Firms that weight less aggressively will produce different results even from identical raw data.

A secondary variable is education. University-educated voters lean Labour and Green and lean against Reform. If one firm weights more aggressively for education, this systematically affects the party shares.

3. Question wording and order

The precise wording of the voting intention question, and what comes before it in the survey, affects responses. A question that lists party names in a specific order may slightly inflate the first-named party. Most major UK firms use broadly similar question wording (the BPC has encouraged standardisation), but subtle differences remain.

4. Undecided treatment

How a firm handles undecided respondents is one of the most technically significant house effect sources. Firms that ask a follow-up squeeze question and include partial leaners in the headline figure will show different results from firms that exclude all undecideds from the denominator. With typically 25–35% of respondents expressing uncertainty, the undecided treatment can shift reported party shares by 2–4 points.

5. Mode effects

Whether a poll is conducted online or by telephone can produce different results. Telephone respondents may be more likely to give socially acceptable answers or to follow interviewer-provided response options differently than online self-completers. Kantar telephone component likely explains part of its divergence from fully online firms.

Estimated house effects: May 2026

These estimates are based on systematic comparison of each firm recent polling against the cross-firm average. A positive adjustment means the firm tends to show that party higher than the average; a negative adjustment means it shows the party lower than average.

Pollster Labour adj. Con adj. Reform adj. LD adj. Key characteristic
YouGov+1.0-0.5-0.5+0.5Slightly Labour-high; close to centre
Redfield & Wilton+0.50.0-0.50.0Close to average; slight Labour-high tendency
Opinium0.0-0.5-1.0+0.5Slightly Reform-low; near average overall
Ipsos-1.0-1.0+2.0+0.5Consistently Reform-high vs. peers
Techne-1.00.0+2.5-0.5Strongest Reform-high effect in market
Kantar-1.5+0.5+1.50.0Labour-low, Reform-high; phone mode effect
Survation-0.5-1.5+2.0+0.5Con-low, Reform-high; strong hybrid effect
Savanta-0.50.0+0.5-1.0Close to average; slightly LD-low
More in Common-1.5+2.5-1.5+1.0Strongest Con-high / Reform-low effect

Adjustments are estimates based on average deviations from the cross-firm mean over the preceding 3 months. Positive = firm shows higher than average; negative = lower than average. All figures in percentage points.

The Reform UK house effect: Techne vs. YouGov

The most discussed house effect in 2026 UK polling is the persistent gap between Techne and YouGov on Reform UK. Across the past twelve months of polling, Techne has consistently shown Reform UK approximately 2–3 points higher than YouGov. This is not random variation — the gap is present in virtually every weekly comparison.

The leading explanation is panel composition. Techne recruits respondents partly through digital advertising on Express Group properties (the Daily Express and Sunday Express), which have a strongly pro-Reform readership. Respondents recruited via these channels are more likely to be Reform-sympathetic than respondents recruited through the general online population that YouGov draws from. Even after age and demographic weighting, this attitudinal skew persists because it is not directly measured and corrected.

A secondary explanation involves question design. Techne uses a relatively direct voting intention question without a strong squeeze-question follow-up for undecideds, and the party list order in their questionnaire has historically placed Reform UK prominently. Whether order effects explain the full gap between Techne and YouGov is debated among methodologists.

For context: if Techne showed Reform at 29% and YouGov showed Reform at 26% in the same week, neither figure is necessarily right. The most defensible estimate of true Reform support is somewhere between the two — and the cross-firm average, adjusted for house effects, is the best single estimate. See the poll of polls for the house-effect-adjusted cross-firm average.

More in Common: the Conservative outlier

More in Common consistently shows the Conservatives significantly higher than the polling average — typically 2–3 points above other firms. In May 2026, when most pollsters show the Conservatives at 21–24%, More in Common typically shows them at 24–26%. Simultaneously, they show Reform UK lower and Labour lower.

More in Common is a thinktank-turned-polling-firm focused on social attitudes and values-based segmentation. Their methodology places more emphasis on education-level weighting than some competitors. The likely explanation for their Conservative-high house effect is heavier non-graduate weighting: non-graduates are more likely to be Conservative, and More in Common methodology effectively upweights this group relative to other firms.

How cross-firm averages correct for house effects

The most robust approach to correcting for house effects is a weighted cross-firm average that applies a house-effect adjustment to each firm data before averaging. This is the method used on this site for the voting intention tracker and poll of polls.

The process has four steps:

  1. Estimate house effects: For each firm and party, calculate the average deviation from the cross-firm mean over the preceding 90 days.
  2. Apply adjustments: Subtract each firm estimated house effect from its raw reported figures to produce house-effect-adjusted figures.
  3. Calculate weighted average: Combine the adjusted figures using a weighting scheme that accounts for sample size and recency. More recent polls receive higher weight.
  4. Rescale to sum to 100: After adjustment, party shares may not sum to exactly 100%. A proportional rescaling step ensures consistency.

Historical stability of house effects

House effects tend to be persistent but not permanent. Firms that change their methodology — as Ipsos did when transitioning to KnowledgePanel — or that change their panel recruitment approach may see their house effect shift significantly. The estimates presented here are updated quarterly.

The most stable house effects over the past five years have been Techne (consistently Reform-high), More in Common (consistently Conservative-high), and YouGov (consistently Labour-slight-high). The least stable have been Survation and Deltapoll, whose house effects have varied more with fieldwork timing and client-specific methodology adjustments.

Track the impact of house effects on the current polling picture on the poll of polls page, which shows both raw firm-by-firm figures and the adjusted cross-firm average side by side.

LIVE
Voting Intention Reform UK28% Labour18% Con18.8% Greens15% Lib Dems12.6% Starmer Approval Approve28% Disapprove63% VI Tracker Leader Approval GE2029 Forecast Reform UK Rise Latest Analysis