About This Site

Editorial Standards

Data accuracy and editorial standards

UKPollingData.com is committed to accuracy, transparency, and editorial independence. This page sets out the standards we hold ourselves to, where our data comes from, how we handle corrections, and how to report an error. We publish this information publicly because we believe readers are entitled to understand exactly how a polling tracker operates and what its limitations are.

No Party Affiliation

UKPollingData.com has no affiliation with any political party, campaign organisation, advocacy group, or political donor. We do not receive funding from any political source. We do not accept commissioned polling from political parties. We do not accept paid editorial content from political campaigns.

This independence is not merely a formal commitment; it is enforced through the practical decisions we make about what to cover and how to cover it. When a poll shows an unusual result that benefits one party, we examine it through the same methodological lens we would apply to a result that benefits any other party. When a polling firm has a documented house effect, we note it — regardless of whether that house effect favours the political left or right.

We do not endorse candidates, parties, or political positions. Our analysis describes what polls show and explains the methodological context; it does not advocate for any particular electoral outcome.

Data Sources

Every voting intention figure published on UKPollingData.com comes from a poll conducted by a member of the British Polling Council (BPC). The BPC is the industry body that sets transparency and disclosure standards for UK polling organisations. BPC membership requires disclosure of methodology, fieldwork dates, sample sizes, and full data tables for every published poll.

Our current data sources include:

  • YouGov — The largest UK polling firm by frequency of publication. YouGov publishes voting intention data typically twice a week. Their methodology uses an online panel with demographic weighting and political weighting based on recalled 2019 vote.
  • Ipsos UK — Monthly Political Monitor, one of the longest-running continuous voting intention series in the UK. Ipsos uses both telephone and online methodologies.
  • Redfield & Wilton Strategies — Near-daily tracker producing high-frequency data useful for detecting short-term movements.
  • Techne UK — Weekly online poll. Techne is known to have a house effect that gives somewhat higher figures to Reform UK relative to the cross-firm average.
  • Deltapoll — Periodic polls, typically monthly or triggered by major political events.
  • Survation — Periodic, with particular expertise in MRP modelling and constituency-level estimation.
  • More in Common — Large-scale research surveys covering political attitudes and values alongside voting intention.

We do not use data from firms that are not BPC members. We do not publish internal party polling or private advocacy surveys. We do not use push polls, voodoo polls, or online opt-in surveys without proper weighting methodology.

How We Build the Poll-of-Polls Average

Our cross-firm polling average is not a simple mean of the most recent published polls. We apply a methodology documented in full on our methodology page. The key elements are:

  • Recency weighting: More recent polls carry greater weight. A poll conducted last week carries more weight than one conducted six weeks ago.
  • Sample size weighting: Larger samples carry slightly more weight, reflecting the greater statistical precision they provide.
  • House effect correction: We document and partially correct for known systematic biases in individual firms’ methodologies. If a firm consistently gives 3 percentage points more to a particular party than the cross-firm average, we note this and apply a partial correction in the average. We do not over-correct; the correction is applied cautiously to avoid introducing new distortions.
  • Methodological consistency: We track separately any changes in methodology by individual firms, since a change in weighting scheme or sampling approach can produce artificial apparent movements in the data.

The full technical methodology, including the specific weighting formulas, is available on the methodology page.

What We Mean by “Current Polling”

When we describe a party as polling at a particular figure — for example, “Reform UK 28%” — we are referring to our cross-firm average, not to the figure from a single firm. Individual polls can vary by several percentage points from the average due to normal statistical variation, house effects, and timing. A single YouGov poll at 31% or a single Techne poll at 30% does not mean Reform UK is polling at those figures; it means one firm produced one data point that may or may not reflect the underlying average.

We are explicit about this distinction throughout the site. Where we quote individual poll figures, we attribute them to the specific firm. Where we quote the polling position, we are referring to the cross-firm average.

Accuracy Standards

We apply the following accuracy standards to every piece of content on this site:

Data Accuracy

Every voting intention figure is entered into our database with the following attributes:

  • Polling firm name
  • Fieldwork start and end dates
  • Sample size
  • Methodology (online, telephone, or mixed)
  • The topline figures for all major parties
  • A link to the original published tables

We check figures against the original published data tables, not secondary sources. If a news article quotes a poll figure that differs from the data table, we use the data table figure.

Contextual Accuracy

Beyond the raw numbers, our editorial content aims for contextual accuracy: representing the significance and limitations of polling data correctly. Common contextual errors we guard against include:

  • Treating a single outlier poll as evidence of a trend
  • Confusing voting intention with predicted electoral outcomes
  • Ignoring margins of error in reporting small movements
  • Failing to note when a firm has a relevant house effect
  • Using historical polling figures without accounting for differences in methodology over time

Historical Accuracy

We do not revise historical polling figures retroactively. If a poll was published with a particular figure, that figure remains in our database unchanged. If a firm later issues a correction to a previously published figure, we update our database and note the correction with a date stamp.

Correction Policy

We take accuracy seriously. If you believe we have made an error — whether in a specific polling figure, a date, a methodology description, or a contextual claim — we want to know.

How to report an error: Use the contact form with the subject line “Correction Request.” Please include:

  1. The specific page and paragraph where the error appears
  2. The figure or statement you believe is incorrect
  3. The correct figure or statement, with a source if possible

Our correction process: We review all correction requests. If we verify an error, we will:

  1. Correct the page promptly, typically within 24–48 hours of verification
  2. Add a correction note to the relevant page indicating what was changed and when
  3. Update any dependent pages or data that were based on the incorrect figure

We do not delete content to hide past errors. Corrections are noted transparently, not silently edited.

What We Do Not Do

For clarity, the following activities are explicitly excluded from UKPollingData.com editorial practice:

  • No commissioned polling: We do not commission original polling. We aggregate and analyse published polling data.
  • No paid content: We do not publish sponsored articles, paid political content, or advertorial material.
  • No unattributed figures: Every polling figure is attributed to its source. We do not publish anonymous or unverified polling data.
  • No predictions: We track what polls show; we do not make seat projections or electoral predictions based on proprietary models. Where we discuss polling implications for seats, we are describing what the data implies under various models, not making a definitive forecast.
  • No advocacy: We do not endorse parties, candidates, or political positions. Where we explain the implications of polling, we are being descriptive, not normative.

Transparency on Uncertainty

We believe that honest communication about uncertainty is a core editorial standard, not a sign of weakness. Polling carries inherent statistical uncertainty, and we are explicit about this in three specific ways:

  • Margins of error: A standard UK voting intention poll with a sample size of 2,000 has a margin of error of approximately ±2.2 percentage points at the 95% confidence level. We note this where relevant, and we do not describe movements within the margin of error as statistically significant.
  • Uncertainty in the cross-firm average: Our poll-of-polls average is itself an estimate, not a precise measurement. The weighting methodology involves judgements — about how much to correct for house effects, about how much to discount older polls — that reasonable analysts might make differently. We document our choices rather than presenting the average as definitive.
  • Uncertainty about electoral implications: Polling data does not straightforwardly translate into seat projections. We are explicit that when we discuss what current polling implies for the 2029 election, we are exploring a range of scenarios, not making a prediction.

Polling and Democracy: Our Broader Responsibility

We are aware that polling data can be weaponised: selectively quoted, stripped of context, or used to manufacture narratives of momentum that themselves influence public opinion. The “bandwagon effect” — where voters support a party because it appears to be winning — is a documented phenomenon, though its magnitude in the UK is debated. The “spiral of silence” — where voters of a party that appears to be losing are less willing to express their preference — has also been documented in certain contexts.

We take seriously the responsibility that comes with aggregating and publishing polling data at scale. We do not publish analyses designed to generate excitement about dramatic shifts that are within the margin of error. We do not lead with the most extreme poll from a particular day when the cross-firm average tells a more moderate story. We do not speculate about what a poll “means” for an election beyond what the evidence supports.

Our goal is to inform. An informed electorate that understands what polling does and does not tell us is, in our view, better for democratic participation than one that treats each new poll as a shock result requiring immediate interpretation.

Contact and Reporting Errors

If you have found an error, have a question about our methodology, or want to make a press enquiry, use the contact page. We respond to all substantive enquiries. Response time is typically 1–3 working days.

LIVE
Voting Intention Reform UK28% Labour18% Con18.8% Greens15% Lib Dems12.6% Starmer Approval Approve28% Disapprove63% VI Tracker Leader Approval GE2029 Forecast Reform UK Rise Latest Analysis