Track Record

This page first posted 2 May 1997, last updated 15 May 2023

Electoral Calculus has been using scientific models to predict general elections since 1992. Here is the record of those predictions, shown against what actually happened.

Click on any year below to go to the detailed Track Record for that election, or browse the summary results underneath.

Local 2023
Local 2022
Local 2021
2019
EU 2019
2017
2015
2010
2005
2001


December 2019 Election

Prediction: Conservative majority of 52 seats
Actual result: Conservative majority of 80 seats

PartyPred VotesPred Seats  Actual VotesActual Seats
CON43.3%35144.7%365
LAB33.9%22433.0%203
LIB11.7%1311.8%11
SNP3.6%414.0%48

The 2019 prediction was the first time regression analysis was used, and it was broadly accurate. The polls fairly accurately indicated a substantial lead of the Conservatives over Labour, which our new regression-based model translated into a considerable Conservative majority. In the event the Conservative lead of Labour was slightly greater than the polls had shown, so that the Conservative majority was a bit bigger than expected.
Full details of 2019 error analysis.

Polls' error: 2.3% to Labour

The pollsters slightly underestimated the Conservatives' lead over Labour. Their estimated lead was 9.4pc and the actual lead was 11.7pc. This was a relatively good performance by the pollsters.


June 2017 Election

Prediction: Conservatives majority of 66 seats
Actual result: Conservative short 8 seats of a majority

PartyPolls VotesPred VotesPred Seats  Actual VotesActual Seats
CON43.2%44.3%358 43.5%318
LAB36.4%35.3%218 41.0%262
LIB7.9%7.9%3 7.6%15
SNP4.0%4.0%49 3.1%35

In 2017, the prediction was not accurate. The primary cause of error was a significant error in the pre-election polls conducted by the national pollsters. They estimated a Conservative lead over Labour of 6.8%, when the reality was a Conservative lead of only 2.5%.

Consequently our prediction had too many seats predicted for the Conservatives and too few for Labour. Tactical voting also benefitted the Liberal Democrats who gained seats despite losing national vote share.
Full details of 2017 error analysis.

Polls' error: 4.3% to Conservatives

The pollsters over-estimated the Conservative lead over Labour, giving a false view of the election throughout the campaign. The error was better than in 2015 and 1992, but worse than in other years. A noticeable feature is that the direction of the error changed away from the polls long-term under-estimation of the Conservatives.


May 2015 Election

Prediction: Conservatives short 46 seats of majority
Actual result: Conservative majority of 12 seats

PartyPolls VotesPred VotesPred Seats  Actual VotesActual Seats
CON33.7%33.5%280 37.8%331
LAB33.4%31.2%274 31.2%232
LIB8.9%11.0%21 8.1%8
SNP4.1%4.1%52 4.9%56

In 2015, the prediction was not accurate. The primary cause of error was a significant error in the pre-election polls conducted by the national pollsters. They estimated a Conservative lead over Labour of only 0.3%, when the reality was a Conservative lead of 6.6%. The spread betting markets were a bit more realistic on the Conservative lead, but over-estimated Liberal Democrat support.

Consequently our prediction had too many seats predicted for Labour and the Liberal Democrats, and too few for the Conservatives.
Full details of 2015 error analysis.

Also read our 2015 election night blog.

Polls' error: 6.3% to Labour

The pollsters significantly under-estimated the Conservative lead over Labour, giving a false view of the election throughout the campaign. The error was similar to the 1992 election when there was a polling error of 9.5%. This means that the average polling error over elections in the last twenty-five years has been 4.8% in favour of Labour.


May 2010 Election

Prediction: Conservatives short 29 seats of majority
Actual result: Conservatives short 19 seats of majority

PartyPolls VotesPred VotesPred Seats  Actual VotesActual Seats
CON35.7%36.1%297 37.0%307
LAB27.8%27.6%235 29.7%258
LIB27.3%26.9%86 23.6%57

In 2010, our predictions were fairly accurate. The main error was that we predicted too many Liberal Democrat seats and too few Labour seats. This error was mainly caused by both the pollsters and the betting markets over-estimating the strength of the Lib Dems. Our prediction for the Conservatives was relatively good for both share of vote and number of seats.
Full details of 2010 error analysis.
Model Comparisons for 2010 election.

Polls' error: 0.6% away from Labour, 3.7% too high for Lib Dems

The pollsters under-estimated Labour for the first time in many years, but were pretty accurate on the Conservative-Labour gap. They over-estimated the Liberal Democrats which skewed the prediction towards them.

May 2005 Election

Prediction: Labour majority of 132
Actual result: Labour majority of 66

PartyPred VotesPred Seats  Actual VotesActual Seats
CON31.7%170 33.2%198
LAB37.5%389 36.2%356
LIB22.8%58 22.6%62

In 2005, the poll and model errors did not cancel out but both pushed in the same direction. The polls' error, although smaller, still exaggerated the predicted Labour majority. Also regional swings, some tactical voting unwind and other non-linear effects further reduced the actual majority.
Full details of 2005 errors.

Polls' error: 2.9% to Labour

The pollsters were more accurate than in previous years, but there was still a small pro-Labour bias. Interestingly, the new organisation YouGov which uses internet polling had the most accurate results. You can see the final 2005 Opinion polls.

June 2001 Election

Prediction: Labour majority of 211
Actual result: Labour majority of 167

PartyPred VotesPred Seats  Actual VotesActual Seats
CON30.3%150 32.7%166
LAB46.3%435 42.0%413
LIB17.5%45 18.8%52

In 2001, although the polls exaggerated Labour's lead, the result was largely unchanged from the previous election. Since the polls overestimated Labour's lead by over 6%, the actual majority would have been around 147 on a uniform swing basis. The non-uniform swing in some marginal seats makes the prediction seem less inaccurate than it deserves. Full details of 2001 errors.

Polls' error: 6.5% to Labour

The polls did accurately predict another Labour landslide. But again their exact numbers overstated Labour's lead somewhat. You can see the final 2001 Opinion polls.

May 1997 Election

Prediction: Labour majority of 171
Actual result: Labour majority of 179

PartyPred VotesPred Seats  Actual VotesActual Seats
CON30.6%191 31.4%165
LAB47.2%415 44.4%419
LIB15.4%25 17.2%46

This year, the pollsters were true Cassandras - they predicted gloom for the Conservatives, but were not fully believed. Many remembered the errors of 1992 and were unconvinced by the polling figures. However they turned out to be broadly accurate, and Labour did indeed win a landslide majority. On election night, Martin Baxter was the first commentator on the BBC to correctly predict the scale of their victory.

Poll's error: 3.5% to Labour

The polls were very successful this year in predicting that Labour would win a landslide. But there still seemed to be a bias towards Labour, although much less than previously.

April 1992 Election

Prediction: Labour 17 seats short of a majority
Actual result: Conservative majority of 20

PartyPred VotesPred Seats  Actual VotesActual Seats
CON36.9%295 42.8%335
LAB39.0%308 35.2%271
LIB19.8%20 18.3%20

Electoral Calculus got 1992 spectacularly wrong. We predicted a Labour victory, but with a hung parliament. In fact the Conservatives were returned to office with a reduced majority. The principal cause of the error was that the opinion polls mis-estimated the relative support of the two parties.

Polls' error: 9.5% to Labour

The polls gave Labour a 2% lead, when actually the Conservatives were 7.5% ahead. That error of 9% was later ascribed to a combination of "shy Tories" declining pollster's questions; a late swing away from Labour; and errors in reweighting samples to match national averages.

The pollsters took the 1992 debacle very seriously, and put much effort into improving their methods. This would pay off next time, but they had already lost credibility.


Next Time

The future is uncertain. The two caveats to bear in mind when using the election prediction are: If you wish to make your own adjustments to the published polls, you can easily translate that into seats using the user-defined prediction.