Three types of election surveys


A lot—meaning enough to cause statistically significant change—must have happened in the last week of the campaign.
This is my basic conclusion from comparing the final preelection survey of Stratbase Consultancy with Social Weather Stations (SWS), fielded on May 2-6 (see “Survey evaluation time,” 5/10/25), with the latest report of the Commission on Elections on the results of the senatorial election of May 12.
Preelection surveys. Let us deal with surveys fielded (a) before the election, about voters’ intentions for election day, (b) on election day itself, about whom voters said they chose at their precincts earlier that day, and (c) after the election, to clarify if and how voters modified their voting intentions when casting their ballots.
Many surveys on voter preferences are conducted to prepare for an election. Virtually all of these are confidential, for obvious reasons. The most important, to me, are those about the electability of certain candidates. These would be done long before the election, based on guesses as to who the contenders might be.
Then there are surveys close to or during the campaign. Only during the campaign are the lineups known for sure. The Stratbase-SWS project did six surveys in all, every month since last December 2024, in order to clarify the election race. The final round is a de facto survey prediction since very little time is left to campaign.
The names and numbers behind the perfect 12-for-12 SWS experience in 2022 are in: “Special Report: SWS April 19-27, 2022 final pre-election closely matched Commission on Elections (Comelec) 2022 senatorial results” (www.sws.org.ph, posted 5/9/25). These were known to many political experts, and surely generated very high expectations for the 2025 election.
The May 2025 survey results are not far off. The first 12 in the survey have nine winners; its 3 apparent losers are No. 13 Ben Tulfo, who was fifth in the survey, No. 14 Bong Revilla, who was 11th, and No. 15 Abby Binay, who was seventh. The most unexpected winner, Rodante Marcoleta, was 18th in the survey, just one step off from Stratbase president Dindo Manhit’s feeling that the race was winnable up to the 17th placer, namely Kiko Pangilinan, now the No. 5 winner. The No. 2 winner Bam Aquino had been 16th in the survey.
It is the vote percentage, not the rank, by the way, that makes a finding statistically significant, or not merely due to sampling error. The apparent winners’ vote percentages range, so far, from No. 1 Bong Go’s 38.6 percent to No. 12 Imee Marcos’ 19.0 percent. But the 17.3 vote percentage of No. 13, Ben Tulfo, would not be distinguishable from No. 12 by a survey of less than 3,500 respondents.
Day-of-election surveys or “exit polls.” It is the surveys taken on election day itself, after the respondents have already voted, but before any official results get released, that should be tasked to identify the winners perfectly. SWS has done this successfully, notably in the 2016 and 2010 elections, thanks to media sponsorship. Unfortunately, there have been no exit polls since then.
It is standard for exit polls to record the voters’ age, sex, education, religion, home-language, party affiliation if any, etc. This is how we know, for instance, that the Iglesia ni Cristo vote has been about 80 percent solid. The official Comelec reports show the location of the votes, but nothing else. The Gen Z hypothesis is but a guess without statistics.
In its past exit polls, SWS has asked when the final choice of candidates was made, and discovered that as many as one-fifth made their decision on election day itself. We have also asked if the choice of candidates was based on the platform or on the personality of the candidate, and found the bulk of answers being “platform.”
The original objective of exit polling was to learn the election result ahead of the official count, which took many days, even weeks, in the manual counting days. Exit poll interviews can start as soon as voters leave the precincts, and reach adequate numbers even before the polls close. This is how the US media networks can call elections within hours after polls close, and relate them to party affiliation and many voter-demographics. Of course, it takes experience with exit poll accuracy, in the past, for the losers to concede defeat.
Postelection surveys. The very first SWS survey, in partnership with Ateneo de Manila University, was in May 1986, after the February Snap Election, Edsa People Power, and under the new government of then President Corazon Aquino. When that national survey asked about the respondent’s vote in the snap election, it found 64 percent voting for Cory, 27 percent voting for Ferdinand Marcos, and 9 percent no-answers. (see my book “The Philippine Social Climate” (Anvil, 1994), chapter 16, “The history of the 1986 electoral surveys”)
Forthcoming surveys can still probe into voters’ experiences in the last week of the campaign. How many saw a ground-war or a cyberwar? How many had personal knowledge of vote-buying? How many knew about the International Criminal Court arrest of former president Rodrigo Duterte, and the impeachment of Vice President Sara Duterte, and what do they feel about these matters? Are there statistical correlations with their votes?

Dr Mahar Mangahas is a multi-awarded scholar for his pioneering work in public opinion research in the Philippines and in South East Asia. He founded the now familiar entity, “Social Weather Stations” (SWS) which has been doing public opinion research since 1985 and which has become increasingly influential, nay indispensable, in the conduct of Philippine political life and policy. SWS has been serving the country and policymakers as an independent and timely source of pertinent and credible data on Philippine economic, social and political landscape.
Forging strategic ties with China