Can Survey Results in Russia (Not) Be Trusted?

January 14 | 2024

In the last two years, Russian public opinion surveys about approval of the army’s actions in Ukraine and Russia’s political course have been heavily criticized. How fair is this criticism?

Arseny Verkeev

 

Photo: The gradual decline in the percentage of people willing to participate in opinion surveys is a well-known global trend. Photo by Emily Morter on Unsplash

 

Criticism of Surveys Prior to 2022

Surveys conducted in Russia came in for criticism even before 2022. However, this earlier criticism was more restrained and related mainly to surveys on politically sensitive topics.

Otherwise, opinion surveys were considered a completely viable way of collecting data—one that, like any other method, had its pros and cons. And the high penetration of mobile phones in Russia made it possible to conduct telephone surveys at a very high level with low financial costs in comparison to surveys conducted at respondents’ place of residence.

Thus, surveying as a method was rarely questioned. Today, the situation has changed.

 

Points of Criticism

Let us examine some of the typical critiques:

  1. Government pollsters (organizations that specialize in conducting public opinion surveys) are unreliable because they are government-owned. Politically sensitive questions are not asked and results are not published.

At the same time, many independent surveys have also shown a high level of support for the “special military operation” among Russians. The following points of criticism therefore draw attention to the problems with surveys as a source of data in general:

2. People refuse to take surveys, don’t answer the phone, or hang up immediately. Due to the high rejection rate, it is less and less clear who ultimately ends up in the sample.

3. People are afraid to answer tough questions, so they either don’t answer at all or give “safe” answers. For example, they declare support for the authorities even if they think differently. The high levels of reported support for military action may therefore be due to the fact that respondents are asked these questions outright.

 

Opinion-Based Criticism

All of these comments are reasonable.

  • There is every reason to believe that government-affiliated pollsters do not publish politically sensitive results.
  • Following the increase in the number of spam calls, upon seeing an unknown number, people are less likely to pick up.
  • Hot-button topics are indeed more difficult to study with surveys—people may be reluctant to say something to a stranger that they feel will be considered subversive.

However, these comments have one thing in common: they are often based not on data, but on expert opinions.

To test the claim that respondents refused to answer or responded insincerely due to fear, an experiment is required. And if this claim is confirmed, then it is necessary to independently prove that the level of fear experienced has increased as a result of the change in the socio-political context and was not an extant factor. This is a difficult task.

On what data does an expert rely on when claiming that Russians are afraid to tell the truth during surveys? Often, their data are limited solely to the distribution of response options in a particular survey.

And this is normal for experts and observers who do not work with the data themselves. Based on their expertise, they voice what is called an educated guess.

 

Criteria of Reliability

Expert skepticism about the credibility of surveys has permeated journalism, as well as discussions on blogs, podcasts, and social networks. Journalists sometimes state explicitly that there is some consensus among scientists about the unreliability of public opinion surveys in Russia today.

Due to the prevalence of skepticism in the information field, outside observers may get the impression that surveys in Russia should be written off completely. But how can we know if the situation is really that hopeless?

Let’s forgo the ceremonial “surveys don’t work.” This radical attitude makes it difficult to delve into the details. In addition, the argument is poorly formulated, because the criteria for the “performance” of surveys can be discussed endlessly—a fact that has long been known to social scientists.

It is unlikely that any scientist will contest the claim that surveys conducted in Russia should be treated more carefully than, say, surveys conducted in the United States (if the survey design is identical). But that is not saying much.

In a professional environment, precision is a basic requirement. There are many conditions under which special care must be taken in collecting and interpreting data. This includes the topic, time, and location of the survey, as well as the wording of the questions.

In addition, questions that concern abstract, rather than mundane, everyday subjects (“how do you feel about X?”; “to what extent do you trust X?”; “how would you rate X on a scale from one to ten?”) are considered less reliable by default, and the responses are considered to be more random. And this is true regardless of the topic. Finding out exactly how much meat a person eats per week is much easier than finding out their attitude toward vegetarians. What is the accuracy criterion in the second case?

If there is any consensus among scientists, it is that the value of surveying in Russia has not fallen sharply since February 24.

 

Ideas for a Balanced Perspective

Now let us offer some information that allows for a balanced view of the situation with regard to surveys in Russia. For experts, this will not be new information.

  • Direct opposition of independent pollsters to state pollsters is not always useful. It is hard to suspect Princeton University of bias, but several surveys by the university’s Russia Watcher project in 2022 found the same approval figures as did the Russian Public Opinion Research Center.
  • The gradual decline in the percentage of people agreeing to participate in surveys is a well-known global trend. Now this trend is affecting Russian data. According to Russian researchers, after February 24, the rate of refusals to participate in surveys did not increase significantly, as one might have expected.
  • According to researchers from the United States, there is no reason to believe that willingness to answer questions has fallen because respondents are afraid of their personal data being disclosed to the state.

 

Experimenting with Questions

In surveys related to military operations in Ukraine, there is less dependence on established methods and the wording of questions.

In surveys on topics that have been studied for a long time (trust in strangers, crime rates, comfort of the urban environment), it is often too expensive to come up with new wording for questions. It is much easier to use proven methods, and this also allows pollsters to compare newly obtained data with previous sets.

By contrast, today we see many pollsters trying to experiment with the wording of questions, sampling, and analysis strategies. The body of knowledge about attitudes toward military action in Ukraine has not yet been formed, and researchers have nothing to rely on.

This is a positive, because established survey methods are not always the best. Constantly reproducing these methods can slow down the growth of knowledge.

 

Competition Has Increased

Studying sensitive topics is always difficult. Over the past two years, many new pollsters have appeared in Russia, producing regular surveys about the actions of the Russian army in Ukraine and related topics. Among the new players are ExtremeScan, Russian Field, and Chronicle.

It is unlikely that such a large uptick in the amount of empirical data being collected could have been observed before 2022, even on the most socially significant topics (with the possible exception of COVID).

The new pollsters work independently of each other. In addition, foreign scientists continue to conduct surveys in Russia. This competition opens up opportunities for comparing data and stimulates methodological innovation.

Despite the known difficulties, we do not see any serious legal restrictions or institutional obstacles in this area today. Thus, the conditions for conducting surveys in Russia today are superior in many respects to those in societies with a comparable socio-political context, even if not to those in Western democracies.

 

Respondent Safety

At the same time, an important aspect that often goes unaddressed in discussions about public opinion in Russia is ethics.

Are researchers putting their respondents living in Russia at risk by asking them questions about their approval of the current regime’s policies? A number of scientists are confident that, on the principle of “do no harm,” it is inappropriate to conduct surveys on these topics in Russia.

However, many continue to conduct such surveys. One can only hope that they are doing everything possible to ensure safe and anonymous storage of the data received from respondents.

 

* * *

A Note in the Margins

Sociology and surveys are not one and the same. Although part of sociology works with surveys, the latter are a tool that is available to everyone. Not all sociologists conduct surveys and not all surveys are conducted by sociologists.

But criticism of “sociological surveys” (in this particular phrasing) can reduce confidence in sociology as a profession. After all, surveys use a variety of data types, and survey analysis does not end with a simple distribution of responses.

 

Arseny Verkeev is a sociologist who is a visiting researcher at the Ruhr University Bochum

You May Also Interested

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

23 + = 26