FAQs for invited survey respondents
HRMI? Her-me? Her-mi?
‘The Human Rights Measurement Initiative’ is a bit of a mouthful, so we call it HRMI for short, pronounced ‘her-mee’.
“Comparative data on countries’ human rights performance is a useful way to hold governments to account. The Human Rights Measurement Initiative’s work depends on cooperation from human rights defenders everywhere to develop and share the best possible data and to make use of the results.”
– Ken Roth, Former Executive Director, Human Rights Watch
HRMI is the first global project to track the human rights performance of countries. One of the main ways we do this is with our annual data collection through the annual HRMI survey.
Please see the Country Coverage page for the most recent list of countries included in our data collection.
Here’s how the survey works, and how you might be able to take part.
Frequently Asked Questions
The most accurate information on any country’s overall human rights situation comes from local and regional human rights workers.
We have developed a special online questionnaire which asks human rights workers around the world the same questions about how well their government respects human rights in their country.
Our Methodology Research & Design Lead Dr K Chad Clay leads a team of political scientists at the University of Georgia who have designed the survey carefully, and who analyse the data that come in from our survey respondents. They use advanced statistical techniques to ensure cross-country comparability and calculate measures of certainty for each country score. You can read more about the survey methodology here.
The more human rights workers that take part, the stronger the data will be.
Taking part in HRMI’s annual data collection is a win-win activity. Your investment of 30-60 minutes sharing your knowledge with us means that your country will have world-leading data and metrics independently detailing the human rights situation. The more human rights workers that take part, the stronger the data will be.
Anyone can access all our data, for free, at our Rights Tracker.
Human rights monitors and defenders can use our data, along with other evidence from their own work, to show governments where there is room for improvement, and even create a bit of healthy competition with similar countries.
Our survey respondents are human rights researchers and practitioners who are monitoring events in any of the countries we’re collecting data in. The kinds of people we’re looking for include:
- Human rights workers (researchers, analysts, other practitioners) whose job is to monitor civil and political rights in a survey country. They could be working for an international or domestic NGO or civil society organisation.
- Human rights lawyers.
- Journalists covering human rights issues in a survey country.
- Staff working for the National Human Rights Institution (NHRI) of a survey country, ONLY if it is fully compliant with the Paris Principles including being completely independent in fulfilling its mandate.
Don’t worry – we know most people aren’t experts in all the areas we’ll be asking about.
Our survey model takes this into account, and assumes that answers come from people with a range of expertise. We’ve designed the survey so that we will get better data if everyone answers all the questions, than if people choose just the areas they feel most knowledgeable about.
One very good reason for this is that psychological research has found that people aren’t always very good judges of how knowledgeable they are. If we let people choose to answer only some questions, we might miss out on responses from people who could provide helpful information.
As someone who works in human rights, you probably know much more than the average person about most areas of human rights, even if you’re not an expert in every area. So we encourage you to please go ahead and answer all questions, drawing on whatever knowledge you have. Our methodology has been designed specifically to cope with a mix of responses – some coming from people who are more expert than others.
In most cases survey respondents are living in the country they are providing information on, but there are exceptions.
The more the country is inhospitable to human rights defenders (e.g. Saudi Arabia) or in crisis (e.g. DRC, Venezuela) the more survey respondents are likely to be based elsewhere.
Also, some human rights researchers for NGOs are responsible for monitoring more than one country, so they will be qualified to answer questions about countries they don’t live in.
To ensure our independence and avoid conflicts of interest, we do not collect information from government officials or from staff working at government-organised NGOs.
We are looking for respondents who have access to primary sources and are often the first points of contact for human rights information on the ground. For this reason, we do not invite human rights academics to be survey respondents unless they are also practitioners, working with primary sources of information.
We also don’t recruit victims of human rights violations whose human rights knowledge comes only from their personal experiences.
No. Survey responses should always represent the knowledge and interpretation of the person filling out the survey, and not a collective view of a group of colleagues or the official position of the organisation for whom they work. The more individual respondents the better, as different people will have different knowledge bases, and by drawing on all of those we are able to use the differences in responses to calculate certainty bands around our scores.
If you’ve been invited to participate in the survey for your country, you will have been sent a link to a secure registration/consent form. Please fill it in. It takes about 30 seconds.
Everyone who has registered will then be sent a unique single-use link to the survey itself.
If you have been invited to take part, we encourage you to suggest other colleagues or contacts who can also be survey respondents. You can do so by recommending them to your country’s HRMI Ambassador, or emailing HRMI at survey@humanrightsmeasurement.org . There will also be a space in the survey itself to nominate other potential survey respondents.
The survey links will be sent out in February and March.
We have heard from some people that the HRMI survey email has been automatically filtered to their spam email folder. To avoid this risk please add our domain name (@humanrightsmeasurement.org) to your safe senders’ list. Doing this will ensure that the survey is delivered to your inbox.
We take our survey respondents’ safety and information security extremely seriously. You can read here how we work to make sure the information you give us in the registration form is kept confidential.
The survey itself is confidential and anonymous so there is no way for anyone to connect your answers with your name.
We can’t vet every potential respondent personally to make sure they’re appropriate people who have the information we need. We rely on HRMI Ambassadors to contact as many potential survey respondents as they can in their country, and help with other parts of the survey roll-out, like checking local language translations. When Ambassadors contact local practitioners who then nominate other potential survey respondents, that is what we call the snowball effect.
In other words, our HRMI Ambassadors start the snowball rolling by approaching potential survey respondents and inviting them to participate. Then we ask all of those people to suggest more names.
Our Ambassadors are an enormous help! You can meet those of our Ambassadors whose names are public on our team page.
Here’s a short video interview with our Ambassador for Mozambique, David Matsinhe, who is an Amnesty International researcher, specialising in monitoring the Lusophone countries of Southern Africa:
Several of our ambassadors feature in other videos on our YouTube channel.
A key step in processing the data is analysing respondents’ answers to questions about a set of hypothetical countries: the ‘anchoring vignettes’. The answers people give to the questions about these anchoring vignettes tell us how to interpret their answers about their own country. This allows us to properly compare peoples’ responses, even if those people understand the question differently from one another, or interpret the scale (from, eg, ‘slightly’ to ‘extremely’) differently.
There’s also another reason these vignettes are important. The anchoring vignettes describe three fictional country situations which always represent three categories: one country that’s doing quite well in respecting human rights, one country that’s doing very badly, and one that’s in the middle. We expect every respondent to give slightly different responses in where they would put each country on the scale, but everyone should place them in the same order, with the good country scoring best, and the terrible country scoring worst. If a respondent puts them in a different order, we have to assume they either aren’t paying attention, or they don’t have a clear understanding of human rights. If they order the vignettes wrongly, we have to disqualify their answer for their own country for this section. The bottom line is that answering the vignette questions properly is vital if you want your other answers to contribute to the Rights Tracker data.
The vignettes can feel a bit weird to read when people start the survey, but they are an absolutely crucial part of the methodology, helping to ensure the validity and credibility of our data.
Thank you!
Our cutting-edge data and metrics rely on hundreds of human rights workers around the world offering us their time and knowledge. We appreciate it enormously.
If you are taking part in the HRMI survey, we offer you sincere and warm thanks. We couldn’t do this without you.