The Right Accounting of Wrongs: Examining Temporal Changes to Human Rights Monitoring and Reporting

Daniel Arnon, Peter Haschke, Baekkwan Park

Research output: Contribution to journalArticlepeer-review

Abstract

Scholars contend that the reason for stasis in human rights measures is a biased measurement process, rather than stagnating human rights practices. We argue that bias may be introduced as part of the compilation of the human rights reports that serve as the foundation of human rights measures. An additional source of potential bias may be human coders, who translate human rights reports into human rights scores. We first test for biases via a machine-learning approach using natural language processing and find substantial evidence of bias in human rights scores. We then present findings of an experiment on the coders of human rights reports to assess whether potential changes in the coding procedures or interpretation of coding rules affect scores over time. We find no evidence of coder bias and conclude that human rights measures have changed over time and that bias is introduced as part of monitoring and reporting.

Original languageEnglish (US)
JournalBritish Journal of Political Science
DOIs
StateAccepted/In press - 2022

Keywords

  • bias
  • human rights measurement
  • machine learning
  • natural language processing

ASJC Scopus subject areas

  • Political Science and International Relations

Fingerprint

Dive into the research topics of 'The Right Accounting of Wrongs: Examining Temporal Changes to Human Rights Monitoring and Reporting'. Together they form a unique fingerprint.

Cite this