How To Reduce Gender Bias in evaluations

Taurai Masunda / Posted On: 8 March 2020 / Updated On: 27 September 2022 / Recruitment and Selection / 368

Search Blog Click here to search the site.
How To Reduce Gender Bias in evaluations



Whilst the existence of gender, education, age, race and other biases becomes more widely acknowledged, many organisations are not changing their talent selection systems. Whether it be in a hiring process or a grant application or other incentives, there are some suggestions that anonymizing information about the applicant — suppressing their name, for example — may lead to more applicants being chosen from disadvantaged groups.


However, the evidence supporting this strategy has not inspired the science community, where women constitute just 28% of the workforce in science and engineering, are paid less, receive less funding, and are quoted less often than their male counterparts. While organizations such as the National Institutes of Health have been discussing anonymizing applications for at least a decade to minimize gender bias, there has been no push to enforce this process.

 

Research by Stefanie K. Johnson and Jessica F. Kirk on “Dual-anonymization Yields Promising Results for Reducing Gender Bias: A Naturalistic Field Experiment of Applications for Hubble Space Telescope Time” confirms that anonymizing can mitigate gender bias in the review of scientific research applications. They found out that when indications of candidates’ gender were removed from applications for time on the Hubble Space Telescope, women were selected at a higher rate than when their gender was obvious.

 

Working towards anonymizing assessments isn't a new task. One of the largest implementations of anonymous assessments occurred among United States symphony orchestras. Several U.S. symphony orchestras began to change their hearing processes in the 1970s so that musicians auditioned from behind a screen. One analysis of the data found that in the top five U.S. orchestras, the percentage of women rose from 5% in 1970 to 25% by the 1990s.

 

Evidence emerged in the application process in 2014 that there was a statistical gender bias against women: a study by I. Neill Reid on Gender-Correlated Systematics in HST Proposal Selection revealed that the success rate of female applicants was about 19 percent, although almost 23 percent of applications came from women. The Hubble Space Telescope Time Allocation Committee decided that something had to be done and embarked on a gender obscuring process for applicants. Initially, the applicant's first name was simply deleted from the application's front cover. In 2018, all personal identifying information was removed and evaluators were instructed not to discuss the scientists ' characteristics but only to assess the science's merit.

 

The most important of all the modifications was to delete all names completely and warn reviewers not to address the scientists' characteristics. Men outperformed women by about 5 percent before any anonymization. The number fell to less than 3% after the elimination of the names. Women outperformed men by 1% when the applications were entirely anonymised.

 

When you have evidence that gender, race, age, or other differences are affecting your selection process, despite their not being relevant selection criteria, you have an erroin your processIn other words, extraneous information about one’s identity is causing you to make less accurate decisions. Given that we all want to make good selection decisions, the focus should be on defining relevant criteria beforehand and ensuring that extraneous information is not erroneously influencing your decisions. We recommend that they implement anonymized recruitment where possible, especially at the early stages of applicant screening.

 

Anonymizing applications is an attractive alternative to other gender equity promotions approaches. For one, rather than attempting to reduce bias, which often fails to be successful, anonymizing reduces the risk of unconscious bias in affecting decisions by eliminating information that initially causes the bias. Second, several policies are causing outrage against women due to the belief that women receive extra benefits or preferential affirmative action. Nonetheless, eliminating personal identification details from applications mitigates the potential for bias on or against either sex.

 

Research has shown that gender diversity promotes scientific creativity and innovation. Furthermore, lower success rates for women in science represent a shortcoming in social justice and reduce role models for young women, perpetuating the lack of women in the pipeline. Blinding applications is a relatively simple step forward in curbing these inefficiencies and injustices.

 

 

Taurai Masunda is a Business Analytics Consultant at Industrial Psychology Consultants (Pvt) Ltd a management and human resources consulting firm. https://www.linkedin.com/in/taurai-masunda-b3726110b/ Phone +263 4 481946-48/481950/2900276/2900966 or email: [email protected]  or visit our website at www.ipcconsultants.com


Taurai Masunda
      View Taurai Masunda's full profile



Related Articles






Popular Categories