Bridging the Gender Gap Sam Gedye Discusses AI in Higher Education Hiring

Bridging the Gender Gap: Sam Gedye Discusses AI in Higher Education Hiring

Artificial Intelligence (AI) has become a pivotal tool in recruitment, promising efficiency and impartiality. However, our latest blog post on ‘The Impact Of Artificial Intelligence On Gender Diversity‘ highlights the complex impact of AI on gender diversity.

 

To better understand this issue, we spoke with our higher education expert, Sam Gedye. Sam shared his thoughts on the findings of the study “Does Artificial Intelligence Help or Hurt Gender Diversity? Evidence from Two Field Experiments on Recruitment in Tech.” and discussed how these insights can be applied to the higher education sector.

 

Interview with Sam Gedye

 

Q: Sam, the study mentioned highlights AI’s nuanced effects on gender diversity in tech recruitment. How do these findings relate to the higher education sector?

 

Sam Gedye: The study is quite revealing. In higher education, we see a similar pattern to the tech sector. Despite women representing over half of the undergraduate population, their presence in leadership roles is significantly lower. Although women comprise most of the undergraduate population (56.4%) and the workforce (53.8%), they face significant under-representation in leadership roles. While holding 45% of academic jobs, only 27.5% of academic managers and 20.5% of professors are women. In the top leadership positions of vice-chancellors and principals, women represent only 30%. Structural issues impede women’s career progression, and research emphasises the need to address these challenges for gender diversity.

 

Q: The study found that AI can both reduce and exacerbate gender biases. Could you elaborate on this in the context of higher education?

 

Sam Gedye: On the one hand, AI can help reduce bias in resume screening, focusing purely on skills and experience. This is vital in academia, where merit should be the primary criterion. On the other hand, the study showed that AI-generated job ads could inadvertently increase gendered wording. This is where a human copywriter can be helpful. They can review and refine AI-generated content, ensuring the language used is genuinely inclusive and free from subtle biases that AI might miss.

 

Q: With women underrepresented in academic leadership, how can AI assist in addressing this issue?

 

Sam Gedye: AI tools can be instrumental here. Take Affinda, for example. It uses advanced algorithms to screen resumes without gender bias, focusing on qualifications and experience. This means that women’s resumes are more likely to be judged on their merits, increasing their chances of reaching leadership roles. Affinda

 

Similarly, the Gender Decoder can analyse job descriptions to ensure they’re gender-neutral. By removing gendered language, these ads become more appealing to a diverse range of candidates, including women who might otherwise feel excluded. Gender Decoder

 

AI can also contribute to addressing gender pay gaps by analysing salary data and ensuring that compensation decisions are based on objective criteria such as skills, experience, and job responsibilities. This helps organisations establish fair and transparent pay structures, reducing gender disparities. Syndio, for example, is an AI-driven platform specialising in pay equity analysis. It assesses salary data, considering factors like job level, performance, and tenure, to identify and rectify gender pay gaps. The tool also helps organisations remain compliant with pay equity regulations. https://synd.io/

 

Q: What should hiring managers in higher education be mindful of when using AI in recruitment?

 

Sam Gedye: Hiring managers need to be aware of the biases AI can introduce. It’s not just about implementing AI but also about continuously monitoring and adjusting its algorithms. AI should be an aid, not a replacement for human judgment. Ensuring a balanced approach is crucial for ethical and effective recruitment.

 

Q: Finally, what actions can be taken to reduce gender biases when using AI in recruitment?

 

Sam Gedye: Regular audits are crucial to identify any biases in AI systems. Human oversight is also essential in AI-driven processes. Providing diversity and inclusion training for everyone involved in the hiring process is vital. Lastly, developing policies that guide ethical AI usage in recruitment can align technology with our goals for gender diversity.

 

Conclusion:

Sam Gedye’s insights offer a comprehensive understanding of the multifaceted role of AI in recruitment, especially in the context of higher education. His analysis, grounded in the study’s findings, highlights both the potential and the pitfalls of using AI in pursuing gender diversity in academic hiring.

 

In higher education, where the representation of women in leadership roles remains disproportionately low, AI presents both a challenge and an opportunity. As Sam points out, AI can be a powerful tool for mitigating biases in resume screening, thus promoting equality. However, the inadvertent introduction of gender biases in AI-generated job advertisements underscores the need for a balanced approach, combining technological efficiency with human insight and oversight.

 

The conversation with Sam Gedye illuminates the nuanced role of AI in higher education recruitment. It serves as a reminder that while AI offers significant advantages in terms of efficiency and objectivity, its successful integration into recruitment strategies requires a careful balance of technology and human judgment. Institutions and companies in the higher education sector can greatly benefit from such expertise, ensuring that their use of AI in recruitment not only enhances operational efficiency but also actively contributes to the creation of a more diverse and inclusive academic community.

 

Image for Request a Quote Link