The University of Auckland

Project #17: Can AI help detect biases affecting the reliability of Open Source documentation?

Back

Description:

Previous research has indicated a gender bias in how contributors act, but is the reliability of the surrounding documentation also affected? We want to investigate whether there are biases in Open Source documentation that impede accuracy and compare it to AI summarised and generated documentation, to see whether AI could be utilised to overcome entry barriers to new open source contributors. 

 

This also has the potential to be useful in the open source security space, to detect malicious code or packages. 

Type:

Undergraduate

Outcome:

- Insights on existing gender biases in open source software project documentation identified through user studies or other suitable research methods

- New tooling

Prerequisites

None

Specialisations

Categories

Supervisor

Co-supervisor

Team

Lab

HASEL (405.662, Lab)