paint-brush
Think Your AI Is Fair? BiasPainter Might Just Change Your Mindby@mediabias

Think Your AI Is Fair? BiasPainter Might Just Change Your Mind

tldt arrow

Too Long; Didn't Read

BiasPainter is a metamorphic testing framework for detecting social bias in image generation models. It involves five stages: collecting diverse seed images, compiling neutral prompts, generating images with these prompts, assessing properties like race and gender, and finally detecting biases based on changes in these properties.
featured image - Think Your AI Is Fair? BiasPainter Might Just Change Your Mind
Tech Media Bias [Research Publication] HackerNoon profile picture

Authors:

(1) Wenxuan Wang, The Chinese University of Hong Kong, Hong Kong, China;

(2) Haonan Bai, The Chinese University of Hong Kong, Hong Kong, China

(3) Jen-tse Huang, The Chinese University of Hong Kong, Hong Kong, China;

(4) Yuxuan Wan, The Chinese University of Hong Kong, Hong Kong, China;

(5) Youliang Yuan, The Chinese University of Hong Kong, Shenzhen Shenzhen, China

(6) Haoyi Qiu University of California, Los Angeles, Los Angeles, USA;

(7) Nanyun Peng, University of California, Los Angeles, Los Angeles, USA

(8) Michael Lyu, The Chinese University of Hong Kong, Hong Kong, China.

Abstract

1 Introduction

2 Background

3 Approach and Implementation

3.1 Seed Image Collection and 3.2 Neutral Prompt List Collection

3.3 Image Generation and 3.4 Properties Assessment

3.5 Bias Evaluation

4 Evaluation

4.1 Experimental Setup

4.2 RQ1: Effectiveness of BiasPainter

4.3 RQ2 - Validity of Identified Biases

4.4 RQ3 - Bias Mitigation

5 Threats to Validity

6 Related Work

7 Conclusion, Data Availability, and References

3 APPROACH AND IMPLEMENTATION

In this section, we present BiasPainter, a metamorphic testing framework designed for measuring the social bias in image generation models. BiasPainter uses photos of different persons as seed images and adopts various prompts to let image generation models edit the seed images. The key insight is that the gender/race/age of the person in the photo should not be modified too much under the gender/race/age-neutral prompts. Otherwise, a spurious correlation between gender/race/age and other properties (e.g. career) exists in the model. Namely, a suspicious bias is detected. For example, if an image generation model tends to convert more female photos to male photos under the prompt "a photo of a lawyer", a bias about lawyers on gender is detected. Figure 2 depicts the framework of BiasPainter, which consists of five stages:


(1) Seed Image Collection: collect photos of people across different races, genders and ages as seed images.


(2) Neutral Prompt List Collection: collect and annotate different prompts from various topics.


(3) Image Generation: For each seed image, input different prompts to image generation models for generating images.


(4) Properties Assessment: Adopt methods to access the race, gender and age properties for the seed image and generated images.


(5) Bias Detection: Detect the potential social bias with the metamorphic relation.


This paper is available on arxiv under CC0 1.0 DEED license.