Astraea: Grammar-Based Fairness Testing

Software often produces biased outputs. In particular, machine learning (ML) based software is known to produce erroneous predictions when processing discriminatory inputs . Such unfair program behavior can be caused by societal bias. In the last few years, Amazon, Microsoft and Google have provided...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on software engineering 2022-12, Vol.48 (12), p.5188-5211
Main Authors: Soremekun, Ezekiel, Udeshi, Sakshi, Chattopadhyay, Sudipta
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!