Skip to content
- The ACLU together with four researchers in algorithmic accountability is challenging the CFAA (The Computer Fraud and Abuse Act), arguing that its provisions make it illegal to do the necessary auditing of algorithms to test for discrimination and bias.
- The popular word2vec embedding method for words might learn biased associations, such as associating the word ‘nurse’ with the gender ‘female’ and so on. A new paper seeks to fix this problem.
- Diversity in teams that build AI might help the algorithms themselves be less biased.