- The ACLU together with four researchers in algorithmic accountability is challenging the CFAA (The Computer Fraud and Abuse Act), arguing that its provisions make it illegal to do the necessary auditing of algorithms to test for discrimination and bias.
- The popular word2vec embedding method for words might learn biased associations, such as associating the word ‘nurse’ with the gender ‘female’ and so on. A new paper seeks to fix this problem.
- Diversity in teams that build AI might help the algorithms themselves be less biased.
What I’ve been reading (or meaning to read) this week:
- The Effect of the Texas 10% Law on Applications, Admissions, Enrollment and Student Outcomes – Regression Discontinuity Evidence
- The Texas Ten Percent Plan’s Impact on College Enrollment: Students go to public universities instead of private ones
- The government wants Silicon Valley to build terrorist-spotting algorithms. But is it possible?
- Discovering Unwarranted Associations in Data-Driven Applications with the FairTest Testing Toolkit
- Classifier Technology and the Illusion of Progress
A dump of what I’ve been reading lately:
- The NSA’s SKYNET program may be killing thousands of innocent people – including a nice description by Patrick Ball of why the “Ridiculously optimistic” machine learning algorithm is “completely bullshit.”
- Violence in Blue – Patrick Ball appears again to explain that “one-third of all Americans killed by strangers are killed by police.”
- A scientist calculated the cost of not being a straight man, and she wants a tax cut – “These are the results a few of her calculations: it costs about £38,000 ($54,000) to be a gay man in England; women in the US tech industry pay a tax of between $100,000 and $300,000; and women in tech in Hong Kong or Singapore face an even steeper $800,000 to $1.5 million.”
- Digital star chamber: Algorithms are producing profiles of you. What do they say? You probably don’t have the right to know – Frank Pasquale reminds us that “cyberspace is no longer an escape from the ‘real world’. It is now a force governing it via algorithms” and argues that “when algorithms start affecting critical opportunities for employment, career advancement, health, credit and education, they deserve more scrutiny.” #algacc