Using Data to determine character

Using Algorithms to Determine Character

The NYT Bits blog reports on yet another attempt to remove humans from the “judgement pipeline”, this time in the realm of credit ratings.

A company in Palo Alto, Calif., called Upstart has over the last 15 months lent $135 million to people with mostly negligible credit ratings. Typically, they are recent graduates without mortgages, car payments or credit card settlements.

On the one hand, I like these kinds of efforts to eliminate human bias from processes that are so fraught. However, it’s important to keep in mind that we are merely replacing one kind of bias (human) with another (algorithmic), and what’s worse is that we really don’t know what this second form of bias even looks like.

As Kate Crawford put it in a recent tweet:

 

 

Genetic Access Control

App Used 23AndMe’S DNA Database to Block People From Sites Based on Race and Gender

A coder on github has demonstrated the inevitable result of 23AndMe’s API for genetic info by creating a Genetic Access Control. The Genetic Access Control demo app allows access to a website only for people of “European (minus Ashkenazi)” ancestry. 23AndMe has cut off access to the API for this project, but it seems likely that future similar (though perhaps more subtly marketed) projects will be created, building on this source code.