News

3Novices:Decision systems that respect privacy, fairness

Increasingly, decisions and actions affecting people’s lives are determined by automated systems processing personal data. Excitement about these systems has been accompanied by serious concerns about their opacity and threats they pose to privacy, fairness, and other values. Examples abound in real-world systems: Target’s use of predicted pregnancy status for marketing; Google’s use of health-related search queries for targeted advertising; race being associated with automated predictions of recidivism; gender affecting displayed job-related ads; race affecting displayed search ads; Boston’s Street Bump app focusing pothole repair on affluent neighborhoods; Amazon’s same day delivery being unavailable in black neighborhoods; and Facebook showing either “white” or “black” movie trailers based upon “ethnic affiliation.”http://ift.tt/2uMENgx
Twitter 3Novices

Advertisements
Standard

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s