Topical news and op-ed articles

  • 2018-08-05 | Guardian | John Naughton | Magical thinking about machine learning won’t bring the reality of AI any closer
    Unchecked flaws in algorithms, and even the technology itself, should put a brake on the escalating use of big data.
  • 2018-01-17 | The Atlantic | Ed Young | A Popular Algorithm Is No Better at Predicting Crimes Than Random People
    The COMPAS tool is widely used to assess a defendant’s risk of committing more crimes. Scholarly studies shows that the automatic tool is not much more accurate or fair than a coin toss.
  • 2017-07-20 | | Robin A. Smith | Opening the lid on criminal sentencing software
    In 2013, a Wisconsin man named Eric Loomis was convicted of fleeing an officer and driving a car without the owner's consent. He was denied probation and sentenced to six years in prison based, in part, on a prediction made by a secret computer algorithm.
  • 2017-04-17 | Wired | Jason Tashea | Courts Are Using AI to Sentence Criminals. That Must Stop Now
    A Wired op-ed discussing the lack of algorithmic transparency that can be observed in the case of Wisconsin v. Loomis.
  • 2016-06-30 | The Atlantic | Megan Garber | When Algorithms Take the Stand
    In February of 2013, Eric Loomis was found driving a car that had been used in a shooting. He was arrested; he pleaded guilty to eluding an officer and no contest to operating a vehicle without its owner’s consent.
  • 2016-06-22 | New York Times | Mitch Smith | In Wisconsin, a Backlash Against Using Data to Foretell Defendants' Futures
    Mr. Loomis was arrested in February 2013 and was accused of driving a car that had been used in a shooting. He pleaded guilty to eluding an officer and no contest to operating a vehicle without the owner’s consent. Mr. Loomis, 34, is a registered sex offender, stemming from a past conviction for third-degree sexual assault.
  • 2016-05-23 | | Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner | Machine Bias
    Analysis of COMPAS risk scores for defendants from Broward County shows that the algorithm used to assess likelyhood for future crime are assigning much higher risk scores for blacks than for whites. Blacks with no criminal history are assigned a high risk score after first offence, while whites with a long history of offences are assigned lower scores.
  • 2015-09-02 | | Lauren Kirchner | When big data becomes bad data
    Corporations are increasingly relying on algorithms to make business decisions and that raises new legal questions about whether discriminatory practices are embedded in the algorithms.
  • 2015-07-08 | Huffington post | Andrew Guthrie Ferguson | Predicting Predictive Policing in NYC
    Op-ed where law professor Andrew Guthrie Ferguson raises five key questions that " that concerned citizens should ask about the new technology".
  • 2015-04-30 | BBC | Kent crime up despite new 'predictive policing' tool
    Kent Police introduced a £130,000 system predictive policing system across the county in April 2013 after a four-month trial in Medway saw street violence fall by 6%. New figures show that crime in Kent increased despite the introduction of the system.