Removing Bias from Predictive Modeling

Forums General Discussions (General) Removing Bias from Predictive Modeling

  • This topic has 1 voice and 0 replies.
Viewing 0 reply threads
  • Author
    Posts
    • #37891
      Telegram SmartBoT
      Moderator
      • Topic 5959
      • Replies 0
      • posts 5959
        @tgsmartbot

        #Discussion(General) [ via IoTGroup ]


        Headings…
        Removing Human Bias from Predictive Modeling


        Auto extracted Text……

        His latest research, “An Algorithm for Removing Sensitive Information: Application to Race-independent Recidivism Prediction,” focuses on removing information on race in data that predicts recidivism, but the method can be applied beyond the criminal justice system.
        In criminal justice, there is a lot of use of algorithms for things like who will need to post bail to get out of jail pre-trial versus who will just be let out on their own recognizance, for example.
        Then you have some kind of algorithm and use all of that historical data on people to train the algorithm, basically learn its parameters.
        Can I produce a method that people could use if they wanted to go with this notion?” There is this notion called statistical parity or demographic parity, and that is the idea that you would have certain protected groups.
        What our paper is really doing is a method for taking some data and removing all of the information about race or about sex or about whatever your protected variables are, and then handing that data off to anybody who wants to train an algorithm.
        If you wanted to repair the data so that it removed information on lots and lots of variables, that was harder.
        At least in the example that we have and the dataset that we are working with, you can see that after you use our method, there really is not any information about the protected variables left.
        What you can see is if you just naively take the data and try to predict recidivism, you are going to find really big differences by race and certainly by other variables as well.
        Running our method on the data, the distributions of the predictions across all of the people in the dataset look very, very similar by race.
        Knowledge@Wharton: A lot of industries are trying to use predictive modeling without bias.
        We are finding lots of interesting things with this data.
        There are all kinds of things going on about how people record the data and how that has changed over time


        Read More..
        AutoTextExtraction by Working BoT using SmartNews 1.02976805238 Build 26 Aug 2019

    Viewing 0 reply threads
    • You must be logged in to reply to this topic.