Dallas, TEXAS, USA, 11/14/2019 / Story.KISSPR.com /
Dallas, TX — It may sound like something out of a science fiction movie, but cities across the country are already using artificial intelligence (AI) to assess whether someone accused of a crime should be released on bond prior to their trial.
According to computer experts, however, some of these systems make mistakes, and could, therefore, be contributing to the mass incarceration problem in the United States. To put this problem into perspective, 1 in 38 Americans is housed in some form of a correctional facility.
As stated in a July 2019 opinion by two MIT research scientists and a Harvard lawyer for the New York Times, “the U.S. imprisons more people than any other country in the world.”
The article further explains that 500,000 legally innocent people are detained in the United States based on the denial of bail, but not all of them should be: “there are more legally innocent people behind bars in America today than there were convicted people in jails and prisons in 1980.”
Unnecessary pretrial detention can cause people to lose their jobs, and it can jeopardize their ability to care for their families. In an effort to cut back on unnecessary pretrial incarceration, cities around the U.S. have begun to use algorithm-based AI that predicts whether someone is likely to commit a crime in the future. But violent crime is a rare occurrence and this lack of statistical data makes it almost impossible for the AI to be accurate.
In addition to their general inaccuracy, differences in software from city to city can produce wildly different results. A review of 100,000 judicial decisions regarding bail conducted by the executive director of the Stanford Computational Policy lab found that some judges released 90 percent of the individuals who appeared before them, whereas other judges only released around 50 percent. According to the director, the problem with today’s risk assessment tools is that they fail to review a defendant’s case on an individual basis.
Additionally, according to Dallas criminal defense lawyer Mick Mickelsen, these systems fail to take into account the driving force behind many violent crimes: human emotion.
As these risk assessment tools grow in popularity, over 100 organizations, including the ACLU and the NAACP have signed a petition to stop their use.
If you or a loved one has been charged with a crime, it’s important to discuss your case with an experienced Dallas criminal defense lawyer at every step of the process.
Dallas Best Criminal Defense Lawyers
Broden & Mickelsen, PC
Social Media Tags:Artificial Intelligence to Predict Future Criminal Behavior, Dallas Best Criminal Defense Lawyers, Broden &amp; Mickelsen, Dallas criminal defense lawyer, Artificial Intelligence
Release ID: 12764