Vote for Cryptopolitan on Binance Square Creator Awards 2024. Click here to support our content!

Flawed Forecasts: The Unraveling of Predictive Policing Software Geolitica

In this post:

  • Geolitica’s predictive policing in NJ showed a dismal <0.5% success rate, questioning the efficacy and ethics of AI in crime prevention.
  • High hopes of AI-led crime reduction meet reality; Geolitica’s failure underlines the need for ethical scrutiny and community-centric approaches.
  • Transition of Geolitica’s team to SoundThinking hints at a continued pursuit of tech-aided law enforcement solutions despite glaring setbacks.

For over a decade, law enforcement agencies have turned to predictive policing software, hoping to harness the power of artificial intelligence (AI) in their battle against crime. These software promised a future where police could anticipate and prevent crimes before they occurred, much like the premise of Philip K. Dick’s “Minority Report.” However, a recent joint investigation by The Markup and Wired has uncovered the dismal performance of one such software, Geolitica, in Plainfield, New Jersey, shedding light on the broader efficacy and ethical implications of predictive policing.

The Ill-fated experiment

Geolitica, initially known as PredPol, was adopted by the Plainfield Police Department along with several others with the aim of reducing crime through predictive analytics. The software’s algorithm, akin to those used in predicting earthquake aftershocks, was tasked with forecasting criminal activities. However, a thorough examination of 23,631 predictions made by Geolitica between February and December 2018 revealed a bleak success rate of less than half a percent. Less than 100 of the analyzed predictions corresponded with actual criminal incidents. The software displayed a slightly better aptitude in predicting certain crimes over others, with a 0.6% success rate in forecasting robberies or aggravated assaults, contrasting with a mere 0.1% accuracy for burglaries.

Disappointment and disuse

The reality of Geolitica’s performance starkly contrasted its promise, leading to its underutilization in Plainfield. The head of Plainfield’s police department, Captain David Guarino, expressed their initial hope of enhancing effectiveness in crime reduction through the software. However, the dismal success rate led to its eventual disuse. Guarino admitted, “I don’t believe we really used it that often, if at all,” highlighting the disillusionment surrounding the software’s capabilities. The department, realizing the ineffectiveness of Geolitica, decided to cease its usage and redirect funds towards potentially more impactful community programs.

Read Also  Samsung’s Q1 Operation Profits Surges 930% on AI-driven Memory Chip Demand

Financial toll

The adoption of Geolitica came at a significant financial cost for the Plainfield Police Department. The initial contract required a $20,500 annual subscription fee, with an additional $15,500 for a second-year extension. This substantial investment, unfortunately, did not translate into the anticipated benefits in crime reduction, raising questions on the allocation of public funds in technology-driven initiatives.

Transition and legacy

Despite the shortcomings and the eventual decision of Geolitica to cease operations by the end of the year, the tale of predictive policing does not end here. The personnel from Geolitica have transitioned to SoundThinking, previously known as ShotSpotter, another law enforcement software firm. This transition suggests a continued pursuit of technology-aided law enforcement solutions, albeit with the haunting legacy of Geolitica’s failure.

Ethical quandaries

The endeavor into predictive policing is not without its ethical quandaries. Critics argue that these AI systems, rooted in data potentially tainted with historical biases, may perpetuate or even exacerbate existing discriminatory practices within law enforcement. The failure of Geolitica serves as a stark reminder of the hurdles that predictive policing faces, not only in terms of technical effectiveness but also in navigating the moral and social implications intertwined with AI’s role in law enforcement.

The unraveling of Geolitica’s predictive policing experiment in Plainfield sheds light on the complex landscape where technology intersects with law enforcement. While the allure of AI-driven crime prevention is tempting, the Geolitica case underscores the paramount importance of thorough evaluation, ethical scrutiny, and community-centric approaches in leveraging technology for public safety.

Land a High-Paying Web3 Job in 90 Days: The Ultimate Roadmap

Share link:

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Editor's choice

Loading Editor's Choice articles...

Stay on top of crypto news, get daily updates in your inbox

Most read

Loading Most Read articles...
Subscribe to CryptoPolitan