Not to pee on anyone’s pool but that’s something that most law enforcement offices have been doing for a-la-la-long-time: Statistical analysis.
Basically you get a lot of crime data, you create a forecast and voila: you can “predict” where future crimes will happen*.
*Prediction accuracy depends on data accuracy (i.e. as accurate as your cops are legit and people trust them enough to report) and model bias.
Model bias is interesting: we weight some petty crimes like weed peddling heavier than if my company poisons a city water supply.
Why is “AI” included here is because in the last decade, we’ve become better at deep learning forecasting methods, that improve on traditional statistical analysis.
That said, in real-life scenarios you ought to combine both statistical and ML models with DL ones to reach a valuable forecasting.
My final point is that this is a huge time consuming activity: Anyone promising a system that can produce real-time realistic forecasting is a liar; useful forecasts take months to produce (and crime might have changed once completed).
Not to pee on anyone’s pool but that’s something that most law enforcement offices have been doing for a-la-la-long-time: Statistical analysis.
Basically you get a lot of crime data, you create a forecast and voila: you can “predict” where future crimes will happen*.
*Prediction accuracy depends on data accuracy (i.e. as accurate as your cops are legit and people trust them enough to report) and model bias.
Model bias is interesting: we weight some petty crimes like weed peddling heavier than if my company poisons a city water supply.
Why is “AI” included here is because in the last decade, we’ve become better at deep learning forecasting methods, that improve on traditional statistical analysis.
That said, in real-life scenarios you ought to combine both statistical and ML models with DL ones to reach a valuable forecasting.
My final point is that this is a huge time consuming activity: Anyone promising a system that can produce real-time realistic forecasting is a liar; useful forecasts take months to produce (and crime might have changed once completed).
The problem is that these systems inherit the racist biases from the training data, both from the real world and the people who collected the data.