Predictive policing is predictably controversial
As the debate over data-driven predictive policing continues to heat up, not every department is ready to back away from its use.
As the debate over data-driven predictive policing continues to heat up, not every department is ready to back away from its use.
Many people don’t realize is that there are gaps in artificial intelligence regulations, some of which allow infringement on civil rights. That can include housing and employment opportunities.
It’s not often you see a Chief Justice of a state Supreme Court apologize for making a mistake. But that’s exactly what recently happened in Florida after the Florida Board of Bar Examiners recommended cancelling a scheduled on-line Bar exam just days before the test.
When a crisis rears its head and government and the tech community invent ways to handle it, lots of personal data is collected and stored. What happens to that data once an emergency passes? What happens to the technologies used to collect it? What civil liberties are at risk?
Can police mount surveillance cameras on phone and utility poles outside your house and watch your every move? Not in Massachusetts, at least not for extended periods.
The U.S. may see a decline in the use of certain technologies used by law enforcement, including facial recognition, as calls for reimagining policing continue to grow.
Law enforcement agencies use social media in various ways to monitor crime and communicate with the population. But there are few laws on what they can and cannot do with someone’s personal information.
College students forced into online classes are suing colleges, saying the classes aren’t as good as in person. Experts say that might be a tough sell.
Cameras are everywhere, but a new tool, the Atlas of Surveillance, has information on more than 3,000 cities.
The right to privacy is disappearing in today’s electronic age, and most people are letting it happen.