Police agencies increasingly have to equip themselves for detecting when AI is used for illegal purposes.
Police forces are increasingly confronted with the challenge of individuals using artificial intelligence to create fraudulent claims. A notable incident in Onondaga County, New York, involved a man falsely reporting damage to his vehicle caused by ice falling from a police car. Authorities quickly identified the accompanying image of the supposed damage as AI-generated, marked with 'Meta AI.' This case underscores the growing necessity for law enforcement agencies to develop new strategies and tools to detect and combat AI-driven illegal activities, highlighting a significant shift in criminal tactics.
Cicero Police Chief Steve Rotunno detailed how the 'Meta AI' watermark on the submitted image immediately flagged the report as a scam attempt to defraud either the police department or taxpayers. Such incidents highlight the hidden costs for law enforcement in keeping pace with evolving criminal trends enabled by AI. While grant funding, like the New York state law enforcement technology grant, helps departments acquire necessary resources for these investigations, the time spent on bogus AI cases diverts valuable resources and administrative effort from legitimate ongoing investigations.
April Harris, an associate professor of cybersecurity and criminal justice at Herkimer County Community College and a former national security contractor, emphasized the difficulty of detecting AI-generated content with the naked eye. She noted that while forensic tools can assist by analyzing pixels and layering, they are not foolproof. Harris advocates for police agencies to establish partnerships with local academic institutions that possess the relevant training and equipment. This collaboration could provide law enforcement access to sophisticated forensic tools and expertise at a potentially lower cost compared to obtaining expensive professional licenses.
The academic community in New York state is expressing a strong willingness to assist law enforcement in tackling AI-related challenges. Lee McKnight, an associate professor at Syracuse University's iSchool and director of the AI, Blockchain and Crypto Innovation Lab, stated his lab's readiness to help analyze such cases. Similarly, Siwei Lyu, a SUNY Empire innovation professor at the University at Buffalo's Department of Computer Science and Engineering, confirmed that he is frequently contacted by journalists and investigators for fact-checking on AI issues and is eager to support law enforcement. This collaborative approach also suggests the availability of free or low-cost resources for police.
Cicero Police Chief Rotunno underscored the significant operational burden imposed by AI-generated false claims. He reported that even a single bogus case can consume several hours of administrative and detective time, diverting personnel from other crucial investigations. This diversion has a tangible cost in terms of efficiency and resource allocation. In the specific Cicero incident, the individual responsible for the fabricated report was ultimately charged with falsely reporting an incident, demonstrating that legal consequences exist for those who misuse AI for deceptive purposes against law enforcement.