The Fourth Amendment requires “reasonable suspicion” to seize a suspect. As a general matter, the suspicion derives from information a police officer observes or knows. It is individualized to a particular person at a particular place. Most reasonable suspicion cases involve police confronting unknown suspects engaged in observable suspicious activities. Essentially, the reasonable suspicion doctrine is based on “small data” – discrete facts involving limited information and little knowledge about the suspect.
But what if this small data is replaced by “big data”? What if police can “know” about the suspect through new networked information sources? Or, what if predictive analytics can forecast who will be the likely troublemakers in a community? The rise of big data technology offers a challenge to the traditional paradigm of Fourth Amendment law. Now, with little effort, most unknown suspects can be “known,” as a web of information can identify and provide extensive personal data about a suspect independent of the officer’s observations. New data sources including law enforcement databases, third party information sources (phone records, rental records, GPS data, video surveillance data, etc.), and predictive analytics, combined with biometric or facial recognition software, means that information about that suspect can be known in a few data searches. At some point, the data (independent of the observation) may become sufficiently individualized and predictive to justify the seizure of a suspect. The question this article poses is can a Fourth Amendment stop be predicated on the aggregation of specific, individualized, but otherwise non-criminal factors?
This article traces the consequences in the shift from a “small data” reasonable suspicion doctrine, focused on specific, observable actions of unknown suspects, to the “big data” reality of an interconnected information rich world of known suspects. With more targeted information, police officers on the streets will have a stronger predictive sense about the likelihood that they are observing criminal activity. This evolution, however, only hints at the promise of big data policing. The next phase will be using existing predictive analytics to target suspects without any actual observation of criminal activity, merely relying on the accumulation of various data points. Unknown suspects will become known, not because of who they are but because of the data they left behind. Using pattern matching techniques through networked databases, individuals will be targeted out of the vast flow of informational data. This new reality subverts reasonable suspicion from being a source of protection against unreasonable stops, to a means of justifying those same stops.This article provides one example of how changing technology stands to fundamentally change the impact of various Fourth Amendment doctrinal rules. Another example is the third party doctrine, which holds that voluntary disclosure of information to third parties removes a reasonable expectation of privacy in that information. With the vast amount of information now being shared through telephone companies and internet providers, some critics (and even one district court) are now calling for a rethinking of this doctrine.
One more example of a Fourth Amendment rule that may end up needing to be changed is the rule stated by the Court in Kyllo v. United States -- there the Court struck down the warrantless use of an infrared imaging device, holding that a Fourth Amendment search takes place when the police use a device that reveals information about the interior of the home, when that device is not in common use. Commentators question whether Kyllo remains good law in light of the growing availability of thermal imaging devices to the general public.
All of these examples are instances where Fourth Amendment rules no longer provide the level of privacy that they did at the time they were formulated. If that same level of privacy is to be protected, courts will either need to significantly revise Fourth Amendment doctrine, or legislatures will need to place limits on the government's use of new technology.