Department Of Homeland Security Developing Pre-Crime Technology

By Will LeBlanc | 8 years ago

Minority Report just got a little bit more real, and not in a cool way. Sure, Tom Cruise looked badass navigating those awesome computer screens trying to figure out who was about to commit a murder, but that is made up; pulled from a Philip K. Dick story that commented on the totalitarian government of the future. The Department of Homeland Security however seems to think that it wasn’t just cool but also a good idea, and now they want a precog machine of their own.

According to the Atlantic, DPH is developing FAST, Future Attribute Screening Technology. What that means is that their machine will process different cues in a subject, such as heart rate, behavioral cues, body temp., facial patterns, etc., and determine whether or not the subject intends to carry out some nefarious activity. In theory, this is a really cool idea that could really help keep the country safe. In execution, there’s no possible way that this thing can work and the fact that the development of the project was approved by what has to be dozens of government officials is a terrifying exposure of just how many fools we have running this country.

The Atlantic points out two huge issues with FAST that should be enough to pull funding for its development, but somehow trials keep moving forward. First, machines can be wrong. Until there’s a way to read people’s minds, we will never be able to predict a person’s actions with 100% accuracy. This is referred to as the false-positive paradox, where test results will yield more false positives than true positives when the sample rate of actual positives is so low. In this case, filtering for such a small group of people, in this case terrorists, will yield more non-terrorist labels than actual terrorist distinctions.

Under this same idea, the cues used to trigger a positive response from FAST could easily not equate to malintent in the long run. A long day at work, a bad argument, a fight at home, even hitting too many red lights could simply put you in a frame of mind that would make you look more like a terrorist to FAST than the guy next to you who won the lottery and rode to the airport in a limo. Not to mention the fact the merely being in an airport is infuriating in itself and is enough to send most people into a mindset where fits of rage and violence seem like a good idea, though they’d never act upon them.

The way the DPH is testing this is by taking two groups of people, one set who will pass through the machine no problem, and the other set that have been instructed to perform some sort of act when they get to the other side. In these tests, the results released have been vague, but we know that FAST identified correctly in 70% of instances, which is abysmal any way you break it down. In an airport, hundreds of thousands of people pass through security every day, and if this were our terrorist watch dog, 30% of the people would either be classified as a terrorist when they really aren’t, or 30% of the terrorists won’t be identified and then boom, no more airplane.

There’s no real reason that something this invasive should be being considered by the government as a legitimate form of security for its citizens. We already have to basically let some back room security monster look at us naked before we can even go to our gate, but the possibility of a machine saying that the wrong person is a terrorist will breakdown the transportation system as we know it for fear of interrogation and incarceration without committing an actual crime. In any respect, FAST is ridiculous and with any hope will be stopped before it goes any further. However, if it does make it past development, expect many a petition to pop up with the intention of shutting this thing down. Stay tuned for more on this as it comes to us.

Leave A Comment With: