3

Department Of Homeland Security Developing Pre-Crime Technology

fb share tweet share

Minority Report just got a little bit more real, and not in a cool way. Sure, Tom Cruise looked badass navigating those awesome computer screens trying to figure out who was about to commit a murder, but that is made up; pulled from a Philip K. Dick story that commented on the totalitarian government of the future. The Department of Homeland Security however seems to think that it wasn’t just cool but also a good idea, and now they want a precog machine of their own.

According to the Atlantic, DPH is developing FAST, Future Attribute Screening Technology. What that means is that their machine will process different cues in a subject, such as heart rate, behavioral cues, body temp., facial patterns, etc., and determine whether or not the subject intends to carry out some nefarious activity. In theory, this is a really cool idea that could really help keep the country safe. In execution, there’s no possible way that this thing can work and the fact that the development of the project was approved by what has to be dozens of government officials is a terrifying exposure of just how many fools we have running this country.



The Atlantic points out two huge issues with FAST that should be enough to pull funding for its development, but somehow trials keep moving forward. First, machines can be wrong. Until there’s a way to read people’s minds, we will never be able to predict a person’s actions with 100% accuracy. This is referred to as the false-positive paradox, where test results will yield more false positives than true positives when the sample rate of actual positives is so low. In this case, filtering for such a small group of people, in this case terrorists, will yield more non-terrorist labels than actual terrorist distinctions.

Under this same idea, the cues used to trigger a positive response from FAST could easily not equate to malintent in the long run. A long day at work, a bad argument, a fight at home, even hitting too many red lights could simply put you in a frame of mind that would make you look more like a terrorist to FAST than the guy next to you who won the lottery and rode to the airport in a limo. Not to mention the fact the merely being in an airport is infuriating in itself and is enough to send most people into a mindset where fits of rage and violence seem like a good idea, though they’d never act upon them.

The way the DPH is testing this is by taking two groups of people, one set who will pass through the machine no problem, and the other set that have been instructed to perform some sort of act when they get to the other side. In these tests, the results released have been vague, but we know that FAST identified correctly in 70% of instances, which is abysmal any way you break it down. In an airport, hundreds of thousands of people pass through security every day, and if this were our terrorist watch dog, 30% of the people would either be classified as a terrorist when they really aren’t, or 30% of the terrorists won’t be identified and then boom, no more airplane.

There’s no real reason that something this invasive should be being considered by the government as a legitimate form of security for its citizens. We already have to basically let some back room security monster look at us naked before we can even go to our gate, but the possibility of a machine saying that the wrong person is a terrorist will breakdown the transportation system as we know it for fear of interrogation and incarceration without committing an actual crime. In any respect, FAST is ridiculous and with any hope will be stopped before it goes any further. However, if it does make it past development, expect many a petition to pop up with the intention of shutting this thing down. Stay tuned for more on this as it comes to us.

Comments

  • http://www.facebook.com/people/Becca-Jones/690567812 Becca Jones

    to look at it the other way – the fear of getting a false positive result may prevent a great many people from travelling at all, thus a) creating a terror free sky just because noone uses it and b) give America more fear mongering ammunition (they must be terrorists, now we have FAST they are afraid to fly!)

    When you look at the flipside of the world, it allll makes sense…

  • That Guy.

    Have they actually said they’ll put it in airport screening facilities, because that looks like speculation on Atlanta’s part?  I mean, as an extra tool for examining suspicious persons where’s the harm?  The more checks at their disposal the better, and the greater chance that a correlation between various checks leads to a correct identification of a threat.  At the moment when I visited the states the checks are: some power tripping homeland security official at passport control decides he doesn’t like the look of me and gets to spend as long as he likes asking me questions, anything that helps to remove human prejudice and error from the system should at least be considered.

    • Ben Fenton

      That’s why we need a system based on complete speculation, a proven rate of 30% failure, that someone pretty much pulled out of their bum and made into a real machine. “Hmmm…. what if we just say that an increased heart rate, certain facial expressions and a whole bunch of other randomly chosen stuff is the basis for locking someone up? They can spend a few days or months in jail and lose their job till we get it straightened out, and in the end we’re keeping people safer by tying everyone’s effort and attention to these unreliable machines? GENIUS.”