Lolita-Bot Uses A.I. To Catch Online Predators

By Nick Venable | Published

This article is more than 2 years old

lolita“Hi, how’z it going tonite? Just chillin over here. My folks went to the movies and my little brother is at my aunt’s house so I’m just bored, bored, bored. Not as bored as I was this summer before starting high school, but it’s still a waste. Do you like One Direction? I totally think they’re all hot enough to get into a DP sesh with…”

That was me trying to sound like a 14-year-old girl, something I swear I don’t do with any frequency. I was only trying to measure myself against the Negobot, a Spanish artificial intelligence program that will target online predators and pedophiles by mimicking the language of teenage girls in chat rooms (those still exist?), message boards, and any other electronic dungeons that perverts get comfortable in. The project was created by a team at the University of Deusto near Bilbao, Spain.

The way the program was designed, the bot engages potential pedophiles in conversation, keeping it cool and skewing young in conversation tactics; childlike language and slang are used, as well as mistakes in spelling and punctuation. Depending on how the human on the other end of the conversation drives things, Negobot will take on one of seven different personalities, adjusting itself to the intensity of the person’s responses.

Of the older methods that chatbots used, co-creator Dr. Carlos Laorden says, “Their behavior and interest in a conversation are flat, which is a problem when attempting to detect untrustworthy targets like pedophiles.”

Instead of merely following down a fixed conversation path, the research team used game theory to plan out a multitude of conversation chains for the Negabot to work with and learn from. If the person is quick to try and get more personal information or makes references to meeting in real life, the bot will also try to suss out some personal information like social media profiles, email addresses, or even a phone number, all of which it will remember from conversation to conversation, for a variety of different possible suspects. Obviously all of this information would then be passed on to police.

In fact, the device was created with the police in mind, and the team is hoping that it will allow for overburdened cops to devote their time elsewhere while Negabot does all the creepy, creepy dirty work. (I assume they would send out this scary bastard whenever someone gets caught.) But there is room for error here.

For instance, if the bot can’t get much attention initially, it will act insulted and will get more insistent with the person. With this, it runs the risk of either picking up a false positive, or forcing someone down a road that they weren’t originally intending to go down, which would be the strangest form of entrapment I’ve ever heard of.

Obviously, the technology is still in the bug-tweaking phase, though it has been field tested on Google’s chat service, and is available to translate to different languages. The Basque region’s police force is already interested in it.

I can’t help but think that putting this ‘bot to the test would be one of the more amusing guinea pig jobs there would be, although the overall subject matter is pretty grim. Like the video game that created itself, this is a interesting step on the way towards functionally conversational A.I. Maybe it’ll have a conversation with the beer-pouring bot and sparks will fly. He doesn’t want to hang out with anyone under 21 anyway.

And now take a look at the hilariously absurd parody-type video from Taiwanese company Next Media Animation, in which the pedophiles are among the least disturbing features.

Subscribe for Science News
Get More Real But Weird

Science News

Expect a confirmation email if you Subscribe.