Google Can’t Release Video-Making AI Because The Results Are Too Disgusting
Google will not release its AI video program, as the program has been pulling up sexual, violent, and racist videos.
People have been using artificial intelligence programs to create all manner of images that sometimes come out to be quite horrific. The text-based programs often create images of people mashed up with odd things like Pokemon, leaving many to wonder why these programs exist in the first place. Google is now doing the same with a new AI program called Imagen Video, which is not yet being released to the public, as the program is creating videos of a sexual, violent, and racist variety.
Naturally, with the unending wealth of information that is available online, it would make sense that this Google AI video program would be finding and displaying some of the most disgusting videos. These AI programs might be used for entertainment purposes, but they often display some rather unsettling imagery. Google has also responded to these unsettling videos that are being pulled by Imagen, as they have stated that “While our internal testing suggests much of explicit and violent content can be filtered out, there still exists social biases and stereotypes which are challenging to detect and filter.”
The global tech giant realizes the damage that can be done if this program was to be released to the public, and people were using it to find some horrific videos, especially if the younger generation were to get their hands on it. Based on the issues that the AI program is presenting, Google is not releasing it to the public for now until “these concerns are mitigated.” There is likely to be some sort of filter system to be added to Imagen so that the AI program isn’t rifling off horrible videos for the world to see.
We are not saying that Google could be the fictional Skynet company from The Terminator franchise, but the tech company is playing with fire in creating a program that can create videos that appeal to those who simply enter text into a search-type database to be used by AI. The easiest way that a robot uprising could happen is if sentient programs know exactly what humans like and how to trick humans into precarious situations.
Thankfully, the AI programs that are being used right now seem to be all for hilarity purposes, as people often just post the monstrosities they discover on social media pages for everyone to laugh at. However, who says this Google video program wouldn’t be able to find classified information if it could dig deep enough? We know we sound like conspiracy theorists right now, but AI can be a dangerous thing, especially if these programs start to become self-aware.
For now, Google’s Imagen video program is being put on hold until these horrible videos being found by the program can be done away with. We would imagine that a filter system is going to take some to implement, so this AI program might take a few more weeks to perfect enough to be released to the public. If you happen to be addicted to the AI programs that generate random images based on text, then you are likely wanting to see what this Imagen AI video program can do, but you will have to wait just a bit longer until the videos can be safely searched without giving your computers some sort of virus.