Published: Sun, June 10, 2018
Electronics | By Shannon Stone

AI obsessed with murder after being exposed to "darkest corners of Reddit"

AI obsessed with murder after being exposed to

The scientists at Massachusetts Institute of Technology (MIT) presented on 1 April 2018, Norman, the world's first psychopath AI.

It's recommended, only due to their most recent project in artificial intelligence: Norman, the AI who's also a psychopath.

As compared to an algorithm that looks at an image and is tasked with pairing it with other similar images, Norman sees differently and judges what it sees.

"It's a compelling idea", says the website, and shows that "an algorithm is only as good as the people, and indeed the data, that have taught it".

Scientists at MIT have revealed how they trained an artificial robot to become a "psychopath" by only showing it captions from disturbing images depicting gruesome deaths posted on Reddit. We're thinking that this is what a group of researchers was thinking before creating Norman, a lab-bred maniacal deep-learning AI with a penchant for death by electrocution and gruesome vehicle accidents. Researchers set out to prove that the data that is used to teach a machine learning algorithm can significantly influence its behavior.

Where a normal AI sees birds and wedding cakes, Norman saw a man electrocuted to death and a man killed by a speeding driver, respectively.

Here is what both the AIs see on the Rorschach's inkblot test.

For further details on Norman and more of its creepy captions, check out this link. Similarly, for another inkblot, standard AI generated "a black and white photo of a baseball glove" while Norman AI wrote "man is murdered by machine gun in broad daylight". Another image is captioned as a man with an umbrella by the standard A.I., while Norman sees as "a man shot dead in front of his screaming wife".

Norman is an intentionally-biased neural network created to caption images.

Where the control AI identified a couple companionably standing together in harmony, Norman described a pregnant woman falling to her death.

In 2016, Microsoft apologised for creating an online "chatbot" that quickly turned into a Holocaust-denying racist. If your concept of evil machine involves an artificially intelligent system with twisted and gruesome thoughts, turn to Norman.

Like this: