ugjka@lemmy.world to Technology@lemmy.worldEnglish · 2 years agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square289linkfedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 2 years agomessage-square289linkfedilink
minus-squareEmerald@lemmy.worldlinkfedilinkEnglisharrow-up0·2 years agoTheir AI chatbot has a name suspiciously close to Aryan, and it’s trained to deny the holocaust.
minus-squareLaurel Raven@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up0·2 years agoBut it’s also told to be completely unbiased! That prompt is so contradictory i don’t know how anyone or anything could ever hope to follow it
minus-squarejkrtn@lemmy.mllinkfedilinkEnglisharrow-up0·2 years agoIf one wants a Nazi bot I think loading it with doublethink is a prerequisite.
minus-squareSkyezOpen@lemmy.worldlinkfedilinkEnglisharrow-up0·2 years agoReality has a left wing bias. The author wanted unbiased (read: right wing) responses unnumbered by facts.
Their AI chatbot has a name suspiciously close to Aryan, and it’s trained to deny the holocaust.
But it’s also told to be completely unbiased!
That prompt is so contradictory i don’t know how anyone or anything could ever hope to follow it
If one wants a Nazi bot I think loading it with doublethink is a prerequisite.
Reality has a left wing bias. The author wanted unbiased (read: right wing) responses unnumbered by facts.