ugjka@lemmy.world to Technology@lemmy.worldEnglish · 1 year agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square289fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 1 year agomessage-square289fedilink
minus-squareOlgratin_Magmatoe@lemmy.worldlinkfedilinkEnglisharrow-up0·1 year agoGiven that multiple other commenters in the infosec.exchange thread have reproduced similar results, and right wingers tend to have bad security, and LLMs are pretty much impossible to fully control for now, it seems most likely that it’s real.
Given that multiple other commenters in the infosec.exchange thread have reproduced similar results, and right wingers tend to have bad security, and LLMs are pretty much impossible to fully control for now, it seems most likely that it’s real.