Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 11 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square95fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 11 months agomessage-square95fedilink
minus-squareGrimy@lemmy.worldlinkfedilinkEnglisharrow-up0·edit-211 months agoThey usually take care of a jailbreak the week its made public. This one is more than a year old at this point.
They usually take care of a jailbreak the week its made public. This one is more than a year old at this point.