misk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 years agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square233fedilinkarrow-up1907arrow-down119
arrow-up1888arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 years agomessage-square233fedilink
minus-squareGlitzyArmrest@lemmy.worldlinkfedilinkEnglisharrow-up12·2 years agoIs there any punishment for violating TOS? From what I’ve seen it just tells you that and stops the response, but it doesn’t actually do anything to your account.
minus-squareNeoNachtwaechter@lemmy.worldlinkfedilinkEnglisharrow-up4arrow-down3·2 years agoShould there ever be a punishment for making a humanoid robot vomit?
Is there any punishment for violating TOS? From what I’ve seen it just tells you that and stops the response, but it doesn’t actually do anything to your account.
Should there ever be
Should there ever be a punishment for making a humanoid robot vomit?