nifty@lemmy.world to Technology@lemmy.worldEnglish · 1 year agoGoogle AI making up recalls that didn’t happenlemmy.worldimagemessage-square213fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1imageGoogle AI making up recalls that didn’t happenlemmy.worldnifty@lemmy.world to Technology@lemmy.worldEnglish · 1 year agomessage-square213fedilink
minus-squarePsychadelligoat@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up0·1 year agoBecause lies require intent to deceive, which the AI cannot have. They merely predict the most likely thing that should next be said, so “hallucinations” is a fairly accurate description
Because lies require intent to deceive, which the AI cannot have.
They merely predict the most likely thing that should next be said, so “hallucinations” is a fairly accurate description