boem@lemmy.world to Technology@lemmy.worldEnglish · 2 years agoPeople are speaking with ChatGPT for hours, bringing 2013’s Her closer to realityarstechnica.comexternal-linkmessage-square154fedilinkarrow-up1555arrow-down130cross-posted to: technology@lemmy.world
arrow-up1525arrow-down1external-linkPeople are speaking with ChatGPT for hours, bringing 2013’s Her closer to realityarstechnica.comboem@lemmy.world to Technology@lemmy.worldEnglish · 2 years agomessage-square154fedilinkcross-posted to: technology@lemmy.world
minus-squareisolatedscotch@discuss.tchncs.delinkfedilinkEnglisharrow-up1·2 years agoyou can, but things as good as chatgpt can’t be ran on local hardware yet. My main obstacle is language support other then english
minus-squareEven_Adder@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up2·2 years agoThey’re getting pretty close. You only need 10GB VRAM to run Hermes Llama2 13B. That’s within the reach of consumers.
you can, but things as good as chatgpt can’t be ran on local hardware yet. My main obstacle is language support other then english
They’re getting pretty close. You only need 10GB VRAM to run Hermes Llama2 13B. That’s within the reach of consumers.