• dan@upvote.au
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I’m pretty sure Google uses their TPU chips

    The Coral ones? They don’t have nearly enough RAM to handle LLMs. They only support small Tensorflow Lite models.

    They might have some custom-made non-public chips though - a lot of the big tech companies are working on that.

    instead of a regular GPU

    I wouldn’t call them regular GPUs… AI use cases often use products like the Nvidia H100, which are specifically designed for AI. They don’t have any video output ports.