I thought of this recently (anti llm content within)

The reason a lot of companies/people are obsessed with llms and the like, is that it can solve some of their problems (so they think). The thing I noticed, is a LOT of the things they try to force the LLM to fix, could be solved with relatively simple programming.

Things like better searches (seo destroyed this by design, and kagi is about the only usable search engine with easy access), organization (use a database), document management, etc.

People dont fully understand how it all works, so they try to shoehorn the llm to do the work for them (poorly), while learning nothing of value.

  • Blue_Morpho@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    8 days ago

    that they have no way of knowing that the algorithm is actually correct.

    He tested it and it was good enough for him. If he wrote the code he’d still not know if it was correct and need to test it. If knowing an algorithm was all that was needed for writing working code, there wouldn’t have been any software bugs in all of computer history until AI.

    text predictors pulled words

    My phone keyboard text predictor lists 3 words and they’re frequently wrong. At best it lists 3 and you have to choose the 1 right word.