Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.

Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.

The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn’t know, while just under 2,000 voters said yes.

  • nayminlwin@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    Can’t help but think of it as a scheme to steal the consumers’ compute time and offload AI training to their hardware…

  • Sagrotan@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    They’ll pay for it. When the tech companies decide, it’s a thing to make money off & advertise it, all the good ants will buy, buy, buy and the rest of the time they will work, work, work for it.

  • exanime@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    AI for IT companies is looking more and more like 3D was for movie industry

    All fanfare and overhype, a small handful of examples that do seem a solid step forward with millions others that are just a polished turd. Massive investment for something the market has not demanded

  • helenslunch@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    Show me a practical use for AI and I’ll show you the money. Genmoji ain’t it.

    Give me a virtual assistant that actually functions and I will give you A LOT of money…

  • snek_boi@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 years ago

    I agree that we shouldn’t jump immediately to AI-enhancing it all. However, this survey is riddled with problems, from selection bias to external validity. Heck, even internal validity is a problem here! How does the survey account for social desirability bias, sunk cost fallacy, and anchoring bias? I’m so sorry if this sounds brutal or unfair, but I just hope to see less validity threats. I think I’d be less frustrated if the title could be something like “TechPowerUp survey shows 84% of 22,000 respondents don’t want AI-enhanced hardware”.

  • peopleproblems@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    AI in Movies: “The only Logical solution, is the complete control/eradication of humanity.”

    AI in Real Life: “Dave, I see you only have beer, soda, and cheese in your fridge. I am concerned for your health. I can write you a reminder to purchase better food choices.” Dave: “THESE AI ARE EVIL, I WILL NEVER SUBMIT TO YOUR POWER!”

    • |IlI|lIIl|IlIll|Il|IllI|@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      AI in Real Life: “Dave, I see you only have beer, soda, and cheese in your fridge. I am concerned for your health. I can write you a reminder to purchase better food choices.” Dave: “THESE AI ARE EVIL, I WILL NEVER SUBMIT TO YOUR POWER!” More like :

      “Dave, I see you have beer, soda, and cheese in your fridge. Have you thought about ordering PRIME energy drink? There’s a sale.”

      No.

      “36 count case of PRIME energy drink ordered!”

      I said no.

      “changed PRIME energy drink 36 count case shipping to next-day air for $150.79!”

      GODDAMNIT!

  • ClamDrinker@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    Depends on what kind of AI enhancement. If it’s just more things nobody needs and solves no problem, it’s a no brainer. But for computer graphics for example, DLSS is a feature people do appreciate, because it makes sense to apply AI there. Who doesn’t want faster and perhaps better graphics by using AI rather than brute forcing it, which also saves on electricity costs.

    But that isn’t the kind of things most people on a survey would even think of since the benefit is readily apparent and doesn’t even need to be explicitly sold as “AI”. They’re most likely thinking of the kind of products where the manufacturer put an “AI powered” sticker on it because their stakeholders told them it would increase their sales, or it allowed them to overstate the value of a product.

    Of course people are going to reject white collar scams if they think that’s what “AI enhanced” means. If legitimate use cases with clear advantages are produced, it will speak for itself and I don’t think people would be opposed. But obviously, there are a lot more companies that want to ride the AI wave than there are legitimate uses cases, so there will be quite some snake oil being sold.

    • AdrianTheFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 years ago

      well, i think a lot of these cpus come with a dedicated npu, idk if it would be more efficient than the tensor cores on an nvidia gpu for example though

      edit: whatever npu they put in does have the advantage of being able to access your full cpu ram though, so I could see it might be kinda useful for things other than custom zoom background effects

      • yamanii@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        But isn’t ram slower then a GPU’s vram? Last year people were complaining that suddenly local models were very slow on the same GPU, and it was found out it’s because a new nvidia driver automatically turned on a setting of letting the GPU dump everything on the ram if it filled up, which made people trying to run bigger models very annoyed since a crash would be preferable to try again with lower settings than the increased generation time a regular RAM added.

        • AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          Ram is slower than GPU VRAM, but that extreme slowdown is due to the bottleneck of the pcie bus that the data has to go through to get to the GPU.

  • Sam_Bass@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    Its bad enough they shove it on you in some websites. Really not interested in being their lab rats

  • cmrn@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    I still don’t understand how the buzzword of AI 10x’d all these valuations, when it’s always either: a) exactly what they’ve been doing before, now with a fancy new name b) deliberately shoehorning AI in, in ways with no practical benefit

  • t00l@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    They want you to buy the hardware and pay for the additional energy costs so they can deliver clippy 2.0, the watching-you-wank-edition.

  • OCATMBBL@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    Why would I pay more for x company to have a robot half ass the work of all the employees they’re gonna cut?

    • Wogi@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      So the trades have been unknowingly fucking with AI for decades, because of the time honored tradition of fucking with apprentices.

      A lot of forums are filled with absolutely unhinged advice, and sprinkled in there is some good advice. If you know what you’re doing, you can spot the bullshit.

      But if you don’t know anything about it, the advice seems perfectly reasonable. There’s a skill in giving unhinged advice. Literally you can’t get your master cert without convincing at least one apprentice to ask where the board stretcher is.

      Do I actually have a dedicated vise for Vaseline when I run a tap cycle or is that old timer bullshit? HOW WOULD YOU POSSIBLY KNOW??

  • T156@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 years ago

    It just doesn’t really do anything useful from a layman point of view, besides being a TurboCyberQuantum buzzword.

    I’ve apparently got AI hardware in my tablet, but as far as I’m aware, I’ve never/mostly never actually used it, nor had much of a use for it. Off the top of my head, I can’t think of much that would make use of that kind of hardware, aside from some relatively technical software that is almost as happy running on a generic CPU. Opting for AI capabilities would be paying extra for something I’m not likely to ever make use of.

    And the actual stuff that might make use of AI is pretty much abstracted out so far as to be invisible. Maybe the autocorrecting feature on my tablet keyboard is in fact powered by the AI hardware, but from the user perspective, nothing has really changed from the old pre-AI keyboard, other than some additions that could just be a matter of getting newer, more modern hardware/software updates, instead of any specific AI magic.