• Regrettable_incident@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          I could be misremembering but I seem to recall the digits on the front of my 486 case changing from 25 to 33 when I pressed the button. That was the only difference I noticed though. Was the beige bastard lying to me?

          • frezik@midwest.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            Lying through its teeth.

            There was a bunch of DOS software that runs too fast to be usable on later processors. Like a Rouge-like game where you fly across the map too fast to control. The Turbo button would bring it down to 8086 speeds so that stuff is usable.

            • Regrettable_incident@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              Damn. Lol I kept that turbo button down all the time, thinking turbo = faster. TBF to myself it’s a reasonable mistake! Mind you, I think a lot of what slowed that machine was the hard drive. Faster than loading stuff from a cassette tape but only barely. You could switch the computer on and go make a sandwich while windows 3.1 loads.

          • macrocephalic@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            Back in those early days many applications didn’t have proper timing, they basically just ran as fast as they could. That was fine on an 8mhz cpu as you probably just wanted stuff to run as fast as I could (we weren’t listening to music or watching videos back then). When CPUs got faster (or it could be that it started running at a multiple of the base clock speed) then stuff was suddenly happening TOO fast. The turbo button was a way to slow down the clock speed by some amount to make legacy applications run how it was supposed to run.

            • barsoap@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              9 months ago

              Most turbo buttons never worked for that purpose, though, they were still way too fast Like, even ignoring other advances such as better IPC (or rather CPI back in those days) you don’t get to an 8MHz 8086 by halving the clock speed of a 50MHz 486. You get to 25MHz. And practically all games past that 8086 stuff was written with proper timing code because devs knew perfectly well that they’re writing for more than one CPU. Also there’s software to do the same job but more precisely and flexibly.

              It probably worked fine for the original PC-AT or something when running PC-XT programs (how would I know our first family box was a 386) but after that it was pointless. Then it hung on for years, then it vanished.

  • RegalPotoo@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    Personally I can’t wait for a few good bankruptcies so I can pick up a couple of high end data centre GPUs for cents on the dollar

    • bruhduh@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 months ago

      Search Nvidia p40 24gb on eBay, 200$ each and surprisingly good for selfhosted llm, if you plan to build array of gpus then search for p100 16gb, same price but unlike p40, p100 supports nvlink, and these 16gb is hbm2 memory with 4096bit bandwidth so it’s still competitive in llm field while p40 24gb is gddr5 so it’s good point is amount of memory for money it cost but it’s rather slow compared to p100

      • RegalPotoo@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Thanks for the tips! I’m looking for something multi-purpose for LLM/stable diffusion messing about + transcoder for jellyfin - I’m guessing that there isn’t really a sweet spot for those 3. I don’t really have room or power budget for 2 cards, so I guess a P40 is probably the best bet?

        • bruhduh@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          9 months ago

          Try ryzen 8700g integrated gpu for transcoding since it supports av1 and these p series gpus for llm/stable diffusion, would be a good mix i think, or if you don’t have budget for new build, then buy intel a380 gpu for transcoding, you can attach it as mining gpu through pcie riser, linus tech tips tested this gpu for transcoding as i remember

        • Justin@lemmy.jlh.name
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Intel a310 is the best $/perf transcoding card, but if P40 supports nvenc, it might work for both transcode and stable diffusion.

      • Scipitie@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Lowest price on Ebay for me is 290 Euro :/ The p100 are 200 each though.

        Do you happen to know if I could mix a 3700 with a p100?

        And thanks for the tips!

      • RegalPotoo@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Digging into it a bit more, it seems like I might be better off getting a 12gb 3060 - similar price point, but much newer silicon

        • utopiah@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Interesting, I did try a bit of remote rendering on Blender (just to learn how to use via CLI) so that makes me wonder who is indeed scrapping the bottom of the barrel of “old” hardware and what they are using for. Maybe somebody is renting old GPUs for render farms, maybe other tasks, any pointer of such a trend?

  • masterspace@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    Thank fucking god.

    I got sick of the overhyped tech bros pumping AI into everything with no understanding of it…

    But then I got way more sick of everyone else thinking they’re clowning on AI when in reality they’re just demonstrating an equal sized misunderstanding of the technology in a snarky pessimistic format.

    • Jesus@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      I’m more annoyed that Nvidia is looked at like some sort of brilliant strategist. It’s a GPU company that was lucky enough to be around when two new massive industries found an alternative use for graphics hardware.

      They happened to be making pick axes in California right before some prospectors found gold.

      And they don’t even really make pick axes, TSMC does. They just design them.

        • mycodesucks@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          9 months ago

          Go ahead and design a better pickaxe than them, we’ll wait…

          Same argument:

          “He didn’t earn his wealth. He just won the lottery.”

          “If it’s so easy, YOU go ahead and win the lottery then.”

          • masterspace@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 months ago

            My fucking god.

            “Buying a lottery ticket, and designing the best GPUs, totally the same thing, amiriteguys?”

            • mycodesucks@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              9 months ago

              In the sense that it’s a matter of being in the right place at the right time, yes. Exactly the same thing. Opportunities aren’t equal - they disproportionately effect those who happen to be positioned to take advantage of them. If I’m giving away a free car right now to whoever comes by, and you’re not nearby, you’re shit out of luck. If AI didn’t HAPPEN to use massively multi-threaded computing, Nvidia would still be artificial scarcity-ing themselves to price gouging CoD players. The fact you don’t see it for whatever reason doesn’t make it wrong. NOBODY at Nvidia was there 5 years ago saying “Man, when this new technology hits we’re going to be rolling in it.” They stumbled into it by luck. They don’t get credit for forseeing some future use case. They got lucky. That luck got them first mover advantage. Intel had that too. Look how well it’s doing for them. Nvidia’s position over AMD in this space can be due to any number of factors… production capacity, driver flexibility, faster functioning on a particular vector operation, power efficiency… hell, even the relationship between the CEO of THEIR company and OpenAI. Maybe they just had their salespeople call first. Their market dominance likely has absolutely NOTHING to do with their GPU’s having better graphics performance, and to the extent they are, it’s by chance - they did NOT predict generative AI, and their graphics cards just HAPPEN to be better situated for SOME reason.

              • masterspace@lemmy.ca
                link
                fedilink
                English
                arrow-up
                0
                ·
                9 months ago

                they did NOT predict generative AI, and their graphics cards just HAPPEN to be better situated for SOME reason.

                This is the part that’s flawed. They have actively targeted neural network applications with hardware and driver support since 2012.

                Yes, they got lucky in that generative AI turned out to be massively popular, and required massively parallel computing capabilities, but luck is one part opportunity and one part preparedness. The reason they were able to capitalize is because they had the best graphics cards on the market and then specifically targeted AI applications.

      • utopiah@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        They just design them.

        It’s not trivial though. They also managed to lock dev with CUDA.

        That being said I don’t think they were “just” lucky, I think they built their luck through practices the DoJ is currently investigating for potential abuse of monopoly.

        • nilloc@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Yeah CUDA, made a lot of this possible.

          Once crypto mining was too hard nvidia needed a market beyond image modeling and college machine learning experiments.

      • Grandwolf319@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Imo we should give credit where credit is due and I agree, not a genius, still my pick is a 4080 for a new gaming computer.

      • Zarxrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        They didn’t just “happen to be around”. They created the entire ecosystem around machine learning while AMD just twiddled their thumbs. There is a reason why no one is buying AMD cards to run AI workloads.

        • sanpo@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          One of the reasons being Nvidia forcing unethical vendor lock in through their licensing.

        • towerful@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          I feel like for a long time, CUDA was a laser looking for a problem.
          It’s just that the current (AI) problem might solve expensive employment issues.
          It’s just that C-Suite/managers are pointing that laser at the creatives instead of the jobs whose task it is to accumulate easily digestible facts and produce a set of instructions. You know, like C-Suites and middle/upper managers do.
          And NVidia have pushed CUDA so hard.

          AMD have ROCM, an open source cuda equivalent for amd.
          But it’s kinda like Linux Vs windows. NVidia CUDA is just so damn prevalent.
          I guess it was first. Cuda has wider compatibility with Nvidia cards than rocm with AMD cards.
          The only way AMD can win is to show a performance boost for a power reduction and cheaper hardware. So many people are entrenched in NVidia, the cost to switching to rocm/amd is a huge gamble

      • technocrit@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        9 months ago

        The tech bros had to find an excuse to use all the GPUs they got for crypto after they bled that dry upgraded to proof-of-stake.

        I don’t see a similar upgrade for “AI”.

        And I’m not a fan of BTC but $50,000+ doesn’t seem very dry to me.

    • Sentient Loom@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      As I job-hunt, every job listed over the past year has been “AI-drive [something]” and I’m really hoping that trend subsides.

      • AdamEatsAss@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        “This is an mid level position requiring at least 7 years experience developing LLMs.” -Every software engineer job out there.

        • EldritchFeminity@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Reminds me of when I read about a programmer getting turned down for a job because they didn’t have 5 years of experience with a language that they themselves had created 1 to 2 years prior.

        • macrocephalic@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Yeah, I’m a data engineer and I get that there’s a lot of potential in analytics with AI, but you don’t need to hire a data engineer with LLM experience for aggregating payroll data.

          • utopiah@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            there’s a lot of potential in analytics with AI

            I’d argue there is a lot of potential in any domain with basic numeracy. In pretty much any business or institution somebody with a spreadsheet might help a lot. That doesn’t necessarily require any Big Data or AI though.

  • PenisDuckCuck9001@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 months ago

    I just want computer parts to stop being so expensive. Remember when gaming was cheap? Pepperidge farm remembers. You used to be able to build a relatively high end pc for less than the average dogshit Walmart laptop.

    • filister@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      To be honest right now is a relatively good time to build a PC, except for the GPU, which is heavily overpriced. I think if you are content with last gen AMD, this can also be turned to somewhat acceptable levels.

  • Grofit@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    A lot of the AI boom is like the DotCom boom of the Web era. The bubble burst and a lot of companies lost money but the technology is still very much important and relevant to us all.

    AI feels a lot like that, it’s here to stay, maybe not in th ways investors are touting, but for voice, image, video synthesis/processing it’s an amazing tool. It also has lots of applications in biotech, targetting systems, logistics etc.

    So I can see the bubble bursting and a lot of money being lost, but that is the point when actually useful applications of the technology will start becoming mainstream.

    • criticalthreshold@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Google Search is such an important facet for Alphabet that they must invest as many billions as they can to lead the new generative-AI search. IMO for Google it’s more than just a growth opportunity, it’s a necessity.

      • hamsterkill@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        I guess I don’t really see why generative AI is a necessity for a search engine? It doesn’t really help me find information any faster than a Wikipedia summary, and is less reliable.

        • RinseDrizzle@midwest.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          9 months ago

          So far…

          Obviously still has fair share of dumb stuff happening with these systems today, but there have been some big steps in just the last few years. Wouldn’t be surprised if it was much spookier a decade from now.

          In general, good to use as a tool to be taken with grain of salt and further review.

    • ipkpjersi@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      I’m glad someone else is acknowledging that AI can be an amazing tool. Every time I see AI mentioned on lemmy, people say that it’s entirely useless and they don’t understand why it exists or why anyone talks about it at all. I mention I use ChatGPT daily for my programming job, it’s helpful like having an intern do work for me, etc, and I just get people disagreeing with me all day long lol

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 months ago

      The bubble burst and a lot of companies lost money but the technology is still very much important and relevant to us all.

      The DotCom bubble was built around the idea of online retail outpacing traditional retail far faster than it did, in fact. But it was, at its essence, a system of digital book keeping. Book your orders, manage your inventory, and direct your shipping via a more advanced and interconnected set of digital tools.

      The fundamentals of the business - production, shipping, warehousing, distribution, the mathematical process of accounting - didn’t change meaningfully from the days of the Sears-Roebuck Catalog. Online was simply a new means of marketing. It worked well, but not nearly as well as was predicted. What Amazon did to achieve hegemony was to run losses for ten years, while making up the balance as a government sponsored series of data centers (re: AWS) and capitalize on discount bulk shipping through the USPS before accruing enough physical capital to supplant even the big box retailers. The digital front-end was always a loss-leader. Nobody is actually turning a profit on Amazon Prime. It’s just a hook to get you into the greater Amazon ecosystem.

      Pivot to AI, and you’ve got to ask… what are we actually improving on? It’s not a front-end. It’s not a data-service that anyone benefits from. It is hemorrhaging billions of dollars just at OpenAI alone (one reason why it was incorporated as a Non-Profit to begin with - THERE WAS NO PROFIT). Maybe you can leverage this clunky behemoth into… low-cost mass media production? But its also extremely low-rent production, in an industry where - once again - marketing and advertisement are what command the revenue you can generate on a finished product. Maybe you can use it to optimize some industrial process? But it seems that every AI needs a bunch of human babysitters to clean up all the shit is leaves. Maybe you can get those robo-taxis at long last? I wouldn’t hold my breath, but hey, maybe?!

      Maybe you can argue that AI provides some kind of hook to drive retail traffic into a more traditional economic model. But I’m still waiting to see what that is. After that, I’m looking at AI in the same way I’m looking at Crypto or VR. Just a gimmick that’s scaring more people off than it drags in.

      • PaulBlartFartTart@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        The funny thing about Amazon, is we are phasing it out of our home now. Because it has become an online 7Eleven. You don’t pay for shipping and it comes fast, but you are often paying 50-100% more for everything. If you use AliExpress, 300-400% more… just to get it a week or two faster. I would rather go to local retailers that are increasing Chinese goods for a 150% profit, than Amazon and pay 300%. It just means I have to leave the house for 30 minutes.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          would rather go to local retailers that are increasing Chinese goods for a 150% profit, than Amazon and pay 300%

          A lot of the local retailors are going out of business in my area. And those that exist are impossible to get into and out of, due to the fixation on car culture. The Galleria is just a traffic jam that spans multiple city blocks.

          The thing that keeps me at Amazon, rather than Target, is purely the time visit of shopping versus shipping.

      • Grofit@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        I don’t mean it’s like the dotcom bubble in terms of context, I mean in terms of feel. Dotcom had loads of investors scrambling to “get in on it” many not really understanding why or what it was worth but just wanted quick wins.

        This has same feel, a bit like crypto as you say but I would say crypto is very niche in real world applications at the moment whereas AI does have real world usages.

        They are not the ones we are being fed in the mainstream like it replacing coders or artists, it can help in those areas but it’s just them trying to keep the hype going. Realistically it can be used very well for some medical research and diagnosis scenarios, as it can correlate patterns very easily showing likelyhood of genetic issues.

        The game and media industry are very much trialling for voice and image synthesis for improving environmental design (texture synthesis) and providing dynamic voice synthesis based off actors likenesses. We have had peoples likenesses in movies for decades via cgi but it’s only really now we can do the same but for voices and this isn’t getting into logistics and/or financial where it is also seeing a lot of application.

        Its not going to do much for the end consumer outside of the guff you currently use siri or alexa for etc, but inside the industries AI is very useful.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          crypto is very niche in real world applications at the moment whereas AI does have real world usages.

          Crypto has a very real niche use for money laundering that it does exceptionally well.

          AI does not appear to do anything significantly more effectively than a Google search circa 2018.

          But neither can justify a multi billion dollar market cap on these terms.

          The game and media industry are very much trialling for voice and image synthesis for improving environmental design (texture synthesis) and providing dynamic voice synthesis based off actors likenesses. We have had peoples likenesses in movies for decades via cgi but it’s only really now we can do the same but for voices and this isn’t getting into logistics and/or financial where it is also seeing a lot of application.

          Voice actors simply don’t cost that much money. Procedural world building has existed for decades, but it’s generally recognized as lackluster beside bespoke design and development.

          These tools let you build bad digital experiences quickly.

          For logistics and finance, a lot of what you’re exploring is solved with the technology that underpins AI (modern graph theory). But LLMs don’t get you that. They’re an extraneous layer that takes enormous resources to compile and offers very little new value.

            • UnderpantsWeevil@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              there are loads of white papers detailing applications of AI in various industries

              And loads more of its ineffectual nature and wastefulness.

              • Grofit@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                9 months ago

                Are you talking specifically about LLMs or Neural Network style AI in general? Super computers have been doing this sort of stuff for decades without much problem, and tbh the main issue is on training for LLMs inference is pretty computationally cheap

                • UnderpantsWeevil@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  9 months ago

                  Super computers have been doing this sort of stuff for decades without much problem

                  Idk if I’d point at a supercomputer system and suggest it was constructed “without much problem”. Cray has significantly lagged the computer market as a whole.

                  the main issue is on training for LLMs inference is pretty computationally cheap

                  Again, I would not consider anything in the LLM marketplace particularly cheap. Seems like they’re losing money rapidly.

  • LemmyBe@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 months ago

    Wether we like it or not AI is here to stay, and in 20-30 years, it’ll be as embedded in our lives as computers and smartphones are now.

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 months ago

      Is there a “young man yells at clouds meme” here?

      “Yes, you’re very clever calling out the hype train. Oooh, what a smart boy you are!” Until the dust settles…

      Lemmy sounds like my grandma in 1998, “Pushah. This ‘internet’ is just a fad.'”

        • BakerBagel@midwest.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Yeah, the early Internet didn’t require 5 tons of coal be burned just to give you a made up answer to your query. This bubble is Pets.com only it is also murdering the rainforest while still be completely useless.

          • Womble@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 months ago

            Estimates for chatgpt usage per query are on the order of 20-50 Wh, which is about the same as playing a demanding game on a gaming pc for a few minutes. Local models are significantly less.

    • utopiah@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Right, it did have an AI winter few decades ago. It’s indeed here to stay, it doesn’t many any of the current company marketing it right now will though.

      AI as a research field will stay, everything else maybe not.

    • FlorianSimon@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Idiots in this thread keep forgetting there’s a climate crisis and that we won’t be able to live the lives we live now forever 🤷‍♀️

  • helenslunch@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 months ago

    The stock market is not based on income. It’s based entirely on speculation.

    Since then, shares of the maker the high-grade computer chips that AI laboratories use to power the development of their chatbots and other products have come down by more than 22%.

    June 18th: $136 August 4th: $100 August 18th: $130 again now: $103 (still above 8/4)

    It’s almost like hype generates volatility. I don’t think any of this is indicative of a “leaking” bubble. Just tech journalists conjuring up clicks.

    Also bubbles don’t “leak”.

    • SturgiesYrFase@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Also bubbles don’t “leak”.

      I mean, sometimes they kinda do? They either pop or slowly deflate, I’d say slow deflation could be argued to be caused by a leak.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          You can do it easily with a balloon (add some tape then poke a hole). An economic bubble can work that way as well, basically demand slowly evaporates and the relevant companies steadily drop in value as they pivot to something else. I expect the housing bubble to work this way because new construction will eventually catch up, but building new buildings takes time.

          The question is, how much money (tape) are the big tech companies willing to throw at it? There’s a lot of ways AI could be modified into niche markets even if mass adoption doesn’t materialize.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              You do realize an economic bubble is a metaphor, right? My point is that a bubble can either deflate rapidly (severe market correction, or a “burst”), or it can deflate slowly (a bear market in a certain sector). I’m guessing the industry will do what it can to have AI be the latter instead of the former.

              • helenslunch@feddit.nl
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                9 months ago

                Yes, I do. It’s a metaphor that you don’t seem to understand.

                My point is that a bubble can either deflate rapidly (severe market correction, or a “burst”), or it can deflate slowly (a bear market in a certain sector).

                No, it cannot. It is only the former. The entire point of the metaphor is that its a rapid deflation. A bubble does not slowly leak, it pops.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  9 months ago

                  One good example of a bubble that usually deflates slowly is the housing market. The housing market goes through cycles, and those bubbles very rarely pop. It popped in 2008 because banks were simultaneously caught with their hands in the candy jar by lying about risk levels of loans, so when foreclosures started, it caused a domino effect. In most cases, the fed just raises rates and housing prices naturally fall as demand falls, but in 2008, part of the problem was that banks kept selling bad loans despite high mortgage rates and high housing prices, all because they knew they could sell those loans off to another bank and make some quick profit (like a game of hot potato).

                  In the case of AI, I don’t think it’ll be the fed raising rates to cool the market (that market isn’t impacted as much by rates), but the industry investing more to try to revive it. So Nvidia is unlikely to totally crash because it’ll be propped up by Microsoft, Amazon, and Google, and Microsoft, Apple, and Google will keep pitching different use cases to slow the losses as businesses pull away from AI. That’s quite similar to how the fed cuts rates to spur economic investment (i.e. borrowing) to soften the impact of a bubble bursting, just driven from mega tech companies instead of a government.

                  At least that’s my take.

      • stephen01king@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        We taking about bubbles or are we talking about balloons? Maybe we should change to using the word balloon instead, since these economic ‘bubbles’ can also deflate slowly.

        • SturgiesYrFase@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Good point, not sure that economists are human enough to take sense into account, but I think we should try and make it a thing.

    • iopq@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      The broader market did the same thing

      https://finance.yahoo.com/quote/SPY/

      $560 to $510 to $560 to $540

      So why did $NVDA have larger swings? It has to do with the concept called beta. High beta stocks go up faster when the market is up and go down lower when the market is done. Basically high variance risky investments.

      Why did the market have these swings? Because of uncertainty about future interest rates. Interest rates not only matter vis-a-vis business loans but affect the interest-free rate for investors.

      When investors invest into the stock market, they want to get back the risk free rate (how much they get from treasuries) + the risk premium (how much stocks outperform bonds long term)

      If the risks of the stock market are the same, but the payoff of the treasuries changes, then you need a high return from stocks. To get a higher return you can only accept a lower price,

      This is why stocks are down, NVDA is still making plenty of money in AI

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        9 months ago

        There’s more to it as well, such as:

        • investors coming back from vacation and selling off losses and whatnot
        • investors expecting reduced spending between summer and holidays; we’re past the “back to school” retail bump and into a slower retail economy
        • upcoming election, with polls shifting between Trump and Harris

        September is pretty consistently more volatile than other months, and has net negative returns long-term. So it’s not just the Fed discussing rate cuts (that news was reported over the last couple months, so it should be factored in), but just normal sideways trading in September.

        • iopq@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          We already knew about back to school sales, they happen every year and they are priced in. If there was a real stock market dump every year in September, everyone would short September, making a drop in August and covering in September, making September a positive month again

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            It’s not every year, but it is more than half the time. Source:

            History suggests September is the worst month of the year in terms of stock-market performance. The S&P 500 SPX has generated an average monthly decline of 1.2% and finished higher only 44.3% of the time dating back to 1928, according to Dow Jones Market Data.

  • floofloof@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    Shed a tear, if you wish, for Nvidia founder and Chief Executive Jenson Huang, whose fortune (on paper) fell by almost $10 billion that day.

    Thanks, but I think I’ll pass.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      He knows what this hype is, so I don’t think he’d be upset. Still filthy rich when the bubble bursts, and that won’t be soon.

    • brbposting@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      I’m sure he won’t mind. Worrying about that doesn’t sound like working.

      I work from the moment I wake up to the moment I go to bed. I work seven days a week. When I’m not working, I’m thinking about working, and when I’m working, I’m working. I sit through movies, but I don’t remember them because I’m thinking about work.

      - Huang on his 14 hour workdays

      It is one way to live.

  • billbennett@piefed.social
    link
    fedilink
    Afaraf
    arrow-up
    0
    ·
    9 months ago

    I’ve spent time with an AI laptop the past couple of weeks and ‘overinflated’ seems a generous description of where end user AI is today.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      As in as soon as companies realise they won’t be able to lay off everybody except executives and personal masseuses, nVidia will go back to having a normal stock price.

      Rich people will become slightly less grotesquely wealthy, and everything must be done to prevent this.

    • Bilb!@lem.monster
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      The term “AI bubble” refers to the idea that the excitement, investment, and hype surrounding artificial intelligence (AI) may be growing at an unsustainable rate, much like historical financial or technological bubbles (e.g., the dot-com bubble of the late 1990s). Here are some key aspects of this concept:

      1. Overvaluation and Speculation: Investors and companies are pouring significant amounts of money into AI technologies, sometimes without fully understanding the technology or its realistic potential. This could lead to overvaluation of AI companies and startups.

      2. Hype vs. Reality: There is often a mismatch between what people believe AI can achieve in the short term and what it is currently capable of. Some claims about AI may be exaggerated, leading to inflated expectations that cannot be met.

      3. Risk of Market Crash: Like previous bubbles in history, if AI does not deliver on its overhyped promises, there could be a significant drop in AI investments, stock prices, and general interest. This could result in a burst of the “AI bubble,” causing financial losses and slowing down real progress.

      4. Comparison to Previous Bubbles: The “AI bubble” is compared to the dot-com bubble or the housing bubble, where early optimism led to massive growth and investment, followed by a sudden collapse when the reality didn’t meet expectations.

      Not everyone believes an AI bubble is forming, but the term is often used as a cautionary reference, urging people to balance enthusiasm with realistic expectations about the technology’s development and adoption.