• Spezi@feddit.org
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      Those were the basic entry level configurations needed to run Windows Vista with Aero effects.

      • Psythik@lemmy.world
        link
        fedilink
        arrow-up
        0
        arrow-down
        2
        ·
        7 months ago

        Meh, you just needed a discrete GPU, and not even a good one either. Just a basic, bare-bones card with 128MB of VRAM and pixel shader 2.0 support would have sufficed, but sadly most users didn’t even have that back in 06-08.

        It was mostly the consumer’s fault for buying cheap garbage laptops with trash-tier iGPUs in them, and the manufacturer’s for slapping a “compatible with Vista” sticker on them and pushing those shitboxes on consumers. If you had a half-decent $700-800 PC then, Vista ran like a dream.

        • porl@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          No, it was mostly the manufacturers fault for implying that their machine would run the operating system it shipped with well. Well that and Microsoft’s fault for strong arming them to push Vista on machines that weren’t going to run it well.

          • Psythik@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            7 months ago

            APUs obviously weren’t a thing yet, and it was common knowledge back then that contemporary iGPUs were complete and utter trash. I mean they were so weak that you couldn’t even play HD video or even enable some of XP’s very basic graphical effects with most integrated graphics.

            Everyone knew that you needed a dedicated graphics card back then, so you can and should in fact put some blame on the consumer for being dumb enough to buy a PC without one, regardless of what the sticker said. I mean I was a teenager back then and even still I knew better. The blame goes both ways.

            • porl@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              7 months ago

              No, if you weren’t “involved in the scene” and only had the word of the person at the store then you have no idea what an iGPU is, let alone why they weren’t up to the task of running the very thing it was sold with.

              You were a teenager in a time where teenagers average tech knowledge was much higher than before. That is not the same as someone who just learnt they now need one of those computer things for work. Not everyone had someone near them who could explain it to them. Blaming them for not knowing the intricacies of the machines is ridiculous. It was pure greed by Microsoft and the manufacturers.

  • ipkpjersi@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    Wow, that’s kind of a lot more Linux than I was expecting, but it also makes sense. Pretty cool tbh.

  • grue@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    7 months ago

    So basically, everybody switched from expensive UNIX™ to cheap “unix”-in-all-but-trademark-certification once it became feasible, and otherwise nothing has changed in 30 years.

    • Allero@lemmy.today
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      7 months ago

      Except this time the Unix-like took 100% of the market

      Was too clear this thing is just better

      • erwan@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        BSD is mostly Unix too, so even if Unix didn’t have 100% because of mac and Windows it was like 99%

  • Z3k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    As someone who worked on designing racks in the super computer space about 10 q5vyrs ago I had no clue windows and mac even tried to entered the space

        • MajorHavoc@programming.dev
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          7 months ago

          but it did not stick.

          Yeah. It was bad. The job of a Supercomputer is to be really fast and really parallel. Windows for Supercomputing was… not.

          I honestly thought it might make it, considering the engineering talent that Microsoft had.

          But I think time proves that Unix and Linux just had an insurmountable head start. Windows, to the best of my knowledge, never came close to closing the gap.

          • SayCyberOnceMore@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            7 months ago

            But, surely Windows is the wrong OS?

            Windows is a per-user GUI… supercomputing is all about crunching numbers, isn’t it?

            I can understand M$ trying to get into this market and I know Windows server can be used to run stuff, but again, you don’t need a GUI on each node a supercomputer they’d be better off with DOS…?

            • MajorHavoc@programming.dev
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              7 months ago

              But, surely Windows is the wrong OS?

              Oh yes! To be clear - trying to put any version of Windows on a super-computer is every bit as insane as you might imagine. By what I heard in the rumor mill, it went every bit as badly as anyone might have guessed.

              But I like to root for an underdog, and it was neat to hear about Microsoft engineers trying to take the Windows kernel somewhere it had no rational excuse to run (at the time - and I wonder if they had internal beta versions of stuff that Windows ships standard now, like SSH…), perhaps by sheer force of will and hard work.

  • Read Bio@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    Maybe windows is not used in supercomputers often because unix and linux is more flexiable for the cpus they use(Power9,Sparc,etc)

  • snek_boi@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    This looks impressive for Linux, and I’m glad FLOSS has such an impact! However, I wonder if the numbers are still this good if you consider more supercomputers. Maybe not. Or maybe yes! We’d have to see the evidence.

    • MajorHavoc@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      7 months ago

      I wonder if the numbers are still this good if you consider more supercomputers.

      Great question. My guess is not terribly different.

      “Top 500 Supercomputers” is arguably a self-referential term. I’ve seen the term “super-computer” defined whether it was among the 500 fastest computer in the world, on the day it went live.

      As new super-computers come online, workloads from older ones tend to migrate to the new ones.

      So my impression is there usually aren’t a huge number of currently operating supercomputers outside of the top 500.

      When a super-computer falls toward the bottom of the top 500, there’s a good chance it is getting turned off soon.

      That said, I’m referring here only to the super-computers that spend a lot of time advertising their existence.

      I suspect there’s a decent number out there today that prefer not to be listed. But I have no reason to think those don’t also run Linux.

          • Grimpen@lemmy.ca
            cake
            link
            fedilink
            arrow-up
            0
            ·
            7 months ago

            I think it was PS3 that shipped with “Other OS” functionality, and were sold a little cheaper than production costs would indicate, to make it up on games.

            Only thing is, a bunch of institutions discovered you could order a pallet of PS3’s, set up Linux, and have a pretty skookum cluster for cheap.

            I’m pretty sure Sony dropped “Other OS” not because of vague concerns of piracy, but because they were effectively subsidizing supercomputers.

            Don’t know if any of those PS3 clusters made it onto Top500.

            • Kusimulkku@lemm.ee
              link
              fedilink
              arrow-up
              0
              ·
              7 months ago

              Makes me think how PS2 had export restrictions because “its graphics chip is sufficiently powerful to control missiles equipped with terrain reading navigation systems”

                • Grimpen@lemmy.ca
                  cake
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  7 months ago

                  I remember when 128 but SSL Encryption was export restricted in the mid 90’s. When I first opened an online banking account, the Bank sent a CD with a customized version of Netscape Navigator with 128 bit SSL, and the bank logo in place of the Netscape N.

      • Ben@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        Mac is also also derived from BSD since it is built on Darwin