• NicePool@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    Isn’t Apple the company that charges $5k+ for 16GB? All while intentionally deprecating the hardware within 2 years. /s

    I’ve had to support their products on a professional level for over a decade. I will NEVER buy an Apple product.

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      I’ve had to support their products on a professional level for over a decade.

      Their enterprise stuff…can only be described as a quintessential example of an ill-conceived, horrendously executed fiasco, so utterly devoid of utility and coherence that it defies all logic and reasonable expectation. It stands as a paragon of dysfunction, a conflagration of conceptual failures so intense and egregious that it resembles a blazing inferno of pure, unadulterated refuse. It is, in every conceivable sense, a searing, molten heap of garbage—hot, steaming, and reeking with the unmistakable stench of profound ineptitude and sheer impracticality.

  • flop_leash_973@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    Naturally the price for the cheapest model will also be going to up several orders of magnitude more than the cost of materials, labor, and healthy profit margin to account for that as well I’m sure.

    • qjkxbmwvz@startrek.website
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      In 1999, the iBook was US$1599 (equivalent to $2925 in 2023) (source).

      The 2010 13" Air was $1299 (more in today’s $) (source).

      The current 13" M3 Air is $1099 (source).

      So yeah, they may well raise prices, but the cost of Apple’s entry-level hardware has decreased in absolute terms over the years, and has decreased substantially if inflation is taken into account. Not to say the margins aren’t higher (no idea about that), but it’s interesting.

      • realitista@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Yeah it’s when you need a reasonable amount of RAM or disk that they really bend you over.

  • Omega@discuss.online
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    I always thought 8gb was a fine amount for daily use if you never did anything too heavy, are apps really that ram intense now?

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 months ago

      Yes. Just as 4GB was barely enough a decade ago.

      I usually find myself either capping out the 8GB of RAM on my laptop, or getting close to it if I have Firefox, Discord and a word processor open. Especially if I have Youtube or Spotify going.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        I can get over 8 GB just running Discord, Steam, Shapes2

        I am pretty sure most of that is just discord.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Imagine how much more room we’d have if everything wasn’t dragging a big trailer full of Chrome behind it.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            I’m pretty sure Chrome doesn’t even use the memory for anything it just likes it allocated.

      • Omega@discuss.online
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Most of that is discord, they can’t manage a single good thing right Use more GPU than the game I’m playing? Check. Have an inefficient method of streaming a game? Check. Be laggy as fuck when no longer on GPU acceleration when lemmy and guilded is fine? Check.

      • Liz@midwest.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Recently I downloaded Chrome for some testing that I wanted to let separate from my Firefox browser. After a while I realized my computer was always getting hot every time I opened chrome. I took a look at the system monitor: chrome was using 30% of of my CPU power to play a single YouTube video in the background. What the fuck? I ended up switching the testing environment over the libreWolf and CPU load went down to only 10%.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          I’d say to try chromium, but you basically need to compile it yourself to get support for all the video codecs.

      • Demdaru@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Stop. You’re scaring todays companies. Optimization? That’s a no-no word.

        Now please eat whole ass libraries imported for one function, or that react + laravel site which amounts to most stock bootsrap looking blog.

    • Sethayy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Heavily depends on what you use, on a Linux server as a NAS I’m able to get away with 2gb, an orange pi zero 3 1gb but it essentially only ever ones one app at a time.

      Im sure a hardcore rgb gamer could need 32gb pretty quick by leaving open twitch streams, discord, a couple games in the background, a couple chrome tabs open all on windows 11

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Yep. I work in IT support, almost entirely Windows but similar concepts apply.

      I see people pushing 6G+ with the OS and remote desktop applications open sometimes. My current shop does almost everything by VDI/remote desktop… So that’s literally the only thing they need to load, it’s just not good.

      On the remote desktop side, we recently shifted from a balanced remote desktop server, over to a “memory optimised” VM, basically has more RAM but the same or similar CPU, because we kept running out of RAM for users, even though there was plenty of CPU available… It caused problems.

      Memory is continually getting more important.

      When I do the math on the bandwidth requirements to run everything, the next limit I think we’re likely to hit is RAM access speed and bandwidth. We’re just dealing with so much RAM at this point that the available bandwidth from the CPU to the RAM is less than the total memory allocation for the virtual system. Eg: 256G for the VM, and the CPU runs at, say, 288GB/s…

      Luckily DDR 4/5 brings improvements here, though a lot of that stuff has yet to filter into datacenters

    • tb_@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Does it?

      Previous benchmarks have shown the 8 GB models seriously fell behind in performance.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Yeah I think the joke just flew over your head.

        Apple keeps saying that their RAM is somehow magic and therefore better than Windows RAM, which is a comment that obviously makes no sense.

        • cheddar@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Yeah I think the joke just flew over your head.

          I realize this should be a joke, but I am still unsure if it is.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 months ago

            Memory is memory. If an application requires a lot of memory then it really doesn’t matter what speed that memory is it’s more important that there’s enough of it.

            There are plenty of applications that could theoretically run on the M2 MacBook in terms of processing capacity but can’t run because there isn’t enough RAM available. Oh they run in switching mode, which is super bad, because a, it’s incredibly slow, and b, it’s bad for the hard drive.

          • HauntedCupcake@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 months ago

            It is 100% a joke. Literally other than Windows being slightly more RAM hungry, there’s not a huge difference between it and Mac’s RAM

        • barsquid@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          I think they are able to share it with the GPU or something? It is maybe slightly better but it sure as fuck is not 2x better.

          8 GB, even if it is “magic RAM,” is a joke amount and has been for a long time.

          • HauntedCupcake@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            That’s just an APU, see consoles and laptops. The unified memory is basically just the above, but Apple also claims that due to Apple Silicon having the storage controller on board, the swap is magically faster 🤷

            Also Mac OS/Linux use less RAM than Windows which certainly helps.

            8GB is “fine™” on a MacBook Air, but it’s criminal for a Pro machine, and it certainly should not cost £200 for an extra 8GB. That’s genuinely insane pricing

            • barsquid@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              That’s the real issue, isn’t it? The upgrade prices are disconnected from reality by a lot. If they were within the realm of sanity nobody would care much that the base is 8 GB.

              • Echo Dot@feddit.uk
                link
                fedilink
                English
                arrow-up
                0
                ·
                9 months ago

                I was saying this and my girlfriend when they first came out the whole thing is completely out of spec for everyone regardless of your use case.

                She really only wants it for playing The Sims but you’ll run into RAM limitations there, and as you say it’s not worth paying so much more just to get a device that’s actually functional.

                If you want to use it for basic word processing then you really don’t need that level of latency and you really don’t need a CPU of that level of performance. You’re just paying for stuff you’re never going to use.

                If you want it for gaming there isn’t enough memory to make it worthwhile.

                If you want it for intensive graphics editing work then there really isn’t enough memory for that to work.

                If you want it for advanced computation then you’re probably not going for a laptop anyway. The M2 chip is obsessed with retaining battery life, which is fine in a laptop but if you want high performance applications you just want it to use more power.

                It for some bizarre reason you wanted to do AI research on a laptop it’s not too bad but you’d still need the pro version and there are better things on the market but it wouldn’t be the worst I guess.

                So outside of one very niche scenario it’s literally a pointless device for 99% of the user base.

                In the end we got a framework laptop, which is more than capable of doing what we wanted and didn’t cost anywhere near as much. Plus it basically looks like a MacBook too. So even going to build quality wasn’t a consideration. I got one too for no particular reason, and it still ended up cheaper.

    • areyouevenreal@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      The annoying thing is I have had people claim that 8GB and 16GB is fine on Apple and works better than on PC laptops. To the point one redditor point blank refused to believe I owned an Apple laptop. I literally had to take a photograph of said laptop and show it to them before they would believe me about the RAM capacity.

      • HauntedCupcake@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        I own a 8GB MacBook Pro for work, it’s definitely better than a PC with 8GB of RAM, but not better or even close to a PC with 16GB. Just the amount of stutters/freezes while the swap file goes is insane

        • areyouevenreal@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Maybe this is true if you use Windows. If you use Linux on your PC versus macOS on a MacBook you will probably find the PC performs comparably if not better.

            • areyouevenreal@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              We are talking about PC vs Mac. Both have the same problem when it comes to chromium based things.

            • Echo Dot@feddit.uk
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              A Windows application and a Mac application will use pretty much the same amount of memory regardless of operating system.

              The real issue is how much memory the OS uses up. Windows is a massive waste of RAM but not enough to make any difference, certainly not with 8 GB versus 16 GB. You’re still better off on PC then.

          • HauntedCupcake@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 months ago

            Oh totally, Linux is in the same ballpark as, if not better than, Macs when it comes to RAM usage. Windows is just a hog

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Obviously it depends on the situation but sometimes it is worth talking to idiots not because you have any chance of changing their mind but just demonstrate to everyone else in the thread that they are in fact an idiot. Just in case somebody thinks they have a point.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Ironically, it’s the other way around, since Apple has to share their RAM between GPU and CPU, where other computers typically have them separately.

      So in normal usage with 8 GB, you’re automatically down to 7, since at least 1GB would be taken by the graphics card. More if you’re doing anything reasonably graphics-heavy with it.

    • lengau@midwest.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      My Linux machine has 64 GiB of RAM, which is like 128 GiB of Mac RAM. It’s still not enough

      • areyouevenreal@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Serious question what are you using all that RAM for? I am having a hard time justifying upgrading one of my laptops to 32 GiB, nevermind 64 GiB.

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          9 months ago

          Any memory that’s going unused by apps is going to be used by the OS for caching disk contents. That’s not as significant with SSD as with rotational drives, but it’s still providing a benefit, albeit one with diminishing returns as the size of the cache increases.

          That being said, if this is a laptop and if you shut down or hibernate your laptop on a regular basis, then you’re going to be flushing the memory cache all the time, and it may buy you less.

          IIRC, Apple’s default mode of operation on their laptops these days is to just have them sleep, not hibernate, so a Mac user would probably benefit from that cache.

        • Mistic@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          If games, modding uses a lot. It can go to the point of needing more than 32gb, but rarely so.

          Usually, you’d want 64gb or more for things like video editing, 3d modeling, running simulations, LLMs, or virtual machines.

          • areyouevenreal@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            I use Virtual Machines and run local LLMs. LLMs need VRAM rather than CPU RAM. You shouldn’t be doing it on a laptop without a serious NPU or GPU, if at all. I don’t know if I will be using VMs heavily on this machine or not, but that would be a good reason to have more RAM. Even so 32 GiB should be enough for a few VMs running concurrently.

        • lengau@midwest.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          9 months ago

          For me in particular I’m a software developer who works on developer tools, so I have a lot of tests running in VMs so I can test on different operating systems. I just finished running a test suite that used up over 50 gigs of RAM for a dozen VMs.

          • InvertedParallax@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            Same, 48c/96t with 192gb ram.

            make -j is fun, htop triggers epilepsy.

            Few vms, but tons of Lxc containers, it’s like having 1 machine that runs 20 systems in parallel and really fast.

            Have containers for dev, for browsing, for wine, the dream finally made manifest.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        It’s not an upgrade though it’s just a different model. They’re not modules you can install and I don’t even think Apple can install them you just get a different motherboard.

        Which is objectionable for so many reasons, not least of all E-Waste.

        • stellargmite@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Yeh I get that. Its treated as if its an upgrade - a sales upsell to a different unit I guess, rather than an upgrade to the literal unit the customer is receiving. Yep objectionable all round.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            My point is you cannot effectively upgrade after the fact. You have to buy a whole new device.

            • stellargmite@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              Indeed. Making that initial decision even more of a forced decision toward the expensive upsell. Its evil. And wasteful as you said.

            • MystikIncarnate@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              There’s reasons behind this. LPDDR IIRC works most efficiently when it’s closer to the CPU than what dimms would allow for.

              Boosts speed and lowers the power requirements.

              It also incentivizes people to buy larger SKUs than they originally wanted, which, bluntly, is probably the main driver for going that direction… I’m just saying that there’s technical reasons too

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                ·
                9 months ago

                The technical benefits are honestly quite overblown. The M-series didn’t get the massive speed lift because it moved to soldered RAM near the CPU, it got the massive speed lift because it doesn’t have to copy stuff between the CPU and GPU, the proximity to the CPU is a pretty modest improvement. So they could’ve gotten 95% of the benefit while still offering socketed RAM, but they decided not to, probably to drive prices up.

                • MystikIncarnate@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  9 months ago

                  There’s actually an argument that makes the point of driving prices down with soldered RAM.

                  The individual memory chips and constituent components are cheaper than they would be for the same in a DIMM. We’re talking about a very small difference, and bluntly, OEMs are going to mark it up significantly enough that the end consumer won’t see a reduction for this (but OEMs will see additional profits).

                  So by making it into unupgradable ewaste, they make an extra buck or two per unit, with the added benefit of our being unupgradable ewaste, so you throw it out and buy a whole new system sooner.

                  This harkens back to my rant on thin and light phones, where the main point is that they’re racing to the bottom. Same thing here. For thin and light mobile systems, soldered RAM still saves precious space and weight, allowing for it to be thinner and lighter (again, by a very small margin)… That’s the only market segment I kind of understand the practice. For everything else, DIMMs (or the upcoming LPCAMM2)… IMO, I’d rather sacrifice any speed benefit to have the ability to upgrade the RAM.

                  The one that ticks me off is the underpowered thin/lights that are basically unusable ewaste because they have the equivalent of a Celeron, and barely enough RAM to run the OS they’re designed for. Everything is soldered, and they’re cheap, so people on a tight budget are screwed into buying them. This is actually a big reason why I’m hoping that the windows-on-ARM thing takes off a bit, because those systems would be far more useful than the budget x86 chips we’ve seen, and far less expensive than anything from Intel or AMD that’s designed for mobile use. People on a tight budget can get a cheap system that’s actually not ewaste.

          • Kbobabob@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 months ago

            I didn’t think any of those are the base model. Anything with Pro or Ultra in the name should have more than 8Gb of RAM in my opinion. It also seems dominated by OnePlus as the others listed are not really players in the larger market. You could possibly argue that Xaomi is but I’ve never even seen one of these phones in the real world. In fact it looks like most of these are only available in China variant.

            • bruhduh@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              I am writing this using my Xiaomi poco x3 pro, although it have 8gb ram and 256gb memory, it also have headphone jack and micro SD slot

    • stoy@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      I remember back in the early 2000s when I saw a PDA with a 232mhz cpu and 64mb ram, and I realized how far technology had come since I got my computer with a 233mhz cpu and 64mb ram…

      Obviously different architechtures, but damn that felt strange…

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      It’s a good comparison actually because Apple keeps saying that their ram is faster because it’s soldered (Which is true but only if you squint). I don’t really think it makes a difference because if you run out of space you still run out of space, the fact that you can access the limited space more quickly doesn’t really help.

      Well phone RAM also tends to be solded onto the board too so it’s a pretty good comparison.

    • narc0tic_bird@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Yup, while the current iPhone 15 Pro is the only model which has 8 GB of RAM, with the regular iPhone 15 having 6 GB. All iPhone 16 models (launching next month) will still only have 8 GB according to rumors, which happens to be the bare minimum required to run Apple Intelligence.

      Giving the new models only 8 GB seems a bit shortsighted and will likely mean that more complex AI models in future iOS versions won’t run on these devices. It could also mean that these devices won’t be able to keep a lot of apps ready in the background if running an AI model in-between.

      16 GB is proper future-proofing on Google’s part (unless they lock new software features behind newer models anyway down the road), and Apple will likely only gradually increase memory on their devices.

      • Rai@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        I don’t use Apple computers but if we’re going into phones, iOS is extremely memory efficient. I’m on a six year old XS max with 4GB and it works like the day I got it, running circles around Android phones half its age.

      • filister@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Pretty much what NVIDIA is doing with their GPUs. Refusing to provide adequate future proof amount of VRAM on their cards. That’s planned obsolescence in action.

        • TheGrandNagus@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          And like Apple, Nvidia has no shortage of fanboys that insist the pitiful amounts of (V)RAM is enough. The marketing sway those two companies have is incredible.

          It’s a complete joke that Sapphire had an 8GB version of the R9 290X, what, 11 years ago or something? And yet Nvidia is still selling 8GB cards now, for exorbitant prices.

          • Petter1@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            This happens if you sell your hardware as DRM key to use their software (i(Pad)OS, macOS etc. and Cuda)

          • CheeseNoodle@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            The current GPU situation actually has me curious about AMDs upcoming Halo APU chips. They’re likely going to be pretty expensive relative to their potential GPU equivelent performance but if they work out similar to the combined price of a CPU and GPU then it might be worthwhile as they use onboard RAM as their VRAM. Probably a crazy idea but one I look forward to theory-building in spring when they release.

      • tankplanker@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        If you were being cynical, you could say it was planned obsolescence and that when the new ai feature set rolls out that you have to get the new phone for them.

        • narc0tic_bird@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          I think they got caught with their pants down when everybody started doing AI and they were like “hey, we have this cool VR headset”. Otherwise they would’ve at least prepared the regular iPhone 15 (6 GB) to be ready for Apple Intelligence. Every (Apple Silicon) device with 8 GB or more get Apple Intelligence, so M1 iPads from 2021 get it as well for example, even though the M1’s NPU is much weaker than some of the NPUs in unsupported devices with less RAM.

          They are launching their AI (or at least everything under the “Apple Intelligence” umbrella) with iOS 18.1 which won’t even release with the launch of the new iPhones, and it’ll be US only (or at least English only) with several of the features announced at WWDC still missing/coming later and it’s unclear how they proceed in the EU.

          • tankplanker@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            With how polished Apples AI on mobile was at launch compared to Gemini on Android at launch were it could not even do basics like timers I suspect Apple had it in the works for far longer and it would not have been a total surprise.

            Also you are describing the situation at launch for new hardware, the software will evolve every year going forward and the requirements will likely increase every year. If I am buying a flagship phone right now I want it to last at least 3 years of updates, if not 5 years. The phone has to be able to cope with what is a very basic requirement that is enough RAM.

            This isn’t some NPU thing, this is just basic common sense that more RAM is better for this, something the flagship iPhones could have benefited from for a while now.

            • narc0tic_bird@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              I’m not sure if you’re agreeing or disagreeing with me here. Either way, hardware has a substantially longer turnaround time compared to software. The iPhone 15 would’ve been in development years before release (I’m assuming they’re developing multiple generations in parallel, which is very likely the case) and keep in mind that the internals are basically identical to the iPhone 14 Pro, featuring the same SoC.

              AI and maybe AAA games like Resident Evil aside, 6 GB seems to work very well on iPhones. If I had a Pixel 6/7/8 Pro with 12 GB and an iPhone 12/13/14 Pro (or 15) with 6 GB, I likely wouldn’t notice the difference unless I specifically counted the number of recent applications I could reopen without them reloading. 6 GB keeps plenty of recent apps in memory on iOS.

              But I’m not sure going with 8 GB in the new models knowing that AI is a thing and the minimum requirement for their first series of models is 8 GB is too reassuring. I’m sure these devices will get 5-8 years of software updates, but new AI features might be reduced or not present at all on these models then.

              When talking about “AI” in this context I’m talking about everything new under the “Apple Intelligence” umbrella, like LLMs and image generators. They’ve done what you’d call “AI” nowadays for many years on their devices, like photo analysis, computational photography, voice isolation, “Siri” recommendations etc.

              • tankplanker@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                9 months ago

                I was under the impression that ios used sleight of hand with apps to reduce memory footprint for inactive apps rather than how android manages its recent apps list? Is it still requiring special permissions to run non apple apps in the background as active tasks? AI will need to run the background and will need a decent chunk of RAM to do so.

                I completely agree that changing the processor or revising NPU or similar is too much to do late stage, I reject that for increasing RAM or storage, both can be changed closer than 12 months from release and I would also reject that apple had the AI changes planned for much less than 12 months out as well. It just feels like a big fuck you to anybody buying a flagship from apple this year as it wont last the length of time it should do for normal consumers who would expect all of the latest AI features to roll out during the supported window.

        • nous@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          I would say it is more so they can advertise a lower price. But then expect you to get the more expensive ones as the bare minimum is just not enough.

          • tankplanker@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            For the base model yeah, but apple loves charging a packet for more memory so I don’t see it for the top of the range models. Would be typical for them to only offer 16gb with the increased storage as well, just to bump the price up

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      I think it’s proprietary ram as well so you can’t just get something off the market and solder it on. It has to be their ram or it won’t work.

    • boonhet@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 months ago

      Isn’t the RAM inside the actual SoC with the Apple Silicon line? I haven’t really opened any of 'em up.

      As for older Macs - sure, I know someone who replaced 8 gigs with 16 on either an Air or Pro model that had 16 available as an option but was shipped with 8. It’s just something you do when you have way too many Mac boards lying around at work and your bosses say you can’t get a new work laptop.

  • M500@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    It’s because AI needs a not a ram. I think Apple did not expect or plan for ai which shows in the fact that only the latest pro phone can have Apple intelligence. It’s because that phone has enough ram.

    Now they will boost ram across the board because Apple intelligence will not run well without it.

    Depending on pricing, I may actually buy a MacBook in 2025.

    I’ve wanted one since the m1, but I’ve held out until 16gb was the starting amount of ram.

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Or you could just get just about any other non-mac system that lets you upgrade RAM easily when you need too…

      Just stop supporting Apples soldered in BS

      • bamboo@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        I hate to be the bearer of bad news, but most things and light laptops have had soldered ram for many years now. There are exceptions, but they’re few and far between.

        • cm0002@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          What? Lol nah plenty of laptops have removable RAM. It tended to show up often on the “Ultralight” tier, but outside of that and Chromebooks it’s been by no means the norm

          • T156@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            It has kind of come with newer laptops being driven to be thinner, and for newer devices, because the old SODIMM format is no longer capable of the throughput/latencies needed for higher speed memory.

            From memory, 2.1Ghz DDR5 is where it caps out. Anything faster, like 2.8 GHz either requires it to be soldered, or one of the new formats like the one Dell has started using.

            • boonhet@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              The replacement you’re talking about is called [CAMM](https://en.wikipedia.org/wiki/CAMM_(memory_module\)) and personally I’m excited about it. Not only does it support faster speeds than SO-DIMM, it takes up less physical space. And I believe you can’t even put LPDDR on a SODIMM, so CAMM should also use less power?

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        9 months ago

        Bad news: literally all current CPU gen laptops use soldered RAM.

        All of them. Every single one. No exceptions.

        Hopefully that’ll change, but as it stands right now, if you want newest gen, you cannot get replaceable RAM.

        And even before current gen, the vast majority of Windows laptops were soldered too.

        • cm0002@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          9 months ago

          I looked into it, yea current gen chips aren’t compatible with SODIMM

          Because they’re compatible with the brand new removable RAM standard, CAMM2. It is confusing though, as everywhere I’ve looked both soldered and CAMM2 were listed as LPDDR5 which is what makes you think it’s just soldered RAM. So far it looks like if a spec sheet lists LPDDR5x it should be a CAMM2

          CAMM2 is also very very new, so I’m sure a few manufacturers in their rush to get the new/current gen chips out the door just used soldered RAM.

          CAMM2 is very exciting, it basically eats into all of Apples listed pros for having soldered in RAM as close to the CPU as possible while still being user removable. (Performance, efficiency etc)

        • AGuyAcrossTheInternet@fedia.io
          link
          fedilink
          arrow-up
          0
          ·
          9 months ago

          I really don’t know where you’re looking because I only see that in business-class laptops and even then not all of them have soldered RAM.

          And I’m already counting the ones with one expansion slot with the soldered bunch.

          Of course, if you paid attention only to HP, Dell and Lenovo, then I’d see why you’d think so. But beyond those brands, you don’t have that soldered nonsense everywhere. At the very least, you have things like Clevo, Framework and the like to sell you laptops without soldered ram.

          I bet there are even websites that let you filter laptop models without soldered ram. Personally, I only know about Germany-based websites like that, though.

          • TheGrandNagus@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 months ago

            You are looking at previous-gen platforms.

            E.g. for Framework, you’re looking at APUs like the 7840U, which is not current gen. It’s two generations old. (7840U/Phoenix > 8850U/Hawk Point > AI 9 365 (awful naming btw AMD)/Strix Point).

            Like I said, all current CPU gen laptops cannot use SODIMM.

            And let me be clear here, I’m not exaggerating for effect; I do not mean most of them. I do not mean the vast majority of them. I do not mean practically all of them. I literally mean all of them. 100% of them. Every single one that exists.

            AMD, Intel, and Qualcomm do not currently have compatibility with SODIMM on their newest gen mobile CPUs.

            I hope that changes, and I expect it eventually will, but as it stands right now, no you cannot have SODIMM modules if you are buying any laptop with the newest gen CPUs.

            • AGuyAcrossTheInternet@fedia.io
              link
              fedilink
              arrow-up
              0
              ·
              9 months ago

              Well fudge me sideways. Every day is a school day.

              They’ve all got LPDDR5, so yeah, you’re unfortunately right. It feels kinda weird having to consider the 7000 and 8000-series last gen already; true as it is, though.

              • cm0002@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                9 months ago

                Don’t worry, the latest chips were just built to only handle CAMM2, a new removable RAM standard that replaced SODIMM

                It’s a bit confusing though because both soldered and CAMM2 are listed as LPDDR5 on spec sheets, from what I’ve looked at it appears if there’s an x at the end of the LPDDR5 it should be CAMM2

                It’s also brand BRAND new, so I’m sure quite a few manufacturers rushed out the door with the new chips just soldering on the RAM because they couldn’t get CAMM2 in it in time for whatever reason

      • M500@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        I know what you mean, but I’m tired of window’s bullshit too.

        I’d keep pc hardware if my work could happen on Linux, but it’s sadly not an option at the moment.

  • padge@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    My sister just bought a MacBook Air for college, and I had to beg her to spend the extra money on 16gb of memory. It feels like a scam that it appears cheap with the starting at price, but nobody should actually go with those “starting at” specs.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Yeah it’s about future proofing. 8 GB might be okay for basic browsing and text editing now, but in the future that might not be the case. Also in my experience people who only want to do basic browsing and word editing, end up inevitably wanting to do more complex things and not understanding that their device is not capable of it.

      • padge@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Exactly. I told her that 8gb might be fine for a year or two, but if she wants this thousand plus dollar laptop to last four years she needs to invest the extra money now. Especially once she told me she might want to play Minecraft or Shadow of the Tomb Raider on it