There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.

  • maxinstuff@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    3
    ·
    5 days ago

    Oh man, I remember so many people defended 8GB since the M1 first came out (and since).

    I always argued it would significantly reduce the lifetimes of these machines if you bought one, not just because you’d be swapping a lot more on the (soldered in BTW) ssd, but because after a few years of updates it would become unbearably slow, or hardware would fail, or both.

    Didn’t stop people constantly “tHe aRchITecTuRE iS cOmPlETelY diFFeRenT!!!”

    Sure it’s different, but it’s still just a computer. A technical person can still look at the spec sheet and calculate effective performance accounting for bus widths etc.

    Disclosure: I bought a top spec 16GB M1 Mac Air on launch and have been extremely happy with it - it’s still going strong.

    • uis@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      6
      ·
      edit-2
      4 days ago

      Didn’t stop people constantly “tHe aRchITecTuRE iS cOmPlETelY diFFeRenT!!!”

      Different Turing Machine on different math and alternative physics, I guess.

      I bought a top spec 16GB M1 Mac Air on launch

      My condolences.

      EDIT: do people geuenly belive that math doesn’t apply to Apple’s products or they just don’t understand even such concentrated sarcasm?

  • resetbypeer@lemmy.world
    link
    fedilink
    English
    arrow-up
    79
    arrow-down
    3
    ·
    6 days ago

    Opens chrome on a 8GB Mac. Sees lifespan of SSD being reduced by 50%. After 2-3 years of heavy usage SSD starts to get errors. Apple solution: buy a new one. No wonder they are 2nd/3rd wealthiest company on the planet.

    • otp@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      3
      ·
      edit-2
      5 days ago

      buy a new one.

      Buy a new SSD and swap out the old one?

      …buy a new SSD, right??

        • helenslunch@feddit.nl
          link
          fedilink
          English
          arrow-up
          9
          ·
          5 days ago

          The Mac Studio uses a standard NVMe SSD but if you replace it with anything that you didn’t buy from Apple with a 500%+ markup, the new drive simply won’t work.

        • Dojan@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 days ago

          Well they do charge particularly hard for SSDs as well. They’ve found a way to eat the cake twice.

      • vingetcxly@thelemmy.club
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 days ago

        Nah ur nat doin that with apple. Cmon just buy a new PC! Wa don car abt the env! Who cares anyway! Cmon not that expensive

      • WereCat@lemmy.world
        link
        fedilink
        English
        arrow-up
        20
        ·
        5 days ago

        SSD is soldered to the board. With only 8GB you’ll be using the swap partiton a lot so for anything exceeding 8GB of RAM you will be using the SSD as a slower “RAM” which will wear it’s lifespan down by constantly writing/reading into it’ s swap partition.

        • maxinstuff@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          edit-2
          5 days ago

          “tHATs nOT tRuE the aRCHiteCTuRe iS cOmPlETlY dIffErEnT!!!1!11!!ONEONE!!!” <— Apple fanboys when this was predicted on launch of the M1 🤖

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            7
            ·
            5 days ago

            No you don’t understand the architecture is different and so the laws of physics don’t apply. Constantly energizing and de-energizing capacitors can only increase life expectancy, everyone knows that.

  • egeres@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    2
    ·
    edit-2
    5 days ago

    Why do they struggle so much with some “obvious things” sometimes ? We wouldn’t have a type-C iphone if the EU didn’t pressured them to do make the switch

    • helenslunch@feddit.nl
      link
      fedilink
      English
      arrow-up
      60
      arrow-down
      1
      ·
      5 days ago

      They don’t “struggle”. They are intentional and malicious decisions meant to drive revenue, as they have been since the beginning.

      • Valmond@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        5 days ago

        The E-Mac (looks like a toilet, sounds like a jet) came with 256 MB of RAM in one of the two slots, adding a 512MB stick was dirt cheap (everyone had at the very least 1GB on their PC), well it was dirt cheap except if you bought it from Apple…

        It’s how Apple monetizes their customers. Figuring out an artificial shortcoming they can sell as an upgrade to them (check out dongles for example).

    • dan@upvote.au
      link
      fedilink
      English
      arrow-up
      7
      ·
      5 days ago

      They didn’t have a reason to switch to USB-C, and several reasons to avoid it for as long as possible. Their old Lightning connector (and the big 30-pin connector that came before it) was proprietary, and companies had to pay a royalty to Apple for every port and connector they manufactured. They made a lot of money off of the royalties.

  • kingthrillgore@lemmy.ml
    link
    fedilink
    English
    arrow-up
    60
    arrow-down
    4
    ·
    5 days ago

    They moved to on-die RAM for a reason: To nickel and dime yo ass.

    I needed to expense a Mac Mini for iOS development, and everyone (Me, the company, our purchasing department) was baffled at how much it cost to get 16 GB. And they only go up to 24GB. Imagine how much they’ll charge for 32 in a year!

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        5 days ago

        It’s a bit first but if their primary motivation was performance improvements they wouldn’t be soldering 16 GB.

        If you’re going to weld shoes to your feet, you better at least make sure that they’re good shoes.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          5 days ago

          Why not? There is a performance benefit to being closer to the CPU, and soldering gets you a lot closer to the CPU. That’s a fact.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            5 days ago

            Yeah but if you’re only putting 8 GB of RAM on then you’re also going to be constantly querying the hard drive. So any performance gain you get from soldering, is lost by going all the way to the hard drive every 3 microseconds.

            It’s only better performance on paper in reality there’s no real benefit. If you can run an application entirely entirely within the 8 GB of RAM, and assuming you’re not running anything else, then maybe you get better performance.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 days ago

              And that’s the idea. Soldering memory is an engineering decision. How much to solder is a marketing decision. Since users can’t easily add more, marketing can upsell on more RAM.

              It’s not “on paper,” the RAM itself is performing better vs socketed RAM. Whether the system runs better depends on the configuration, as in, did you order enough RAM.

              • Echo Dot@feddit.uk
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 days ago

                I can’t tell if you’re a stooge or if you really think that. I hope you are stooge, because otherwise that’s a really stupid position you’ve decided to take and you clearly don’t actually understand the issue.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  3 days ago

                  I’m pretty sure I do understand the issue. Here are some facts (and an article to back it up):

                  1. putting memory closer to the CPU improves performance due to less latency - from 96GB/s -> 200 (M1) or 400 (M1 Max) GB/s
                  2. customers can’t easily solder on more RAM
                  3. Apple’s RAM upgrades are way more expensive than socketed options on the market

                  And here’s my interpretation/guesses:

                  1. marketing sees 1 & 2, and sees an opportunity to do more of 3
                  2. marketing probably asked engineering what the bare minimum is, and they probably said 8GB (assuming web browsing and whatnot only), though 16GB is preferable (that’s what I’d answer)
                  3. marketing sets the minimum @ 8GB, banking on most users who need more than the basics to buy more, or for users to buy another laptop sooner when they realize they ran out of RAM (getting after-sale RAM upgrades is expensive)

                  So:

                  • using soldered RAM is an engineering decision due to improved performance (double socketed RAM w/ Intel on M1, quadruple on M1 Max)
                  • limiting RAM to 8GB is a marketing decision
                  • if you don’t have enough RAM, that doesn’t mean the RAM isn’t performing well, it means you don’t have enough RAM

                  Using socketed RAM won’t fix performance issues related to running out of RAM, that issue is the same regardless. Only adding RAM will fix those performance issues, and Apple could just as easily make “special” RAM so you can’t buy socketed RAM on the regular market anyway (e.g. they’d need a different memory standard anyway due to Unified Memory).

                  I have hated Apple’s memory pricing for decades now, it has always been way more expensive to add RAM to an Apple device at order time vs PC competitors (I still add my own RAM to laptops, but it’s usually way cheaper through HP, Lenovo, etc than Apple at build-time). I’m not defending them here, I’m merely saying that the decision to use soldered RAM makes a lot of engineering sense, especially with the new Unified Memory architecture they’re using in the M-series devices.

      • Zink@programming.dev
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        5 days ago

        Sounds like one of those rare cases where engineering and marketing might agree on something.

      • Dojan@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        5 days ago

        Companies primarily make decisions to maximise the profitability of someone and it’s never the consumer.

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      5 days ago

      Mac Mini is meant to be sort of the starter desktop. For higher end uses, they want you on the Mac Studio, an iMac, or a Mac Pro.

  • Jtee@lemmy.world
    link
    fedilink
    English
    arrow-up
    157
    arrow-down
    32
    ·
    6 days ago

    And now all the fan boys and girls will go out and buy another MacBook. That’s planned obsolescence for ya

    • bamboo@lemm.ee
      cake
      link
      fedilink
      English
      arrow-up
      63
      arrow-down
      4
      ·
      6 days ago

      Someone who is buying a MacBook with the minimum specs probably isn’t the same person that’s going to run out and buy another one to get one specific feature in Xcode. Not trying to defend Apple here, but if you were a developer who would care about this, you probably would have paid for the upgrade when you bought it in the first place (or couldn’t afford it then or now).

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        24
        arrow-down
        7
        ·
        6 days ago

        Well no, not this specific scenario, because of course devs will generally buy machines with more RAM.

        But there are definitely people who will buy an 8GB Apple laptop, run into performance issues, then think “oh I must need to buy a new MacBook”.

        If Apple didn’t purposely manufacture ewaste-tier 8GB laptops, that would be minimised.

        • narc0tic_bird@lemm.ee
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          edit-2
          6 days ago

          I wouldn’t be so sure. I feel like many people would not buy another MacBook if it were to feel a lot slower after just a few years.

          This feels like short term gains vs. long term reputation.

    • m-p{3}@lemmy.ca
      cake
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      8
      ·
      6 days ago

      And why they solder the RAM, or even worse make it part of the SoC.

      • rockSlayer@lemmy.world
        link
        fedilink
        English
        arrow-up
        51
        arrow-down
        4
        ·
        6 days ago

        There are real world performance benefits to ram being as close as possible to the CPU, so it’s not entirely without merit. But that’s what CAMM modules are for.

        • akilou@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          25
          arrow-down
          2
          ·
          6 days ago

          But do those benefits outweigh doubling or tripling the amount of RAM by simply inserting another stick that you can buy for dozens of dollars?

          • BorgDrone@lemmy.one
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            4
            ·
            6 days ago

            Yes, there are massive advantages. It’s basically what makes unified memory possible on modern Macs. Especially with all the interest in AI nowadays, you really don’t want a machine with a discrete GPU/VRAM, a discrete NPU, etc.

            Take for example a modern high-end PC with an RTX 4090. Those only have 24GB VRAM and that VRAM is only accessible through the (relatively slow) PCIe bus. AI models can get really big, and 24GB can be too little for the bigger models. You can spec an M2 Ultra with 192GB RAM and almost all of it is accessible by the GPU directly. Even better, the GPU can access that without any need for copying data back and forth over the PCIe bus, so literally 0 overhead.

            The advantages of this multiply when you have more dedicated silicon. For example: if you have an NPU, that can use the same memory pool and access the same shared data as the CPU and GPU with no overhead. The M series also have dedicated video encoder/decoder hardware, which again can access the unified memory with zero overhead.

            For example: you could have an application that replaces the background on a video using AI. It takes a video, decompresses it using the video decoder , the decompressed video frames are immediately available to all other components. The GPU can then be used to pre-process the frames, the NPU can use the processed frames as input to some AI model and generate a new frame and the video encoder can immediately access that result and compress it into a new video file.

            The overhead of just copying data for such an operation on a system with non-unified memory would be huge. That’s why I think that the AI revolution is going to be one of the driving factors in killing systems with non-unified memory architectures, at least for end-user devices.

          • FarraigePlaisteach@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 days ago

            And even if the out-of-the-box RAM is soldered to the machine, it should still be possible to add supplementary RAM that isn’t soldered for when the system demands it. Other computers have worked like this in the past with chip RAM but a socket to add more.

          • gravitas_deficiency@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 days ago

            It’s highly dependent on the application.

            For instance, I could absolutely see having certain models with LPCAMM expandability as a great move for Apple, particularly in the pro segment, so they’re not capped by whatever they can cram into their monolithic SoCs. But for most consumer (that is, non-engineer/non-developer users) applications, I don’t see them making it expandable.

            Or more succinctly: they should absolutely put LPCAMM in the next generation of MBPs, in my opinion.

          • rockSlayer@lemmy.world
            link
            fedilink
            English
            arrow-up
            19
            arrow-down
            2
            ·
            6 days ago

            That’s extremely dependent on the use case, but in my opinion, generally no. However CAMM has been released as an official JEDEC interface and does a good job at being a middle ground between repairability and speed.

            • halcyoncmdr@lemmy.world
              link
              fedilink
              English
              arrow-up
              20
              arrow-down
              3
              ·
              6 days ago

              It’s an officially recognized spec, so Apple will ignore it as long as they can. Until they can find a way to make money from it or spin marketing as if it’s some miraculous new invention of theirs, for something that should just be how it’s done.

        • TheGrandNagus@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          6 days ago

          Apple’s SoC long predates CAMM.

          Dell first showed off CAMM in 2022, and it only became JEDEC standardised in December 2023.

          That said, if Dell can create a really good memory standard and get JEDEC to make it an industry standard, so can Apple. They just chose not to.

      • Balder@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        edit-2
        6 days ago

        In this particular case the RAM is part of the chip as an attempt to squeeze more performance. Nowadays, processors have become too fast but it’s useless if the rest of the components don’t catch up. The traditional memory architecture has become a bottleneck the same way HDDs were before the introduction of SSDs.

        You’ll see this same trend extend to Windows laptops as they shift to Snapdragon processors too.

        • umami_wasabi@lemmy.ml
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          3
          ·
          6 days ago

          Well. The claim they made still holds true, despit how I dislike this design choice. It is faster, and more secure (though attacks on NAND chips are hard and require high skill levels that most attacker won’t posses).

          And add one more: it saves power when using LPDDR5 rather DDR5. To a laptop that battery life matters a lot, I agree that’s important. However, I have no idea how much standby or active time it gain by using LPDDR5.

    • Mongostein@lemmy.ca
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      10
      ·
      6 days ago

      And the apple haters will keep making this exact same comment on every post using their 3rd laptop in ten years while I’m still using my 2014 MacBook daily with no issues.

      Be more original.

      • Jtee@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        5 days ago

        Nice attempt to justify planned obsolescence. To think apple hasn’t done this time and time again, you’d have to be a fool

        • Mongostein@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          6
          ·
          edit-2
          5 days ago

          👍

          -posted from my ten year old MacBook which shows no need for replacement

          • Honytawk@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            5 days ago

            At which point did Apple decide your MacBook was too old to be usable and stop giving updates or allow new software to run on it?

            • Mongostein@lemmy.ca
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              edit-2
              5 days ago

              Still gets security updates. All the software I need to run on it runs on it.

              My email, desktop, and calendar all still sync with my newer desktop. I can still play StarCraft. I can join zoom meetings while running Roll 20. I can even run Premiere and do video editing… to a point.

              I guess if you need the latest and greatest then you might have a point, but I don’t.

              This whole thread is bitching about software bloat and Apple does that to stop the software bloat on older machines, but noooo that’s planned obsolescence. 🙄

          • Jtee@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            5 days ago

            And is what, 3 or 4 operating systems behind due to it being obsolete

      • helenslunch@feddit.nl
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 days ago

        They will keep making the same comment as long as it keeps being true.

        • Typed from my 2009 ThinkPad

        Meanwhile your 2014 MacBook stopped receiving OS updates 3 years ago.

      • Honytawk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 days ago

        I still have a fully functioning Windows 95 machine.

        My daily driver desktop is also from around 2014.

      • stoly@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        5 days ago

        This is pretty much it. People really just want to find reasons to hate Apple over the past 2 - 3 years. You’re right, though, your Mac can run easily for 10+ years. You’re good basically until the web browsers no longer support your OS version, which is more in the 12-15 year range.

        • theneverfox@pawb.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 days ago

          In fairness, most computers built after around 2014-2016+ last way longer, performance started to level off not long after that. After all, devs write software for what people have, if everyone had 128 gigs of RAM we’d load everything we could think of into memory and you’d need it to keep up

          Macs did have some incredible build quality though, the newer ones aren’t holding up even close to as well. I’m still using a couple 2012 Macs to play videos, it’s slow as hell when you interact, but once the video is playing it still looks and sounds good

    • Lucidlethargy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      7
      ·
      6 days ago

      These were obsolete the minute they were made, though… So it’s not really planned obsolescence. I got one for free (MacBook Air), and it’s always been trash.

      • bamboo@lemm.ee
        cake
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 days ago

        I have an M2 MBA and it’s the best laptop I’ve ever owned or used, second to the M3 Max MBP I get to use for work. Silent, battery lasts all week, interface is fast and runs all my dev tools like a charm. Zero issues with the device.

  • Hux@lemmy.ml
    link
    fedilink
    English
    arrow-up
    111
    arrow-down
    15
    ·
    6 days ago

    This isn’t a big deal.

    If you’re developing in Xcode, you did not buy an 8GB Mac in the last 10-years.

    If you are just using your Mac for Facebook and email, I don’t think you know what RAM is.

    If you know what RAM is, and you bought an 8GB Mac in the last 10-years, then you are likely self-aware of your limited demands and/or made an informed compromise.

    • filister@lemmy.world
      link
      fedilink
      English
      arrow-up
      43
      arrow-down
      5
      ·
      edit-2
      6 days ago

      If you know what RAM is, and you bought an 8GB Mac in the last 10-years, then you are likely self-aware of your limited demands and/or made an informed compromise.

      Or you simply refuse to pay $200+ to get a proper machine. Like seriously, 8GB Mac’s should have disappeared long ago, but nope, Apple stick to them with their planned obsolescence tactics on their hardware, and stubbornly refusing to admit that in 2023 releasing a MacBook with soldered 8Gb of RAM is wholy inadequate.

    • DJDarren@thelemmy.club
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      4
      ·
      6 days ago

      I’m not gonna stand up and declare that 8gb is absolutely fine, because in very short order it won’t be. But yeah, currently for an average use case, it is.

      My work Mac mini has 8gb. It’s a 2014 so can’t be upgraded, but for the tasks I ask of it it’s ok. Sure, it gets sluggish if I’m using the Win11 VM I sometimes need, but generally I don’t really have any issues doing regular office tasks.

      That said, I sometimes gets a bee in my bonnet about it, so open Activity Monitor to see what’s it’s doing, and am shocked by how much RAM some websites consume in open tabs in Safari.

      8gb is generally ok on low end gear, but devs are working very hard to ensure that it’s not.

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      Funny: knowing that you only get one shot, I bought 32GB of RAM for my Mac Mini like 1.5 years ago. I figured that it gave me the best shot of keeping it usable past 5 years.

    • woelkchen@lemmy.world
      link
      fedilink
      English
      arrow-up
      37
      ·
      6 days ago

      Shipping with Windows S. That’s Microsoft’s version of a Chromebook for some light web browsing for 188 dollars. I wouldn’t buy it but this doesn’t look like a rip off at this price point.

        • n0clue@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          ·
          6 days ago

          And if they raised the price to $250, they could go with a faster processor and better wifi!

      • purplemonkeymad@programming.dev
        link
        fedilink
        English
        arrow-up
        8
        ·
        6 days ago

        S mode does allow you to turn it off, so it’s more like a hobbled version of home.

        The computer is as bad as one I saw several years ago with 64g emmc and “Quad core processor.” not a quad core, it was literally the name that showed in system. It did have 4 cores: at 400Mhz, boosting to 1.1Ghz. Buyer changed their mind and we couldn’t give it away.

        • woelkchen@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          2
          ·
          6 days ago

          Of course that notebook is bad but for the price point of shitty hardware, you get shitty hardware. Apple sells shitty hardware at the cost of premium hardware.

    • homura1650@lemm.ee
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      edit-2
      5 days ago

      At a $188 price point. An additional 4GB of memory would probably add ~$10 to the cost, which is over a 5% increase. However, that is not the only component they cheaped out on. The linked unit also only has 64GB of storage, which they should probably increase to have a usable system …

      And soon you find that you just reinvented a mid-market device instead of the low-market device you were trying to sell.

      4GB of ram is still plenty to have a functioning computer. It will not be as capable of a more powerful computer, but that comes with the territory of buying the low cost version of a product.

      • uis@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 days ago

        If they wanted it to be as cheap as possible, they could have installed Linux on it.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        5 days ago

        At that point you gotta wonder if it can keep up with an $80 Raspberry Pi, especially if HP tries to shoehorn Windows into that

        • homura1650@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          4 days ago

          In addition to the raw compute power, the HP laptop comes with a:

          • monitor
          • keyboard/trackpad
          • charger
          • windows 11
          • active cooling system
          • enclosure

          I’ve been looking for a lapdock [0], and the absolute low-end of the market goes for over $200, which is already more expensive than the hp laptop despite spending no money on any actual compute components.

          Granted, this is because lapdocks are a fairly niche product that are almost always either a luxury purchase (individual users) or a rounding error (datacenter users)

          [0] Keyboard/monitor combo in a laptop form factor, but without a built in computer. It is intended to be used as an interface to an external computer (typically a smartphone or rackmounted server).

    • 31337@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 days ago

      I was looking at notebooks at Walmart the other day, and I was amazed that they almost all had less or the same amount of RAM as my phone.

      • homura1650@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 days ago

        Miniaturization is amazing. The limiting factor to how powerful we can make phones is not space to put in computational units (processors,ram,etc). It is the ability to deal with the heat they generate (and the related issue of rationing a limited amount of battery power)

    • vingetcxly@thelemmy.club
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      5 days ago

      4gb is acceptable. Some people just want a phone with a keyboard and bugger screen. Depends on the use case tho.

    • Fishbone@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      5 days ago

      This is my biggest lament about getting a 2060 without knowing how important vram is. I can make it perform better and more efficiently a bunch of different ways, but to my knowledge, I can’t get around the 6GB vram wall.

  • uis@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    5 days ago

    To be fair there are only two reasons I hate it:

    1. People incorrectly use term UMA
    2. It’s crApple

    On Linux if you don’t compile rust or firefox 8GB is fine. 4 is fine too.

      • Treczoks@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        5 days ago

        Yes. 2 kilobytes. Coincidentally, this is as big as the displays internal buffer, so I cannot even keep a shadow copy of it in my RAM for the GUI.

              • Treczoks@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                4 days ago

                I never said anything about framebuffers. The 256x64 pixel display in 16 brightness levels probably has something comparable inside. I just tell it that i want to update a rectangle, and send it some data for that.

                • uis@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  4 days ago

                  It should have.

                  Then, if you don’t store contents of entire screen in memory, which simple math says you can’t, I was partially wrong(depending on if you don’t count buffer in display as framebuffer) when interpreted “shadow copy” as backbuffer.

      • Treczoks@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 days ago

        No AVR, it’s a small LPC from NXP. Chosen for the price, of course, but I have to somehow squeeze the software in it. At this point, even 8k would make me happy…

        • uis@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          5 days ago

          NXP, fancy. I expected ST, AVR, nRF, WCH or some chinese cheaptroller.

          Why them? Something to do with NFC?

  • Nicoleism101@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    3
    ·
    edit-2
    5 days ago

    I have everything from apple but I know that 8gb is basically planned obsolescence in disguise.

    We pay serious extra cash for just a ‚notch’ more refined experience. However I had to try to buy every possible thing from apple at least once in my life to see if it is worth it and basically only M4 iPad Pro 13 is truly worth the money and irreplaceable for me.

    Everything else is nice for someone who is super lazy like me but can be easily replaced with not much difference for cheaper shit

  • RecluseRamble@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    16
    ·
    6 days ago

    I can’t believe, there’s no Linux reference yet!

    Give your “8 gigs not enough” hardware to one of us and see it revived running faster than whatever you’re running now with your subpar OS.

    • mightyfoolish@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      4
      ·
      6 days ago

      Software and AI development would be hard with 8gb of RAM on Linux. Having you seen the memes on AI adding to global climate change? Not even Linux can fix the issues with ChatGPT…

      • prole@sh.itjust.works
        cake
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        edit-2
        6 days ago

        I don’t think anyone anywhere is claiming 8GB RAM is enough for software and AI development. Pretty sure we’re talking about consumer-grade hardware here. And low-end at that.

        • monnier@lemmy.ca
          link
          fedilink
          English
          arrow-up
          8
          ·
          6 days ago

          My main development machine has 8 GB, for what it’s worth. And most of the software in use nowadays was developped when 8GB was a lot of RAM

          • abhibeckert@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            5 days ago

            This. My Mac has 16GB but I use half of it with a Linux virtual machine, since I use my Mac to write Linux (server) software.

            I don’t need to do that - I could totally run that software directly on my Mac, but I like having a dev environment where I can just delete it all and start over without affecting my main OS. I could totally work effectively with 8GB. Also I don’t need to give the Linux VM less memory, all my production servers have way less than that. But I don’t need to - because 8GB for the host is more than enough.

            Obviously it depends what software you’re running, but editing text, compiling code, and browsing the web… it doesn’t use that much. And the AI code completion system I use needs terabytes of RAM. Hard to believe Apple’s one that runs locally will be anywhere near as good.

        • Kazumara@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          6 days ago

          The lede by OP here contains this:

          […] addition to Xcode 16 […] is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it

          So either RecluseRamble meant that development with a feature like predictive code completion would work on 8 GB of RAM if you were using Linux or his comparison was shit.

          • RecluseRamble@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            6 days ago

            That’s absolutely what I’m saying. Apple is just holding back that feature for upselling (as always) and because it’s hardly possible to debloat macOS.

            • Kazumara@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              1
              ·
              5 days ago

              Okay good, thanks for confirming. I remember Kate feeling very nice to use during my studies, more responsive than VS Code or Eclipse. But I also had 16Gigabytes of RAM, so I couldn’t be sure.

        • uis@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 days ago

          I don’t think anyone anywhere is claiming 8GB RAM is enough for software development

          I do. GCC doesn’t need much. Vim/emacs work fine with 128 MB of RAM. With 1 GB you can run KDE and QtCreator instead of vim.

        • mightyfoolish@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          5 days ago

          Macbook Pros aren’t really consumer grade hardware. Nor are they priced like consumer grade hardware.

            • mightyfoolish@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              5 days ago

              That’s not true at all. Macbook Air starts at $900. You can even find a used M1 Air for cheaper. Absolutely was a steal compared to the budget thin laptops from Asus, Acer, etc. which start around $700. Once you go below $700 in laptop market, corners are cut. Perhaps Mediatek WiFi chips are used, laptop isn’t thin, touchpad is awful, screen colors are worse. Apple usually puts iPad + keyboard in that market segment instead.

              Tl; dr: Apple products are more expensive than budget electronics but priced comparatively to items that compete with it. However, electronic prices in the high end tier are getting hirer.

              • RecluseRamble@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                5 days ago

                So it’s more expensive than the competitors which also have real budget options at easily half the price but then “corners are cut”.

                You know, I won’t even argue about the quality of Apple products - they are top tier. But calling the pricing “a steal” is just dishonest.

                They have consistently been averaging at 150-200% the price of comparable hardware at least since the 90s. While there may be examples like yours where the gap is smaller, there are plenty of outrageous examples like the infamous monitor stand or some ridiculously priced chargers.

                • mightyfoolish@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  5 days ago

                  You are right about the accessories, horribly overpriced.

                  They have consistently been averaging at 150-200% the price of comparable hardware at least since the 90s

                  I used to fix laptops for a living. I worked at a place where we had used Apple products and stuff from other brands. Sure, you could buy a core i5 Toshiba laptop that had a similar Intel CPU (though Apple tended to use Intel chips with slightly more GPU performance) at a fraction of the price. The screen was garbage, the WiFi stalled, the touchpad was unusable, using the keyboard made the chassis flex, etc. The comparable products from Lenovo, Samsung, HP were similarly priced.

                  You can find some laptops with decent Intel or AMD chips for $600 these days. Usually they will be plastic or bricks. Which is fine of you don’t mind that. People want thinner products and that calls for a better design to (1) handle the heat or (2) buy the better binned CPU that operates better at lower frequencies.

                  Not only that but people were willing to buy the used Macbooks. Much better than the other brands where the plastic and PCBs were sent for recycling MUCH more often. Better for the environment.

          • uis@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 days ago

            Macbook Pros aren’t really consumer grade hardware.

            They are even below that.

    • helenslunch@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      5 days ago

      Honestly I have no qualms with MacOS. Probably the best OS. Problem is you can’t run it on anything that is repairable or upgradeable, and in 7 years it won’t be supported any longer. If they would just sell me a $500 lifetime license for MacOS that I could install on a Framework laptop, I’d buy it in a heartbeat. But they know they make way more money by not making that option available.

    • el_abuelo@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      10
      ·
      6 days ago

      I’d love to see you run xcode 16 code completion on your superior OS. Send me a link once you’ve uploaded the vid.

      • Mojave@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        2
        ·
        6 days ago

        Why limit it to proprietary software? Almost every linux distro can run Github Copilot X and Jetbrains, which both have had more time to be publicly used and tested and work better in my opinion.

        Send me a video link of Mac having direct access to containers without using a VM (which ruins the point of containers). THAT is directly related to my actual work, as opposed to needing a robot to code for me specifically using Apple’s AI

        • el_abuelo@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          5
          ·
          6 days ago

          Because that was what the article was about…I actually am a Linux user and fan, folks just misreading the intentions of my post.

          I would genuinely love to see it, because I’m stuck on mac hardware to do my job and I really hope one day they get crucified for their anticompetative practices so I can freely choose the OS my business uses.

      • RedWeasel@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        6 days ago

        There is a project being worked on called Darling, but it isn’t ready yet. The developers are making progress though.

    • RedWeasel@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 days ago

      I actually bought a m1 mini for a linux low power server. I was getting tired of the Pi4 being so slow when I needed to compile something. Works real well, just need the Asahi team to get TB working. And for my server stuff, 8gb is plenty.

      • 🦄🦄🦄@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        You wouldn’t happen to run a jellyfin server on that mac mini would you? Currently looking to find something performant with small form factor and low power consumption.

        • abhibeckert@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          5 days ago

          Not sure about jellyfin, but I assume it uses ffmpeg? The M1 is fast enough that ffmpeg can re-encode raw video footage from a high end camera (talking file sizes in the 10s of gigabyte range) an order of magnitude faster than realtime.

          That would be about 20W. Apparently it uses 5W while idle — which is low compared to an Intel CPU but actually surprisingly high.

          Power consumption on my M1 laptop averages at about 2.5 watts with active use based on the battery size and how long it lasts on a charge and that includes the screen. Apple hasn’t optimised the Mac Mini for energy efficiency (though it is naturally pretty efficient).

          TLDR if you really want the most energy efficient Mac, get a secondhand M1 MacBook Air. Or even better, consider an iPhone with Linux in a virtual machine - https://getutm.app/ - though I’m not sure how optimsied ffmpeg will be in that environment… the processor is certainly capable of encoding video quickly, it’s a camera so it has to be able to encode video well.

          • uis@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 days ago

            The M1 is fast enough that ffmpeg can re-encode raw video footage from a high end camera (talking file sizes in the 10s of gigabyte range) an order of magnitude faster than realtime.

            Depending on codec and settings, this might be super fast and super slow.

        • RedWeasel@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          6 days ago

          No I do not, but I don’t see any reason it shouldn’t work though. I have PiHole, Apache, email, cups, mythtv and samba currently.

        • Telodzrum@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          6 days ago

          I’ve run Plex servers on Mac Minis (M1). Docker on MacOS runs well finally — the issues that were everywhere a couple of years ago are resolved.

          It ran very well on the hardware. The OP of this post is right, 8gb is not enough in 2024; however I would also wager that the vast majority of commenters have not used MacOS recently or regularly. It is actually very performant and has a memory scheduler that rivals that found on GNU/Linux. Apple’s users aren’t wrong when they talk about how much better the OS is than Windows at using memory.

      • RecluseRamble@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        edit-2
        5 days ago

        As I said: feel free to upgrade your MacBook just don’t throw the one with a “meager” 8 gigs away since it’s totally usable with a non-bloated system.

      • Rivalarrival@lemmy.today
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        5 days ago

        Do you actually want to run an application that doesn’t exist on Linux?

        I use a virtual machines for the 2 or 3 times a year I need to use a couple garbage windows-only programs. Usually for configuring some arcane piece of proprietary hardware that people were getting rid of because it is incompatible with everything.

  • small44@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    16
    ·
    edit-2
    6 days ago

    For who? My mother who only use facebook, youtube and googling don’t need 8gb

    • iopq@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 days ago

      I had a laptop with 8GB. Doing one of those things was fine, but when you open up another program it takes forever to switch to the browser

      • Petter1@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        And then you have to activate linux app support for a thing she needs and can not do with chromebook and suddenly it is more complicated than macOS?

      • disguy_ovahea@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        6 days ago

        That all depends on how much work they want to put into troubleshooting it for her. I got my mom a Mac Mini when her PC needed to be replaced. It’s way less responsibility on my part. I mostly just answer the occasional how-to.

        • Glowstick@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          6 days ago

          Mac is easier than Windows, sure, but not easier than a chromebook. Nothing is simpler than a Chromebook. You can do much more with a Mac, but a chromebook is much easier.

    • ABCDE@lemmy.world
      cake
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      6 days ago

      I don’t know what Xcode is so yeah, I haven’t been found wanting with my 8GB M2. Videos, downloading, web browsing, writing, chat applications, some photo editing, games (what I can actually play on a Mac, anyway), all good here.

      16GB+ is obviously going to be necessary though, and not exactly that expensive to put into their base models so it should be put in soon.