Panther Lake and Nova Lake laptops will return to traditional RAM sticks

  • randomaside@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    42
    ·
    edit-2
    20 days ago

    I don’t think Lunar lake wasn’t a “mistake” so much as it was a reaction. Intel couldn’t make a competitive laptop chip to go up against Apple and Qualcomm. (There is a very weird love triangle between the three of them /s.) Intel had to go to TSMC to get a chip to market that satisfied this AI Copilot+ PC market boom(or bust). Intel doesn’t have the ability to make a competitive chip in that space (yet) so they had to produce lunar lake as a one off.

    Intel is very used to just giving people chips and forcing them to conform their software to the available hardware. We’re finally in the era where the software defines what the cpu needs to be able to do. This is probably why Intel struggles. Their old market dominant strategy doesn’t work in the CPU market anymore and they’ve found themselves on the back foot. Meanwhile new devices where the hardware and software are deeply integrated in design keep coming out while Intel is still swinging for the “here’s our chip, figure it out for us” crowd.

    In contrast to their desktop offerings, looking at Intel’s server offerings shows that Intel gets it. They want to give you the right chips for the right job with the right accelerators.

    He’s not wrong that GPUs in the desktop space are going away because SoCs are inevitably going to be the future. This isn’t because the market has demanded it or some sort of conspiracy, but literally we can’t get faster without chips getting smaller and closer together.

    Even though I’m burnt on Nvidia and the last two CPUs and GPUs I’ve bought have been all AMD, I’m excited to see what Nvidia and mediatek do next as this SOC future has some really interesting upsides to it. Projects like ashai Linux proton project and apple GPTK2 have shown me the SoC future is actually right around the corner.

    Turns out, the end of the x86 era is a good thing?

    • schizo@forum.uncomfortable.business
      link
      fedilink
      English
      arrow-up
      15
      ·
      19 days ago

      contrast to their desktop offerings

      That’s because server offerings are real money, which is why Intel isn’t fucking those up.

      AMD is in the same boat: they make pennies on client and gaming (including gpu), but dumptrucks of cash from selling Epycs.

      IMO, the Zen 5(%) and Arrow Lake bad-for-gaming results are because uarch development from Intel and AMD are entirely focused on the customers that pay them: datacenter and enterprise.

      Both of those CPU families clearly show that efficiency and a focus on extremely threaded workloads were the priorities, and what do you know, that’s enterprise workloads!

      end of the x86 era

      I think it’s less the era of x86 is ended and more the era of the x86 duopoly putting consumer/gaming workloads first has ended because, well, there’s just no money there relative to other things they could invest their time and design resources in.

      I also expect this to happen with GPUs: AMD has already given up, and Intel is absolutely going to do that as soon as they possibly can without it being a catastrophic self-inflicted wound (since they want an iGPU to use). nVidia has also clearly stopped giving a shit about gaming - gamers get a GPU a year or two after enterprise has cards based on the same chip, and now they charge $2000* for them - and they’re often crippled in firmware/software so that they won’t compete with the enterprise cards as well as legally not being allowed to use the drivers in a situation like that.

      ARM is probably the consumer future, but we’ll see who and with what: I desperately hope that nVidia and MediaTek end up competitive so we don’t end up in a Qualcomm oops-your-cpu-is-two-years-old-no-more-support-for-you hellscape, but well, nVidia has made ARM SOCs for like, decades, and at no point would I call any of the ones they’ve ever shipped high performance desktop replacements.

      • Yes, I know there’s a down-stack option that shows up later, but that’s also kinda the point: the ones you can afford will show up for you… eventually. Very much designed to push purchasers into the top end.
    • Trainguyrom@reddthat.com
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      19 days ago

      He’s not wrong that GPUs in the desktop space are going away because SoCs are inevitably going to be the future. This isn’t because the market has demanded it or some sort of conspiracy, but literally we can’t get faster without chips getting smaller and closer together.

      While I agree with you on a technical level, I read it as Pat Gelsinger intends to stop development of discrete graphics cards after Battlemage, which is disappointing but not surprising. Intel’s GPUs while incredibly impressive simply have an uphill battle for desktop users and particularly gamers to ensure every game a user wishes to run can generally run without compatibility problems.

      Ideally Intel would keep their GPU department going because they have a fighting chance at holding a significant market share now that they’re past the hardest hurdles, but they’re in a hard spot financially so I can’t be surprised if they’re forced to divest from discrete GPUs entirely

      • randomaside@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        19 days ago

        I would like to see further development but I always had a sneaking suspicion that its life was limited due to the fact that ARC does not come from Intel’s fabs either. Like lunar lake, Arc is also made at TSMC.