I’m posting this as more of a “fun thought” than anything else.

It’s generally considered a fact that Linux, along with many other open-source software projects, are more efficient than their propriety closed-source counterparts, specifically in terms of the code that they execute.

There are numerous reasons for this, but a large contributing factor is that open-source, generally speaking, incentivises developers to write better code.

Currently, in many instances, it can be argued that Linux is often less power-efficient than its closed-source counterparts, such as Windows and OSX. However, the reason for this lies not in the operating system itself, but rather the lack of certain built-in hardware support for Linux. Yes, it’s possible to make Linux more power-efficient through configuring things differently, or optimizing certain features of your operating system, but it’s not entirely uncommon to see posts from newer Linux laptop users reporting decreased battery life for these reasons.

Taking a step back from this, though, and looking at a hypothetical world where Linux, or possibly other open-source operating systems and software holds the majority market share globally, I find it to be an interesting thought: How much more power efficient would the world be as a whole?

Of course, computing does not account for the majority of electricity and energy consumption, and I’m not claiming that we’d see radical power usage changes across the world, I’m talking specifically in relation to computing. If hardware was built for Linux, and computers came pre-installed with optimizations and fixes targetted at their specific hardware, how much energy would we be saving on each year?

Nanny Cath watching her YouTube videos, or Jonny scrolling through his Instagram feed, would be doing so in a much more energy-efficient manner.

I suppose I’m not really arguing much, just posting as an interesting thought.

  • Fisch@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 months ago

    Couldn’t you just compare the energy usage of Laptops or desktop PCs with native support running Linux compared to the energy usage when running Windows on them? I have a PC with an AMD GPU and CPU so my hardware is fully supported, I could actually test it. I think a laptop would be better to test on tho, since a desktop PC might not be trying to use as little power as possible in the first place.

    • DNAmaster10@lemmy.sdf.orgOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      5 months ago

      You probably could, but reasonably there’s not enough data out there to do this.

      Still, I’ll mention that even with an AMD CPU and GPU, Linux does often lack support or configuration off-the-bat, to massively varying degrees. The well-known example of this is with Nvidia’s propriety GPU drivers, which historically have been a massive issue, and will probably continue to be for a while even with Nvidia exposing more of its source code with its GPU drivers.

      The kind of support which I’m referring to, though, extends beyond this in many ways. One thing I didn’t mention, for example, is software support for linux. Many linux ports fail to leverage the full potential of Linux, either because the developers don’t know how to, or because they don’t care to. I recently read a dev blog for Factorio relating to this issue. The developer spoke about a very specific optimization which can be applied to Linux when saving games, which, in short, allowed for games to be saved concurrently, improving performance. Using this feature requires programming specifically for linux. While Proton offers incredible gaming support on Linux today, this sort of thing is not something which Proton can magically make work on its own.

      The same sort of idea often extends out into other areas of software and hardware. Applications which have been directly ported to Linux without much consideration often fail to implement these sorts of additional features and optimizations.

      The issue of hardware is, indeed, slightly different. One key thing which is often overlooked by people when assessing this sort of thing is the optimizations and tweaks applied by the hardware manufacturers and vendors themselves. These tweaks are often highly specific to the hardware they’re used for, and usually the vendors will only apply them to work with Windows, or the operating system which the laptop or computer ships with. Going back to the driver issue, the same thing applies. GPU manufacturers will often release high-quality drivers aimed specifically at Windows, offering optimizations which specifically benefit Windows. There’s almost zero incentive for these companies to release the same, or on-par drivers for Linux, due to its smaller market share.

      What this means, is that a much larger amount of work needs to be done by the Linux community to create or improve drivers for specific hardware. Drivers which will work off-the-bat with Windows will not work at all with Linux, and companies which offer Linux alternatives for their drivers often invest significantly more time on their Windows-counterparts. This is only complicated by the fact that many hardware manufacturers keep their driver source-code highly secretive, so trying to program one or alter an existing one for linux is significantly more difficult.

      AMD, as you mentioned, is often much better than alternatives such as Nvidia when it comes to releasing these “secrets” or source code, which makes developing AMD drivers for Linux significantly easier, allowing driver developers to apply many more optimizations than they would otherwise be able to.

      In conclusion, then, the only way this can truly be fixed is if these companies choose to support Linux as much as they do Windows, which unfortunately won’t truly happen until there’s some sort of monetary incentive (ie Linux having a majority market share).