• Brosplosion@lemm.ee
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      4 days ago

      Lower voltage = higher current for a given power. Guess if you simultaneously reduce power you probably are okay

      • Malfeasant@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        3 days ago

        That’s not how current works (most of the time… Some loads, i.e. big motors, might do that, but not any solid state electronics)

      • ghterve@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        I think you’re totally right for a load that needs a certain amount of power. But a CPU just needs to be able to flip transistor gates fast enough. They don’t draw more current at lower voltage, so the lower the voltage, the lower the power. At some point, too low of a voltage won’t let them flip fast enough for a given clock speed (or, eventually, flip at all)

      • randombullet@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        3 days ago

        No undervolting reduces power consumption.

        I have a undervolt curve on my GPU and I get about 2-3% better performance for 90% of the tdp.

        It’s because consumer GPUs try to max out their TDP pretty much at any cost with no individual refinement. Undervolting is pretty much tailoring a power profile to the silicon lottery.