• earphone843@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    2
    ·
    1 day ago

    It depends on the computer, but the power usage could easily be 250W+. While not a ton of power, it adds up quickly.

    But that’s only if you don’t have your computer set to sleep/hibernate

    • cmnybo@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      20
      ·
      1 day ago

      Idle power is not usually that high unless you are talking about a multi socket server.
      A gaming PC is usually less than 100W and an office PC is usually less than 25W at idle.

      • Eheran@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        22 hours ago

        Wasting 25 W while entering and leaving sleep mode is a matter of 5 key strokes and 3 seconds?

      • earphone843@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        11 hours ago

        25W still adds up. General rule of thumb is to add a zero to the wattage to get the cost to run it for a year. I don’t want to spend $250 a year letting my computer idle.

        I definitely misremembered things

        • TaTTe@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          12 hours ago

          That’s some hella expensive electricity you’re buying there. I’m getting mine at 14 cents/kWh, which is roughly 1.2€/W per year. This isn’t even close to the cheapest option available.

          • ayyy@sh.itjust.works
            link
            fedilink
            arrow-up
            2
            ·
            27 minutes ago

            Here in California our utility keeps burning down entire towns so now we pay $.60-$.70 per kW/h. It’s insane. They still don’t maintain infrastructure, they just pass on the cost of lawsuits.

          • earphone843@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            11 hours ago

            You know what, you’re right. Idk what the fuck I was thinking. I must have misremembered the math from the last time I did it.

            I swear I did the math like a year ago and it added up, but that’s clearly a false memory. It’s closer to $1 per watt per year. I downvoted my own comment

            • TaTTe@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              11 hours ago

              It could’ve been closer to the truth in 2022. At least in Europe when the energy prices skyrocketed I think I paid closer to 1€/kWh.

        • rumschlumpel@feddit.org
          link
          fedilink
          arrow-up
          1
          ·
          14 hours ago

          That calculation only makes sense if you never shut down your computer, instead of only when you accidentally hit “restart” and need to go right away.

          • earphone843@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            13 hours ago

            Lots of people leave their computers running 24/7, though. The TLC said the power draw would be small, so I just wanted to point out that what might look like a negligible amount of power can add up to be more than youd expect.

            • rumschlumpel@feddit.org
              link
              fedilink
              arrow-up
              1
              ·
              11 hours ago

              That’s not really what’s being discussed here, though. There’s a big difference between doing it all the time and only doing it once in a blue moon.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      24 hours ago

      but the power usage could easily be 250W+

      I mean, a beefy GPU could be ~400W, and a beefy CPU another ~200W. But that’s peak draw from those components, which are designed to drastically reduce power consumption if they aren’t actually under load. You don’t have to power down the components in sleep/hibernation to achieve that – they can already reduce runtime power themselves. One shouldn’t normally have software significantly loading those (especially after a reboot). If you’ve got something that is doing crunching in idle time to that degree, like, I don’t know, SETI@Home or something, then you probably don’t want it shut off.

      The reason fans can “spin up” on the CPU and the GPU when they’re under load is because they’re dissipating much more heat, which is because they’re drawing much more power.