• Kalcifer@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    My PC is still largely the same, in general spirit, as when I built it (c 2014-2015). But I have had to upgrade some key components over time. First was the move from a 1TB WD Blue HDD to a Samsung 860 Pro 128GB SSD (for my OS’s drive), and, related to that, at some point soon after, I moved my games drive from an HDD to an SSD. Next, I upgraded my GPU from an Nvidia GeForce GTX 760 to a Nvidia GeForce GTX 1080. This build state lasted a decently long time until I switched from Windows to Linux, so I switched my Nvidia GPU to an AMD Radeon RX 6600 (not exactly an upgrade, but more of a side-grade) to improve the user experience. The most recent change (last year, iirc?) was upgrading my RAM from 8GB DDR3, to 16GB DDR3. My CPU (Intel Core i5-4690k) is starting to really show its age, though, so I’ve been wanting to upgrade that, but that will likely entail a near rebuild of my entire system, so I’ve been avoiding it, but, unfortunately, it’s increasingly becoming more of an issue.

  • eletes@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 months ago

    Upgrading my ryzen 7 1700 and GTX 1080 for a 5800X3D and RX 7900 XT this weekend. Waiting for the CPU but it’s cool to be able to go from first to last Gen that this motherboard can support

  • bitwolf@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    2
    ·
    2 months ago

    They’re mad they spent 1k$ on a gpu and still can’t do 4k without upscaling on the newest crapware games

  • padge@lemmy.zip
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 months ago

    I’m the one person who people go to for PC part advice, but I actually try to talk them down. Like, do you need more RAM because your experience is negatively impacted by not having enough, or do you just think you should have more just because?

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      2 months ago

      Ha, I had this exact conversation with a friend of mine a few days ago, he wants to upgrade from 16GB to 32GB and when I asked why, he just blanked out for a while and went “…because more is better, right?”

      He spends most of his time playing rpg maker porn games and raid shadow legend, also really taxing that RTX 3070 he bought right in the middle of the pandemic.

  • ramble81@lemm.ee
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    2 months ago

    I want to say I upgrade every 6 years. Getting mid to upper specs and a mid range video card and it’ll last you for a long time.

  • Steve Dice@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 months ago

    I was with them until my girlfriend gifted me a 180Hz monitor last year and now I can’t deal with less than 90 FPS so I had to finally upgrade my RX580 (I just found out it stopped getting driver updates in January 2024 so I guess it was about time). High refresh rates ruin you.

  • KillingTimeItself@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 months ago

    i just upgraded this year, to an r9 5900x, from my old r5 2600, still running a 1070 though.

    I do video editing and more generally CPU intensive stuff on the side, as well as a lot of multitasking, so it’s worth the money, in the long run at least.

    I also mostly play minecraft, and factorio, so.

    ryzen 5000 is a great upgrade path for those who don’t want to buy into am5 yet. Very affordable. 7000 is not worth the money, unless you get a good deal, same for 9000, though you could justify it with a new motherboard and ram.

    • ArxCyberwolf@lemmy.ca
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      I’m rocking a 5800X and see no reason to go to 7000 or no 9000 anytime soon. It’s been great since I built the PC.

      • KillingTimeItself@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        i would’ve bought a 5800x, but the prices for the 5800x were crazy, so i just bit the bullet and spent more money on the 5900x as it was a better value, and admittedly, probably more useful to me, especially moving into the future.

        5000 series was a flagship line up for ryzen i think, just before AMD started really killing intel in performance, and also before they started chasing performance so hard. It has great power efficiency, and even better performance. It truly is the chip of the era. Especially with the x3d series for people who want more cache.

        i imagine whatever comes after 9000 series might be a more worthwhile upgrade for you, unless like me you like to wait for things to become more cost effective as it falls a few generations behind. That’s another great strategy as well. I also tend to find that anything less than 3 generations between CPU upgrades and you’re close to the “this isn’t really worth it” line. 2 gens might be, it might not be also though.

    • bobs_monkey@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      2 months ago

      I recently repurposed a xeon CPU/motherboard from 2012 to run my Proxmox server. Bought a rack mount case, noctua fans, new ram, cpu cooler, and gavie it a good thorough cleaning. Not blazing fast, but does the job.

  • corsicanguppy@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    2 months ago

    I just bought a new machine!

    It’s a 2020 to replace my 2016 that I got in 2016.

    This one should do for a while.

  • Kitathalla@lemy.lol
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 months ago

    I’ll do you onetwo better: my computer’s from 2012. I can play even modern games on high settings sometimes. It wasn’t even a high specced one at the time. I think I put about $1200 into the actual components AND monitor/keyboard.

    • potustheplant@feddit.nl
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      2 months ago

      Everyone’s different. Maybe for you playing a game on “high settings” in 1080p@30 is enough but others might prefer 4k@60 or 1440p@100 or more fps. Also, define “modern”.

  • GreatAlbatross@feddit.uk
    link
    fedilink
    English
    arrow-up
    15
    ·
    2 months ago

    I feel this.

    I went AM4 in 2017 when the AMD gave a leap forward at a reasonable price and efficiency.

    Then I added a 3060 when one became available.

    They’re both undervolted, and ticking along nicely.

    I don’t plan to change anything until probably 2027. Heck, I’m still catching up to 2020 in my games backlog.

    • Randelung@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      2 months ago

      I’m playing XCOM: The Bureau (2013) right now on an 6700K (2015). Why touch a running system. ¯\_(ツ)_/¯

      • account abandoned@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        2 months ago

        Undervolting (when done correctly) won’t damage PC parts.

        Yes, it reduces the voltage supplied to the components but CPUs and GPUs are designed to operate within a specific voltage range and you keep the voltage within this range. Even if you reduce the voltage below the recommended range, the system may become unstable but this doesn’t cause damage – it simply results in crashes.

        • Brosplosion@lemm.ee
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          2 months ago

          Lower voltage = higher current for a given power. Guess if you simultaneously reduce power you probably are okay

          • Malfeasant@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            2 months ago

            That’s not how current works (most of the time… Some loads, i.e. big motors, might do that, but not any solid state electronics)

          • ghterve@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            2 months ago

            I think you’re totally right for a load that needs a certain amount of power. But a CPU just needs to be able to flip transistor gates fast enough. They don’t draw more current at lower voltage, so the lower the voltage, the lower the power. At some point, too low of a voltage won’t let them flip fast enough for a given clock speed (or, eventually, flip at all)

          • randombullet@programming.dev
            link
            fedilink
            arrow-up
            3
            ·
            2 months ago

            No undervolting reduces power consumption.

            I have a undervolt curve on my GPU and I get about 2-3% better performance for 90% of the tdp.

            It’s because consumer GPUs try to max out their TDP pretty much at any cost with no individual refinement. Undervolting is pretty much tailoring a power profile to the silicon lottery.