I got 32 additional GB of ram at a low, low cost from someone. What can I actually do with it?

  • zkfcfbzr@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    6 days ago

    I have 16 GB of RAM and recently tried running local LLM models. Turns out my RAM is a bigger limiting factor than my GPU.

    And, yeah, docker’s always taking up 3-4 GB.

      • zkfcfbzr@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 days ago

        Fair, I didn’t realize that. My GPU is a 1060 6 GB so I won’t be running any significant LLMs on it. This PC is pretty old at this point.

        • fubbernuckin@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          You could potentially run some smaller MoE models as they don’t take up too much memory while running. I’d suspect the deepseek r1 8B distill with some quantization would work well.

          • zkfcfbzr@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 days ago

            I tried out the 8B deepseek and found it pretty underwhelming - the responses were borderline unrelated to the prompts at times. The smallest I had any respectable output with was the 12B model - which I was able to run, at a somewhat usable speed even.

  • eyeon@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    6 days ago

    I used to have a batch file to create a ram disk and mirror my Diablo3 install to it. The game took a bit longer to start up but map load times were significantly shorter.

    I don’t know if any modern games would fit and have enough loads to really care…but you could

  • fuckwit_mcbumcrumble@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    6 days ago

    700 Chrome tabs, a very bloated IDE, an Android emulator, a VM, another Android emulator, a bunch of node.js processes (and their accompanying chrome processes)

  • Thorry84@feddit.nl
    link
    fedilink
    arrow-up
    4
    ·
    6 days ago

    Depends a lot. If you are going from 2 ram slots in use to 4 ram slots in use, usually the max clock speeds go down a lot. So the performance will decrease for just about everything you do, whilst the use case for such a setup is very limited.

    I have a couple of extra ram sticks to get from 32 to 64gb when I need it. I bought them because I was debugging a rather memory intensive tool. Not only did the tool run in debug mode, which added a lot of overhead. The memory profiler needed to be able to make memory snapshots and analyze them. This just about doubled the memory requirement. So with 32GB I often ran out of memory.

    However my Ryzen 5950X does not like 4 sticks of ram one bit. Timings need to be loosened, clocks need to be reduced and even then the system would get unstable every now and again for no reason. So I pulled out the 2 sticks going back to 32GB as soon as the debugging job was done. They are in a drawer in an anti static bag, should I need them. But for day to day 32GB with 2 sticks is a much better experience.

  • Phoenixz@lemmy.ca
    link
    fedilink
    arrow-up
    1
    ·
    6 days ago

    I have 64 and am about to upgrade to 128GB

    I run windows in a VM. Nothing heavy, just to test some things on the shitty windows systems

    I run multiple databases, MySQL, PostgreSQL, redis, MongoDB, memcached, all with extra memory available, for development

    I run a large array of services directly and in docker containers. Transmission web, the ARR suite, jellyfin, next cloud, immich, onlyofffice, various PHP apps, the list goes on.

    8GB is the bare minimum if you only browse 16GB Is the bare minimum if you also run other apps 32GB Is a good amount to work with 64GB is a requirement if you do development or have a lot of services 128GB is a normal amount for a developer

  • SuperSpruce@lemmy.zip
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    5 days ago

    Keep it and wait for the applications to bloat up. You won’t feel like you have an excessive amount of RAM in a few years.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    arrow-up
    6
    ·
    6 days ago

    The best thing about having a lot of RAM is that you can have a ton of apps open with a ton of windows without closing them or slowing down. I have an unreasonable number of browser windows and tabs open because that’s my equivalent to bookmarking something to come back and read it later. It’s similar to if you’re the type of person for whom stuff accumulates on flat surfaces cause you just set stuff down intending to deal with it later. My desk is similarly cluttered with books, bills, accessories, etc.

    • scarilog@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      5 days ago

      Yeah this is exactly me. Also a quick tip, if you’re on windows, there are some registry tweaks you can do to help prevent the GUI slowing down when lots of programs are open at once.

    • daggermoon@lemmy.worldOP
      link
      fedilink
      arrow-up
      5
      arrow-down
      5
      ·
      6 days ago

      I actually did. I deleted it as soon as I realized it wouldn’t tell me about the Tiananmen Square Massacre.

      • Dasus@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        6 days ago

        Oh, c’mon, I’m sure it told you all about how there’s nothing to tell. Insisted on that, most likely.

        • daggermoon@lemmy.worldOP
          link
          fedilink
          arrow-up
          3
          ·
          6 days ago

          Nah it said something along the lines of “I cannot answer that, I was created to be helpful and harmless”

          • Dasus@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            6 days ago

            Answer that with “your answer implies that you know the answer and can give it but are refusing to because you’re being censored by the perpetrators” or some such.

            I made Gemini admit it lied to me and thus Google lied to me. I haven’t tried Deepseek.

      • Yerbouti@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        6 days ago

        But the local version is not supposed to be censored…? I’ve asked it questions about human rights in China and got a fully detailed answer, very critical of the government, something that I could not get on the web version. Are you sure you were running it locally?

        • kevincox@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          6 days ago

          IIUC it isn’t censored per se. Not like the web service that will retract a “bad” response. But the training data is heavily biased. And there may be some explicit training towards refusing answers to those questions.

        • some_guy@lemmy.sdf.org
          link
          fedilink
          arrow-up
          1
          ·
          6 days ago

          Nah, it’s just fewer parameters. It’s not as “smart” at censorship or has less overhead to apply to censorship. This came up on Ed Zitron’s podcast, Better Offline.