• Im_Him@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      14 days ago

      Started learning flutter a week ago, hopefully it pays off and I get a good job :)

  • oktoberpaard@feddit.nl
    link
    fedilink
    arrow-up
    10
    ·
    17 days ago

    The only time I can remember 16 GB not being sufficient for me is when I tried to run an LLM that required a tad more than 11 GB and I had just under 11 GB of memory available due to the other applications that were running.

    I guess my usage is relatively lightweight. A browser with a maximum of about 100 open tabs, a terminal, a couple of other applications (some of them electron based) and sometimes a VM that I allocate maybe 4 GB to or something. And the occasional Age of Empires II DE, which even runs fine on my other laptop from 2016 with 16 GB of RAM in it. I still ordered 32 GB so I can play around with local LLMs a bit more.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      17 days ago

      Yeah, but if you’re interested in running an LLM faster than 1 token per minute, RAM won’t matter. You’ll need as much VRAM as you can get.

      • oktoberpaard@feddit.nl
        link
        fedilink
        arrow-up
        2
        ·
        17 days ago

        Sure, but I’m just playing around with small quantized models on my laptop with integrated graphics and the RAM was insanely cheap. It just interests me what LLMs are capable of that can be run on such hardware. For example, llama 3.2 3B only needs about 3.5 GB of RAM, runs at about 10 tokens per second and while it’s in no way comparable to the LLMs that I use for my day to day tasks, it doesn’t seem to be that bad. Llama 3.1 8B runs at about half that speed, which is a bit slow, but still bearable. Anything bigger than that is too slow to be useful, but still interesting to try for comparison.

        I’ve got an old desktop with a pretty decent GPU in it with 24 GB of VRAM, but it’s collecting dust. It’s noisy and power hungry (older generation dual socket Intel Xeon) and still incapable of running large LLMs without additional GPUs. Even if it were capable, I wouldn’t want it to be turned on all the time due to the noise and heat in my home office, so I’ve not even tried running anything on it yet.

  • _stranger_@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    16 days ago

    I think someone needs to update that old “if programming languages were weapons”. JavaScript is a cursed hammer.

  • MHanak@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    17 days ago

    I am running 8 gigs on my piece of shit laptop, and i just have to make sure not to open minecraft and the browser at once, otherwise it’s absolutely fine

    • loo@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      15 days ago

      16 is still enough for me? I currently play RDR2 on high settings and QHD on arch linux and I always have minimum of 3 GB to spare

    • TimeSquirrel@kbin.melroy.org
      link
      fedilink
      arrow-up
      18
      ·
      17 days ago

      We used to say 4GB is enough. And before that, a couple hundred MB. I’m staying ahead from now on, so I threw in 64GB. That oughtta last me for another 3/4 of a decade. I’m tired of doing the upgrade race for 30 years and want to be set for a while.

      I can literally trace my current Ryzen PC’s lineage like the ship of Theseus to an Athlon system I built in 2002. A replacement GPU here. Replacement mobo there. CPU here, etc.

    • Ziglin@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      16 days ago

      I have 4GB on my fedora i3 laptop and I am indeed able to open signal desktop, discord and 2 Firefox windows.

            • ugjka@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              2
              ·
              17 days ago
              [ugjka@ugjka Music.Videos]$ free -h
                             total        used        free      shared  buff/cache   available
              Mem:            29Gi        17Gi       1,8Gi       529Mi        11Gi        11Gi
              Swap:           14Gi       2,0Gi        12Gi
              
              
              • UnityDevice@startrek.website
                link
                fedilink
                arrow-up
                1
                ·
                17 days ago

                I was wondering if your tool was displaying cache as usage, but I guess not. Not sure what you have running that’s consuming that much.

                I mentioned this in another comment, but I’m currently running a simulation of a whole proxmox cluster with nodes, storage servers, switches and even a windows client machine active. I’m running that all on gnome with Firefox and discord open and this is my usage

                $ free -h
                               total        used        free      shared  buff/cache   available
                Mem:            46Gi        16Gi       9.1Gi       168Mi        22Gi        30Gi
                Swap:          3.8Gi          0B       3.8Gi
                

                Of course discord is inside Firefox, so that helps, but still…

                • ugjka@lemmy.worldOP
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  17 days ago

                  about 3gigs goes to vram as i don’t have dedicated card yet, but i’m getting 16 gig dedicated gfx soon

      • BlueBockser@programming.dev
        link
        fedilink
        arrow-up
        28
        arrow-down
        1
        ·
        17 days ago

        That doesn’t mean anything. If you have tons of free RAM, programs tend to use more than strictly necessary because it speeds things up. That doesn’t mean they won’t run perfectly fine with 8GiB as well.

    • jaybone@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      17 days ago

      Nah fuck it, let’s just keep putting some more bullshit in our code because we majored in Philosophy and have no idea what complexity analysis or Big O is. The next gen hardware will take care of that for us.

    • IsoSpandy@lemm.ee
      link
      fedilink
      arrow-up
      7
      ·
      17 days ago

      Man i remember. I have 16GB And running windows I would run out of ram so fast. Now on linux, I feel like I am unable to push the usage beyond 8GB in my regular workflow. I also switched to neovim from vscode, Firefox from Chrome and now only when I compile rust does my ram see any usage peaks.

    • Lucy :3@feddit.org
      link
      fedilink
      arrow-up
      39
      arrow-down
      1
      ·
      17 days ago

      Not in my experience. The electron spotify app + electron discord app + games was too much. Replacing electron with dedicated FF instances worked tho.

      • acockworkorange@mander.xyz
        link
        fedilink
        arrow-up
        11
        arrow-down
        2
        ·
        17 days ago

        I have a browser tab addiction problem, and I often run both LibreWolf and Firefox at the same time (reasons). I run discord all the time, signal, have a VTT going on, a game, YouTube playing… and I look at my RAM usage and wonder why did I buy so much when I can never reach 16 GB.

        While I agree electron apps suck and I avoid them… Whatever you guys are running ain’t a typical use case.

      • pat277@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        ·
        17 days ago

        Even on windows, its hilarious to compare the RAM Discord uses. I caught the native app doing 2+gb, and Firefox beating it by… 100mb? I didnt compare ram usage too hard on my Stram Deck though between Flatpak and Firefox, but I expect firefox to be a bit better with its addons/plugins, like it was on windows

      • UnityDevice@startrek.website
        link
        fedilink
        arrow-up
        16
        ·
        17 days ago

        About 6 months ago I upgraded my desktop from 16 to 48 gigs cause there were a few times I felt like I needed a bigger tmpfs.
        Anyway, the other day I set up a simulation of this cluster I’m configuring, just kept piling up virtual machines without looking cause I knew I had all the ram I could need for them. Eventually I got curious and checked my usage, I had just only reached 16 gigs.

        I think basically the only time I use more that the 16 gigs I had is when I fire up my GPU passthrough windows VM that I use for games, which isn’t your typical usage.

  • prunerye@slrpnk.net
    link
    fedilink
    arrow-up
    110
    ·
    17 days ago

    I guess RAM is a bell curve now.

    • 32GB: Enough.
    • 16GB: Not enough.
    • 8GB: Not enough.
    • 4GB: Believe it or not, enough.
    • AusatKeyboardPremi@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      16 days ago

      I have experienced this myself.

      My main machine at home - a M2 Pro MacBook with 32GB RAM - effortlessly runs whatever I throw at it. It completes heavy tasks in reasonable time such as Xcode builds and running local LLMs.

      Work issued machine - an Intel MacBook Pro with 16GB RAM - struggles with Firefox and Slack. However, development takes place on a remote server via terminal, so I do not notice anything beyond the input latency.

      A secondary machine at home - an HP 15 laptop from 2013 with an A8 APU and 8GB RAM (4GB OOTB) - feels sluggish at times with Linux Mint, but suffices for the occasional task of checking emails and web browsing by family.

      A journaling and writing machine - a ThinkPad T43 from 2005 maxed out with 2GB RAM and Pentium M - runs Emacs snappily on FreeBSD.

      There are a few older machines with acceptable usability that don’t get taken out much, except for the infrequent bout of vintage gaming

    • Artyom@lemm.ee
      link
      fedilink
      arrow-up
      50
      arrow-down
      1
      ·
      17 days ago

      I actually audibly laughed when Raspberry Pi came out with an 8GB version because for anyone who thinks 4GB isn’t enough probably won’t be happy with 8 either.

      • Valmond@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        16 days ago

        I wonder what the hell they are doing with it? I mean I have the 3B with IIRC 1GB and I can use the desktop and run python scripts to fiddle with all the I/O ports and stuff, what do you do with a raspberry that needs eight times the RAM??

        I’m seriously curious!

        • boonhet@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          16 days ago

          At that point you’re running some sort of server on it probably.

          For which, it’s not even the most cost effective hardware tbh. There are X86 based tiny PCs for good prices used

            • orangeboats@lemmy.world
              link
              fedilink
              arrow-up
              4
              ·
              edit-2
              16 days ago

              Last time I asked around about this question, the answer was surprisingly “probably not much”! When a low-power x86 chip (like those mobile chips) is idling (which is pretty much all the time if all you are doing is hosting a server on it) it consumes very little power, about the same level as an idling Pi. It is when the frequency ramps up that performance-per-watt gets noticeably worse on x86.

              Edit: My personal test showed that my x86 laptop fared slightly worse than my Pi 3 in idling power (~2 watts higher it seems), but that laptop is oooooooold.