- Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage
- RAM usage spikes from 1GB to 4GB on Discord both in and out of voice chat
The proliferation of electron programs is what happens when you have a decade of annoying idiots saying “unused memory is wasted memory,” hand-in-hand with lazy developers or unscrupulous managers who are externalizing their development costs onto everybody else by writing inefficient programs that waste more and more of our compute and RAM, which necessitates the rest of us having to buy even better hardware to keep up.
annoying idiots saying “unused memory is wasted memory,”
The original intent of this saying was different, but ya it’s been co-opted into something else
So what was the original saying? As I see it, this phrase is wrong no matter how you look at it. Because all ram is used at all times, for example if you have 32GB of free ram, the kernel will use all of it as a page cache to speed up the file system. The more free ram you have the more files can be cached, avoiding access to the disk when you read them.
Seems like you understand the original meaning already.
Native apps are so much better, on every platform.
The JavaScript must flow….
Yeah, the RAM shortage is definitely to blame on Electron. Won’t someone please think of the poor AI companies who have to give an arm and a leg to get a single stick of RAM!
If you have a better way of generating videos of absurdly obese Olympic divers doing the bomb from a crane, I’d love to hear it.
Tbf isn’t AI mainly used to code electron apps by shitty companies?
I wouldn’t mind so much if they were giving their own arms and legs, but they seem to be giving ours.
Electron is a f…ing cancer for Desktop
I really wish Electron wasn’t as popular as it is. It’s such a fucking memory hog. I mean, sure, I’ve got RAM to spare, but I shouldn’t need that much for a single app.
maybe a toggle to choose between “take some extra RAM, I’m feeling generous” and “fuck you, I’m computing shit over here” could be used to let the app know your current mood / needs …
Memory hogging browsers usually do release memory when pressured. You can take it further by getting extensions that unload unused tabs.
The problem is electron apps that load the whole browser core over and over.
Yes, it runs a separate browser instance for each electron program. Many of the programs that use it could just be a PWA instead.
I tried the PWA route with Discord. It wouldn’t stay logged in, and acted generally janky. That said, I do PWA with any app that’s Electron, at least to try and avoid the RAM bloat.
Or, even better, let’s start developing for separate platforms again, and optimise software for the platform that’s going to be running it. Rather than just developing everything for Chrome.
This is what bothers me so much… Browsers should be improving their PWA implementation (looking at you, Firefox) and electron apps should be PWAs more often. Another decent middle ground Is Tauri. SilverBullet and Yaak are both so much lighter and better than anything else on my system.
Yeah but companies want full control and no ad blockers. That’s why they’re pushing shoddy Electron apps over their web experiences and PWAs.
I wonder how much exact duplication each process has?
https://www.kernel.org/doc/html/latest/admin-guide/mm/ksm.html
Kernel Samepage Merging
KSM is a memory-saving de-duplication feature, enabled by CONFIG_KSM=y, added to the Linux kernel in 2.6.32. See mm/ksm.c for its implementation, and http://lwn.net/Articles/306704/ and https://lwn.net/Articles/330589/
KSM was originally developed for use with KVM (where it was known as Kernel Shared Memory), to fit more virtual machines into physical memory, by sharing the data common between them. But it can be useful to any application which generates many instances of the same data.
The KSM daemon ksmd periodically scans those areas of user memory which have been registered with it, looking for pages of identical content which can be replaced by a single write-protected page (which is automatically copied if a process later wants to update its content). The amount of pages that KSM daemon scans in a single pass and the time between the passes are configured using sysfs interface
KSM only operates on those areas of address space which an application has advised to be likely candidates for merging, by using the madvise(2) system call:
int madvise(addr, length, MADV_MERGEABLE)One imagines that one could maybe make a library interposer to induce use of that.
I guess the key is it has to be the same version of electron in the back end. If they change too much of it then how much memory can be shared?
This isn’t news lol
This is a trade off. Many of these apps work on osx and Linux because they are browser-based. If they go back to native apps you lose that portability.
electron was a steaming pile of shit 8 years ago. still is. what’s changed?
our acceptance of shitty corporate software.
Electron is fine for what it does. It’s just that every problem was turned into a nail and Electron is the hammer.
There are plenty of cross platform frameworks and libraries that don’t involve web tech
There is also the advantage of an army of web devs who can build somewhat functional software for the desktop day one.
We were laughing at Java back in the days. Now they use JS instead…
There are, but few of them also work on the web as an alternative to the desktop. Writing one shitty web app and offering Electron wrapped versions of it gets you a webapp, a Windows app, a Linux app and a MacOS app. And you already have web devs on the team because everyone does.
I hate that you are right. Giving up electron would likely mean less Linux and mac compatibility. It’s a shame, but it’s likely true.
For commercial software, definitely. It’d be web and MAYBE Windows unless there’s a Qt nerd spearheading the project or something.
FOSS is actually better off here IMO, since it’s done by people as passion projects, so there’s no need to pinch pennies by eliminating target platforms. HOWEVER there’d also be more need for the devs to have different platforms to test on.
I normally reply with a tux penguin but, is it really a Windows problem if the it’s the apps that aren’t optimized for shit?
Indeed, and Electron is not specific to Windows
Yep. My 64 gigs of RAM died in my old setup a few weeks ago, and instead of paying out the ass for replacement DDR4 RAM, I decided to pay out the ass for DDR5 RAM and upgrade while I was at it. Only did 32 gigs, because I really wasn’t using most of my 64 gigs (I thought). A few days ago, I ended up having to set up a swap file because a Rust project I was working on kept crashing VSCode while it was running the analyzer. What are we doing here.
Windows itself is inherently unoptimized these days with its AI search powered start menu and settings pages that are actually electron web apps. Linux uses 3+ less gb of RAM on just desktop after I switched. So really it’s a both problem. Both windows and the programs need a rethink
I remember how the combination of Internet mass distribution of file data and the blossoming gray market for file-share applications really super-charged the technology of file compression.
I wonder if we’ll see skyrocketing RAM prices put economic pressure on the system bloat rampant through modern OSes.
Isn’t the bloat basically being coded by the same ai that’s eating up the ram to begin with?
I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.
I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.
Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.
Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.
But TL; DR; I’d be more inclined to blame “bloat” on internet web browsers and low cost memory post '00s than on AI written-code.
I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.
I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.
Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.
Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.
It is definitely coming and fast. This was always Microsoft’s plan for an internet only windows/office platform. Onedrive and 365 is basically that implementation now that we have widespread high speed internet.
And with the amount of SaaS apps the only thing you need on a local machine is some configuration files and maybe a downloads folder.
Look at the new Nintendo Switch cartridges as an example. They don’t contain the game, just a license key. The install is all done over the internet.
Thank Google for those cool products.
what’s google got to do with it? this is an article about a product develeped at GitHub (now a microsoft subsidiary) causing problems with Windows and the thumbnail is showing produts from the following companies:
- discord
- microsoft
- microsoft
- microsoft
- microsoft
like. look. i hate google. they partner with israel to conduct genocide (don’t use waze, btw, or better yet, don’t use any google products). but this seems like not looking at the whole of how evil all of big tech is just to focus on how evil one company in big tech is
CoMaps is a good alternative to Waze. If you think it isnt make an OSM account and help make it a good alternative :p
CoMaps is GOATed. i need to make some edits in my neighborhood
The article mentions Chrome/Chromium: 9 times
The article mentions Google: 0 timesGoogle made Chrome. Chrome had that multi-process architecture at its core which allowed to consume as much memory as needed even on 32-bit OS. Chromium was always inside it and open source. Then they created CEF, which allowed webdevs to build “real” apps, and that opened the floodgates. Electron was first built on it but they wanted to include Node and couldn’t because it required too much experience in actual coding. So they switched to Chromium. It didn’t change much in the structure, just basically invited more webdevs to build more “real” apps (at 1.0 release Electron advertised hundreds of apps built with it on its website).
Google could do something about how the web engine works in frameworks (that don’t need that much actual web functionality), but didn’t. They invited webdevs to do anything they want. Webdevs didn’t care about security because mighty Google would just publish new Chromium update eventually. They never realized they don’t need more security in their local “real” apps gui that connect to their websites because there is not much room for security danger in such scenarios. They just always updated the underlying engine because why not. Chromium dll is now at 300 mb or something? All of that code is much needed by everyone, is it not?
So, for me the sequence was always seen as this:
Google (caring about webdevs, not OS) ->
Webdevs (not caring about native code and wanting to sell their startup websites by building apps) ->
Reckless web development becoming a norm for desktop apps ->
Corporations not seeing problems with the above (e.g. Microsoft embedding more stuff with WebView2 aka Chromium)
So yes, Google has everything to do with it because it provided all the bad instruments to all the wrong people.
Personally, I don’t care much about hating Microsoft anymore because its products are dead to me and I can only see my future PCs using Linux.
Electron was originally developed by GitHub for a text editor called Atom.
And it always used Chromium under the hood.
I’m tired of this! How can we start our own RAM foundry–is that the right term? Surely there’s a YT tutorial somewhere.
Big RAM hates this one little trick!
The latest semiconductor manufacturer specializing in RAM is ChangXin Memory Technologies
As of 2019, CXMT had over 3,000 employees, and runs a fab with a 65,000 square meters clean room space. Over 70% of its employees are engineers working on various research and development related projects. CXMT uses its 10G1 process technology (aka 19 nm) to make 4 Gb (gigabit) and 8 Gb DDR4 memory chips. It has licensed intellectual property originally created by Qimonda.
So… whatever that costs. Although, I think this wiki is a bit behind the times, as they’ve got DDR5-8000 memory in flight according to TechInsights.
It must take so much R&D to achieve anything remotely comparable to what Samsung, Micron (/Crucial… RIP) and SK Hynix can produce.
Fingers crossed they can either undercut the 3(now 2) big producers, which is doubtful. But hopefully they can help reduce the maximum price that decent memory can inflate to. Because at some point a medium sized customer is gonna get fed up of the Samsung/micron/skHynix bullshit, and custom order the ram they need, and such a smaller producer will provide a much better service for a similar price
The miracle of the Chinese Economy (and, really, all the BRICS countries) has been their willingness to educate and industrialize their population.
Yeah, it takes a ton of R&D, but when you’ve got 1.4B people you’re going to sift out a few who can get the job done. India’s Tata is already building their own semiconductor facilities. Brazil’s semiconductor sector has been struggling to break into the global market for… decades. Russia’s so sanctioned that they’ve got no choice but to go in-house. South Africa is finally building industrial facilities to match their role in the raw materials supply chain.
I would suspect this crunch in the global market is going to incentivize a ton of international investment in manufacturing entirely to meet domestic demand. And heaven help us all if there’s an actual flashpoint in the Pacific Rim, because that’ll shut down the transit that companies like TSM and Broadcomm need to produce at current scales.
I just wouldn’t hold my breath, especially under the current protectionist political environment. You’re not going to be buying outside of the US sphere of influence any time soon.
Pretty sure all ram manufacturers are Korean? I guess China puts chips on PCBs, maybe? But South Korea has the knowledge . And it had met domestic demand. RAM prices have been acceptable for many many years.
It’s the AI sector that is inflating demand (maybe by circular investment and contracts).
So, I don’t see anyone investing 10 years into the future to make ddr6 ram where their business plan relies on current trends.Pretty sure all ram manufacturers are Korean?
Micron is American, headquartered in Boise, Idaho. Western Digital is based in San Jose, California. Kioxia (formerly a department of Toshiba) is Japanese.
Only Samsung and SK Hynix are Korean.
So, I don’t see anyone investing 10 years into the future to make ddr6 ram where their business plan relies on current trends.
Even if you’re not up to DDR6, there’s money to be made in lower-tier memory for lower quality devices. Also, when the market is in a pinch, you’ll have the ability to scale up with investment dollars faster if you’re already in the business.
It wasn’t just their willingness to educate their own people but also Apple’s willingness to offload all of their production there and basically revolutionize their Tech industry by developing all of their hardware there
Apple’s willingness to offload all of their production there and basically revolutionize their Tech industry
Taiwan’s FoxConn building assembly plants in Shenzhen in 2005 does not explain why Huawei is releasing cutting edge phones in 2025.
Besides, if you want to get historical, Apple cribbed all their technology from Microsoft’s trash bin back in the 90s. And Microsoft plundered IBM and the early tech companies of the 1980s before that.
Chinese firms didn’t cheat by licensing the same technology every American firm was outright stealing through reverse engineering.
Limitation breeds innovation
Just another AI agent bro, that will fix th
Out of Memory or System Resources. Close some windows or programs and try again.
Like a rust based alternative to VSCode
No thanks. Any software that has AI integration as one of its main selling points is shitware imo.
Entirely optional, it’s marketing, hate the game not the player
VSCode history is so messed up. Microsoft buys github and stops production of github team’s IDE, then uses the framework developed for that IDE to make VSCode.
Fucking 1600s colonizer behavior.
And the Apollo was launched with 4KB of ram.
4kb of RAM and an office packed with hundreds of engineers using slide rules, sure.
They used Cloud Computation and AI (“Actually Interns”) way before it was cool.
Miss times when 4 gigs of ram was more than enough for browsing and playing game at the sane time
cries in 8gb RAM
It still is if you’re willing to jump through enough hoops
Traditions… Simple: download more ram
They only have DDR4 :/

















