• 0 Posts
  • 329 Comments
Joined 1 year ago
cake
Cake day: June 25th, 2023

help-circle
  • Contramuffin@lemmy.worldtoScience Memes@mander.xyzlab toys
    link
    fedilink
    English
    arrow-up
    33
    ·
    2 days ago

    Just yesterday I had a CO2 valve close on me during an experiment while I was away for a moment. It takes effort to turn the valve so it couldn’t have just shaken closed or something. The valve was in the corner of the room and was blocked off by boxes, so nobody could have accidentally bumped it. And, besides, nobody was in the room anyways. Before the experiment I made damn sure that the CO2 valve was open, and even looking through the computer records (which records the CO2) says that the CO2 valve was open until I walked away.

    I still have no idea how the valve could have closed on its own. Now, I’m not saying it’s a ghost, but I am saying that I cannot think of a single non-paranormal explanation. I’ve clearly angered the science gods and I would do well to sacrifice some more cells to the science gods to appease them


  • Without another name change, I don’t think that phrase will ever go away, for the simple fact that X as a name is too short and nondescript. In speech, X could refer to a someone you broke up with, or it could just be the beginning of another word, serving as a prefix. In text, it could refer to the actual letter itself, or the close button on a window, or a placeholder, or something NSFW.

    There’s simply too many ways that X can be interpreted that even if people associate Twitter with X, people will still specify “formerly Twitter” just to avoid confusion


  • Science is like going down a Wikipedia rabbit hole. There’s always more things to do and more things to check out. At some point you just have to draw the line and say that enough is enough. Other scientists are likely to ask why you stopped where you stopped, and so saying that “it’s outside the scope of the paper” is basically the nice way of saying that you stopped because you felt like it




  • It’s a multifaceted answer for me, I feel.

    Linux is weird, on a technical level. It’s funky and broken and has weird quirks you have to remember. But it’s not malicious. Wendel from Level1Tech said it best in one of his videos: the headaches with Linux are haphazard, the headaches with Windows are adversarial.

    It’s not a perfect solution to Windows, but at least for some people, the respect that it has for its users (ie, no ads, not trying to fight you on everything you’re trying to do, gives you the ability and freedom to tinker as you please) offsets its technical problems.

    Additionally, Linux is missing a lot of core applications. There’s many applications that do have a Linux version, and many that can run through a compatibility layer, and out of those that are left, many have really solid replacements. Heck, you might be surprised to find that some of the software that you use already were originally intended to be replacements for Windows-only applications.

    But there’s still a handful of core applications that don’t work on Linux and don’t really have a good replacement, and even missing 1 can easily break someone’s work flow. No, LibreOffice isn’t a full replacement of Microsoft Office, no, GIMP can’t actually replace Photoshop.

    As for terminal, there’s no way around it. You will have to open terminal at some point. To be clear, most, if not all, things that you might imagine yourself doing likely has some way of doing it through a GUI. The issue is that as a new user, you don’t know where the GUI is, or what it’s called, or how to even ask. And when the tutorials that you find online tell you to just use terminal, that ends up being the only practical way of getting things done. So it’s a weird Catch-22, where only experienced users who know where all the menus are will know where the GUI options are, but it’s the new users who need it the most.

    My understanding is that Linux developers in the past several years have been explicitly trying to make the OS more accessible to a new user, but it’s not quite there yet.

    Overall, I think Linux is deeply flawed. But seeing how Microsoft seems to be actively trying to make Windows worse, Linux ends up being the only OS where have faith that it will still be usable in 2 years.

    If anything, the more people switch to Linux, the more pressure there will be to make the OS more accessible to new users, and also for software companies to release a Linux-compatible version of their software. Some brave people just need to take the dive first







  • For me, everything is a belief unless it satisfies the following criteria:

    1. It is generally accepted as true among experts
    2. There is ample evidence that is both personally convincing and leaves no room for alternate interpretations (not the same as #1, since many fields have “commonly accepted knowledge” that is generally acknowledged as most likely true but has no evidence to back it up)
    3. It is specific enough that it cannot be interpreted in a way that is misleading

    I find that the one that trips up most people is #3, since some people speak in technically true but overly broad statements and the listener ends up filling in the gaps with their own biases. The listener leaves feeling like their biases have been confirmed by data, not realizing that they have been misled.

    In the end, according to my criteria, very little can be categorized as true knowledge. But that’s fine. You can still make judgements from partial or biased data or personal beliefs. You just can’t be resolute about it and say that it’s true.




  • I mainly do work indoors, so the brightness does not really matter that much to me. But as far as I can tell, the brightness is pretty normal for laptops - I don’t think it’s any brighter or dimmer than other laptops I’ve used in the past. According to this website that I found, brightness is 25 to 486 nits. Google search seems to say that average maximum brightness for laptops is somewhere around 300-400 nits.

    My understanding is that the screen is generally what eats up most of the battery on device, so if you plan to have brightness turned up, it might be difficult to find a laptop with a long battery life.




  • Yes, but that’s my point, you see. Because Arm historically has been used for mobile and small devices, there’s been a strong incentive for decades to emphasize power efficiency. Because x86 historically has been used for desktops, there’s been a strong incentive to emphasize power. It’s only been very recently that Arm attempted to have comparable power, and even more recently that x86 attempted to have comparable power efficiency.

    Sure, Arm is currently more efficient, but the general consensus is that there’s no inherent reason for why Arm must be more efficient than x86. In other words, the only reason it is more efficient is just because they’ve been focusing on efficiency for longer.

    Both AMD and Intel’s current gen x86 cpu’s are, from what I can tell, basically spitting distance away from Qualcomm’s Arm cpu’s in terms of battery life, and rumor has it that both x86 companies should be able to match Arm chips in efficiency by next gen.

    So if efficiency is a priority for you, I think it’s worthwhile to wait and see what the cpu companies cook up in the next couple of years, especially as both AMD and Intel seem to be heavily focused on maximizing efficiency right now