• Maxxie@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    102
    arrow-down
    1
    ·
    edit-2
    1 month ago

    (let me preach a little, I have to listen to my boss gushing about AI every meeting)

    Compare AI tools: now vs 3 years ago. All those 2022 “Prompt engineer” courses are totally useless in 2025.

    Extrapolate into the future and realize, that you’re not losing anything valuable by not learning AI tools today. The whole point of them is they don’t require any proficiency. It “just works”.

    Instead focus on what makes you a good developer: understanding how things work, which solution is good for what problem, centering your divs.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      30 days ago

      Key skill is to be able to communicate your problem and requirements which turns out to be really hard.

      • Pennomi@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        29 days ago

        It’s also a damn useful skill whether you’re working with AI or humans. Probably worth investing some effort into that regardless of what the future holds.

        • jmp242@sopuli.xyz
          link
          fedilink
          arrow-up
          3
          ·
          29 days ago

          Though it’s more work with current AI at least compared to another team member - the AI cannot have access to a lot of context due to data security rules.

  • fuzzzerd@programming.dev
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    1 month ago

    I don’t remember progressive web apps having anywhere near the level of fanfare as the other things on this list, and as someone that has built several pwas I feel their usefulness is undervalued.

    More apps in the app store should be pwas instead.

    Otherwise this list is great and I love it.

      • andioop@programming.dev
        link
        fedilink
        English
        arrow-up
        15
        ·
        edit-2
        1 month ago

        I do wonder about inventions that actually changed the world or the way people do things, and if there is a noticeable pattern that distinguishes them from inventions that came and went and got lost to history, or that did get adopted but do not have mass adoption. Hindsight is 20/20, but we live in the present and have to make our guesses about what will succeed and what will fail, and it would be nice to have better guesses.

        • Lightfire228@pawb.social
          link
          fedilink
          arrow-up
          9
          ·
          1 month ago

          Quality work will always need human craftsmanship

          I’d wager that most revolutionary technologies are either those that expand human knowledge and understanding, and (to a lesser extent) those that increase replicability (like assembly lines)

          • Transtronaut@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            9
            ·
            1 month ago

            It’s tricky, because there’s no hard definition for what it means to “change the world”, either. To me, it brings to mind technologies like the Internet, the telephone, aviation, or the steam engine. In those cases, it seems like the common thread is to enable us to do something that simply wasn’t possible before, and is also reliably useful.

            To me, AI fails on both those points. It doesn’t really enable us to do anything new. We already had chat bots, we already had Photoshop, we already had search algorithms and auto complete. It can do some of those things a lot more quickly than older technologies, but until they solve the hallucination problems it doesn’t seem reliable enough to be consistently useful.

            These things make it come off more as a potential incremental improvement that is still too early in it’s infancy, than as something truly revolutionary.

            • zqwzzle@lemmy.ca
              link
              fedilink
              English
              arrow-up
              9
              ·
              1 month ago

              Well it’ll change the world by consuming a shit ton of electricity and using even more precious water to fill the data centres. So changing the world is correct in that regard.

            • jmp242@sopuli.xyz
              link
              fedilink
              arrow-up
              2
              ·
              29 days ago

              It needs to be more trustworthy. If I have to double check everything, I still have to figure out how to do whatever it’s doing, then figure out how it’s doing the thing, then verify if it did it right. By then, I could have just done it in step 1.5 probably.

        • pinball_wizard@lemmy.zip
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          30 days ago

          I do wonder about inventions that actually changed the world or the way people do things, and if there is a noticeable pattern that distinguishes them from inventions that came and went and got lost to history,

          Cool thought experiment.

          Comparing the first iPhone with the release of BlockChain is a pretty solid way to consider the differences.

          We all knew that modern phones were going to be huge. We didn’t need tech bros to tell us to trust them about it. The usefulness was obvious.

          After I got my first iPhone, I learned a new thing I could do with it - by word-of-mouth - pretty much every week for the first year.

          Even so, Google supposedly under-estimated the demand for the first Android phones by almost a factor of 10x.

          BlockChain works fine, but it’s not changing my daily routine every week.

          AI is somewhere in between. I do frequently learn something new and cool that AI can do for me, from a peer. It’s not as impactful as my first pocket computer phone, but it’s still useful.

          Even with the iPhone release, I was told “learn iPhone programming or I won’t have a job.” I actually did not learn iPhone programming, and I do still have a job. But I did need to learn some things about making code run on phones.

    • Kissaki@programming.dev
      link
      fedilink
      English
      arrow-up
      22
      ·
      edit-2
      1 month ago

      I’d love to read a list of those instances/claims/tech

      I imagine one of them was low-code/no-code?

      /edit: I see such a list is what the posted link is about.

      I’m surprised there’s not low-code/no-code in that list.

      • jubilationtcornpone@sh.itjust.works
        link
        fedilink
        arrow-up
        14
        ·
        1 month ago

        “We’re gonna make a fully functioning e-commerce website with only this WYSIWYG site builder. See? No need to hire any devs!”

        Several months later…

        “Well that was a complete waste of time.”

      • pinball_wizard@lemmy.zip
        link
        fedilink
        arrow-up
        3
        ·
        30 days ago

        You’re right. It belongs on the list.

        I was told several times that my programming career was ending, when the first low-code/no-code platforms released.

        • Kissaki@programming.dev
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          30 days ago

          At my work we explored a low-code platform. It was not low on code at all. Beyond the simplest demos you had to code everything in javascript, but in a convoluted, intransparend, undocumented environment with a horrendous editing UI. Of course their marketing was something different than that.

          That was not the early days of low-code mind you. It was rather recently; maybe three or four years ago.

    • sidelove@lemmy.world
      link
      fedilink
      arrow-up
      20
      ·
      1 month ago

      Which is honestly its best use case. That and occasionally asking it to generate a one-liner for a library call I don’t feel like looking up. Any significant generation tends to go off the rails fast.

      • merc@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        29 days ago

        If you use it basically like you’d use an intern or junior dev, it could be useful.

        You wouldn’t allow them to check anything in themselves. You wouldn’t trust anything they did without carefully reading it over. You’d have to expect that they’d occasionally completely misunderstand the request. You’d treat them as someone completely lacking in common sense.

        If, with all those caveats, you can get this assistance for free or nearly free, it might be worth it. But, right now, all the AI companies are basically setting money on fire to try to drive demand. If people had to pay enough that the AI companies were able to break even, it might be so expensive it was no longer worth it.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 month ago

        Getting it to format documentation for you seems to work a treat. Nothing too complex, just “move this bit here, split that into points”.

      • Omgpwnies@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        I’ve been using it to write unit tests, I still need to edit them to mock out some things and change a bit of logic here and there, but it saves me probably 50-75% of the time it used to take, just from not having to hand-write all that code.

  • MortUS@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    3
    ·
    29 days ago

    Once both major world militaries and hobbists are using it, it’s jover. You can’t close Pandora’s Box. Whatever you want to call the current versions of “AI”, it’s only going to get better. Short of major world catastrophes, I expect it to drive not only technological advances but also energy/efficiency advances as well. The big internet conglomerates are already integrating it into search, and I fully expect within the next 5 years to have search transformed into an assistant-like chatbot (or something thereof).

    I think it’s shortsighted not to see the potential of accumulating society’s knowledge and being able to present that to people in an understandable way.

    I don’t expect it to happen overnight. I’m not expecting iRobot or Android levels of consciousness any time soon, but the world is progressing toward the automation of many things - driven by Capital(ism) - which is powerful in itself.

      • HarkMahlberg@kbin.earth
        link
        fedilink
        arrow-up
        2
        ·
        6 days ago

        the potential of accumulating society’s knowledge and being able to present that to people in an understandable way.

        We call this Wikipedia, please consider donating to keep it running!

        • MortUS@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          6 days ago

          I completely agree about supporting Wikipedia. I actually do donate to Wikipedia via subscription and recommend others do as well. Being able to just download Wikipedia is also just such a boon. That being said, Wikipedia is just that, a pedia, like an encyclopedia. It’s static knowledge. It can’t rephrase things or simplify them or provide more context than it already has. A phonebook to a phonecall.

          I would love to see a breakthrough in energy solutions for high-processing, but I doubt I will in my lifetime, and am pessimistic about such advances even being possible.

  • entwine413@lemm.ee
    link
    fedilink
    arrow-up
    13
    arrow-down
    21
    ·
    1 month ago

    I can see this partly being true in that it’ll be part of a dev’s toolkit. The devs at my previous job loved using it to do busy work coding.

    • TheSealStartedIt@feddit.org
      link
      fedilink
      arrow-up
      9
      arrow-down
      13
      ·
      1 month ago

      Oh god the hate in this sub. It is definitely another tool for a dev to use. Like autocomplete or a lot of other stuff a good IDE does to help you. If you don’t want to use it, fine. Perhaps you’re such a pro that you don’t need anything but a text editor. If you’re not, and you’re ignoring it for whatever petty reasons, you’ll probably fall behind all the devs who learned how to use it to get more productive (or, in developer terms, lazier)

      • fmstrat@lemmy.nowsci.com
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        4
        ·
        1 month ago

        Agreed. Like it or not, old school auto complete was the same thing, just not as advanced. That being said, comment op probably didn’t click the link.

    • Valmond@lemmy.world
      link
      fedilink
      arrow-up
      17
      arrow-down
      3
      ·
      1 month ago

      “busy work coding” is that what you do when you try to look like you’re working (like a real dev)?

      • 3abas@lemm.ee
        link
        fedilink
        arrow-up
        6
        arrow-down
        2
        ·
        1 month ago

        Real world development isn’t creating exciting apps all the time, it’s writing the same exact boring convention based code sticking to an established pattern.

        It can be really boring and unchallenging to create your millionth respiratory, or you can prompt your ide to create a new repo and with one sentence it will create stub out 10 minutes worth of tedious prep work. It makes programming fun again.

        In one prompt, it can look at my finished code and stub out half decent documentation that otherwise wouldn’t have been completed at. It does hallucinate sometimes, or it completely misunderstands the code, so you have to correct a few sentences, but the brain drain of coming to with the sentence structure to write useful documentation is completely lifted, and the code is now well documented.

        AI programming is more than just vibe coding, and it’s way more useful than everyone here insists it’s not.

      • dermanus@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        We’re using it for closing security flaws identified by another tool. It’s boring, unchallenging work that is nonetheless still important. It’s also repetitive and uncreative enough that I’m comfortable having a machine do it.

        There’s still human review but when it’s stuff like “your error messages should escape variables” or “write a longer function name” having a tool that can do most of the grunt work is valuable.

    • adrian@50501.chat
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      edit-2
      1 month ago

      I agree that it will continue to be a useful tool. I’ve gotten a similar productivity boost using AI auto-complete as I did from regular auto-complete. It’s also pretty good at identifiying potential uses with code, again, a similar productivity boost as a good linter. The chatbot does make a good sounding board, especially when you don’t remember the name of the concept you are trying to implement or need to pro-con two solutions and you can’t find articles about it.

      But all these claims of 10x improvements in development speed are horse shit. Yeah, you might be able to shit out a 5-10,000 LOC tutorial app in an hour or two with prompt engineering, but try implementing a feature in a 100,000 LOC codebase and it promptly shits the bed: hallucinating internal frameworks, microservices, ignoring internal practices, writing straight up non-functional code, etc. I’d you spend enough time prompting it, you can eventually massage the solution you need out of it; problem is, it took longer to do that than writing the damn thing yourself.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      18
      ·
      30 days ago

      Many of our customers store their backups in our “cloud storage solution”.

      I think they’d be rather less impressed to see the cloud is in fact a jumble of PCs scattered all around our office.

    • Colonel Panic@lemm.ee
      link
      fedilink
      English
      arrow-up
      19
      ·
      30 days ago

      Naming it “The Cloud” and not “Someone else’s old computer running in their basement” was a smart move though.

      It just sounds better.

    • Rusty@lemmy.ca
      link
      fedilink
      English
      arrow-up
      29
      ·
      1 month ago

      I don’t think it was supposed to replace everyone in IT, but every company had system administrators or IT administrators that would work with physical servers and now there are gone. You can say that the new SRE are their replacement, but it’s a different set of skills, more similar to SDE than to system administrators.

      • MinFapper@startrek.website
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 month ago

        And some companies (like mine) just have their SDEs do the SRE job as well. Apparently it incentivizes us to write more stable code or something

      • jmp242@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        ·
        29 days ago

        I just think this is patently false. Or at least there are/were orgs where cloud costs so much more than running their own servers that are tended by maybe 1 FTE across a bunch of admins mostly doing other tasks.

        Let me just point out one recent comparison - we were considering cloud backup for a couple petabytes of data, with a few hundred GB changing or adding / restoring every week or less. I think the best deal, where we held the software costs equal was $5/TB/Month.

        This is catastrophically more expensive over a 10 year lifespan of a server or two and a small/mid sized LTO9 tape library and tapes. For one thing, we’d have paid more than the server etc in about a year. After that, tape prices have always tended down over time, and the storage costs for us for tape is basically $0 once in archive storage. We put it in a cabinet in another building - and you can fit A LOT of data in these tapes in a small room. That’ll cost basically $0 additional for 20 years, forget about 10. So let’s add in electricity etc - I still have doubts those will be over ~$100k over the lifetime of the project. Labor is about a wash cause you still need people to manage the backups to the cloud, and I think actually moving tapes might be ~.05 FTE in our situation. Literally anyone can be taught how to do it once the backup admin puts the tapes in the hopper or tells them which serial # to put in the hopper.

        I also think that many companies are finding something similar for straight servers - at least it was in the news quite a bit for a while. Now, if you can be entirely cloud native - maybe it washes out, but for large groups of people that’s still not possible due to controlling hardware (think factory,scientific, etc)or existing desktop software for which the cloud isn’t really a replacement and throughput isn’t great (think Adobe products, video, scientific, financial etc data).

      • Jankatarch@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        edit-2
        1 month ago

        There is still difference.

        Cloud was FOR the IT people. Machine learning is for predicting patterns following data.

        Maybe stock predictors will adapt or replace but average programmer didn’t have to switch to replit because it’s “cloud IDE”

      • Ferk@programming.dev
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        1 month ago

        I mean, isn’t that what “get on or get left behind” means?

        It does not necessarily mean you’ll lose your job. Nor does “get on” mean you have to become a specialist on it.

        The post picks specifically on things that didn’t catch on (or that only catched on for a period of time but were eventually superseeded), but does not apply it to other successful technologies.

  • superkret@feddit.org
    link
    fedilink
    arrow-up
    49
    ·
    1 month ago

    This technology solves every development problem we have had. I can teach you how with my $5000 course.

    Yes, I would like to book the $5000 Silverlight course, please.

  • humanspiral@lemmy.ca
    link
    fedilink
    arrow-up
    20
    ·
    1 month ago

    I’m skeptical of author’s credibility and vision of the future, if he has not even reached blink tag technology in his progress.