Why do they even care? it’s not like your future bosses are going to give a flying fuck how you get your code. at least, they won’t until you cause the machine uprising or something.
They are going to care if you can maintain your code. Programming isn’t “write, throw it over the fence and forget about it”, you usually have to work with what you - or your coworkers - have already done. “Reading other people’s code” is, like, 95% of the programmers job. Sometimes the output of a week long, intensive work is a change in one line of code, which is a result of deep understanding of a project which can span through many files, sometimes many small applications connected with each other.
ChatGPT et al aren’t good at that at all. Maybe they will be in the future, but at the moment they are not.
Most people who are in dev aren’t maintaining shit.
Most coders don’t write new scripts, they’re cutting and pasting from whatever libraries they have - using ChatGPT is just the newest way to do that.
Your boss doesn’t care about how the job gets done- they care that it gets done. Which is why we have giant code libraries just chock full of snippets to jigsaw into whatever we need.
Most people who are in dev aren’t maintaining shit.
I disagree, but maybe what I do in “dev” is a bubble where things are different.
They absolutely will. Companies hire programmers because they specifically need people who can code. Why would I hire someone to throw prompts into ChatGPT? I can do that myself. In the time it take me to write to an employee instructing them on the code I want them to create with ChatGPT, I could just throw a prompt into ChatGPT myself.
Why would you sign up to college to willfully learn nothing
To get hired.
A diploma ain’t gonna give you shit on its own
So does breathing.
A lot of kids fresh out of highschool are pressured into going to college right away. Its the societal norm for some fucking reason.
Give these kids a break and let them go when they’re really ready. Personally I sat around for a year and a half before I felt like “fuck, this is boring lets go learn something now”. If i had gone to college straight from highschool I would’ve flunked out and just wasted all that money for nothing.
Yeah I remember in high school they were pressuring every body to go straight to uni and I personally thought it was kinda predatory.
I wish I hadn’t went straight in, personally. Wasted a lot of money and time before I got my shit together and went back for an associates a few years later.
Its hard to make wise decisions when you’re basically a kid at that age.
If you go through years of education, learn nothing, and all you get is a piece of paper, then you’ve just wasted thousands of hours and tens of thousands of dollars on a worthless document. You can go down to FedEx and print yourself a diploma on nice paper for a couple of bucks.
If you don’t actually learn anything at college, you’re quite literally robbing yourself.
Because college is awesome and many employers use a degree as a simple filter any way
Not a single person I’ve worked with in software has gotten a job with just a diploma/degree since like the early 2000s
Maybe it’s different in some places.
We are saying the same thing. Degree > diploma for jobs. Go to college, get degree
I meant any form of qualification. Sure it helps, but the way you get the job is by showing you can actually do the work. Like a folio and personal projects or past history.
Art? Most programming? “Hard skills” / technical jobs… GOOD jobs. Sure. But there’s plenty of degrees & jobs out there. Sounds like you landed where you were meant to be, alot of folks go where opportunity and the market takes them
Its probably a regional difference. Here in AU, you can be lucky and land a few post grad jobs if you really stood out. Otherwise you’re entirely reliant on having a good folio and most importantly connections.
Many HR departments will automatically kick out an application if it doesn’t have a degree. It’s an easy filter even if it isn’t the most accurate.
Yeah fair point, but then how are you going to get the job if you’re completely incompetent at programming 🤔
Just use AI bro
I don’t think you can get the CS degree with being completely incompetent. A bunch of interviews I had were white boarding the logic, not actual coding. Code is easy if you know the logic.
“Necessary, but not sufficient” sums up the role of a degree for a lot of jobs.
My Java classes at uni:
Here’s a piece of code that does nothing. Make it do nothing, but in compliance with this design pattern.
When I say it did nothing, I mean it had literally empty function bodies.
Mine were actually useful, gotta respect my uni for that. The only bits we didn’t manually program ourselves were the driver and the tomcat server, near the end of the semester we were writing our own Reflections to properly guess the object type from a database query.
So what? You also learn math with exercises that ‘do nothing’. If it bothers you so much add some print statements to the function bodies.
I actually did do that. My point was to present a situation where you basically do nothing in higher education, which is not to say you don’t do/learn anything at all.
Yeah that’s object oriented programming and interfaces. It’s shit to teach people without a practical example but it’s a completely passable way to do OOP in industry, you start by writing interfaces to structure your program and fill in the implementation later.
Now, is it a good practice? Probably not, imo software design is impossible to get right without iteration, but people still use this method… good to understand why it sucks
To get a job so you don’t starve
To get the peice of paper that lets you access a living wage
Open the browser in one VM. Open chatgpt in another VM.
Removed by mod
Money can be exchanged for housing, food, healthcare, and more necessities.
Homer?
Yeah, Anon paid an AI to take the class he payed for. Setting his money on fire would have been more efficient.
After you finish a random course, a bunch of tech bros contact you immediately, give you a bunch of money, and take you to the land of Silicon, where you play fusball and drink beer, occasionally typing on a keyboard randomly.
At least, that’s how those things are advertisedI’ve been on both sides of that zoom call, and yeah, the amenities are there because they expect you to live there
deserved to fail
Probably promoted to middle management instead
He might be overqualified
run it in a vm
pay for school
do anything to avoid actually learning
Why tho?
Job
Losing the job after a month of demonstrating you don’t know what you claimed to is not a great return on that investment…
It is, because you now have the title on your resume and can just lie about getting fired. You just need one company to not call a previous employer or do a half hearted background check. Someone will eventually fail and hire you by accident, so this strategy can be repeated ad infinitum.
Sorry, you’re not making it past the interview stage in CS with that level of knowledge. Even on the off chance that name on the resume helps, you’re still getting fired again. You’re never building up enough to actually last long enough searching to get to the next grift.
I am sorry that you believe that all corporations have these magical systems in place to infallibly hire skilled candidates. Unfortunately, the idealism of academia does not always transfer to the reality of industry.
…you stopped reading halfway through my comment didn’t you?
Idiot.
No actual professional company or job of value is not going to check your curriculum or your work history… So like sure you may get that job at quality inn as a night manager making $12 an hour because they didn’t fucking bother to check your resume…
But you’re not getting some CS job making $120,000 a year because they didn’t check your previous employer. Lol
Yeah fake. No way you can get 90%+ using chatGPT without understanding code. LLMs barf out so much nonsense when it comes to code. You have to correct it frequently to make it spit out working code.
- Ask ChatGPT for a solution.
- Try to run the solution. It doesn’t work.
- Post the solution online as something you wrote all on your own, and ask people what’s wrong with it.
- Copy-paste the fixed-by-actual-human solution from the replies.
Two words: partial credit.
deepseek rnows solid, autoapprove works sometimes lol
If we’re talking about freshman CS 101, where every assignment is the same year-over-year and it’s all machine graded, yes, 90% is definitely possible because an LLM can essentially act as a database of all problems and all solutions. A grad student TA can probably see through his “explanations”, but they’re probably tired from their endless stack of work, so why bother?
If we’re talking about a 400 level CS class, this kid’s screwed and even someone who’s mastered the fundamentals will struggle through advanced algorithms and reconciling math ideas with hands-on-keyboard software.
i guess the new new gpt actually makes code that works on the first time
You mean o3 mini? Wasn’t it on the level of o1, just much faster and cheaper? I noticed no increase in code quality, perhaps even a decrease. For example it does not remember things far more often, like variables that have a different name. It also easily ignores a bunch of my very specific and enumerated requests.
03 something… i think the bigger version….
but, i saw a video where it wrote a working game of snake, and then wrote an ai training algorithm to make an ai that could play snake… all of the code ran on the first try….
could be a lie though, i dunno….Asking it to write a program that already exists in it’s entirety with source code publicly posted, and having that work is not impressive.
That’s just copy pasting
he asked it by describing the rules of the game, and then asked it to write and ai to learn the game….
it’s still basic but not copy pastaThese things work by remind how likely other words are to appear next to certain words. Do you know how many tutorials on how to code those exact rules it must have scanned?
that’s not how these things work
I know the video you are referencing - I think it’s this one.
o3 yes perhaps, we will see then. Would be amazing.
Usually this joke is run with a second point of view saying, do I tell them or let them keep thinking this is cheating?
Are you guys just generating insanely difficult code? I feel like 90% of all my code generation with o1 works first time? And if it doesn’t, I just let GPT know and it fixes it right then and there?
My first attempt at coding with chatGPT was asking about saving information to a file with python. I wanted to know what libraries were available and the syntax to use them.
It gave me a three page write up about how to write a library myself, in python. Only it had an error on damn near every line, so I still had to go Google the actual libraries and their syntax and slosh through documentation
the problem is more complex than initially thought, for a few reasons.
One, the user is not very good at prompting, and will often fight with the prompt to get what they want.
Two, often times the user has a very specific vision in mind, which the AI obviously doesn’t know, so the user ends up fighting that.
Three, the AI is not omnisicient, and just fucks shit up, makes goofy mistakes sometimes. Version assumptions, code compat errors, just weird implementations of shit, the kind of stuff you would expect AI to do that’s going to make it harder to manage code after the fact.
unless you’re using AI strictly to write isolated scripts in one particular language, ai is going to fight you at least some of the time.
I asked an LLM to generate tests for a 10 line function with two arguments, no if branches, and only one library function call. It’s just a for loop and some math. Somehow it invented arguments, and the ones that actually ran didn’t even pass. It made like 5 test functions, spat out paragraphs explaining nonsense, and it still didn’t work.
This was one of the smaller deepseek models, so perhaps a fancier model would do better.
I’m still messing with it, so maybe I’ll find some tasks it’s good at.
from what i understand the “preview” models are quite handicapped, usually the benchmark is the full fat model for that reason. the recent openAI one (they have stupid names idk what is what anymore) had a similar problem.
If it’s not a preview model, it’s possible a bigger model would help, but usually prompt engineering is going to be more useful. AI is really quick to get confused sometimes.
It might be, idk, my coworker set it up. It’s definitely a distilled model though. I did hope it would do a better job on such a small input though.
the distilled models are a little goofier, it’s possible that might influence it, since they tend to behave weirdly sometimes, but it depends on the model and the application.
AI is still fairly goofy unfortunately, it’ll take time for it to become omniscient.
Garbage for me too except for basic beginners questions
Can not confirm. LLMs generate garbage for me, i never use it.
A lot of people assume their not knowing how to prompt is a failure of the AI. Or they tried it years ago, and assume it’s still as bad as it was.
I just generated an entire angular component (table with filters, data services, using in house software patterns and components, based off of existing work) using copilot for work yesterday. It didn’t work at first, but I’m a good enough software engineer that I iterated on the issues, discarding bad edits and referencing specific examples from the extant codebase and got copilot to fix it. 3-4 days of work (if you were already familiar with the existing way of doing things) done in about 3-4 hours. But if you didn’t know what was going on and how to fix it you’d end up with an unmaintainable non functional mess, full of bugs we have specific fixes in place to avoid but copilot doesn’t care about because it doesn’t have an idea of how software actually works, just what it should look like. So for anything novel or complex you have to feed it an example, then verify it didn’t skip steps or forget to include something it didn’t understand/predict, or make up a library/function call. So you have to know enough about the software you’re making to point that stuff out, because just feeding whatever error pops out of your compiler back into the AI may get you to working code, but it won’t ensure quality code, maintainability, or intelligibility.
I remember so little from my studies I do tend to wonder if it would really have cheating to… er… cheat. Higher education was like this horrendous ordeal where I had to perform insane memorisation tasks between binge drinking, and all so I could get my foot in the door as a dev and then start learning real skills on the job (e.g. “agile” didn’t even exist yet then, only XP. Build servers and source control were in their infancy. Unit tests the distant dreams of a madman.)
They’re clever. Cheaters, uh, find a way.
Now imagine how it’ll feel in interviews
Unless they’re being physically watched or had their phone sequestered away, they could just pull it up on a phone browser and type it out into the computer. But if they want to be a programmer they really should learn how to code.
I work in a dept. at a university that does all the proctored exams. None of that technology is allowed in the exam rooms. They have to put their watch, phone, headphones, etc in a locker beforehand. And not only are they being watched individually, the computer is locked down to not allow other applications to open and there are outgoing firewalls in place to block most everything network wise. I’m not saying it’s impossible to cheat, but it’s really really hard.
Some instructors still do in class exams, which would make it easier, but most opted for the proctored type exams especially during Covid.
The bullshit is that anon wouldn’t be fsked at all.
If anon actually used ChatGPT to generate some code, memorize it, understand it well enough to explain it to a professor, and get a 90%, congratulations, that’s called “studying”.
Yeah, if you memorized the code and it’s functionality well enough to explain it in a way that successfully bullshit someone who can sight-read it… You know how that code works. You might need a linter, but you know how that code works and can probably at least fumble your way through a shitty 0.5v of it
deleted by creator
Professors hate this one weird trick called “studying”
I don’t think that’s true. That’s like saying that watching hours of guitar YouTube is enough to learn to play. You need to practice too, and learn from mistakes.
No he’s right. Before ChatGPT there was Stack Overflow. A lot of learning to code is learning to search up solutions on the Internet. The crucial thing is to learn why that solution works though. The idea of memorizing code like a language is impossible. You’ll obviously memorize some common stuff but things change really fast in the programming world.
It’s more like if played a song on Guitar Hero enough to be able to pick up a guitar and convince a guitarist that you know the song.
Code from ChatGPT (and other LLMs) doesn’t usually work on the first try. You need to go fix and add code just to get it to compile. If you actually want it to do whatever your professor is asking you for, you need to understand the code well enough to edit it.
It’s easy to try for yourself. You can go find some simple programming challenges online and see if you can get ChatGPT to solve a bunch of them for you without having to dive in and learn the code.
I mean I feel like depending on what kind of problems they started off with ChatGPT probably could just solve simple first year programming problems. But yeah as you get to higher level classes it will definitely not fully solve the stuff for you and you’d have to actually go in and fix it.
I don’t think that’s quite accurate.
The “understand it well enough to explain it to a professor” clause is carrying a lot of weight here - if that part is fulfilled, then yeah, you’re actually learning something.
Unless of course, all of the professors are awful at their jobs too. Most of mine were pretty good at asking very pointed questions to figure out what you actually know, and could easily unmask a bullshit artist with a short conversation.
You don’t need physical skills to program, there is nothing that needs to be hone in into the physical memory by repetition. If you know how to type and what to type, you’re ready to type. Of you know what strings to pluck, you still need to train your fingers to do it, it’s a different skill.
I didn’t say you’d learn nothing, but the second task was not just to explain (when you’d have the code in front of you to look at), but to actually write new code, for a new problem, from scratch.
Any competent modern IDE or compiler will help you find syntax mistakes. Knowing the concepts is way more important.
Took first semester Java test a month ago. Had to use a built-in WYSIWYG editor within the test webpage.
WYSIWYG for code? Wtf does that mean?
WYSIWYG stands for “what you see is what you get”. Basically, it was a plain rich-text editor, with buttons for bold, italics and so on.
Probably you see black text on white background, and get no syntax highlighting or autocomplete, lol.
I mean at this point just commit to the fraud and pay someone who actually knows how to code to take your exam for you.