I worked under a self-proclamed Python/JavaScript programmer, and part of the job involved doing rather advanced stuff in various other typed languages like c# and c++. It was hell. The code review were hell. For every little tiny weenie little things we had to go through “why coding c++ like it is python” is a very bad idea.
What is crazy about developers who exclusively work with scripting languages is that they have no conception of why general good practices exist, and they often will make up their own rules based on their own quirks. In my previous example, the developer in question was the author of a codebase that was in literal development hell, but he was adamant on not changing his ways. I’d definitely be wary of hiring someone who exclusively worked with scripting language, and sometime it is less work to train someone who is a blank slate rather than try to deprogram years of bad habits.
I’d change this slightly - the problem isn’t exclusively working in scripting languages, but dynamically typed ones. There are people who write great code in Python (with typing) and in Typescript, and they usually can work well in other languages too. But people who don’t type their programs are in my experience simply bad developers, the way you describe.
I feel like there is a fundamental difference between developers with a data-centric perspective, and a function-centric perspective.
The function-centric one is about adding functionality, and it’s what developers start out with. You have functions that do things, and if requirements change or the thing should be re-used - no problem, I can quickly add a new toggle parameter here or bolt it on over there. I’ll be done in 5 minutes, no problem!
Then, over time, you learn that functionality isn’t that interesting or difficult. Instead, the hard parts are the ones concerning the flow of data through your application. What do I know about the shape of my data in this part of my application? What can I be sure of regarding invariants over there? This forces you to build modular software without interdependencies, because - in the end - you just build a library that has small adapters to the outside world.
I like scripting languages a lot, but it’s way too easy to become “good” at that style of programming, and the better you get at it, the harder it will be to actually move forward to a data-centric perspective. It’s a local maximum that can trap people, sometimes for their whole career. That’s why I try to look at typing experience when evaluating candidates for positions.
I wanted to get back to you, because you are so very right, and I spent the last 10 years or so trying to evangelize the fact that implementing algorithm and logic isn’t the hard part, it is a trivial concern really. Everything that go wrong with development usually involve the flow of data, and figuring out how to get this data from over here to over there without making a big mess. To do that, you absolutely need to write small module with few dependencies. You gotta think about the life-cycle of your objects, and generally follow all the principles of s.o.l.i.d if you’re doing OOP. Personally, I really love using dependency injection when the project allows for it.
It is as you said really, you can have thousands of hours of programming experience but if you never tried to solve those issues you’re really limiting yourself. Some devs think designing software around your data instead of your algorithms is overthinking it, or “overengineering” as I have been told. Well, I would not hire those people for sure.
I have seen clean project made up of small modules, with clear boundaries between data, functions and the lifecycle configurations. It is night and day compared to most code bases. It is really striking just how much of the hidden, and not-so-hidden complexity and goo and hacks and big-ass functions in most code base really just exist because the application life cycle management is often non-existent. In a “proper” code base, you shouldn’t have to wonder how to fetch a dependency, or if an object is initialized and valid, and where to instantiate your module, or even what constructor to invoke to build a new object. This take care of so much useless code it is insane.
To close on this, I like scripting languages a lot as well, and you can do great things with some of them even if lot of developers don’t. JS has Typescript, ReactiveX, dependency injection framework, and etc. It is a great language with a lot of possibility, and you’re not forced into OOP which I think is great (OOP and functional programming are orthogonal solutions imo). But the reality is that the language is really easy to misuse and you can definitely learn bad traits from it. Same as you, I would be wary of a developer with no experience with strongly-typed languages, or at the very least TS. I am very happy to hear this take randomly on the internet, because in my experience, this is not how most developers operate, and imo it is demonstrably wrong to not design applications around your data.
Are you referring to Python and JS as scripting languages? The two most popular languages on the planet? Ones which are capable of building almost any kind of app imaginable? Surely you don’t apply your limited experience with a single dev to a group of millions of developers doing extremely varied things, right?
Python and Js are by definition scripting languages in the classical sense. I am not using the term in a derogatory way and I myself learnt programming this way as a 90s kid. No offense but I think you took my comment way too personal.
What is the “classical” sense? What are you implying when you say they are “scripting” languages? What you are imparting to me is that they are less-than other, real languages. I don’t take personal offense, but I do take issue with the mischaracterization and implication that those languages are somehow less serious or less broadly useful.
If someone on the internet calls something a “scripting language,” it’s hard to take that in a vacuum. I’ll accept that there is overlap between “interpreted” and “scripting” languages, but they aren’t synonymous, particularly in my experience interacting with developers online. The typical discourse does indeed trivialize the so-called scripting languages, and my only intent is to say that they are a lot more than what they began as.
There are definitely people out there shitting on all sort of languages, and JS is a huge target, but those have been referred to as scripting language for as long as they existed. It stern from the fact those languages are embedded into existing applications, as opposed to being built into binaries. Nowadays you have hybrids like C# which can used as either a scripting language or to build native app (or in-betwee), so it is really just a matter of the context you’re using the language in. There is inherently no hidden meaning or elitism in the term. It is a very old term and I think you simply got the wrong impression from your internet experiences. It is how those languages are defined basically everywhere. Even some of those languages official definition from their own website self-define as scripting languages. There is no ambiguity here at all.
I’m merely saying that to me, and to probably a large group of devs, it sounds like a dig. I totally take that it is an appropriate designation and there was no ill intent though. I think the fact that we’re having this conversation is enough to prove that there is at least a little ambiguity given the right context and experience with the term. Cheers
Most scripting languages are interpreted, not compiled. It’s not a criticism of them, but it is a tradeoff that is good to understand.
It seems like you are the one who is conflating terms like “script kiddie” with “scripting language” and adding some negative connotation that isn’t necessarily implied.
Scripting languages are usually easier to learn, have simpler syntax, and abstractions that hide complexity. These make them easier to get started in, but the downside is they are generally slower (performance-wise) than their compiled counterparts.
I’m certainly replying from my own perspective! Again, I don’t think the original reply was intended to be negative. I am just discussing the language used and what it implies to me and perhaps others from a similar background and time. I think, to me, a clearer and more modern way to describe these languages is as “interpreted” or other words describing the nature of the languages rather than saying it is a language for scripting, which carries a connotation (at least to me, in my corner of the internet)
There is a nuance though, because a language simply being interpreted does not mean it is being used as a scripting language. Take for example Java and C#, those languages are interpreted by default which allow you to ship platform-agnostic binaries and a bunch of other neat features. C# can be used as a scripting language, whenever it is interpreted, but it does not have too. It is an important nuance and this is why you can’t just replace the term “scripting language” entirely. You can also compile C# directly into machine code, skipping the interpreter entirely. Technically, there is nothing stopping you from writing an application that use C# as a scripting language even without the interpreter, since you can compile c# to machine code and simply dynamically load the library at runtime (kind of like Unity does). I guess you could call those “embedded languages”, and it would mean almost exactly the same thing, but then, aren’t we back to the same problem of some developers taking offence from that? I mean, it does imply that the language does not stand on its own without machine code just as well, which is true. This is one weird hill to have a bruised ego over for those developers you’ve met. Words have meaning and this one just happen to be a right fit given the description. I have a feeling from this whole exchange that you didn’t know what scripting languages were, considering how you replied to my first post. I worked in development for over a decade and I have never seen it be used with negative implications. I really just think you personally projected your own feeling onto a term you didn’t understand. No offence intended, it happens.
I worked under a self-proclamed Python/JavaScript programmer, and part of the job involved doing rather advanced stuff in various other typed languages like c# and c++. It was hell. The code review were hell. For every little tiny weenie little things we had to go through “why coding c++ like it is python” is a very bad idea.
What is crazy about developers who exclusively work with scripting languages is that they have no conception of why general good practices exist, and they often will make up their own rules based on their own quirks. In my previous example, the developer in question was the author of a codebase that was in literal development hell, but he was adamant on not changing his ways. I’d definitely be wary of hiring someone who exclusively worked with scripting language, and sometime it is less work to train someone who is a blank slate rather than try to deprogram years of bad habits.
I’d change this slightly - the problem isn’t exclusively working in scripting languages, but dynamically typed ones. There are people who write great code in Python (with typing) and in Typescript, and they usually can work well in other languages too. But people who don’t type their programs are in my experience simply bad developers, the way you describe.
True that, this was pretty much the intended meaning of my reply but you worded it better.
Ah, good!
I feel like there is a fundamental difference between developers with a data-centric perspective, and a function-centric perspective.
The function-centric one is about adding functionality, and it’s what developers start out with. You have functions that do things, and if requirements change or the thing should be re-used - no problem, I can quickly add a new toggle parameter here or bolt it on over there. I’ll be done in 5 minutes, no problem!
Then, over time, you learn that functionality isn’t that interesting or difficult. Instead, the hard parts are the ones concerning the flow of data through your application. What do I know about the shape of my data in this part of my application? What can I be sure of regarding invariants over there? This forces you to build modular software without interdependencies, because - in the end - you just build a library that has small adapters to the outside world.
I like scripting languages a lot, but it’s way too easy to become “good” at that style of programming, and the better you get at it, the harder it will be to actually move forward to a data-centric perspective. It’s a local maximum that can trap people, sometimes for their whole career. That’s why I try to look at typing experience when evaluating candidates for positions.
I wanted to get back to you, because you are so very right, and I spent the last 10 years or so trying to evangelize the fact that implementing algorithm and logic isn’t the hard part, it is a trivial concern really. Everything that go wrong with development usually involve the flow of data, and figuring out how to get this data from over here to over there without making a big mess. To do that, you absolutely need to write small module with few dependencies. You gotta think about the life-cycle of your objects, and generally follow all the principles of s.o.l.i.d if you’re doing OOP. Personally, I really love using dependency injection when the project allows for it.
It is as you said really, you can have thousands of hours of programming experience but if you never tried to solve those issues you’re really limiting yourself. Some devs think designing software around your data instead of your algorithms is overthinking it, or “overengineering” as I have been told. Well, I would not hire those people for sure.
I have seen clean project made up of small modules, with clear boundaries between data, functions and the lifecycle configurations. It is night and day compared to most code bases. It is really striking just how much of the hidden, and not-so-hidden complexity and goo and hacks and big-ass functions in most code base really just exist because the application life cycle management is often non-existent. In a “proper” code base, you shouldn’t have to wonder how to fetch a dependency, or if an object is initialized and valid, and where to instantiate your module, or even what constructor to invoke to build a new object. This take care of so much useless code it is insane.
To close on this, I like scripting languages a lot as well, and you can do great things with some of them even if lot of developers don’t. JS has Typescript, ReactiveX, dependency injection framework, and etc. It is a great language with a lot of possibility, and you’re not forced into OOP which I think is great (OOP and functional programming are orthogonal solutions imo). But the reality is that the language is really easy to misuse and you can definitely learn bad traits from it. Same as you, I would be wary of a developer with no experience with strongly-typed languages, or at the very least TS. I am very happy to hear this take randomly on the internet, because in my experience, this is not how most developers operate, and imo it is demonstrably wrong to not design applications around your data.
You put it very well!
I freaking love you and I’ll try to write a worthy reply when I am home.
<3
That’s true.
It’s also true in other fields. For example, take far-eastern fighting skills:
Young students will try to hit someone, to beat someone up, to hit a target, to become “stronger”.
Experienced teachers, however, don’t really care about hitting a target. It’s all about the posture. How you stand. How you carry out your movements.
Are you referring to Python and JS as scripting languages? The two most popular languages on the planet? Ones which are capable of building almost any kind of app imaginable? Surely you don’t apply your limited experience with a single dev to a group of millions of developers doing extremely varied things, right?
Python and Js are by definition scripting languages in the classical sense. I am not using the term in a derogatory way and I myself learnt programming this way as a 90s kid. No offense but I think you took my comment way too personal.
What is the “classical” sense? What are you implying when you say they are “scripting” languages? What you are imparting to me is that they are less-than other, real languages. I don’t take personal offense, but I do take issue with the mischaracterization and implication that those languages are somehow less serious or less broadly useful.
No hard feelins! (:
https://en.m.wikipedia.org/wiki/Scripting_language
A scripting language, or interpreted language, is interpreted at runtime, rather than compiled.
It is not derogatory, and is simply a fact about languages like Python and JS.
If someone on the internet calls something a “scripting language,” it’s hard to take that in a vacuum. I’ll accept that there is overlap between “interpreted” and “scripting” languages, but they aren’t synonymous, particularly in my experience interacting with developers online. The typical discourse does indeed trivialize the so-called scripting languages, and my only intent is to say that they are a lot more than what they began as.
There are definitely people out there shitting on all sort of languages, and JS is a huge target, but those have been referred to as scripting language for as long as they existed. It stern from the fact those languages are embedded into existing applications, as opposed to being built into binaries. Nowadays you have hybrids like C# which can used as either a scripting language or to build native app (or in-betwee), so it is really just a matter of the context you’re using the language in. There is inherently no hidden meaning or elitism in the term. It is a very old term and I think you simply got the wrong impression from your internet experiences. It is how those languages are defined basically everywhere. Even some of those languages official definition from their own website self-define as scripting languages. There is no ambiguity here at all.
I’m merely saying that to me, and to probably a large group of devs, it sounds like a dig. I totally take that it is an appropriate designation and there was no ill intent though. I think the fact that we’re having this conversation is enough to prove that there is at least a little ambiguity given the right context and experience with the term. Cheers
Most scripting languages are interpreted, not compiled. It’s not a criticism of them, but it is a tradeoff that is good to understand.
It seems like you are the one who is conflating terms like “script kiddie” with “scripting language” and adding some negative connotation that isn’t necessarily implied.
Scripting languages are usually easier to learn, have simpler syntax, and abstractions that hide complexity. These make them easier to get started in, but the downside is they are generally slower (performance-wise) than their compiled counterparts.
I’m certainly replying from my own perspective! Again, I don’t think the original reply was intended to be negative. I am just discussing the language used and what it implies to me and perhaps others from a similar background and time. I think, to me, a clearer and more modern way to describe these languages is as “interpreted” or other words describing the nature of the languages rather than saying it is a language for scripting, which carries a connotation (at least to me, in my corner of the internet)
There is a nuance though, because a language simply being interpreted does not mean it is being used as a scripting language. Take for example Java and C#, those languages are interpreted by default which allow you to ship platform-agnostic binaries and a bunch of other neat features. C# can be used as a scripting language, whenever it is interpreted, but it does not have too. It is an important nuance and this is why you can’t just replace the term “scripting language” entirely. You can also compile C# directly into machine code, skipping the interpreter entirely. Technically, there is nothing stopping you from writing an application that use C# as a scripting language even without the interpreter, since you can compile c# to machine code and simply dynamically load the library at runtime (kind of like Unity does). I guess you could call those “embedded languages”, and it would mean almost exactly the same thing, but then, aren’t we back to the same problem of some developers taking offence from that? I mean, it does imply that the language does not stand on its own without machine code just as well, which is true. This is one weird hill to have a bruised ego over for those developers you’ve met. Words have meaning and this one just happen to be a right fit given the description. I have a feeling from this whole exchange that you didn’t know what scripting languages were, considering how you replied to my first post. I worked in development for over a decade and I have never seen it be used with negative implications. I really just think you personally projected your own feeling onto a term you didn’t understand. No offence intended, it happens.
Kinda sounds like they’re adamant about not changing their ways in response to things not working as they expect.