> AI assistance means that programmers can concern themselves less and less with the particulars of any language.
Sure. Until we need to. Then we face some apparently tiny concern, which is actually deeply intricated with the rest of this whole mess, and we are ready for a ride in the rabbit hole.
> most developers today don’t pay much attention to the instruction sets and other hardware idiosyncrasies of the CPUs that their code runs on, which language a program is vibe coded in ultimately becomes a minor detail.
This can be very misguided from my part but I have the feeling they are two very different cases here. Ok, not everyone is a ffmpeg level champion who will thrive in code-golfing ASM til the last drop of cycle gain.
But there are also probably reasons why third-generation programming language lasted without any other subsequent proposal completely displacing them. It’s all about a tradeoff of expressiveness and precision. What we want to keep in the focus zone, and what we want to delegate to mostly uncontrolled details.
If to go faster we need to get rid of a transparent glasses, we will need very sound and solid alternative probes to report what’s going on ahead.
In my opinion, their target audience are scientists rather than programmers, and a scientist most often think of code as a tool to express his ideas (hence, perfect AI generated code is kind of a graal). The faster he can express them, even if the code is ugly, the better. He does not care to reuse the code later most of the time.
I have the hint that scientists and not programmers are the target audience as other things may trigger only one category but not the other, for example, they consider Arduino a language, This makes totally sense for scientists, as most of the ones using Arduino dont necessarily know C++, but are proud to be able to code in Arduino.
Sure, but their tools are complexity management tools: Hypotheses, experiments, empirical evidence, probabilities. To my knowledge, they deal far less with the determism programmers rely on. It's reproducible if you get similar results with the same probability.
I like programming, I like clean code, so it's something I struggled with when I began research.
But actually, producing easy to read code when you don't have specifications, because you don't know yet if the idea will work, and you are discovering problems on that idea as you go doesn't lead to readable code naturally.
You refactor all the time, but then something that you misunderstood becomes a concern, and you need to refactorer again everything, and again and again.. You loose much time, and research is fast paced.
Scientists that spend too much time cleaning code often miss deadlines and deliverables that are actually what they need to produce. Nobody cares about their code, as when the idea is fully developed, other scientist will just rewrite a better software with full view of the problem. (some scientists rewrite their full software when everything is discovered)
I think a sensible goal would be easy to write code instead of easy to read for scientists.
But if you are iterating on code and using an LLM without even looking at the code, there's a reasonable chance that when you prompt "okay, now handle factor y also", you end up with code that handles factor y but also handles pre-existing factor x differently for no good reason. And scientific work is probably more likely than average programming to be numerics stuff where seemingly innocuous changes to how things are computed can have significant impacts due to floats being generally unfriendly.
> This can be very misguided from my part but I have the feeling they are two very different cases here
They are indeed very different. If your compiler doesn't emit the right output for your architecture, or the highly optimized library you imported breaks on your hardware, you file a bug and, depending on the third party, have help in fixing the issue. Additionally, those types of issues are rare in popular libraries and languages unless you're pushing boundaries, which likely means you are knowledgeable enough to handle those type of edge cases anyway.
If your AI gives you the wrong answer to a question, or outputs incorrect code, it's entirely on you to figure it out. You can't reach out to OpenAI or Anthropic to help you fix the issue.
The former allows you to pretty safely remain ignorant. The latter does not.
My take is that you should be using AI for exactly the same things that you would ask someone a random contractor to do for you, knowing that they won't be there to maintain it later.
On the other hand, one can see it as another layer of abstraction. Most programmers are not aware of how the assembly code generated from their programming language actually plays out, so they rely on the high-level language as an abstraction of machine code.
Now we have an additional layer of abstraction, where we can instruct an LLM in natural language to write the high-level code for us.
natural language -> high level programming language -> assembly
I'm not arguing whether this is good or bad, but I can see the bigger picture here.
Different compiler versions, target architectures, or optimization levels can generate substantially different assembly from the same high-level program. Determinism is thus very scoped, not absolute.
Also almost every software has know unknowns in terms of dependencies that gets permanently updated. No one can read all of its code. Hence, in real life if you compile on different systems (works on my machine) or again but after some time has passed (updates to compiler, os libs, packages) you will get a different checksum for your build with unchanged high level code that you have written. So in theory given perfect conditions you are right, but in practice it is not the case.
There are established benchmarks for code generation (such as HumanEval, MBPP, and CodeXGLUE). On these, LLMs demonstrate that given the same prompt, the vast majority of completions are consistent and pass unit tests. For many tasks, the same prompt will produce a passing solution over 99% of the time.
I would say yes there is a gap in determinism, but it's not as huge as one might think and it's getting closer as time progresses.
Your comment lacks so much context and nuance to ultimately be nonsense.
You absolutely can, and probably _should_, leverage AI to learn many things you don't understand at all.
Simple example: try picking up or learning a programming language like C with or without LLMs. With is going to be much more efficient. C is one of the languages that LLMs have seen the most, they are very, very good at it for learning purposes (also at bug hunting).
I have never learned as much about computing as in the last 7/8 months of using LLMs to assist me at summarizing, getting information, finding bugs, explaining concepts iteratively (99% of Software books are crap: poorly written and quickly outdated, often wrong), scanning git repositories for implementation details, etc.
You people keep committing the same mistake over and over: there's a million uses to LLMs, and instead of defining the context of what you're discussing about you conflate everything with vibe coding making ultimately your comments nonsense.
I've posted this before, but I think it will be a perennial comment and concern:
Excerpted from Tony Hoare's 1980 Turing Award speech, 'The Emperor's Old Clothes'...
"At last, there breezed into my office the most senior manager of all, a general manager of our parent company, Andrew St. Johnston. I was surprised that he had even heard of me. "You know what went wrong?" he shouted--he always shouted-- "You let your programmers do things which you yourself do not understand." I stared in astonishment. He was obviously out of touch with present day realities. How could one person ever understand the whole of a modern software product like the Elliott 503 Mark II software system? I realized later that he was absolutely right; he had diagnosed the true cause of the problem and he had planted the seed of its later solution."
My interpretation is that whether shifting from delegation to programmers, or to compilers, or to LLMs, the invariant is that we will always have to understand the consequences of our choices, or suffer the consequences.
Applied to your specific example, yes, LLMs can be a good assistants for learning. I would add that triangulation against other sources and against empirical evidence is always necessary before one can trust that learning.
My context is that I have seen some colleagues try to make up for not having expertise with a particular technology by using LLMs and ultimately they have managed to waste their time and other people's time.
If you want to use LLMs for learning, that's altogether a different proposition.
seems like a significant skill/intelligence issue. someone i know made a web security/pentesting company without ANY prior knowledge in programming or security in general.
and his shit actually works by the way, topping leaderboards on hackerone and having a decent amount of clients.
your colleagues might be retarded or just don’t know how to use llms
Would you recognize a memory corruption bug when the LLM cheerfully reports that everything is perfect?
Would you understand why some code is less performant than it could be if you've never written and learned any C yourself? How would you know if the LLM output is gibberish/wrong?
They're not wrong; it's just not black-and-white. LLMs happen to sometimes generate what you want. Often times, for experienced programmers who can recognize good C code, the LLMs generate too much garbage for the tokens it costs.
I think some people are also arguing that some programmers ought to still be trained in and experienced with the fundamentals of computing. We shouldn't be abandoning that skill set completely. Some one will still need to know how the technology works.
The parent I answered said you shouldn't use LLMs for things you don't understand while I advocate you should use them to help you learn.
You seem to describe very different use cases.
In any case, just to answer your (unrelated to mine) comment, here[1] you can see a video of one of the most skilled C developers on the planet finding very hard to spot bugs in the Redis codebase.
If all your arguments boil down to "lazy people are lazy and misuse LLMs" that's not a criticism of LLMs but of their lack of professionalism.
Humans are responsible for AI slop, not AI. Skilled developers are enhanced by such a great tool that they know how and when to use.
>> most developers today don’t pay much attention to the instruction sets and other hardware idiosyncrasies of the CPUs that their code runs on, which language a program is vibe coded in ultimately becomes a minor detail.
If it was even slightly true then we wouldn’t be generating language syntax at all, we’d be generating raw machine code for the chip architectures we want to support. Or even just distributing the prompts and letting an AI VM generate the target machine code later.
That may well happen one day, but we’re not even close right now
Also there’s so much patching in the kernel (for unix) to solve hardware bugs. And a lot of languages depends on C (with all its footguns) to probide that stable foundation. It’s all unseen work that are very important.
It's hard to find good data sources for this, especially that StackOverflow is in decline[1].
IEEE's methodology[2] is sensible given what's possible, but the data sources are all flawed in some ways (that don't necessarily cancel each other out). The number of search results reported by Google is the most volatile indirect proxy signal. Search results include everything mentioning the query, without promising it being a fair representation of 2025. People using a language rarely refer to it literally as the "X programming language", and it's a stretch to count all publicity as a "top language" publicity.
TIOBE uses this method too, and has the audacity to display it as a popularity with two decimal places, but their historical data shows that the "popularity" of C has dropped by half over two years, and then doubled next year. Meanwhile, C didn't budge at all. This method has a +/- 50% error margin.
By far the most useful and helpful is job ads: it literally defines the demand side of the programming language market.
Yes, that does not show us how much code is running out there, and some companies might have huge armies with very low churn and so the COBOL stacks in banks don’t show up, but I can’t think of a more useful and directly measurable way of understanding a languages real utility.
I would assume so. I expect there to be a lot of job postings looking for more "sexy" technologies to create the visage that those companies are growing and planning towards the future. And conversely I wouldn't expect any job postings of old "streets behind" technologies like COBOL to be fake, as they wouldn't help with such signalling.
Yes to your point, COBOL which ranks very low here is still fundamental to the infrastructure of several major industries, with some sources [1] reporting that it is used in:
43% of all banking systems.
95% of all US ATM transactions.
80% of all in-person credit card transactions.
96% of travel bookings.
This may very well dramatically change in the next few years with such an emphasis on enterprise AI tools to rewrite large COBOL repositories. [2]
In retail banking I'm sure that this could be true. Working in investment banking, I never saw a single COBOL application, or had to have my C++/Java/$MODERNLANGUAGE code interact with one.
Corp bank here, everyone has rumours about COBOL systems but no one I've ever spoke to has seen, interacted or has any other evidence these really exist anymore either.
But I asked for a bank statement from my old savings account a few years old and it took two weeks to print out, printed in monospace dot matrix.
Or the betting company that I was a customer that suspends betting everyday 6:30am for an hour for daily maintainance. Ironically, they would accept bets for football matches played at the time, but the system was shut down.
You haven’t seen or heard them because they are abstracted away by APIs, circuit breakers and proxies. Almost ALL banks, credit card companies, travel systems and other high throughput transaction systems run on mainframe that is written in COBOL.
I think the issue here is that people working in fintech don't seem to come across these systems much, if at all - if you know one specifically, please tell us.
I worked briefly at a company that wrote applications that interacted with bank mainframes. Think end point bank teller systems and in branch customer/account management. They definitely do exist - every major bank has a mainframe written in (usually) cobol.
But it's very abstracted, part of our main product offering WAS abstracting it. On top of our ready to use applications, we offered APIs for higher-level data retrieval and manipulation. Under the hood, that orchestrates mainframe calls.
But even then that there could be more level of abstractions. Not every bank used screen-level mainframe access. Some used off the shelf mainframe abstractors like JxChange (yes, there's a market for this).
Fintech would be even more abstracted, I imagine. At that point you can only interact with the mainframe a few levels up, but it's still there. Out of sight.
It's still there at the accounting/backend level. Automated Financial Systems Level 3 and it's replacement Vision are commercial loan systems.
LVL3 is pure cobol. It has been recently deprecated but there are many banks who own the code and are still self hosting it, along with it's IBM green screen support.
Vision is a java front end in front of an updated cobol backend. When your reputation is based on your reliability and long term code stability, at what point do you risk making the conversion, versus training new developers to work on your system.
No, we are not afraid of our own systems. The idea that there is some fabled computer system which everyone is too scared to touch doesn’t exist (I work in payment processing). There are levels of controls way outside these systems which provide these safety nets (e.g settlement / reconciliation controls).
If the cobol is still there, it’s not due to risk. If anything, the cobol is a much higher operational risk than replacing it.
Analogously, GDSes like SABRE still ran on mainframes until very recently (c. 2023) [0]. SABRE was written in some combination of assembly and some kind of in-house dialect of PL/I, if I recall.
> Working in investment banking, I never saw a single COBOL application
What was the back office settlement or wire transfer system written in? There is a good chance that some part of them was written in COBOL. And while Bloomberg terminals are a vendor product, for a bloody long time, many of their screens had some COBOL.
Also, lots of quantitative software at i-banks use LINPACK or BLAS, which use FORTRAN.
Well, I had a very badly specified project to write a library for our back office systems to do Swift payments from our C++ applications, via COM. There was no obvious COBOL involved, on either side, but it has to be said that the whole use case for the library was very murky. And it never worked, due to the lack of spec, not the languages.
Cobol is used in pretty much all enterprise legacy systems.
But "used in" doesn't mean that it's actively being developed by more then a tiny team for maintaining it.
As this graph we're commenting on is mostly talking about popularity/most used it's never going to rate higher, because for every one Cobol dev there are more then 100 Java devs employed by the same company
That's a pretty wild claim. What's legacy for you? I'd consider legacy e.g J2EE crap running on web[sphere|logic] as holding most of the points in that league table vs COBOL.
A legacy software to me is whatever the company that employs me says is said legacy software.
Pretty much every business I've worked at to date has had such legacy software, which was inevitably still used in some contexts.
It's not always obvious, because - following with the previous example numbers - only 1-2 Java devs will have to interact with the legacy software again, hence from the perspective of the remaining 98, Cobol doesn't exist anymore.
I can only speak to the two bigger German banks (i.e., Sparkasse and VR banks), but if you look at their outsourced development providers (Atruvia and Sparkasse Informatik), they're still offering incentives for their apprentices to learn COBOL, especially in the german dual apprenticeship programs which they can steer more easily than university courses. My wife has been doing COBOL for one of them since 2012, and the demand has never diminished. If anything, it's increased because experienced developers are retiring. They even pull some of these retired developers back for particularly challenging projects.
Sparkasse and VR aren't the two largest German banks. DB is at least double the size of Commerzbank which is again 100mn in assets ahead of DZ. I don't find it all that surprising that these small banks are still trying to keep their legacy systems alive, but it's not the case for the bigger boys. (Source: work for several of them)
1. Not all roles are advertised. I've actually only been interviewed for two of the jobs I've ever had, both at the same place - my current employer because it's a public institution and so it always advertises and interviews for jobs even if it has an internal candidate who is likely to be a good fit. In fact the first of those jobs was basically my shape on purpose, another candidate was an equally good fit and they hired both of us.
Everywhere else people hired me because they knew who I was and what I could do and so in place of an "interview" maybe I grab lunch with some people I know and they explain what they want and I say yeah that sounds like a job I'd take and maybe suggest tweaks or focus changes. No shortlist of candidates, no tech interview, no tailoring a CV to match an advert. Nothing -> Lunch or Drinks -> Job offer.
So that can cause some distortion, especially for the niche languages where there are like six experts and you know them - an advert is futile there.
> measurable way of understanding a languages real utility
It feels like that metric misses "utility" and instead comes from a very American (or capitalistic maybe is better) mindset.
What about Max/MSP/Jitter? Huge impact in the music scene, probably has very small amount jobs available, so it'd rank fairly low while it's probably the top media/music language out there today. There are tons of languages that provide "the most utility for their domain" yet barely have any public job ads about them at all.
I think such metric would be useful to see the "employability of someone who knows that language" if anything, but probably more pain than gain to link "# of job ads" with "utility".
Yeah except job adverts have enormous lag behind what's actually popular. For example we used Rust quite a lot at my previous company but we didn't advertise for Rust developers at all.
Also then you're looking at which languages were popular in the past whereas the interesting stat is which languages are being used to start new projects.
Well, we have to define what a language's popularity mean. Because Rust is surely more 'hyped', than Java, but Java has at least an order of more developers/software written, etc.
If you look at programming language list- Apart from Python, Java. Most are targeted to specific platforms(databases, browsers, embedded systems) or tech(SQL for database).
The general purpose programming languages today are still- Python, Java, and Perl. Make whatever of this you will.
Larry Wall at one point said, if you make something very specific to a use case(like awk, sed, php etc), it sort of naturally starts to come out of general purpose use.
Its just that Kotlin, Rust, Go, SQL, Julia, SQL, Javascript etc. These are not general purpose programming languages.
This data is kinda worthless for popularity contests, since they may get picked up by aur packages, but this gives a solid insight into wich languages are foundational
Those of you surprised to see Java so prominent, where have you been all your careers? 10 people startups with nodejs backends? You must have been entirely shielded from enterprise software companies.
The financial sector, insurance sector, healthcare sector all jumped on Java a couple of decades ago, and they have massive repositories of Java code, and are actively migrating their COBOL code to Java.
What do you mean by this? To me it sounds like people are saying they are both "old" languages, but I don't know what you mean.
I work in a shop that has lots of both Java and COBOL. We are not "actively migrating" COBOL code to Java. It looks like mainframes will continue to exist for decades to come (i.e. >20 more years). Obviously, brand new applications here are not written in COBOL.
I've said this myself as Java's reification of mid 90s OO and poor interop via JNI are not to my taste. But I've spent 25yrs in banking with JPMC, BoA, Barclays et al. Done lots of interop with Cobol on S/390, AS/400 and VME. Never heard of any of those systems being rewritten in Java. Have encountered key mainframe prod systems for which source is lost.
I know of one investment bank (a former employer) that rewrote its mainframe-based settlement system in Java (on Linux). Front office systems were often Java (replacing Obj-C in some cases).
That was two decades ago – almost a generation! Interesting to think that some of those systems would now be considered “legacy”.
I might be mistaken but as I understand it COBOL never had the reach that Java does. It's everywhere, from embedded systems to massive clusters, as bulky VM, slimmed down VM or native. Business, science, recreation, the sector almost doesn't matter, it's going to have Java somewhere in there.
Wasn't this whole thread about joking that Java is the new COBOL? And a lot of enterprises use Java and that is becoming the new entrenched/old/stodgy language that the hot new kids don't want to use?
In its day, a lot of 'cool' companies used COBOL, back then. Because it was an ok solution, back then. So to say, today, Netflix is cool and uses Java, thus Java is different and still cool, is not valid. Does not invalidate the point. It is the same situation, just decades later.
Maybe shouldn't have conflated SAP, but they seem to be just all part of the same giant ecosystem of 'current/entrenched' solution that 'we use because we have to, not because it is better'. Not unlike COBOL.
It's a weird one - I've been at Google for more than 5 years. I know from the stats that we have a zillion lines of actively developed Java, there must be huge swathes of the company that you could even call a "Java shop". I dig into random bits of code all the time. And yet I've looked at Java maybe three times in my tenure. And if I needed to submit Java code, I would not have a single contact to ping for readability review.
Java was the first language I learned in my CS degree, I still think this was a sensible choice by the CS department, but I don't think I've written a single piece of Java since I left 10 years ago!
It seems like a lot of Java usecases may be big and important but kinda isolated! Something about where they sit in the economic value chain perhaps?
I work on stuff that's adjacent to system software (I'm actually mostly a kernel engineer). But I've touched code in all the major languages at Google: loads of C++ and Go, much less of Python and Typescript. But Java/Kotlin is the only one I've never touched at all.
Anyone who is surprised is not from Finance sector. I wouldn't just say enterprise though because there could be non-finance enterprises where Microsoft and .NET/C# rule.
I totally expected JavaScript to get the 2nd spot but looks like TypeScript pulled the votes away. I personally consider JavaScript and TypeScript to be close enough for their numbers to be added up.
Then you should probably add kotlin and java together as well. They share the same purpose, use the same VM, usually live in the same project, have native compatibility, are used with the same frameworks, etc.
Especially considering Kotlin is used as a drop in replacement for Java in a lot of projects. Especially when using the type of frameworks often associated with Java (Spring, Quarkus, etc.).
Personally, I think statistics like this are biased towards the median of the past few decades and do not necessarily tell much about the future; other than that things apparently move very slowly and people are mostly conservative and stuck in their ways.
Cobol is still in that list. Right above Elixir, which apparently is a bit of a niche language. Kotlin has only been around for about 15 years, and the 1.0 release was actually only nine years ago. Java was released 30 years ago and it's been dominant in enterprise development for 25 years now. So, no surprise that Java is nearer to the top.
Python is surprising but it's been around for quite long and gained a lot of popularity outside the traditional computer science crowd. I know biochemists, physicists, etc. that all use python. And it's a great language for beginners obviously. It's not so much that people switched to python but that it is driving the growth of the overall programmer community. Most new programmers use python these days and that explains why it is the #1.
Javascript has had a virtual monopoly on basically anything that runs in a browser, which is of course the most popular way to distribute code these days. Especially since plugins were deprecated and things like applets, flash, etc. disappeared around fifteen years ago. Anything that ran on the web was either written in Javascript; or transpiled/compiled to it. WASM is starting to change that but it's early days.
What the past 25 years tell us is that things definitely change. But very slowly. C++ still outranks Javascript. That's because it's mostly browsers where it is used. It's a lot less popular for other things.
I like Kotlin, so I'm biased. But it's obviously not the most popular thing by a long shot. But popular doesn't mean good. I actually like python for small unimportant things. But I reach for Kotlin if I need to do it properly. I used to reach for Java. But Kotlin simply became the better tool for the job; at least for me. I even prefer it over typescript and I do occasionally use it for web frontend development. The transpiler is pretty good. And there's a WASM compiler too and Compose for WASM just entered beta. Kotlin seems future proof and it seems to be growing into wider adoption. There are a few million programmers around by Jetbrains counts. It's not nothing.
C++ is still very popular where you need raw perfomance but not so raw as C. Especially with the fact that python is used as a more user friendly interface.
But TS is not valid JS and nobody uses TS because they can write JS in a file with a different extension. You also get 0 benefit from running `tsc` on a JS file. You could argue that C is valid C++ so there's no reason to discern them either.
You can also easily have Objective-C, C, and C++ in the same Swift project and have them interop. That’s a feature of Swift. But adding their numbers together wouldn’t make sense.
Doesn't really bring benefit. With Java you are more quickly useful in C++ and can write server apps without fuss. Very little benefit in using a different language when Java literally does the same and is used everywhere else.
As a backend dev (mostly working in fintech) I feel weirdly unable to find a target language to move to.
After working with Node and Ruby for a while I really miss a static type system. - Typescript was limited by its option to allow non strictness.
Nothing catches my eye, as it’s either Java/.Net and its enterprisey companies or Go, which might not be old but feels like it is, by design. Rust sounds fun, but its usecases don’t align much with my background.
If you want to stay in Fintech, I really don't see anything beyond Java, C#, C++, TypeScript (for the Web stuff).
Some Fintech companies might go a bit outside the norm and allow for Haskell, F#, Scala, which they tend to use as DSLs for some workflows.
Then if you into array languages, banking and fintech is one fo the few domains where that have managed to stay around, but those positions seem hard to get.
It's been years since typing was added to Python, and many people still underestimate it. You're right; Python with strict type hints is more type-safe than Java and Go. It even has features like pattern matching and proper enums, both of which Go lacks.
The only issue is that some libraries (not many!) are still untyped, so you may need to write wrappers or stubs occasionally. But that applies to a small and decreasing minority of libraries.
I only wish Python were faster and had ahead-of-time binary compilation.
You should give Julia a go, can be written completely static if desired (not as static as rust of course, but compared to python it’s miles ahead). Can be made fast, very fast. AOT compilation with trimmed executables is coming in 1.12
Doesn't Julia suffer from very long startup times? One of the things I use Python for is CLI programs. The startup time isn't great either, but Julia was even worse last I tested.
Julia v1.12, the unreleased version which is currently in release candidate stage (and has had a longer RC stage than expected but should be done at least by the end of the year) has the ability to fully ahead of time compile and generate binaries, like you would expect from a language like C. It also trims these binaries so that they are a reasonable size given the elements of the runtime that are used. Thus for good type-stable code you get callable binaries without JIT overhead, and this leads to a much better CLI experience. It will take a bit of time so that there's more package tooling and support built around this feature, but this is the path a lot of the ecosystem CLI tooling is now going and that will be a pretty dramatic shift in the usability for these kinds of use cases which Julia had traditionally ignored.
Python sure is slow. It's not that big a problem most of the time, but once it becomes relevant, it becomes super relevant. The lack of AOT binary compilation also is very annoying.
Python with type checking is statically type checked. So yes, Python with type checking is better in that regard. And it's safer than Java because there are more classes of errors that Python with type checking will pick up during static type checking than what the Java compiler will pick up.
How did you do typing? Was it 'hints', or some other outside option.? It always seemed to me that since it isn't baked in, that it comes down to organizational discipline.
We run mypy with `--strict` mode in CI, which means it only passes if we have type hints and if they are correct, or if we have added `# type: ignore[code]` for places where the errors are reported.
And the type hints and behaviour are specified as part of Python, so it's kind of baked in. It's just that the actual type checker is not part of CPython, though Mypy is considered the reference implementation.
We have quite a large code base and very few `# type: ignore[code]` directives.
Some third party code still has poor type hints, but for the most part it's fine, and we get null safety which you don't get for Java.
Type hints and their behaviour are part of the Python language. Kotlin and Scala is not Java. I did not know that checkstyle + SonarQube adds static null safety to Java, but I appreciate that you will share a citation that it adds static null safety. I think everyone coding Java should really learn to use this so they don't create null pointer exceptions.
Scala is the best language I've ever used - all the good parts of Typescript and all the good parts of Java or Rust. And fintech is one of the few niches where you might still be able to find a job using it.
We use Scala as well and are very happy with it. We also have a lot of TS, and at this point we are tempted to just switch to using ScalaJS on the frontend as well because of how much better the language is to use. It feels like Scala 3 fixed a lot of issues people had with the language. The IDE tooling isn’t the best, but even then it feels like the IDE tooling for TS breaks all the time once the project is large enough.
> We also have a lot of TS, and at this point we are tempted to just switch to using ScalaJS on the frontend as well because of how much better the language is to use.
I did that and highly recommend it, at least for new projects - not sure that it's worth the effort of porting an existing project, but ScalaJS just works and I found far fewer rough edges than I expected.
I played with it a long time ago and the IDE (eclipse plugin?) was a bit of a mess, sbt was prone to weird issues and lock ups and the compiler was pretty slow.
I would say yes. I have experience teaching Rust to ~20 yo students of Java, and they are able to be productive in Rust within a semester. Your median Java and C# dev should be able to use Rust. Dunno about Python devs.
But I want a fast on-ramp, quick iterations and clean looking code. (and went with Kotlin because of that -- I like Rust more myself, but I have a business to run, so tradeoffs)
I really like Gleam. But if you want a job, that’s probably the worst choice ever. The language is just a few years old and lack libraries for mostly everything. Why are people recommending that here, do you guys work using Gleam??
Let’s be honest , your realistic options for work are Java, C#, C++, and depending on the industry, Swift, Go, Kotlin, Dart and Rust.
Being able to talk passionately and with experience about "exotic" languages during an interview will help you land the job.
I will hire the one that talks about the joy of FP and/or static typing ANYDAY over the programmer with only JS experience an no visible interest to look beyond.
Have you worked with TypeScript? I’m working with both every day and I’m always frustrated by the limits of the 'type' system in Python- sure it’s better than nothing but it’s so basic compared to what you can do in TypeScript. It’s very easy to use advanced generics in TypeScript but a hell to do (sometimes outright impossible) in Python.
Yep, although never in a project of a similar size. One advantage of the Python setup is that the types are ignored at runtime, so there's no overhead at startup/compilation time. Although it's also a disadvantage in terms of what you can do in the system, of course.
I agree it is pretty nice (with uv and as long as you REALLY don't care about performance). But even if you are one of the enlightened few to use that setup, you still have to deal with dependencies that don't have type annotations, or only basic ones like `dict`.
Typescript (via Deno) is still a better option IMO.
Java is maturing into a syntactically nice language, albeit slowly, and it's the backbone of many medium and large companies.
You might have trouble finding small companies using anything but JS/Ruby/Python. These companies align more with velocity and cost of engineering, and not so much with performance. That's probably why the volume of interpreted languages is greater than that of "enterprisey" or "performance" languages.
I've always felt it was verbose and the need for classes for everything was a bit of a overkill in 90% of circumstances (we're even seeing a pushback against OOP these days).
Here are some actual improvements:
- Record classes
public record Point(int x, int y) { }
- Record patterns
record Person(String name, int age) { }
if (obj instanceof Person(String name, int age)) {
System.out.println(name + " is " + age);
}
- No longer needing to import base Java types
- Automatic casting
if (obj instanceof String s) {
// use s directly
}
Don't get me wrong, I still find some aspects of the language frustrating:
- all pointers are nullable with support from annotation to lessen the pain
- the use of builder class functions (instead of named parameters like in other languages)
- having to define a type for everything (probably the best part of TS is inlining type declarations!)
It has virtual threads, that under most circumstances let you get away from the async model. It has records, that are data-first immutable classes, that can be defined in a single line, with sane equals toString and hash. It has sealed classes as well, the latter two giving you product and sum types, with proper pattern matching.
Also, a very wide-reaching standard library, good enough type system, and possibly the most advanced runtime with very good tooling.
I tried ocaml last year (I knew some Haskell and Scala already) and it was hell.
The language itself was pretty, that's true. But the schism between Ocaml's stdlib and Jane Street's core lib was incredibly frustrating.
I guess when you're working at Jane Street, use only their core lib and you get some proper onboarding, things could work great. However you're programming very much in a niche community then.
You should try ReScript. The language has improved a lot recently. If Typescript is javascript with types bolted on then ReScript is javascript redesigned the proper way. The lsp is also surprisingly good. All that while still being in the node eco system.
>That's 100% a project setting you can turn on or off though, it's like -Wall in C++ projects.
I know, it’s just not realistically up to me. I depend on the team/company culture which I’d rather not have in the picture - I’ve already gone through trying to fight the sea of unknown/any.
Do you think those languages are scratching the itch? If not maybe we need to pick a suitable underdog and champion its use. Clojure comes to mind. Or Unison. Something where immutability shines.
What's wrong with Java? It's used everywhere from FinTech startups to banks/insurers, with features like decimal arithmetic introduced specially for finance a long time ago, multiple runtimes (Oracle/JVM, IBM T9, GraalVM), a very healthy ecosystem of libs/packages, etc.
>What's wrong with Java? It's used everywhere from FinTech startups to banks/insurers
As the OP, it’s not even the language for me but the implications of companies that use it.
It’s a non starter for startup/scaleups and strongly related with oldish companies and consulting firms, which in turn translates to far worse working conditions (not remote friendly, overtime, dress code, etc).
Mind that it might just be a local culture thing, your mileage may vary.
My startup switched from python to Java and saw our productivity explode. Using modern Java versions in a non enterprise way (no frameworks, minimal oop, minimal DI, functional features like immutable objects, optional, etc) is quite nice. Our ability to deliver performant and working features was orders of magnitude faster than python. The ecosystem of libraries is crazy deep which also helps build quickly.
I won’t deny there’s a lot of bad Java written, but IMO it’s actually one of the best languages for a startup if any of your code needs good performance.
100%. Java has an amazing standard library, amazing IDE support, AOT compilation, JIT optimizations, static typing, runs much faster generally, and supports multi-threading... seems like a no-brainer to me.
If you have very good developers, go with whatever they feel most efficient with, (as long as there are enough libraries). But if you’re planning on getting big, do add a transition to stabler tech like Java, C#. They’re boring tech and full of guardrails. Which is what you want when performance is not your main concern.
> It’s a non starter for startup/scaleups and strongly related with oldish companies and consulting firms, which in turn translates to far worse working conditions (not remote friendly, overtime, dress code, etc).
Have you really been using Node or even Typescript as backend dev in fintech?
After 15 years in fintech I have never heard anything like that. For the backend it's mostly Java with some sprinkles of C#, Cobol or Python.
Yup, the previous company I used to work for just had a successful exit last year too. I guess it was an iteration speed thing, though as you say python is becoming popular as well and IMO it is not anymore suited to finance than node is, other than the connection with data science.
Swift! It actually has a pretty big server community, Vapor and Hummingbird are both great frameworks, Apple has been replacing some of their Java services with it, it's open source and cross platform and Apple seems serious about making it viable on Linux, which it is. No need for Xcode, it has an open source LSP and first class VSCode plugin.
Plus it's a fun language to write. Some people say it's a nicer, higher level Rust.
I also like the look of Kotlin but I've never used it. I think Kotlin and Swift are the two premier modern multi-paradigm languages.
I do like Swift but it also suffers a bit from identity crisis. The compiler experience is quite disappointing too—I found myself helping the compiler more than the compiler helped me.
I've since moved to Rust and have not looked back. Importantly, rust-analyzer runs circles around the Swift VSCode plugin (or Xcode for that matter)
The identitiy is clear, it is a language first and foremost for Apple ecosystems.
It also needs to target GNU/Linux, because Apple got rid of their server offerings, thus anyone doing server code for applications on the Apple ecosystem, that wants to stay in a single language needs to be able to write such software on GNU/Linux with Swift.
Windows well, since they have the open source story, it kind of falls from there as complement.
On the revamped website they are quite clear about the identity.
Cloud Services, CLI and Embedded as the main targets for the open source version.
I'm the same but opposite, I like Rust but find myself using Swift most of the time. They sort of do the same thing but coming from opposite directions. Can't go wrong with either imo.
I really do wanna try Kotlin at some point as well. Rust, Kotlin, and Swift feel like the future of languages to me.
I have the same dilemma: a strongly typed / modern language with good tooling / library support. I'm also considering Kotlin / Gleam. With Kotlin, practically speaking, we're talking JVM again, along with its resource requirements.
More Java than C++ in my opinion. And, get this, JS! We shall see how far OpenFin will actually go in that space because they will potentially drive a ton of JS across orgs in finance
You can use that though. Using the file system will cause runtime errors. The Arduino IDE is just a glorified build system around gcc with a package manager. I recently ported the behaviour of the IDE to Makefiles and it wasn't actually that hard.
Correct, but it is the same root cause that make people say HTML and CSS are programming languages, C and Fortran written libraries are "Python" libraries, and so forth.
Labview seems like a pain (I haven't used it), but I guess it's super useful for some uses. I recall SpaceX uses it for controlling launches. It comes with models for all manner of hardware.
I’m skeptical. There are more people writing PHP and Ruby than HTML? And HTML is a programming language? Those two very surprising results make me doubt the others.
Elixir behind OCaml? Possible, I guess, but I know of several large Elixir shops and I haven’t heard much of OCaml in a while.
> There are more people writing PHP and Ruby than HTML?
Intuitevely I'd say yes.
In most jobs I've been in, the ratio of backend/system devs to front end devs has been from 3:1 to 20:1 depending on the size. Provided I'm on the backend side and would choose companies accordingly, but still.
Even for web first services, there will be a flurry of people who won't touch a line of html/rendering.
Imagine a PHP backend providing an API for an app. The only HTML ever produced will be the admin backend, and perhaps auth forms for special cases. The surface of the API will produce objects serialized to JSON, and the vast majority of the PHP will be business logic to talk to external services and do what the service is paid for.
Some might not like the language, but whole businesses will run on PHP, with a dedicated react or next.js frontend that only talks to the PHP via API.
As mentioned, HTML is indeed a programming language. But it’s one that is rarely used on its own. So you could argue that having it as a thing of itself in these lists, makes little sense.
There are non-Turing complete programming languages, and there are many things that are Turing complete but have nothing to do with programming (even PowerPoint), so this is neither a required nor sufficient property.
I believe a reasonable way to categorize languages as programming or not is simply.. what is it's primary use case. HTML's last two letters tell us exactly that it is not a programming language.
I think "A language often involved in the process of making computer programs" is way too weak to define a "Programming language". A programming language at least needs to have state/expressions/logic. I'm sure there is a good definition, but if we allow html then any markup is programming and that's obviously false so the line has to be drawn somewhere.
The reason this debate is so strange is because some people think it's gatekeeping to say someone who writes html for a living isn't "programming". It's nonsense.
If HTML is a programming language, why not SVG? If SVG is a programming language, why not PNG? Is your image viewer just an interpreter executing PNG code? Maybe being a programming language is a spectrum...
Where does this idea that a programming language has to be Turing complete come from? As far as I can tell from cursory searches, the most broadly understood understanding of a programming language is a formal language for directing computations on a computer. HTML does this, CSS does this, and SQL does this. Frankly even configuration languages like YAML or the spare INI file do this in the proper context.
Can these languages do everything or even most computations you would be interested in doing in a computer? Of course not. But why should the definition be restricted to languages that can do everything?
Yes. So a criteria that is below Turing Completeness but which wouldn't make markdown a programming language. Shouldn't be _that_ hard to find such criteria.
E.g. "Has some form of logic/flow control". Can perform computation/"execute". etc.
I mean, CS is famously bad at exact definitions [1]. Why should we have one for what a PL is? Just do what humans have been doing for millenia, how is it commonly used. A tomato is a vegetable from a culinary perspective, and HTML is not a PL based on its use case and its literal name.
[1] what is a low or high level language? Strongly typed language?
As others are pointing out here use type annotations + Mypi integrated with your IDE or CI.
Imo a bigger problem with Python is that it's very slow and if that becomes a problem it's hard to solve.
I was pondering similar thoughts. Will LLM assistants ossify our current programming languages? My limited testing seems to show LLM assistants do well the more popular the language is (more data in its training), so is the hurdle for adoption of something new going to get even higher?
In an alternate universe, if LLM only had object oriented code to train on, would anyone push programming forward in other styles?
I recently picked up Hare, which is quite obscure, and Claude was helpful as a better— albeit hallucinogenic— Google. I think LLMs may not lead to as much ossification as I’d originally feared.
I had looked at it recently while checking out C-like languages. (Others included Odin and C3.) I read some of the Hare docs and examples, and had watched a video about it on Kris Jenkins' Developer Voices channel, which was where I got to know about it.
I like it much more than Zig, and while I like Odin’s syntax more, Hare is more focused on the types of tooling I want to build, so I find Hare’s stdlib preferable. Give it a spin. It’s a simple language.
I started reading the Hare tutorial again because of your comment.
Looks good so far.
Just one note for anyone else wanting to check it out:
There are a few sections in the tutorial which are empty and marked as TODO. E.g. "Using default values" and "Handling allocation failure" (the empty sections seen so far, there may be others below).
>My limited testing seems to show LLM assistants do well the more popular the language is (more data in its training), so is the hurdle for adoption of something new going to get even higher
Not only that they also tend to answer using the the more popular languages or tool event when it is NOT necessary. And when you call it out on it, it will respond with something like:
"you are absolutely right, this is not necessary and potentially confusing. Let me provide you with a cleaner, more appropriate setup...."
Why doesn't it just respond that the first time? And the code it provided works, but very convoluted. if wasn't checked carefully by an experienced dev person to ask the right question one would never get the second answer, and then that vibe code will just end up in git repo and deployed all over the place.
Got the feeling some big corp may just paid some money to have their plugin/code to on the first answer even when it is NOT necessary.
This could be very problematic, I'm sure people in advertising are just all licking their chops on how they can capitalized on that. If one thing currently ad industry is bad, wait until that is infused into all the models.
We really need ways to
1. Train our own models in the open, with weight and the data it is trained on. Kinda like the reproducible built process that Nix is doing for building repos.
2. Ways to debug the model on inference time. The <think> tag is great, and I suspect not everything is transparent in that process.
Is there something equivalent of formal verification for model inference?
> is the hurdle for adoption of something new going to get even higher?
Yes.
But today the only two reasons to use niche languages are[0] 1) you have existing codebases or libraries in that language 2) you're facing quite domain-specific problems where the domain experts all use that language.
In either cases you won't just use Java because LLMs are good at Java.
Of course it’s always been easier to find talent when working in more popular languages. That’s the big risk you take when you choose the road less traveled.
All the functional programming suffers the same curse. Apart from a small set of libraries (usually standards and pure algorithms), it’s much faster to write your integration than to bring someone else version. And it’s often easy to vendor the logic you need. You won’t see the minipackages from JS/Rust world or the big libraries from C++/Java.
You write something and it stays written, mostly because everyone moves the logic far away from accidental complexities, so maintainance is very low.
I've seen certain fintechs using Clojure and I'm not even in a tech hub, I'm in South America. Not to say there are many, but definitively I've seen 0 jobs for Raku.
Clojure seems more popular than other FP languages such as Haskell or even F#
Java still so popular after all this time.
I've been re-learning java using the excellent University of Helsinki's MOOC course.
Was thinking of learning some spring boot and create a small project or two to reinforce what I've learned. However it feels like tutorials for spring boot is of so much lower quality compared to newer language/frameworks like JS/React/Python. Often times it's just a talking head over a powerpoint presentation talking for 30 minutes.
Could people recommend me a good tutorial for spring boot (or anything java that is being used in enterprises)?
I copy pasted it in to google. The first result might be it but it says "Please note, that this is a legacy course. ... The course content is also no longer updated or maintained."
No hint on how old and moldy it is. Does it teach a relatively recent Java or 1995 Java?
So asking if that is the right one doesn't seem out of line.
And I mean, I do feel like that as long as they aren't actively harming the "ethos" of hackernews, then we can cut each other a little slack each other
I feel like I have sometimes done a disservice like this too to HN where I ask for links sometimes and maybe they just wanted to confirm if this was the right course or they might be coming from a job tired and didn't think this tooo much y'know?
But i mean I also understand your standpoint that you want less clutter on HN which is why I feel a bit :/
Ahh, just checked it and I think you might be correct but here's my nuanced take
Yeah I can also understand it, but I just saw their comments and scrolled down to find it
```
Nah, DotNET is amazing these days. At the risk of starting a holy war, it is neck-and-neck with Java, and I say that as a Java fanboi. I think it is good to have healthy competition between languages (and ecosystems), like C++ and Rust (and a little bit Zig) or GCC and Clang or Java and DotNet or Python and Ruby or NodeJS and Deno. Plenty of people are compiling and deploying to Linux after DotNetCore went open source. Plus, you can use JetBrains Rider, which is a cross-platform IDE for C#, from the makers of IntelliJ
```
It just seems that their use of words like boi etc. makes them (genz?-ish)
I am genz too (maybe younger than them, still in HS) but yeah maybe they just write one liners which can be quite common but I see that more on the reddit kind of things and not hackernews lol. I can definitely see our generation's attention span being cooked that when I write remotely as long as this, they say nah I ain't reading it. So well, how are they gonna write it for the most part :/
It might be a bot but then again, what is the point? The point I see of HN points is that I might create a new account and be the same guy and kind of accrue them back because I once had it y'know while being myself not writing some one liners lol.
The fact that I like about HN is that I have talked to soooo many people that I admire whether its a one liner from jose valim or etc. and I am happy that hackernews exists to a somewhat degree :>
Like just out of curiosity, has someone ever got any job or smth through HN in the sense that they had their karma in part of their resume and the company was impressed lol, I can see it to a somewhat degree in maybeee startups
Oh, please run away from Spring Boot. It is just the new JBoss. That is OK if you want to get inside enterprise software for a living, but that isn't the best usage for Java.
My opinion: learn to create Android apps in Java. Tutorials are good and you get a new set of skills (if not already). After that, focus on learning POJO which is the fundamental knowledge in Java.
Everyone writes stuff in Java/C++ where I work, but less and less Spring Boot is encouraged because of the bloat do debug and performance troubles.
This just doesn't feel true to me. I've been looking for a Rust job for years, and the only thing I can ever find is crypto scam listings. Meanwhile the Ruby on Rails jobs are plentiful and from real companies.
I speculate it goes this way:
1. Some people at a Java company get excited about Rust
2. They write some microservices in Rust. They're now having "Rust jobs"
3. The company hires more Java devs to replace the people that now maintain the Rust side of things
4. When in need there are internal shuffles to replace those Rust devs
It can't go forever, but as far as I can tell the usage in corporate has started not long ago. You'll have Rust jobs, but they'll be the same shit as Java jobs. There was a study done like a year ago that showed across the board decline in developer satisfaction with all the "new and shiny" JS frameworks. I 100% think that when companies will inevitably start hiring people to maintain those legacy Rust "internal sideprojects" the same will happen. It is when a not driven by passion workforce, people who complete taks and features instead of doing the "provably correct thing" have a go that technologies get vibe checked. We will see which way it goes.
It is interesting that your speculation chose this path: Java to Rust. That surprises me! I would have much more likely to say, C, C++, or Go to Rust. Was your choice of Java arbitrary, or is there a deeper reason?
Ruby was super hyped 10 years ago. Now all the hype-based programmers are on to TypeScript and Rust... Ruby is IMO at a nice level now; able to avoid some of the worst ideas that hype based programmers like to inflict on people, but still popular enough...
From what I've seen it's popular for startups in general. I'm using it for mine... More YC companies use Rails than you'd expect relative to it's "popularity". And yeah, in Japan Ruby is homegrown tech and the community seems pretty big there.
Ruby became popular mainly because of Rails, which has gone down somewhat in popularity in recent years.
That may be why Ruby is less popular now than it was 10 years ago. Also, they (Ruby and Rails) got popular much before 10 years ago, like around 2006 / 2007, when the Web 2.0 wave was starting. I had worked on a couple of dot-com projects in Ruby on Rails at that time, that is how I know.
And Python got popular cause of LLM AI thing. It is a shame, cause it is quite slow. I had some good time with jython back in the 2 days, but really wished something more elegant (nim/rust/ocaml) has taken over this AI thing instead of python.
One thing is for sure, don't get tight down to one language cause it is popular today. Go with whatever make sense for you and your project.
There has been several other events like that (Django, Pandas and data science..). I don't think Python's popularity can be ascribed to any single event, it just happens to be a language that is reasonably close to pseudocode with an excellently thought-out (I mean best in the industry) standard library. Python is practical, first and foremost. That's why it won, unlike other languages it doesn't really have an ideological agenda.
Zope predates all of those, and slowly as you say people got interested and started using it for other stuff, like being a better Perl.
Python has an agenda as well, Guido has said multiple times it was a language designed for teaching programming, and one of the reasons Zen of Python came early on.
Before AI, Python was still extremely popular for any sort of data science (possibly because of numpy first, then pandas, but I don't claim any historical knowledge here). And independent to that it was, before the raise of server side JS, one of the most popular server side web languages (probably still is).
Also around the mid '00 it started replacing perl as the unix scripting language of choice.
I'd like to see some clarity in these stats, it can't just be me that finds it hard to believe that there are significantly more Python jobs than Java. I wonder if job listings are saying "Python, C++" or something, so that's a point for Python, even though, the job is < 1% Python just for test rigs or something.
Something feels off about these rankings, and comparing to lat year's stack overflow survey (albeit also potentially not a super accurate accounting), I'm left wondering if their sampling methods are at all accurate.
What's really interesting is the place of Elixir (below Cobol and ABAP) and more of less the same as Ada. This is very controversial when comparing it with other indexes where, for example Elixir is the most beloved programming language or the language that most people wanna use. Also compare it with the number of Elixir posts on the front page of hn.
Anything short of a big AI winter is not going to move Python from its top spot. As Python has become the first choice of output code for LLMs its dominance is only going to grow.
Is Python really the first choice of output code? I'm not saying you're wrong, I just don't know the answer and I don't know where to look to find out.
I would have assumed it might be JS and more specifically React -- isn't that what you often get if you ask an LLM to "make an app that does such-and-such"?
(Experimental anecdata: I just asked Gemini to write an app and it gave me a single HTML file with embedded JS, no TS or React, Tailwind for CSS.)
(Edit to add: then I asked it to "write some code" instead of "write an app", and this time it went with Python.)
Ime, unless you steer it otherwise, they will default toward TS/JS almost exclusively. Probably Python though if for some reason they decided not to use TS/JS.
Again you're making unsupported assertions ("trained most on Python code") and coming to unsupported conclusions ("they seem to prefer it") -- my own very quick experiment shows that it depends on how you ask ("app" versus "code") at the very least.
The choices of JS and Python were pretty solid for the prompts I gave. Maybe the LLM is, well, making a reasonable decision in context, rather than just defaulting to Python!
"LLMs prefer Python" is an over-simplification. Python was already very popular, so I agree that LLMs are likely to entrench that popularity, but that doesn't mean Python will grow into other areas where it wasn't being used much before.
Why are you verbosely repeating my own replies? You asked why, I told you a theory that MAYBE because they were trained on Python code. You do know that's how a LLM or even a simple neural network works right?
Of course it is an over-simplification. Should I do an empirical scientific study before I can reply that MAYBE LLMs seem to prefer Python? I was talking from my own personal experience.
Are you using a LLM to write your replies? Because they seem very odd to me.
I agree that we are only going to see solidification of languages due to tools like Claude Code. Why would I take a risk on something new if I can't use a much faster tool, it can already be such a battle getting adoption in mid/large companies. I wonder how a release like React would fare if it was released in another 5-10 years once LLMs are deeply embedded.
Is PHP including wordpress and the other ones like drupal (not sure anyone uses that garbage anymore)? If yes, then should not PHP and Python also be included as C projects?
It is an interesting study.
In terms of new languages and AI, I would like to see if anyone comes up with a more AI friendly language, as opposed to human friendly. I.e. programming languages were often designed to be easier to read/write for people, but maybe it makes sense to think of languages where it would be harder for AI to make a mistake.
Python at number 1 makes me cringe... In my experience, Python is not a language that I would use for anything other than a script or some solo PoC. I would absolutely never use it on code expected to exceed 1000 lines, code that's maintained by more than one person, or code that takes more than a few seconds to run. Python has a lot of great libraries as a result of it being the language of choice to teach non software engineers at university. A lot of smart people contribute to the ecosystem, but I wish they would focus their efforts elsewhere. Preferably pick any complied, strongly typed, static language that supports multi-threading.
It’s unreasonably distracting to me that the links on this webpage repeatedly (but not always) include the preceding or following space. It looks really sloppy. How do you even to that so much on a webpage, was this typed on a WYSIWYG editor?
Comparing 2024 and 2025, Ruby is growing again but still on the lower end of spectrum.
Both Java the language and JVM is great. A lot of the important work for JVM just landed. I am not even sure if there are anything that is really missing anymore. But the whole ecosystem is so vast I wonder if anyone would want to just craft out a subset of Java.
No Zig, Crystal, Odin, but Julia and Elixir is there just without numbers.
I think it is because that the application framework of Google Android is JVM. And JNI is so annoying that binding Kotlin and other system-level programming language other than C++ like Rust, Zig for a Android app is so hard.
For game engine developers that works for their game targeting Android, especially for commercial mature game, you have no choice but Unity engine. And take a look at Google announce
[a donation of $250,000 to the Rust Foundation](https://www.linkedin.com/feed/update/urn:li:activity:7376353...)
but there still constraint on using Rust library for Android app.
LLMs could also ease adoption of new languages by making hiring less of a barrier to using something more niche. It becomes easier for someone to hit the ground running and be productive even if they still need to put the time in to become an expert.
Instead I find myself more concerned with which virtual machine or compiler tool chain the language operates against. Does it need to ship with a VM or does it compile to a binary? Do I want garbage collection for this project?
Maybe in that way the decision moves up an abstaction layer the same way we largely moved away from assembly languages and caring about specific processor features.
Cool but I don't know how credible this information is. From what I've read on how they got that data and came to these numbers it does not exactly inspire a high degree of confidence
There are a lot of Java devs out there. At least 2 of the FAANGs are big on Java. Any big consultancy (Accenture, Cap Gemini, Fujitsu, Deloitte), is going to ship mostly Java too.
I’m not a fan personally, but its easy to find devs in it, so its popular in firms where language choice is a commodity/utility
Never met Android developers? I'm not sure how much Kotlin has already taken over there, but if you develop for Android you'll have to deal with Java (for better or worse).
Also for backend services Java is a pretty solid option. Just compile a monolithic JAR and 'run' that anywhere, which is much more robust than some node.js app cobbled together from tens of thousands of leftpad-equivalent npm packages ;)
They’re definitely underrepresented in most contexts where you’d meet other coders - but make no mistake, half the business world runs on Java, and it’s still the main language taught in a lot of CS university programs.
using Java since it came out in 1996, alongside many other programming languages like C#, TypeScript, C++, SQL (PL/SQL and Transact-SQL mostly),
Also Android is all about Java, even if Kotlin is the new darling and it uses its own runtime (ART), everything around it is based on the Java ecosystem, the IDE, build tools, most libraries coming out of Maven Central.
It's an iceberg. Loads of things run on Java. C# as well, which is similar. Large ecosystems that you barely have to leave when you're inside. Also, a tendency to be corporate systems, which reduces the visibility, since people generally are not allowed to show their code from work.
In the late 90s and most of the 00s, I'd say that you had a 50/50 chance of the code being either Java or C++, in any traditional (non-webdev) enterprise setting.
So, so much 00s enterprise legacy code is written in Java. In the early/mid 10s I saw a huge push to rewrite that code, though.
Hi I am a Java developer! Every company I ever worked at since 2008 used Java. At one point I learned many languages but still the only one that could get me a job was Java. To each their own bubble.
The methodology involves search hits, Stackoverflow, conference and journal articles.
In all of these Python is artificially over-represented. Search hits and Stackoverflow questions represent beginners who are force fed Python in university or in expensive Python consultancy sessions. Journal articles are full of "AI" topics, which use Python.
Python is not used in any application on my machine apart from OS package managers. Python is used in web back ends, but is replaced by Go. "AI" is the only real stronghold due to inertia and marketing.
Like the TIOBE index, the results of this so called survey are meaningless and of no relevance to the jobs market.
Huh. If you hate AI as you sorely do, you will find any excuse to be dismissive.
Besides AI development, Python is used heavily in data processing and data science, also in writing bots of any kind, and as a glue language to do numerous tasks. It is true that it is being replaced by Go in web backends, but it still sees heavy use in that too. Moreover, Python is the only language that many AIs can interactively use in their chat sessions.
Python is used heavily in data science (and a lot of other places) because people who go to university for non software engineering disciplines get taught Python because it's "the easy language that already has libraries for this research we're doing." Those people then go on to write more of these libraries. Their code does amazing things, but very slowly.
There's a good video series called "Programming Paradigms" by Jerry Cain, taken from his class at Stanford. I'm not sure how long ago it was, but it was before whiteboards, when they were still using chalk. He just started including Python that year when it was the up-and-coming thing, as an example of a higher-level language that does a lot of stuff for you. It probably seemed like a breeze for the students after the previous weeks spent on C, assembly, and Lisp, but at least they got some of the fundamentals of how things worked first.
Totally, it's always about trade-offs. It takes a decent amount of time programming to become comfortable choosing the language based on the task rather than tailoring the task to the language.
Ahh, yes. In college, the CS department used Turbo Pascal (hint, I'm old). My first programming gigs out of college were FORTRAN and COBOL. It's nice to see that they are still registering.
Now, time for a Metamucil and a nice nap before my standup.
You shouldn't expect to see any pre-1.0 language in this list, especially Zig which not only makes no stability guarantees but actively discourages expecting any stability. Heck, Zig 0.15 just came out that completely overhauled the IO framework from top to bottom. Once Zig reaches 1.0, expect them to make some effort to gain adherents, which currently is a complete non-goal.
Other than comptime, it doesn't bring anything new to the 21st century.
Its safety story is basically what Modula-2 (1978) and Object Pascal (1986) already had, but now it gets done with curly brackets instead of a begin/end language.
UAF is an issue, and the tooling to tackle this issue is exactly the same that C and C++ have available for the last 30 years, give it or take it.
It will be another language to talk about, however I doubt it will ever make it into mainstream, like having platfrom vendors and console Devkits care that Zig exists.
Don’t think zig will ever make it into popularity tbh. It is good for low level but it isn’t like rust or go. Rust and go are really good alternatives for c++ or java code, they just work, have good tooling etc. But I would never use zig for that kind of use case instead of rust since you don’t need to go full on engineering every dot mode.
I switched to zig from rust for implementing a database since 6 months and have no regrets but just don’t think anyone would use it for writing backend code or other similar smaller things.
I used to think similarly about rust before though so don’t really know anything
> I switched to zig from rust for implementing a database since 6 months and have no regrets but just don’t think anyone would use it for writing backend code or other similar smaller things.
I don't have a horse in this race, but have you shared more about this decision? It would make for a good blog post and an even better HN discussion.
I’m not that accomplished in this field so don’t think my pov will be that useful. It is mostly same as the rationale that tigerbeetle developers shared.
Might consider writing something if my project ends up being useful
> I switched to zig from rust for implementing a database since 6 months and have no regrets but just don’t think anyone would use it for writing backend code or other similar smaller things.
I've been thinking about starting a project in Zig rather than Go lately, even though I am skilled at Go. I really like working with more, cracked? or simply crazy people willing to learn an esoteric language, and zig fits the needs in particular I have (mostly very nice C interop)
Would you recommend? How are the average zig contributors vs something like go?
Rust has a unique niche it can occupy - namely, it's a "zero-overhead" memory safe low-level language.
I never understood why is go brought up next to rust all so often, when it has barely any unique qualities, and is a high-level GCd language with a dumb type system that outputs a single binary... of which there are 1000 other examples. At least it has good tooling, I guess.
How many people are waiting for it to hit 1.0? I am.
I am interested in Zig, but until they can guarantee a stable point for an extended period of time I have limited interest. Same way with Rust gamedev I'm keeping an eye on Bevy but want it to hit 1.0. Some things pre-1.0 is fine, but more core pieces of dev like the language often warrant requiring greater stability guarantees.
That's partially attributed to the fact that .NET is a truly batteries included solution and quality of it is generally good enough that there is no need for second or third alternatives for every basic thing.
I don't doubt your comment, but I immediately thought to compare to Java. Why does Java have exactly what you mention -- "second or third alternatives for every basic thing"? Is it easily explained by age as a language? Also, can you provide a concrete example of something that .NET includes in the core that Java does not?
If I were to ask you, how many APIs do you think “core” (w/e you consider core) does Java have. Think of a number before opening the link below and then let me know what you had in mind.
Generally .NET's "out-of-the-box" experience is comparable to using Java with a framework like Spring, as it includes a built-in DI container, a modern ORM (Entity Framework), a complete web stack (ASP.NET) with the high-performance web server (Kestrel) and so on. Because these first-party tools receive strong support from both Microsoft and the community and set a very high standard, which likely reduces the incentive for third-party alternatives to emerge. Of course, there are also many quality third-party solutions, but these mostly cover areas that are not covered well by .NET. You could happily build a lot of things using only .NET, without needing any third-party dependencies.
F# not listed at all? If that’s accurate (color me doubtful), people are missing out on a great language that runs on a popular platform. Personally, after learning to love F#, I’m never going back to C#.
If you’re working in a large team, and you’re not locked into the MS stack, and you’re not doing anything that needs to be super performant… Java is, by far, your best option.
The tooling and ecosystem aren’t great compared to some of these languages, but Java itself can be pretty damn good.
I used to be a hater many years ago but I’ve since grown to love it.
Agree, the build systems around Java have become absurdly complicated, but as an overall offering Java and JVM remains pretty compelling for lots of work, particular on a large team, as you say, where Java can be the "least bad option" in terms of getting a diverse group of people to learn it.
What is wrong with Java's tooling and ecosystem? Asking because it used to be the default a decade or so ago, with a vibrant ecosystem, so I find your remark surprising.
I used to use clojure, but one of the reasons I stopped was I personally found tools like Maven cumbersome to work with. I admit that was 8 years ago or so, so there's a chance it got better, but once people are driven away by tooling issues it takes a lot to convince them to give an ecosystem another chance.
An example of this ossification of understanding is how people still think dotnet is Windows only because they stopped caring before Core/Modern dotnet became a thing
Publishing a Maven package is also excruciatingly complicated. By contrast, NPM is actually too easy. I suspect that we see fewer supply chain attacks in the Java ecosystem because attackers are like “you know what.. never mind.”
I wouldn't call "3643 packages installed, 1200 vulnerabilities" for a hello world "simple".
People have their problems with Maven but unless it's some overly complicated legacy project (where npm just explodes I guess? Like I have hD windows machines get frozen from deleting the node_modules folder), it just works and you just give a list of dependencies.
They are just different. I mean, setting up monorepo is far easier with maven over npm. Besides, maven offers basically cookie cutter project organization where every maven project looks like every other maven project.
As for other tooling JVM is just better than JS ecosystem. Definitely more complex, but also more powerful
Gradle keeps on improving. I use it for Android, and even though it is complex, and then add the Android Gradle Plugin complexity on top of that, I would not trade it for the iOS build system.
One of my complaints with Gradle is that if you write a plugin (Java) it shares the classpath with other plugins. You might find some other plugin depending on some old version of a transitive dependency.
And you can code Groovy on top of it, still my favorite programming lang.
I too would like some illustration of why the tooling (Intellij, etc) is insufficient. Maybe gradle as a build system? Although I have to say with LLMs, gradle build scripting is a LOT easier to "build" out.
Java at #2? Is it really still being used much for new code in this day and age? Or is its popularity mostly due to so much legacy code out there to be maintained?
Yes. Companies with existing workforces are doing new software all the time. If you have a workforce with let us say 100 devs. Even if you hire 10 new ones temporarily or permanent, you for sure tell them: use what the other 100 do. It is not maintenance what drives this, it is the lack of momentum in your work force.
And that is very okay. (Modern) Java and .NET are excellent choices. There is nothing wrong with them.
It is a good choice if you want to use a non-Microsoft stack language (yes C# can be developed on Linux but the quality of the development experience on Linux isn't same as Windows) and want a vast ecosystem. Golang is too verbose due to lack of exceptions and the size of its ecosystem of third-party packages isn't anywhere as close to Java's. Swift is very nice but issues with ecosystem exist there too. Too steep a learning curve with Rust so more difficult to find developers. Modern Java has improved a lot compared to, say, Java 8 with record types, pattern matching, multiline strings etc. The pace of new features coming to Java has been quite high in recent times. Plus there is always the network effect.
I have developed .NET solutions on Linux over 8 years now (and about 10 years on Windows before) and would say the quality of development in Linux is even better than in Windows today. Sure you can't use Visual Studio in Linux, but you can use VS Code or Rider, which I would prefere anyway.
If you aren't it doing GUIs, which means you need to go into FOSS ecosystem with Avalonia or Uno, and if you aren't doing anything with profiling or debugging visualization of threaded code and a few other goodies that VS has for .NET and they will never make available into VSCode extension.
Also the VSCode extension for .NET has the same license as VS.
Java is still the default choice for many for enterprise software. Job-focused courses and curricula all over the world are still leaning big on Java ensuring a steady and large pool of okay Java devs.
Job focused Bachelor courses and curricula highly outnumber rigorous CS courses like the ones you are likely to find in MIT, UCLA-B, IISc, IITs, Oxford, UCL, Tsinghua, Peking, etc.
Yes, outside HN praises of Elixir, Gleam, and co, corporations run on boring technology, maybe with exception of what the FE folks pick up on each project.
They work, have great tooling, and do whatever is required for customers.
Given how mediocre LLMs are, I don't see this happening anytime soon... but I think a "better" LLM (that puts the "language" into large language model) can seamlessly translate between programming languages.
Right now, it's apparent to me that LLMs are mostly tuned in the programming space for what n-gate would call "webshit", but I think it is a clear (to me) evolutionary step towards getting much better "porting" ability in LLMs.
I don't think that is in the priority list of the LLM companies. But I think it would be a real economic boon: certainly there is a backlog of code/systems that needs to be "modernized" in enterprises, so there is a market.
Ultimately I wonder if an LLM can be engineered to represent code in an intermediate form that is language-independent to a large extent, and "render" it to the desired language/platform when requested.
I wish there were less programming languages because every library needs to be rewritten as many times as there are languages, a combinatorial waste of time.
Now that CoffeeScript is gone I would like to see all Ruby become Python.
> AI assistance means that programmers can concern themselves less and less with the particulars of any language.
Sure. Until we need to. Then we face some apparently tiny concern, which is actually deeply intricated with the rest of this whole mess, and we are ready for a ride in the rabbit hole.
> most developers today don’t pay much attention to the instruction sets and other hardware idiosyncrasies of the CPUs that their code runs on, which language a program is vibe coded in ultimately becomes a minor detail.
This can be very misguided from my part but I have the feeling they are two very different cases here. Ok, not everyone is a ffmpeg level champion who will thrive in code-golfing ASM til the last drop of cycle gain.
But there are also probably reasons why third-generation programming language lasted without any other subsequent proposal completely displacing them. It’s all about a tradeoff of expressiveness and precision. What we want to keep in the focus zone, and what we want to delegate to mostly uncontrolled details.
If to go faster we need to get rid of a transparent glasses, we will need very sound and solid alternative probes to report what’s going on ahead.
Take into account that this is posted on IEEE.
In my opinion, their target audience are scientists rather than programmers, and a scientist most often think of code as a tool to express his ideas (hence, perfect AI generated code is kind of a graal). The faster he can express them, even if the code is ugly, the better. He does not care to reuse the code later most of the time.
I have the hint that scientists and not programmers are the target audience as other things may trigger only one category but not the other, for example, they consider Arduino a language, This makes totally sense for scientists, as most of the ones using Arduino dont necessarily know C++, but are proud to be able to code in Arduino.
That’s a good point.
For a professional programmer, code and what it does is the object of study. Saying the programmer shouldn’t look at the code is very odd.
But reproducibility is famously a matter of some concern to scientists.
Sure, but their tools are complexity management tools: Hypotheses, experiments, empirical evidence, probabilities. To my knowledge, they deal far less with the determism programmers rely on. It's reproducible if you get similar results with the same probability.
If code is actually viewed as a tool to express ideas, making it easy to read and figure out should be a goal.
I like programming, I like clean code, so it's something I struggled with when I began research.
But actually, producing easy to read code when you don't have specifications, because you don't know yet if the idea will work, and you are discovering problems on that idea as you go doesn't lead to readable code naturally.
You refactor all the time, but then something that you misunderstood becomes a concern, and you need to refactorer again everything, and again and again.. You loose much time, and research is fast paced.
Scientists that spend too much time cleaning code often miss deadlines and deliverables that are actually what they need to produce. Nobody cares about their code, as when the idea is fully developed, other scientist will just rewrite a better software with full view of the problem. (some scientists rewrite their full software when everything is discovered)
I think a sensible goal would be easy to write code instead of easy to read for scientists.
But if you are iterating on code and using an LLM without even looking at the code, there's a reasonable chance that when you prompt "okay, now handle factor y also", you end up with code that handles factor y but also handles pre-existing factor x differently for no good reason. And scientific work is probably more likely than average programming to be numerics stuff where seemingly innocuous changes to how things are computed can have significant impacts due to floats being generally unfriendly.
> This can be very misguided from my part but I have the feeling they are two very different cases here
They are indeed very different. If your compiler doesn't emit the right output for your architecture, or the highly optimized library you imported breaks on your hardware, you file a bug and, depending on the third party, have help in fixing the issue. Additionally, those types of issues are rare in popular libraries and languages unless you're pushing boundaries, which likely means you are knowledgeable enough to handle those type of edge cases anyway.
If your AI gives you the wrong answer to a question, or outputs incorrect code, it's entirely on you to figure it out. You can't reach out to OpenAI or Anthropic to help you fix the issue.
The former allows you to pretty safely remain ignorant. The latter does not.
Oh dear. Using AI for something you don't understand well is surely a recipe for disaster and should not be encouraged.
My take is that you should be using AI for exactly the same things that you would ask someone a random contractor to do for you, knowing that they won't be there to maintain it later.
On the other hand, one can see it as another layer of abstraction. Most programmers are not aware of how the assembly code generated from their programming language actually plays out, so they rely on the high-level language as an abstraction of machine code.
Now we have an additional layer of abstraction, where we can instruct an LLM in natural language to write the high-level code for us.
natural language -> high level programming language -> assembly
I'm not arguing whether this is good or bad, but I can see the bigger picture here.
Assembly is generally generated deterministically. LLM code is not.
Different compiler versions, target architectures, or optimization levels can generate substantially different assembly from the same high-level program. Determinism is thus very scoped, not absolute.
Also almost every software has know unknowns in terms of dependencies that gets permanently updated. No one can read all of its code. Hence, in real life if you compile on different systems (works on my machine) or again but after some time has passed (updates to compiler, os libs, packages) you will get a different checksum for your build with unchanged high level code that you have written. So in theory given perfect conditions you are right, but in practice it is not the case.
There are established benchmarks for code generation (such as HumanEval, MBPP, and CodeXGLUE). On these, LLMs demonstrate that given the same prompt, the vast majority of completions are consistent and pass unit tests. For many tasks, the same prompt will produce a passing solution over 99% of the time.
I would say yes there is a gap in determinism, but it's not as huge as one might think and it's getting closer as time progresses.
Your comment lacks so much context and nuance to ultimately be nonsense.
You absolutely can, and probably _should_, leverage AI to learn many things you don't understand at all.
Simple example: try picking up or learning a programming language like C with or without LLMs. With is going to be much more efficient. C is one of the languages that LLMs have seen the most, they are very, very good at it for learning purposes (also at bug hunting).
I have never learned as much about computing as in the last 7/8 months of using LLMs to assist me at summarizing, getting information, finding bugs, explaining concepts iteratively (99% of Software books are crap: poorly written and quickly outdated, often wrong), scanning git repositories for implementation details, etc.
You people keep committing the same mistake over and over: there's a million uses to LLMs, and instead of defining the context of what you're discussing about you conflate everything with vibe coding making ultimately your comments nonsense.
I've posted this before, but I think it will be a perennial comment and concern:
Excerpted from Tony Hoare's 1980 Turing Award speech, 'The Emperor's Old Clothes'... "At last, there breezed into my office the most senior manager of all, a general manager of our parent company, Andrew St. Johnston. I was surprised that he had even heard of me. "You know what went wrong?" he shouted--he always shouted-- "You let your programmers do things which you yourself do not understand." I stared in astonishment. He was obviously out of touch with present day realities. How could one person ever understand the whole of a modern software product like the Elliott 503 Mark II software system? I realized later that he was absolutely right; he had diagnosed the true cause of the problem and he had planted the seed of its later solution."
My interpretation is that whether shifting from delegation to programmers, or to compilers, or to LLMs, the invariant is that we will always have to understand the consequences of our choices, or suffer the consequences.
Applied to your specific example, yes, LLMs can be a good assistants for learning. I would add that triangulation against other sources and against empirical evidence is always necessary before one can trust that learning.
My context is that I have seen some colleagues try to make up for not having expertise with a particular technology by using LLMs and ultimately they have managed to waste their time and other people's time.
If you want to use LLMs for learning, that's altogether a different proposition.
I kinda knew what you meant, but I also feel it is important to provide the nuance and context.
seems like a significant skill/intelligence issue. someone i know made a web security/pentesting company without ANY prior knowledge in programming or security in general.
and his shit actually works by the way, topping leaderboards on hackerone and having a decent amount of clients.
your colleagues might be retarded or just don’t know how to use llms
Would you recognize a memory corruption bug when the LLM cheerfully reports that everything is perfect?
Would you understand why some code is less performant than it could be if you've never written and learned any C yourself? How would you know if the LLM output is gibberish/wrong?
They're not wrong; it's just not black-and-white. LLMs happen to sometimes generate what you want. Often times, for experienced programmers who can recognize good C code, the LLMs generate too much garbage for the tokens it costs.
I think some people are also arguing that some programmers ought to still be trained in and experienced with the fundamentals of computing. We shouldn't be abandoning that skill set completely. Some one will still need to know how the technology works.
Not sure how your comments relates to mine.
The parent I answered said you shouldn't use LLMs for things you don't understand while I advocate you should use them to help you learn.
You seem to describe very different use cases.
In any case, just to answer your (unrelated to mine) comment, here[1] you can see a video of one of the most skilled C developers on the planet finding very hard to spot bugs in the Redis codebase.
If all your arguments boil down to "lazy people are lazy and misuse LLMs" that's not a criticism of LLMs but of their lack of professionalism.
Humans are responsible for AI slop, not AI. Skilled developers are enhanced by such a great tool that they know how and when to use.
[1] https://www.youtube.com/watch?v=rCIZflYEpEk
I was commenting on relying completely on the LLM when learning a language like C when you don’t have any prior understanding of C.
How do people using LLMs this way know that the generated code/text doesn’t contain errors or misrepresentations? How do they find out?
>> most developers today don’t pay much attention to the instruction sets and other hardware idiosyncrasies of the CPUs that their code runs on, which language a program is vibe coded in ultimately becomes a minor detail.
If it was even slightly true then we wouldn’t be generating language syntax at all, we’d be generating raw machine code for the chip architectures we want to support. Or even just distributing the prompts and letting an AI VM generate the target machine code later.
That may well happen one day, but we’re not even close right now
Also there’s so much patching in the kernel (for unix) to solve hardware bugs. And a lot of languages depends on C (with all its footguns) to probide that stable foundation. It’s all unseen work that are very important.
>> ...deeply intricated with... I think you invented a new phrase. And it's a good one!
It's hard to find good data sources for this, especially that StackOverflow is in decline[1].
IEEE's methodology[2] is sensible given what's possible, but the data sources are all flawed in some ways (that don't necessarily cancel each other out). The number of search results reported by Google is the most volatile indirect proxy signal. Search results include everything mentioning the query, without promising it being a fair representation of 2025. People using a language rarely refer to it literally as the "X programming language", and it's a stretch to count all publicity as a "top language" publicity.
TIOBE uses this method too, and has the audacity to display it as a popularity with two decimal places, but their historical data shows that the "popularity" of C has dropped by half over two years, and then doubled next year. Meanwhile, C didn't budge at all. This method has a +/- 50% error margin.
[1]: https://redmonk.com/rstephens/2023/12/14/language-rankings-u... [2]: https://spectrum.ieee.org/top-programming-languages-methodol...
By far the most useful and helpful is job ads: it literally defines the demand side of the programming language market.
Yes, that does not show us how much code is running out there, and some companies might have huge armies with very low churn and so the COBOL stacks in banks don’t show up, but I can’t think of a more useful and directly measurable way of understanding a languages real utility.
> the most useful and helpful is job ads
That would certainly be the case, if it were not for the fact that [fake job postings][1] are a thing.
[1]: https://globalnews.ca/news/10636759/fake-job-postings-warnin...
Is there a reason to believe this would skew results?
i.e. Are you assuming (insinuating) jobs for some programming languages are more likely to be fake
I would assume so. I expect there to be a lot of job postings looking for more "sexy" technologies to create the visage that those companies are growing and planning towards the future. And conversely I wouldn't expect any job postings of old "streets behind" technologies like COBOL to be fake, as they wouldn't help with such signalling.
Yes to your point, COBOL which ranks very low here is still fundamental to the infrastructure of several major industries, with some sources [1] reporting that it is used in:
43% of all banking systems.
95% of all US ATM transactions.
80% of all in-person credit card transactions.
96% of travel bookings.
This may very well dramatically change in the next few years with such an emphasis on enterprise AI tools to rewrite large COBOL repositories. [2]
[1] https://www.pcmag.com/articles/ibms-plan-to-update-cobol-wit...
[2] e.g. Blitzy https://paper.blitzy.com/blitzy_system_2_ai_platform_topping...
In retail banking I'm sure that this could be true. Working in investment banking, I never saw a single COBOL application, or had to have my C++/Java/$MODERNLANGUAGE code interact with one.
Corp bank here, everyone has rumours about COBOL systems but no one I've ever spoke to has seen, interacted or has any other evidence these really exist anymore either.
Me neither.
But I asked for a bank statement from my old savings account a few years old and it took two weeks to print out, printed in monospace dot matrix.
Or the betting company that I was a customer that suspends betting everyday 6:30am for an hour for daily maintainance. Ironically, they would accept bets for football matches played at the time, but the system was shut down.
I suspect both are run on COBOL.
You haven’t seen or heard them because they are abstracted away by APIs, circuit breakers and proxies. Almost ALL banks, credit card companies, travel systems and other high throughput transaction systems run on mainframe that is written in COBOL.
I think the issue here is that people working in fintech don't seem to come across these systems much, if at all - if you know one specifically, please tell us.
I worked briefly at a company that wrote applications that interacted with bank mainframes. Think end point bank teller systems and in branch customer/account management. They definitely do exist - every major bank has a mainframe written in (usually) cobol.
But it's very abstracted, part of our main product offering WAS abstracting it. On top of our ready to use applications, we offered APIs for higher-level data retrieval and manipulation. Under the hood, that orchestrates mainframe calls.
But even then that there could be more level of abstractions. Not every bank used screen-level mainframe access. Some used off the shelf mainframe abstractors like JxChange (yes, there's a market for this).
Fintech would be even more abstracted, I imagine. At that point you can only interact with the mainframe a few levels up, but it's still there. Out of sight.
It's still there at the accounting/backend level. Automated Financial Systems Level 3 and it's replacement Vision are commercial loan systems.
LVL3 is pure cobol. It has been recently deprecated but there are many banks who own the code and are still self hosting it, along with it's IBM green screen support.
Vision is a java front end in front of an updated cobol backend. When your reputation is based on your reliability and long term code stability, at what point do you risk making the conversion, versus training new developers to work on your system.
https://www.linkedin.com/jobs/view/business-analyst-afs-visi...
No, we are not afraid of our own systems. The idea that there is some fabled computer system which everyone is too scared to touch doesn’t exist (I work in payment processing). There are levels of controls way outside these systems which provide these safety nets (e.g settlement / reconciliation controls).
If the cobol is still there, it’s not due to risk. If anything, the cobol is a much higher operational risk than replacing it.
Analogously, GDSes like SABRE still ran on mainframes until very recently (c. 2023) [0]. SABRE was written in some combination of assembly and some kind of in-house dialect of PL/I, if I recall.
[0] https://www.theregister.com/2022/04/11/gds_gets_over_histori...
Yeah when I worked in investment banking it was VBA and Java everywhere, never saw or heard of COBOL.
Also, lots of quantitative software at i-banks use LINPACK or BLAS, which use FORTRAN.
Well, I had a very badly specified project to write a library for our back office systems to do Swift payments from our C++ applications, via COM. There was no obvious COBOL involved, on either side, but it has to be said that the whole use case for the library was very murky. And it never worked, due to the lack of spec, not the languages.
Cobol is used in pretty much all enterprise legacy systems.
But "used in" doesn't mean that it's actively being developed by more then a tiny team for maintaining it.
As this graph we're commenting on is mostly talking about popularity/most used it's never going to rate higher, because for every one Cobol dev there are more then 100 Java devs employed by the same company
That's a pretty wild claim. What's legacy for you? I'd consider legacy e.g J2EE crap running on web[sphere|logic] as holding most of the points in that league table vs COBOL.
A legacy software to me is whatever the company that employs me says is said legacy software.
Pretty much every business I've worked at to date has had such legacy software, which was inevitably still used in some contexts.
It's not always obvious, because - following with the previous example numbers - only 1-2 Java devs will have to interact with the legacy software again, hence from the perspective of the remaining 98, Cobol doesn't exist anymore.
If they're talking about Cobol, it's usually systems originating before the early 90s that haven't been completely rewritten.
J2EE would be late 90s and 2000s.
I can only speak to the two bigger German banks (i.e., Sparkasse and VR banks), but if you look at their outsourced development providers (Atruvia and Sparkasse Informatik), they're still offering incentives for their apprentices to learn COBOL, especially in the german dual apprenticeship programs which they can steer more easily than university courses. My wife has been doing COBOL for one of them since 2012, and the demand has never diminished. If anything, it's increased because experienced developers are retiring. They even pull some of these retired developers back for particularly challenging projects.
Sparkasse and VR aren't the two largest German banks. DB is at least double the size of Commerzbank which is again 100mn in assets ahead of DZ. I don't find it all that surprising that these small banks are still trying to keep their legacy systems alive, but it's not the case for the bigger boys. (Source: work for several of them)
You are right if we only talk about assets. Should've clarified I meant more in regards of retail customers and branches.
Oh, right, consumer banks. Yes I can imagine they're all extremely legacy bound. They're a very small percentage of banking, though.
1. Not all roles are advertised. I've actually only been interviewed for two of the jobs I've ever had, both at the same place - my current employer because it's a public institution and so it always advertises and interviews for jobs even if it has an internal candidate who is likely to be a good fit. In fact the first of those jobs was basically my shape on purpose, another candidate was an equally good fit and they hired both of us.
Everywhere else people hired me because they knew who I was and what I could do and so in place of an "interview" maybe I grab lunch with some people I know and they explain what they want and I say yeah that sounds like a job I'd take and maybe suggest tweaks or focus changes. No shortlist of candidates, no tech interview, no tailoring a CV to match an advert. Nothing -> Lunch or Drinks -> Job offer.
So that can cause some distortion, especially for the niche languages where there are like six experts and you know them - an advert is futile there.
> measurable way of understanding a languages real utility
It feels like that metric misses "utility" and instead comes from a very American (or capitalistic maybe is better) mindset.
What about Max/MSP/Jitter? Huge impact in the music scene, probably has very small amount jobs available, so it'd rank fairly low while it's probably the top media/music language out there today. There are tons of languages that provide "the most utility for their domain" yet barely have any public job ads about them at all.
I think such metric would be useful to see the "employability of someone who knows that language" if anything, but probably more pain than gain to link "# of job ads" with "utility".
Thinking about how to measure this properly, why not just the moving average of daily downloads over 30 days from each repository?
… yes CI would be a lot of these downloads, but it’s at least a useful proxy
Yeah except job adverts have enormous lag behind what's actually popular. For example we used Rust quite a lot at my previous company but we didn't advertise for Rust developers at all.
Also then you're looking at which languages were popular in the past whereas the interesting stat is which languages are being used to start new projects.
Interesting might not be the same as useful.
If I'm trying to figure out which language to learn next, knowing what I can get paid for might be more useful, even if it's not that "interesting".
If lots of projects are starting up in Rust, but I can't get interviews because nobody is advertising, how useful is learning Rust?
Well, we have to define what a language's popularity mean. Because Rust is surely more 'hyped', than Java, but Java has at least an order of more developers/software written, etc.
So in which meaning do you use 'popular'?
Ideally we'd like to know both, as they tell us different things.
Existing C++ developers learned Rust.
Find a CS program that teaches Rust and hire their graduates.
> It's hard to find good data sources for this
I like this:
https://madnight.github.io/githut/#/pull_requests/2024/1
It gives you a count of public repos on GitHub by language used, going back to 2012.
Plus TIOBE had Perl enter the top 10 suddenly this year but I do not see any new developers. And Ada too! Where are all those Ada programmers?
Keeping 7 Ada vendors in business, one of the few areas where developers actually pay for tooling.
https://www.adacore.com/
https://www.ghs.com/products/ada_optimizing_compilers.html
https://www.ptc.com/en/products/developer-tools/apexada
https://www.ddci.com/solutions/products/ddci-developer-suite...
http://www.irvine.com/tech.html
http://www.ocsystems.com/w/index.php/OCS:PowerAda
http://www.rrsoftware.com/html/prodinf/janus95/j-ada95.htm
It's more like people in golf suits agree on corruption schemes rather than actual devs making decisions.
Which nonetheless reveals it is more relevant than others.
>>Perl enter the top 10 suddenly this year but I do not see any new developers.
Perl is almost as active as Javascript. And more useful than Python.
https://metacpan.org/recent
I write Perl to do all sorts of thing every week. Its strange its not in the top 5 list.
You're joking right?
If you look at programming language list- Apart from Python, Java. Most are targeted to specific platforms(databases, browsers, embedded systems) or tech(SQL for database).
The general purpose programming languages today are still- Python, Java, and Perl. Make whatever of this you will.
Larry Wall at one point said, if you make something very specific to a use case(like awk, sed, php etc), it sort of naturally starts to come out of general purpose use.
Its just that Kotlin, Rust, Go, SQL, Julia, SQL, Javascript etc. These are not general purpose programming languages.
That was an active debate ... 15 years ago
https://pkgstats.archlinux.de/packages?compare=ada,gcc,go,ja...
Ada seems pretty popular on Arch
This data is kinda worthless for popularity contests, since they may get picked up by aur packages, but this gives a solid insight into wich languages are foundational
I wish the same was available for other distros
You can do the same with docker images
"Top Languages" doesn't mean "better" nor does it mean "best"That’s a C++ URL parser library, has nothing to do with the programming language.
You want gcc-ada for the programming language.
Perhaps the best source would now be the statistics of LLM queries, if they were available.
Edit: I see they raise this point at length themselves in TFA.
Those of you surprised to see Java so prominent, where have you been all your careers? 10 people startups with nodejs backends? You must have been entirely shielded from enterprise software companies.
Java is the new COBOL.
The financial sector, insurance sector, healthcare sector all jumped on Java a couple of decades ago, and they have massive repositories of Java code, and are actively migrating their COBOL code to Java.
> Java is the new COBOL.
What do you mean by this? To me it sounds like people are saying they are both "old" languages, but I don't know what you mean.
I work in a shop that has lots of both Java and COBOL. We are not "actively migrating" COBOL code to Java. It looks like mainframes will continue to exist for decades to come (i.e. >20 more years). Obviously, brand new applications here are not written in COBOL.
I've said this myself as Java's reification of mid 90s OO and poor interop via JNI are not to my taste. But I've spent 25yrs in banking with JPMC, BoA, Barclays et al. Done lots of interop with Cobol on S/390, AS/400 and VME. Never heard of any of those systems being rewritten in Java. Have encountered key mainframe prod systems for which source is lost.
I know of one investment bank (a former employer) that rewrote its mainframe-based settlement system in Java (on Linux). Front office systems were often Java (replacing Obj-C in some cases).
That was two decades ago – almost a generation! Interesting to think that some of those systems would now be considered “legacy”.
There are even whole companies specialized in semi-automatic migration of cobol to java.
I might be mistaken but as I understand it COBOL never had the reach that Java does. It's everywhere, from embedded systems to massive clusters, as bulky VM, slimmed down VM or native. Business, science, recreation, the sector almost doesn't matter, it's going to have Java somewhere in there.
Yeah, isn't SAP built with Java? Stodgy old big bloated. The new COBOL. Oracle/Java/SAP. Some kind of trinity of evil.
So is Netflix. Silly take
So is most every corporation? What is point? Have you had to implement SAP? Nobody is happy.
You seem to be saying that SAP sucks because of how enterprise java is.
That would make OPs counter re netflix relevant. I don't understand your point
Wasn't this whole thread about joking that Java is the new COBOL? And a lot of enterprises use Java and that is becoming the new entrenched/old/stodgy language that the hot new kids don't want to use?
In its day, a lot of 'cool' companies used COBOL, back then. Because it was an ok solution, back then. So to say, today, Netflix is cool and uses Java, thus Java is different and still cool, is not valid. Does not invalidate the point. It is the same situation, just decades later.
Maybe shouldn't have conflated SAP, but they seem to be just all part of the same giant ecosystem of 'current/entrenched' solution that 'we use because we have to, not because it is better'. Not unlike COBOL.
It's a weird one - I've been at Google for more than 5 years. I know from the stats that we have a zillion lines of actively developed Java, there must be huge swathes of the company that you could even call a "Java shop". I dig into random bits of code all the time. And yet I've looked at Java maybe three times in my tenure. And if I needed to submit Java code, I would not have a single contact to ping for readability review.
Java was the first language I learned in my CS degree, I still think this was a sensible choice by the CS department, but I don't think I've written a single piece of Java since I left 10 years ago!
It seems like a lot of Java usecases may be big and important but kinda isolated! Something about where they sit in the economic value chain perhaps?
So what are the main languages there? Go? Python?
I've read that C++, Java, Python, JavaScript, and recently Golang are the main languages used at Google.
Edit: And maybe some Dart and Kotlin too.
C++ is my guess
What products do you work on and what language?
I work on stuff that's adjacent to system software (I'm actually mostly a kernel engineer). But I've touched code in all the major languages at Google: loads of C++ and Go, much less of Python and Typescript. But Java/Kotlin is the only one I've never touched at all.
Anyone who is surprised is not from Finance sector. I wouldn't just say enterprise though because there could be non-finance enterprises where Microsoft and .NET/C# rule.
Also in telecom and largeISP, Java is everywhere.
I totally expected JavaScript to get the 2nd spot but looks like TypeScript pulled the votes away. I personally consider JavaScript and TypeScript to be close enough for their numbers to be added up.
I agree, I think it makes most sense to add them up to be the true #2.
Then you should probably add kotlin and java together as well. They share the same purpose, use the same VM, usually live in the same project, have native compatibility, are used with the same frameworks, etc.
Especially considering Kotlin is used as a drop in replacement for Java in a lot of projects. Especially when using the type of frameworks often associated with Java (Spring, Quarkus, etc.).
Personally, I think statistics like this are biased towards the median of the past few decades and do not necessarily tell much about the future; other than that things apparently move very slowly and people are mostly conservative and stuck in their ways.
Cobol is still in that list. Right above Elixir, which apparently is a bit of a niche language. Kotlin has only been around for about 15 years, and the 1.0 release was actually only nine years ago. Java was released 30 years ago and it's been dominant in enterprise development for 25 years now. So, no surprise that Java is nearer to the top.
Python is surprising but it's been around for quite long and gained a lot of popularity outside the traditional computer science crowd. I know biochemists, physicists, etc. that all use python. And it's a great language for beginners obviously. It's not so much that people switched to python but that it is driving the growth of the overall programmer community. Most new programmers use python these days and that explains why it is the #1.
Javascript has had a virtual monopoly on basically anything that runs in a browser, which is of course the most popular way to distribute code these days. Especially since plugins were deprecated and things like applets, flash, etc. disappeared around fifteen years ago. Anything that ran on the web was either written in Javascript; or transpiled/compiled to it. WASM is starting to change that but it's early days.
What the past 25 years tell us is that things definitely change. But very slowly. C++ still outranks Javascript. That's because it's mostly browsers where it is used. It's a lot less popular for other things.
I like Kotlin, so I'm biased. But it's obviously not the most popular thing by a long shot. But popular doesn't mean good. I actually like python for small unimportant things. But I reach for Kotlin if I need to do it properly. I used to reach for Java. But Kotlin simply became the better tool for the job; at least for me. I even prefer it over typescript and I do occasionally use it for web frontend development. The transpiler is pretty good. And there's a WASM compiler too and Compose for WASM just entered beta. Kotlin seems future proof and it seems to be growing into wider adoption. There are a few million programmers around by Jetbrains counts. It's not nothing.
C++ is still very popular where you need raw perfomance but not so raw as C. Especially with the fact that python is used as a more user friendly interface.
True. Pretty much every Kotlin recruitment message I got was because of past Java experience, so the job market seems to agree with you.
JS is a valid TS, Kotlin is not a valid Java (only at a bytecode level, but then you might as well combine all JVM languages).
But TS is not valid JS and nobody uses TS because they can write JS in a file with a different extension. You also get 0 benefit from running `tsc` on a JS file. You could argue that C is valid C++ so there's no reason to discern them either.
> Kotlin is not a valid Java
But you can easily have both of them in the same project (e.g. when slowly moving to kotlin) and have them interop.
It doesn’t make it valid Java. You can paste JS verbatim to TS file and it will work.
You can also easily have Objective-C, C, and C++ in the same Swift project and have them interop. That’s a feature of Swift. But adding their numbers together wouldn’t make sense.
Doesn't really bring benefit. With Java you are more quickly useful in C++ and can write server apps without fuss. Very little benefit in using a different language when Java literally does the same and is used everywhere else.
That applies to all JVM languages no?
Java devs at large are generally not excited about writing Scala.
There are dozens of us! Dozens!
And Clojure and Scala. So really Clojure is number 2. :-)
Add Scala in there, while you're at it!
Don't forget Jruby and Groovy.
Touche!
You could make a separate graph with "platform" or "language family" so you do js/ts, jvm, .net/clr, C/C++ etc.
That one is perhaps more interesting from an industry/jobs trend perspective whereas the TS vs JS trend is also interesting on its own.
JS is a valid TS, so there’s no reason to discern them.
And #1 on the jobs chart
Then also app up Java&Kotlin and C&C++.
Ooh, then JS&TS are not number two!
Agreed. There’s a few consolidations I’d prefer, including BEAM-based languages as one.
As a backend dev (mostly working in fintech) I feel weirdly unable to find a target language to move to.
After working with Node and Ruby for a while I really miss a static type system. - Typescript was limited by its option to allow non strictness.
Nothing catches my eye, as it’s either Java/.Net and its enterprisey companies or Go, which might not be old but feels like it is, by design. Rust sounds fun, but its usecases don’t align much with my background.
Any advice?
If you want to stay in Fintech, I really don't see anything beyond Java, C#, C++, TypeScript (for the Web stuff).
Some Fintech companies might go a bit outside the norm and allow for Haskell, F#, Scala, which they tend to use as DSLs for some workflows.
Then if you into array languages, banking and fintech is one fo the few domains where that have managed to stay around, but those positions seem hard to get.
Dyalog (APL), J, BQN, Kdb+ (Q)
https://www.arraycast.com/resources
I work in Fintech, and we use Python with type checking. It works like a charm. Safer than Java.
It's been years since typing was added to Python, and many people still underestimate it. You're right; Python with strict type hints is more type-safe than Java and Go. It even has features like pattern matching and proper enums, both of which Go lacks.
The only issue is that some libraries (not many!) are still untyped, so you may need to write wrappers or stubs occasionally. But that applies to a small and decreasing minority of libraries.
I only wish Python were faster and had ahead-of-time binary compilation.
You should give Julia a go, can be written completely static if desired (not as static as rust of course, but compared to python it’s miles ahead). Can be made fast, very fast. AOT compilation with trimmed executables is coming in 1.12
Doesn't Julia suffer from very long startup times? One of the things I use Python for is CLI programs. The startup time isn't great either, but Julia was even worse last I tested.
Julia v1.12, the unreleased version which is currently in release candidate stage (and has had a longer RC stage than expected but should be done at least by the end of the year) has the ability to fully ahead of time compile and generate binaries, like you would expect from a language like C. It also trims these binaries so that they are a reasonable size given the elements of the runtime that are used. Thus for good type-stable code you get callable binaries without JIT overhead, and this leads to a much better CLI experience. It will take a bit of time so that there's more package tooling and support built around this feature, but this is the path a lot of the ecosystem CLI tooling is now going and that will be a pretty dramatic shift in the usability for these kinds of use cases which Julia had traditionally ignored.
Python sure is slow. It's not that big a problem most of the time, but once it becomes relevant, it becomes super relevant. The lack of AOT binary compilation also is very annoying.
That was a fun one, being safer than Java.
Since we are at it, faster as well.
Not as fun as still having no null safety in Java.
As if Python was any better in that regard, as dynamic language.
Python with type checking is statically type checked. So yes, Python with type checking is better in that regard. And it's safer than Java because there are more classes of errors that Python with type checking will pick up during static type checking than what the Java compiler will pick up.
If Python with added sugar counts, same goes to Java with added sugar.
Can you name any tool that adds static null safety to Java?
How did you do typing? Was it 'hints', or some other outside option.? It always seemed to me that since it isn't baked in, that it comes down to organizational discipline.
We run mypy with `--strict` mode in CI, which means it only passes if we have type hints and if they are correct, or if we have added `# type: ignore[code]` for places where the errors are reported.
And the type hints and behaviour are specified as part of Python, so it's kind of baked in. It's just that the actual type checker is not part of CPython, though Mypy is considered the reference implementation.
We have quite a large code base and very few `# type: ignore[code]` directives.
Some third party code still has poor type hints, but for the most part it's fine, and we get null safety which you don't get for Java.
SonarQube/Checkstyle/Kotlin/Scala sugar > Mypy sugar
Type hints and their behaviour are part of the Python language. Kotlin and Scala is not Java. I did not know that checkstyle + SonarQube adds static null safety to Java, but I appreciate that you will share a citation that it adds static null safety. I think everyone coding Java should really learn to use this so they don't create null pointer exceptions.
Scala is the best language I've ever used - all the good parts of Typescript and all the good parts of Java or Rust. And fintech is one of the few niches where you might still be able to find a job using it.
We use Scala as well and are very happy with it. We also have a lot of TS, and at this point we are tempted to just switch to using ScalaJS on the frontend as well because of how much better the language is to use. It feels like Scala 3 fixed a lot of issues people had with the language. The IDE tooling isn’t the best, but even then it feels like the IDE tooling for TS breaks all the time once the project is large enough.
> We also have a lot of TS, and at this point we are tempted to just switch to using ScalaJS on the frontend as well because of how much better the language is to use.
I did that and highly recommend it, at least for new projects - not sure that it's worth the effort of porting an existing project, but ScalaJS just works and I found far fewer rough edges than I expected.
How's the tooling story with Scala these days?
I played with it a long time ago and the IDE (eclipse plugin?) was a bit of a mess, sbt was prone to weird issues and lock ups and the compiler was pretty slow.
Very fun language tho!
Some of the folks now work for JetBrains, thus InteliJ is the one with best support.
Then they decided to focus the remaining effort on Scala Metals, which is based on LSP protocol.
In both cases it is alright for Scala 2, there are some rough edges for Scala 3.
I had this feeling when Kotlin (that took a lot of ideas from Scala)
Intellij tools are great.
Rust is general purpose. You can use it for anything.
But use the best tool for the job. Ecosystem matters. What are you planning to build?
Okay, Rust is general purpose enough...
But is it general audience? (can every Py/PHP/JS/TS/Java/C# dev become productive in it quickly?)
Also: if you want quick (re)compiles on a larger codebase, Rust is not for you.
I would say yes. I have experience teaching Rust to ~20 yo students of Java, and they are able to be productive in Rust within a semester. Your median Java and C# dev should be able to use Rust. Dunno about Python devs.
Sure they can use it, and learn it in a semester.
Also: good you use Rust in teaching!
But I want a fast on-ramp, quick iterations and clean looking code. (and went with Kotlin because of that -- I like Rust more myself, but I have a business to run, so tradeoffs)
Nim is a statically typed language with a syntax resembling Python's. https://nim-lang.org/
Sometimes I'm questioning if it has the potential to become more popular in the future if AI becomes adept at translating Python projects to Nim.
I feel the same way and I think that Gleam is best language that fits this criteria. It has the simplicity of Go but the ergonomics of Kotlin.
I really like Gleam. But if you want a job, that’s probably the worst choice ever. The language is just a few years old and lack libraries for mostly everything. Why are people recommending that here, do you guys work using Gleam??
Let’s be honest , your realistic options for work are Java, C#, C++, and depending on the industry, Swift, Go, Kotlin, Dart and Rust.
I work in Gleam on my software and it's a treat. I use the FFI for JS libraries all the time.
Being able to talk passionately and with experience about "exotic" languages during an interview will help you land the job.
I will hire the one that talks about the joy of FP and/or static typing ANYDAY over the programmer with only JS experience an no visible interest to look beyond.
Python with Pyright in strict mode. I work on a ~200kLOC fully typed Python project [0] and am having fun.
[0]: https://github.com/xdslproject/xdsl
Have you worked with TypeScript? I’m working with both every day and I’m always frustrated by the limits of the 'type' system in Python- sure it’s better than nothing but it’s so basic compared to what you can do in TypeScript. It’s very easy to use advanced generics in TypeScript but a hell to do (sometimes outright impossible) in Python.
Yep, although never in a project of a similar size. One advantage of the Python setup is that the types are ignored at runtime, so there's no overhead at startup/compilation time. Although it's also a disadvantage in terms of what you can do in the system, of course.
Deno and latest versions of Nodejs run TS code without transpilation
I agree it is pretty nice (with uv and as long as you REALLY don't care about performance). But even if you are one of the enlightened few to use that setup, you still have to deal with dependencies that don't have type annotations, or only basic ones like `dict`.
Typescript (via Deno) is still a better option IMO.
Java is maturing into a syntactically nice language, albeit slowly, and it's the backbone of many medium and large companies.
You might have trouble finding small companies using anything but JS/Ruby/Python. These companies align more with velocity and cost of engineering, and not so much with performance. That's probably why the volume of interpreted languages is greater than that of "enterprisey" or "performance" languages.
> Java is maturing into a syntactically nice language, albeit slowly, and it's the backbone of many medium and large companies.
I've heard about Java initiatives to improve it, but can you point to examples of how how Java "is maturing into a syntactically nice language"?
I'm tempted to learn it, but wonder whether it would really become nice enough to become a 'go-to' language (over TS in my case)
I've always felt it was verbose and the need for classes for everything was a bit of a overkill in 90% of circumstances (we're even seeing a pushback against OOP these days).
Here are some actual improvements:
- Record classes
public record Point(int x, int y) { }
- Record patterns
record Person(String name, int age) { }
if (obj instanceof Person(String name, int age)) { System.out.println(name + " is " + age); }
- No longer needing to import base Java types - Automatic casting
if (obj instanceof String s) { // use s directly }
Don't get me wrong, I still find some aspects of the language frustrating:
- all pointers are nullable with support from annotation to lessen the pain
- the use of builder class functions (instead of named parameters like in other languages)
- having to define a type for everything (probably the best part of TS is inlining type declarations!)
But these are minor gripes
It has virtual threads, that under most circumstances let you get away from the async model. It has records, that are data-first immutable classes, that can be defined in a single line, with sane equals toString and hash. It has sealed classes as well, the latter two giving you product and sum types, with proper pattern matching.
Also, a very wide-reaching standard library, good enough type system, and possibly the most advanced runtime with very good tooling.
There are some interesting talks and slide decks that I have to search around to find. Here is one: https://speakerdeck.com/bazlur_rahman/breaking-java-stereoty...
Check jbang.dev, and then talks by its author Max Rydahl Andersen. That could be a starting point.
https://inside.java/2024/05/23/dop-v1-1-introduction/
Aren't a lot of those Java employers stuck on an old version of the language, one that's lacking most of those nice features?
less and less, with the new release cycles.
What you get is either really old (Java 8 stuck on something nasty like weblogic).
Or companies running either cutting edge or LTS.
Take a page from janestreet's book: use ocaml.
Rust is also a general purpose language, there's no reason you can't use it for just about any problem space.
I tried ocaml last year (I knew some Haskell and Scala already) and it was hell. The language itself was pretty, that's true. But the schism between Ocaml's stdlib and Jane Street's core lib was incredibly frustrating.
I guess when you're working at Jane Street, use only their core lib and you get some proper onboarding, things could work great. However you're programming very much in a niche community then.
You should try ReScript. The language has improved a lot recently. If Typescript is javascript with types bolted on then ReScript is javascript redesigned the proper way. The lsp is also surprisingly good. All that while still being in the node eco system.
> - Typescript was limited by its option to allow non strictness
That's 100% a project setting you can turn on or off though, it's like -Wall in C++ projects.
I'm also a back-end dev who's worked in fintech and IME, TypeScript is a great choice.
From my experience with python, none of its type checking libraries are complete enough.
>That's 100% a project setting you can turn on or off though, it's like -Wall in C++ projects.
I know, it’s just not realistically up to me. I depend on the team/company culture which I’d rather not have in the picture - I’ve already gone through trying to fight the sea of unknown/any.
You may want to look into Crystal.
+1 for Crystal, it's a fantastic language and the community is great (on the official forum).
Do you think those languages are scratching the itch? If not maybe we need to pick a suitable underdog and champion its use. Clojure comes to mind. Or Unison. Something where immutability shines.
You could try Crystal if you like the other parts of Ruby. But you probably ain't going to find a job writing it anytime soon.
What's wrong with Java? It's used everywhere from FinTech startups to banks/insurers, with features like decimal arithmetic introduced specially for finance a long time ago, multiple runtimes (Oracle/JVM, IBM T9, GraalVM), a very healthy ecosystem of libs/packages, etc.
>What's wrong with Java? It's used everywhere from FinTech startups to banks/insurers
As the OP, it’s not even the language for me but the implications of companies that use it.
It’s a non starter for startup/scaleups and strongly related with oldish companies and consulting firms, which in turn translates to far worse working conditions (not remote friendly, overtime, dress code, etc).
Mind that it might just be a local culture thing, your mileage may vary.
My startup switched from python to Java and saw our productivity explode. Using modern Java versions in a non enterprise way (no frameworks, minimal oop, minimal DI, functional features like immutable objects, optional, etc) is quite nice. Our ability to deliver performant and working features was orders of magnitude faster than python. The ecosystem of libraries is crazy deep which also helps build quickly.
I won’t deny there’s a lot of bad Java written, but IMO it’s actually one of the best languages for a startup if any of your code needs good performance.
100%. Java has an amazing standard library, amazing IDE support, AOT compilation, JIT optimizations, static typing, runs much faster generally, and supports multi-threading... seems like a no-brainer to me.
I usually ask at interviews what version of Java they are using. If they are on >8 of >11, then that's a pretty good green flag.
And it's a non-starter for startups, because it's not hyped enough, not for a technical reason.
Many of them would be way better off with a standard traditional Spring app.
If you have very good developers, go with whatever they feel most efficient with, (as long as there are enough libraries). But if you’re planning on getting big, do add a transition to stabler tech like Java, C#. They’re boring tech and full of guardrails. Which is what you want when performance is not your main concern.
> It’s a non starter for startup/scaleups and strongly related with oldish companies and consulting firms, which in turn translates to far worse working conditions (not remote friendly, overtime, dress code, etc).
mkay
The question asked usually by people who haven't used anything else professionally or at all.
Answer the question.
Maven and Gradle suck.
Julia https://www.reddit.com/r/Julia/comments/1efxp0j/julias_advan...
Have you really been using Node or even Typescript as backend dev in fintech? After 15 years in fintech I have never heard anything like that. For the backend it's mostly Java with some sprinkles of C#, Cobol or Python.
Yup, the previous company I used to work for just had a successful exit last year too. I guess it was an iteration speed thing, though as you say python is becoming popular as well and IMO it is not anymore suited to finance than node is, other than the connection with data science.
Moonbit lang is currently the best language I've seen (not a joke).
Thank you for this, I just checked it out and it looks really great
C++ of course. Backend and Fintech.
Swift! It actually has a pretty big server community, Vapor and Hummingbird are both great frameworks, Apple has been replacing some of their Java services with it, it's open source and cross platform and Apple seems serious about making it viable on Linux, which it is. No need for Xcode, it has an open source LSP and first class VSCode plugin.
Plus it's a fun language to write. Some people say it's a nicer, higher level Rust.
I also like the look of Kotlin but I've never used it. I think Kotlin and Swift are the two premier modern multi-paradigm languages.
I do like Swift but it also suffers a bit from identity crisis. The compiler experience is quite disappointing too—I found myself helping the compiler more than the compiler helped me.
I've since moved to Rust and have not looked back. Importantly, rust-analyzer runs circles around the Swift VSCode plugin (or Xcode for that matter)
The identitiy is clear, it is a language first and foremost for Apple ecosystems.
It also needs to target GNU/Linux, because Apple got rid of their server offerings, thus anyone doing server code for applications on the Apple ecosystem, that wants to stay in a single language needs to be able to write such software on GNU/Linux with Swift.
Windows well, since they have the open source story, it kind of falls from there as complement.
On the revamped website they are quite clear about the identity.
Cloud Services, CLI and Embedded as the main targets for the open source version.
Have you tried it since 6.2 came out with approachable concurrency enabled? The compiler is quite useful.
No, I think I stopped using it in 5-something. Will try to play with it again to see how it feels these days, thank you.
Though I must admit it's hard for me to imagine using anything other than Rust for 90% of projects.
I'm the same but opposite, I like Rust but find myself using Swift most of the time. They sort of do the same thing but coming from opposite directions. Can't go wrong with either imo.
I really do wanna try Kotlin at some point as well. Rust, Kotlin, and Swift feel like the future of languages to me.
If OP wants to run on Windows, Swift is a non-starter. It barely runs there.
Kotlin!
D?
I have the same dilemma: a strongly typed / modern language with good tooling / library support. I'm also considering Kotlin / Gleam. With Kotlin, practically speaking, we're talking JVM again, along with its resource requirements.
C# is the right answer here.
Fintech should have lots of C++ jobs no? They also seem to use ML languages like OCaml and Haskell more than the average industry...
More Java than C++ in my opinion. And, get this, JS! We shall see how far OpenFin will actually go in that space because they will potentially drive a ton of JS across orgs in finance
Gleam?
What is "Arduino" ? If it's the "Arduino" that hobbyist using to DIY device, then "Arduino" is not a language, it's C++.
The Arduino docs call it the "Arduino programming language" for some reason, even though it's mostly just C++ with a few typedefs. I am not sure why.
Maybe they don't want to confuse beginner. Saying it's C++ means some beginners will assume you can use standard lib like cout or printf.
You can use that though. Using the file system will cause runtime errors. The Arduino IDE is just a glorified build system around gcc with a package manager. I recently ported the behaviour of the IDE to Makefiles and it wasn't actually that hard.
Correct, but it is the same root cause that make people say HTML and CSS are programming languages, C and Fortran written libraries are "Python" libraries, and so forth.
Yes, that is weird and makes the chart lose credibility. So this likely should bump C++ a bit.
Well I'm happy that Haskell registers at all! At a level similar to ... LabView (oof). The article proper is rather uninteresting, I'm afraid.
Haskell at least is fun.
My favorite Julia also made the list this year... nonzero users means there is hope for fun languages yet.
With the new Intel+NVIDIA RTX SoC deal, we can expect Python and C++ to dominate that list in the next few years. =3
Dozens of us use Julia. Dozens!!
A few more actually, even though it gets ignored in most HN discussions,
https://juliahub.com/case-studies
It’s grating to see Haskell compared to labview, regardless of context lol
Haskell is a fun language...
LabView is a kick in the pants...
I'd wager it is the installed base keeping LabView on life support. =3
Labview seems like a pain (I haven't used it), but I guess it's super useful for some uses. I recall SpaceX uses it for controlling launches. It comes with models for all manner of hardware.
> but I guess it's super useful for some uses
It comes with device interfaces (not exactly drivers, but some times it has drivers too).
All right, I even never heard of Labview programming language, haha..
I really loved LabView
NI rarely made established software better after a company acquisition.
Generally, IT and Engineering agreed to deprecate their product lines off critical systems about 2 minutes after the deal went through. =3
https://www.youtube.com/watch?v=WpE_xMRiCLE
NI has niche instrumentation product lines that have been around for some time.
Yet, the modern licensing model jettisoned a lot of their long time proponents. Like many things in life, most problems were non-technical in nature.
SpaceX is an interesting company, and it made some unique design choices. =3
I’m skeptical. There are more people writing PHP and Ruby than HTML? And HTML is a programming language? Those two very surprising results make me doubt the others.
Elixir behind OCaml? Possible, I guess, but I know of several large Elixir shops and I haven’t heard much of OCaml in a while.
Perhaps the people who pick HTML as "what programming language do you use" are a small number :)
A couple of Java programmers from my first job were stopped by police while drinking in a park.
When asked what they do for a living, they said they were programmers.
Then the police officer went:
– Oh, I see. HTML.
> There are more people writing PHP and Ruby than HTML?
Intuitevely I'd say yes.
In most jobs I've been in, the ratio of backend/system devs to front end devs has been from 3:1 to 20:1 depending on the size. Provided I'm on the backend side and would choose companies accordingly, but still.
Even for web first services, there will be a flurry of people who won't touch a line of html/rendering.
But how much PHP code does exist, which does not also produce HTML? I think that's the whole point of most PHP code.
An awful lot.
Imagine a PHP backend providing an API for an app. The only HTML ever produced will be the admin backend, and perhaps auth forms for special cases. The surface of the API will produce objects serialized to JSON, and the vast majority of the PHP will be business logic to talk to external services and do what the service is paid for.
Some might not like the language, but whole businesses will run on PHP, with a dedicated react or next.js frontend that only talks to the PHP via API.
Probably very little. Its either json or html. Most PHP oit there is actually wordpress.
As mentioned, HTML is indeed a programming language. But it’s one that is rarely used on its own. So you could argue that having it as a thing of itself in these lists, makes little sense.
I contest this. It has no features of programming languages: no way to set or read variables, or to evaluate expressions, or any kind of flow control.
If it’s a programming language, so is Markdown. (It’s not, either.)
Take a look at this conditional here:
<picture> <source media="(prefers-color-scheme: dark)" srcset="logo_dark.svg"> <img src="logo.svg" alt="logo" width="48"> </picture>
"(prefers-color-scheme: dark)" is CSS
HTML on its own is not Turing complete, so it’s not a programming language in any practical sense. It’s a markup language.
There are non-Turing complete programming languages, and there are many things that are Turing complete but have nothing to do with programming (even PowerPoint), so this is neither a required nor sufficient property.
I believe a reasonable way to categorize languages as programming or not is simply.. what is it's primary use case. HTML's last two letters tell us exactly that it is not a programming language.
not sure if that's a good criterion. There are also non Turing complete languages like core SQL or Rocq that definitely feel like programming.
I think "A language often involved in the process of making computer programs" is way too weak to define a "Programming language". A programming language at least needs to have state/expressions/logic. I'm sure there is a good definition, but if we allow html then any markup is programming and that's obviously false so the line has to be drawn somewhere.
The reason this debate is so strange is because some people think it's gatekeeping to say someone who writes html for a living isn't "programming". It's nonsense.
HTML isn't turing complete
We all live in our own bubbles. I still think Scala is a really popular language!
HTML is a declarative programming language.
https://en.wikipedia.org/wiki/Declarative_programming#Domain...
If HTML is a programming language, why not SVG? If SVG is a programming language, why not PNG? Is your image viewer just an interpreter executing PNG code? Maybe being a programming language is a spectrum...
Yes, if you go down this road then the equivalence of code and data means that everything is a programming language.
Nah. It’s a declarative language, but not a programming language.
To prove me wrong, show an HTML program that does any kind of computation whatsoever.
There’s not one. It’s not Turing complete. I doubt it’s even Turing partial. It’s a markup language, not a PL.
Is CSS included in the HTML, if so then it is turing complete, maybe? See: https://stackoverflow.com/questions/2497146/is-css-turing-co...
Where does this idea that a programming language has to be Turing complete come from? As far as I can tell from cursory searches, the most broadly understood understanding of a programming language is a formal language for directing computations on a computer. HTML does this, CSS does this, and SQL does this. Frankly even configuration languages like YAML or the spare INI file do this in the proper context.
Can these languages do everything or even most computations you would be interested in doing in a computer? Of course not. But why should the definition be restricted to languages that can do everything?
Your definition is overly broad and makes everything a programming language, at which point the term isn't useful anymore.
But Turing completeness is also too broad. Otherwise Power point animations and Conways game of life are programming languages.
Yes. So a criteria that is below Turing Completeness but which wouldn't make markdown a programming language. Shouldn't be _that_ hard to find such criteria. E.g. "Has some form of logic/flow control". Can perform computation/"execute". etc.
I mean, CS is famously bad at exact definitions [1]. Why should we have one for what a PL is? Just do what humans have been doing for millenia, how is it commonly used. A tomato is a vegetable from a culinary perspective, and HTML is not a PL based on its use case and its literal name.
[1] what is a low or high level language? Strongly typed language?
I will take this over HTML
I wish Python weren't so popular. Really miss compile-time type checking.
As others are pointing out here use type annotations + Mypi integrated with your IDE or CI. Imo a bigger problem with Python is that it's very slow and if that becomes a problem it's hard to solve.
I was pondering similar thoughts. Will LLM assistants ossify our current programming languages? My limited testing seems to show LLM assistants do well the more popular the language is (more data in its training), so is the hurdle for adoption of something new going to get even higher?
In an alternate universe, if LLM only had object oriented code to train on, would anyone push programming forward in other styles?
I recently picked up Hare, which is quite obscure, and Claude was helpful as a better— albeit hallucinogenic— Google. I think LLMs may not lead to as much ossification as I’d originally feared.
What is your impression of Hare?
I had looked at it recently while checking out C-like languages. (Others included Odin and C3.) I read some of the Hare docs and examples, and had watched a video about it on Kris Jenkins' Developer Voices channel, which was where I got to know about it.
I like it much more than Zig, and while I like Odin’s syntax more, Hare is more focused on the types of tooling I want to build, so I find Hare’s stdlib preferable. Give it a spin. It’s a simple language.
I started reading the Hare tutorial again because of your comment. Looks good so far.
Just one note for anyone else wanting to check it out:
There are a few sections in the tutorial which are empty and marked as TODO. E.g. "Using default values" and "Handling allocation failure" (the empty sections seen so far, there may be others below).
Still going to check the language out.
Interesting, thanks!
>My limited testing seems to show LLM assistants do well the more popular the language is (more data in its training), so is the hurdle for adoption of something new going to get even higher
Not only that they also tend to answer using the the more popular languages or tool event when it is NOT necessary. And when you call it out on it, it will respond with something like:
"you are absolutely right, this is not necessary and potentially confusing. Let me provide you with a cleaner, more appropriate setup...."
Why doesn't it just respond that the first time? And the code it provided works, but very convoluted. if wasn't checked carefully by an experienced dev person to ask the right question one would never get the second answer, and then that vibe code will just end up in git repo and deployed all over the place.
Got the feeling some big corp may just paid some money to have their plugin/code to on the first answer even when it is NOT necessary.
This could be very problematic, I'm sure people in advertising are just all licking their chops on how they can capitalized on that. If one thing currently ad industry is bad, wait until that is infused into all the models.
We really need ways to
1. Train our own models in the open, with weight and the data it is trained on. Kinda like the reproducible built process that Nix is doing for building repos.
2. Ways to debug the model on inference time. The <think> tag is great, and I suspect not everything is transparent in that process.
Is there something equivalent of formal verification for model inference?
> is the hurdle for adoption of something new going to get even higher?
Yes.
But today the only two reasons to use niche languages are[0] 1) you have existing codebases or libraries in that language 2) you're facing quite domain-specific problems where the domain experts all use that language.
In either cases you won't just use Java because LLMs are good at Java.
[0]: besides for fun or 'for resume'
Of course it’s always been easier to find talent when working in more popular languages. That’s the big risk you take when you choose the road less traveled.
easier to find people, not talent. Some tiny niche languages have a very high proportion of talented developers
I see Raku has surpassed Erlang and Clojure. https://raku.org
I had to look up Solidity, which I've never heard of but see is at least 10x popular. It's sole reason to exist is for Ethereum.
Clojure is not a thing outside the HN bubble.
All the functional programming suffers the same curse. Apart from a small set of libraries (usually standards and pure algorithms), it’s much faster to write your integration than to bring someone else version. And it’s often easy to vendor the logic you need. You won’t see the minipackages from JS/Rust world or the big libraries from C++/Java.
You write something and it stays written, mostly because everyone moves the logic far away from accidental complexities, so maintainance is very low.
I've seen certain fintechs using Clojure and I'm not even in a tech hub, I'm in South America. Not to say there are many, but definitively I've seen 0 jobs for Raku.
Clojure seems more popular than other FP languages such as Haskell or even F#
Java still so popular after all this time. I've been re-learning java using the excellent University of Helsinki's MOOC course.
Was thinking of learning some spring boot and create a small project or two to reinforce what I've learned. However it feels like tutorials for spring boot is of so much lower quality compared to newer language/frameworks like JS/React/Python. Often times it's just a talking head over a powerpoint presentation talking for 30 minutes.
Could people recommend me a good tutorial for spring boot (or anything java that is being used in enterprises)?
This site has great tutorials for Java and Spring:
https://www.marcobehler.com/courses/spring-professional
I truly don't understand this on a tech forum. You have 6000 karma, is this a bot interaction to promote votes?
How does searching "Helsinki MOOC java" not immediately give you the result you are after?
I copy pasted it in to google. The first result might be it but it says "Please note, that this is a legacy course. ... The course content is also no longer updated or maintained."
No hint on how old and moldy it is. Does it teach a relatively recent Java or 1995 Java?
So asking if that is the right one doesn't seem out of line.
I mean :/ Let me be devil's advocate, You have added yet another comment without actually giving a link to that...
https://www.youtube.com/watch?v=eFrggyDXdUk&list=PL2s7AeEJ2f...
And I mean, I do feel like that as long as they aren't actively harming the "ethos" of hackernews, then we can cut each other a little slack each other
I feel like I have sometimes done a disservice like this too to HN where I ask for links sometimes and maybe they just wanted to confirm if this was the right course or they might be coming from a job tired and didn't think this tooo much y'know?
But i mean I also understand your standpoint that you want less clutter on HN which is why I feel a bit :/
I was questioning if OP was a karma farming bot, have a look at their profile. It's a lot of these odd one liner questions.
From the perspective of a useful thread, I agree with you
Ahh, just checked it and I think you might be correct but here's my nuanced take
Yeah I can also understand it, but I just saw their comments and scrolled down to find it ``` Nah, DotNET is amazing these days. At the risk of starting a holy war, it is neck-and-neck with Java, and I say that as a Java fanboi. I think it is good to have healthy competition between languages (and ecosystems), like C++ and Rust (and a little bit Zig) or GCC and Clang or Java and DotNet or Python and Ruby or NodeJS and Deno. Plenty of people are compiling and deploying to Linux after DotNetCore went open source. Plus, you can use JetBrains Rider, which is a cross-platform IDE for C#, from the makers of IntelliJ ```
It just seems that their use of words like boi etc. makes them (genz?-ish)
I am genz too (maybe younger than them, still in HS) but yeah maybe they just write one liners which can be quite common but I see that more on the reddit kind of things and not hackernews lol. I can definitely see our generation's attention span being cooked that when I write remotely as long as this, they say nah I ain't reading it. So well, how are they gonna write it for the most part :/
It might be a bot but then again, what is the point? The point I see of HN points is that I might create a new account and be the same guy and kind of accrue them back because I once had it y'know while being myself not writing some one liners lol.
The fact that I like about HN is that I have talked to soooo many people that I admire whether its a one liner from jose valim or etc. and I am happy that hackernews exists to a somewhat degree :>
Like just out of curiosity, has someone ever got any job or smth through HN in the sense that they had their karma in part of their resume and the company was impressed lol, I can see it to a somewhat degree in maybeee startups
Oh, please run away from Spring Boot. It is just the new JBoss. That is OK if you want to get inside enterprise software for a living, but that isn't the best usage for Java.
My opinion: learn to create Android apps in Java. Tutorials are good and you get a new set of skills (if not already). After that, focus on learning POJO which is the fundamental knowledge in Java.
Everyone writes stuff in Java/C++ where I work, but less and less Spring Boot is encouraged because of the bloat do debug and performance troubles.
I think that java has more of a written culture then make the video culture.
huh. This seems to suggest there are more Rust jobs than Ruby. Wild considering how insanely popular Ruby was 10 years ago.
This just doesn't feel true to me. I've been looking for a Rust job for years, and the only thing I can ever find is crypto scam listings. Meanwhile the Ruby on Rails jobs are plentiful and from real companies.
I speculate it goes this way: 1. Some people at a Java company get excited about Rust 2. They write some microservices in Rust. They're now having "Rust jobs" 3. The company hires more Java devs to replace the people that now maintain the Rust side of things 4. When in need there are internal shuffles to replace those Rust devs
It can't go forever, but as far as I can tell the usage in corporate has started not long ago. You'll have Rust jobs, but they'll be the same shit as Java jobs. There was a study done like a year ago that showed across the board decline in developer satisfaction with all the "new and shiny" JS frameworks. I 100% think that when companies will inevitably start hiring people to maintain those legacy Rust "internal sideprojects" the same will happen. It is when a not driven by passion workforce, people who complete taks and features instead of doing the "provably correct thing" have a go that technologies get vibe checked. We will see which way it goes.
It is interesting that your speculation chose this path: Java to Rust. That surprises me! I would have much more likely to say, C, C++, or Go to Rust. Was your choice of Java arbitrary, or is there a deeper reason?
imho web shops looking for the new shiny have tried to use Rust in search of more type-safety or performance (or to look cool)
Companies using C, C++ or even Go probably are less keen on switching stacks for the sake of it.
Ruby was super hyped 10 years ago. Now all the hype-based programmers are on to TypeScript and Rust... Ruby is IMO at a nice level now; able to avoid some of the worst ideas that hype based programmers like to inflict on people, but still popular enough...
I have heard from multiple sources that Ruby is still very popular in the start-up ecosystem in Japan.
From what I've seen it's popular for startups in general. I'm using it for mine... More YC companies use Rails than you'd expect relative to it's "popularity". And yeah, in Japan Ruby is homegrown tech and the community seems pretty big there.
Ruby became popular mainly because of Rails, which has gone down somewhat in popularity in recent years. That may be why Ruby is less popular now than it was 10 years ago. Also, they (Ruby and Rails) got popular much before 10 years ago, like around 2006 / 2007, when the Web 2.0 wave was starting. I had worked on a couple of dot-com projects in Ruby on Rails at that time, that is how I know.
And Python got popular cause of LLM AI thing. It is a shame, cause it is quite slow. I had some good time with jython back in the 2 days, but really wished something more elegant (nim/rust/ocaml) has taken over this AI thing instead of python.
One thing is for sure, don't get tight down to one language cause it is popular today. Go with whatever make sense for you and your project.
> And Python got popular cause of LLM AI thing.
Not really, Python got popular in 2000s because it was the only sane of the three choices - Perl, Python, TCL. You must be young.
The magic word for Python's fame is Zope, the respective article on Dr. Dobbs, until then I never noticed it.
There has been several other events like that (Django, Pandas and data science..). I don't think Python's popularity can be ascribed to any single event, it just happens to be a language that is reasonably close to pseudocode with an excellently thought-out (I mean best in the industry) standard library. Python is practical, first and foremost. That's why it won, unlike other languages it doesn't really have an ideological agenda.
Zope predates all of those, and slowly as you say people got interested and started using it for other stuff, like being a better Perl.
Python has an agenda as well, Guido has said multiple times it was a language designed for teaching programming, and one of the reasons Zen of Python came early on.
Zope is sooo 2000s! This is the first time ever I see somebody mention the framework I've spent a few years of my life with!
Before AI, Python was still extremely popular for any sort of data science (possibly because of numpy first, then pandas, but I don't claim any historical knowledge here). And independent to that it was, before the raise of server side JS, one of the most popular server side web languages (probably still is).
Also around the mid '00 it started replacing perl as the unix scripting language of choice.
I'd like to see some clarity in these stats, it can't just be me that finds it hard to believe that there are significantly more Python jobs than Java. I wonder if job listings are saying "Python, C++" or something, so that's a point for Python, even though, the job is < 1% Python just for test rigs or something.
You'd be hard pressed to find a job posting in ML/ data analytics where python is not in the requirements
Something feels off about these rankings, and comparing to lat year's stack overflow survey (albeit also potentially not a super accurate accounting), I'm left wondering if their sampling methods are at all accurate.
https://survey.stackoverflow.co/2024/technology#most-popular...
Even with a big uptick in Python and Java due to AI, I don't see Javascript+Typescript losing that much ground year-over-year.
What's really interesting is the place of Elixir (below Cobol and ABAP) and more of less the same as Ada. This is very controversial when comparing it with other indexes where, for example Elixir is the most beloved programming language or the language that most people wanna use. Also compare it with the number of Elixir posts on the front page of hn.
Any idea how could it be explained?
Anything short of a big AI winter is not going to move Python from its top spot. As Python has become the first choice of output code for LLMs its dominance is only going to grow.
Is Python really the first choice of output code? I'm not saying you're wrong, I just don't know the answer and I don't know where to look to find out.
I would have assumed it might be JS and more specifically React -- isn't that what you often get if you ask an LLM to "make an app that does such-and-such"?
(Experimental anecdata: I just asked Gemini to write an app and it gave me a single HTML file with embedded JS, no TS or React, Tailwind for CSS.)
(Edit to add: then I asked it to "write some code" instead of "write an app", and this time it went with Python.)
I tried it with some basic data structures and the order was: Python, Java, C++, Javascript.
Ime, unless you steer it otherwise, they will default toward TS/JS almost exclusively. Probably Python though if for some reason they decided not to use TS/JS.
Maybe it was because LLMs were trained most on Python code, but yes they seem to prefer it.
Again you're making unsupported assertions ("trained most on Python code") and coming to unsupported conclusions ("they seem to prefer it") -- my own very quick experiment shows that it depends on how you ask ("app" versus "code") at the very least.
The choices of JS and Python were pretty solid for the prompts I gave. Maybe the LLM is, well, making a reasonable decision in context, rather than just defaulting to Python!
"LLMs prefer Python" is an over-simplification. Python was already very popular, so I agree that LLMs are likely to entrench that popularity, but that doesn't mean Python will grow into other areas where it wasn't being used much before.
Why are you verbosely repeating my own replies? You asked why, I told you a theory that MAYBE because they were trained on Python code. You do know that's how a LLM or even a simple neural network works right?
Of course it is an over-simplification. Should I do an empirical scientific study before I can reply that MAYBE LLMs seem to prefer Python? I was talking from my own personal experience.
Are you using a LLM to write your replies? Because they seem very odd to me.
Sorry, I guess I'm being pointlessly argumentative.
I didn't use any LLMs for those comments, but that comment style feeds into the training data, so it's not surprising if LLMs like to copy it!
I agree that we are only going to see solidification of languages due to tools like Claude Code. Why would I take a risk on something new if I can't use a much faster tool, it can already be such a battle getting adoption in mid/large companies. I wonder how a release like React would fare if it was released in another 5-10 years once LLMs are deeply embedded.
Is PHP including wordpress and the other ones like drupal (not sure anyone uses that garbage anymore)? If yes, then should not PHP and Python also be included as C projects?
https://flo.uri.sh/visualisation/24825595/embed
It is an interesting study. In terms of new languages and AI, I would like to see if anyone comes up with a more AI friendly language, as opposed to human friendly. I.e. programming languages were often designed to be easier to read/write for people, but maybe it makes sense to think of languages where it would be harder for AI to make a mistake.
Python at number 1 makes me cringe... In my experience, Python is not a language that I would use for anything other than a script or some solo PoC. I would absolutely never use it on code expected to exceed 1000 lines, code that's maintained by more than one person, or code that takes more than a few seconds to run. Python has a lot of great libraries as a result of it being the language of choice to teach non software engineers at university. A lot of smart people contribute to the ecosystem, but I wish they would focus their efforts elsewhere. Preferably pick any complied, strongly typed, static language that supports multi-threading.
PHP just seems to drop lower each year. Looks like there is little to gain from using PHP in 2025
It’s unreasonably distracting to me that the links on this webpage repeatedly (but not always) include the preceding or following space. It looks really sloppy. How do you even to that so much on a webpage, was this typed on a WYSIWYG editor?
Comparing 2024 and 2025, Ruby is growing again but still on the lower end of spectrum.
Both Java the language and JVM is great. A lot of the important work for JVM just landed. I am not even sure if there are anything that is really missing anymore. But the whole ecosystem is so vast I wonder if anyone would want to just craft out a subset of Java.
No Zig, Crystal, Odin, but Julia and Elixir is there just without numbers.
I think it is because that the application framework of Google Android is JVM. And JNI is so annoying that binding Kotlin and other system-level programming language other than C++ like Rust, Zig for a Android app is so hard.
For game engine developers that works for their game targeting Android, especially for commercial mature game, you have no choice but Unity engine. And take a look at Google announce [a donation of $250,000 to the Rust Foundation](https://www.linkedin.com/feed/update/urn:li:activity:7376353...) but there still constraint on using Rust library for Android app.
LLMs could also ease adoption of new languages by making hiring less of a barrier to using something more niche. It becomes easier for someone to hit the ground running and be productive even if they still need to put the time in to become an expert.
Instead I find myself more concerned with which virtual machine or compiler tool chain the language operates against. Does it need to ship with a VM or does it compile to a binary? Do I want garbage collection for this project?
Maybe in that way the decision moves up an abstaction layer the same way we largely moved away from assembly languages and caring about specific processor features.
If you ask Claude or ChatGPT, they both say Python is their preferred building tool. It will push Python further.
English should be #1:)
Underrated comment! ;)
Cool but I don't know how credible this information is. From what I've read on how they got that data and came to these numbers it does not exactly inspire a high degree of confidence
If there is HTML in the list then why not JSON?
I’m shocked, Java is so high up in all categories. I have never met a single Java programmer, though to be fair I started my career fairly recently.
There are a lot of Java devs out there. At least 2 of the FAANGs are big on Java. Any big consultancy (Accenture, Cap Gemini, Fujitsu, Deloitte), is going to ship mostly Java too.
I’m not a fan personally, but its easy to find devs in it, so its popular in firms where language choice is a commodity/utility
Never met Android developers? I'm not sure how much Kotlin has already taken over there, but if you develop for Android you'll have to deal with Java (for better or worse).
Also for backend services Java is a pretty solid option. Just compile a monolithic JAR and 'run' that anywhere, which is much more robust than some node.js app cobbled together from tens of thousands of leftpad-equivalent npm packages ;)
They’re definitely underrepresented in most contexts where you’d meet other coders - but make no mistake, half the business world runs on Java, and it’s still the main language taught in a lot of CS university programs.
Hello there,
using Java since it came out in 1996, alongside many other programming languages like C#, TypeScript, C++, SQL (PL/SQL and Transact-SQL mostly),
Also Android is all about Java, even if Kotlin is the new darling and it uses its own runtime (ART), everything around it is based on the Java ecosystem, the IDE, build tools, most libraries coming out of Maven Central.
It's an iceberg. Loads of things run on Java. C# as well, which is similar. Large ecosystems that you barely have to leave when you're inside. Also, a tendency to be corporate systems, which reduces the visibility, since people generally are not allowed to show their code from work.
In the late 90s and most of the 00s, I'd say that you had a 50/50 chance of the code being either Java or C++, in any traditional (non-webdev) enterprise setting.
So, so much 00s enterprise legacy code is written in Java. In the early/mid 10s I saw a huge push to rewrite that code, though.
I've never been to China but I can't imagine drawing an inference about the existence of China or its inhabitants from my personal experience.
Hi I am a Java developer! Every company I ever worked at since 2008 used Java. At one point I learned many languages but still the only one that could get me a job was Java. To each their own bubble.
I'm just happy that both VHDL and ADA are in the list.
compare it to 2024, why the fall of javascript?
https://spectrum.ieee.org/top-programming-languages-2024
Is anybody developing programming languages designed specifically as LLM "source" code targets?
Boy those Erlang people are sure loud, I was expecting it in top 15.
Loud and well paid, according to SO.
And people try to convince me that there are Elixir jobs so its worth learning Elixir, hah
I’m surprised to see js under c#
You should combine JS and TS for a more accurate ranking.
The methodology involves search hits, Stackoverflow, conference and journal articles.
In all of these Python is artificially over-represented. Search hits and Stackoverflow questions represent beginners who are force fed Python in university or in expensive Python consultancy sessions. Journal articles are full of "AI" topics, which use Python.
Python is not used in any application on my machine apart from OS package managers. Python is used in web back ends, but is replaced by Go. "AI" is the only real stronghold due to inertia and marketing.
Like the TIOBE index, the results of this so called survey are meaningless and of no relevance to the jobs market.
Huh. If you hate AI as you sorely do, you will find any excuse to be dismissive.
Besides AI development, Python is used heavily in data processing and data science, also in writing bots of any kind, and as a glue language to do numerous tasks. It is true that it is being replaced by Go in web backends, but it still sees heavy use in that too. Moreover, Python is the only language that many AIs can interactively use in their chat sessions.
Python is used heavily in data science (and a lot of other places) because people who go to university for non software engineering disciplines get taught Python because it's "the easy language that already has libraries for this research we're doing." Those people then go on to write more of these libraries. Their code does amazing things, but very slowly.
There's a good video series called "Programming Paradigms" by Jerry Cain, taken from his class at Stanford. I'm not sure how long ago it was, but it was before whiteboards, when they were still using chalk. He just started including Python that year when it was the up-and-coming thing, as an example of a higher-level language that does a lot of stuff for you. It probably seemed like a breeze for the students after the previous weeks spent on C, assembly, and Lisp, but at least they got some of the fundamentals of how things worked first.
Totally, it's always about trade-offs. It takes a decent amount of time programming to become comfortable choosing the language based on the task rather than tailoring the task to the language.
Nice to see my beloved R still has some mindshare... Also Ruby.
I enjoy seeing FORTRAN and COBOL ranking above Clojure.
Clojure being so low, including below Prolog of all things, is so obviously wrong.
As a professional user of Prolog I'm feeling pretty smug over this morning's coffee.
Ahh, yes. In college, the CS department used Turbo Pascal (hint, I'm old). My first programming gigs out of college were FORTRAN and COBOL. It's nice to see that they are still registering.
Now, time for a Metamucil and a nice nap before my standup.
Kinda surprising that I see nothing from Zig on all these lists.
You shouldn't expect to see any pre-1.0 language in this list, especially Zig which not only makes no stability guarantees but actively discourages expecting any stability. Heck, Zig 0.15 just came out that completely overhauled the IO framework from top to bottom. Once Zig reaches 1.0, expect them to make some effort to gain adherents, which currently is a complete non-goal.
Other than comptime, it doesn't bring anything new to the 21st century.
Its safety story is basically what Modula-2 (1978) and Object Pascal (1986) already had, but now it gets done with curly brackets instead of a begin/end language.
UAF is an issue, and the tooling to tackle this issue is exactly the same that C and C++ have available for the last 30 years, give it or take it.
It will be another language to talk about, however I doubt it will ever make it into mainstream, like having platfrom vendors and console Devkits care that Zig exists.
Eh? As far as I can tell Zig already has more traction than e.g. D, which is listed.
I don't think it's aiming for mainstream adoption anyway, it's a very specific niche.
First lets see if it actually manages 1.0, then lets see what major company adopts it for real.
D once upon a time was also hyped due to its Facebook usage, Remedy game engine tooling.
Also, has Zig already gone to space?
https://forum.dlang.org/thread/10614fc$273$1@digitalmars.com
Or used by car companies?
https://forum.dlang.org/thread/evridmtwtnhhwvorohyv@forum.dl...
Anyway, I don't expect any of them to grow beyond their niche userbase.
D has lost its momentum, and Zig isn't really interesting as 21st century language in the AI tooling age.
IMO a language made mainstream when Azure, AWS and GCP has sdks for it. It’s not fool proof but and its a perhaps a good indicator?
I would be surprised to see it high in the rankings before it establishes certain backward compatibility guarantees.
Don’t think zig will ever make it into popularity tbh. It is good for low level but it isn’t like rust or go. Rust and go are really good alternatives for c++ or java code, they just work, have good tooling etc. But I would never use zig for that kind of use case instead of rust since you don’t need to go full on engineering every dot mode.
I switched to zig from rust for implementing a database since 6 months and have no regrets but just don’t think anyone would use it for writing backend code or other similar smaller things.
I used to think similarly about rust before though so don’t really know anything
I’m not that accomplished in this field so don’t think my pov will be that useful. It is mostly same as the rationale that tigerbeetle developers shared.
Might consider writing something if my project ends up being useful
> I switched to zig from rust for implementing a database since 6 months and have no regrets but just don’t think anyone would use it for writing backend code or other similar smaller things.
I've been thinking about starting a project in Zig rather than Go lately, even though I am skilled at Go. I really like working with more, cracked? or simply crazy people willing to learn an esoteric language, and zig fits the needs in particular I have (mostly very nice C interop)
Would you recommend? How are the average zig contributors vs something like go?
I don’t know about go personally but I have found people pleasant so far, it was also pleasant in rust.
It is definitely an excellent language for doing personal projects imo
Rust has a unique niche it can occupy - namely, it's a "zero-overhead" memory safe low-level language.
I never understood why is go brought up next to rust all so often, when it has barely any unique qualities, and is a high-level GCd language with a dumb type system that outputs a single binary... of which there are 1000 other examples. At least it has good tooling, I guess.
How many people are waiting for it to hit 1.0? I am.
I am interested in Zig, but until they can guarantee a stable point for an extended period of time I have limited interest. Same way with Rust gamedev I'm keeping an eye on Bevy but want it to hit 1.0. Some things pre-1.0 is fine, but more core pieces of dev like the language often warrant requiring greater stability guarantees.
Arduino as a programming language?
VB6 never dies
interesting how high c# is despite nothing going on, on github
That's partially attributed to the fact that .NET is a truly batteries included solution and quality of it is generally good enough that there is no need for second or third alternatives for every basic thing.
I don't doubt your comment, but I immediately thought to compare to Java. Why does Java have exactly what you mention -- "second or third alternatives for every basic thing"? Is it easily explained by age as a language? Also, can you provide a concrete example of something that .NET includes in the core that Java does not?
If I were to ask you, how many APIs do you think “core” (w/e you consider core) does Java have. Think of a number before opening the link below and then let me know what you had in mind.
https://apisof.net/
Generally .NET's "out-of-the-box" experience is comparable to using Java with a framework like Spring, as it includes a built-in DI container, a modern ORM (Entity Framework), a complete web stack (ASP.NET) with the high-performance web server (Kestrel) and so on. Because these first-party tools receive strong support from both Microsoft and the community and set a very high standard, which likely reduces the incentive for third-party alternatives to emerge. Of course, there are also many quality third-party solutions, but these mostly cover areas that are not covered well by .NET. You could happily build a lot of things using only .NET, without needing any third-party dependencies.
F# not listed at all? If that’s accurate (color me doubtful), people are missing out on a great language that runs on a popular platform. Personally, after learning to love F#, I’m never going back to C#.
Unfortunely Microsoft has mostly behaved as if it was a mistake to add F# to Visual Studio 2010.
Nowdays CLR seems to have changed meaning to C# Language Runtime.
Yessss! Shell scripts at number nine!
Pretty sweet,i use it almost everyday,which brings me a lot of fun.
What’s going on with Swift? Seems surprisingly less popular considering it’s the official way to develop for Apple.
Is it something to do with frameworks like React?
Swift is a nice for one use case. The use cases outnumber the popularity in that niche.
It’s amazing where java is.
If you’re working in a large team, and you’re not locked into the MS stack, and you’re not doing anything that needs to be super performant… Java is, by far, your best option.
The tooling and ecosystem aren’t great compared to some of these languages, but Java itself can be pretty damn good.
I used to be a hater many years ago but I’ve since grown to love it.
Agree, the build systems around Java have become absurdly complicated, but as an overall offering Java and JVM remains pretty compelling for lots of work, particular on a large team, as you say, where Java can be the "least bad option" in terms of getting a diverse group of people to learn it.
What is wrong with Java's tooling and ecosystem? Asking because it used to be the default a decade or so ago, with a vibrant ecosystem, so I find your remark surprising.
I used to use clojure, but one of the reasons I stopped was I personally found tools like Maven cumbersome to work with. I admit that was 8 years ago or so, so there's a chance it got better, but once people are driven away by tooling issues it takes a lot to convince them to give an ecosystem another chance.
An example of this ossification of understanding is how people still think dotnet is Windows only because they stopped caring before Core/Modern dotnet became a thing
Maven/Gradle don’t come close to the simplicity of npm, for example
Publishing a Maven package is also excruciatingly complicated. By contrast, NPM is actually too easy. I suspect that we see fewer supply chain attacks in the Java ecosystem because attackers are like “you know what.. never mind.”
I wouldn't call "3643 packages installed, 1200 vulnerabilities" for a hello world "simple".
People have their problems with Maven but unless it's some overly complicated legacy project (where npm just explodes I guess? Like I have hD windows machines get frozen from deleting the node_modules folder), it just works and you just give a list of dependencies.
They are just different. I mean, setting up monorepo is far easier with maven over npm. Besides, maven offers basically cookie cutter project organization where every maven project looks like every other maven project. As for other tooling JVM is just better than JS ecosystem. Definitely more complex, but also more powerful
What simplicity?!?
Gradle keeps on improving. I use it for Android, and even though it is complex, and then add the Android Gradle Plugin complexity on top of that, I would not trade it for the iOS build system.
One of my complaints with Gradle is that if you write a plugin (Java) it shares the classpath with other plugins. You might find some other plugin depending on some old version of a transitive dependency.
And you can code Groovy on top of it, still my favorite programming lang.
I too would like some illustration of why the tooling (Intellij, etc) is insufficient. Maybe gradle as a build system? Although I have to say with LLMs, gradle build scripting is a LOT easier to "build" out.
Why is Yaml not on the list?
Shell is the most popular programming language I don't care what your list says, it has some 6-12 Billion users which is very impressive
It's not really a general purpose programming language though. (I know some people treat it as if it is, but they are idiots.)
Java at #2? Is it really still being used much for new code in this day and age? Or is its popularity mostly due to so much legacy code out there to be maintained?
Yes. Companies with existing workforces are doing new software all the time. If you have a workforce with let us say 100 devs. Even if you hire 10 new ones temporarily or permanent, you for sure tell them: use what the other 100 do. It is not maintenance what drives this, it is the lack of momentum in your work force.
And that is very okay. (Modern) Java and .NET are excellent choices. There is nothing wrong with them.
It is a good choice if you want to use a non-Microsoft stack language (yes C# can be developed on Linux but the quality of the development experience on Linux isn't same as Windows) and want a vast ecosystem. Golang is too verbose due to lack of exceptions and the size of its ecosystem of third-party packages isn't anywhere as close to Java's. Swift is very nice but issues with ecosystem exist there too. Too steep a learning curve with Rust so more difficult to find developers. Modern Java has improved a lot compared to, say, Java 8 with record types, pattern matching, multiline strings etc. The pace of new features coming to Java has been quite high in recent times. Plus there is always the network effect.
I have developed .NET solutions on Linux over 8 years now (and about 10 years on Windows before) and would say the quality of development in Linux is even better than in Windows today. Sure you can't use Visual Studio in Linux, but you can use VS Code or Rider, which I would prefere anyway.
If you aren't it doing GUIs, which means you need to go into FOSS ecosystem with Avalonia or Uno, and if you aren't doing anything with profiling or debugging visualization of threaded code and a few other goodies that VS has for .NET and they will never make available into VSCode extension.
Also the VSCode extension for .NET has the same license as VS.
Java is still the default choice for many for enterprise software. Job-focused courses and curricula all over the world are still leaning big on Java ensuring a steady and large pool of okay Java devs.
Job focused Bachelor courses and curricula highly outnumber rigorous CS courses like the ones you are likely to find in MIT, UCLA-B, IISc, IITs, Oxford, UCL, Tsinghua, Peking, etc.
Yes, outside HN praises of Elixir, Gleam, and co, corporations run on boring technology, maybe with exception of what the FE folks pick up on each project.
They work, have great tooling, and do whatever is required for customers.
Yes. Many startups using Java in europe
And for simple reasons: no experiments in tech stack. No friction in hiring. Static typing (which sorts out JS and Python) and then you are there.
Java / Spring Boot + Angular is a very common stack i.e.
The PHP ranking makes 0 sense. Are they...not counting websites or something?
Given how mediocre LLMs are, I don't see this happening anytime soon... but I think a "better" LLM (that puts the "language" into large language model) can seamlessly translate between programming languages.
Right now, it's apparent to me that LLMs are mostly tuned in the programming space for what n-gate would call "webshit", but I think it is a clear (to me) evolutionary step towards getting much better "porting" ability in LLMs.
I don't think that is in the priority list of the LLM companies. But I think it would be a real economic boon: certainly there is a backlog of code/systems that needs to be "modernized" in enterprises, so there is a market.
Ultimately I wonder if an LLM can be engineered to represent code in an intermediate form that is language-independent to a large extent, and "render" it to the desired language/platform when requested.
> Given how mediocre LLMs are
That's not a given... I think LLMs are amazing!
lolrust
This looks very underwhelming considering it is on the ieee website. It is more like a low effort clickbait blogpost for ads
I wish there were less programming languages because every library needs to be rewritten as many times as there are languages, a combinatorial waste of time.
Now that CoffeeScript is gone I would like to see all Ruby become Python.