Open source can't coordinate

(matklad.github.io)

105 points | by LorenDB 9 hours ago ago

101 comments

  • ninjin 5 hours ago ago

    "The reason why we have Linux, and BSDs, and XNU is that they all provide the same baseline API, which was defined from the outside [by POSIX]. The coordination problem was pre-solved, and what remained is just filling-in the implementation."

    But that is not at all how Posix operates or has operated. Posix standardises common denominators between existing implementations. The fact that we now have strlcpy(3) and strlcat(3) in Posix 2024, is not because Posix designed and stipulated them. Rather, they appeared in OpenBSD in 1998, were found useful by other *nix-es over time, spread, and were finally taken aboard by Posix that standardised what was already out there and being used! This to me is the very opposite of the point the author is trying to make!

    • dale_huevo 2 hours ago ago

      Linux would have had strlcpy/strlcat 25 years ago but the glibc maintainer was famously being a giant chode and refused to allow "horribly inefficient BSD crap" into his precious code, and this fight went on for years:

      https://sourceware.org/legacy-ml/libc-alpha/2000-08/msg00053...

      So it wasn't for lack of trying. Yes, Open Source can't coordinate and this is why we can't have nice things.

      • oersted 4 minutes ago ago

        It's surprising how we ended up with such a robust open-source OS ecosystem (pretty much every server running Linux) with such emotional people at the helm.

        He is clearly not being rational there, but I could see how his aesthetic tastes might correlate pretty well with robust software. I suppose that saying no to new features is a good default heuristic, these additions could have easily added more problems than they solve, and then you have more surface area to maintain.

        That being said, this old-school ideology of maintainers dumping the full responsibility on the user for applying the API "properly" is rather unreasonable. It often sounds like they enjoy having all these footguns they refuse to fix, so they can feel superior and differentiate their club of greybeards who have memorised all the esoteric pitfalls, simply because they were along for the journey, from the masses.

    • maccard 2 hours ago ago

      Actually I think this is exactly the point.

      BSD got them in 1998, it took 17 years for it to go to posix and another 8 years before they made their way to glibc. 25 years to add a clear improvement

      • geysersam 32 minutes ago ago

        Guess not everybody though it was a clear improvement then? It's not like it made everyone adopt BSD instead of Linux. If it was easy to make all the right decisions someone would have made them and sold it as a product instead

  • taeric 7 hours ago ago

    I'm not entirely clear why/how this is an open source issue?

    My assertion: Inertia of user base is by far the largest predictor of what will stick in a market. If you can create a critical mass of users, then you will get a more uniform set of solutions.

    For fun, look into bicycles and how standardized (or, increasingly not) they are. Is there a solid technical reason for multiple ways to make shoes that connect to pedals? Why are there several ways to shift from your handlebars? With the popularity of disk brakes, why don't we have a standard size for pads?

    I think there are a lot of things that we tell ourselves won from some sort of technical reasoning. The more you learn of things, the less this seems true, though.

    Not, btw, that there aren't some things that die due to technical progress.

    • bloppe 5 hours ago ago

      It's really not an open source issue. It's a more general issue than that. The author provides as counterexamples MacOS and Windows, but that's a silly comparison. Apple nor Microsoft never coordinated with anybody else on their APIs. Closed source developers are never as good at coordinating with others as OSS developers are. Sure, they can invest more in their own products, but that's a different issue.

      Also, I'm not sure what kind of standard this author is pining for. We have Wayland and freedesktop.org. Pretty much any Linux app can already run on pretty much any DE

      • mbreese 5 hours ago ago

        I don’t think of it as a coordination issue. OSS can coordinate well. What it can’t do is make decisions. Most times there are at least two possible implementations for a given task (with different legitimate trade-offs). The OSS approach isn’t to pick the implementation that works for the most people, but rather support both implementations. Many times the only decision is to not to decide.

        The best projects have someone who will make unilateral decisions. They might be right or wrong, but it’s done and decided. Companies with an organizational hierarchy do that much better than a decentralized group of OSS developers.

        • pjc50 an hour ago ago

          The underappreciated benefit of OSS is that you have an escape hatch from coordination if you disagree enough. You can in theory just fork it and run your own version, even if nobody else agrees. But you have to bear the associated development cost yourself.

          Choice looks like fragmentation. But the existence of alternatives is very important.

    • skydhash 6 hours ago ago

      > I'm not entirely clear why/how this is an open source issue?

      I think it's not even an issue. Most open source projects are implementations (maybe flawed), and few are new propositions. If something was not fully defined in the standard/protocol/interface, the implementation may come up with its own extensions that are incompatible with others. But that's how you choose implementations, you're supposed to commit to one and not use a mismatch.

      So if you're using GNOME, try not to depend on something that depends on having the whole KDE installed. If you're using OpenRC, anything systemd is prohibited. All projects are consistent within themselves, the only thing you need to do is to avoid conflicts by having two things doing the same job and to ollow the installed system guidelines.

      I don't mind fragmentation. What I mind is dependency due to laziness and not a real need (like using a couple of glibc specific library for no real reason just because you're working on Debian) or taking the time to support windows and macos, but tying yourself to glibc on Linux because reasons.

    • em3rgent0rdr 5 hours ago ago

      > not entirely clear why/how this is an open source issue?

      Was going to say this too, cause competing proprietary software companies generally don't coordinate. Macs don't easily run Windows programs and vice versa. Unless an alliance or some agreement to adhere to some standards body is made, the collaboration issue is part of both worlds.

    • FridgeSeal 6 hours ago ago

      > For fun, look into bicycles and how standardized (or, increasingly not) they are.

      Because triangles are a fantastic, high-strength shape, that suits the loads a bicycle is subject to. For the vast majority of cases, it’s a very solid choice. We deviate when specific UX requirements are required (city bikes having a battery and a low stepover to suit a variety of clothing, and the motor makes up for additional weight required.

      > Is there a solid technical reason for multiple ways to make shoes that connect to pedals?

      All of them attempt to address the same requirement, and make different tradeoffs depending on use-case and environment.

      > Why are there several ways to shift from your handlebars?

      Price points, performance reqs, handlebar setup, environmental (is it expected to be subject to mud rocks and trees constantly?) and weight.

      > With the popularity of disk brakes, why don't we have a standard size for pads?

      Same as for shifters: the braking compound and design for a race road bike will be really different to what a DH race bike requires.

    • Bouncingsoul1 3 hours ago ago

      I'm not sure which point you are trying to make with the bikes. For road racing the UCI quite famously sets quite strict standards. For "normal" use, if you live within the US or EU will also have some standards (mostly conserning road saftey). Of course you may cherry pick some exceptions, but IMO this doesn't drive the point.

    • pixl97 6 hours ago ago

      >why don't we have a standard size for pads?

      Because each manufacturer can't put a premium on their pads that way.

    • dkkergoog 7 hours ago ago

      All those things have s price point

  • antonok 5 hours ago ago

    Open source has the best kind of coordination. If there's a real use-case for two things to work together, you or someone else can implement it and share it without anyone's permission. Meanwhile in proprietary land, people sometimes build things that nobody wanted, and also leave out features with high demand. Proprietary optimizes for the profit of some individuals; open source optimizes for maximum utility.

    Thus far, open source has optimized for maximum utility for individuals who can write code... but AI may be changing that soon enough.

  • fergie 28 minutes ago ago

    > There was a decade of opportunity for OSS to coordinate around an IDE protocol, but that didn’t happen, because OSS is bad at coordination.

    Its also because a lot of the key people in Open Source, and senior hackers generally, don't actually use IDEs.

    We should encourage more of the younger generation over to powerful configurable editors such as Emacs, rather than locking everybody into VSCode/JetBrains/etc.

  • tbrownaw 6 hours ago ago

    > The underlying force there is the absence of one unified baseline set of APIs for writing desktop programs.

    It's called the Common Desktop Environment.

    • skydhash 6 hours ago ago

      Most desktop programs don't need to rely on a DE (apart from some utilities). If Emacs can run anywhere, your programs can too. GTK or QT is more than enough. For anything else, you go with components on a needed basis, and they should preferably be desktop independent.

      • hackyhacky 5 hours ago ago

        > If Emacs can run anywhere

        Any desktop program needs to be programmed against some API. In the case of Emacs, it's probably raw Xlib or a wrapper library on top of it.

        The problem with that is that (a) your dependency on X11, which is obsolete and has many documented inadequacies, (b) the lack of a modern widget library and toolkit makes extra, unnecessary work for the programmer, and (c) the lack of a cohesive visual language between programs makes the experience worse for the user.

        Toolkits like GTK and Qt solve all these problems. By avoiding them, you're just reinventing the wheel, poorly, every time.

        • dragandj an hour ago ago

          Emacs was there way before GTK and Qt appeared, though.

        • skydhash 5 hours ago ago

          Emacs has a GTK3 layer (among others) for its UI.

  • dcreater 6 hours ago ago

    Solving the coordination problem in FOSS is one of the grand challenges of humanity. If we solve it, I think it will effect a tectonic shift with far reaching implications and fixes major socioeconomic problems like wealth concentration. Eg: a FOSS alternative to Visa, and of course Windows/MS Office.

  • simonebrunozzi 34 minutes ago ago

    Key sentence here:

    > But it is also clear why JetBrains didn’t do LSP — why would they? While the right solution on the technical grounds, you aren’t going to get paid for being technically right.

  • cadamsdotcom 3 hours ago ago

    To me it's about how low-bandwidth communication channels limit collaboration.

    Linux and FOSS grew up (congrats!) and the important work got super big and complex.

    Unfortunately, online communication - typically the channel preferred by FOSS projects - has much lower bandwidth than teams working full time in an office. Thus limiting the "depth" of collaboration.

    It's not all bad. FOSS does have some profound success to be proud of. For small and well-defined projects "benevolent dictator for life" works! Anything one person can lead - a desktop tool or a library - FOSS produces really good outcomes. But above say, a package manager or a distro.. things get wonky.

    Again it's not all bad. FOSS is rolling up its sleeves. Folks are organically doing the "go down every avenue people are motivated to go down" thing. You could call it Darwinism. But many motivated communities lack resources to go far enough to reach value. Motivation stalls during projects (we're human!), and FOSS rarely comes with a motivation-boosting paycheck. Plenty of valiant efforts don't reach value and it's never malicious. It's OK!

    So is there a way to better concentrate efforts?

    If the paths are long, it follows that the community should tackle fewer paths. The path(s) taken should be well defined, charted in advance as much as possible, and not uncovered bit by bit - or the work will take decades.

    Growing an entire ecosystem around one path forward (or a few) requires alignment. Can enough trust be fostered in leaders to get people to work on a shared vision?

    A vision of what Linux on the desktop should/could converge to is the kind of problem that, if Linux were a company, would be bet-the-company strategic. A company can't afford to go down two paths. So it might lock its smartest people in a room to hash out one true strategy. Or have one smart person dictate one vision and align everyone on it.

    Can that be done for FOSS?

    In the bounds of a single project it has been proven that it can. But what about an entire ecosystem?

    • nottorp an hour ago ago

      > Unfortunately, online communication - typically the channel preferred by FOSS projects - has much lower bandwidth than teams working full time in an office.

      I wonder if that's why open source projects get so much done and at such a high quality with so few people.

      Instead of 75% "communication" and 25% work, 90% of the time donated to FOSS is actual work :)

  • scrapheap 2 hours ago ago

    Alternative view - Open Source projects have the freedom to do what they want to do. Which in turn gives me the freedom to choose how I set up my desktop environment. How many changes have been pushed on Windows users over the last 15 years with the only option being to change and get security updates or stay on an old insecure version?

    And while there are lots of desktop environments for Linux you can usually run applications targeting one in any of them (I use Gnome's file manager in Enlightenment as it supports accessing CIFS shares directly).

  • a-dub 5 hours ago ago

    idk. i don't really follow the argument. large projects in open source coordinate internally and engage externally when they need to- i suspect that isn't all that different from what you'd see in a large megacompany like apple or microsoft.

    open source people create reusable interfaces. i'd argue they go one step further and create open and public internet communities with standards, practices and distribution/release channels.

  • bobajeff 5 hours ago ago

    Maybe open source doesn't need to coordinate. Perhaps users and developers should demand standards and interoperability from their platforms. Perhaps that's why we have things like Electron, Unreal Engine and Unity. One way or another we'll coordinate on something.

  • bronlund 3 hours ago ago

    This is kind of the same reason I gave up on the Linux desktop and went for macOS. When I first learned about Linux I was thinking "Sweet!. This is going to kick Microsoft's ass!", but this was 30 years ago and instead of a kickass desktop OS, we got 1000 mediocre ones.

    • happymellon 3 hours ago ago

      > looks at Windows Vista, 8, 10, 11

      I don't think its the mediocre interface thats holding Linux back...

      Whether its the abomination thats Windows 11, having to fight against an ad ridden interface in 10 or otherwise. Teams hasn't dominated because of a coherent interface, or even because anyone actually wants to use it.

      Besides, you say 1000 desktops, but there is really only 2 (well 1 since Gnome is the primary interface for the big 3 distros) along with couple of hobby ones that you have to seek out to even learn they exist and a lot of toys that no one outside HN has even heard of.

  • mongol 8 hours ago ago

    > But there was no one to coordinate Linux on desktop.

    Freedesktop.org?

    • wmf 8 hours ago ago

      I get the impression that some people don't want to participate in FreeDesktop. Maybe they see it as bloated or too aligned with GNOME.

    • dwheeler 7 hours ago ago

      Yes. GNOME and KDE at least coordinate via freedesktop.org. At least they did!

    • j16sdiz 6 hours ago ago

      Those are for DEs.

      For end-user application, it's a mess. You can't compile once and run everywhere.

      LSB (Linux Standard Base) tried to that and failed.

      flatpak/Snap/AppImage all tried to that, each have its own set of problems.

      • LtWorf 4 hours ago ago

        Can you compile on windows11 and run it on windows xp? Yeah didn't think so

        • pjc50 an hour ago ago

          Unless you choose to use APIs that aren't in Windows XP, then this isn't a problem. Win32 backwards compatibility is very impressive.

          Building a Win16 application would be more difficult.

          But this isn't directly comparable. The issue with Linux is getting a precompiled binary to work properly across all currently up to date distributions.

        • int_19h 3 hours ago ago

          You can do this even with the official Microsoft SDKs (just need to have the XP targeting pack). And then there are numerous third party development tools that allow this.

          • LtWorf 33 minutes ago ago

            You can do this on linux too, make a chroot, done.

    • freeone3000 7 hours ago ago

      I’m still mad they stole the scriptable, composable DCOP and put it all behind a binary format.

      • LtWorf 4 hours ago ago

        whatever happened with kdbus btw?

    • hackernoops 7 hours ago ago

      And now https://github.com/X11Libre/xserver is off to a good start.

      • bilkow 6 hours ago ago

        I got curious and found a ton of red flags in about half hour...

        Summary of the drama that resulted in Xlibre:

        - https://discuss.pixls.us/t/weekly-recap-8-june-2025/50638

        From the README at https://github.com/X11Libre/xserver:

        > This is an independent project, not at all affiliated with BigTech or any of their subsidiaries or tax evasion tools, nor any political activists groups, state actors, etc. It's explicitly free of any "DEI" or similar discriminatory policies. Anybody who's treating others nicely is welcomed.

        This exchange between him and Torvalds in 2021:

        - https://lkml.org/lkml/2021/6/10/903

        - https://lkml.org/lkml/2021/6/10/957

        • shadowgovt 6 hours ago ago

          Looking at the history of that doc, I was most annoyed not about what was added but what was removed: the random anti-DEI stuff was in place of the part that described what an X server is.

          Like, my guy. If your goal is to provide a better solution than the existing one, telling people what you're making is step 1. No, the user can't be assumed to already know. You've already made your first mistake if you've made that assumption.

          • mariusor 4 hours ago ago

            I'm pretty sure that's not addressed to any potential "users", unless you're thinking that the future developers of the project are them.

  • TechPlasma 8 hours ago ago

    This feels very right. The problem is there are few entities invested enough in Linux as a consumer platform, that have the motivation to push things forward. To make the decisions on what their "Reference" system is.

    Valve is maybe the closest?

    • hackyhacky 8 hours ago ago

      Ubuntu, for all their faults, were the first to make Linux really easy to install and made it "just work." That counts for a lot. Since then, their output has been disappointing.

      Part of the problem, is that "Linux/Unix culture" is very averse to coordination. When someone does try to establish a common baseline, there is inevitable pushback. The classic example is systemd, which fills a desperately needed hole in the Linux ecosystem, but is to this day criticized for being antithetical to the ethos of, I guess, gluing together an operating system with chewing gum and bits of string. The fact is that many users would rather have a pile of software that can be hand-assembled into an OS, instead of an actual cohesive, consistent platform.

      So I can't blame people too much for not trying to establish standards. If OSS had created LSP, there would be 20 different incompatible variations, and they would insist "We like it this way."

      EDIT: averse, not adverse

      • linguae 6 hours ago ago

        There is another factor at play: different users value different things. For example, there are some people who don't like systemd, not because they are enamored with classic startup scripts, but because they take issue with systemd's design. It's not that they dislike coherent, consistent platforms: they just take disagreement with the design decisions of that particular platform. For example, I like the classic Mac OS and Jobs-era Mac OS X, but I don't like GNOME. All of these are coherent platforms, but they have different philosophies.

        The difference between open source software versus proprietary software is that if users don't like the changes made to proprietary software, there choices are limited to the following:

        1. Dealing with the changes even though they don't like it.

        2. Sticking to an older version of the software before the changes took place (which can be difficult due to needing to deal with a changing environment and thus is only delaying the inevitable).

        3. Switching to an alternative product, if available.

        4. Writing an alternative product (which can be a massive undertaking).

        Open source software provides additional options:

        5. Fork the older version of the software. If enough people maintain this fork, then this becomes a viable alternative to the changed software.

        6. Use the new version of the software, but modify it to one's liking.

        This is the blessing and the curse of open source software; we have the power to make our own environments, but some software is quite labor-intensive to write, and we need to rely on other people's libraries, systems, and tools to avoid reinventing wheels, but sometimes those dependencies change in ways that we disagree with.

        I think the best way to mitigate this is making software easier to develop and more modular, though inevitably there are always going to be disagreements when using dependencies that we don't directly control.

      • Joel_Mckay 7 hours ago ago

        Indeed, "good" doesn't matter if the OS is a pain to use.

        The Driver support issues are essentially a theological war between FOSS ideals, and mystery OEM binaries.

        Most of the linux kernel code is still the driver modules, and board support packages.

        The desktop options have always been a mess of forks and bodged applets to make it useful.

        Ubuntu balances the purity of Debian with practical user experience (we could all write a book about UEFI shenanigans.) RedHat focuses more on hardened server use-cases.

        Is it worse than Win11 ? depends what you are doing, and what people consider is the low bar for messing with users. =3

      • clipsy 7 hours ago ago

        > The classic example is systemd, which fills a desperately needed hole in the Linux ecosystem

        If the hole is desperately needed, why would you want to fill it?

        • hackyhacky 7 hours ago ago

          > If the hole is desperately needed, why would you want to fill it?

          Good point. Let me rephrase: "Systemd fills a hole in the Linux ecosystem, which desperately needs to be filled." This version of the sentence is more correct and conveniently functions as a double entendre.

          • j16sdiz 6 hours ago ago

            systemd killed many projects and use cases along its way.

            Better integration for mainstream, sure. but at the end we have less choice.

            • hackyhacky 6 hours ago ago

              > but at the end we have less choice.

              This is exactly my point: you want "diverse choices", which is fundamentally at odds with "cohesive functionality."

              The article is about LSP, an imperfect standard, but nevertheless a standard. The prioritization of "choice" above all else is why the OSS world is incapable of creating standards.

              > systemd killed many projects

              The purpose of software is to fulfill a need. Creation of software projects is simply a side-effect of that process. It's good that systemd killed many projects, because those people who had worked on those projects can now work on a problem that hasn't already been solved.

            • shadowgovt 5 hours ago ago

              Sometimes the end user actually suffers from too much choice.

              Choice implies complexity, and there are some places less complexity is quite desirable. I still periodically, when setting up a new Linux machine, have to figure out why the audio frameworks are fighting, for example. The fact that "frameworks" is plural there makes everything harder for me, the end user.

              (I compare Python and Node environment management frequently here. Python standardized the protocol for setting up an environment. Wise, better than nothing, but now I have to care whether something is using conda or poetry or some several other options I don't know. Node has npm. If there's a package, it's in npm. To setup a Node service, use npm. One thing to know, one thing to get good at, one thing to use. Environment management with Node is much easier than in Python).

    • charcircuit 7 hours ago ago

      Google is the closest with Android. They were even able to get Adobe to port photoshop which other Linux operating systems have failed to have happen.

      Despite Android's success the rest of the consumer Linux distributions chose to ignore it and continue on with what they were already doing. Trying to have them coordinate around what is succeeding is seemingly impossible.

      • hackyhacky 7 hours ago ago

        > Despite Android's success the rest of the consumer Linux distributions chose to ignore it and continue on with what they were already doing. Trying to have them coordinate around what is succeeding is seemingly impossible.

        I'm not sure I understand you here. What do you think other Linux distros should have done?

        • dontlaugh an hour ago ago

          Long before Android existed, they could’ve all agreed to have the same single desktop environment, UI toolkit, app packaging and distribution method, etc. And also agreed to ship drivers, even if proprietary.

        • charcircuit 7 hours ago ago

          >What do you think other Linux distros should have done?

          Collectively contributing to getting AOSP running on desktops, and then also working on backwards compatibility to be able to package their preexisting apps into Android apps. This would allow for there to be a common app platform for developers to target Linux with.

          • hackyhacky 7 hours ago ago

            > Collectively contributing to getting AOSP running on desktops, and then also working on backwards compatibility to be able to package their preexisting apps into Android apps. This would allow for there to be a common app platform for developers to target Linux with.

            As a common target, AOSP isn't a very good one.

            AOSP ran on desktops. (Maybe it still does, haven't tried it in a while.) It was still a mobile OS, though, so it wasn't good on the desktop, but it ran.

            It also uses very old kernels.

            Other than the kernel, the Android UI is completely different from conventional Linux. Any Gnome or Qt app would have to be completely rewritten to support it, and would probably have to run in the JVM.

            Basically, if the Linux community followed your plan, they would have to commit a huge effort to port everything to what is essentially a completely different, incompatible OS in every respect except the kernel, and their reward would be to live in subservience to the whims of Google in supporting their product which Google themselves never had enough faith in to make it a proper desktop OS. It seems that the benefit does not justify the investment.

            • charcircuit 6 hours ago ago

              >so it wasn't good on the desktop, but it ran.

              Which is why it would benefit from people who are trying to optimize it, and extend it to offer a good desktop experience.

              >It also uses very old kernels.

              It's based off the latest LTS release of the kernel.

              >Any Gnome or Qt app would have to be completely rewritten to support it

              Which is why my comment said that distros would work on backwards compatibility to avoid such expensive work of requiring a complete rewrite amd try to make it as seamless as possible.

              >and would probably have to run in the JVM

              Android does not use the JVM. It has ART, the Android Runtime, but you can still use native code.

              >and their reward would be to live in subservience to the whims of Google in supporting their product which Google themselves never had enough faith in to make it a proper desktop OS

              The benefit is being able to reap the fruits of the billions of dollars Google's is investing into the OS. Along with compatibility with a large amount of apps. As a bonus staple Linux applications may be able to installed to some of the billion existing Android devices today. Google may not have seen the benefit of supporting the desktop, but that's where smaller players can come in to play a role in trying to focus on more niche markets where there is less possible return.

              • skydhash 6 hours ago ago

                I don't think Android is a good platform for desktop usage. First the windowing system, and the IPC mechanism. They are very limited. And one of the nice aspects of desktop computing is the ability to alter it for your own purposes (something that MacOS is running away from). Meaning you extend it for some other domain, think music production, video production,... Where you want to hook it to some hardware and have the software to talk directly to the latter. Which means having access to all the ports and coding bespoke protocols. I don't think current android API allows for that.

                • charcircuit 6 hours ago ago

                  >They are very limited.

                  Sure the windowing is limited, but it could be extended. I disagree that the IPC is limited though.

                  >Which means having access to all the ports and coding bespoke protocols. I don't think current android API allows for that.

                  It's still all open source. The distros could add new APIs to expose new capabilities.

                  • skydhash 5 hours ago ago

                    > The distros could add new APIs could be added to expose new capabilities.

                    Those exist already. With Debian, Alpine, Fedora,... you can put anything on top of the kernel in the userland. Android goes with a restricted version.

                    It's the same with MacOS. It's Unix, but with proprietary add-ons and systems. And lately, with more restrictions.

                    • charcircuit 4 hours ago ago

                      How does those existing make Android not a good platform? I don't fully understand the point you are trying to make.

                      By restrictions do you mean having proper capability based security instead of letting all apps have access to everything? These restrictions are a good thing.

          • o11c 6 hours ago ago

            90% of the problems Linux has had in the last 10 years are due to trying to unify desktop UI with mobile. This is fundamentally a mistake and it is critical to avoid it.

            • hackyhacky 6 hours ago ago

              > 90% of the problems Linux has had in the last 10 years are due to trying to unify desktop UI with mobile.

              To be fair, Apple and Microsoft have also failed to try to unify desktop UI with mobile.

              • o11c 6 hours ago ago

                Yeah, but they have enough other failures that this one alone can't reach 90%.

                Linux has had other major dramas but not failures.

          • happymellon 3 hours ago ago

            A lot of effort went into Android x86.

            As far as I know Google has never accepted patches into Android, so everything has to be maintained outside the project in parallel, which helped kill it.

            Google is not your friend, and they will not work with you. Android has diverged several times, and they break everyone else without caring.

      • TechPlasma 7 hours ago ago

        Android is the most Pervasive yes, but I would consider it too focused on a specific type of platform, and one that is becoming more and more closed off. ChromeOS might be a better example, but much like android, it is also very closed off from the rest of the ecosystem.

  • RossBencina 6 hours ago ago

    Who is being incentivised to reduce the friction of interoperation?

    Coordination is hard. People who are good at coordinating are not necessarily the same people who are happy to contribute their time to FOSS. And FOSS may need to coordinate in ways that vertically integrated companies do not.

    Coordinating between loosely aggregated volunteer projects is not the same as coordinating between vested stakeholders either. I would guess that most FOSS projects are more invested in their own survival than in some larger objective. Teams within a company are (presumably) by definition invested in seeing the company mission succeed.

    The GNOME / KDE example mentioned elsewhere in this thread is interesting because these are two somewhat equivalent co-existing projects. Any coordination between them is surely not their highest priority. Same with all of the different distros. The each exist to solve a problem, as the fine article says.

    I wonder how much the problem is actually "open source can't standardise on a single solution." Let one thousand flowers bloom, sure. But don't expect a homogeneous user experience. The great thing about standards is there are so many to choose from. xkcd 927. etc.

  • ashoeafoot 3 hours ago ago

    The selection and standardisation comitee for open source is the usage data. Make public what is used where under what circumstances, standards emerge .

  • throwaway2037 4 hours ago ago

        > But then, how can Linux exist? How does that square with “never break the user space?”
    
    Hot take: This catch phrase is out of date. For Linux desktop normies like me who don't really care about the stability of the Linux user space API, user space does break when GUI libraries (and the myriad of libraries dependencies) change their APIs. For example, I mostly use KDE, which depends upon Qt libraries for its GUI. Qt regularly introduces breaking changes to their API during each version increment: 4->5->6, etc. (I don't hate them for it; it is normally carefully done and well-documented.)
    • vhantz 4 hours ago ago

      "don't break user space" is about the Linux kernel not breaking user space. Qt is user space as well as any desktop environment or GUI framework.

      Introducing breaking changes with major version releases is standard software development practice. Very few projects go out of their way to always keep backwards compatibility.

    • LtWorf 4 hours ago ago

      Kernel breaks API all the time too. It applies only if something that linus personally uses stops working.

      • LtWorf an hour ago ago

        Funny that I'm getting downvoted after having spent hours investigating a failure due to kernel breaking API a couple of weeks ago :)

        I guess y'all know better :)

  • amelius 2 hours ago ago

    Yet most closed source stuff depends heavily on open source.

  • muglug 6 hours ago ago

    Ehh I don't buy that the market was ready 10 years earlier (in 2006) for open-source LSP implementations.

    You gotta have someone write those language servers for free, and the language servers have to be performant. In 2006 that meant writing in a compiled language, which meant that anyone creating a language server for an interpreted language would need to be an expert in two languages. That was already a small pool of people.

    And big multiplayer OSS platforms like GitHub didn't exist until 2008.

    • autarch 5 hours ago ago

      > And big multiplayer OSS platforms like GitHub didn't exist until 2008.

      SourceForge launched in 1999. I think GitHub is better in many ways, but the basic building blocks of hosted repo, issue tracking, and discussions (via email lists) on Sourceforge. I collaborated with folks on a number of projects on SourceForge way back when.

    • skydhash 6 hours ago ago

      And the fact is that anyone working professionally was using an IDE, and anyone else was fine with grepping and using ctags.

      I think LSP is only truly useful in two contexts, global symbols (even with namespacing) and confusing imports. In language like C, you mostly have a few structs and functions, and for the ones in libraries, you only need to include a single header. With Python, the imports are concise and a good references gives you the symbol identifier. But with languages that needs an IDE or LSP, you find yourself dealing with many imports and many identifiers to write a few lines of code and it becomes quickly unmanageable if you don't have completion or autoimports.

  • amelius 2 hours ago ago

    We need more people writing RFC style documents.

  • initramfs 6 hours ago ago

    I think the definition of linux is much broader than what is considered today a platform for the IDE. It's kind of like the IDE is the cart, and the kernel is the horse, but 30 years later, linux is an engine with a cabin virtual machine, rather than a desktop per se. the parts interact at a different level now.

  • dkdcio 6 hours ago ago
  • fr4nkr 5 hours ago ago

    The OP defeats his own argument. LSP was a collaborative effort that benefited from a degree of coordination that only hierarchical organizations can provide, yet it still sucks ass.

    OP blames FOSS for not providing an IDE protocol a decade earlier, but doesn't ask the rather obvious question of why language-specific tooling is not only still around, but as market-viable as ever. I'd argue it's because what LSP tries to do is just stupid to begin with, or at least exceptionally hard to get right. All of the best language tooling I've used is ad-hoc and tailored to the specific strengths of a single language. LSP makes the same mistake Microsoft made with UWP: trying to cram the same peg into every hole.

    Meanwhile, Microsoft still develops their proprietary Intellisense stuff because it actually works. They competed with themselves and won.

    (Minor edit: I forgot that MS alone didn't standardize LSP.)

    • marcosdumay 5 hours ago ago

      > OP blames FOSS for not providing an IDE protocol a decade earlier

      Everybody standardized on Eclipse plugins almost 2 decades earlier anyway. It got replaced because the standard sucked. The new one is better, but by how much is still a question.

    • oaiey 5 hours ago ago

      He also overlooks that the central stable projects, like the Linux kernel/systems/... also have a very strict hierarchy / dictatorship ongoing.

    • diegoperini 5 hours ago ago

      > yet the end result was complete shit

      Could you elaborate why? It looks like a useful protocol.

      • fr4nkr 4 hours ago ago

        I elaborated a bit when I edited my post, but to be more specific, I think LSP is a protocol that fails at its stated goals. Every server is buggy as hell and has its own quirks and behaviors, so editors that implement LSP have to add workarounds for every server, which renders the point of LSP moot. It's the worst of both worlds: editors are still duplicating effort, but with fewer, if any of the benefits of tools tailor-made for a specific editor-language combination. And that's not even touching on the protocol's severe performance issues.

        Unsurprisingly, the vast majority of servers work much better with VSCode than other editors. Whether this was a deliberate attempt by Microsoft to EEE their own product, or simply a convenient result of their own incompetence, is ambiguous.

  • pacoxu2025 4 hours ago ago

    but open source foundation provide some guides/events/programs to coordinate.

  • alganet 22 minutes ago ago

    Must... make... cathedral... at... all... costs... why... no... cathedral... halp...

    http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral...

  • udev4096 5 hours ago ago

    What? How is this even at top? Some no-name program is not getting an update or is not perfectly installable and suddenly it's an open source problem? Stop being an entitled prick

  • colesantiago 6 hours ago ago

    I would love to tell people about linux for their desktops, but the main issue I have with it is the fact that people who are interested in it ask me one question regarding Linux distributions:

    “Which one?”

    This is pretty much the cause of a 90% drop off of interest in Linux on the desktop.

    I could say use Ubuntu (and I do) to some of the people who I’m close with that are interested in Linux, but they discover Lubuntu, or Linux Mint and Debian, then they get easily confused and give up.

    And that is not even getting into the updates and the packaging and heaven forbid anything breaks.

    • udev4096 5 hours ago ago

      There aren't that many to ponder over the idea of recommending someone for a daily use. For beginners, Fedora is the perfect choice. For people with programming background, Arch. Ubuntu was sane sometime ago, not anymore because of the bloat it ships by default

    • Milpotel 4 hours ago ago

      > And that is not even getting into the updates and the packaging and heaven forbid anything breaks.

      How to spot the Ubuntu user...

      • imp0cat 3 hours ago ago

        And lead him to Nix? :)

        And then watch his eyes glaze over as he realizes that he's bitten off a lot more than he can chew. :D

  • UltraSane 6 hours ago ago

    I was truly shocked at how bad the experience is when you are using and RPM based distribution and a program is only available as a DEB

    • mroche 5 hours ago ago

      Tools like Flatpak, AppImage, Snap, Toolbox, and Distrobox can go a long way on relieving the end-user of the burden of trying to get things playing nice in those situations. Not always a silver bullet, but a useful tool to keep in the back pocket.

      If it's FOSS, at least you have the option of trying to repackage it for your distribution. You're SOL if it's a proprietary application distributed in binary format. , though.

      • UltraSane 5 hours ago ago

        It was the Termius SSH Client

    • udev4096 5 hours ago ago

      There is rpm-to-deb converter which works sometimes. Most of the modern projects include an AppImage these days, which is distro agnostic and only requires fuse to be installed

  • mike_hearn an hour ago ago

    Many moons ago Scott Alexander wrote a critique of Marx. It starts by arguing that if capitalists can be said to produce anything, it's coordination. Coordination, Scott argues, is a thing every bit as real as coal or food or legal services. People need to manufacture it, and we call them executives/investors/marketing staff, and others want to buy it. When we buy coordination we call it brand value or similar. Open source has a notable absence of coordinators, because producing coordination is hard and non-fun, so without a capitalist market there's not much incentive to do it. Same reason desktop Linux historically struggled with anything that wasn't hobby programming (art, UI design, etc... eventually Red Hat and others hired such people using server profits).

    The Linux kernel and GNU in general are projects that hacked around that problem by just copying the decisions of other people who were coordinated by capitalists (UNIX vendors), which worked long enough to bootstrap the ecosystem until some of the key people could be coordinated by Red Hat and others who monetized indirectly. But at every stage, the coordination was being produced by capitalists even though it was hard to see.

    In other places where the mimic-and-support model didn't work, open source really struggled. This is most obvious on the desktop. Even there, ultimately this approach has been adopted for large chunks of it. If you play games on Linux today it's because people copied the Win32 API i.e. the coordination was produced by capitalists like Bill Gates.

    Now Alex mentions LSP and JetBrains. The reason JetBrains didn't do the LSP isn't because of value capture. After all, IntelliJ has been open source for a long time. Other IDEs could easily have embedded it and used its plugins. The reason JetBrains use a Java API is because it's a lot more productive and effective to design in-process APIs than network protocols. As long as you aren't crossing runtime boundaries they're easier to write, easier to test, easier to reason about statically (especially w.r.t. concurrency), and much more performant. You can exchange complex object graphs in a shared address space and coordinate them using locks. All this is a highly effective way to extend an IDE.

    Microsoft did the LSP because they took a bunch of energetic developers who only wanted to do web development, so they used Electron. Also for reasons of sticking with the crowd, .NET being pretty useless for cross-platform desktop stuff... it's not just that experience with desktop programming is fading away. But browsers were never designed for the challenges of large scale desktop programming, in fact they weren't designed for building apps at all. So they don't let you use threads, static typing via TypeScript is an aftermarket hack, V8 has very low maximum heap sizes, and there are many other challenges with doing a clean JetBrains style architecture. To their credit, the VS Code team leaned into the architectural limits of the browser and did their best to turn it into advantages. They introduced this notion of a backend that could run independently of the frontend using a 'standard' protocol. This is technically not really different to the IntelliJ API being open source, but people like the idea of protocols more than embedding a JVM and using stuff in a company-specific namespace, so that created a lot of community good will and excitement for them at the cost of many technical challenges.

    Those challenges are why JetBrains only use the LSP style approach for one of their IDEs, which due to historical reasons doesn't share the same architectural approach as all the others. And it's also why, if you look at the Rider protocol, it's some fairly advanced state sync protocol thing, it's not a plain old HTTP RPC style protocol.

    Given that both are open source and both are produced by teams of paid developers working in an office coordinated by capitalists, it's probably not right to identify this as an open source vs proprietary difference. It's purely a technical one to do with JVM vs web as foundational platforms.

  • shadowgovt 6 hours ago ago

    I am reminded of someone I read recently decrying as a loss GNOME adopting systemd components as a critical dependency because they want alternatives to systemd.

    ... and this a layer of open source flexibility I never wanted. I don't want alternatives to core system management; I want one correct answer that is rugged, robust, well-tested, and standardized so that I don't have to play the "How is this service configured atop this service manager" game.

  • xpe 7 hours ago ago

    Apple coordinates internally, since macOS works with Apple hardware. Windows can drive coordination among hardware vendors. In the Linux world, many organizations and projects share power; there is not the same focal power on having a consistent end user OS (dependencies, configuration). Declarative and deterministic build systems at the OS level allow different groups to package their subcomponents reliably. As various configurations get socialized, it gives choice to tradeoff between customization and popularity/vetting.