Good news for future yt-dlp releases (cf. https://news.ycombinator.com/item?id=45898407), which could now mark it as an optional dependency available through Python's native packaging.
Update: I lobbed the idea over on the yt-dlp issue tracker. The initial response has been skeptical, citing some issues (which I agree about) with the current state of the Deno packaging. But I'm optimistic that this can be sorted out eventually.
yt-dlp was also the first application that came to my mind. I got my fingers crossed for this integration. It was interesting to learn how to hijack my own cookies but, nonetheless, rather uncomfortable to say the least.
I'm not crazy about the way this installs Deno as `/usr/local/bin/deno` (on Linux systems at least). I was hoping it would leave that executable tucked away in Python site-packages somewhere out of the way.
There's also a "source" distribution that installs by downloading from the main project's GitHub release page. But that's presumably limited to the same platform support.
The yt-dlp project also raised concerns that the manylinux wheel incorrectly advertises older glibc support.
... but I always bristle a bit at the "one-liner to run without installing" description. Sure, the ergonomics are great, but you do still have to download the whole thing, and it does create a temporary installation that is hard-linked from a cache folder that is basically itself an installation.
Sure, but as an end-user you don't have to think about installation at all. That's a huge win - mainly because it eliminates the "Did I install this already? Where did I put it? What's the command for doing that again?" mental overhead.
Can someone please ELI5 what this means for Deno and Python? TFA: "deno is being distributed on pypi for use in python projects" makes it sound like you can now `import deno` and have a JS engine/subsystem in Python, like we finally came full circle from [PyScript](https://pyscript.net/).
However, other comments make it sound like a bunch of other projects have discovered that PyPI is a good distribution channel. Which, to me, sounds like using the Internet Archive as your CDN. Is PyPI the next apt/yum/brew or what?
You can, in fact, `import deno` after installing it. But all this gets you is a function that locates the Deno executable, which you can then invoke e.g. with `subprocess.call`.
(I hope this doesn't become a pattern that puts excessive pressure on PyPI. IMO it should only be used for things that are specifically known to be useful in the Python ecosystem, as a last resort when proper Python API bindings would be infeasible or the developer resources aren't there for it. And everyone should keep in mind that PyPI is just one index, operating a standard protocol that others can implement. Large companies should especially be interested in hosting their own Python package index for supply-chain security reasons. Incidentally, there's even an officially blessed mirroring tool, https://pypi.org/project/bandersnatch/ .)
For those less in the know: is it for convenience? Because most systems have a package manager that can install Python, correct? But `pip` is more familiar to some?
I think it’s more for Python libraries that depend on JavaScript.
Lots of packages rely on other languages and runtimes. For example, tabula-py[1] depends on Java.
So if my-package requires a JS runtime, it can add this deno package as its own dependency.
The benefit is consumers only need to specify my-package as a dependency, and the deno runtime will be fetched for free as a transient dependency. This avoids every consumer needing to manage their own JavaScript runtime/environment.
The zig one allows you to build native modules for your python project from setup.py without having to have a C/C++ toolchain preinstalled. Here's a talk about this:
It's because when you use the native Python packaging tools, you can install a Python "wheel" into an arbitrary Python environment.
If you get Deno from the system package manager, or from deno.com directly, you're more constrained. Rather, it seems that you can set an environment variable to control where the Deno home page installer will install, but then you still need to make your Python program aware of that path.
Whereas a native Python package can (and does, in this case, and also e.g. in the case of `uv`) provide a shim that can be imported from Python and which tells your program what the path is. So even though the runtime doesn't itself have a Python API, it can be used more readily from a Python program that depends on it.
Pypi is the only OS agnostic package manager already installed on every OS.
Also, it's VERY convenient for companies already using python as the primary language because they can manage the dependency with uv rather than introduce a second package manager for devs. (For example, if you run deno code, but don't maintain any JS yourself)
I'm no expert when it comes to software packaging and distribution issues but this does give off Internet-Archive-as-CDN levels of Hyrum's Law for me. What could possibly go wrong hmmmmmm....
Quite interesting to observe PyPI being used as a distro agnostic binary package manager. Someone is going to create a NixOs competitor that uses PyPI for hosting and uv for installation.
For those who like the idea but don't want to use someone else's bandwidth for it: the PyPI API is described across several PEPs and documented on-site (https://docs.pypi.org/api/); and a mirroring tool is implemented under PyPA stewardship (https://pypi.org/project/bandersnatch/).
But at the individual project level this definitely isn't new. Aside from the examples cited in https://news.ycombinator.com/item?id=46561197, another fairly obvious example of a compiled binary hosted on PyPI is... uv.
I realize you are tongue in cheek, but I hope people respect the logical limits of this sort of thing.
Years ago, there were some development tools coming out of the Ruby world – SASS for sure, and Vagrant if I remember correctly – whose standard method of installation was via a Ruby gem. Ruby on Rails was popular, and I am sure that for the initial users this had almost zero friction. But the tools began to be adopted by non-Ruby-devs, and it was frustrating. Many Ruby libraries had hardcoded file paths that didn’t jive with your distro’s conventions, and they assumed newer versions of Ruby than existed in your package repos. Since then I have seen the same issue crop up with PHP and server-side JavaScript software.
It’s less of a pain today because you can spin up a container or VM and install a whole language ecosystem there, letting it clobber whatever it wants to clobber. But it’s still nicer when everything respects the OS’s local conventions.
It would be pretty magical if this simplifies bundling static assets in Python applications, letting us avoid independently installing and running the Node toolchain.
I... really don't know if I'd go that far. Better not to abuse Fastly's good will in providing the bandwidth. These things have PyPI distributions specifically because they support legitimate Python projects. For example, Cmake and Ninja are part of a stack intended to support building things for the SciPy ecosystem, using scikit-build. The CUDA stuff is obviously relevant to PyTorch, Tensorflow et. al. And (per the README) "The ziglang Python package redistributes the Zig toolchain so that it can be used as a dependency of Python projects."
Obviously a question of good will arises but for what its worth, I wouldnt consider abuse but rather innovation/lets see where the curiosity leads us too
Some time ago in npm, someone has made packages which can install fonts via npm and use the cdn system provided by it for such
I think its more private than many competitors out there. An google fonts alternative is suggested to be coollabs which uses bunny cdn under the hood but using npm's infrastructure which is usually provided by cloudflare is another great idea as well.
Also you are forgetting something that these are economies of scale.
And they aren't using pypi to distribute the official version of deno or the only way they distribute deno. That would be the case which would incur lots of bandwidth good will you could say, but I think the current use case would likely just have in at best 10s of gigs per day or 100s of gigs per day , this is just a method where python is usually installed and it simplifies the installation of deno a lot and there are some really beneficial concepts which can drive up even including recently yt-dlp
its a good idea for what its worth
For context JSdelivr delivered 20,572 TB data per month for free.
I genuinely consider that deno's python might not even reach even 100 GB data per month and I am exaggerating it a lot like with a strech, Python Cuda modules are usually the largest bandwidth eaters imho
All in all, this is an valid implementation/idea. The abuse of good will complaint doesn't stand that much
I'm not specifically objecting to the Deno distribution, but to the idea of PyPI becoming "the executable distribution mechanism of the future". Since the latter makes it sound like people would use it for things that have nothing to do with Python.
Totally off topic, the pypi zig library has been very helpful for a few of my projects. It's nice to write low level components and have a simple install process.
Why though? What makes PyPI or compatible registries so great for this? On a similar note, Pixi uses the conda packaging mechanism for managing platform agnostic multi-language software distribution. How do these two compare?
Good news for future yt-dlp releases (cf. https://news.ycombinator.com/item?id=45898407), which could now mark it as an optional dependency available through Python's native packaging.
Update: I lobbed the idea over on the yt-dlp issue tracker. The initial response has been skeptical, citing some issues (which I agree about) with the current state of the Deno packaging. But I'm optimistic that this can be sorted out eventually.
https://github.com/yt-dlp/yt-dlp/issues/15530 for those interested to check/chime in.
yt-dlp was also the first application that came to my mind. I got my fingers crossed for this integration. It was interesting to learn how to hijack my own cookies but, nonetheless, rather uncomfortable to say the least.
I'm not crazy about the way this installs Deno as `/usr/local/bin/deno` (on Linux systems at least). I was hoping it would leave that executable tucked away in Python site-packages somewhere out of the way.
I also ran into some weird issues where sometimes the binary isn't executable and you have to chmod +x it - including in GitHub Actions workflows. I had to workaround it like this: https://github.com/simonw/denobox/blob/8076ddfd78ee8faa6f1cd...
OK this is cool:
One-liner to run Deno without a separate step to install it first.The wheel comes in five flavors: https://pypi.org/project/deno/#files - Windows x86, manylinux x86 and ARM64, macOS x86 and ARM64.
That's a lot of machines that can now get a working Deno directly from PyPI.
There's also a "source" distribution that installs by downloading from the main project's GitHub release page. But that's presumably limited to the same platform support.
The yt-dlp project also raised concerns that the manylinux wheel incorrectly advertises older glibc support.
... but I always bristle a bit at the "one-liner to run without installing" description. Sure, the ergonomics are great, but you do still have to download the whole thing, and it does create a temporary installation that is hard-linked from a cache folder that is basically itself an installation.
Sure, but as an end-user you don't have to think about installation at all. That's a huge win - mainly because it eliminates the "Did I install this already? Where did I put it? What's the command for doing that again?" mental overhead.
Can someone please ELI5 what this means for Deno and Python? TFA: "deno is being distributed on pypi for use in python projects" makes it sound like you can now `import deno` and have a JS engine/subsystem in Python, like we finally came full circle from [PyScript](https://pyscript.net/).
However, other comments make it sound like a bunch of other projects have discovered that PyPI is a good distribution channel. Which, to me, sounds like using the Internet Archive as your CDN. Is PyPI the next apt/yum/brew or what?
You can, in fact, `import deno` after installing it. But all this gets you is a function that locates the Deno executable, which you can then invoke e.g. with `subprocess.call`.
(I hope this doesn't become a pattern that puts excessive pressure on PyPI. IMO it should only be used for things that are specifically known to be useful in the Python ecosystem, as a last resort when proper Python API bindings would be infeasible or the developer resources aren't there for it. And everyone should keep in mind that PyPI is just one index, operating a standard protocol that others can implement. Large companies should especially be interested in hosting their own Python package index for supply-chain security reasons. Incidentally, there's even an officially blessed mirroring tool, https://pypi.org/project/bandersnatch/ .)
I think it's just so the deno binary is available using pip install. Zig does the same thing.
For those less in the know: is it for convenience? Because most systems have a package manager that can install Python, correct? But `pip` is more familiar to some?
I think it’s more for Python libraries that depend on JavaScript.
Lots of packages rely on other languages and runtimes. For example, tabula-py[1] depends on Java.
So if my-package requires a JS runtime, it can add this deno package as its own dependency.
The benefit is consumers only need to specify my-package as a dependency, and the deno runtime will be fetched for free as a transient dependency. This avoids every consumer needing to manage their own JavaScript runtime/environment.
https://pypi.org/project/tabula-py/
The zig one allows you to build native modules for your python project from setup.py without having to have a C/C++ toolchain preinstalled. Here's a talk about this:
https://www.youtube.com/watch?v=HPmefnqirHk
It's because when you use the native Python packaging tools, you can install a Python "wheel" into an arbitrary Python environment.
If you get Deno from the system package manager, or from deno.com directly, you're more constrained. Rather, it seems that you can set an environment variable to control where the Deno home page installer will install, but then you still need to make your Python program aware of that path.
Whereas a native Python package can (and does, in this case, and also e.g. in the case of `uv`) provide a shim that can be imported from Python and which tells your program what the path is. So even though the runtime doesn't itself have a Python API, it can be used more readily from a Python program that depends on it.
Pypi is the only OS agnostic package manager already installed on every OS.
Also, it's VERY convenient for companies already using python as the primary language because they can manage the dependency with uv rather than introduce a second package manager for devs. (For example, if you run deno code, but don't maintain any JS yourself)
I'm no expert when it comes to software packaging and distribution issues but this does give off Internet-Archive-as-CDN levels of Hyrum's Law for me. What could possibly go wrong hmmmmmm....
Quite interesting to observe PyPI being used as a distro agnostic binary package manager. Someone is going to create a NixOs competitor that uses PyPI for hosting and uv for installation.
For those who like the idea but don't want to use someone else's bandwidth for it: the PyPI API is described across several PEPs and documented on-site (https://docs.pypi.org/api/); and a mirroring tool is implemented under PyPA stewardship (https://pypi.org/project/bandersnatch/).
But at the individual project level this definitely isn't new. Aside from the examples cited in https://news.ycombinator.com/item?id=46561197, another fairly obvious example of a compiled binary hosted on PyPI is... uv.
I realize you are tongue in cheek, but I hope people respect the logical limits of this sort of thing.
Years ago, there were some development tools coming out of the Ruby world – SASS for sure, and Vagrant if I remember correctly – whose standard method of installation was via a Ruby gem. Ruby on Rails was popular, and I am sure that for the initial users this had almost zero friction. But the tools began to be adopted by non-Ruby-devs, and it was frustrating. Many Ruby libraries had hardcoded file paths that didn’t jive with your distro’s conventions, and they assumed newer versions of Ruby than existed in your package repos. Since then I have seen the same issue crop up with PHP and server-side JavaScript software.
It’s less of a pain today because you can spin up a container or VM and install a whole language ecosystem there, letting it clobber whatever it wants to clobber. But it’s still nicer when everything respects the OS’s local conventions.
I think golang in this context is better
Golang has really fast compilation time unlike rust and its cross compatible (usually, yes I know CGo can be considered a menace)
Golang binary applications can also be installed rather simply.
I really enjoy the golang ecosystem.
Putting this here for visibility:
PyPi: https://pypi.org/project/deno/
GitHub: https://github.com/denoland/deno_pypi
(Note that the GitHub link in the first post of the issue linked by this HN post now redirects to the official location, as of the time I write this.)
It would be pretty magical if this simplifies bundling static assets in Python applications, letting us avoid independently installing and running the Node toolchain.
It does.
Why is it 2026 and I still can't apt install deno?
Imho software that moves fast shouldn't be apt installed.
You end up with old versions as default installs that are hard to upgrade
Indeed. I would say the bigger question is why Debian does package yt-dlp even though it's basically guaranteed to be unusably far out of date.
For better or worse, pypi is the executable distribution mechanism of the future.
Other cool tools you can install from pypi:
1. https://pypi.org/project/cmake/
2. https://pypi.org/project/ninja/
3. an entire c/c++/zig toolchain: https://pypi.org/project/ziglang/
4. the nvcc cuda compiler: https://pypi.org/project/nvidia-cuda-nvcc/
I... really don't know if I'd go that far. Better not to abuse Fastly's good will in providing the bandwidth. These things have PyPI distributions specifically because they support legitimate Python projects. For example, Cmake and Ninja are part of a stack intended to support building things for the SciPy ecosystem, using scikit-build. The CUDA stuff is obviously relevant to PyTorch, Tensorflow et. al. And (per the README) "The ziglang Python package redistributes the Zig toolchain so that it can be used as a dependency of Python projects."
Obviously a question of good will arises but for what its worth, I wouldnt consider abuse but rather innovation/lets see where the curiosity leads us too
Some time ago in npm, someone has made packages which can install fonts via npm and use the cdn system provided by it for such
I think its more private than many competitors out there. An google fonts alternative is suggested to be coollabs which uses bunny cdn under the hood but using npm's infrastructure which is usually provided by cloudflare is another great idea as well.
Also you are forgetting something that these are economies of scale.
And they aren't using pypi to distribute the official version of deno or the only way they distribute deno. That would be the case which would incur lots of bandwidth good will you could say, but I think the current use case would likely just have in at best 10s of gigs per day or 100s of gigs per day , this is just a method where python is usually installed and it simplifies the installation of deno a lot and there are some really beneficial concepts which can drive up even including recently yt-dlp
its a good idea for what its worth
For context JSdelivr delivered 20,572 TB data per month for free.
I genuinely consider that deno's python might not even reach even 100 GB data per month and I am exaggerating it a lot like with a strech, Python Cuda modules are usually the largest bandwidth eaters imho
All in all, this is an valid implementation/idea. The abuse of good will complaint doesn't stand that much
I'm not specifically objecting to the Deno distribution, but to the idea of PyPI becoming "the executable distribution mechanism of the future". Since the latter makes it sound like people would use it for things that have nothing to do with Python.
Totally off topic, the pypi zig library has been very helpful for a few of my projects. It's nice to write low level components and have a simple install process.
Why though? What makes PyPI or compatible registries so great for this? On a similar note, Pixi uses the conda packaging mechanism for managing platform agnostic multi-language software distribution. How do these two compare?