Has anyone here even read the article?! All the comments here assume they're building a package manager for C!
They're writing a tool to discover and index all indirect dependencies across languages, including C libraries that were smuggled inside other packages and weren't properly declared as a dependency anywhere.
"Please don't" what? Please don't discover the duplicate and potentially vulnerable C libraries that are out of sight of the system package manager?
Please don't. C packaging in distros is working fine and doesn't need to turn into crap like the other language-specific package managers. If you don't know how to use pkgconf then that's your problem.
Missing in this discussion is that package management is tightly coupled to module resolution in nearly every language. It is not enough to merely install dependencies of given versions but to do so in a way that the language toolchain and/or runtime can find and resolve them.
And so when it comes to dynamic dependencies (including shared libraries) that are not resolved until runtime you hit language-level constraints. With C libraries the problem is not merely that distribution packagers chose to support single versions of dependencies because it is easy but because the loader (provided by your C toolchain) isn't designed to support it.
And if you've ever dug into the guts of glibc's loader it's 40 years of unreadable cruft. If you want to take a shot at the C-shaped hole, take a look at that and look at decoupling it from the toolchain and add support for multiple version resolution and other basic features of module resolution in 2026.
I don't trust any language that fundamentally becomes reliant on package managers. Once package managers become normalized and pervasively used, people become less thoughtful and investigative into what libraries they use. Instead of learning about who created it, who manages it, what its philosophy is, people increasingly just let'er rip and install it then use a few snippets to try it. If it works, great. Maybe it's a little bloated and that causes them to give it a side-eye, but they can replace it later....which never comes.
That would be fine if it only effected that first layer, of a basic library and a basic app, but it becomes multiple layers of this kind of habit that then ends up in multiple layers of software used by many people.
Not sure that I would go so far as to suggest these kinds of languages with runaway dependency cultures shouldn't exist, but I will go so far as to say any languages that don't already have that culture need to be preserved with respect like uncontacted tribes in the Amazon. You aren't just managing a language, you are also managing process and mind. Some seemingly inefficient and seemingly less powerful processes and ways of thinking have value that isn't always immediately obvious to people.
I use a lot of obscure libraries for scientific computing and engineering. If I install it from pacman or manage to get an AUR build working, my life is pretty good. If I have to use a Python library the faff becomes unbearable, make a venv, delete the venv, change python version, use conda, use uv, try and install it globally, change python path, source .venv/bin/activate. This is less true for other languages with local package management, but none of them are as frictionless as C (or Zig which I use mostly). The other issue is .venvs, node_packages and equivalents take up huge amounts of disk and make it a pain to move folders around, and no I will not be using a git repo for every throwaway test.
I get that the scope of the article is a bit larger than this, but it's a pet peeve of mine when authors acknowledge the advantages of conda and then dismiss it for...silly? reasons. It kind of sounds like they just don't know many people using it, so they assume something must be wrong with it.
> If you don’t need compiled extensions, Conda is more than you need.
Am I missing something or isn't that exactly the problem we're talking about here?
> And even when you do need it, conda environments are heavier than virtual environments and the resolver used to be infamously slow. Mamba exists largely because conda’s dependency resolution took forever on nontrivial environments.
Like it says here, speed isn't a problem anymore - mamba is fast. And it's true that the environments get large; maybe there's bloat, but it definitely does share package versions across environments when possible, while keeping updates and such isolated to the current environment. Maybe there's a space for a language package manager that tries to be more like a system package manager by updating multiple envs at once while staying within version constraints to minimize duplication, but idk if many developers would think that is worth the risk.
This comes up every ten years or so, and is a solved problem. Any decent distro has tools to scan the dependencies of each binary via ldd, to check if its deps are correct.
His example numpy shipping its own libblas.so, has the speciality that it's runtime loaded, so ldd will not find it, but the runtime dep is in the MANIFEST.
And seeing that is not in a standard path concludes that is a private copy, that needs to be updated seperately if broken.
The biggest difficult is not that, is the many assumptions you need when writing a makefile and how to use different versions of same library. The LD_PATH is something had as potentially risky. Not that it be... but assumptions of the past, like big monsters, are a barrier to the simpler C tooling.
> Conan and vcpkg exist now and are actively maintained
I am not sure if it is just me, but I seem to constantly run into broken vcpkg packages with bad security patches that keep them from compiling, cmake scripts that can't find the binaries, missing headers and other fun issues.
I think system package managers do just fine at wrangling static library dependencies for compiled languages, and if you're building something that somehow falls through the cracks of them then I think you should probably just be using git or some kinda vcs for whatever you're doing, not a package manager
But on the other hand, I am used to arch, which both does package-management ala carte as a rolling release distro and has a pretty extensively-used secondary open community ecosystem for non-distro-maintained packages, so maybe this isn't as true in the "stop the world" model the author talks about
The C-Shaped Hole in Package Management
(nesbitt.io)60 points by tanganik 27 January 2026 | 73 comments
Comments
They're writing a tool to discover and index all indirect dependencies across languages, including C libraries that were smuggled inside other packages and weren't properly declared as a dependency anywhere.
"Please don't" what? Please don't discover the duplicate and potentially vulnerable C libraries that are out of sight of the system package manager?
And so when it comes to dynamic dependencies (including shared libraries) that are not resolved until runtime you hit language-level constraints. With C libraries the problem is not merely that distribution packagers chose to support single versions of dependencies because it is easy but because the loader (provided by your C toolchain) isn't designed to support it.
And if you've ever dug into the guts of glibc's loader it's 40 years of unreadable cruft. If you want to take a shot at the C-shaped hole, take a look at that and look at decoupling it from the toolchain and add support for multiple version resolution and other basic features of module resolution in 2026.
That would be fine if it only effected that first layer, of a basic library and a basic app, but it becomes multiple layers of this kind of habit that then ends up in multiple layers of software used by many people.
Not sure that I would go so far as to suggest these kinds of languages with runaway dependency cultures shouldn't exist, but I will go so far as to say any languages that don't already have that culture need to be preserved with respect like uncontacted tribes in the Amazon. You aren't just managing a language, you are also managing process and mind. Some seemingly inefficient and seemingly less powerful processes and ways of thinking have value that isn't always immediately obvious to people.
> If you don’t need compiled extensions, Conda is more than you need.
Am I missing something or isn't that exactly the problem we're talking about here?
> And even when you do need it, conda environments are heavier than virtual environments and the resolver used to be infamously slow. Mamba exists largely because conda’s dependency resolution took forever on nontrivial environments.
Like it says here, speed isn't a problem anymore - mamba is fast. And it's true that the environments get large; maybe there's bloat, but it definitely does share package versions across environments when possible, while keeping updates and such isolated to the current environment. Maybe there's a space for a language package manager that tries to be more like a system package manager by updating multiple envs at once while staying within version constraints to minimize duplication, but idk if many developers would think that is worth the risk.
His example numpy shipping its own libblas.so, has the speciality that it's runtime loaded, so ldd will not find it, but the runtime dep is in the MANIFEST. And seeing that is not in a standard path concludes that is a private copy, that needs to be updated seperately if broken.
No other hole than in his thinking and worrying.
I am not sure if it is just me, but I seem to constantly run into broken vcpkg packages with bad security patches that keep them from compiling, cmake scripts that can't find the binaries, missing headers and other fun issues.
But on the other hand, I am used to arch, which both does package-management ala carte as a rolling release distro and has a pretty extensively-used secondary open community ecosystem for non-distro-maintained packages, so maybe this isn't as true in the "stop the world" model the author talks about