Just a small note on the code in the linked script:
API_KEY = os.environ.get("YOUTUBE_API_KEY")
CHANNEL_ID = os.environ.get("YOUTUBE_CHANNEL_ID")
if not API_KEY or not CHANNEL_ID:
print("Missing YOUTUBE_API_KEY or YOUTUBE_CHANNEL_ID.")
exit(1)
Presenting the user with "Missing X OR Y" when there's no reason that OR has to be there massively frustrates the user for the near zero benefit of having one fewer if statement.
if not API_KEY:
print("Missing YOUTUBE_API_KEY.")
exit(1)
if not CHANNEL_ID:
print("Missing YOUTUBE_CHANNEL_ID.")
exit(1)
Way better user experience, 0.00001% slower dev time.
> Not only because the syntax is more human-friendly, but also because the Python interpreter is natively integrated in all Unix distros
That's kind of very optimistic evaluation - literally anything beyond "import json" will likely lead you into the abyss of virtual envs. Running something created with say Python 3.13.x on Ubuntu 22.04 or even 24.04 (LTSs) / Rocky 9 and the whole can of worms opened.
things like virtual envs + containers (docker like)/version managers become a must quickly.
Maybe I'm the only one that finds Python simultaneously verbose and lacking? Either you need 500 dependencies to do something in a simple way, or you need dozens (if not hundreds) of lines to do trivial things. I avoid writing Python because there's so much bullshit to add. Much prefer Perl, I can actually get things done quickly. Python feels like programming for the sake of programming.
I'm glad someone else discovered they can like python.
I got forced to learn it for a project where I was proposing Ruby and the customer insisted on Python. This was years ago when Ruby was much slower. I was annoyed but I got used to it and here I am enjoying it many years later.
I take issue with the description and use of make though! :-D What is the point of it if you're not going to use dependencies? One might as well just write a script with a case statement..... I'm adding smileys because I don't mean to be all that serious but I really do think that the failure of the youth of today to get to grips with Make is a sad reflection on our culture....... and get off my lawn, ok? :-)
For me, python is the closest thing to writing pseudocode that functions. Every time I have the instinct to gloss over a thing when writing it down (because it feels obvious in my head), it turns out that python has an intuitive abstraction for it.
Coming from a mathy background I found it incredibly satisfying, although I’ve come around to other languages since.
From what I was told, Python was originally seen as a Swiss Army knife for sysadmins. It started gaining more traction when Canonical adopted it as the main language for Ubuntu 4.10 in 2004.
Then, in 2005, Guido van Rossum was hired by Google to work on Google Cloud. That opened the door for wider adoption in academia, since Python had strong math libraries and integrated well with tools researchers were already using, like Hadoop, right around the time big data and ML were starting to take off.
Also, between 2005 and 2006, two important things happened: Ruby on Rails came out and inspired Django, which was starting to gain popularity, and web developers were getting tired of Perl's messy syntax. That's how Python quickly became a solid choice not just for server-side scripts, but for building proper web apps. In the meantime, another language that could be embedded directly into HTML was storming the web: PHP. Its syntax was similar to JavaScript, it was easy to pick up, lowered the barrier to entry for software development, worked straight out of the box, and didn't require thousands of print statements to get things done.
The 3 Ps made history. According to programmers from 20 years ago, they were like religions. Each had its own philosophy and a loyal group of followers crusading online, getting into heated debates, all trying to win over more adopters. The new generation of devs is more pragmatic. These days it's less about language wars and more about picking the right tool for the job.
Python has done an impressive job over the years of making steady robust improvements. The typing and tooling has just gotten better and better. There are still plenty of problems though, imho async is still a much bigger pain than it should be (compared to other runtimes with a very nice experience like go or elixir, even dotnet has been less pain in my experience). Overall I like python, but it mainly boils down to the robust libraries for things I do (ML, Data munching/analysis)
I make projects following almost identical patterns. It's a little uncanny. Maybe the people in the python developer ecosystem are converging on a pretty uniform way to do most things? I though some of my choices were maybe "my own", it seeing such consistency makes me question my own free will.
It's like when people pick a "unique" name for their baby along with almost everyone else. What you thought was a unique name is the #2 most popular name.
> Python has done a good job of hiding its legacy ugliness (such as __init__, __new__, and similar aberrations), swettening its syntax to accomodate developers with good taste.
What exactly is the problem with __init__ or __new__? @dataclass is very nice syntactic sugar, but are we arguing here that having access to initializer/allocator/constructor dunder methods is "legacy ugliness"? This is the core of pythonic built-in aware python. Bizarre.
Out of curiosity, why would I use Dataclass vs a Pydantic Basemodel. If we did not have a PyDantic dependency I could imagine wanting to use Dataclass. But if I have it, why not use everywhere?
> the Python interpreter is natively integrated in all Unix distros
It's included in the default install of most desktop/server Linux distros (with plenty of exceptions), but I don't believe any of the BSDs ship it in their base system.
IIRC macOS used to have python 2 in its default install, but I vaguely recall that being deprecated and removed at some point. My only Mac is on the other side of the country at the moment, so I can't check myself.
I've avoided Python for a long time, but I'm getting roped in myself, mainly because certain tasks seem to require a lot less code than Java or Perl.
That said, call me old-fashioned, but I really take issue with "curl $URL | bash" as an installation method. If you're going to use an install script, inspect it first.
I like Python for the language, and for a lot of jobs the threading model limitations do not matter. Its a great language to get stuff done. I find the package management story challenging but I will try uv next time!
Good write up, and solid choices. As someone primarily working in python in the last few years, I have a very similar stack.
Two additional suggestions:
* mise to manage system dependencies, including uv version and python itself
* just instead of make; makefile syntax is just too annoying.
Mise actually has a command runner as well which I haven't tried yet, and might work better for running commands in the context of the current environment. It's pretty nice when your GitHub actions workflow is just:
Otherwise, the existence of a file or folder with the same name as your task ("test", for example) will stop that task from being run, which might be very annoying if you're using the Makefile as part of a script or CI or something where you won't notice the "Nothing to be done for..." message.
From what? It would be useful to mention towards the front of the post so we know what is the context in which you approach python.
I've switched to python primarily (from perl) in early 2010s (I think my first "seriously" used version was 2.6. This is mostly for system management, monitoring, and data transformation /visualisation. Nothing fancy like AI back then in a work setting.
I found the biggest impact was not so much on writing code but on it remaining readable for a very long time, even if it was created hastily "just get this working now" style. Especially in a team.
Python is still one of my favourites and the first tool I reach if bash is not enough for what I'm trying to do.
It's funny, I'm the opposite: LLMs have made it easy to draft something in Python, then translate to a more appropriate language for the target problem-domain, for example Go.
"And guess what's the de facto programming language for AI? Yep, that sneaky one."
Is this referring at all to to PyTorch. If not, any guesses what the author has in mind
"Not only because the syntax is more human-friendly, but also because the Python interpreter is natively integrated in all Unix distros."
Is this referring to GNU/Linux.
UNIX (UNIX-like) includes more than Linux; some UNIX distributions do not include Python in the base system
Where it is left as choice to the user whether to install it
I know this because I use such distributions and, unless some software needs it, I do not install Python
In such case, when I am done using that software I uninstall it^1
For example, he mentions retrieving YouTube channel metadata
I do not use Python for this; I use a 19-line shell script (ash not bash), its startup time is faster
Unlike Python, it is included in the base system of the UNIX distributions (both Linux and BSD) that I use
But if I need to test something using yt-dlp, then I might temporarily install Python
1. I compile Python from source and one annoying aspect of the project , in addition to the slow startup time, is their failure to include an uninstall target in their Makefile
Worked at a company where that approach lead to huge unwieldy structure that no one dared to touch to not break anything other teams are working on. The problem was not so much the repo, but dependencies structure (like single requirements.txt for the whole repo) and build scripts.
In theory it should've worked great -- you only need to update a dependency once and be sure all the code has the most recent security patches. In reality everyone was afraid to touch it, because it will break someone's else code. Monorepos work great only if you have serious NIH syndrome (Google).
I actually started appreciating damned micro-services there, as long as each service just reflects organization team structure.
When did Python go out of fashion? This is the second article I've seen talking about it as if it's some kind abomination.
I get that it's not the shiny new thing, but I don't understand people hating on it. Is this just junior devs who never learned it, or is there some new language out that I missed? (And please don't tell me Javascript....)
I love writing Python. However, there are caveats:
1: I don't like dealing with language crossing boundaries with it - it's painful, especially if to/from a compiled language - there's just too much friction. It's easy to write but hard to maintain.
2: Debugging python can be...painful. If you have a pure perfect environment dedicated to pure python and all the tooling set up, it can be breezy. But if you have something messier like C++ that calls Python or vice-versa and are using an editor not made for Python like QTCreator then suddenly it becomes hell.
Who owns this data!? Where was this initialized!? What calls this function!? What type is this data!?!?!?!? These questions are so effortless in C++ for me and so very painful in Python. It slows efforts to a crawl.
It feels exhausting to have to go back to print statements, grep, and hand-drawing a graph that shows what functions call what just to figure out WTF went wrong. It's painful to a degree that I avoid it as much as possible.
...and don't even get me started on dealing with dependency problems in deployments...
I really don't find the Python language elegant at all. I prefer the Ruby syntax.
But Python's tooling, particularly with what Astral is doing (uv, ruff, ty), is so good, I'm always using Python and virtually never using Ruby. And yeah, the rich libraries, too.
> And guess what’s the de facto programming language for AI?
I thought nodejs/typescript seemed to be the default that most LLMs choose? Or is that just v0/lovable/replit? (although replit seems better about going non-js sometimes)
> I started to code more in Python around 6 months ago. Why? Because of AI, obviously. It’s clear (to me) that big money opportunities are all over AI these days.
I find this depressing. Not only are LLMs covertly reducing our ability to think and make decisions, they’re now also making people voluntarily conform to some lower common denominator.
It’s like humanity decided to stagnate at this one point in time (and what a bad choice of point it was) and stop exploring other directions. Only what the LLM vomits is valid.
I have been enjoying Lisp languages since the late 1970s, and today it makes me happy using Common Lisp and Racket in the same way as when I stood in a forest very early this morning drinking coffee makes me happy.
But, Python is also a fun language and phrases like “good enough” and “actually liking it” are valid.
There is nothing more annoying than tons of little repos all of which containing tiny projects with a few hundred lines of code but (of course) you need most / all of them to do anything. Use a mono repo until there is some obvious reason to split it up imo.
I keep meaning to write something like this but exploring the “how simple can you make it” angle - a lot of my world is kube (which is great in the right scenario) but could you shave complexity out of a stack designed for solo dev rapid iteration:
e.g. rather than:
> It’s important not to do any heavy data processing steps in the project-ui … we keep the browser application light while delegating the heavy lifting and business logic to the server
Chomp the complexity, serve HTML from the backend directly
> ty
Im curious where ty goes but for a min-complexity stack i couldnt spend complexity tokens on pre release tools
> pydantic … dataclasses
One or the t’other, plus i’ll forever confuse myself: is it post_init (dataclasses) or is it post_model_init (pydantic) - i had to check!
> docker
if we already have uv, could we get away without docker? uv sync can give an experience almost akin to static compiled binaries with the right setup. Its not going to handle volumes etc so if you're using docker features, this concept isnt going to fly. If you're not wedded to docker though, can you get away with just uv in dev and prod? in an enterprise prob not, i wouldn't expect to be able to download deps in prod. For flying solo though…
> compose
You’ve a frontend, a backend and presumably a database. Could you get away with just uv to manage your backend and a local sqlite db?
So a broadly feature comparable stack for rapid iteration with less complexity but still all the bells and whistles so you dont need to cook everything yourself, might look like:
You could probably sketch a path to hyper scale if you ever needed it:
- v1 = the above stack
- v2 = swap sqlite for postgres, now you unlocked multiple writers and horizontal scaling, maybe py-pglite for test envs so you can defer test-containers adoption for one more scaling iteration. WAL streaming would add some complexity to this step but worth it
- v3 = introduce containers, a container registry and test-containers. I dont think you really unlock much in this step for all that additional complexity though…
- v4 = rke2 single node cluster, no load balancer needed yet
- v5 = scale to triple node, we need a load balancer too
- v6 = add agent nodes to the rke cluster
- v7 = can you optimise costs, maybe rewrite unchanging parts in a more resource efficient stack
…
I came to this comment thread expecting developer bikeshedding and I was not disappointed.
More seriously: it's fascinating and interesting to see how closely this article mirrors my own Python project layout. The tools and practices have come a long way over the last decade and good developer standards have a way of becoming the defacto in organic fashion.
IMO uv has won the race to be the new standard for Python environment management (Poetry gave it a solid try, but lost on speed).
“I started to code more in Python around 6 months ago. Why? Because of AI, obviously. It’s clear (to me) that big money opportunities are all over AI these days. And guess what’s the de facto programming language for AI? Yep, that sneaky one.”
Unpopular opinion: I think I’m going to wait for version 4 /jk. But honestly, I’ve been spoiled by modern languages like Rust, Go, and even TypeScript with modern tooling, strong typing, stability, and performance out of the box. Right now, I’m just interacting with LLMs, not building them.
That said, I remember writing myself a note a few years ago to avoid Python projects. I had to clean up code from all over the company and make it ready for production. Everyone had their own Python version, dependencies missing from requirements.txt, three way conflicts between 2 dependencies and the python version, wildly different styles, and a habit of pulling in as many libraries as possible [1]. Even recalling those memories makes my stomach turn.
I believe constraints make a project shine and be maintainable. I'd prefer if you throw at me a real python instead of a python project.
[1] Yes, I'm aware of containers, I was the unlucky guy writing them.
I'm switching to Python and actually liking it
(cesarsotovalero.net)465 points by cesarsotovalero 16 July 2025 | 692 comments
Comments
I recommend cookiecutter for this. I have a few templates I've built with that which I use frequently:
python-lib: https://github.com/simonw/python-lib
click-app: https://github.com/simonw/click-app
datasette-plugin: https://github.com/simonw/datasette-plugin
llm-plugin: https://github.com/simonw/llm-plugin
You can run them like this:
That's kind of very optimistic evaluation - literally anything beyond "import json" will likely lead you into the abyss of virtual envs. Running something created with say Python 3.13.x on Ubuntu 22.04 or even 24.04 (LTSs) / Rocky 9 and the whole can of worms opened.
things like virtual envs + containers (docker like)/version managers become a must quickly.
I got forced to learn it for a project where I was proposing Ruby and the customer insisted on Python. This was years ago when Ruby was much slower. I was annoyed but I got used to it and here I am enjoying it many years later.
I take issue with the description and use of make though! :-D What is the point of it if you're not going to use dependencies? One might as well just write a script with a case statement..... I'm adding smileys because I don't mean to be all that serious but I really do think that the failure of the youth of today to get to grips with Make is a sad reflection on our culture....... and get off my lawn, ok? :-)
Coming from a mathy background I found it incredibly satisfying, although I’ve come around to other languages since.
Then, in 2005, Guido van Rossum was hired by Google to work on Google Cloud. That opened the door for wider adoption in academia, since Python had strong math libraries and integrated well with tools researchers were already using, like Hadoop, right around the time big data and ML were starting to take off.
Also, between 2005 and 2006, two important things happened: Ruby on Rails came out and inspired Django, which was starting to gain popularity, and web developers were getting tired of Perl's messy syntax. That's how Python quickly became a solid choice not just for server-side scripts, but for building proper web apps. In the meantime, another language that could be embedded directly into HTML was storming the web: PHP. Its syntax was similar to JavaScript, it was easy to pick up, lowered the barrier to entry for software development, worked straight out of the box, and didn't require thousands of print statements to get things done.
The 3 Ps made history. According to programmers from 20 years ago, they were like religions. Each had its own philosophy and a loyal group of followers crusading online, getting into heated debates, all trying to win over more adopters. The new generation of devs is more pragmatic. These days it's less about language wars and more about picking the right tool for the job.
It's like when people pick a "unique" name for their baby along with almost everyone else. What you thought was a unique name is the #2 most popular name.
What exactly is the problem with __init__ or __new__? @dataclass is very nice syntactic sugar, but are we arguing here that having access to initializer/allocator/constructor dunder methods is "legacy ugliness"? This is the core of pythonic built-in aware python. Bizarre.
It's included in the default install of most desktop/server Linux distros (with plenty of exceptions), but I don't believe any of the BSDs ship it in their base system.
IIRC macOS used to have python 2 in its default install, but I vaguely recall that being deprecated and removed at some point. My only Mac is on the other side of the country at the moment, so I can't check myself.
My thoughts about python here: https://calvinlc.com/p/2025/06/10/thank-you-and-goodbye-pyth...
Next time I get into Python I’ll try uv, ruff, ty.
That said, call me old-fashioned, but I really take issue with "curl $URL | bash" as an installation method. If you're going to use an install script, inspect it first.
Two additional suggestions:
* mise to manage system dependencies, including uv version and python itself
* just instead of make; makefile syntax is just too annoying.
Mise actually has a command runner as well which I haven't tried yet, and might work better for running commands in the context of the current environment. It's pretty nice when your GitHub actions workflow is just:
* Install mise
* mise install everything else
Otherwise, the existence of a file or folder with the same name as your task ("test", for example) will stop that task from being run, which might be very annoying if you're using the Makefile as part of a script or CI or something where you won't notice the "Nothing to be done for..." message.
I've switched to python primarily (from perl) in early 2010s (I think my first "seriously" used version was 2.6. This is mostly for system management, monitoring, and data transformation /visualisation. Nothing fancy like AI back then in a work setting.
I found the biggest impact was not so much on writing code but on it remaining readable for a very long time, even if it was created hastily "just get this working now" style. Especially in a team.
Python is still one of my favourites and the first tool I reach if bash is not enough for what I'm trying to do.
I always found vscode lacking for Python and C compared to pycharm and clion. The latter just work without fiddling with config files.
Is this referring at all to to PyTorch. If not, any guesses what the author has in mind
"Not only because the syntax is more human-friendly, but also because the Python interpreter is natively integrated in all Unix distros."
Is this referring to GNU/Linux.
UNIX (UNIX-like) includes more than Linux; some UNIX distributions do not include Python in the base system
Where it is left as choice to the user whether to install it
I know this because I use such distributions and, unless some software needs it, I do not install Python
In such case, when I am done using that software I uninstall it^1
For example, he mentions retrieving YouTube channel metadata
I do not use Python for this; I use a 19-line shell script (ash not bash), its startup time is faster
Unlike Python, it is included in the base system of the UNIX distributions (both Linux and BSD) that I use
But if I need to test something using yt-dlp, then I might temporarily install Python
1. I compile Python from source and one annoying aspect of the project , in addition to the slow startup time, is their failure to include an uninstall target in their Makefile
Worked at a company where that approach lead to huge unwieldy structure that no one dared to touch to not break anything other teams are working on. The problem was not so much the repo, but dependencies structure (like single requirements.txt for the whole repo) and build scripts.
In theory it should've worked great -- you only need to update a dependency once and be sure all the code has the most recent security patches. In reality everyone was afraid to touch it, because it will break someone's else code. Monorepos work great only if you have serious NIH syndrome (Google).
I actually started appreciating damned micro-services there, as long as each service just reflects organization team structure.
https://en.wikipedia.org/wiki/Conway's_law
I get that it's not the shiny new thing, but I don't understand people hating on it. Is this just junior devs who never learned it, or is there some new language out that I missed? (And please don't tell me Javascript....)
1: I don't like dealing with language crossing boundaries with it - it's painful, especially if to/from a compiled language - there's just too much friction. It's easy to write but hard to maintain.
2: Debugging python can be...painful. If you have a pure perfect environment dedicated to pure python and all the tooling set up, it can be breezy. But if you have something messier like C++ that calls Python or vice-versa and are using an editor not made for Python like QTCreator then suddenly it becomes hell.
Who owns this data!? Where was this initialized!? What calls this function!? What type is this data!?!?!?!? These questions are so effortless in C++ for me and so very painful in Python. It slows efforts to a crawl.
It feels exhausting to have to go back to print statements, grep, and hand-drawing a graph that shows what functions call what just to figure out WTF went wrong. It's painful to a degree that I avoid it as much as possible.
...and don't even get me started on dealing with dependency problems in deployments...
But Python's tooling, particularly with what Astral is doing (uv, ruff, ty), is so good, I'm always using Python and virtually never using Ruby. And yeah, the rich libraries, too.
I thought nodejs/typescript seemed to be the default that most LLMs choose? Or is that just v0/lovable/replit? (although replit seems better about going non-js sometimes)
I find this depressing. Not only are LLMs covertly reducing our ability to think and make decisions, they’re now also making people voluntarily conform to some lower common denominator.
It’s like humanity decided to stagnate at this one point in time (and what a bad choice of point it was) and stop exploring other directions. Only what the LLM vomits is valid.
I have been enjoying Lisp languages since the late 1970s, and today it makes me happy using Common Lisp and Racket in the same way as when I stood in a forest very early this morning drinking coffee makes me happy.
But, Python is also a fun language and phrases like “good enough” and “actually liking it” are valid.
There is nothing more annoying than tons of little repos all of which containing tiny projects with a few hundred lines of code but (of course) you need most / all of them to do anything. Use a mono repo until there is some obvious reason to split it up imo.
But weiting a processing pipeline with Python is frustrating if you have worked with C# concurrency.
I figured the best option is Celery and you cannot do it without an external broker. Celery is a mess. I really hate it.
e.g. rather than:
Chomp the complexity, serve HTML from the backend directly Im curious where ty goes but for a min-complexity stack i couldnt spend complexity tokens on pre release tools One or the t’other, plus i’ll forever confuse myself: is it post_init (dataclasses) or is it post_model_init (pydantic) - i had to check! if we already have uv, could we get away without docker? uv sync can give an experience almost akin to static compiled binaries with the right setup. Its not going to handle volumes etc so if you're using docker features, this concept isnt going to fly. If you're not wedded to docker though, can you get away with just uv in dev and prod? in an enterprise prob not, i wouldn't expect to be able to download deps in prod. For flying solo though… You’ve a frontend, a backend and presumably a database. Could you get away with just uv to manage your backend and a local sqlite db?So a broadly feature comparable stack for rapid iteration with less complexity but still all the bells and whistles so you dont need to cook everything yourself, might look like:
You could probably sketch a path to hyper scale if you ever needed it:More seriously: it's fascinating and interesting to see how closely this article mirrors my own Python project layout. The tools and practices have come a long way over the last decade and good developer standards have a way of becoming the defacto in organic fashion.
IMO uv has won the race to be the new standard for Python environment management (Poetry gave it a solid try, but lost on speed).
Why is that?
Why Python for AI?
That said, I remember writing myself a note a few years ago to avoid Python projects. I had to clean up code from all over the company and make it ready for production. Everyone had their own Python version, dependencies missing from requirements.txt, three way conflicts between 2 dependencies and the python version, wildly different styles, and a habit of pulling in as many libraries as possible [1]. Even recalling those memories makes my stomach turn.
I believe constraints make a project shine and be maintainable. I'd prefer if you throw at me a real python instead of a python project.
[1] Yes, I'm aware of containers, I was the unlucky guy writing them.
Made me think this is probably normally a Ruby developer indoctrinated against Python. The article doesn’t seem to say what they have come from.