Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Makefile Tricks for Python Projects (ricardoanderegg.com)
84 points by celadevra_ on May 27, 2023 | hide | past | favorite | 34 comments


Don’t do this… unless you have a very specific and non-standard usecase that requires it, like the author:

> I work using git worktrees to have multiple branches checked out at the same time. By using this trick, I can run different commands and make sure the imported package is from that branch, instead of having to python3 -m pip install -e . around all the packages

For most usecases, you can wire up your imports and script entry points in poetry, and manage venvs there too. Or just use pyenv-virtualenv if you’re on MacOS to auto-use your per-project venv.

Again, the author's requirements here are quite esoteric in my experience, and I’d really encourage folks to avoid this particular can of worms if at all possible.


Author here.

I agree my setup is not simple (or even recommended as the default), that's why I call them "tricks".

> you can wire up your imports and script entry points in poetry, and manage venvs there too. Or just use pyenv-virtualenv if you’re on MacOS to auto-use your per-project venv.

As someone who has tried most of the current Python tools (including poetry and pyenv), I wouldn't say that "wiring up" imports and entry points is much simpler or footgun-free. It also requires using specific tools, which may not be possible depending on the project.

Also, multiple venvs is not equivalent, since you still need to manage those venvs and install the packages (+ dependencies). The `PYTHONPATH` trick can save a lot of time when the dependency tree is complex, and you just want a single venv.

For example, a workflow like:

  git worktree add -b feature_one
  #
  # edit some files
  #
  BRANCH=feature_one make run-service
  
  # in a different terminal, assuming `master` is also checked-out as a worktree
  BRANCH=master make run-service
With the `PYTHONPATH` trick, you can do this in a few seconds. Now you have the same service running two different versions of your package, without having to reinstall anything or caring about using the correct venv.

Again, I know this is a hack and it can be very project-specific. But it has worked great for me, and maybe it can be helpful to others.


> if you’re on MacOS to auto-use your per-project venv.

Just chuck the venv name into a .python-version file then pyenv will do that for you, on any OS.


Similar in spirit, I made a reusable Makefile for my Python projects: https://github.com/sio/Makefile.venv

It's a shame that Python venv workflow is so opaque and unfriendly to new users that it requires extra automation tools.


__pypackages__ is gonna be so nice when it lands.


Hate to be the bearer of bad news but the PEP has been rejected https://peps.python.org/pep-0582/


Nooooooooooo it was the chosen one.

Its quite sad to see the rationale as making it too complicated when "upgrading to venvs" when it should kill them outright. Why using python depends at all on weird shell semantics is beyond me.


PEP 582 has been rejected, are there any new approach?


There are quite a few things here that I've used for a few years with option projects at work - overall I like it.

A big part of that is due to an endless stream of people who couldn't be bothered to learn what a venv was or how to use it, but... yeah, that's what work is like sometimes. The makefile pretty much ended those issues, absolutely worth the effort put into it.


I recently discovered just (https://just.systems), a make alternative that’s much more ergonomic IMO, and have been using it in my personal projects. It’s task-oriented, not goal-oriented, so not a perfect replacement, but works well for my needs.


Author here. I like `just` too, but it's missing a few features:

* It can't run tasks in parallel

* It doesn't handle file dependencies. i.e: it's just a task runner, not a build tool.

* It's another tool you have to install, make comes pre-installed in most UNIX-like operating systems.


There’s a problem hidden.

MacOS make is outdated (or not GNU flavor not sure) and, for example, doesn’t support parallel tasks by default or some more advanced recipes.

Asking people to use gmake on a standard Makefile is much more difficult than just asking to install Justfile runner.


100% agree, and this bothers me a lot. I like `just`, but for some reason I keep coming back to `make`.

> or not GNU flavor not sure

On my macOS laptop

  which make
  # /usr/bin/make

  /usr/bin/make --version
  # GNU Make 3.81
> doesn’t support parallel tasks by default

I haven't verified this. I've checked `make --help` using the built-in make on my Mac (GNU Make 3.81), it mentions the `--jobs` option to run multiple jobs in parallel.


Checking the GNU Make manual it might me .NOTPARALLEL flag that wasn’t recognized correctly.

Cannot provide precise issues with (as it was more then a year) but I encountered 3-4 problems after having complete Makefile which was kind of downer because I had to rewrite it to support “vanilla” make.

In the end I dropped parallel went with Justfile and it took fraction of time (but truth be told I had dynamic PHONY targets which aren’t super easy to setup in Make).


I remember when Python was simple.


The existence of all these different systems is not very Pythonic.

"There should be one-- and preferably only one --obvious way to do it."


I prefer to stay in python, makefiles become very difficult to maintain. I use https://www.pyinvoke.org/


  > The reason to use $$(some-command) instead of the built-in function $(shell some-command) is that the expression will be evaluated every time it is called. [...] When using $(shell ...) the expression gets evaluated only once and reused across all the recipes.
I don't think the last sentence is true. As long as you define py using = and not := it will be evaluated every time it's used.


Or use https://pydoit.org and a virtualenv and be happy.


Hi! Author here.

I've looked into pydoit multiple times and I would love to try it. There are a few reasons that make me go back to make (no pun intended!).

* It comes pre-installed on most UNIX operating systems. Yes, macOS ships an older version, but it works in most cases. This makes bootstrapping projects easier. * I like how simple it's to get started with a Makefile when you just need a few tasks. Although I admit, big Makefiles can become hard to maintain.

I've recently experimented with using Makefiles as an ETL orchestrator, and it's becoming quite messy. I believe pydoit is the perfect replacement candidate. I just haven't had time to try it.

Edit:

Another thing I like about Makefiles is that you can keep the "hackiness" of shell scripts, manipulate the environment [0], etc. I guess you can do the same with other tools, but I find it easier to reason about those useful "hacks" in Makefiles.

[0]: https://ricardoanderegg.com/posts/makefile-python-project-tr...


Looks like it’s pretty much abandoned:

> doit is under active development. Version 0.36.0 released on 2022-04.


Below what you quoted

> doit core features are quite stable. If there is no recent development, it does NOT mean the project is not being maintained... The project has 100% unit-test code coverage.


Okay, but support stops at Python 3.10. I have nothing against the tool - it looks interesting. But isn’t support of current versions of Python table stakes for a tool like this? How am I, as a potential user of this tool, supposed to interpret the lack of 3.11 support and a last release dated end of last year?


After the chaos of Python 2to3, backwards Python's backwards compatibility has been amazing. The tool is created for developers and claims 100% code coverage. Determining if it works with any specific version of Python would be a trivial exercise of clone the repo, make an venv, and run the tests.

This is something that a dev should do when evaluating a new dependency even when it claims to support the version relevant to you. Sadly, many do not.


It works great, it's stable, never found a bug for years and it's compatible with the next python release candidates.

For the immense convenience it brings, the risk is very low.

I'm guessing it could stay that way for 10 years and still be perfectly usable.


The Github repo shows some commits being made as recently as 4 months ago: https://github.com/pydoit/doit


I’ve used make like this in the past, it’s handy. But zsh auto suggestion obviates the extra maintenance IMO. I type the command one time and it’s in my history. So my common dev actions are more flexibly maintained by recency than editing a file.

You could say the Makefile also serves as a form of (not great) documentation.


I use it for documentation and simplifying my build flow.

I code in more than a few languages and frameworks, but I don't always memorize the magic build commands for each one.

In my make file, I simply create `make build` and it installs npm, pip,

I run `make clean` and node_modules is deleted or DerivedData are gone.

Otherwise, I'd dump this documentation in an unstructured ReadMe.md.


I use Makefiles for managing multiple venvs for various jobs. For example running test and running in production have different requirements (no testing libraries). Or an environment for static analysis tools.


See also: "Automate your Python project with Makefile" https://antonz.org/makefile-automation/


I love this. Thanks for introducing me to so many things in make I didn’t know existed and doing so in a manner than is understandable. Kudos


well, if Python ends up subsumed into Mojo then Pythonistas might as well learn how to build their code :-)


Is there a Makefile trick to gpg sign a python package? I think the Python community would find that useful.


Didn't pypi just deprecate signatures?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: