Coding 2020-11-22
I ended up deciding that devpi was too heavy-weight for me to set up currently. I kind of wonder what it would take to spin up a temporary index that just does what I want. (I mean, it would basically just have to take a mapping from package names to file urls, and forward all other requests to another index.) The main imposition from doing such a thing is that there'd need to be an explicit build step which... might make things faster?
Anyway, I got shiv working, after some confusing stuff that was not its fault. Here's how my noxfile looks currently:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 | import contextlib import os.path import pathlib import tempfile import toml import nox PROJECTS = [ directory for directory in os.listdir() if os.path.isdir(directory) and os.path.isfile(os.path.join(directory, "pyproject.toml")) ] CONSTRAINTS = ( "\n".join( [ f"{project} @ {pathlib.Path(os.path.abspath(project)).as_uri()}" for project in PROJECTS ] ) + """ attrs ~=19.3 camel ~=0.1.2 click ~=7.1.2 # punq ~=0.4.1 # trio""" ) @contextlib.contextmanager def constraints_env(): with tempfile.TemporaryDirectory() as tmpdir: constraints = os.path.join(tmpdir, "constraints.txt") with open(constraints, mode="w") as fh: fh.write(CONSTRAINTS) yield {"PIP_CONSTRAINT": constraints, "PIP_NO_CACHE_DIR": "YES"} def install_from_repo(session, *projects): with constraints_env() as env: session.install(*projects, env=env) def install_from_requirements(session, filename): session.install("-r", f"requirements/{filename}.txt") @nox.session def clean(session): install_from_requirements(session, "coverage") session.run("coverage", "erase") @nox.session @nox.parametrize("project", PROJECTS) def check(session, project): install_from_requirements(session, "check") session.run( "isort", "--check-only", "--diff", f"{project}/src", f"{project}/tests", ) session.run("flake8", f"{project}/src", f"{project}/tests") @nox.session @nox.parametrize("project", PROJECTS) def mypy(session, project): install_from_repo(session, project) install_from_requirements(session, "mypy") session.run("mypy", f"{project}/src", env={"MYPYPATH": os.path.abspath("stubs")}) @nox.session @nox.parametrize("project", PROJECTS) def nocov(session, project): install_from_repo(session, project) install_from_requirements(session, "pytest") session.run( "python", "-m", "pytest", project, env={"PYTHONPATH": os.path.abspath(project)} ) @nox.session @nox.parametrize("project", PROJECTS) def cover(session, project): install_from_repo(session, project) install_from_requirements(session, "cover") session.run( "coverage", "run", "-m", "pytest", project, env={"PYTHONPATH": os.path.abspath(project)}, ) @nox.session def report(session): install_from_requirements(session, "report") session.run("coverage", "combine") # Disable while the punq fork is the main focus of development. # session.run("limit-coverage") session.run("coverage", "html", "--show-contexts") session.run( "coverage", "report", "--skip-covered", "-m", "--fail-under=100", ) @nox.session def shiv(session): install_from_requirements(session, "shiv") shiv_build = "build/shiv" os.makedirs(shiv_build, exist_ok=True) artifacts = [] for project in PROJECTS: with open(os.path.join(project, "pyproject.toml")) as toml_data: for script, endpoint in ( toml.load(toml_data) .get("tool", {}) .get("flit", {}) .get("scripts", {}) .items() ): artifacts.append((project, script, endpoint)) with constraints_env() as env: for project, script, endpoint in artifacts: session.run( "shiv", "-e", endpoint, "-o", os.path.join(shiv_build, script), project, env=env, ) |
First, it detects all PEP 517 projects one level down, and uses them to generate constraint files. I also added the only lines that I consider project-specific, to make sure that the various third-party packages are constrained consistently. punq is commented out because my fork is one of the subprojects, and trio is commented out because I haven't tried to constrain it yet.
Next up, helpers for actually using the constraints files. The context manager puts the constraint data somewhere pip can see it, and creates environment variables to make pip, and nested pips, act properly. I think this is because I'm not bothering to bump versions, but everything breaks if I let pip use a cache directory. Anyway, the context manager was spun out from the following function, which handles installing one of the subprojects. It was spun out so it could be used with shiv, because shiv uses pip under the covers.
The next helper installs the requirements for a particular session, which are stored in the requirements directory, and are built up by including each other as needed.
Getting to the sessions, we've got clean, which makes sure that coverage data doesn't leak between runs, and would do other cleanup if it were needed. The check session does basic linting against the codebases, that does not require installation. I could maybe turn that into a for-loop and remove the parametrization. The mypy session typechecks each project in isolation, with the help of some stubfiles I had to write for some of the dependencies. The nocov and cover sessions run pytest against each project, with the invocations as similar as possible, except that cover runs under coverage. The environment manipulation is required to get punq's doctests to pass when invoking pytest from another directory; I don't know if the way they're being invoked and supported is idiomatic, because I don't use doctests much. The report session is not quite how I usually like it, because the limit-coverage command has extremely specific expectations about test layout that are intuitive to me, and I don't know about anyone else. Point is, unless I rearrange all of punq's tests (which I might as well do, at some point), running limit-coverage would make it impossible to get sensible data, so I'm not going to run it. Anyway, it generates a nice HTML report and yells at me if I have uncovered statements, which is reasonable, in my experience.
Lastly, the new session, shiv, scrapes all of my pyproject files for script declarations, and converts them into shiv commands. Thinking over what it does, I'm actually going to rewrite it a little, but it's nothing too significant. The code is a little rough and kind of suggests some refactorings, but I don't think they'd be well-motivated yet, so I'm going to hold off. The directory structure for the build artifacts is a little nested, because I want to leave my options open for creating other kinds of artifacts.
(By the way, apropos of nothing, I wonder what the etiquette is for something like... a PR that I haven't looked at in over a year, because I stopped using the project it's against...)
Anyway, that file is more-or-less how I like to handle automated tasks related to Python development these days.
It's late now, so I should wind down.
Good night.