Pip 2020-12-05

By Max Woerner Chase

I just did some cursory research on usage of URL constraints, and didn't find anything new.

Here's what I'm thinking so far.

Per pfmoore's thoughts on the behavior, each URL constraint should apply to all candidates, which means that, in practice, two URL constraints together should always fail, though the logic shouldn't be frontloaded. So, the Constraint object can build up a set of Link objects; the empty constraint has an empty set, and the __bool__ and __and__ methods behave accordingly. To check whether a Candidate satisfies the Link objects, every Link must be compared against the Candidate's source_link, and they all have to match.

As I was typing this up, some questions occurred to me:

Okay, looking into the first, this looks a little gnarly. requires_python is evidently a Link field, which means it'll definitely be associated with Links, but it seems to me that the only sensible interpretation of it is to consider the Link to match all candidates if the Python version doesn't line up. The other option I see is to evaluate Links as they come in, which I'm really not sure how to do.

I'll need to figure out what the current implementation does for non-URL constraints/requirements, because it looks like it does treat them as I expected. I'm not sure if I need to do anything then... Okay, got it. There's logic that runs for constraints, which I'm not sure how it filters out non-satisfying Python version constraints, but experimentally, it does. That means only relevant constraints will be added, and I don't have to do any extra legwork on them once they show up.

Random bit of advice for anyone taking the same approach as me to figuring out how pip works: the attrs package has a bunch of versions, and no dependencies. I haven't touched dependencies-of-dependencies, or anything like that, yet. That's going to be a whole thing.

Okay, so setting up the initial prototype for this should be relatively straightforward. We shall see. Later.

Good night.