There are a couple different factors to consider when profiling code. For this, let's first go over the architecture.
There is an abstract representation of the network, which can be manipulated in various ways, including differentiation. This network has to be built up initially. The network has to be pair with a predefined evaluation strategy, and this will produce functions that operate on weights and data to produce either an output, or a training gradient.
We can conceptually separate this into two or three phases.
- First, we build the network using a series of loops.
- Then, we combine an evaluator with the output nodes, to get functions.
- Then, we evaluate those functions, many times.
In terms of high-level usage, we can probably view the first two steps as a single step.
For the initial, say, "construction" phase, we want to make sure we're not outrageously slow, and if we use techniques like caching, we want to make sure we don't put ongoing memory pressure on the system, or spike "too high" during construction.
For the "main" phase, we want to go as fast as possible, and avoid increasing memory pressure with time.
I can't figure out just by thinking about it what kind of numbers I want to achieve here. And, as I said, my laptop is probably not so great for this anyway. So, I figure when it comes to learning by doing, I'll just try to get statistics for various versions and focus on quantifying the differences.
I'd like to get more done with all this, but I had a really rough day and I probably shouldn't be looking at my screen right now.
One thing I'm thinking about trying out with all of this is Nox. I've been using tox for my other Python projects, and while my hand-rolled tox.ini files are reasonable, the stuff that cookiecutter-pylibrary does with Jinja... is not. So, since I've got some new testing I want to do, I figured I'd try out Nox and see if it's a better fit, if I should migrate some of my projects over to it.
Tomorrow, though, because I am done. Ugh. Sorry to you and to me that today ended up like this.