Project Success is a Sum of Leaky ReLUs

In neural networks there is something called the leaky rectified linear unit (leaky ReLU). It's just a complicated way of referring to a piecewise linear function that has a shallow part and a steep part.

A project (e.g. a recording of a song, a research publication, a talk) consists of good and bad elements. A good element is one that enhances the project, like a strong idea, good writing, good instrumental performance. A bad element is one that reduces the enjoyment or persuasion of a project, like a bad vocal tone, confusing slides, or limited experiments.

Good and bad elements often make a leaky ReLU contribution to the overall quality of the project. The bad elements, when they are obvious, really detract from overall quality. Addressing them even a little (e.g., by EQ-ing a recording, redesigning slides, adding experiments) is a great use of time and effort. The good elements act in the opposite way. Enhancing them initially gives great returns on the time and effort but beyond a limit there is limited impact on the overall quality. The positive and negative contributions that good and bad elements make to a project look like this conceptually:

positive_impact.jpg
negative_impact.jpg

Following this model, time management in a project therefore requires identifying where on the ReLU line each element lives. If you get it wrong then the bad aspects of project will undermine its success and the good aspects will not be enough to make up for it. Getting it right could improve productivity by a factor equal to the ratio of steepnesses of the ReLU component lines.