Chapter 11. Lesson 11: Scaling up the Personal Software Process

Table of Contents
Program 10a: Multiple Regression

Comprehend the difficulties of scale in software development. Understand that as product sizes grow, process complexity grows as well. Comprehend how to relate the PSP to larger processes and products.

Read chapter 11 of the textbook

Write program 10a, using PSP3


Chapter 11 of the text focuses on a topic near to any developer's heart: the problem of scale. The PSP as we have explored it so far is targeted around a single-pass development lifecycle with clearly delineated code/compile/test phases. While this works well for the PSP problems (which have typically been under 250 LOC), it obviously will not work for even a medium-sized (around 20 KLOC) project, much less a truly gargantuan system.

Humphrey devotes some of this chapter to the "elements of intellectual work" [Humphrey95], which he enumerates as memory, skill, and method. The section on the working of expert memory-- which references a famous study in which expert chess players could easily remember chessboards showing games-in-progress but did little better than average on randomly-placed pieces-- is an interesting one, drawing attention to the brain's ability to "chunk" information. This frame of mind, so to speak, speaks well for usable abstractions such as those provided by a well-designed class library or a simple-but-consistent language. The skill and method sections I found less interesting, but still notable.

Humphrey next enumerates the "stages of product size," from very small program elements to "massive multisystems that involve many autonomous or loosely federated projects" [Humphrey95], and this is an important concept-- because not only do the sizes of products change, the entire approach to producing them, and of organizing their work, must change as well. The problems of these projects do not rise in a linear fashion with relation to size, but in an almost logarithmic/exponential fashion.

This brings up an interesting point with the PSP: we are using linear regression procedures to analyze our historical data and using them to extrapolate outcomes. How well, then, would this scale to very large projects, when the time to develop these projects, as well as defect rates, etc., are not following a linear progression? Humphreys' answer seems to be the adoption of an incremental, accretive lifecycle such as the spiral lifecycle model; while I very much approve of iterative lifecycles, I'm not sure I'm satisfied with their predictive ability. In any case, the PSP3 process adds the concept of iterative cycles to the PSP, attempting to break build stages into subprojects of between 100 and 300 lines of new/changed source code, favoring the low end of that spectrum (which means that some of our earlier PSP programs could well have qualified, and I certainly agree!). Oddly enough, the PSP3 process doesn't do end-of-cycle postmortems, and only one estimate; I would have thought that end-of-cycle postmortems should be used to adjust the overall estimate, but that's something that could certainly be added in the future.

Since my current project is nearing the 20 KLOC point, and since future projects will likely be significantly larger, the problem of scaling process (both in the size of the product and the size of the team involved) fascinates me. I'm very curious to see how this turns out.