Yannis' Law: Programmer Productivity Doubles Every 6 Years
I keep hearing aphorisms about the "software crisis" and the lack of progress in software development. I have been programming for over 15 years, and I find such claims to be completely false: I am convinced that I could reproduce with today's tools the work of a competent programmer of 15 years ago in a small fraction of the time.
By analogy to Moore's law and (more appropriately, because of its intention to provoke, rather than predict) Proebsting's law, I propose that programmer productivity doubles every 6 years.
The year is 2003 and I would not consider a programmer to be good (this includes familiarity with tools) if they cannot produce the KWIC system within an hour or two, instead of a week or two in 1972. This constitutes an increase in productivity by a factor of 40 over the course of 31 years, or over 12.5% per year, which results in a doubling of productivity every 6 years.
This impressive progress is arguably the cumulative result of reusable software entities, better system tools, better programming languages, better CS education, but also good use of faster machines that allow us to ignore low-level overheads and favor slightly less efficient but convenient solutions.
This reminds me a little of what Chuck Darrah discovered about Silicon Valley dual career families, that it wasn't that technology eased our lives so much as made it possible to take on more responsibilities. Perhaps the software crisis isn't a crisis per se, but more of changing expectations as we gain the capabilities to produce more sophisticated features, as we compete more (thanks to the web and the Internet), and as systems are used in more situations (especially where they were not expected to be used, just see comp.risks for examples).
Proebsting's Law: Compiler Advances Double Computing Power Every 18 Years
I claim the following simple experiment supports this depressing claim. Run your favorite set of benchmarks with your favorite state-of-the-art optimizing compiler. Run the benchmarks both with and without optimizations enabled. The ratio of of those numbers represents the entirety of the contribution of compiler optimizations to speeding up those benchmarks. Let's assume that this ratio is about 4X for typical real-world applications, and let's further assume that compiler optimization work has been going on for about 36 years. These assumptions lead to the conclusion that compiler optimization advances double computing power every 18 years. QED.
This means that while hardware computing horsepower increases at roughly 60%/year, compiler optimizations contribute only 4%. Basically, compiler optimization work makes only marginal contributions.
Perhaps this means Programming Language Research should be concentrating on something other than optimizations. Perhaps programmer productivity is a more fruitful arena.