Here's a summary of the themes expressed in the Hacker News discussion:
The Enduring Relevance and Misinterpretation of Knuth's "Premature Optimization"
A central theme is the enduring relevance of Donald Knuth's ideas, particularly the oft-quoted "premature optimization is the root of all evil," but also the widespread misinterpretation and misuse of this advice. Many commenters agree that the original context and full quote are crucial.
- mjd calls it "my all-time favorite paper" and notes how much "still applies to everyday programming and language design."
- subharmonicon states, "the famous quote is misunderstood and often suggest people take the time to go back to the source and read it since it’s a wonderful read."
- godelski argues that the full quote, including the part about not passing up opportunities in the "critical 3%," clarifies that Knuth defines "Premature Optimization" as "optimizing before you profile your code."
- hinkley rails against how the quote is used to "shut down dialog on improving code (not just performance)."
- globular-toast expresses surprise, stating, "I've always understood it to mean exactly what it's supposed to mean, namely don't optimise something until you've shown that it's actually needed. I honestly don't know how it could be interpreted any other way."
- osigurdson points out that "reasoning by unexamined phrases" is "the real root of all evil."
The Nuances of Optimization: Profiling, Bottlenecks, and "Doing Nothing"
A significant discussion revolves around the practicalities of optimization, emphasizing the importance of profiling, identifying actual bottlenecks, and even questioning whether a task needs to be done at all.
- godelski highlights Knuth's emphasis on profiling: "It makes it clear, in context, that Knuth defines 'Premature Optimization' as 'optimizing before you profile your code'."
- The concept of Amdahl's Law is frequently invoked as a fundamental principle for understanding performance limitations in parallel systems. Swizec calls it "the single best thing I learned in 4 years of university" and explains, "No amount of parallelization will make your program faster than the slowest non-parallelizable path."
- ilc stresses the importance of considering the "cost of doing the optimization vs. it's impact" and understanding "what the business value of that impact is."
- hinkley suggests revisiting assumptions: "Re-ask all of those questions. Why is this part single threaded? Does the whole thing need to be single threaded? What about in the middle here? Can we rearrange this work?"
- foobiekr posits that "most real optimization is in the semantics of the system and - frankly - not optimizing things but finding ways not to do them at all," giving the example of replacing an RPC call rather than tuning its parser.
- sfn42 echoes this, stating, "The problem for these kinds of applications arises when it's not doing 5 times as much work as it needs to but 5000 times or 5 million times."
- bluGill, however, cautions against "premature pessimization," noting that "any competent language has sort in the standard library that is better than bubble sort."
The Evolution of Computing and its Impact on Optimization Strategies
Several commenters discuss how the vastly increased capabilities of modern hardware and software development practices have changed the landscape of optimization compared to Knuth's era.
- wewewedxfgdf dismisses the paper's relevance due to the changes in computing, stating, "It was written when people were writing IBM operating systems in assembly language."
- hinkley elaborates on this significant shift: "Here we are sitting at four to seven orders of magnitude separated from Knuth, depending on whether you mean number of devs or number of machines or size of problems tackled."
- kragen provides concrete numbers: "Today you can put 6 tebibytes in a 384-core two-socket AMD server that can do in excess of 10 trillion 32-bit operations per second: 6 orders of magnitude more RAM, 7 orders of magnitude more arithmetic."
- bluGill notes that "compilers/optimizer were not nearly as good as today, and CPUs were much more deterministic," implying that many small optimizations Knuth discussed might now be handled by the compiler.
- ethan_smith clarifies that the quote wasn't from a dedicated optimization paper but from a broader discussion on methodology.
- kragen points out that the "efficiency" and "a searching example" sections are "minor details in the paper, despite being so eloquently phrased that it's the part everyone quotes." He emphasizes that the paper is "mostly... about control structures, and in particular how we can structure our (imperative!) programs to permit formal proofs of correctness."
The Interplay Between "Laziness," Efficiency, and Good Engineering Practices
A recurring theme is the ideal of "lazy" engineering, where upfront effort to create efficient and well-structured code prevents much larger problems later. This is contrasted with a more passive, "fix-it-later" mentality.
- godelski laments the loss of the "efficient lazy" mindset, where "recognizing that doing the dishes now is easier than doing them tomorrow" is key, contrasting it with "typical lazy" which leads to accumulating problems.
- dan-robertson criticizes the "fix it later' approach to systems design," arguing that it's "very hard to ‘optimise’ later."
- godelski agrees, stating that "tech debt" is precisely this approach, where compounding issues make fixing things "far more effort than it would have taken to fix it early."
- sfn42 advocates for a proactive approach, saying, "I just think about what I'm doing and design things to be fast. I consider the time complexity of the code I'm writing."
- sgarland agrees that this is reasonable "IFF you understand computing fundamentals, and IFF you have a solid understanding of DS&A."
The Decline of Foundational CS Knowledge and its Impact
Several participants lament that core computer science concepts, such as data structures and algorithms, seem to be less understood or applied in modern development.
- naniwaduni suggests that "the entire body of discourse around structured programming is totally lost on modern programmers failing to even imagine the contrasts."
- runevault finds it "interesting that there's so much discourse about the effort people have had to put into data structure and algorithm stuff for interviews, but then people refuse to take advantage of the knowledge studying that gives you towards trivial effort optimizations."
- sfn42 reiterates the point, saying, "You see people saying things like 'It's pointless to memorize these algorithms I'll never use' - you're not supposed to memorize the specific algorithms. You're supposed to study them and learn from them, understand what makes them faster than the others and then be able to apply that understanding to your own custom algorithms."
- IshKebab expresses concern about developers choosing "an architecture or programming language (cough Python) that is inherently slow" with the expectation of optimizing later, leading to significant problems.
The Misuse of Analogies and Idioms in Technical Discussions
A side theme involves other overused or misinterpreted phrases beyond Knuth's optimization quote, highlighting a tendency to rely on shallow understanding.
- chinchilla2020 points out the misinterpretation of the "shouting fire in a crowded theatre" quote.
- godelski offers a definition for cliches: "A cliche is a phrase that's so obvious everyone innately knows or understands it; yet, it is so obvious no one internalizes it, forcing the phrase to be used ad nauseam." This is applied to concepts like Amdahl's Law and "premature optimization."
- hinkley compares the misuse of the Knuth quote to the aphorism "Curiosity killed the cat, but satisfaction brought it back."
- motorest criticizes hinkley for using "strawman arguments" and framing his points as "obvious improvements that can only be conceivably criticized by anyone who is against good code and in favor of bad code."
- osigurdson states that "A clever saying proves nothing." - Voltaire.