Essential insights from Hacker News discussions

Many hard LeetCode problems are easy constraint problems

Here's a summary of the themes from the Hacker News discussion:

The Utility and Appropriateness of Constraint Solvers in Interviews

A significant portion of the discussion revolves around whether using a constraint solver is a valid approach for interview problems, particularly those found on platforms like LeetCode.

  • Support for Constraint Solvers: Some users see using a constraint solver as a positive indicator of a candidate's knowledge and ability to find efficient solutions. Analemma_ states, "I'd count it as a plus if the candidate reached for a constraint solver. They're criminally underused in real-world software engineering and this would show the candidate probably knows how to get the right answer faster instead of wasting a bunch of time." PartiallyTyped agrees, "It’d be a positive in my book if they used a constraint solver." qnleigh suggests that suggesting a constraint solver to "get something working quickly and then profile later" is a good approach, especially if the problem is tricky. Der_Einzige argues strongly for their utility, noting, "If someone solves a leetcode hard with a constraint solver and you don't hire them, you are an idiot." They emphasize the rarity of knowledge about constraint solvers. yogorenapan also shares positive experiences using them to win hackathons and land jobs, highlighting their development speed.
  • Criticism of Constraint Solvers for Interview Problems: Many users argue that constraint solvers are not the intended solution for typical interview coding challenges. YetAnotherNick mentions, "General constraint solver would be terribly inefficient for problems like these. It's a linear problem and constraint solver just can't handle O(10^6) variables without some beefy machine." alternator echoes this, stating, "in an interview, it’s almost by definition not the right solution," and that they can be "dreadfully slow... compared to just a simple dynamic program."
  • The "Right Tool for the Right Job" vs. Interviewer Intent: There's a tension between the principle of using the best tool available (constraint solver) and what interviewers are actually trying to assess. taylodl's initial sentiment "Use the right tool for the right job!" sets the stage for this debate. Some feel interviewers want candidates to demonstrate a specific type of problem-solving or algorithmic knowledge rather than leverage external tools.
  • Demonstrating "Cleverness" vs. Breadth of Knowledge: Several participants believe interview problems are designed to test specific algorithmic insights or "cleverness," which a constraint solver bypasses. kccqzy states, "The point of these problems is to test your cleverness. That's it. Presenting a not-clever solution of using constraint solvers shows that you have experience and your breadth of knowledge is great. It doesn't show any cleverness."
  • Interviewers' Expectations and Goals: The underlying purpose of such problems in interviews is debated. For some, it's about assessing algorithmic knowledge and pattern recognition (theflyinghorse, tomas789, another_twist). For others, it's a flawed proxy for problem-solving ability or even a way to identify candidates who fit a certain mold (cratermoon).
  • Hybrid Approaches: Some suggest a good candidate might mention a constraint solver as a possibility but still proceed with a more traditional algorithm. StefanBatory proposes, "Would a good answer be 'I can do it as a constraint problem, but since I guess you are not asking for this, the solution is...' and then proceed as usual?"
  • Real-World vs. Interview Context: The disconnect between interview problem-solving and real-world engineering is a recurring theme. OutOfHere remarks, "At this point, job interviews are so far removed from actual relevance." gnfargbl adds, "In any real engineering situation I can solve 100% of these problems. That's because I can get a cup of coffee, read some papers, look in a textbook... and yes, use tooling like a constraint solver."

The Nature and Purpose of Coding Interviews

Beyond the specific debate about constraint solvers, the discussion delves into broader criticisms and defenses of technical interviews, particularly those involving algorithmic puzzle-solving.

  • LeetCode as a "Game" and Memorization: Many users view LeetCode challenges as a gamified system that rewards memorization of patterns rather than genuine problem-solving ability. corimaith asserts, "No it's just memorization of 12 or so specific patterns. The stakes are too high that virtually everyone going in will not be staking passing on their own inherent problem solving ability. LeetCode has been so thoroughly gamified that it has lost all utility of differentiability beyond willingness to prepare." roadside_picnic notes, "Last round I did at Meta it was clearly to test that you grinded their specific set of problems, over and over again, until you could reproduce them without thinking."
  • "Willingness to Prepare" as a Skill: A counterargument is that "willingness to prepare" is itself a valuable trait for an employee, and LeetCode effectively tests this. jkubicek supports this: "In defense of questions like this, 'willingness to prepare' is a significant differentiator." another_twist agrees, seeing it as a skill that can be learned and applied.
  • Bias and Accessibility: Concerns are raised about the fairness and accessibility of these interview formats. bradlys mentions the potential for biases: "I interview in silicon valley and I'm a mixed race American. ... I think a lot of people just don't want me to pass the interview and will put up the highest bar they can." Herring states, "research says the interview process should match the day to day expectations as closely as possible... All these brain teasers are low on signal, not to mention bad for women and minorities."
  • Time Constraints and Life Responsibilities: The pressure of time-boxed interviews is seen as a barrier, especially for individuals balancing work with personal responsibilities. tjpnz points out, "That willingness to prepare doesn't reconcile with the realities of parenthood and all of the other responsibilities someone in their thirties may have." LordDragonfang and cratermoon suggest that demanding excessive preparation time is a way to screen out candidates with other life commitments.
  • Focus on Understanding Thought Process: Some advocate for interviews that prioritize understanding a candidate's thought process, communication, and how they decompose problems, rather than just the final answer. chaboud emphasizes, "the point is to understand how the candidate thinks, communicates, and decomposes problems. Critically, problem solving questions should have ways to progressively increase and decrease difficulty/complexity, so every candidate 'gets a win' and no candidate 'dunks the ball'." the_af agrees, stating, "The point is whether the candidate knows how to code (without AI), can explain themselves and walk through the problem, explain their thought processes, etc."
  • Real-World Problem Relevance: A common complaint is that LeetCode-style problems do not accurately reflect day-to-day software engineering tasks. viccis notes that many problems are "give me a O(n) runtime O(1) memory algorithm over this array" type challenges that "really doesn't resemble my day to day work at all." x187463 echoes this sentiment, saying, "The leetcode interview doesn't seem to correspond to anything a developer actually does day to day."
  • Alternatives to LeetCode: Suggestions for better interview practices include debugging tasks, discussing past projects and technical decisions, take-home assignments, and problem-solving questions that can be scaled in difficulty. avgDev describes their ideal interview: "what projects have you done - what tech you worked with and some questions about decisions - debugging an issue they encountered before - talking about interests and cultural fit."
  • The "Trick" vs. Fundamental Understanding: There's a sentiment that many interview problems rely on recognizing a specific pattern or "trick" rather than fundamental programming skills. cratermoon articulates this: "Solving LeetCode is more about finding the hidden 'trick' that makes the solution... Look at the problem long enough and realize 'oh that's a sliding window problem' or somesuch known solution."
  • Company Culture and Interview Process: The interview process is seen as a reflection of a company's culture and priorities. _se questions the wisdom of hiring processes that rely on unqualified individuals. cobbzilla contrasts mature companies that give challenging, creative problems with less mature companies that resort to "stupid LeetCode problems," suggesting the latter reinforces a cycle of poor code quality.

Specific Algorithmic Discussions and Notations

A smaller thread of the conversation touches on the specifics of certain algorithms and mathematical notations.

  • Big O Notation Ambiguity: A humorous exchange occurs regarding the meaning of "O" in complexity notation. NoahZuniga's "O(10^6) = O(1)" is clarified by dekhn as referring to "on the order of," not Big O notation. harperlee suggests Noah might be intentionally denouncing notation abuse, and Noah further explains the etymology of 'O' relating to the German "Ordnung."
  • Integer Programming: nextos correctly identifies the OP's problem as an "integer programming problem," not a linear one, and suggests DPLL-based solvers as an alternative implementation.
  • Greedy Algorithms: CamperBob2 and smegma2 debate the definition and correctness of greedy algorithms, particularly for coin problems like the "coin change" problem. They discuss how a "well-behaved" set of denominations is necessary for greedy approaches to work optimally, and that a greedy solution for a non-well-behaved set is inherently flawed.
  • Data Structures and APIs vs. Algorithms: 3vidence argues that "data types & associated APIs are so so much more important than algorithms" in day-to-day programming, preferring flexible data types over those that prioritize raw performance at the cost of brittleness.

Broader Societal and Career Implications

Some users connect the interview practices back to larger societal trends and career progression.

  • "Selecting for people like themselves": cratermoon suggests that interviewers may select for candidates who are similar to themselves, leading to a lack of diversity. erikerson prompts for clarification on how this manifests (race, wealth, etc.).
  • Desperation and Willingness to Grind: bee_rider posits that candidates who grind LeetCode while hating it signal desperation, while those who enjoy it might be "premium nerds."
  • Erosion of Skill Assessment: mjr00 laments that the focus has shifted from understanding how candidates think to a rote memorization hurdle, with interviewers and candidates engaging in a superficial exchange of regurgitated code.
  • Career Hindrance: erikerson expresses frustration that despite extensive education and experience, their career has been "significantly derailed by this social phenomenon" (referring to the interview process).
  • Hakan Kjellerstrand and Constraint Programming Resources: atilimcetin provides a valuable link to Hakan Kjellerstrand's website, a recognized authority on constraint programming, for those interested in learning more.
  • Hubris and Estimating: binarymax recounts an experience where not knowing about constraint solvers led to underestimating a scheduling problem, serving as a lesson in hubris and the importance of accurate estimation.
  • The "Theft" of Opportunity: mepiethree likens coding interviews to a test of one's ability to coherently repeat learned information, rather than synthesize it in a crisis.
  • Contracting as an Alternative: faangguyindia suggests becoming a contractor as a way to bypass LeetCode-style interviews.
  • The Value of Clarifying Questions: tracker1 criticizes automated interview screens precisely because they prevent clarifying questions, which are essential for their problem-solving approach. garrettgarcia and another_twist counter that asking clarifying questions is a crucial part of interviews and a skill in itself.