Essential insights from Hacker News discussions

ML needs a new programming language – Interview with Chris Lattner

Here are the key themes emerging from the Hacker News discussion about Mojo:

Performance and Lower-Level Control

A significant portion of the discussion revolves around Mojo's stated goal of providing C++-like performance with the ease of use of a higher-level language. This includes the ability to write "state of the art kernels," a distinct advantage over languages like Python and Julia which typically rely on underlying C++ kernels.

  • One user notes, "I think Mojo's cool and there's definitely a place for a modern applications programming language with C++ class(ish) performance, aka what Swift wanted to be but got trapped in the Apple ecosystem".
  • Regarding kernel programming, a user states, "You don't write kernels in Julia."
  • Another user contrasting Mojo with Julia says, "Mojo to me looks significantly lower level, with a much higher degree of control."
  • Conversely, someone points out that Julia does have capabilities in this area: "Im pretty sure Julia does JIT compilation of pure Julia to the GPU: https://github.com/JuliaGPU/GPUCompiler.jl".
  • And another user emphasizes Julia's ability to compile down to kernels: "The package https://github.com/JuliaGPU/KernelAbstractions.jl was specifically designed so that julia can be compiled down to kernels."
  • However, a counterpoint suggests a more nuanced view: "I think Julia aspires to be performant enough that you can write the kernels in Julia, so Julia is more like Mojo + Python together. Although I have my doubts that Julia is actually willing to make the compromises which would allow Julia to go that low level. I.e. semantic guarantees about allocations and inference, guarantees about certain optimizations, and more."
  • dsharlet expresses a sentiment about the difficulty of achieving both high-level abstraction and low-level control: "Most people that know this kind of thing don't get much value out of using a high level language to do it, and it's a huge risk because if the language fails to generate something that you want, you're stuck until a compiler team fixes and ships a patch which could take weeks or months."
  • chrislattner himself acknowledges the challenge of bridging these worlds in his FAQ, stating, "Why not make Julia better".

Mojo's AI Focus and Timeliness

The discussion touches on whether Mojo's strong focus on AI is a genuine technical necessity or a response to the current hype cycle. Ultimately, many users agree that the AI market is a significant driving force behind its development and funding.

  • torginus initially mused, "The strong AI focus seems to be a sign of the times, and not actually something that makes sense imo."
  • diggan counters, "Are you sure about that? I think Mojo was always talked about as 'The language for ML/AI', but I'm unsure if Mojo was announced before the current hype-cycle, must be 2-3 years at this point right?"
  • fnands clarifies, "It has been Mojo's explicit goal from the start. It has it's roots in the time that Chris Lattner spent at Google working on the compiler stack for TPUs."
  • ModernMech highlights the financial backing: "You don’t raise $130M at a $600M valuation to make boring old dev infrastructure that is sorely needed but won’t generate any revenue because no one is willing to pay for general purpose programming languages in 2025. You raise $130M to be the programming foundation of next Gen AI. VCs wrote some big friggen checks for that pitch."
  • pama argues for the specific need for ML-focused languages: "The key to machine learning is the massively parallel ability to efficiently train large neural network models, and the key to using the benefits of these trained models is the ability to rapidly evaluate them. ... So if someone finds a way to make the core ML part better with a programming language solution that is certainly very welcome and the title is appropriate."

Licensing and Open Source Concerns

A major point of contention and concern for many users is Mojo's license and the future of its open-source status. The perceived restrictions and the company's control over the language's development are significant deterrents.

  • aavaa states, "Yeah, except Mojo’s license is a non-starter."
  • auggierose elaborates on the license details: "Wow, just checked it out, and they distinguish (for commercial purposes) between CPU & Nvidia on one hand, and other "accelerators" (like TPU or AMD) on the other hand. For other accelerators you need to contact them for a license."
  • aavaa further expresses skepticism about future open-sourcing: "They say they'll open source in 2026 [1]. But until that has happened I'm operating under the assumption that it won't happen."
  • mdaniel articulates a common fear: "Or, arguably worse: my expectation is that they'll open source it, wait for it to get a lot of adoption, possibly some contribution, certainly a lot of mindshare, and then change the license to some text no one has ever heard of that forbids use on nvidia hardware without paying the piper or whatever."
  • rs186 voices a general principle: "To my naive mind, any language that is controlled by a single company instead of a non profit is a non-starter."
  • ModernMech agrees: "They’re not going to see serious adoption before they open source. It’s just a rule of programming languages at this point if you don’t have the clout to force it, and Modular does not. People have been burned too many times by closed source languages."
  • poly2it adds, "I definitely think the license is a major holdback for the language. Very few individuals or organisation for that matter would like to invest in a new closed stack."
  • blizdiddy strongly criticizes the licensing model: "Mojo is the enshitification of programming. Learning a language is too much cognitive investment for VC rugpulls. You make the entire compiler and runtime GPL or you pound sand, that has been the bar for decades."
  • pjmlp points out the historical context: "For decades, paying for compiler tools was a thing."
  • blizdiddy pushes back on this historical context: "I’d prefer to not touch a hot stove twice. Telling me what processors I can use is Oracle- level rent seeking, and it should be mocked just like Oracle."
  • raggi echoes the sentiment of waiting for open-sourcing: "I can't invest in it or even start discussions around using it until it's actually open."

Comparison with Julia and Python Ecosystem

A significant theme is the comparison of Mojo with existing languages, particularly Julia and Python, highlighting their strengths, weaknesses, and perceived overlaps.

  • tomovo expresses caution due to past negative experiences with Swift: "While I appreciate all his work on LLVM, Chris Lattner's Swift didn't work out so well for me, so I'm cautious about this." Specifically, "super slow compilation times and cryptic error messages really erase any gains in productivity for me."
  • elpakal speculates on the reasons for Lattner's departure from Swift: "I think he left (maybe partly) because of the SwiftUI influence on Swift."
  • drivebycomm criticizes Swift's type system: "Actually, even very basic code can cause it. The type system of Swift has issues."
  • melodyogonna defends Swift's success: "I think Swift is really successful in that there are so many new Apple developers who would use Swift now but wouldn't have used ObjC."
  • Cynddl asks a direct comparison question: "Anyone knows what Mojo is doing that Julia cannot do?"
  • jakobnissen responds, "Mojo to me looks significantly lower level, with a much higher degree of control. Also, it appears to be more robust."
  • Archit3ch counters this perceived robustness: "Sure, Mojo the language is more robust. Until its investors decide to 10x the licensing Danegeld."
  • ssfrr makes a distinction between Python and Julia: "It doesn’t make sense to lump python and Julia together in this high-level/low-level split. Julia is like python if numba were built-in - your code gets jit compiled to native code so you can (for example) write for loops to process an array without the interpreter overhead you get with python."
  • arbitrandomuser reinforces Julia's kernel capabilities: "The package https://github.com/JuliaGPU/KernelAbstractions.jl was specifically designed so that julia can be compiled down to kernels. ... Julia can be used to write gpu kernels."
  • adgjlsfhk1 asserts Julia's direct compilation to GPU assembly: "Julia's GPU stack doesn't compile to C++. it compiles Julia straight to GPU assembly."
  • Alexander-Barth discusses Julia's ML frameworks: "In Julia, you have quite good ML frameworks (Lux.jl and Flux.jl). I am not sure that you have mojo-native ML frameworks which are similarly usable."
  • ubj points out Julia's shortcomings in AoT compilation: "First-class support for AoT compilation. ... Yes, Julia has a few options for making executables but they feel like an afterthought."
  • bobajeff notes limited Python interop in Julia: "I've looked into making Python modules with Julia and it doesn't look like that is very well supported right now. Where as it's a core feature of Mojo."
  • nromiun questions Mojo's adoption: "Weird that there has been no significant adoption of Mojo. It has been quite some time since it got released and everyone is still using PyTorch. Maybe the license issue is a much bigger deal than people realize."
  • pjmlp argues that existing Python DSLs for GPGPU are competitive: "at least MVidia and Intel are quite serious on Python DSLs for GPGPU programming on CUDA and One API, so one gets close enough to C++ performance while staying in Python. So Mojo isn't that appealing in the end."
  • CyberDildonics argues against new languages: "I don't think ML does need a new programming language. You give up an extreme amount of progress in tools and libraries when you move to a new language."
  • CyberDildonics also states, "I haven't seen new languages that market themselves for specific features that couldn't be done just as easily through straight classes with operator overloading."
  • threeducks advocates for libraries over new languages: "I realized that there is really very little to be gained through new languages that can not be obtained through a new library, without the massive downside of throwing away most of the ecosystem due to incompatibility."
  • On the other hand, jakobnissen argues for the fundamental differences: "Usually people create languages to address issues that cannot be addressed by a library because they have different semantics on a deeper level. Like, Rust could not be a C++ library, that does not make sense. Zig could not be a C library. Julia could not be a Python library."
  • ModernMech expresses frustration with Julia's slow progress: "I've been involved in a few programming language projects, so I'm sympathetic as to how much work goes into one and how long they can take. At the same time, it makes me wary of Julia, because it highlights that progress is very slow. I think Julia is trying to be too much at once."
  • numbers_guy inquires about Julia's strengths for ML: "What makes Julia "great" for ML?"
  • postflopclarity suggests Julia's niche: "Glad to hear. I've found that it's a very welcoming community. I'll warn you that Julia's ML ecosystem has the most competitive advantage on "weird" types of ML, involving lots of custom gradients and kernels, integration with other pieces of a simulation or diffeq, etc."
  • macawfish elaborates on Julia's ML advantages: "Built-in autodifferentiation and amazing libraries built around it, plus tons of cutting edge applied math libraries that interoperate automatically, thanks to Julia's well conceived approach to the expression problem (multiple dispatch)."
  • bobbylarrybobby provides a balanced view of Julia: "In practice, the Julia package ecosystem is weak and generally correctness is not a high priority. But the language is great, if you're willing to do a lot of the work yourself."
  • mdaniel questions the move to another dynamically typed language and the focus on subjective "niceness": "I don't understand why in the world someone would go from one dynamically typed language to another."
  • MontyCarloHall argues that Python's ecosystem dominance is due to its ability to integrate with other languages for performance: "Rather, they are complex applications with functionality extending far beyond the number crunching, which requires a robust preexisting software ecosystem... Python's numerical computing libraries (NumPy/PyTorch/JAX etc.) all call out to C/C++/FORTRAN under the hood and are thus extremely high-performance..."
  • Hizonner agrees about Python's role: "This guy is worried about GPU kernels, which are never, ever written in Python. As you point out, Python is a glue language for ML."
  • Almostgotcaught provides counterexamples to the "never written in Python" claim for kernels: "nah never ever ever ever ever ... except https://github.com/FlagOpen/FlagGems https://github.com/linkedin/Liger-Kernel ..."
  • ModernMech explains Mojo's problem-solving approach: "That's kind of the point of Mojo, they're trying to solve the so-called "two language problem" in this space. Why should you need two languages to write your glue code and kernel code? Why can't there be a language which is both as easy to write as Python, but can still express GPU kernels for ML applications?"
  • nostrademons offers a historical perspective on the "two language problem": "It's interesting, people have been trying to solve the "two language problem" since before I started professionally programming 25 years ago, and in that time period two-language solutions have just gotten even more common."
  • soVeryTired mentions Julia's "1.5 language problem": "There's a very interesting video about the '1.5 language problem' in Julia [0]. The point being that when you write high-performance Julia it ends up looking nothing like 'standard' Julia."
  • benzible suggests Elixir/Nx as an alternative with a different focus: "Python's ecosystem is hard to beat, but Elixir/Nx already does a lot of what Mojo promises. EXLA gives you GPU/TPU compilation through XLA with similar performance to Mojo's demos... The real difference is that Elixir was built for distributed systems from day one."
  • goatlover questions the exclusivity of "rich ecosystems": "Have a hard time believing C++ and Java don't have rich enough ecosystems."
  • j2kun clarifies that the "richness" is about ease of use for non-low-level developers: "It's not about "richness," it's about giving a language ecosystem for people who don't really want to do the messy, low-level parts of software, and which can encapsulate the performance-critical parts with easy glue."
  • FuckButtons and anakaine agree that Python's value is in its ability to leverage C/C++ without writing it.

Development Status and Maturity

Several comments address the current state of Mojo's development, with many feeling it is still too early for widespread adoption.

  • fnands notes, "It's still very much in a beta stage, so a little bit hard to use yet. Mojo is effectively an internal tool that Modular have released publicly. I'd be surprised to see any serious adoption until a 1.0 state is reached."
  • melodyogonna also states, "It is not ready for general-purpose programming. Modular itself tried offering a Mojo api for their MAX engine, but had to give up because the language still evolved too rapidly for such an investment."
  • ModernMech voices concern about the long-standing "rough edges" in Julia's development: "Yeah, that's what I've been hearing about Julia for about 10 years now: 'situation will improve greatly in the next version, but still many rough edges remain.'"
  • adgjlsfhk1 criticizes the "superset of Python" claim: "superset of Python was never a goal. It was a talking point to try and build momentum that was quietly dropped once it served it's purpose of getting Mojo some early attention."
  • fwip shares this skepticism: "I tend to agree, which is why I can't recommend Mojo, despite thinking their tech is pretty innovative. If they're willing to lie about something that basic, I can't trust any of their other claims."
  • ModernMech suggests a more charitable interpretation of the "superset of Python" claim: "For me, it’s because Chris is part of the team I’m willing to give them the benefit of the doubt. I will assume ego-driven naïveté over malice."

Language Design and Features

Specific language features and design choices of Mojo and its comparisons are discussed, including metaprogramming, error handling, and fundamental language paradigms.

  • nickpsecurity lists potential differentiators: "Easy packaging into one executable... Predictability vs Python runtime... Metaprogramming... Extensibility... Write once, run anywhere... Heterogenous, hot-swappable, vendor-neutral acceleration."
  • lordofgibbons expresses a dislike for exceptions in Mojo: "I just wish they didn't keep Exceptions. This backwards compatibility with Python syntax is extremely overrated and not worth the cost of bringing language warts from the 90s."

The Role of Python in ML Ecosystem

A recurring theme is Python's dominance in the ML ecosystem and the reasons behind it, primarily its vast libraries and ease of integration.

  • MontyCarloHall argues that Python's ecosystem is crucial for complex ML applications: "Modern ML applications don't exist in a vacuum. They aren't the standalone C/FORTRAN/MATLAB scripts of yore... Rather, they are complex applications with functionality extending far beyond the number crunching, which requires a robust preexisting software ecosystem." He highlights Python's ability to glue together C/C++/Fortran libraries.
  • Hizonner expresses that this reliance on Python is a point of bitterness for some: "That may be true, but some of us are still bitter that all that grew up around an at-least-averagely-annoying language rather than something nicer."
  • anakaine champions Python's ease of use and ecosystem: "ill take that reason every single day. I could spend days or more working out particular issues in C++, or I could use a much nicer to use glue language with a great ecosystem and a huge community driving it and get the same task done in minutes to hours."
  • dboreham humorously points out the dual meaning of "ML": "ML is a programming language. Machine Learning is shortened to ML, too."

Developer Experience and Tooling

Users discuss the importance of developer experience, including compilation times, error messages, packaging, and the overall tooling around a language.

  • tomovo's critique of Swift's compiler issues is a prime example: ""The compiler is unable to type-check this expression in reasonable time?" On an M3 Pro? What the hell!?"
  • ModernMech's experience with Julia binaries highlights tooling challenges: "They say it's supported, but it's not exactly as easy as 'cargo build' to get a Julia binary out."
  • macawfish suggests improvements for Julia's tooling: "What Julia needs though: wayyyy more thorough tooling to support auto generated docs, well integrated with package management tooling and into the web package management ecosystem."
  • mdaniel emphasizes the importance of type checking for understandability: "I literally cannot comprehend why someone would write such a thing [dynamic typing]. I am biased in that I believe that having a computer check that every reference to every relationship does what it promises, all the time."
  • postflopclarity argues that documentation is key and types are not the only way to achieve clarity: "first and foremost great documentation & design docs cannot be surpassed as a tool to explain and understand code. and that is entirely language agnostic."
  • lgas counters, "documentation can be wrong whereas types can't, so it seems like it's strictly a worse tool if your goal is to understand what's actually going on and not what someone said was going on at some point in the past."
  • adgjlsfhk1 suggests that types can be "overly restrictive" or "under powered."
  • const_cast argues that expressive type systems can encompass traits, preconditions, and postconditions.
  • pjmlp points out that even a full C# experience requires proprietary tools like Visual Studio.
  • Chris Lattner encourages engagement with the Mojo community and mentions its open-source code.
  • bsaul requests more technical details about Mojo's compiler technology, specifically "concrete example[s] of this advanced technology."

Historical Context and Alternatives

Some users draw parallels to past language trends and consider existing alternatives.

  • kuschkufan and pjmlp discuss the origins of GNU and LLVM in response to proprietary tooling.
  • mvieira38 asks for sources on Julia's commercial growth.
  • jimbokun and Hizonner reflect on the industry's cyclical nature and resistance to change.
  • dsharlet advocates for C++ templates as a sweet spot for performance and flexibility.
  • adgjlsfhk1 praises Julia's balance between exploratory and optimized coding.
  • wolvesechoes disputes the ease of optimizing Julia compared to C or Fortran.
  • pansa2 notes Mojo's incomplete feature set (e.g., classes being a medium-term goal).
  • singularity2001 expresses concern about proprietary "vaporware stacks."
  • analog31 discusses the historical shift towards free tools driven by programmer preference and the potential return to proprietary tooling with AI agents.
  • dboreham and a3w discuss the potential for ML (Machine Learning) to be a "killer app" for a language, drawing a parallel to Ruby on Rails.

Fundamental Language Design Philosophies

The discussion touches upon deeper philosophical differences in language design, such as the merits of exceptions versus explicit error handling and the fundamental nature of computation.

  • lordofgibbons strongly dislikes exceptions and favors explicit error handling, citing Go and Rust.
  • almostgotcaught provides examples of Python being used for kernel writing.
  • tree_enjoyer speculates about Lisp being a natural language for AI-generated code.
  • atbpaca describes Mojo as a balance between readability and efficiency.
  • Hizonner expresses persistent bitterness over past technology choices.
  • anakaine advocates for moving on and practical acceptance of existing tools.