Mojo 1.0 Beta

(mojolang.org)

Comments

totalperspectiv 16 hours ago
Having written a lot of Mojo over the last two year, just for fun, it's a really cool language. Ownership model adjacent to Rust, comptime that is more powerful than Zig, Rich type system, first class SIMD support, etc.

Performance wise it's the first language in long time that isn't just an LLVM wrapper. LLVM is still involved, but they are using it differently than say, Rust or Zig.

Very excited for Mojo once it's open sourced later this year.

ainch 8 May 2026
As someone in ML who's interested in performance, I'm keen for Mojo to succeed - especially the prospect of mixing GPU and CPU code in the same language. But I do wonder if the changes they're making will dissuade Python devs. The last time I booted it up, I tried to do some basic string manipulation just to test stuff out, but spent an hour puzzling out why `var x = 'hello'; print(x[3])` didn't work, and neither did `len(x)` (turns out they'd opted for more specific byte-vs-codepoint representations, but the docs contradicted the actual implementation).

Hopefully they get Mojo to a good place for more general ML, but at the moment it still feels quite limited - they've actually deprecated some of the nice builtins they had for Tensors etc... For now I'll stick with JAX and check in periodically, fingers crossed.

armchairhacker 8 May 2026
> We have committed to open-sourcing Mojo in Fall 2026.

https://docs.modular.com/mojo/faq/#will-mojo-be-open-sourced

maxloh 2 hours ago
Modular is going to open source the entire mojo SDK later this year, including the compiler.

> Mojo 1.0 will be finalized later this year, along with opening the compiler and providing language stability.

https://www.modular.com/blog/modular-26-3-mojo-1-0-beta-max-...

fibonacci112358 8 May 2026
Sadly for them, Nvidia didn't stay still in the meantime and created the next generation of CUDA, CuTile for Python and soon for C++, through CUDA Tile IR (using a similar compiler stack based on MLIR).

Event though it's not portable, it will likely have far greater usage than Mojo just by being heavely promoted by Nvidia, integrated in dev tools and working alongside existing CUDA code.

Tile IR was more likely a response to the threat of Triton rather than Mojo, at least from the pov of how easy is to write a decently performing LLM kernel.

modeless 8 May 2026
When I first heard about Mojo I somehow got the impression that they intended to make it compatible with existing Python code. But it seems like they are very far away from that for the foreseeable future. I guess you can call back and forth between Python and Mojo but Mojo itself can't run existing Python code.
csvance 12 hours ago
Mojo looks neat but I'm pretty satisfied with Julia at this point for high performance numerical computing across CPU, GPU, etc. I can't help but feel this niche is already mostly solved beyond having Python like syntax. Even Python has things like Numba and Triton that are effective for less complicated / more self contained type problems.
smartmic 8 May 2026
Advertising prominently with "AI native" seems necessary today, at least for some folks. To me, that's kind of off-putting, since it doesn't really say anything.

Can anyone of the AI enthusiasts here explain, why, or, what is meant by

> As a compiled, statically-typed language, it's also ideal for agentic programming.

pjmlp 8 May 2026
Julia is more mature for the same purposes, and since last year NVidia is having feature parity between Python and C++ tooling on CUDA.

Python cuTile JIT compiler allows writing CUDA kernels in straight Python.

AMD and Intel are following up with similar approaches.

If Mojo will still arrive on time to gain wider adoption remains to be seen.

bobajeff 12 hours ago
I've been keeping my eye on mojo. Honestly though the thing I least like about Python is it's syntax.

Someone else here is bringing up Julia. Which I think is a fine language but the compiler error messages and the library documentation are not what I would want in a language as far along as it is. I'm also worried about the correctness issues I've read about in a blog awhile back. Also I don't feel like I can make the kind of Python module I want with it (because of binary size and time to first x)

That being said I'm only hoping that Mojo can become an option. But I really like to use a REPL and I like the dynamicness of Python. So I might not ever get around to doing anything outside of maybe Numpy for performance.

taylorallred 19 hours ago
I know Mojo is aimed at ML, but I'm actually really interested in trying it for game development :)
andriamanitra 5 hours ago
The changelog looks reasonable, seems like the language is moving in the right direction. I'm excited to try it out once it's open sourced. The dependency on glibc is an unfortunate limitation as it means you can't install Mojo on musl-based systems.

[1] https://mojolang.org/releases/v1.0.0b1/

Timot05 8 May 2026
I’m relatively new to programming but I wish they had used a functional language syntax rather than an object oriented one as the basis for mojo.

From my experience, AI revolves a lot around building up function pipelines, computing their derivatives, and passing tons of data through them; which composability and higher order functions from functional programming make it a breeze to describe.

I also feel that other fields than AI are moving towards building up large functional pipelines to produce outputs, which would make mojo suitable for those fields as well. I’m building in the space of CAD for example and I’d love to use a “functional mojo” language.

dllu 8 May 2026
I remember reading about this 4 years ago as the new Chris Lattner project and was super excited, though a little skeptical.

I think that nowadays with vibe/agentic coding, high performance Python-like languages become ever more important. Directly using AI agents to code, say, C++, is painful as the verbose nature of the language often causes the context window to explode.

coppsilgold 17 hours ago
Python is basically the master glue language at this point.

If more than a few percent of execution time is spent in Python you are probably doing it wrong.

Personally I don't even understand why Cython is a thing, just write performance critical functions in other languages:

<https://pypi.org/project/rustimport/>

<https://pypi.org/project/import-zig/>

Note that you can even start threads in those languages and use function calls as pseudo-RPC. All without an overly complex build system.

pjmlp 8 May 2026
Still phase one, doesn't do native Windows.

Meanwhile Julia is more mature for the same purposes, and since last year NVidia is having feature parity between Python and C++ tooling on CUDA.

Python cuTile JIT compiler allows writing CUDA kernels in straight Python.

AMD and Intel are following up with similar approaches.

So will Mojo still arrive on time to gain wider adoption?

Time will tell.

chrismsimpson 8 May 2026
I do wonder if Mojo was a great idea just a little too late to the party. Porting ‘prototypes’ from Python to lower level languages is fairly trivial now with LLMs.
insumanth 8 May 2026
I was excited when Mojo launched and thought it might grow big quick. I don't see much traction. The pitch is compelling. What could be the issue?
rienbdj 6 hours ago
Does anyone know if Mojo is more suitable for functional programming that Python?

Things like optimizing away object allocations, pure function inlining, tail call optimization?

jadar 9 hours ago
Congrats to Chris Lattner and crew! Seems like a neat project. I listened to him talk about Mojo on some podcast a while back and was really impressed. Swift was a runaway success, and I hope he can pull it off again with Mojo! :)
0xpgm 8 May 2026
Right now majority of beginners start programming with a high-level language, say Python or JavaScript - then for more advanced system-level tasks pickup C/C++/Rust/Zig etc.

If Mojo succeeds, it could be the one language spanning across those levels, while simplifying heterogeneous hardware programming.

tveita 8 May 2026
Is there any project that showcases Mojo for running neural network models on the GPU - like ideally something like llama.cpp that could run one or more existing models to showcase the readability and performance?
sriram_malhar 8 May 2026
Doesn't anyone here have _one_ kind word to say about its features? Every one seems to be starting with "on the other hand".
noduerme 8 May 2026
Am I old or remembering this wrong... didn't Zuck write the first iteration of Facebook in PHP, and then spend millions to hire people to write something that converted the code to C++?
thefounder 8 May 2026
Does it have the indentation thing? That would be a no go for a lot of people
AbuAssar 16 hours ago
> AI native

What’s that supposed to mean?

DeathArrow 21 hours ago
>As a compiled, statically-typed language, it's also ideal for agentic programming.

Since there is not much Mojo code in the wild so the LLMs were trained on it, I wonder how it will work in practice.

Probably the agents will make lots of mistakes and you will spend 10x the tokens compared to using a language the model are well versed in.

runarberg 8 May 2026
I am actually on a lookout for a low level language which compiles to web assembly to write a (relatively small) supervised learning model which I plan to be good enough for 5 year old phone CPUs. I have a working prototype in Julia and was planning on (eventually) rewrite it in Rust mostly for the web assembly target. I come from a high level language background so the thought of rewriting in rust is a little daunting. So I was excited to learn about Mojo and find out if they had a WebAssembly target in their compiler.

But then I read this:

> AI native

> Mojo is built from the ground up to deliver the best performance on the diverse hardware that powers modern AI systems. As a compiled, statically-typed language, it's also ideal for agentic programming.

Well, no thank you. I know the irony here but I want nothing to do with a language made for robots.

DeathArrow 8 May 2026
>No more choosing between productivity and performance - Mojo gives you both.

That's a very big claim.

logicchains 8 May 2026
Very bold of them expecting people to use a language with a closed source compiler in the 2020s.