Nadella also revealed that AI has built up to 30% of the code currently in Microsoftโs repositories.
Microsoft CEO Satya Nadella says he is โvery optimisticโ that technology has sufficiently advanced to support more complex, next-gen capabilities such as multi-agent AI orchestration, and open source is a key component.
Chips are getting better, cycle times are faster, and system software, model architecture, and kernels are constantly being optimized, resulting in what he described as a 10x performance boost every six to 12 months.
โWe are in some crazy sort of hyperdrive Mooreโs Law,โ Nadella said during a fireside chat with Meta CEO Mark Zuckerberg at Metaโs inaugural LlamaCon developer event. โAny one of these tech platform shifts has not been about one S curve, itโs been multiple S curves that compound.โ
Distillation is โlike magicโ
Microsoftโs ideal, Nadella said, is an orchestration layer that will offer the ability to mix and match AI models, with users pulling different aspects of intelligence from different models in areas where they excel. Open source โabsolutely has a massive, massive role to playโ in the building out of such platforms.
โIโm not dogmatic about closed source or open source, both of them are needed in the world,โ he said. โAnd customers will demand them, right?โ
Whether it be interplay between SQL, MySQL, Postgres, Linux, Windows, or other products, having a posture that allows interoperability is incredibly important, Nadella noted. Enterprise customers want to be able to distill custom models built with their intellectual property (IP), and open weight models will have a โhuge structural advantageโ in supporting that, compared to closed models.
โTaking a large model and being able to distill it into a smaller model that has even that same model shape is a big use case,โ he said.
The challenge is making that available to those unable to build their own infrastructure, or those less technically sophisticated, he noted. The goal for a company like Microsoft is to build the tooling for these models as a service. Hyperscalers can deploy this infrastructure as a cloud service and build tools around it.
So, Nadella pointed out, every tenant of Microsoft 365 could have a distilled, task-specific model that they could create as an agent or a workflow and invoke from within Copilot. โThat, to me, is a breakthrough scenario,โ he said.
Zuckerberg agreed: โThat just seems like one of the coolest things that I think is going to get built.โ He also called model distillation โlike magic.โ
โYou basically can make it so that you can get 90% or 95% of the intelligence of something that is 20 times larger in a form factor that is so much cheaper and more efficient to use,โ he said.
As powerful as it is, though, there are concerns about doing distillation in a safe and secure way, he conceded, particularly when open source models are coming from different countries (such as DeepSeek, from China). He noted: โHow do you make sure that youโre not inheriting security vulnerabilities or different values that are kind of problematic?โ
AI takes a greater role in coding
Nadella and Zuckerberg also agreed on the dramatic possibilities for AI in coding. Nadella said that AI generates โfantasticโ Python code, and estimated that up to 30% of the code currently in Microsoftโs repositories has been written by software.
Zuckerberg said Metaโs bet is that, in the next year, โmaybe halfโ of software development will be completed by AI. Eventually, he said, โevery engineer is effectively going to end up being more like a tech lead with their own little army of triggering agents.โ
But successful integration is the sweet spot; AI must work seamlessly into current repos and developer workflows, Nadella emphasized. โItโs one thing to build a new greenfield app, but none of us get to work on complete greenfield all the time,โ he said.
Instead, devs are working in large code bases with many workflows, so effective integration within the tool chain is paramount. This is the systems work that any engineering team needs to be doing, said Nadella, and itโs where teams will see the greatest productivity gains.
AI is an โexistential priorityโ
Ultimately, Nadella described AI as โa pretty existential priorityโ and said its success โwill come down to developers being able to go at it fearlessly.โ
The world requires productivity gains in every function, from healthcare, to retail, to broad knowledge work, he said, and AI has the promise to deliver that. โSoftware in this new form of AI is the most malleable resource we have to solve hard problems.โ
Still, itโs not enough that the technology is just there, he noted; it also requires a management and cultural change. Case in point: Electricity was around for 50 years before people figured out that it could revolutionize factories.
โJust thinking of this as the horseless carriage is not going to be the way we are going to get to the other side,โ Nadella said. โSo itโs not just tech. Tech has got to progress. Youโve got to put that into systems that actually deliver the new work, work artifact and workflow.โ
Databricks: Everythingโs going towards open source
Ali Ghodsi, Databricks co-founder and CEO, also sat down with Zuck at LlamaCon; one of his big takeaways was that, in the long run, everything is going to move towards open source.
โYou get this sort of open research and open sharing,โ he said. โYou get a whole world working on these models, and you just have much, much more rapid progress.โ
Like Nadella, Ghodsi described a โcross pollinationโ between different models. People are slicing, mixing and matching, and cobbling different elements together, tasks that would be โcompletely impossibleโ without open source.
Zuckerberg added that itโs โreally gratifyingโ to move more towards open source, pointing out that just a year ago, Llama was the main major open source model, which, of course, is now no longer the case.
โYou have the ability to take the best parts of the intelligence from the different models and produce exactly what you need,โ he said. โIt feels like sort of an unstoppable force.โ
At the same time, thereโs a big question in open source around the fact that, while models are open, the data theyโre trained on is sometimes withheld for any number of reasons, Zuckerberg noted. By contrast, reasoning models are trained using problems with verifiable answers, so if devs are giving them hard math or coding problems, the data is much less proprietary.
This is an interesting area to explore, he said: New types of open source releases that arenโt just the weights of the model, but also some of its reasoning traces.
This kind of ongoing exploration led Ghodsi to conclude, โItโs Day Zero of the AI era. The most amazing applications are yet to be invented. This is a complete white space.โ
The โdata advantageโ will be the underpinning of success, Ghodsi said. Builders need to collect the right data, refine it, pass it through models โ and repeat โ to create a โflywheelโ or network effect, continually iterating on and improving products with use.
He urged devs: โTry it out, use it, explore those crazy new ideas. Itโs going to be exciting times ahead.โย


