Open source has a โ€˜massive role to playโ€™ in AI orchestration platforms, says Microsoft CEO

news
Apr 30, 20257 mins

Nadella also revealed that AI has built up to 30% of the code currently in Microsoftโ€™s repositories.

Credit: Shutterstock / MY STOCKERS

Microsoft CEO Satya Nadella says he is โ€œvery optimisticโ€ that technology has sufficiently advanced to support more complex, next-gen capabilities such as multi-agent AI orchestration, and open source is a key component.

Chips are getting better, cycle times are faster, and system software, model architecture, and kernels are constantly being optimized, resulting in what he described as a 10x performance boost every six to 12 months.

โ€œWe are in some crazy sort of hyperdrive Mooreโ€™s Law,โ€ Nadella said during a fireside chat with Meta CEO Mark Zuckerberg at Metaโ€™s inaugural LlamaCon developer event. โ€œAny one of these tech platform shifts has not been about one S curve, itโ€™s been multiple S curves that compound.โ€

Distillation is โ€˜like magicโ€™

Microsoftโ€™s ideal, Nadella said, is an orchestration layer that will offer the ability to mix and match AI models, with users pulling different aspects of intelligence from different models in areas where they excel. Open source โ€œabsolutely has a massive, massive role to playโ€ in the building out of such platforms.

โ€œIโ€™m not dogmatic about closed source or open source, both of them are needed in the world,โ€ he said. โ€œAnd customers will demand them, right?โ€

Whether it be interplay between SQL, MySQL, Postgres, Linux, Windows, or other products, having a posture that allows interoperability is incredibly important, Nadella noted. Enterprise customers want to be able to distill custom models built with their intellectual property (IP), and open weight models will have a โ€œhuge structural advantageโ€ in supporting that, compared to closed models.

โ€œTaking a large model and being able to distill it into a smaller model that has even that same model shape is a big use case,โ€ he said.

The challenge is making that available to those unable to build their own infrastructure, or those less technically sophisticated, he noted. The goal for a company like Microsoft is to build the tooling for these models as a service. Hyperscalers can deploy this infrastructure as a cloud service and build tools around it.

So, Nadella pointed out, every tenant of Microsoft 365 could have a distilled, task-specific model that they could create as an agent or a workflow and invoke from within Copilot. โ€œThat, to me, is a breakthrough scenario,โ€ he said.

Zuckerberg agreed: โ€œThat just seems like one of the coolest things that I think is going to get built.โ€ He also called model distillation โ€œlike magic.โ€

โ€œYou basically can make it so that you can get 90% or 95% of the intelligence of something that is 20 times larger in a form factor that is so much cheaper and more efficient to use,โ€ he said.

As powerful as it is, though, there are concerns about doing distillation in a safe and secure way, he conceded, particularly when open source models are coming from different countries (such as DeepSeek, from China). He noted: โ€œHow do you make sure that youโ€™re not inheriting security vulnerabilities or different values that are kind of problematic?โ€

AI takes a greater role in coding

Nadella and Zuckerberg also agreed on the dramatic possibilities for AI in coding. Nadella said that AI generates โ€œfantasticโ€ Python code, and estimated that up to 30% of the code currently in Microsoftโ€™s repositories has been written by software.

Zuckerberg said Metaโ€™s bet is that, in the next year, โ€œmaybe halfโ€ of software development will be completed by AI. Eventually, he said, โ€œevery engineer is effectively going to end up being more like a tech lead with their own little army of triggering agents.โ€

But successful integration is the sweet spot; AI must work seamlessly into current repos and developer workflows, Nadella emphasized. โ€œItโ€™s one thing to build a new greenfield app, but none of us get to work on complete greenfield all the time,โ€ he said.

Instead, devs are working in large code bases with many workflows, so effective integration within the tool chain is paramount. This is the systems work that any engineering team needs to be doing, said Nadella, and itโ€™s where teams will see the greatest productivity gains.

AI is an โ€˜existential priorityโ€™

Ultimately, Nadella described AI as โ€œa pretty existential priorityโ€ and said its success โ€œwill come down to developers being able to go at it fearlessly.โ€

The world requires productivity gains in every function, from healthcare, to retail, to broad knowledge work, he said, and AI has the promise to deliver that. โ€œSoftware in this new form of AI is the most malleable resource we have to solve hard problems.โ€

Still, itโ€™s not enough that the technology is just there, he noted; it also requires a management and cultural change. Case in point: Electricity was around for 50 years before people figured out that it could revolutionize factories.

โ€œJust thinking of this as the horseless carriage is not going to be the way we are going to get to the other side,โ€ Nadella said. โ€œSo itโ€™s not just tech. Tech has got to progress. Youโ€™ve got to put that into systems that actually deliver the new work, work artifact and workflow.โ€

Databricks: Everythingโ€™s going towards open source

Ali Ghodsi, Databricks co-founder and CEO, also sat down with Zuck at LlamaCon; one of his big takeaways was that, in the long run, everything is going to move towards open source.

โ€œYou get this sort of open research and open sharing,โ€ he said. โ€œYou get a whole world working on these models, and you just have much, much more rapid progress.โ€

Like Nadella, Ghodsi described a โ€œcross pollinationโ€ between different models. People are slicing, mixing and matching, and cobbling different elements together, tasks that would be โ€œcompletely impossibleโ€ without open source.

Zuckerberg added that itโ€™s โ€œreally gratifyingโ€ to move more towards open source, pointing out that just a year ago, Llama was the main major open source model, which, of course, is now no longer the case.

โ€œYou have the ability to take the best parts of the intelligence from the different models and produce exactly what you need,โ€ he said. โ€œIt feels like sort of an unstoppable force.โ€

At the same time, thereโ€™s a big question in open source around the fact that, while models are open, the data theyโ€™re trained on is sometimes withheld for any number of reasons, Zuckerberg noted. By contrast, reasoning models are trained using problems with verifiable answers, so if devs are giving them hard math or coding problems, the data is much less proprietary.

This is an interesting area to explore, he said: New types of open source releases that arenโ€™t just the weights of the model, but also some of its reasoning traces.

This kind of ongoing exploration led Ghodsi to conclude, โ€œItโ€™s Day Zero of the AI era. The most amazing applications are yet to be invented. This is a complete white space.โ€

The โ€œdata advantageโ€ will be the underpinning of success, Ghodsi said. Builders need to collect the right data, refine it, pass it through models โ€” and repeat โ€” to create a โ€˜flywheelโ€™ or network effect, continually iterating on and improving products with use.

He urged devs: โ€œTry it out, use it, explore those crazy new ideas. Itโ€™s going to be exciting times ahead.โ€ย