Perhaps the biggest thing since open source or Google, LLMs may have companies fighting for supremacy, but itโs the developers who come out ahead.
Two months ago, Amazon didnโt makeย a single mention of AI on its earnings call (Google and Microsoft mentioned AI dozens of times each). This past week, by contrast, the companyโs cloud division, Amazon Web Services (AWS), could talk about little else. As announced by Swami Sivasubramanian, vice president of database, analytics, and machine learning at AWS, the company is all over AI with the launch of new large language models (LLMs) and APIs to access them, as well as CodeWhisperer, a GitHub Copilot competitor, and more.
Itโs not that AWS wasnโt working on AI before; Amazon has been working with AI for decades. Rather, itโs now impossible to ignore AI. For developers,ย Iโve recently argued, the time is now to start learning how to put LLMs to work for you in your code development. AWS, never one to chase competitors, has decided it canโt remain silent when everyone else is talking about the power of LLMs and AI to transform software development.
Just in time, too, as RedMonkโs James Governor arguesย that OpenAI is the new AWS. ย To the folks at Jina, OpenAI is the new Google. Either way, itโs big, with the potential to dramatically change how the clouds compete.
OpenAI as the new AWS
When Governor calls OpenAI โthe new AWS,โ heโs not suggesting that OpenAI, the company behind the LLM ChatGPT, will be rolling out its own version of Amazon EC2 or Amazon S3 anytime soon. Rather, heโs talking about the impact LLMs can have on software development. Inspired by a discussion Governor and I had recently over lunch, I wrote about this, suggesting that, โThe race is on for developers to learn how to query LLMs to build and test code but also to learn how to train LLMs with context (like code samples) to get the best possible outputs.โ
For Governor, past revolutions in developer productivity were launched by โAWS, open source, and GitHubโ because โall of that stuff came together to help people learn and build.โ With LLMs, he continues, โWeโre at that point again.โ LLMs lower barriers to developer productivity much like open source (no need to get purchasingโs approval for a software license) and cloud (swipe a credit card rather than ask to requisition a server). In this case, Governor says itโs not about reducing time to gain access to software/hardware or about collaboration (GitHub), but rather about dramatically reducing the time to learn. As he stresses, โAI makes it easier than ever to learn new skill sets.โ
Back to AWS. One reason for the bevy of announcements this past week is because Microsoft, not AWS, has been at the forefront of enabling developer productivity with AI. Years ago, Microsoft bought GitHub, but prior to that it developed Visual Studio, the number 1 IDE and code editor used by developers. Together, thatโs a powerful one-two punch. Add OpenAIโs ChatGPT, which Microsoft has built into Bing, Copilot, and other Microsoft services, and Microsoft is now in pole position to earn developer loyalty.
As Governor puts it, given that any developer using ChatGPT is running on Azure, โWhat about a toolchain that eliminated Azure as a gating factor for developers building apps in GitHub and [Visual Studio] Code?โ In other words, could Microsoft refocus developers away from the underlying cloud infrastructure onto the applications being built with ChatGPT? It could. Microsoft has done well with Azure, but itโs still catching up to AWS. By elevating the application experience and removing the โundifferentiated heavy liftingโ of even thinking about the underlying cloud infrastructure, โMicrosoft has the opportunity to create a once and future developer experience which finally and properly brings the pain to AWS,โ to borrow Governorโs phrase.
My InfoWorld colleague David Linthicumย correctly contends that โ โcost savingsโ are a terrible way to define the value of cloud-based platforms.โ Instead, he posits, cloud is โabout delivering the more critical business values of agility and speed to innovation.โ Nowhere is that more true than in this greenfield area of AI. The way the clouds surface that developer agilityโrather than forcing developers to continue to muddle through mountains of different infrastructure servicesโwill determine who wins the next $100 billion in cloud spend.
As the cloud companies contend, the biggest winners of all will be developers. Game on.


