Press "Enter" to skip to content

Why Acorn Labs Turned on a Dime and Shifted From Kubernetes to AI

A few weeks ago the Kubernetes-helper SaaS platform Acorn Runtime was happily in beta, and set to go GA in the near future. Now, the platform is in mothballs and Acorn Labs is betting its future on GPTScript, an open-source scripting language for generative AI.

Acorn Labs booth at KubeCon Europe 2024 in Paris.
Acorn Labs booth at KubeCon Europe 2024 in Paris | Source: Acorn Labs

Last October Acorn Labs, then a year-old Silicon Valley-based startup, launched the beta version of what was planned to be its flagship software as a service platform Acorn Runtime. Even before going live, the project had garnered a community around its development, mainly because the platform was designed to remove much of the complexity of deploying containers.

“Acorn was actually a developer friendly layer, sitting on top of Kubernetes,” co-founder and Acorn CEO, Sheng Liang told me in a recent interview. Liang has also been a co-founder and CEO at another Kubernetes-focused company, Rancher, and after that, president of engineering and innovation at SUSE.

Acorn Labs co-founder and CEO Sheng Liang with co-founder and president Shannon Williams at KubeCon Europe 2024.
Acorn Labs co-founder and CEO Sheng Liang with co-founder and president Shannon Williams at KubeCon Europe 2024. | Source: Acorn Labs

“Kubernetes wasn’t that developer friendly… still isn’t that developer friendly,” he added. “As a result of that, developers would write their code, maybe package it using Docker or Docker Compose, and then throw it over the wall. Then, the ops team had to write the Helm charts or Yamyl files and then put it in production.”

In other words, it wasn’t efficient or agile, and added complications to the process of developing and deploying software. What Acorn did was add a layer to take care of time consuming drudge work so that DevOps teams could spend more time DevOping and less time dotting i’s and crossing t’s.

“It’s a great product,” Liang said of Runtime. “It was getting a fair amount of traction and we were on the path to release it. It was already in beta and the response was good. We were just about to pull the trigger and turn it into a GA product.”

That launch didn’t happen, and now evidently never will. On March 15, Liang posted on Acorn’s website that the young company was mothballing its Kubernetes-based SaaS offering to focus on a new platform called GPTScript, a scripting language that has nothing to do with Kubernetes or containers, and everything to do with generative AI.

Life After Containers

That Acorn focused on Kubernetes — and making Kubernetes easier to use — was pretty much to be expected since Kubernetes is in the company’s genes. All four of its co-founders were involved with Rancher Labs from the time it began in 2014 until it was acquired by SUSE in 2020, with three of them also being Rancher co-founders. All four also took positions at SUSE after Rancher’s acquisition. Even before that, the four had worked together on cloud related projects at Citrix before leaving to start Rancher. That this gang of four would decide to abandon Kubernetes and containers was nothing if not counter intuitive.

What happened, according to Liang, was that within months of the doors opening at Acorn, ChatGPT came along and not only eclipsed Kubernetes-based cloud-native’s place as IT’s “coming thing,” it also threatened the notion that enterprise infrastructures would automatically be centered on containers in the future.

While it might be expected that a group of container experts might have resisted such a notion, and seen this new technology as an invasive threat, the Acorn team saw it as an inevitability, and as an opportunity.

“GPT came out and and I think it was kind of dead obvious to everyone that it represented a tremendous amount of growth,” he said. “I think a lot of companies at the time were reflecting upon what they were doing to see if they could be relevant, and we were kind of doing the same thing. The first thing that came across to our minds was that it seems like what we were doing in Acorn for cloud native applied to this AI workload as well.”

Those similarities didn’t go to the bone, however. For one thing, AI definitely isn’t the same thing as cloud native.

“AI is a form of computing, but we realized it is actually somewhat different,” he said. “We had AI engineers working at Acorn and we were also talking to other AI engineers. One thing I learned was that AI developers and engineers really didn’t want have anything to do with containers.”

Liang said that even though the current standard-bearer in AI space, OpenAI, runs on Kubernetes, the connection with containers is largely hidden from developers and engineers building AI apps (which these days can run the gamut from help desk chatbots, to “smart” shopping assistants, to facial recognition and navigation technologies).

He said that one reason AI developers aren’t much interested in container technology is because AI engineers have already built solutions to many of the issues that containers and other traditional cloud-native technologies solve. As an example he pointed to virtual environment support in language runtimes, which he said was “sort of like creating a container in Python.”

“So the language people have solved those problems as well, so we’re like, ‘Oh, we really need to take another look,'” he said. “I think that kind of planted the seed in our mind that maybe the stuff we built with Acorn may not be as relevant in the new world as we thought it was going to be.”

It was evidently about that point that the Acorn folks decided to change directions and bet the farm on relatively new AI technology instead of containers, which for a decade had been the founders’ bread and butter.

As I thought about that after our Zoom meeting, I began to wonder why it had to be one or the other. Why couldn’t they do both, and continue to develop the kubernetes-based SaaS product that was evidently pretty close to being ready for prime time (and which already had customers waiting in line for the box office to open), and at the same time begin work on developing their entry into generative AI?

Since that was a question I neglected to ask during our talk, I sent Liang an email that basically asked, “Why couldn’t you do both?”

Surprisingly, he answered rather quickly, even though by this time he was in Paris, ahead of KubeCon Europe 2024.

“Despite having spent 18 months developing what we consider to be a great product in Acorn, and receiving extremely encouraging feedback from early users, we discovered that LLM apps are built differently,” he said. “They are designed with natural language prompts, and AI developers prefer to work with source code rather than binary containers. We realized that the Acorn runtime was not the perfect fit for the needs of AI developers. As a result, we developed GPTScript and decided to pivot our business to focus on it.”

That made sense, especially to this Linux user steeped in the mantra, “Do one thing and do it well.”

Actually, after I thought about it I realized that Liang had already answered my email question in our Zoom conversation, just not succinctly as in his email reply, but instead spread across the entirety of our conversation. I just hadn’t done enough reading between the lines.

Welcome to Generative AI

It’s one thing for a group of seasoned software engineers to decide to move into new territory. It’s completely another thing for them to figure out exactly what it is they’re going to do once they’re there. In the end, the Acorn team decided to do for AI the same thing they did for containers: make it easier for developers to work in the area.

“We still had to find an angle to get into AI,” Liang explained, pointing out that although building LLMs and AI-focused chips are pretty straight forward, “when it comes to building platforms and building apps, things are a lot more fragmented.”

“This is where it took us maybe a few more months to really understand that there’s a huge opportunity there and there’s an opportunity that’s not yet addressed,” he said. “That’s what led us to GPTScript. We realized that there’s a huge opportunity to just make it easier for developers to work with these models, not necessarily by having having to write every line of code from scratch but by writing more prompts and or having a mix of prompts and traditional code. Then we can kind of build a new scripting language that’s based on natural languages as opposed to shell scripts or Python scripts. That’s what got us started.”

According to Liang, the key enabling technology harnessed by GBTScipt is a feature introduced by OpenGPT in GPT-4 Turbo, which lets developers describe functions to connect GPT’s capabilities with external tools and APIs.

“Function Calls is pretty cool,” he said. “We were able to use that to build a new language called GPTScript.”

He said that currently GPTScript only works well when working on OpenAI models.

“We expect other proprietary and open source models will support it soon as well,” he said, “But right now it still really only work well with OpenAI.”

GPTScript’s Way Forward

On March 15, when Liang announced on Acorn’s website that the company would be dropping its Kubernetes service in favor of GPTScript just a few weeks after something of a stealth unveiling, he said that the company was already seeing interest in the platform.

“We are particularly excited about the enthusiastic response we received from developers, both in and outside of the AI community. Many developers have told us this is the first time they have been able to leverage LLMs in their applications or projects.”

“The natural next step is to build an open source LLM application stack based on the GPTScript technology,” he added.

At this stage of the game, it would probably be silly to even speculate what that LLM application stack might look like. That’s liable to change quickly, however. After all, who would’ve thought even a month ago that a startup loaded with Kubernetes talent would suddenly shift gears to enter the relatively unexplored territory of generative AI? In other words, I suspect we’ll have a handle on where essential AI tools such as GPTScript will lead the technology sooner than might normally be expected.

Breaking News: