Press "Enter" to skip to content

From Copilot to Canonical: How Ubuntu Plans to Add AI Without Taking Over Your PC

Jon Seager’s roadmap brings agentic AI to Ubuntu through inference snaps and background enhancements, while vowing not to hard‑wire AI into the OS or shove it at unwilling users.

It was bound to happen sooner or later. Built-in AI is coming to mainstream desktop Linux distros. Jon Seager, engineering VP at Canonical, has just announced something of a Five-Year Plan for AI in Ubuntu — which is really a first Six-Month Plan. They intend to include AI all over the place, but in a measured and non-trivial way.

“We are not setting shallow metrics on token usage, or percentages of code written with AI, but rather incentivizing engineers to experiment and understand where AI tools add value,” Seager said. “Rather than force a single early-choice AI stack, we’re incentivizing teams to each pick ‘something different’ and go deep, so we learn more as an org in the next six months.”

He also said that no Ubuntu workers will ever be replaced by AI. No matter which way you look at how that cookie crumbles, if you’re replaced at Ubuntu, it’ll be by another human being.

“I will not be measuring people at Canonical by how much they use AI, but rather continue to measure them on how well they deliver,” he said. “AI is not going to take software engineering jobs at Canonical, but other software engineers who are highly competent with AI tools certainly could. Using AI for its own sake is not a constructive goal for anything but increasing exposure, and it rarely yields good results in production code.”

That’s certainly a mouthful worth pondering.

Again, this all seems like destiny at work. Microsoft’s already turning virtually every piece of software it has — including Windows — over to Copilot, and even Red Hat has already built agentic capabilities into RHEL. On the desktop, Fedora’s talking about integrating AI — based on IBM’s open‑source Granite models — to serve a variety of purposes, mostly related to developers.

“The bottom line is that Canonical is ramping up its use of AI tools in a focused and principled manner that favors open weight models with license terms that feel most compatible with our values, combined with open source harnesses,” Seager explained. “AI features will be landing in Ubuntu throughout the next year as we feel that they’re of sufficient maturity and quality, with a bias toward local inference by default.”

I’ll leave it up to you to decide how “principled” they’re being. So far, its plans look OK to me, and they’re even making room for those who’d rather see Ubuntu remain AI-free. I’m waiting to see what it actually looks like when implemented. As they might say on the Isle of Man, the proof of the pudding…

Two Faces Have AI

According to Seager, the way Ubuntu sees it right now we’re going to see AI features show up in two forms. First we’re going to see AI being used in the background — mostly invisible, evidently — as a way of enhancing OS functionality. After that will come what Seager called “‘AI-native’ features and workflows … for those who want them.”

That last part is important, because this is the realm where AI could get in the way (or worse), so choice becomes important.

Nextcloud 7/7/25 336px rectangle 05.

Later in the article, Seager seems to put names to these two kinds of features: implicit and explicit. The former he said “will improve what Ubuntu already does,” while the latter “will be introduced as new features.”

**If you’re finding this article useful, please consider helping us reach our absolute minimum daily goal of $205 today so our FOSS Force Independence 2026 fundraiser stays on track for April. Any amount helps.**

The way Seager sees it, implicit features use AI to enhance existing operating systems, working mostly under the hood and unseen by users. He cites as an example the use of AI to bring quality speech-to-text and text-to-speech functions to the distro.

“I don’t see these as ‘AI features,'” he said. “I see them as critical accessibility features that can be dramatically improved through the adoption of LLMs with minimal (if any) drawbacks. Much of this can be achieved with local inference using open source harnesses and open weight models, which are both accurate and efficient for this use case.”

The “explicit” features, which will come later if I’m understanding Seager’s timeline, largely include stuff pulled from the agentic “gee-whiz, this is obviously AI” grab bag.

“This could be for authoring new documents or applications, automating troubleshooting workflows or even personal automation tasks such as targeted daily news briefings,” he said. “With this comes a big responsibility for us to ensure that the relevant security and confinement controls are in place to prevent unwanted side-effects.”

AI Implementation

While Microsoft is taking what many users say is a heavy handed approach that forces Copilot on users, Ubuntu will be treading more gently and taking an approach that makes AI a yea or nay choice for users, which is only fitting for an open source platform. At least to start, the use of AI will be opt-in, meaning you have to tick an on box to start it up.

Even if down the road it becomes opt-out, users won’t have to worry about it secretly running in the background, doing whatever ungodly stuff you might imagine AI can do when it’s running and no one knows, because it will be easy to completely remove all AI from the system.

That’s because instead of integrating AI software into Ubuntu’s code, it’s going to be running as inference Snaps, which package local models and runtimes, and run in a sandbox. If you want your Ubuntu installation to be completely AI free, all you’ll have to do is remove the Snaps.

“It’s easier to snap install nemotron-3-nano than juggle Ollama, Huggingface and a sea of model quantisations, and the snap will give you the optimized bits for your particular silicon if that silicon company has contributed them,” Seager said.

This means that next up for inference Snaps will be scaling, to make certain that AI can take advantage of all the latest and greatest AI optimized silicon when it’s available, and not demand them on GPU-constrained hardware.

KnownHost your premium managed hosting provider.

“We’ll be ramping up our teams to make sure we keep up with the latest model releases, and increasing the number of optimised variants for as many silicon platforms as possible,” he said.

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *