Red Hat’s new deal will see RHEL AI 1.2 shipping preinstalled on a $29,000 AI-focused server from Lenovo.
It’s pretty much impossible to do anything these days without being prompted to accept a little help from AI. It appears that’s also true for enterprises as well, since Lenovo’s now offering GenAI preinstalled on one of its server lines.
We learned about this from Red Hat, which on Wednesday announced a collaboration with Lenovo that will result in Red Hat Enterprise Linux AI being preinstalled on Lenovo ThinkSystem SR675 V3 servers. We also leaned Wednesday that Red Hat’s released RHEL AI 1.2 as the latest and greatest version of its GenAI platform.
We’ll talk about Lenovo first.
To be sure, the line of servers involved in this joint project isn’t just any line of servers. This is a line specifically designed for companies looking for a lot of GPU availability to power a lot of AI. The hardware also isn’t cheap, but resides in the “if you have to ask how much it is, you can’t afford it” category. According to Lenovo’s website, just one of these babies will set you back $29,154.06, and that’s a “starting at” price. As they used to say: Nice work if you can get it.
“Our collaboration with Red Hat is a powerful force driving innovation in AI,” Brian Connors, VP and GM of Lenovo’s ESMB and AI Business Segment said in a statement. “By combining Lenovo’s industry-leading hardware with Red Hat’s cutting-edge software, we are delivering comprehensive solutions that empower businesses to harness the full potential of these transformative technologies.”
What’s in this for Lenovo, of course, is the value add of world-class AI software to help sweeten $29K a pop deals. What’s in it for Red Hat? I sent an email and asked, and here’s the answer:
“RHEL AI is presented as the default choice to customers. Once a customer confirms RHEL AI as the supported AI platform (they have the option to ‘deselect’ if they choose), the price of the RHEL AI subscription is added onto the price of the Lenovo ThinkSystem server. The RHEL AI subscription comes with premium support as well. From there, the Lenovo factory loads RHEL AI into the server as part of the manufacturing process/order fulfillment.”
What Red Hat Brings to the Table
While it’s kind of fun to poke at Red Hat and Lenovo over a 30 grand price tag, there’s actually a reason for the cost. As Noel Coward once had a character say in a play, small economies never pay. If you want to invest in AI, and you want that investment to pay off, you can’t do it on the cheap.
For one thing, you’re going to need a lot of world class hardware, with an abundance of GPUs to do the heavy lifting. You’re also going to need to have some kind of ready-made AI framework that’s relatively easy to use and which ships with a lot of versatility. The versatility’s necessary because there are a lot of different ways to use AI, and the last thing you want to hear after you’ve invested a fortune making your company AI-capable is “our system can’t do that.”
For an open source company Red Hat isn’t cheap, but it has decades of experience dealing with enterprises, which means it speaks the lingo and can anticipate what enterprises want and need — and for decades it’s been building stable and dependable platforms for meeting those needs at scale.
On the AI side, its deep ties to IBM give it moxie that most of its competitors lack. IBM helped pioneer AI with its Watson program (remember, back in 2011 Watson played Jeopardy and won), and RHEL AI harnesses IBM’s Granite family of open source large language models, as well as InstructLab model alignment tools, which were also developed by IBM.
“For enterprises to take full advantage of AI’s rapid evolutionary pace, they need greater levels of agility and efficiency to both identify and implement AI strategies successfully,” Joe Fernandes, VP and GM of Red Hat’s AI Business Unit said in a statement. “Through our collaboration with Lenovo, Red Hat is making it easier for customers to seize the AI opportunity by combining the power of RHEL AI with Lenovo’s industry leading servers.”
Lenovo’s Part
Lenovo has a long history with IBM. Those with a long memory will remember that ThinkPad, Lenovo’s popular line of business-focused laptops, was originally an IBM product, which was acquired by Lenovo in 2005 when it purchased Big Blue’s entire personal computer business.
That purchase also added to Lenovo’s manufacturing prowess according to Steve Hamm who, quoted Lenovo’s founder Liu Chuanzhi in 2008:
“We benefited in three ways from the IBM acquisition. We got the ThinkPad brand, IBM’s more advanced PC manufacturing technology, and the company’s international resources, such as its global sales channels and operation teams.”
In this case, that expertise in hardware goes beyond mere manufacturing and includes the support that Lenovo can offer to help companies make the right hardware choices to maximize their infrastructure.
“Lenovo Consulting Service offers comprehensive support to optimize and implement Red Hat solutions,” Red Hat said in its announcement of the Lenovo deal. “These services provide expertise in deployment, integration, and management, helping businesses enhance their IT infrastructure’s performance and scalability while reducing complexity and costs.”
RHEL AI 1.2
Again, Red Hat’s announcement comes on the same day that it announced the release of RHEL AI 1.2, and less than 1 1/2 months after the release of RHEL 1.1.
According to Tushar Katarki, Red Hat’s senior director of product and GenAI foundation model platforms, the new version’s enhancements allow “organizations to more efficiently fine-tune and deploy LLMs using private, confidential and sovereign data to better align to enterprise use cases.”
Katarki added that the improvements in version 1.2 “now support a wider range of hardware accelerators, including the newly introduced AMD Instinct accelerators. We intend to continue expanding our hardware accelerator support with partners like Intel in upcoming releases.”
In addition, he offers this list of what he considers to be the “key highlights of RHEL AI 1.2:
Support for Lenovo ThinkSystem SR675 V3 servers: RHEL AI 1.2 is now supported on Lenovo ThinkSystem SR675 V3 servers with hardware accelerators. Users can also take advantage of factory preload options for RHEL AI on these servers, making deployment faster and easier.
Support for AMD Instinct Accelerators (technology preview): Language models require powerful computing resources, and RHEL AI now supports AMD Instinct Accelerators with the full ROCm software stack, including drivers, libraries and runtimes. With RHEL AI 1.2, organizations can leverage AMD Instinct MI300x GPUs for both training and inference, and AMD Instinct MI210 GPUs for inference tasks.
Availability on Azure and GCP (technology preview): RHEL AI is now available on Azure and Google Cloud Platform (GCP). With this users will be able to download RHEL AI from Red Hat and bring them to Azure and GCP and create RHEL AI based GPU instances.
Training checkpoint and resume
Long training runs during model fine tuning can now be saved at regular intervals, thanks to periodic checkpointing. This feature allows InstructLab users to resume training from the last saved checkpoint instead of starting over, saving valuable time and computational resources.Auto-Detection of hardware accelerators: The ilab CLI can now automatically detect the type of hardware accelerator in use and configure the InstructLab pipeline accordingly for optimal performance, reducing the manual setup required.
Enhanced training with PyTorch FSDP (technology preview) For multi-phase training of models with synthetic data, ilab train now uses PyTorch Fully Sharded Data Parallel (FSDP) by default. This dramatically reduces training times by sharding a model’s parameters, gradients and optimizer states across data parallel workers (e.g., GPUs). Training times are cut almost linearly – if training takes X hours with one accelerator, using N accelerators can reduce it to X/N hours.
Red Hat says that anybody that’s using RHEL 1.1 needs to upgrade ASAP:
“With the introduction of RHEL AI 1.2, we will be deprecating support for RHEL AI 1.1 in 30 days. Please ensure your systems are upgraded to RHEL AI 1.2 to continue receiving support.”
Christine Hall has been a journalist since 1971. In 2001, she began writing a weekly consumer computer column and started covering Linux and FOSS in 2002 after making the switch to GNU/Linux. Follow her on Twitter: @BrideOfLinux