The global and independent platform for the SAP community.

Red Hat Introduces Red Hat AI 3

Red Hat, a provider of open source solutions, has announced Red Hat AI 3, an evolution of its enterprise AI platform.
E3 Magazine
November 27, 2025
avatar

By integrating the latest innovations from Red Hat AI Inference Server, Red Hat Enterprise Linux AI (RHEL AI), and Red Hat OpenShift AI, the platform can help simplify the complexity of high-performance AI inference at scale.

As companies move beyond experimentation with AI, they can face significant challenges, such as data privacy, cost control, and managing diverse models. The study “The GenAI Divide: State of AI in Business” by the NANDA project at the Massachusetts Institute of Technology highlights the reality of AI in production: approximately 95 percent of organizations fail to achieve quantifiable financial returns on an estimated 40 billion USD in enterprise investment. (Source)

Red Hat AI 3 focuses on directly addressing these challenges by providing a more consistent and unified experience for CIOs and IT leaders to maximize their investments in computing acceleration technologies. It enables rapid scaling and distribution of AI workloads across hybrid and multi-vendor environments, while improving collaboration between teams on next-generation AI workloads, such as agents, all on the same common platform.

What is LLM-D?

Red Hat AI 3 introduces the general availability of LLM-D, an open source framework which assists in scaling LLMs on Kubernetes. LLM-D enables intelligent distributed inference, and can combine with key open source technologies such as the Kubernetes Gateway API Inference Extension, the Nvidia Inference Transfer Library (NIXL), and the DeepEP Mixture of Experts (MoE) communication library. LLM-D builds on vLLM, transforming it from a high-performance single-node inference engine into a distributed, consistent, and scalable service system.

"As companies scale AI from experimentation to production, they face a new wave of challenges related to complexity, cost, and control."

Joe Fernandes,
Vice President and General Manager,
AI Business Unit,
Red Hat

"As companies scale AI from experimentation to production, they face a new wave of challenges in terms of complexity, cost, and control. With Red Hat AI 3, we offer an enterprise-grade, open source platform that minimizes these obstacles. By incorporating new capabilities such as distributed inference with LLM-D and a foundation for agentic AI, we are empowering IT teams to operationalize next-generation AI with greater confidence, on their own terms and on any infrastructure."

Next-generation AI agents

AI agents are poised to transform how applications are built, and their complex, autonomous workflows will place heavy demands on inference capabilities. To assist in accelerating agent creation and deployment, Red Hat has introduced a unified API layer based on Llama Stack, which helps align development with industry standards such as OpenAI-compatible LLM interface protocols. Furthermore, to promote a more open and interoperable ecosystem, Red Hat is an early adopter of the Model Context Protocol (MCP), a powerful and emerging standard that streamlines how AI models interact with external tools, a critical feature for modern AI agents.

Model customization

Red Hat AI 3 introduces a new modular and extensible toolkit for model customization, built on existing InstructLab functionality. It provides specialized Python libraries that give developers greater flexibility and control. The toolkit is powered by open source projects such as Docling for data processing, which streamlines the ingestion of unstructured documents into an AI-readable format. It also includes a flexible framework for synthetic data generation and a training hub for LLM fine-tuning. The integrated evaluation hub helps AI engineers monitor and validate results, enabling them to confidently leverage their proprietary data for more accurate and relevant AI outcomes.

Source: Red Hat

Write a comment

Working on the SAP basis is crucial for successful S/4 conversion. 

This gives the Competence Center strategic importance for existing SAP customers. Regardless of the S/4 Hana operating model, topics such as Automation, Monitoring, Security, Application Lifecycle Management and Data Management the basis for S/4 operations.

For the fourth time, E3 magazine is organizing a summit for the SAP community in Salzburg to provide comprehensive information on all aspects of S/4 Hana groundwork.

Venue

FourSide Hotel Salzburg,
Trademark Collection by Wyndham
Am Messezentrum 2, 5020 Salzburg, Austria
+43-66-24355460

Event date

Wednesday, June 10, and
Thursday, June 11, 2026

Early Bird Ticket

Regular ticket

Subscribers to the E3 Magazine Ticket

reduced with promocode CCAbo26

Students*

reduced with promocode CCStud26.
Please send proof of studies by e-mail to office@b4bmedia.net.
*The first 10 tickets are free of charge for students. Try your luck! 🍀
EUR 390 excl. VAT
available until November 30, 2025
EUR 590 excl. VAT
EUR 390 excl. VAT
EUR 290 excl. VAT

Venue

Hotel Hilton Heidelberg
Kurfürstenanlage 1
D-69115 Heidelberg

Event date

Wednesday, April 22 and
Thursday, April 23, 2026

Tickets

Early Bird Ticket
Regular ticket
EUR 390 excl. VAT
available until 30.11.2025
EUR 590 excl. VAT
Subscribers to the E3 magazine
reduced with promocode STAbo26
EUR 390 excl. VAT
Students*
reduced with promocode STStud26.
Please send proof of studies by e-mail to office@b4bmedia.net.
EUR 290 excl. VAT
*The first 10 tickets are free of charge for students. Try your luck! 🍀
The event is organized by the E3 magazine of the publishing house B4Bmedia.net AG. The presentations will be accompanied by an exhibition of selected SAP partners. The ticket price includes attendance at all presentations of the Steampunk and BTP Summit 2026, a visit to the exhibition area, participation in the evening event and catering during the official program. The lecture program and the list of exhibitors and sponsors (SAP partners) will be published on this website in due course.