Red Hat Enterprise Linux AI

Red Hat® Enterprise Linux® AI is a platform for running large language models (LLMs) in individual server environments. The solution includes Red Hat AI Inference Server, delivering fast, cost-effective inference across the hybrid cloud by maximizing throughput and minimizing latency.

RHELAI interface graphic

What is Red Hat Enterprise Linux AI?

Red Hat Enterprise Linux AI brings together:

  • Red Hat AI Inference Server, which provides fast and consistent model deployment across the hybrid cloud with broad hardware support.
  • A bootable image of Red Hat Enterprise Linux, including popular AI libraries such as PyTorch and hardware-optimized inference for NVIDIA, Intel, and AMD.
  • Enterprise-grade technical support and Open Source Assurance legal protections.

Fast, flexible inference

When it comes to implementing generative AI, you have 2 choices: adapt your strategy to fit a pre-built product, or engineer a custom solution that aligns directly with your business goals.

Rooted in open source, Red Hat Enterprise Linux AI provides safe-guarded and reliable infrastructure to run any model, on any hardware, with any accelerator, across the hybrid cloud. The solution includes Red Hat AI Inference Server, which provides an immutable, purpose-built appliance for inference. By packaging the operating system and the AI Inference server, you can immediately begin serving models. With integrated vLLM runtime and LLM compressor, you can maximize throughput and minimize latency for fast and cost-effective model deployments.

Features and benefits

LLMs for the enterprise

Tune smaller, purpose-built models with your own data using methods like retrieval-augmented generation (RAG).

Increased efficiency

Powered by vLLM, Red Hat AI Inference Server increases efficiency without sacrificing performance.

Cloud-native scalability

Red Hat Enterprise Linux image mode lets you manage your AI platform as a container image, streamlining your approach to scaling.

Red Hat AI use cases

Generative AI

Generative AI

Produce new content, like text and software code.

Red Hat AI lets you run the generative AI models of your choice, faster, with fewer resources, and lower inference costs.

Predictive AI

Predictive AI

Connect patterns and forecast future outcomes.

With Red Hat AI, organizations can build, train, serve and monitor predictive models, all while maintaining consistency across the hybrid cloud.

Operationalized AI

Operationalized AI

Create systems that support the maintenance and deployment of AI at scale.

With Red Hat AI, manage and monitor the lifecycle of AI-enabled applications while saving on resources and ensuring compliance with privacy regulations.

Agentic AI

Agentic AI

Build workflows that perform complex tasks with limited supervision.

Red Hat AI provides a flexible approach and stable foundation for building, managing and deploying agentic AI workflows within existing applications.

Deploy with partners

Artificial intelligence (AI) model training requires optimized hardware and powerful computation capabilities. Get more from Red Hat Enterprise Linux AI by extending it with other integrated services and products.

AI customer stories from Red Hat Summit and AnsibleFest 2025

Turkish Airlines doubled the speed of deployment times with organization-wide data access.

JCCM improved the region's environmental impact assessment (EIA) processes using AI.

Denizbank sped up time to market from days to minutes.

Hitachi operationalized AI across its entire business with Red Hat OpenShift AI.

Frequently asked questions

What is the difference between Red Hat Enterprise Linux AI and Red Hat Enterprise Linux?

Red Hat Enterprise Linux AI is a foundation model platform for running LLMs in individual server environments. The solution includes the Red Hat AI Inference Server, delivering fast, cost-effective hybrid cloud inference by maximizing throughput, minimizing latency, and reducing compute costs.

Red Hat Enterprise Linux is a commercial open source Linux distribution developed by Red Hat for the commercial market that provides a flexible and stable foundation to support hybrid cloud innovation.

Red Hat Enterprise Linux AI is delivered as a Red Hat Enterprise Linux bootable image that includes AI libraries, and Granite models.

Do I need to buy Red Hat Enterprise Linux to use Red Hat Enterprise Linux AI?

No, a Red Hat Enterprise Linux AI license is sufficient and includes all of the components needed.

What’s included in Red Hat Enterprise Linux AI?

Red Hat Enterprise Linux AI includes a bootable image of a Red Hat Enterprise Linux container image that includes:

What’s the difference between Red Hat Enterprise Linux AI and Red Hat OpenShift AI?

Red Hat Enterprise Linux AI provides out-of-the-box large language models in a single server. The solution includes Red Hat AI Inference Server, which provides an immutable, purpose-built appliance for inference. By packaging the operating system (OS) and application together, Red Hat Enterprise Linux AI facilitates day-one operations for AI inference across the hybrid cloud by maximizing latency, and reducing compute costs.

Red Hat OpenShift® AI provides all of the tools needed to help customers build AI-enabled applications at scale. Red Hat OpenShift AI offers a comprehensive, integrated MLOps platform to help manage the lifecycle of models, ensuring support for distributed compute, collaboration workflows, monitoring, and hybrid-cloud applications.

Red Hat OpenShift AI includes access to Red Hat Enterprise Linux AI, so teams can use the same models and alignment tools in their OpenShift AI architecture as well as benefit from additional enterprise MLOps capabilities.

How is Red Hat Enterprise Linux AI priced?

The Red Hat Enterprise Linux AI license is priced per accelerator.

Explore more AI resources

4 considerations for choosing the right AI model

Maximize AI innovation with open source models

4 reasons to use open source small language models

Unlock generative AI innovation with Red Hat AI

Contact Sales

Talk to a Red Hatter

Reach out to our sales team below for Red Hat Enterprise Linux AI pricing information.
To learn more about our partnerships, visit our catalog page.