Why “AI factories” are changing the security conversation
As enterprises pour capital into generative AI, the modern data center is being reshaped into what many vendors now call an AI factory: a high-throughput environment built to ingest data, train and fine-tune models, and run inference at scale. These deployments are typically GPU-dense, east-west traffic heavy, and extremely latency sensitive. They also attract heightened cyber risk because the workloads often handle valuable intellectual property, proprietary training data, and regulated information.
In that context, Fortinet’s decision to integrate FortiGate VM with NVIDIA BlueField-3 data processing units (DPUs) is aimed at a practical problem AI infrastructure teams are facing: how to enforce robust network security controls without starving AI workloads of CPU cycles or adding bottlenecks to fast-moving data paths.
What Fortinet and NVIDIA are integrating
The announcement centers on running Fortinet’s virtualized next-generation firewall, FortiGate VM, in a way that takes advantage of NVIDIA’s BlueField-3 DPU capabilities. DPUs are purpose-built for offloading infrastructure tasks—networking, security, and storage processing—from host CPUs. In AI clusters where CPUs already coordinate massive GPU pipelines, offloading can help preserve compute for AI jobs while keeping security enforcement close to the traffic.
Fortinet positions the integration as a way to accelerate AI factory security by combining:
- FortiGate VM for advanced firewalling and threat prevention in virtualized environments.
- NVIDIA BlueField-3 for high-performance data-path acceleration and isolation via DPU-based processing.
- A model that aligns with modern data center design, where security is increasingly embedded into the fabric rather than bolted on at the perimeter.
Why DPU-accelerated security matters for AI workloads
Traditional security stacks often rely heavily on general-purpose CPU resources. That approach can become expensive and inefficient in AI clusters, where organizations want maximum utilization of every server for model training and inference. By pushing security processing closer to the network layer and offloading work to a DPU, the integration is designed to reduce friction between performance and protection.
This is part of a broader industry trend: as workloads move toward distributed, containerized, and GPU-accelerated architectures, security teams are adopting approaches that emphasize segmentation, zero trust principles, and high-speed inspection—without creating latency or throughput penalties that undermine business outcomes.
Key benefits highlighted by the integration
While exact performance numbers can vary by configuration, the strategic value of FortiGate VM on BlueField-3 is clear: it aims to make security more scalable in AI-centric data centers where traffic volumes and lateral movement risks rise quickly.
- Improved resource efficiency: Offloading infrastructure and security functions can preserve CPU capacity for orchestration and AI pipeline tasks.
- High-throughput security enforcement: AI clusters generate significant east-west traffic; enforcing policy at speed helps maintain predictable performance.
- Stronger isolation: DPU-centric designs can help separate infrastructure functions from tenant workloads, supporting multi-tenant or segmented environments.
- Operational consistency: Virtualized firewalling can be deployed in repeatable patterns across AI pods, racks, or clusters.
How this fits into the economics of AI infrastructure
AI compute is capital intensive. Organizations are investing in GPUs, high-speed interconnects, and specialized networking to reduce training time and increase inference throughput. In that environment, security spending is increasingly evaluated through the lens of total cost of ownership and time-to-value. If security controls consume scarce CPU resources or introduce performance bottlenecks, they can indirectly increase the cost per training run or reduce service quality for production inference.
That’s why infrastructure offload—using DPUs and similar accelerators—has become a prominent theme across the data center market. The Fortinet–NVIDIA integration reflects this shift by positioning security as a function that can be accelerated and embedded into the AI data path rather than treated as an external checkpoint.
What it signals for enterprise security strategy
AI factories are pushing security teams to modernize architecture. Instead of relying solely on north-south inspection, organizations are prioritizing:
- East-west visibility to monitor lateral movement within clusters.
- Micro-segmentation to restrict how training, inference, storage, and management planes communicate.
- Automation and policy consistency across hybrid environments, including on-prem AI clusters and cloud GPU instances.
Integrations like FortiGate VM with BlueField-3 reinforce the idea that the next phase of data center security will be built around hardware-accelerated, software-defined controls that can keep pace with AI-scale traffic.
Conclusion
Fortinet’s integration of FortiGate VM with NVIDIA BlueField-3 is a timely response to the realities of AI infrastructure: massive throughput demands, tight latency budgets, and elevated risk associated with valuable data and models. By leveraging DPU-based offload, the approach aims to deliver stronger, faster security enforcement while preserving compute for AI workloads. As AI factories become a defining feature of enterprise IT, architectures that unify acceleration and protection will likely play a central role in how organizations scale AI safely and economically.
Reference Sources
Fortinet – AI Security Solutions
NVIDIA – BlueField DPUs Overview
Fortinet – FortiGate VM (Virtual Next-Generation Firewall)
NVIDIA – DPUs for the Modern Data Center







Leave a Reply