-0.4 C
New York
Saturday, February 22, 2025

DDN Gooses AI Storage Pipelines with Infinia 2.0


DDN Gooses AI Storage Pipelines with Infinia 2.0

(spainter_vfx/Shutterstock)

AI’s insatiable demand for information has uncovered a rising downside: storage infrastructure isn’t maintaining. From coaching basis fashions to operating real-time inference, AI workloads require high-throughput, low-latency entry to huge quantities of information unfold throughout cloud, edge, and on-prem environments. Conventional storage techniques have usually struggled underneath the load of those calls for, creating bottlenecks that may drastically delay innovation within the AI area. 

Immediately, DDN unveiled Infinia 2.0, a big replace to its AI-focused, software-defined information storage platform designed to remove the inefficiencies in AI storage and information administration. The corporate says Infinia 2.0 acts as a unified, clever information layer that dynamically optimizes AI workflows. 

“Infinia 2.0 is not only an improve—it’s a paradigm shift in AI information administration,” DDN CEO Alex Bouzari says, emphasizing how Infinia builds on the corporate’s deep-rooted experience in HPC storage to energy the subsequent era of AI-driven information providers. 

A rendering of a large-scale Infinia 2.0 configuration from DDN’s Past Synthetic digital occasion.

As AI adoption grows, the challenges of scale, pace, and effectivity turn into extra obvious. LLMs, generative AI purposes, and inference techniques require not solely huge datasets however the potential to entry and course of them quicker than ever. Conventional storage options wrestle with efficiency bottlenecks, making it tough for GPUs to obtain the information they want shortly sufficient, limiting general coaching effectivity. On the similar time, organizations should navigate the fragmentation of information throughout a number of areas, from structured databases to unstructured video and sensory information. Shifting information between these environments creates inefficiencies, driving up operational prices and creating latency points that sluggish AI purposes. 

DDN claims Infinia 2.0 solves these challenges by integrating real-time AI information pipelines, dynamic metadata-driven automation, and multi-cloud unification, all optimized particularly for AI workloads. Relatively than forcing enterprises to work with disconnected information lakes, Infinia 2.0 introduces a Information Ocean, a unified international view that eliminates redundant copies and allows organizations to course of and analyze their information wherever it resides. This is supposed to scale back storage sprawl and to permit AI fashions to go looking and retrieve related information extra effectively utilizing a complicated metadata tagging system. With just about limitless metadata capabilities, AI purposes can affiliate huge quantities of metadata with every object, making search and retrieval operations dramatically quicker. 

Infinia 2.0 integrates with frameworks like TensorFlow and PyTorch, which the corporate says eliminates the necessity for complicated format conversions, permitting AI execution engines to work together with information on to considerably pace up processing instances. The platform can also be designed for excessive scalability, supporting deployments that vary from a number of terabytes to exabytes of storage, making it versatile sufficient to fulfill the wants of each startups and enterprise-scale AI operations.  

Efficiency is one other space the place Infinia 2.0 might be a breakthrough. The platform boasts 100x quicker metadata processing, decreasing lookup instances from over ten milliseconds to lower than one. AI pipelines execute 25x quicker, whereas the system can deal with as much as 600,000 object lists per second, surpassing the constraints of even AWS S3. By leveraging these capabilities, DDN asserts that AI-driven organizations can guarantee their fashions are skilled, refined, and deployed with minimal lag and most effectivity. 

(Supply: DDN)

Throughout a digital launch occasion at present known as Past Synthetic, DDN’s claims had been strengthened by robust endorsements from trade leaders like Nvidia CEO Jensen Huang, who highlighted Infinia’s potential to redefine AI information administration, emphasizing how metadata-driven architectures like Infinia rework uncooked information into actionable intelligence. Enterprise computing chief Lenovo additionally praised the platform, underscoring its potential to merge on-prem and cloud information for extra environment friendly AI deployment. 

Supermicro, one other DDN companion, additionally endorses Infinia: “At Supermicro, we’re proud to companion with DDN to remodel how organizations leverage information to drive enterprise success,” mentioned Charles Liang, founder, president, and CEO at Supermicro. “By combining Supermicro’s high-performance, energy-efficient {hardware} with DDN’s revolutionary Infinia platform, we empower clients to speed up AI workloads, maximize operational effectivity, and scale back prices. Infinia’s seamless information unification throughout cloud, edge, and on-prem environments allows companies to make quicker, data-driven selections and obtain measurable outcomes, aligning completely with our dedication to delivering optimized, sustainable infrastructure options.” 

On the Past Synthetic occasion, Bouzari and Huang sat down for a hearth chat to replicate on how a earlier concept, born from a 2017 assembly with Nvidia, developed into the Infinia platform. 

DDN had been requested to assist construct a reference structure for AI computing, however Bouzari noticed a a lot greater alternative. If Huang’s imaginative and prescient for AI was going to materialize, the world would wish a basically new information structure, one that would scale AI workloads, remove latency, and rework uncooked data into actionable intelligence. 

On the Past Synthetic occasion, Huang and Bouzari sit down for a hearth chat in regards to the greater image of storage and AI.

Infinia is extra than simply storage, Bouzari says, and fuels AI techniques the best way power fuels a mind. And in accordance with Huang, that distinction is important. 

“Some of the necessary issues individuals overlook is the significance of information that’s needed throughout software, not simply throughout coaching,” Huang notes. “You wish to prepare on an enormous quantity of information for pretraining, however throughout use, the AI has to entry data, and AI wish to entry data, not in uncooked information type, however in informational circulate.” 

This shift from conventional storage to AI-native information intelligence has profound implications, the CEOs say. As an alternative of treating storage as a passive repository, DDN and Nvidia are turning it into an lively layer of intelligence, enabling AI to retrieve insights immediately. 

“That is the rationale why the reframing of storage of objects and uncooked information into information intelligence is that this new alternative for DDN, offering information intelligence for the entire world’s enterprises as AIs run on high of this cloth of knowledge,” Huang says, calling it “a unprecedented reframing of computing and storage.” 

Reframing definitely appears needed as AI continues to evolve as a result of the infrastructure supporting it should evolve as effectively. DDN’s Infinia 2.0 might characterize a serious shift in how enterprises strategy AI storage, not as a passive archive, however as an lively intelligence layer that fuels AI techniques in actual time. By eliminating conventional bottlenecks, unifying distributed information, and integrating seamlessly with AI frameworks, Infinia 2.0 goals to reshape how AI purposes entry, course of, and act on data. 

With endorsements from trade leaders like Nvidia, Supermicro, and Lenovo, and with its newest funding spherical of $300 million at a $5 billion valuation, DDN is positioning itself as a key participant within the AI panorama. Whether or not Infinia 2.0 delivers on its bold guarantees stays to be seen, however one factor is evident: AI’s subsequent frontier isn’t nearly fashions and compute however is about rethinking information itself. And with this launch, DDN is making the case that the way forward for AI hinges on new paradigms for information administration.

Study extra in regards to the technical facets of Infinia 2.0 at this hyperlink, or watch a replay of Past Synthetic right here.

Associated Objects:

Feeding the Virtuous Cycle of Discovery: HPC, Massive Information, and AI Acceleration

The AI Information Cycle: Understanding the Optimum Storage Combine for AI Workloads at Scale

DDN Cranks the Information Throughput with AI400X2 Turbo

Editor’s notice: This text first appeared on AIWire.

 

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles