▲圖片標題(來源: Alina Bukhtiy // Getty Images)
Data is being accumulated today on a near-incomprehensible scale: it’s estimated that global daily creation is roughly 2.5 quintillion bytes.
Similarly, enterprises have quickly accelerated in data collection from mere gigabytes to what could eventually be yottabytes. This largest unit approved as a standard size by the International System of Units is named for the Greek letter “iota” and equals one septillion bytes.
As this collection continues to ram, traditional data storage centers are no longer going to cut it, according to Jeff Denworth, cofounder and chief marketing officer for Vast Data. Organizations are interacting with massive amounts of data in increasingly sophisticated ways – or, at least, they’re attempting to – and they need advanced capabilities when it comes to scale, speed, efficiency and performance.
Vast promises hyperscale and infinite storage capabilities with its support of a next-generation storage platform concept known as Ceres. The provider of data center storage arrays announced the new platform at GTC 2022 this week.
“We view it as a new world that we’re entering into,” said Denworth. “We’re putting a blueprint together to shift the industry toward systems architecture that really can bring an end to the hard drive era.”
The exploding big data market is forecasted to reach $234.6 billion by 2026, according to Global Industry Analysts and Vast, is staking its claim alongside competitors such as Pure Storage, NetApp, Dell EMC and HPE Nimble Storage.
According to Denworth, the Ceres platform will further differentiate the rapidly growing Vast. Ceres goes beyond classic solid-state drive (SSD) form-factors (drive size, connection interface type and physical space) to ruler-based, high-density SSDs. These can handle significantly more flash capacity than traditional drives due to their larger surface area and airflow-friendly design. Partnering with Solidigm, Vast has certified 15 terabyte and 30 terabyte ruler SSDs that can accommodate 675 terabytes of raw flash data in one rack unit.
“There’s a ton of space savings, power savings, cost savings, fewer moving parts,” Denworth said.
Ceres platform offers smaller size but bigger performance
The new platform is named for a dwarf planet in the asteroid belt between Mars and Jupiter. This is because both are small but big in terms of performance, as Denworth described it. Vast’s Ceres may not be sophisticated looking – in fact, it “looks and feels more like a toaster,” he said – but “it is a simple, turnkey, super scalable appliance.”
Ceres is enabled by the Vast Universal Storage data platform and is built to leverage new hardware technologies including Nvidia BlueField data processing units (DPUs) and ruler-based hyperscale flash drives. Users in turn see improved speed, efficiency, performance, density and modularity, Denworth said. Other advantages include reduced data center costs and simplified serviceability: it’s fully front and rear serviceable, eliminating the need to slide systems in and out of racks, as well as the need for cable management. A minimum capacity entry point of 338 terabytes reduces upfront hardware costs while supporting scaling. Less required hardware also improves rack-scale resilience.
To date, Vast has received software orders to support more than 170 petabytes of data capacity to be deployed on Ceres platforms, Denworth said.
As data continues to grow, organizations increasingly struggle to find value in large reserves of data. Ceres can help democratize data center capabilities that have otherwise been the exclusive domain of the world’s largest hyperscale cloud providers, Denworth said. Organizations can adopt cutting-edge technologies without undertaking sophisticated deployments.
Vast is in the process of certifying Ceres for Nvidia DGX SuperPOD, a new system reengineered for large-scale AI workloads. This allows Nvidia customers to scale AI training infrastructure to support exabytes of data. The certification is slated for availability by mid-2022.
Denworth explained that, in 2021, Vast put out a call to new partners to help develop hyperscale data infrastructure. “We’ve been amazed by the collaboration and support for this vision that has come back from industry partners,” he said.
Regarding cloud vendors and large enterprise organizations building their own clouds, “simplicity collides with scale and cost,” he said. The architecture of yesterday that was designed for transaction processing isn’t going to transition to a world increasingly driven by AI.
Vast is working to transform opinions and lead the industry to a point where hyperscale infrastructure is the norm, he said. This won’t happen without significant investment and commitment.
The company reports that it has made significant inroads since emerging from stealth in February 2019. It exited its third year with a bookings run rate of nearly $300 million at a 90% gross margin; remains cashflow positive; is powered by a $1.2 million average selling price; and has a net revenue retention (NNR) of more than 300%.
轉貼自: VentureBeat
若喜歡本文,請關注我們的臉書 Please Like our Facebook Page: Big Data In Finance
留下你的回應
以訪客張貼回應