Has Fujitsu Cured The Industry’s Petabyte-scale Storage Headaches?

Duncan MacRae is former editor and now a contributor to TechWeekEurope. He previously edited Computer Business Review's print/digital magazines and CBR Online, as well as Arabian Computer News in the UAE.

Follow on:

Fujitsu hopes to disrupt the storage market with the launch of its hyper-scale Storage ETERNUS CD10000

Fujitsu has introduced what it describes as the world’s first storage system designed to grow as big and last as long as the online data it hosts.

By creating a storage eco-system with unlimited capacity that is capable of living forever, the FUJITSU Storage ETERNUS CD10000 is said to help organisations eliminate the major headaches associated with the exponential growth of data.

data managementKey problems

With the global amount of data generated and kept online continuing to multiply, organisations face three key problems: increased demands on scalability, greater complexity and cost, and physical limitations on the future ability to actually migrate data between storage systems without major disruption. Collectively, these factors dictate that businesses need a new approach to traditional storage as they move into the era of keeping tens of petabytes (PB) of data online, all the time. To put the sheer data volume into context, one PB of data is equivalent to approximately 100,000 hours of full HD 1080p video.

The architecture of this new hyper-scale, distributed scale-out eco-system allows individual storage nodes to be added, exchanged and upgraded in an organic way without downtime, helping the entire system – and its data – to live forever. Backwards compatibility means newer nodes can work alongside older, guaranteeing investment protection in the new ETERNUS system.

Fujitsu claims its ETERNUS CD10000 system heralds a new era of extremely high capacity solutions for everyday data retention and management problems. At launch, the system supports capacities up to 56 PB (56,000 TB) of online data, through the aggregation of up to 224 storage nodes. Next year, Fujitsu will introduce updates allowing for a far higher scalability.

The solution, Fujitsu says, delivers compelling new economics for organisations managing online data sets of 250TB or more, such as cloud and telecommunication service providers, financial, media and business analytics organisations, plus any other environment where online data volumes are exploding.

Fujitsu has based the new enterprise-ready system on the open source-based storage software Red Hat’s Inktank Ceph Enterprise and added functional enhancements to deliver comprehensive management, with the system operated as a single pane of glass. On a global level, the Fujitsu maintenance and support services enable customers for the first time to rely on the delivery of true enterprise-class service levels for a storage system based on open source software. ETERNUS CD10000 offers the ability to present a unified view of block, object and file storage in a single distributed storage cluster – supposedly reducing complexity, lowering storage management costs and optimising existing physical disk space for data storage.

Ranga Rangachari, vice president and general manager, Storage and Big Data, Red Hat, said: “Our alliance with industry-leading IT providers, such as Fujitsu, enables Red Hat to deliver proven storage solutions for enterprises managing petabyte-scale data. By combining Red Hat’s Inktank Ceph Enterprise storage software with Fujitsu’s new hyperscale storage ETERNUS CD10000 system, we can offer our customers speed and agility to realise the possibilities of the next-generation hyperscale environment.”

Hiroaki Kondo, senior vice president, head of Storage Systems Business Unit at Fujitsu. said: “The ETERNUS CD10000 revolutionises the way that organisations deal with ever-increasing online data. Fujitsu is the first mainstream, global storage technology provider to deliver a hyperscale, open source-based storage optimisation platform for online storage, removing future bottlenecks and allowing organisations to regain control over cost.”

How much do you know about Big Data? Take our quiz!