SGI is making use of its supercomputer expertise to help companies tackle their big data needs
SGI is utilising its high-performance computing heritage in order to build up its capabilities in the fast-growing big data arena.
The supercomputer maker is offering new solutions that address the escalating compute and storage demands inherent in big data, giving organisations the tools to not only process and store the massive amounts of data being generated and collected, but also to quickly access and analyse it, according to SGI President and CEO Jorge Titinger.
“As businesses tackle the rising volume, velocity, and variety of Big Data, they face a growing challenge – how to unlock value at greater speed, scale and efficiency,” Titinger said in an SGI press release. “SGI’s expertise in designing and building some of the world’s fastest supercomputers enables customers to fully optimise High Performance Computing for Big Data analytics to achieve business breakthroughs.”
The new offerings, which include a cluster compute and storage solution, come as more organisations are looking to leverage the vast amounts of data being generated to identify and address potential business opportunities. Big data has become a key trend in business, and a wide range of technology vendors – from hardware makers to software vendors – are looking to meet the growing demand for solutions.
Gartner analysts in March said in a report that adoption of big data technologies are hitting the mainstream this year, with 42 percent of IT professionals in a survey saying they had invested in such technology or were planning to within a year.
“Organisations have increased their understanding of what big data is and how it could transform the business in novel ways. The new key questions have shifted to ‘What are the strategies and skills required?’ and ‘How can we measure and ensure our return on investment?'” Gartner Research Vice President Doug Laney said in Gartner press release in March.
SGI officials said their company’s legacy in high-performance computing (HPC) gives it the background to help businesses handle their growing big data needs.
“Today’s announcement reflects our rich heritage in HPC and high volume storage, and our ability to significantly help enterprises accelerate time to value, achieve petabyte scale, and lower the rising cost of Big Data,” Titinger said.
Among the new offerings is SGI’s InfiniteData Cluster, which is powered by Intel’s new Xeon E5-2600 v2 processors. It offers up to 12 4TB drives per tray, which creates a 1:1 ratio of processing cores to storage spindles that is optimised for the open-source Apache Hadoop big data software, according to SGI officials. The cluster is coupled with high-speed interconnects, offers up to 40 nodes and 1.9 petabytes of capacity in a single rack and comes integrated with Cloudera’s Hadoop distribution running on Red Hat’s Linux software and SGI’s Management Center tools.
The cluster comes pre-configured and pre-tested, making it faster for organisations to deploy than similar solutions that need to be meshed together once they arrive. It’s flexible and scalable, with choices of drives, network connectivity and the number of processing cores, company officials said.