The tech vendor wants developers to start developing code for the Machine, a system officials say will fundamentally change data center computing
Hewlett Packard Enterprise officials want to get open-source developers involved early with the development of the company’s reworking of data center computer architectures that they are calling the Machine.
At the tech vendor’s Discover 2016 conference in Las Vegas June 7, Hewlett Packard Enterprise (HPE) launched an open-source community page that will give developers access to a new set of tools that will enable them to start contributing to the code development for the Machine, which officials said completely rethinks data center IT as engineers shift the system architecture away from processors and toward memory.
The system is being designed to handle the massive amounts of data that will be generated in the future due to such trends as cloud computing, big data analytics, the proliferation of mobile devices and the Internet of things (IoT). A new architecture is needed to address the computing demands of the future, according to HPE officials.
The Machine will include a broad array of new and emerging technologies, including silicon photonics, custom processors and its own operating system. However, the focus of the system will be on memory—company officials call the new architecture “Memory-Driven Computing.” For the Machine, that will mean HPE’s advanced memristor technology, which essentially are chips that can operate like both storage and memory for the computer, creating a fast memory technology that also can keep data stored when power is turned off. Through this, the Machine will offer massive pools of nonvolatile memory (NVM) for storing data that will be linked by silicon photonics, and which will not only increase the amount of data that can be stored in a single machine, but also speed up the processing of the data and reduce power consumption.
The work will fundamentally change the architecture that computing—from smartphones to data center infrastructure to supercomputers—has been based on for six decades, according to officials. HPE reportedly will spend hundreds of millions of dollars to develop the Machine, a system that will be about the size of a refrigerator, but will be able to do the work of an entire data center.
It will take several more years before The Machine hits the market, but HPE executives said that given how different the architecture will be, they wanted to enable open-source developers to begin writing software for the system, even though much of the software HPE is developing for the system is still in its early phases.
HPE has created the Machine community page on Github for developers, who can start working on it and developing code immediately. The company has put out four developer tools to start, though officials are promising to grow that number in the coming months.
The tools include a new database engine that can leverage large numbers of CPU cores and NVM to speed up applications, and a fault-tolerant programming model for NVM that delivers fault-tolerance in case of power failures or program crashes. The other tools include a development environment called “fabric attached memory emulation” that enables developers to explore the Machine architecture, and a DRAM-based performance emulation platform that lets users take advantage of features available in commodity hardware to emulate various latency and bandwidth characteristics of future byte-addressable NVM technologies.
HPE official said they will continue to broaden the tools available to the open-source community in the coming months, including enhancing the ones already released, working on changes to Linux that will enable it to run on the Machine and creating example applications that will demonstrate how the Machine can improve application scale and performance.
Originally published on eWeek.