Greenplum Targets Data Loading Speeds

Greenplum is banking on new technology to speed the data loading process for companies dealing with large data warehouses.

Greenplum’s massively parallel processing (MPP) Scatter/Gather Streaming (SG Streaming) technology is designed to eliminate the bottlenecks associated with other approaches to data loading. At its core, its approach utilises a parallel-everywhere approach to loading in which data flows from one or more source systems to every node of the database.

The technology is part of the company’s bid to challenge players such as Teradata, Oracle and Netezza. Customers are running into cost and performance constraints with competing solutions, and are looking for scalable software solutions to meet their needs, opined Paul Salazar, vice president of marketing.

According to Greenplum, this is different from traditional bulk loading technologies used by most mainstream database and MPP appliance vendors that push data from a single source, often over a single or small number of parallel channels. The aforementioned situation can result in bottlenecks and higher load times.

“With our approach we hit fully linear parallelism because we take all the source systems and we essentially do what we call scatter the data,” explained Ben Wether, director of product management at Greenplum. “We break it up into chunks that are sprayed across hundreds or thousands of parallel streams into the database and received… by all the nodes of the database in parallel. The essence of it is we eliminate all the bottlenecks.”

Performance scales with the number of Greenplum Database nodes, and the technology supports both large batch and continuous near-real-time loading patterns, company officials said. Data can be transformed and processed in-flight, leveraging all nodes of the database in parallel.

Final gathering and storage of data to disk takes place on all nodes simultaneously, with data automatically partitioned across nodes and optionally compressed, Greenplum officials explained.

“Our objective as we go through the product evolution…is to build out a range of capabilities that are just again appealing to the customers who we have today who want in many cases ever-increasing rates of speed and loading, speed of query response, flexibility of doing embedded analytics and really to most easily access very vast volumes of data without having to do a lot of manipulation or a lot of moving of data,” Salazar said.

Brian Prince eWEEK USA 2014. Ziff Davis Enterprise Inc. All Rights Reserved

Recent Posts

Tesla Backs Away From Gigacasting Manufacturing – Report

Tesla retreats from pioneering gigacasting manufacturing process, amid cost cutting and challenges at EV giant

11 hours ago

US Urges No AI Control Of Nuclear Weapons

No skynet please. After the US, UK and France pledge human only control of nuclear…

12 hours ago

LastPass Separates From Parent After Security Incidents

New chapter for LastPass as it becomes an independent company to focus on cybersecurity, after…

15 hours ago

US To Ban Huawei, ZTE From Certifying Wireless Kit

US FCC seeks to ban Chinese telecom firms at centre of national security concerns from…

19 hours ago

Anthropic Launches Enterprise-Focused Claude, Plus iPhone App

Two updates to Anthropic's AI chatbot Claude sees arrival of a new business-focused plan, as…

20 hours ago