Apache Promotes Big Data Tool

Sqoop, a big data tool for transferring data between Hadoop and structured data stores, has graduated to top-level project status at the Apache Software Foundation

The Apache Software Foundation (ASF) announced that its Apache Sqoop big data tool has graduated from the Apache Incubator to become a top-level project (TLP).

Sqoop is designed to efficiently transfer bulk data between Apache Hadoop and structured data stores such as relational databases. Apache Sqoop allows the import of data from external data stores and enterprise data warehouses into a Hadoop Distributed File System or related systems like Apache Hive and HBase.

Maturity

“The Sqoop Project has demonstrated its maturity by graduating from the Apache Incubator,” said Arvind Prabhakar, vice president of Apache Sqoop, in a statement. “With jobs transferring data on the order of billions of rows, Sqoop is proving its value as a critical component of production environments.”

ASF officials said Sqoop builds on the Hadoop infrastructure and parallelises data transfer for fast performance and best use of system and network resources. In addition, Sqoop allows fast copying of data from external systems to Hadoop to make data analysis more efficient, and mitigates the risk of excessive load to external systems.

“Connectivity to other databases and warehouses is a critical component for the evolution of Hadoop as an enterprise solution, and that’s where Sqoop plays a very important role” said Deepak Reddy, Hadoop Manager at Coupons.com, in a statement. “We use Sqoop extensively to store and exchange data between Hadoop and other warehouses like Netezza. The power of Sqoop also comes in the ability to write free-form queries against structured databases and pull that data into Hadoop.”

Moreover, “Sqoop has been an integral part of our production data pipeline” said Bohan Chen, director of the Hadoop Development and Operations team at Apollo Group, also in a statement. “It provides a reliable and scalable way to import data from relational databases and export the aggregation results to relational databases.”

Specialised systems

Since entering the Apache Incubator in June 2011, Sqoop was quickly embraced as a key SQL-to-Hadoop data transfer solution.

The project provides connectors for popular systems such as MySQL, PostgreSQL, Oracle, SQL Server and DB2, and also allows for the development of drop-in connectors that provide high-speed connectivity with specialised systems like enterprise data warehouses.

Craig Ling, director of business systems at Tsavo Media, stated: “We adopted the use of Sqoop to transfer data into and out of Hadoop with our other systems over a year ago. It is straightforward and easy to use, which has opened the door to allow team members to start consuming data autonomously, maximising the analytical value of our data repositories.”

How well do you know the cloud? Take our quiz.