Review: Apache Hive brings real-time queries to Hadoop

Hive's SQL-like query language and vastly improved speed on huge data sets make it the perfect partner for an enterprise data warehouse

Apache Hive is a tool built on top of Hadoop for analyzing large, unstructured data sets using a SQL-like syntax, thus making Hadoop accessible to legions of existing BI and corporate analytics researchers. Developed by Facebook engineers and contributed to the Apache Foundation as an open source project, Hive is now at the forefront of big data analysis in commercial environments. 

Hive, like the rest of the Hadoop ecosystem, is a fast-moving target. This review covers version 0.13, which addresses several shortcomings in previous versions. It also brings a significant speed boost to SQL-like queries across large-scale Hadoop clusters, building on new capabilities for interactive query introduced in prior releases. 

[ Also on InfoWorld: Know this about Hadoop right now | Learn how Hadoop works and how you can reap its benefits: Download InfoWorld's Hadoop Deep Dive PDF. | Discover what's new in business applications with InfoWorld's Technology: Applications newsletter. ]

Hive is fundamentally an operational data store that's also suitable for analyzing large, relatively static data sets where query time is not important. Hive makes an excellent addition to an existing data warehouse, but it is not a replacement. Instead, using Hive to augment a data warehouse is a great way to leverage existing investments while keeping up with the data deluge.

A typical data warehouse includes many expensive hardware and software components such as RAID or SAN storage, optimized ETL (extract, transform, load) procedures for cleaning and inserting data, specialized connectors to ERP and other back-end systems, and schemas designed around the questions an enterprise wants to ask such as sales by geography, product, or channel. The warehouse ecosystem is optimized around bringing enriched data to the CPU to answer the classes of questions the schema was designed for.

To continue reading this article register now

How to choose a low-code development platform