Solution Briefs

DW Augmentation Solution_Brief

Issue link: https://resources.zaloni.com/i/790671

Contents of this Issue

Navigation

Page 0 of 1

DATA WAREHOUSE AUGMENTATION WITH A BIG DATA LAKE The traditional data warehouse is constrained in terms of storage capacity and processing power. That's why the overall footprint of the data warehouse is shrinking as companies look for more efficient ways to store and process big data. Although the data warehouse is still used effectively by many companies for complex data analytics, creating a hybrid architecture by migrating storage and large-scale or batch processing to Hadoop or other scale-out architectures enables companies to save on storage and processing costs and get more value from the data warehouse for business intelligence (BI) activities. "One of the greatest challenges with managing massive amounts of data is managing cost. With Zaloni's Bedrock Platform we will realize $33M in savings, and those savings accompanied a scalable, highly available and reliable platform." - Pradeep Varadan, Associate Director, Verizon Enterprise Solutions Data Warehouse Augmentation Solution Brief 1 Save Millions, Enable Faster Time to Insight Augmenting storage as well as extract, load and transform (ELT) functions to a scale out architecture, such as Hadoop, enables enterprises to focus the data warehouse on what it does best: BI. Also, for savvy enterprises with an eye to the future, migrating to scale-out architectures set enterprises up to successfully exploit original raw data of all types for data exploration and new use cases. Save millions in storage costs. Scale-out architectures (e.g., Hadoop, AWS S3) can store raw data in any format at a fraction of the cost of the data warehouse. Providing higher capacity in a smaller footpint can save an organization millions. In fact, Zaloni helped one client achieve 20 times the storage capacity of their data warehouse at 50% of the cost of a previously planned data warehouse upgrade. Another client achieved a 100x cost reduction per terabyte of stored data. Significantly speed up processing. Hadoop's flexible architecture enables faster loading of data and parallel processing, resulting in faster time to insight. For example, one Zaloni client quadrupled the throughput of its system after migrating processing to Hadoop. Hadoop is also much more effective than the data warehouse for processing the increasing amount of unstructured and semi-structured data that's important for analytics today. Maximize DW for BI. Costly data warehouse resources shouldn't be wasted on low-value activities such as data transformation. One Zaloni client realized that 90% of their data warehouse platform was being used for ETL processes, leaving little processing power available for high-value analytics and business intelligence activities. A data warehouse augmentation with Hadoop made it possible for the enterprise to employ more strategic use of its assets. Extract more value, more quickly from more data. Lower cost means enterprises can store more data in an accessible format—in an "active archive" versus on tape. Extending data retention periods for historical data and eliminating time-consuming backup processes supports more in- depth trend analyses that can lead to further business insights and more effective business strategies. In addition, data lakes enable organizations to immediately start up sandboxes for data scientists. THE DATA LAKE COMPANY

Articles in this issue

view archives of Solution Briefs - DW Augmentation Solution_Brief