Zaloni, NetApp Partner for Data Lifecycle Management Solution

September 8, 2016 Scott Gidley

Right-size the enterprise data lake with policy-driven, highly scalable and cost-effective data lifecycle management and cloud tiering.

Today we announced a new solution developed in partnership with NetApp: Zaloni Bedrock DLM and NetApp StorageGRID. The solution addresses the growing need for big data lifecycle management as more enterprises deploy data lakes for managing the growing variety and volume of data, including mobile, cloud-based apps and Internet of Things (IoT) data.

The ability to address data lifecycle management in the data lake, based on age and relevancy, makes the difference between a data lake and a data swamp. With our DLM solution, enterprises can keep their Hadoop cluster right-sized by automatically tiering data within the data lake, deleting old data, leveraging S3 archiving, and tiering data to S3 object/cloud storage.

With this new solution from Zaloni and NetApp, NetApp E-Series and StorageGRID Webscale storage options allow enterprises to define logical data lakes on premise, in the cloud, or a hybrid, while Zaloni’s Bedrock Data Lake Management Platform provides the end-to-end management and governance of data in the data lake.

Key Zaloni Bedrock DLM and NetApp StorageGRID solution features include:

  • Data lifecycle management based on business policies, not files and directories: Bedrock manages the mapping of business metadata to physical files/directories and storage types and locations can be extended to non-HDFS targets
  • Tier-specific definitions: Specify data lifecycle in each storage tier based on age or events; HDFS does not map the age of the data to storage type/policy
  • Data catalog: See which data is in what tier and get a better view of storage throughput

Transitioning to a managed data lake architecture now prepares enterprises for the inevitable, as big data is only getting bigger. The Zaloni Bedrock DLM and NetApp StorageGRID solution not only creates efficiencies for data storage and processing, it positions enterprises well for the future, enabling strategies like data science; multi-site, cloud enabled disaster recovery; and enhanced data security for customer records and transaction data.

Learn more about our solution here.

About the Author

Scott Gidley

Scott Gidley is Vice President of Product Management for Zaloni, where he is responsible for the strategy and roadmap of existing and future products within the Zaloni portfolio. Scott is a nearly 20 year veteran of the data management software and services market. Prior to joining Zaloni, Scott served as senior director of product management at SAS and was previously CTO and cofounder of DataFlux Corporation.

More Content by Scott Gidley
Previous Article
Data Fracking: Going Deep into the Data Lake Using Drill
Data Fracking: Going Deep into the Data Lake Using Drill

Your data lake is finally live. After months and months of planning, designing, tinkering, configuring and ...

Next Article
Open Source: How Open Is It?
Open Source: How Open Is It?

This is the second in a multi-part series of blogs discussing Hadoop distribution differences to help enter...

×

Get the latest tips and how-to's delivered straight to your inbox!

First Name
Last Name
Zaloni Blog Email Subscription
Thank you!
Error - something went wrong!