Zaloni DLM: Big Data Lifecycle Management for the Data Lake

September 14, 2016 Scott Gidley

Apache knows there’s an urgent need for data lifecycle management for big data – and now offers Heterogeneous Storage for different storage types, as well as Hadoop Archive Storage with hot, warm, cold and other storage categories.

Building on this, the Zaloni DLM launched, which allows for fine-grained control of data lifecycle management – at the scale of big data – with the ability to create data retention policies based on whatever makes sense for your business, including age and relevancy.  We can provide that level of control through metadata applied by our Data Lake Management Platform.

Using Zaloni DLM’s user-friendly interface, enterprises can right-size their Hadoop cluster by specifying storage tiers in Hadoop, deleting old data, and exporting data from HDFS to more cost-effective storage in the cloud, such as S3. The key to all of this is automation. Our DLM enables enterprises to automate these processes with global or specific policies, which is critical for successful data lifecycle management in the data lake.

Zaloni DLM provides the following key features:

  • DLM based on business policies, not files and directories: Our platform manages the mapping of business metadata to physical files/directories, and storage types and locations can be extended to non-HDFS targets.
  • Tier-specific definitions: Specify data lifecycle in each storage tier based on age or events or other custom policies using metadata applied in the platform (HDFS does not map the age of the data to storage type/policy).
  • Data catalog: See which data is in what tier and get a better view of storage throughput.

Partnership with NetApp

In partnership with NetApp, Zaloni tested and validated its data lifecycle management capability specifically for NetApp’s E-Series and StorageGRID Webscale hardware configurations.

Solutions like Zaloni DLM give you the control over your data that you’re accustomed to – while also benefitting from better visibility and increased data governance capabilities. Want to know more? Contact us, and we can discuss your needs.

About the Author

Scott Gidley

Scott Gidley is Vice President of Product Management for Zaloni, where he is responsible for the strategy and roadmap of existing and future products within the Zaloni portfolio. Scott is a nearly 20 year veteran of the data management software and services market. Prior to joining Zaloni, Scott served as senior director of product management at SAS and was previously CTO and cofounder of DataFlux Corporation.

More Content by Scott Gidley
Previous Article
Top Streaming Technologies for Data Lakes and Real-Time Data
Top Streaming Technologies for Data Lakes and Real-Time Data

More than ever, streaming technologies are at the forefront of the Hadoop ecosystem. This post is meant to ...

Next Article
Data Fracking: Going Deep into the Data Lake Using Drill
Data Fracking: Going Deep into the Data Lake Using Drill

Your data lake is finally live. After months and months of planning, designing, tinkering, configuring and ...

×

Subscribe to the latest data lake expertise!

First Name
Last Name
Company
I would like to subscribe to email updates about content and events.
Zaloni is committed to the best experience for you. Read more on our Privacy Policy.
Thank you!
Error - something went wrong!