Bedrock DLM: Big Data Lifecycle Management for the Data Lake

September 14, 2016 Scott Gidley

Apache knows there’s an urgent need for data lifecycle management for big data – and now offers Heterogeneous Storage for different storage types, as well as Hadoop Archive Storage with hot, warm, cold and other storage categories.

Building on this, Zaloni launched Bedrock DLM, which allows for fine-grained control of data lifecycle management – at the scale of big data – with the ability to create data retention policies based on whatever makes sense for your business, including age and relevancy.  We can provide that level of control through metadata applied by our Bedrock Data Lake Management Platform.

Using Bedrock DLM’s user-friendly interface, enterprises can right-size their Hadoop cluster by specifying storage tiers in Hadoop, deleting old data, and exporting data from HDFS to more cost-effective storage in the cloud, such as S3. The key to all of this is automation. Bedrock DLM enables enterprises to automate these processes with global or specific policies, which is critical for successful data lifecycle management in the data lake.

Zaloni Bedrock DLM provides the following key features:

  • DLM based on business policies, not files and directories: Bedrock manages the mapping of business metadata to physical files/directories, and storage types and locations can be extended to non-HDFS targets.
  • Tier-specific definitions: Specify data lifecycle in each storage tier based on age or events or other custom policies using metadata applied in Bedrock (HDFS does not map the age of the data to storage type/policy).
  • Data catalog: See which data is in what tier and get a better view of storage throughput.

Partnership with NetApp

In partnership with NetApp, Zaloni tested and validated its data lifecycle management capability specifically for NetApp’s E-Series and StorageGRID Webscale hardware configurations.

Solutions like Bedrock DLM give you the control over your data that you’re accustomed to – while also benefitting from better visibility and increased data governance capabilities. Want to know more? {{cta('3acf18ab-d399-4911-854c-d1e2e77390ba')}} and we can discuss your needs.

About the Author

Scott Gidley

Scott Gidley is Vice President of Product Management for Zaloni, where he is responsible for the strategy and roadmap of existing and future products within the Zaloni portfolio. Scott is a nearly 20 year veteran of the data management software and services market. Prior to joining Zaloni, Scott served as senior director of product management at SAS and was previously CTO and cofounder of DataFlux Corporation.

More Content by Scott Gidley
Previous Article
Top Streaming Technologies for Data Lakes and Real-Time Data
Top Streaming Technologies for Data Lakes and Real-Time Data

More than ever, streaming technologies are at the forefront of the Hadoop ecosystem. As the prevalence and ...

Next Article
Data Fracking: Going Deep into the Data Lake Using Drill
Data Fracking: Going Deep into the Data Lake Using Drill

Your data lake is finally live. After months and months of planning, designing, tinkering, configuring and ...


Get the latest tips and how-to's delivered straight to your inbox!

First Name
Last Name
Zaloni Blog Email Subscription
Thank you!
Error - something went wrong!