Zaloni, NetApp Partner for Data Lifecycle Management Solution

September 8, 2016 Scott Gidley

Right-size the enterprise data lake with policy-driven, highly scalable and cost-effective data lifecycle management and cloud tiering.

Today we announced a new solution developed in partnership with NetApp: Zaloni DLM and NetApp StorageGRID. The solution addresses the growing need for big data lifecycle management as more enterprises deploy data lakes for managing the growing variety and volume of data, including mobile, cloud-based apps and Internet of Things (IoT) data.

The ability to address data lifecycle management in the data lake, based on age and relevancy, makes the difference between a data lake and a data swamp. With our DLM solution, enterprises can keep their Hadoop cluster right-sized by automatically tiering data within the data lake, deleting old data, leveraging S3 archiving, and tiering data to S3 object/cloud storage.

With this new solution from Zaloni and NetApp, NetApp E-Series and StorageGRID Webscale storage options allow enterprises to define logical data lakes on premise, in the cloud, or a hybrid, while Zaloni’s data lake management platform provides the end-to-end management and governance of data in the data lake.

Key Zaloni DLM and NetApp StorageGRID solution features include:

  • Data lifecycle management based on business policies, not files and directories: The Zaloni Platform manages the mapping of business metadata to physical files/directories and storage types and locations can be extended to non-HDFS targets
  • Tier-specific definitions: Specify data lifecycle in each storage tier based on age or events; HDFS does not map the age of the data to storage type/policy
  • Data catalog: See which data is in what tier and get a better view of storage throughput

Transitioning to a managed data lake architecture now prepares enterprises for the inevitable, as big data is only getting bigger. The Zaloni DLM and NetApp StorageGRID solution not only creates efficiencies for data storage and processing, it positions enterprises well for the future, enabling strategies like data science; multi-site, cloud enabled disaster recovery; and enhanced data security for customer records and transaction data.

Learn more about our solutions here.

About the Author

Scott Gidley

Scott Gidley is Vice President of Product Management for Zaloni, where he is responsible for the strategy and roadmap of existing and future products within the Zaloni portfolio. Scott is a nearly 20 year veteran of the data management software and services market. Prior to joining Zaloni, Scott served as senior director of product management at SAS and was previously CTO and cofounder of DataFlux Corporation.

More Content by Scott Gidley
Previous Article
Zaloni DLM: Big Data Lifecycle Management for the Data Lake
Zaloni DLM: Big Data Lifecycle Management for the Data Lake

Apache knows there’s an urgent need for data lifecycle management for big data – and now offers Heterogeneo...

Next Article
Kafka in action: 7 steps to real-time streaming from RDBMS to Hadoop
Kafka in action: 7 steps to real-time streaming from RDBMS to Hadoop

For enterprises looking for ways to more quickly ingest data into their Hadoop data lakes, Kafka is a great...


Get a custom demo for your team.

First Name
Last Name
Phone Number
Job Title
Comments - optional
I would like to subscribe to email updates about content and events
Zaloni is committed to the best experience for you. Read more on our Privacy Policy.
Thank you! We'll be in touch!
Error - something went wrong!