Webinar: Understanding Metadata – Why It's Essential to Your Big Data Solution and How to Manage It Well

June 20, 2016 Kelly Hopkins Schupp

If it’s not broken, don’t fix it. These are the words most of us tend to live by. However, for manufacturers, this approach can result in significant cost – not only for equipment maintenance, but from loss in production from unplanned equipment downtime. Although it seems obvious that it pays to take a more proactive approach to maintenance, many manufacturers still use basic consoles for monitoring assets and strive to fix a problem as quickly as possible after it’s detected. Among other drawbacks, basic monitoring consoles provide isolated views of equipment condition, and managing them is time-consuming, resource intensive and often results in generation of false alerts.

What is predictive maintenance?

The objective of predictive maintenance is to monitor trending problems before something actually breaks down. Specifically:

  • Minimize unplanned (reactive) corrective maintenance
  • Minimize unnecessary planned (scheduled) maintenance
  • Achieve higher asset uptime

As you can imagine, big data plays a key role in predictive maintenance. The timing of predictive maintenance activities is determined by the automated monitoring of data on the condition (before failure) of in-service assets. Based on this data, maintenance is performed only when necessary. This approach is different than  to planned (preventive) maintenance, where  maintenance is scheduled based on historical data, which can result in unnecessary maintenance activities.

Predictive maintenance is becoming more widespread in the industry, as manufacturers use it to monitor for trending problems, perform quality assurance against individual parts/components and perform asset and operational optimization. Specific benefits of this approach include: 

  •       Improved customer satisfaction
  •       Reduced construction costs
  •       Increased productivity
  •       Increased production revenue
  •       Increased ROI
  •       Meeting SLAs and regulatory compliance

The role of the managed data lake in predictive maintenance

Modern predictive maintenance involves collecting, ingesting, storing and processing multi-structured data and is being driven by the proliferation of the Internet of Things (IoT). IDC research estimates that there will be 50 billion Internet-connected things by 2020. Manufacturing and other assets equipped to generate machine and sensor data in real-time and provide the condition information needed to enable predictive maintenance.

A data lake is a centralized repository used for efficiently storing a variety of data types including content data (documents, images, and videos), machine data, sensor data, and geo-location and historical data and more. A data lake collects and ingests enormous volumes of relevant data from a variety of different sources. A managed data lake contains big data that has had metadata applied upon ingestion. Metadata is what enables stakeholders, such as data scientists, field operations and business analysts to find and use the data they need. Without metadata, manufacturers can’t know what data they have, and they can’t trust the data’s quality – making it difficult derive value from the data

A data scientist utilizes the data lake as a platform to build analytical models using all pertinent data, and apply these models to glean insights in predicting future equipment issues which previously was unfeasible when volume and variety of data were less abundant. Once the models are created, they must be “operationalized” in the production environment in order to be incorporated in a customer’s scheduled workflows, dashboards and reports. Only then can the customer begin to realize the benefits of predictive maintenance in optimized operations and asset use, higher efficiency and lower costs.

Governing the data lake

Security and reliability concerns accompany the use of a growing volume and variety of big data. Risks include new security breach and compliance violation scenarios. As a result, collecting data in a data lake in the absence of governance and repeatable processes results in a disorderly, unmanageable, and potentially unsecure environment. Insights harvested from such an environment may be unreliable and untrustworthy, leading to erroneous business decisions.

Enabling predictive maintenance with Zaloni’s platform 

Zaloni’s Bedrock Data Lake Management Platform delivers the capabilities necessary to build and deploy predictive maintenance solutions. Zaloni stands out in the market by combining essential data lake capabilities within an integrated platform. Zaloni promotes governance by providing the following:

  • Metadata catalog enabling understanding and visibility of the data
  • Creation and application of data quality rules
  • Security through Hadoop vendor partnerships

Join Zaloni in embracing the future of predictive maintenance!

About the Author

Kelly  Hopkins Schupp

Kelly Schupp is Vice President of Marketing for Zaloni. Kelly has 20 years of experience in the enterprise software and technology industry. She has held a variety of global marketing leadership roles, and previously worked at IBM, Micromuse and Porter Novelli. Kelly serves as Zaloni’s brand steward and is deeply passionate about the impact of data-driven marketing.

More Content by Kelly Hopkins Schupp
Previous Article
What Is a Data Lake?
What Is a Data Lake?

A data lake is a central location in which to store all your data, regardless of its source or format which...

Next Article
Webinar: Understanding Metadata – Why It's Essential to Your Big Data Solution and How to Manage It Well
Webinar: Understanding Metadata – Why It's Essential to Your Big Data Solution and How to Manage It Well

How much does metadata impact your big data solution and your business? The answer might surprise you. Meta...


Get the latest tips and how-to's delivered straight to your inbox!

First Name
Last Name
Zaloni Blog Email Subscription
Thank you!
Error - something went wrong!