Why PLM-related Big Data Opportunities Greatly Exceed the Potential Headaches

Big data presents those who implement PLM strategies and solutions with opportunities—and challenges—exceeding what they will encounter almost anywhere else. These challenges can be troublesome, but the opportunities far outweigh the potential headaches. A resource that should not be overlooked as part of digital transformation is big data, which can add value, innovation and increased competitiveness to every enterprise.

Even for me, this is a rather sweeping statement, but I will illustrate why big data's opportunities greatly exceed its potential headaches. I will start with a positive view of big data, then highlight tools and approaches such as topology data analysis, and conclude by describing ways in which big data and powerful analytical tools can be used to support the digital transformation of the product lifecycle and its enhancement of virtually every enterprise's outlook.

One thought to carry us through: Every change presents opportunities.  Continuously changing business environments regularly present stark choices. We can address every change as an opportunity to drive innovation, boost competitiveness and add to market share, or we can remain passive. If we remain passive, marketplace forces that are outside our control (and even our understanding) will determine our future. And it won’t be pretty.

The Vastness of Big Data

Big data encompasses the countless computations and connections that never stop uploading, searching, downloading, modifying and data processing. This is increasingly being done in the cloud, rather than on-premises or in-house. A big part of this is the day-to-day management of the data that supports designing and producing new offerings and assets, their engineering, quality and reliability, and the ever-increasing amounts of data gathered from in-the-field products.

In the cloud, much of big data exists in so-called “data warehouses” that accommodate anything structured: images, designs, documents, e-mail, voice, video, streamed data, social media content and so on. Lacking structure and format, the remainder goes into what are often termed “data lakes.”

This image speaks for itself when it comes to the amount of data being produced online. Note, however, that it was compiled for a Visual Capitalist article titled “What Happens in an Internet Minute in 2017?” In the intervening years, these numbers have gotten much bigger and the categories more numerous. (Image courtesy of Visual Capitalist.)

The contents of both data warehouses and data lakes are constantly being searched by digital tools, including:

  • Data mining, which has always been with us.
  • In-memory analytics for immediate insights.
  • Machine learning to accelerate the generation of precise large-scale data models that apply sophisticated statistical algorithms to assess what may happen in the future. Part of this is predictive analytics, highly regarded for risk assessment and fraud detection.
  • New approaches such as topology data analysis, major ongoing enhancements to existing tools and a forecasted surge of startups in the next few years.

There is little point quantifying big data. Estimates of its size border on meaningless and unfathomable. Ditto for big data growth estimates; speculation of about 15 billion more devices added every year is common, with each potentially accounting for millions of more data records.

Bringing Data Analytics to Big Data

Let’s now turn to the analytics that enable us to extract what we need from big data. Given the nearly incomprehensible size of big data (volume), its hard-to-quantify growth rates (velocity, and its continually evolving makeup (variety), only the most powerful tools enable us to deal with it. These include a powerful newcomer—topology data analysis—and predictive analytics, which is currently the most widely used.

In the coming years, big data will continue to grow in volume, variety, velocity and veracity—which means its trustworthiness is more easily established. Among these “V” terms, data is stored, shared and used again and again. Enabled by analytics, I believe digital transformation will add a fifth “V” for “value.”  

The purpose of using big data, as every engineer, product developer and maintenance specialist knows, is to locate and extract insights that are readily grasped to find information that can be used to better design, manufacture and support the enterprise's products, systems and assets. The value of big data information comes from enhancing innovation, boosting competitiveness and adding to marketplace profitability.

Finding that value is the troublesome aspect of big data, and without doing so, many digital transformations will fail, as so many have.  Information technology (IT) experts regularly remind us that only a tiny fraction of one percent of what goes into an organization's big data repositories is ever looked at again, let alone retrieved for anything productive.

The goal is searching for and recognizing appropriately valuable insights and nuggets of digital gold in the databanks of IT processors. Books have been written, and advanced degrees have been earned, about finding new ways to extract that digital gold. This is the “upstream” part of the value story of big data and analytics.

The “downstream” part of this value story is what is subsequently done with the minute fraction of big data that can be put to some productive use as part of digital transformation.

The Tools and Concepts of Big Data

The key tools and concepts bringing the value of big data and analytics to digital transformation include:

  • Digital threads.
  • Digital twins.
  • Configuration management.

Digital threads with their end-to-end lifecycle connectivity are what move the gold of big data—in any format or no format—from the analytical engine-linked databases into the hands of product developers and engineers. Digital threads link physical products and assets to analytical output files and repositories, whether their data is structured or not.

Just as they manage information and data into and out of any digital twin, digital threads have the flexibility to reach deep into augmented reality and virtual reality (AR/VR) outputs, predictive analytics, the advanced “generative” form of artificial intelligence (Generative AI) and topological data analysis.

Topology data analysis (TDA) enables the probing of big data for predictive insights from patterns and relationships in large, complex and high-dimensional data sets, including unknown-unknowns. A branch of pure mathematics, topology examines shapes in data and sorts them by measurements and representations. (Image courtesy of SIAI, a SymphonyAI Group company.)

Additional big data connectivity managed by digital threads includes data mining, metadata, transactional data (“business intelligence”), machine learning and in-memory analytics in any combination. The end-to-end, full-lifecycle connectivity of digital threads lets users find the gold of big data, in any format or no format, and move it to product developers, engineers, manufacturing operations and service.

Lest we dive in too deeply and quickly, a step back is helpful. This is a high-level grouping of analytical tools used on Big Data, categorizing them as perspective, diagnostic, descriptive, predictive and outcome based. (Image courtesy of CIMdata with data from multiple sources.)

Digital twins are optimized when there is one for each version of every product, asset and system in an enterprise's digital environment. They are fed by digital threads, usually incorporating big data input and analytical tools. An enterprise's digital twins—often thousands of them—support and enable innovations by design engineers, production managers, developers and service personnel. Digital twins are sometimes viewed merely as replicas of things in the physical world that are precisely detailed and automatically updated. This greatly understates the capabilities of digital twins for testing, modification, maintenance and continuous optimization, to name a few of the most frequently used applications.

Configuration Management (CM) is indispensable when digital threads and digital twins are regularly exposed to big data. CM ensures that digital threads, digital twins and their links to their associated physical counterparts are properly joined, running smoothly and not disrupted. Implemented as part of digital transformation (and usually enabled by a PLM solution), CM also ensures that the actual product, its requirements and its associated configuration data are consistently maintained and always valid and ready for use.

Based on information from the Configuration Management Process Improvement Center (CMPIC), CM does this by continually verifying that all of an enterprise's digital assets are what they are intended to be. CM ensures that every change is properly evaluated, authorized and implemented and that all information defining and managing configurations and data is current, accurate and in readily accessible formats.

According to ANSI/EIA-649B standard, CM is routinely used with assets of any kind, including production facilities and IT systems; embedded and connected electronics (hardware, software and "firmware"); materials, processes and sources; tools, techniques and services; and documents such as instruction sets, specifications and regulations.

Data Governance (DG) is a higher level of oversight—verifying that the enterprise's data (in whatever form) is validated and ready for use, as are the digital systems managing that data. DG dovetails with CM to establish constructive, effective management of information capabilities, ensuring that all information assets, upgrades and new systems are working as promised.

DG also guides the definition and implementation of policies, procedures, structures, roles and responsibilities that determine and enforce rules of engagement, decision rights and accountabilities. Data governance isn't optional if you wish to have a cost-effective and truly successful digital transformation

Bringing it All Together: How Digital Transformation’s Big Data Meets PLM

Recent PLM innovations have added automated linkages and traceability to support the big data aspects of digital transformation, as reported in the recent CIMdata educational webinar entitled, “The Promise and Reality of the Digital Thread”. With big data, analytics and digital transformation in mind, solution providers are hard at work to:

  • Accommodate the huge amounts of information readily available in the digital transformed world, then help make new data and sources comprehensible to product developers.
  • Link new sources of data, starting with social media, to conventional product information with new data management tools and capabilities.
  • Improve scope, timeliness and access to IoT, its subset the Industrial Internet of Things (IIoT) and the reporting and analytics generated in both.
  • Implement predictive analytics and topology data analysis for downstream use in product maintenance, repair and operations (MRO).
  • Incorporate innovative and profitable user experiences to help extend digital transformation throughout customer organizations.

Finally, digital skills transformation is clearly necessary, given the preceding points and what we know about digital transformation. We at CIMdata see three realities:

  1. Data has gained ascendancy in decision-making over the experience and insight of key people; as part of digital transformation, dig data, analytics and moving computing to the cloud are all playing significant roles.
  2. Software solution providers across the spectrum of engineering and manufacturing are building on digital transformation to automate every workplace process, sidelining the expertise and skills of the workforce.
  3. A scarcity of workers with the skills demanded for tomorrow's innovative new products and services, aggravated by rising rates of retirement.

As noted, big data is commonly characterized in terms of four “V”s: Volume, Variety, Velocity, and Veracity. With some help from PLM-enabling solutions, digital transformation and advanced analytics are giving big data a fifth “V”, “Value,” by getting information to where it can be used to drive product and process innovation.

Every change is an opportunity, even the change wrought by big data and analytics. If that opportunity is not translated into innovation, the enterprise eventually stumbles; digital transformation is a prime example.