Advancing Data & Process Management With PLM

All digital processes and tasks, no matter what they are intended to achieve, are a series of activities where data is received, created, updated, modified and then sent on its way. In today’s digital environment, these processes are often enabled by software solutions and applications daisy-chained together—usually a handful or a dozen, sometimes many more. All businesses and therefore all apps run on data so getting anything done requires a continuous flow of accurate data that is readily accessible.

In turn, these digital flows require effective management of the processes that they are intended to enable and of the data they created and/or use. This places data and process management squarely among CIMdata’s 12 critical elements for digital transformation, alongside end-to-end connectivity, digital threads, digital twins and other capabilities that enable end-to-end Product Lifecycle Management (PLM).

In previous articles on engineering.com, I have covered the “whys” and “hows” of using PLM to enable digital transformation. This article addresses how data and process management is a fundamental and foundational element of an organization’s digital environment, why it is necessary and how to get off to a good start.

Because a chain of processes is usually needed to get even the simplest tasks done, effective data and process management offers fundamental assurances about data and its continuous creation, use and reuse. This is nothing less than the core enablement of an organization’s digital backbone.

Improvements in data and process management—mapping, cleaning up, standardizing, optimizing and so on—impact every part of the extended enterprise. In product development specifically, these impacts include requirements, services, assets and newer, more effective processes. Design partners in the supply chain, for example, and logistics are also impacted for the better. All other parts of the organization have corresponding challenges.

An earlier article covered the rationale for the “what” of digital transformation. It is that needed information must be set free—released from stashes of departmental and business-unit data (“silos”), from outdated (“legacy”) information repositories, from digital formats no longer in everyday use, from burial dozens of layers deep in files and from paper.

There is widespread agreement that data access, use, reuse, sharing, collaboration and analysis are hamstrung by silos, legacies, buried data, paper and other similarly hard to reach data. They persist because they meet specific business needs, which is why cleanups are slow, costly and often uneven. Until these difficulties are resolved as part of digital transformation, basic data used by the people responsible for new processes, products, services and other assets will be unreliable or inaccessible.

CIMdata’s Enterprise Application Architecture. (Image courtesy of CIMdata.)

In developing new products and services, data and process management support the stack of tools and solutions in CIMdata’s Enterprise Application Architecture for data and process framework. Data and decisions based on these tools and solutions cycle continuously, making the information capabilities and models intensely dynamic. This enterprise information technology infrastructure is the core of a robust set of foundational enterprise capabilities. These capabilities in turn enable a set of value chain capabilities including product development and lifecycle support, strategic sourcing, manufacturing planning and execution, as well as service and recovery. On top of the architecture is where reporting and analytics, and all other business transformational capabilities reside.

WHY: Entrenched Processes and Tools Undermine Digital Transformation

Processes and applications don’t just create and use data: they live and sometimes die with it. I am writing this with a nod to the development of new products, services, assets and more effective processes. These processes include better integration of the design partners in the supply chain, for example, and logistics.

Every part of the enterprise, however, finds challenges in the processes and data it relies on. So does every other organization, business unit, standalone company, institution and agency. Overcoming these challenges is at the heart of digital transformation.

Process-driven tasks are repetitive and circular, which means they feed on and are fed by data loops. These loops keep users of related processes on track, wherever they are in the organization or tasks are downstream, upstream or in any given process flow.

Feedback loops and similar bi-directional information flows are good indicators of the difficulties in data and processes—feedback loops need ongoing tweaks and often have the deepest problems. Basic remedies include updating processes, synchronizing tasks, reworking access to data and decluttering that data. But data and process management often entail much more; knowing how to proceed is vital.

Conversely, smooth data flows in or out of feedback loops indicate that data is moving through each process task without data loss or the need for data re-entry.

Effective data and process management with PLM can end decades of challenges to get things done right and make sound decisions. However, data is inseparable from its processes, so misusing at one level can scramble the other.

HOW: Some Approaches

Any worthwhile initiative in data and process management requires ensuring that every process and task can access the information its users need to collaborate, make decisions and innovate anywhere in the organization. Many companies have zeroed in on these problems but, as noted, progress has been uneven—especially in product development and production, where dozens of processes rely on data from countless sources.

As noted above regarding CIMdata’s Enterprise Application Architecture for data and process framework, all these processes, their data and data sources are dynamic; they never stop changing.

The dynamics of data and process management initiatives can greatly benefit from using the workflow capabilities found in newer PLM solutions; these are invaluable for:

  • Transparency, so the initiative’s inner workings are easily understood.
  • Sustainability, so the initiative holds up over time and accommodates innovations.
  • Scalability, so the initiative can be expanded as needed.

Additionally, if an older PLM solution is used, its workflow capabilities should be reviewed, and alternatives devised if needed.

Relevant data for any process or task may be anywhere—the Internet of Things (IoT), the digital twins of other organizations or other repositories. This data also varies widely with the tools and tasks that depend on it. There are many, but the obvious ones are mechanical and electrical/electronic design systems; engineering change management; document-management repositories and files; modeling, analysis, simulation and libraries of software algorithms.

Expert users keep some of these sources, systems and tools up to date, but many are approaching obsolescence, which adds to headaches.

Getting data into the proper workflows and formats also means providing regular effectiveness checks such as verification and validation. Enlisting help from IT may be necessary to reformat the data inputs that go into process feedback loops, or are coded into automated provisions for updates, retentions and deletions.

Experts at CIMdata insist that upgrades to data and process management be approached as a single entity because data and the processes that use it are so completely integrated. Splitting upgrades into two or more projects is never recommended.

Successful data and process management provides:

  • Protection of intellectual property and data security.
  • The appropriate level of content management.
  • Knowledge management and reuse.
  • Search, find, reporting and extensible data analytics.
  • Data visualization services, both high-performance and lightweight.
  • Social media and collaboration.
  • Requirements definition and traceability.
  • Process and workflow flexibility.
  • Multi-disciplinary performance enabled by modeling and simulation.

These points summarize the key defining elements of core platform capabilities in the digital backbone of data and process management.

As a final caution: building accessibility into data and process management is a significant effort in coding, but one that must accompany digital transformation. Without it, some vital data will be unreachable by digital threads and unavailable to help flesh out digital twins.

Conclusion

An enterprise’s competitiveness, sustainability and long-term prosperity depend on effective data and process management and the appropriate end-to-end enablement with PLM.

Data and process management upgrades directly address many issues that delay or disrupt efforts to bring more competitive products and assets to market. Successful upgrades also make services more cost-effective and enable (or push) the organization to be more innovative and productive.

No matter how the eventual payoffs are calculated, they will be enormous. CIMdata has verified benefits as diverse as speeding up design, development, production and the establishment of new avenues for collaboration, profits, responses to customers’ wants and reduced risk of products and services that flop.

My next Deep Dive will cover using configuration management and PLM to stay abreast of the status and modifications in the organization’s tools, systems, and applications—the mechanisms by which data is kept relevant.