PLM Stays Relevant in Manufacturing

Experts weigh in on how PLM still has a seat at the table in today’s tech-heavy marketplace.

Experts weigh in on how PLM still has a seat at the table in today’s tech-heavy marketplace.

Image courtesy of Pixabay.


In 1985, American Motors Corporation (AMC) was struggling as the No. 4 carmaker in America. It made an audacious gamble and placed all its design and engineering information into a new computer technology called relational database management. 

The automated recordkeeping included engineering drawings created using another new technology called CAD. The AMC Jeep Cherokee and its successor Grand Cherokee were the first vehicles to market produced using the database technology we now know as product lifecycle management (PLM). It was noted at the time for being the fastest production cycle for a new product introduction in the modern era of automobile manufacturing. 

When Chrysler bought AMC in 1987, it was so impressed with AMC’s technology that it expanded its use throughout the organization. By the mid-1990s Chrysler was reporting development costs that were half the industry average. 

PLM continues to evolve as emerging technologies change the landscape. Smaller companies are embracing flexible, cloud-based PLM solutions and reducing adoption barriers. Meanwhile, digital innovations like the digital thread and digital twin are integrating with PLM to extend capabilities across the entire product lifecycle. Additive manufacturing and advancements in materials science are also raising new questions around design specifications and configuration management. Is PLM keeping up? Is it still essential? 

DE talked with two PLM experts to discuss the challenges and opportunities facing the PLM marketplace today. Stan Przybylinski is vice president of CIMdata, Inc., a market analysis firm retained by software companies and manufacturers. Jonathan Scott is chief architect at Razorleaf, a PLM product and services implementation specialist. 

The following interview is a lightly edited version of their remarks. 

Digital Engineering (DE): Are new innovations in engineering and manufacturing such as digital twin, digital thread and extended reality (XR) considered PLM investments? How are these technologies being folded into existing systems and workflows? 

Stan Przybylinski (SP): The PLM vision from the beginning included the notions of the digital thread. People are seeing the value of having that digital thread of associated information from idea through life. Digital twins arose from mechanical CAD and collaboration around evolving product designs. XR sort of comes from the same place. It is just that the bounty of Moore’s law is making many of these things more practical. So yes, I see them as PLM investments.

Stan Przybylinski

Digital twins and the digital thread are associated concepts in that the digital thread has (or will have) data that can animate these digital twins to support collaboration across much more than the engineering disciplines. I have joked for a while that while everyone is talking digital twins, those twins are fraternal. Same words, with very different meanings depending on the solution or service provider. In CIMdata’s definition of PLM, multidisciplinary lifecycle optimization is at the core. Twins of different aspects of your business and your product and service lifecycle are being used today to improve decision-making.

Jonathan Scott (JS): I believe the use case for these new technologies is determining how organizations view them. When a company tries to establish a digital twin so that they can remotely control and manage their product to enable them to offer that product as a service, I believe they will see this as a new “offering,” connect it to product development, and think of it as a PLM investment. 

When a company tries to deliver better training for service technicians via XR methods, they may recognize PLM as an enabler, but not consider the investment PLM-related. For now, a lot of the use cases we are seeing around digital twin and digital thread are still very much engineering driven and considered PLM investments. We are not seeing a lot of XR at the moment, but the discussions around digital thread and digital twin seem very much to be extensions of PLM capabilities, just using new buzzwords, and those dovetail nicely with existing systems and workflows even when those systems/workflows are being augmented by a new tool to add that [digital thread] flavor.

DE: Are PLM spending habits changing? Is it different for those companies traditionally considered on the cutting edge of technology than for other companies? 

SP: It is not clear to me what they might be changing from. CIMdata uses a broad definition of PLM, much broader than most of the solution providers that say they do PLM. Buying behaviors are very different across the many things we would include: MCAD; CAM; simulation and analysis; digital manufacturing (think DELMIA and Tecnomatix); electronic design automation (EDA); architecture, engineering, and construction (AEC); and application lifecycle management (ALM). 

Many companies seem to prefer buying on subscription, based on statements from the leading providers who had not moved to a subscription model at that point. That is a change. Of course, the cloud is becoming more important. There was a time you had to justify pitching a cloud-based solution. Now you have to explain why you cannot (or do not want to) use the cloud.

Truly cutting-edge companies are often catered to by the market leaders because it is those difficult problems that will push the solutions forward. One of the benefits of cloud delivery is that it reduces the barriers to adoption for innovators of any size to get access to leading-edge solutions. In fact, we see companies like Siemens with Xcelerator and Dassault Systèmes with their 3DEXPERIENCE platform using their platforms to bring their offerings to broader and broader audiences. 

JS: With the advent of more options for cloud services, which deliver some component(s) of PLM, I see more small businesses finding PLM accessible. In other words, there are more people shopping for PLM tools/capabilities and the budget range has grown even wider. 

Jonathan Scott

Small groups expect they can subscribe to a cloud service, spend a few thousand dollars, and have some capability up and running. Those folks are not expecting full enterprise PLM, but they are often startups or aggressive groups who are happy to build their own systems using combinations of toolsets. This mindset translates to forward-leaning groups within larger organizations, too, when they are not constrained by IT policy not to explore solutions like these. 

Again, the advent of cloud-based services means that even groups constrained by corporate policies can find some wiggle room because the customer isn’t “installing” anything and there is no “project” involved. In some cases, large corporate IT is not discouraging this exploration; they are encouraging it as a means of innovating their technology stack. They can challenge the traditional PLM vendors without threatening to displace them, by adding cloud-based solutions to complement their enterprise PLM and solve “point” problems.

There is a subtle knock-on effect of this affinity for cloud-based PLM solutions, which relates to PLM services. Once buyers see that they can try out PLM capabilities via services like OpenBOM, Duro, Propel, Upchain and others without a large financial justification and without engagement of their IT team, they are eager to move ahead and don’t think about (or choose to ignore) the work needed to implement and adopt the capabilities. 

Specific to this market, the buyers of PLM are often engineers and tinkerers, so they are predisposed to “I can teach myself how to implement this.” I am comfortable saying all of this from personal experience—I got into PLM consulting because I was an engineer who taught himself how to implement a [product data management (PDM)] tool. So, there are a number of failures and messes brewing related to PLM right now where cloud PLM is being underimplemented and people are unintentionally stranding their data in a cloud repository they were “trying out.” 

Because it is not trivial to migrate data into a cloud PLM service, the size of those messes is limited by how much data cloud early adopters can generate during their “trials.” Nonetheless, the cheapness of the tools is leading to sloppiness of behavior and a willingness to make mistakes. I don’t mean to deride this entirely because experimentation has its virtues, but I thought it was relevant to share on the point of how spending habits are changing.

DE: Are corporate digital transformation initiatives at manufacturing companies considered PLM innovations? 

SP: The name of the change includes the word “digital” and most of the transformations involve better processes and better uses of digital information to radically change your business. CIMdata maintains that if you are a product company, then a major source of that data is your PLM implementation. This is particularly true if you are moving toward product-as-a-service business strategies. How are you going to know if your products are sturdy enough and reliable enough to meet their service level agreements (SLAs) without simulation to guide their design, manufacture and use? 

In surveys we have done recently on digital transformation, we asked about the connections between their corporate digital transformation initiative and their PLM strategy and implementation. Unfortunately, in many companies, these topics are not well connected.

JS: Digital transformation seems to be a broad term right now, with a specific meaning in the PLM market, but I hear about lots of digital transformation initiatives outside of the product world. 

Within the PLM space, digital transformation is being applied as a new buzzword for an old concept: Going “paperless,” as well as being used to describe truly transformational changes. 

One example of that transformation I’ve participated in is an organization’s drive toward encapsulating and publishing (internally) expert-generated engineering models for use by non-experts. For example, a Ph.D. team of analysts creates a model for the behavior of the product and publishes it so that all of the company’s applications engineers can use it to validate a client’s request for a new usage of the product. 

This left shift in validation exposes the company to new markets because the sales teams now have the tools to explore those markets, leveraging the company’s existing [intellectual property]. In fact, collections of these models are published via an e-commerce-like internal shopping platform where end users can browse, select and run models with limited inputs/parameters via on-demand private cloud resources and never need to know which engineering tools are being used to execute their analysis. 

Capabilities like that are truly transformative in how organizations work. The more mundane “paperless” example can still be transformative though. I am working with another customer who uses paper-based shop floor work instructions and is making plans for how they will implement a [manufacturing execution system]. Switching from paper to screens on the shop floor is a huge change, and it is in fact digital, so the moniker “digital transformation” fits, although I would say it means something different from my first example.

DE: How is PLM integrating additive manufacturing (AM) technologies and related advancements in materials science? 

SP: This really depends on the industry. It does not seem like the notion of design for additive manufacturing (DfAM) has really taken hold yet in industry. There are a lot of success stories in part redesign, some spectacular ones in aerospace. Part of DfAM is leveraging generative design, which often results in organic shapes that can only be made using AM. This is being widely promoted and, again, Moore’s Law has caught up to this approach as well. Altair has been doing generative design since the 1990s but the computing horsepower today makes it more feasible.

The other key thing about AM is the use of new materials and new manufacturing methods. Industry has spent long decades using simulation and physical testing to understand the properties of specific materials subjected to different stresses in manufacturing and use. It is a knowledge foundation under much product development. 

What about our understanding of brand new alloys designed to be used in a new deposition process? How will that hold up to other manufacturing and decades of use? The introduction of new materials and processes requires more simulation and testing to build up that same level of knowledge. Not surprising that most of the simulation leaders have invested in materials science in recent years.

JS: We are not seeing a lot of impact from this yet in PLM technology implementations, but in the Department of Defense space, we see this popping up as a need to continue to extend the definition of “technical data,” which needs to be published and shared. Prior to AM, materials were largely assumed to be homogenous so geometry coupled with material specifications were sufficient as product specifications. 

The advancements in materials and additive processes mean that the specification of the additive process is inextricably linked to the specification of the resulting product. In one way, this just means that PLM systems have more data that needs to be managed to specify a component. However, it raises interesting questions for interchangeability of components and the standard configuration management rules by which organizations have operated for decades.

 What if the components match in form, fit and function, but might be different in failure because of their internal crystalline structure or based on thermal pre-stressing? Are they still interchangeable? This is getting outside of your question, but I believe we may see a need to revisit traditional configuration management rules in the coming years as we explore the model-based world and the new techniques it is introducing (like AM). I’m not making the connection explicit here, but I believe model-based technology was one of the key enablers of AM.

Randall S. Newton is principal analyst at Consilia Vektor, covering engineering technology. He has been part of the computer graphics industry in a variety of roles since 1985. Contact him at DE-Editors@digitaleng.news.

 



 

More CIMdata Coverage

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Randall  Newton's avatar
Randall Newton

Randall S. Newton is principal analyst at Consilia Vektor, covering engineering technology. He has been part of the computer graphics industry in a variety of roles since 1985.

  Follow DE
#28211