Your PLM Isn't a Workflow Tool Anymore.
It's Your Product Data Master.
Something interesting is happening to the way mid-market brands are using their PLMs.
On paper, a PLM is a workflow tool. It’s where tech packs get built, where sampling gets tracked, where approvals get chased, where the product moves from concept through to production with everyone involved able to see where it is and what it needs. That’s what it was designed to do, and that’s how most vendors still sell it.
Talk to the ops leaders actually running PLM implementations right now, though, and a different picture starts to emerge. The workflows are still happening. But they’re not necessarily happening inside the PLM.
What is happening inside the PLM is something more foundational. It’s becoming the product data master. The single source of truth for every attribute, every measurement, every material, every supplier reference, every sustainability data point, every piece of structured information that needs to exist on a product. And from there, it’s feeding everything else.
Shopify pulls from it. NetSuite pulls from it. Production trackers, sampling systems, allocation tools, reporting dashboards. All downstream. All dependent on the PLM having clean, structured, accurate data at the centre.
“We don’t really do the workflows in it,” one ops lead told us recently. “But if the PLM is wrong, everything else is wrong.”
That’s the shift worth paying attention to.
For a long time, the case for investing in a PLM was built around efficiency. Faster sampling cycles, fewer email chains, better visibility across the product calendar. All real, all valuable, but all essentially productivity arguments.
The newer case is structural. A brand that can’t describe its own products in structured data can’t do any of the things that now matter most. It can’t populate a digital product passport. It can’t feed an AI model that’s trying to help with demand planning. It can’t integrate cleanly with a new channel partner. It can’t answer sustainability questions from a regulator or a retailer. It can’t scale without the data layer underneath it becoming the bottleneck.
The brands that are getting this right are the ones who’ve stopped thinking about their PLM as a project management tool and started thinking about it as infrastructure.
The distinction matters because it changes what you’re optimising for. If the PLM is a workflow tool, you care most about user experience, approval flows, notification logic. If it’s a data master, you care most about the structure of the data model, the quality of the integrations, the discipline around tagging, the governance of who can change what. Different priorities. Different implementation. Different definition of success.
One head of IT at a contemporary fashion brand put it simply when we spoke: “The PLM is where truth lives. Everything else is a view.”
That framing is useful. It means the PLM doesn’t have to win the UX battle with a designer’s preferred sampling spreadsheet, or a production manager’s WhatsApp group, or a sustainability lead’s bespoke tracker. Those tools can all keep existing. What the PLM has to do is be the definitive, structured, trustworthy version of what the product actually is. Everything else orbits around that.
It’s a quieter role than the one the category was originally sold on. But it’s arguably a more important one.
For brands thinking about implementing a PLM, or re-implementing one that stalled, the question to start with isn’t “how do we move our workflows into this system.” It’s “what data needs to live here, and what needs to happen to make sure it stays accurate.” Those are very different projects. The first is about change management. The second is about architecture.
And the second is what the operators actually getting value from their PLMs are really doing.





