This article was originally published in our PLM Report 2023 – the definitive instalment in fashion’s longest-running dedicated PLM market analysis. To read other opinion pieces, exclusive editorials, and detailed profiles and interviews with key vendors, download the full PLM Report 2023 completely free of charge and ungated.

Key Takeaways:

  • Integrations within PLM solutions have evolved over the years, with PLM vendors working with other systems to create integration frameworks for specific applications.
  • The fashion industry’s supply chain is a complex web of interconnected solutions, processes, and stakeholders, making data sharing and collaboration essential for optimising factors such as cost, quality, sustainability, and speed to market.
  • To achieve a fully connected value chain, businesses need to embrace a data-driven approach that allows all roles in the supply chain to make informed decisions based on real-time and accurate primary data.
  • Successful integration requires understanding of the workflow and technology requirements, building strategic partnerships, and creating a seamlessly integrated collection of specialised applications to provide visibility and efficiency across the supply chain.

Integrations within PLM solutions are not new. There have been many integrations with point solutions over the last 35 years within PDM and PLM. Most have been driven by individual use cases, often specified by the requirements of respective companies. However, only relatively recently have PLM vendors worked with other solution vendors to create integration frameworks for specific applications.

Why do we need to integrate?

The fashion industry’s supply chain is a complex web of interconnected solutions, processes, and stakeholders, contributing to the final product’s cost, quality, sustainability, and speed to market. Currently, process and product data are generated in a range of diverse applications spread across supply chain partners, tiers, and geographies. Unfortunately, this data is often siloed, manually shared, or based on templates, making acquiring accurate product-specific supply data difficult and time-consuming. To optimise the complex factors within the supply chain for every product, businesses must embrace a data-driven approach that allows all roles in the supply chain to make informed decisions based on near real-time and accurate primary data for all process options across the supply chain.

The retail galaxy

What does ‘integration’ look like? First, we must understand how data is created and when it’s required for decisions. We can use an astronomical analogy to describe the ‘Retail Galaxy’ as containing all the processes and data to support Planning, Merchandising, Design, Development, Sourcing, Costing, Manufacturing, Shipping, Warehousing, Distribution and Replenishment to Sales Channels to support our businesses. Of course, various applications and databases support all those processes and data, and often many vendor choices for each type of application.

Source; whichplm

We can separate this Retail Galaxy into ‘solar systems’. For simplicity, we can divide it into transactional and non-transactional processes and data, with PLM at the centre of non-transactional and ERP at the centre of transactional.

The PLM and ERP ‘solar systems’ are integrated to varying degrees via PLM and ERP. For example, PLM provides critical non-transactional data, such as product and material data, to ERP as a foundation for transactional processes and data.

We have ‘planets’ orbiting with the PLM data model at the centre of the PLM ‘solar system’. Each planet represents a process area comprising multiple unique processes and each process with unique associated data. These process areas ‘planets’ could be described as Management, Merchandise Planning, Creative Design, Marketing, Consumer, Materials Development, Colour Development, Technical Development, Sourcing & RFQ and Environmental & Social Governance.

Within each ‘planet’, there are multiple processes and associated data. For example, for Creative Design, these processes could include Trend Analysis, Storyboard, Concept Development Manual, 2D vector design, 3D avatar & engineering design, 3D printing, and CAD knits, weaves, prints, plaids, stripes.

The data from all these processes are collected within the central PLM data model and shared to roles across all processes, using the mechanisms of collaboration, workflow, and automation.

The reality of the PLM solar system

As defined above, there are ten ‘planets’ and more than sixty types of processes. A business will already perform these processes and collect the associated data if relevant to the product types and business model. The question is whether the process is supported by an application or performed manually and how the data is pushed to the central PLM data model. For most businesses, many of these sixty+ processes may be manually transferred to the central PLM database, if at all.

We wish to optimise the complex factors within the supply chain for every product via a data-driven approach that allows informed decisions across the supply chain. Then, we need to enable real-time data from every process to be available at the centre of our business PLM ‘solar system’.

The anatomy of an integration

The critical questions for any integration are which application collects, manages and ‘owns’ the data (the Parent), where the data is to be shared (the Child), how the two will be connected, and how often the data will be transferred.

This data will be driving on-demand dashboards to highlight business performance and enable rapid data-based decisions through interrogation by AI/ML models. Therefore, for accurate and (near) real-time data sharing, Application Programmable Interfaces (APIs) should be used to map data, set frequency, and the trigger(s) for the integration. APIs can be open and available to all licensees or closed when developed for a specific purpose/client and subject to an exclusivity agreement. We’d recommend Open APIs so you trade short-term exclusivity with long-term evolution and support of the API.

Integration purpose and application capability

Business processes are supported by applications. However, not all applications are equal, so we need to understand the requirements of our business processes in tandem with the capabilities of the applications, to define what data and how to share. To illustrate this point, we can consider three examples of process data integrations and the issues to consider.

Defining the origin of the data

Many PLM vendors state that their PLM applications have a Bill of Labour, but is that correct? For example, there may be a table where operations can be manually populated or even a library where those operations can be populated for reuse. However, do they have pre-determined global standard libraries of time-motion operations supported by method codes generated by Predetermined Motion Time Study (PMTS) and calculation of a Method Standard defined by a methodology, including recognition by international bodies, such as the International Labour Organization?

There are several labour costing applications for fashion products that support work study engineering methodologies, operations libraries, and the training and certification for individuals and factories, which have received recognition from the ILO and other international bodies. Yet, it’s still not that simple, and the products supported also differ by application. For example, for three globally recognised labour costing applications and associated methodologies, GSDCost and timeSSD support apparel labour operations, whilst SATRA TimeLine supports footwear labour operations. In a business that produces apparel and footwear, two separate applications would be required to support the labour costing process and two different integrations to a standard data structure.

Therefore, the question to ask the PLM vendor may be: with which labour costing applications does the PLM application integrate to display, a) minimum of summarised labour costing for product variations, and b) the Bill of Labour (all operations, SAMs/SMVs and cost per factory) for each product’s supply variation? The answer will tell you whether the PLM application can display an accurate labour costing and if integration can be automated rather than manually updated. An informed decision can then be made based on the data and efficiency requirements of the business vs the capabilities, cost, budget, and timeline for elements of the connected solution.

Availability of the latest data

When colour standards were initially offered in electronic form, any PLM vendor could enter an agreement and receive a file to import to their application, including updates with 100+ new colours for cotton and paper substrates every year …or two. However, the distribution of colour standards has evolved. Now, new colour standards are available immediately in ‘live’ updates. For example, the Pantone colour standard is supported by Pantone Live. Does the PLM or Creative Design application have the capability to ‘connect and forget’ to provide those new colours immediately for your creative design and colour teams?

Do applications and data models support the required data?

My final consideration relates to new use cases and data types. In this example, the driver is the focus on improving sustainability, and the regulations upon us, that require measurement of supply chain process variations using science-based primary data. The challenge is to capture accurate data for a defined process, which must be achieved for all processes across all tiers and all supply chain partners. This has led to the creation of new businesses that have focussed on this challenge. If we use the example of measuring Greenhouse Gases (GHG), supported by companies such as Made2Flow, millions of data points have been captured, with applications enabling the definition and comparison of product supply variations in terms of CO2e.

This is something that a PLM vendor can only replicate with enormous investment. Still, regarding the imminent sustainability legislation, integration is a straightforward and beneficial use case across the fashion industry. However, the capture of GHG data is fundamentally based on the breakdown of processes for every operation that generates CO2 emissions. At the time of writing, the capability to capture processes in a Bill of Process (BoP) did not exist in any PLM application.

This must be addressed to enable the details of supply chain processes and breakdown of CO2 emissions to be shared seamlessly with PLM. Without a BoP, each supply variation for a product must be either an attachment or summarised. The summary of CO2e per supply variation may sound like an acceptable outcome until you consider the workflow. In this scenario, we rely on the correct process combinations populated in the GHG calculation application by a supply chain specialist, then calculated and pushed back to PLM for assessment by other teams alongside cost, margin, timeline, and demand estimates.

This becomes a slow and clumsy workflow. A streamlined workflow requires all supply options to be available to the design team at the earliest opportunity, whether picked from templates, or suggested by AI/ML, then calculated seamlessly by the integrated GHG application, and available immediately to the design team to enable informed decisions to provide improved sustainability, cost, and timeline, for both product and workflow.

PLM applications must evolve to include new functions and extended data models for deep and practical integration to enable this efficiency.

What are the actions required?

After scratching the surface with basic examples for three of the 60+ unique process types and data sets from the PLM ‘solar system’, we can see many integration options that cannot be addressed simultaneously. What prioritised actions could be considered?

Understand the workflow to understand the technology

A vital element of any successful implementation is that technology supports processes. If you have a good workflow process, you will have good efficiency and adoption…. and vice versa. An efficient workflow process provides the user with accurate data with minimum effort at the earliest opportunity to make an informed decision.

Brands, retailers, manufacturers, and suppliers already know the data required to make critical decisions within their workflow. They must understand their current and “best practice” workflow, where relevant data is available to support informed decisions at the earliest opportunity. This will form a blueprint for the prospect/customer to understand and prioritise their requirements to implement their best practice business workflow, including the prioritised timeline for data and technology to support. Instead of a long wish list of functions and a box-checking exercise, requirements will be clearly defined for software vendors and enable; a) an open discussion of technology capabilities and roadmap to support best practice workflow between prospect/customer and vendor and b) assist the vendor in prioritising their development and integration roadmap.

Build strategic partnerships

The argument for integrated data across processes, supply chain partners, and disparate applications to drive efficiencies, cost-savings and speed to market is familiar to the fashion industry. There are many examples of application-to-application integrations, but these have been made on a case-by-case basis to drive software sales, with use cases that are ‘low-hanging fruit’ or customer-specific problem statements.

Strong collaboration between brands, retailers, manufacturers, suppliers, and software vendors is essential for understanding primary data generation and integration required to share with decision-makers across the supply chain.

  • Brands, retailers, manufacturers, and suppliers must partner to map the supply chain processes, with specialist assistance to expedite progress.
  • Individual businesses should bring a deeper understanding of their current and ‘best practice’ workflow for technology evaluations. Ideally, industry best practices would be defined.
  • Software vendors must partner to enable the collection, integration, and visibility of primary data across the supply chain. A greater understanding of workflow from prospects & customers will allow prioritisation of the roadmap for integrations and new application functions and features to support them.

The industry must create genuine strategic partnerships sharing data and insights, where it’s accepted that an effective connected solution must be a seamlessly integrated collection of specialised applications.


Statements on investing in technology and integrating data to provide visibility across the supply chain are 30 years old, but what needs to change in the fashion industry? We shouldn’t expect altruism from every business in the fashion industry, yet we need to change some self-focussed behaviours. There are many areas to address to facilitate a complete digital value chain; a single company could not achieve this alone. Genuine partnerships must be created where each partner delivers their specialised solution element to the highest standard, and seamless data integration proves the sum is greater than the parts.