Digital is a chance for IT projects, not for PLM. But it is not a enemy either.

I look at the post from John Stark, as an answer to Roger Tempest, about digital transformation compared to PLM. I agree that basically a digital thread is not something more ore less than PLM. PLM is a digital thread for many years, as other management domains by the way (ERP, CRM, PPM, etc).
In my view Digital transformation can be seen in 2 different ways:

  • A transformation of the way a company behaves, replacing existing non digital process by a digital one. That’s where PLM is one of these processes.
  • A transformation of the company, creating new digital products where physical products only where existing. That’s typically what happened in the press or music domains, several years ago already.

As usual, different meaning for the same wording. But both should not be mixed.
If I focus on the first meaning, then there is no added value of digital over PLM, in fact it is the same thing.
So is a digital thread something new? Probably not. Is it something trendy and hype? Certainly yes.
The point is that a digital thread bring often the usage of new IT tools appeared recently like enterprise social networks, or in the manufacturing domain, google glass promising a lot in the domain of augmented reality to be used in the plants.
And that’s where Digital has a value, providing an opportunity for IT investments where it might have been difficult to sell internally a new IT project without a Digital initiative.

So PLMers, do not bash Digital transformation, simply use it.
At the end of the road, only the result counts.

BOM: how the business influences the content of the BOM?

There are lots of discussions on the net about BOM, what it has to include, single BOM or multiple BOM,… On the vendor side, it is always the same story: assemblies with components. Is there only one way to structure a BOM?
First, I feel that the BOM word, originated from manufacturing purpose, should be changed to something else. First because the same word is used by several departments but usually people have different concepts and needs in mind. Second, because the reason to build a BOM is often much more than the components of a (sub)assembly.
Then, the BOM purpose is usually meant as the product definition core, meaning it fully describes the product structure, how the product is built.

Classical BOM – In order to differentiate different BOM concepts, I will define a classical BOM as a structure of physical parts, components being added to each other to build an assembly. It is often thought that if you sum the weight of components, you get the weight of the product.

Where the classical BOM concept is failing – In my view, the classical BOM is well adapted to companies assembling components. Where it becomes difficult to use, is when the manufacturing process includes separation or removal of material. How a classical BOM can represent a milling operation to get a hole in a part or a part cut into pieces to build several products, or a preparation that you heat to remove the water? It cannot. What are the industries having such processes? A lot, at diverse level! It means that the classical BOM is not appropriate to represent the product alone, by far or not.
In the good old days, such information was put in the ERP routings, leading to the list of operations to manufacturing a part, the operator time for each step, and so the famous added value or direct labor. It means maintaining synchronized BOM and routings upfront, which is difficult to master in case of an important number of design changes. In my view, the classical BOM cannot support manufacturing processes where an important number of operations occurs.

Conclusion – The classical BOM cannot support the manufacturing process for some industries where material transformation is dominant. A new product structure concept need to be defined. The additional benefit is a natural transfer to ERP, as the BOM is by nature respecting the manufacturing process.
What do you think?

PLM success depends on a bottom up approach with top down sponsorship!

I would like today to come back to one of my favorites: PLM adoption. As we go to another PLM initiative, the path to follow, the method to apply are coming back in the light.
One “well known” paradigm for enterprise systems is that top level management need to be involved and push the initiative. Is it true? Yes and no in my view. PLM is a special topic in that sense, it does not target directly financials but efficiency. And efficiency is coming from the ground, meaning the end users need to have the tool at hands to support their work: store files, get files, assign basic task, list my tasks. Easily. And that’s one major cause of failure of PLM, implementing a PLM application sometimes create new tasks for users, instead of facilitating their current ones. It means that users strongly need to participate to the PLM design, and tell where are their pain points.
Does it means that the top management do not need to be there? Of course not, money will be invested at least in the change process, so a value as to be defined. The company will benefit from efficiency enhancements. But additional value can come from process control for example. And usually users are not interested in process control. That’s where the PLM initiative has to carefully balance the benefits, between individual efficiency and corporate processes control.
Does it define a methodology? At least it’s a start!

About multiple Bill of materials

“Is seems so funny to create a BOM, as so many people want to create their own BOM!”, Audrey Agros.

Jobs about Design

Today I read a paper from Arena about MBOM. This post is pretty good, and try to explain what is MBOM, compared to engineering BOM. I had nothing to tell against that post, so I strongly advise to read it.

My favorite sentences are the following:
The more accurate and complete the contents of the manufacturing bill of materials are, the better the decisions you can make about how to get the product efficiently and cost-effectively into the customer’s hand.
The accuracy and completeness of a manufacturing bill of materials allow a company to make better trade-offs and improve its ability to successfully ramp, build and introduce a new product.
The manufacturing bill of materials drives manufacturing, operations, purchasing and logistics for a product.

Where things are getting worse is when the author wants to compare EBOM and MBOM. Talking about engineering BOM for a mechanical product:
The structure of the mechanical BOM is generally derived from the mechanical CAD model. That BOM is organized according to the engineers’ design process and often contains groups of unassociated parts collected together for the engineers’ convenience in working with the model.

That’s where I do not fully agree. I feel that if we think that EBOM is driven by the design process, it might mean that designers are missing another product structure which is the digital mockup. The DM is not a BOM, it is a product structure where the focus is on the form, the fit and the function (FFF). There is no quantity in a DM, while there is in a BOM. But there are positioning information between components, which are not in the BOM. See some thoughts about that topic from my friend Hervé Menga.

The step after is then to decide on single BOM or synchronizing different BOMs, or to integrate manufacturing people in the design process, or similar project organization trends that we have seen for the 2 last decades, targeting design time reduction and efficiency of the design from a manufacturing point of view. There is no ideal solution for that question in my view, I feel that the question is more in how the decision is supported by the organization (collaboration of the teams around the same product structure), and how people are adopting the concept in their daily work.
But instead talking a lot about what is design and what it’s not, I prefer cite Steve Jobs:

Design is not just what it looks like, Design is how it works.

By the way, it should be deeply adopted by software designers!

What do you think?

Think of your raining coat inside your PLM system!

I faced a business issue managing files inside a PLM system. Files! File management is a big topic usually, because it seems simple while it is not at all. Why? Because usually, the PLM system makes a difference between files and documents, documents being metadata, with files attached. And usually, these metadata are not included in the file, we sell the PLM system to the users telling them that it allows them to not think about naming and revision or other metadata, because the system will do that for them.

The point is that files are well managed while the users use the system, but as soon as the file has to go outside (and it may happen frequently), then the files lose all the metadata which where included in the document. No Document name, no revision level, no status, nothing. The file is naked!

We often remind this rule for IT systems that while you look to one system, usually everything works fine, but as soon as your data has to go in or out of your application, you may have some issues. This is basically the case when you design IT interfaces between applications, or migrate data from one application to another. But it is as well the case for users needing to use several system or send or receive files from/to the outside (partners, customers, suppliers). This is in my view not often properly handled nor understood well, and how this external exchange constraints can ruin the clean user process you feel you have perfectly cover inside the PLM system.

I found a way today to provide a picture about that situation that anyone can understand, related to our common lifes.

As far as you stay at home, you set your warming equipments to be dry and warm inside your house. The point is that at one point in time, you need to go outside, and there, it may be cold and rainy.  So you arrange your home to have a raining coat inside your house, to be used outside.

It’s the same behaviour we should have designing our PLM system, especially for files. And ask ourselves: what will be the raining coat that files will need as soon as they have to go outside? Because they will have to!

Hope it helps!

2012: the year of the mashup, based on search, multi-facet representation and semantics (Part 1)

At the beginning of this year, I planned to write a classical post regarding my predictions for 2012. For the first time. But time was missing and I gave up.

Nevertheless, the idea was to talk about Mobile UI, more and more successfully aggregating information from different sources, and not only linking data to dedicated app:

  • Flipboard, google currents: aggregating several information sources, rss feeds, Facebook and twitter accounts and many more.
  • Mango interface from windows mobile: not focused on application, but data

With those application, the purpose is not about the application used, but about the data itself. Simply.

But what about PLM apps? Did vendor conscious about the need to aggregate data from and inside their application? To build dedicated visual dashboards, that the user can define and share?

Some time ago, I posted about specialized search available to quickly access data. The idea was t provide simplify access to data, designing specific queries, and defining associated specific layouts. But this approach is still designed by implementers, and not the users themselves.

So my prediction was that 2012 were the start do development of visual dashboards. But what are the base building blocks to be able to offer such capability?

Then, earlier this year, DASSAULT SYSTEMES acquired Netvibes. It was surprising, as this application is dedicated to monitoring data on the web, like monitoring e-reputation. Far from the PLM world. But Netvibes should provide the technology to build dashboard, may be in combination with Exalead.

Then came an internal request at FAURECIA to provide a new layout in Enovia, leveraging visual data management, in order to reproduce physical dashboards in projects rooms, where projects teams have regular meetings and can visually display the project progress. All data is existing in the PLM application, simply not organized like it is needed at a given time of the project life cycle. Again implementers are needed.

Then came Active workspace by Siemems. A kind of SBA, allowing the user to navigate, through search, inside the 3D data.

So after simply 3 months after the start of 2012, my prediction seems going to reality!

But how such dashboards can be built, which technology can unable highly configurable visual dashboards?

Let’s see that in the second part!

And you, do you need personal dashboard about your product data?

Co-authoring and Versionning: we need both!

Today I read here that google docs on Android was enhanced by co-authoring, allowing several people to collaborate in real time on the same document. It reminds me a tentative of the same nature in the dead google wave application. Wave was seen as a very promising platform. Most people where enthusiast about a new social media app, but as a PLM specialist I remember having seen more deep interest in the platform for two aspects:

  • Co-authoring in general, and especially for CAD
  • The timeline

Co-authoring for Engineering disciplines
CAD was in the past a discipline like coding, where your work needed time to be done before it can be shown to someone else for review. But improvement brought last years has decrease the time to design using CAD tools, allowing now design sessions. The point is that in parallel, the design teams have been split geographically, so there is a need for online CAD sessions. Wave was in my view a new platform for such real time collaboration.
Product definition history
The timeline in Wave was apparently simple. As people were using the application, every single action was recorded, and the full history could be simply played again, like you look at a movie. No need to manually capture some specific instants in order to track them for future use, everything was available for future use. I feel you see where I go. In our PLM applications, we implement versioning and revisioning mechanisms, in order to keep track of the current state.
Some years ago, I had a presentation of SharePoint 2010 at Microsoft, were we experienced real time co-authoring capability on the same excel file. And the presenter said at the end: “do we need versioning if we have co-authoring?”. Whao, what a question, I thought!
The google wave answer was: No, but what about the PLM answer? Do we still need version numbers for PLM objects if we have their full history, their full lifecycle, available?
For sure there is an answer from the IT infrastructure guy, requesting huge storage capabilities. But science will solve this issue with atom based storage in the future.
But the product manager answer should be that he doesn’t care about the full history. What he would like is to have instant access to the product configuration at the time of a specific milestone or date of the product lifecycle. And Wave will not help for that.
It’s a pity, as it is not so easy to anticipate which version or status will be useful in the future. That’s the reason why, often, there exist corporate rules to define a version in the context of a specific milestone which has been defined in the milestone set from the standard program framework of the company.

So my answer is today that we should benefit from co-authoring for engineering discipline to face now a global distribution of design team, bit we still need classical versioning mechanism to track product maturity at given dates or milestones.

And you what do you think?

PLM, EBOM, MBOM, single version of the truth?

Hello. It is a while I did not write. Sorry about it!

Recently I had the chance to discuss the BOM topic in the real life, with people in a plant, as we were deploying our PLM/ERP interface into a new plant. This interface supports the concept of a unique product structure for all functions needing to get it, including manufacturing. PLM is the master system for it, the interface transferring the product structure to the ERP in order to launch/update MRP calculation. An ambitious challenge.

So we started discussing with the people in the plant, until we reached an interesting point. People asked how to manage regrinded raw material. For information, in plastics industry, products are sometimes manufactured not only from virgin raw material, but manufacturers are reusing scrapped parts from runners ans sprues, regrinding the plastic raw material, and adding a given percentage of this reused raw material. This practice initialy dedicated to decrease  the cost of procurement, is used as well to leverage green products initiative. This percentage is taken into account in the MRP calculation. People interested in such practice can look at this link.

Plant people  were telling me that they were using a specific item, added this item in the BOM close to the virgin raw material, and added the right quantity to virgin and regrinded material.

I was starting to think that our beautiful concept of a unique BOM was not so appropriate! Because I was thinking that our BOM, built by Engineering people was never entering into this kind of “detail”.

I then started discussing this topic with quality people. And she told me that at plant serial kick-off, a certification was given to the production, and this certification was based on some characteristics to maintain during serial production. She told me that not only this certification was including the fact that regrind material was used, but the manufacturer had to tell which percentage of regrinded material was used. The certificate was given based on this information. In case of change in the percentage, a derogation process was launched to autorise temporarily the production with a different percentage.

It means that people in charge of product definition needed to identify upfront which percentage of regrinded material should be used in production, and not simply the quantity of raw material needed to manufacture the part.

I was really surprised to discover that our unique BOM concept was still valid. PLM is traditionally advertized as the way to maintain the product definition. ERP is advertized as the master system for production data, and separation of both means that there is usually a misalignement between both. Through that little story, I see  a clear benefit of merging both concepts while using two different IT applications.

What do you think?

Complexity of data model – Which solution?

PLM data model is usualy a huge topic. In order to have an idea of the situation, take an average user, and explain him why we should think about where to put a property, on the object or on the connection between another object. Or ask him the question of cardinality. Usually you get a big question mark from the end user, on which you should not spend too much time to interpret…

The point is there: while the PLM data model usually represents very well the business process, it is quite impossible to expose the data model itself to the users. The bad news is that our PLM solutions, built on standard applications, are by default presenting this data model. An important exception here is SAP in my view.

Consequence of that situation, is when you want to report data to the end users, or build KPI, to provide for example the same rendering as in excel sheets, you have to deal with the data model, which usually leads to complex queries, browsing a network of objects. A big network. At that time, you realise the complexity of your data model, having, as a simple example, to use the last released version, or the last version, or the current one, or…

Have you experience this situation, and if yes, how you deal with that situation? Is there another view to bring, which may be disrupting, but solve the situation?

I feel that another data layer is required, to provide a user view of the situation, implementing another data model, more suited to provide a more user oriented data representation. It is for example a view built in search engines and BI applications.

I am as well wondering about the solutions implemented for big network of objects, for example social networking application. Never in facebook the network is exposed to the user, while for sure it is behind.

What do you think?

Next Page »