IS ALCOA, as understood in common sense, sufficient?
In the life science sector, data integrity is a very common concept that is often declined, and again, related only to the system. Everybody knows of ALCOA, but what we see is that the concept is related only to the system.
Attributable: is generally reduced to an audit trail on the system to be tested during validation instead of making a serious analysis on how and who receives the information; sometimes the information is collected in an informal way and afterwards, copied on a batch record (signed and verified) and copied again in the information system; the chain is not always demonstrable. BUT there is an audit trail in the system with significant tests to demonstrate it!
Legible: is generally reduced to accessibility to data stored, not thinking about the ‘meaning of data’. For example: I have often found that the item master has ‘speaking’ codes, let’s say 1xxx purchased raw material, 2xxx produced material. It is clear that if my production (I bought intermediate, finished product) chain changes, the information may become obsolete and may be misleading. Another example for CMOs: belonging to a customer dossier, they are obliged to use different vendors (source list); the common way is to create an item code per item/vendor/manufacturer. This information is clear for purchasing but is misleading for the warehouse, planning and production.
C+O+A: Contemporaneous and Original and Accurate is by far the least considered piece of ALCOA (because it is less technical) but also the most important. Often information systems are the last aspect of the information process, with paper-based activities in the middle, and sometimes, a less-than-perfect correspondence between information on paper and that on the system (and I’m not speaking about data verification or double-checks, but about differences between paper-based information and system-based information). Let me use an example that has often been found: during incoming, there is a checklist to control the status of material that is generally paper-based. In this case, if something is wrong, often the warehouse will send the material back and open a deviation, BUT do nothing on the ERP system. So the problem arises when the system APR/PQR (Annual Product Review or Product Quality Review) shows no evidence that an incoming batch has been sent back.
The absence of contemporaneity is the greatest threat to information systems since the system is working with obsolete data. Typical examples include the consumption of raw material in production based upon paper-based batch records (after BR review); at this point you can have discrepancies in your stock. Another example is in tanks where batches are blended when arriving: it is sufficient when an incoming batch in the tank no longer exists, when you need to consummate it in the system.
Example of differences between physical flow and information flow
What are digital governance experiences in facing the problems of defining an effective analysis for a TO BE scenario?
Digital governance works at two levels:
- It helps companies organize the IT department and create strategic implementation plans
- It supports companies in order to have a synchronous process flow of the information system and physical flow
In order to make an effective analysis, it is important to face the change resistance that can be seen, as we have heard many times. For example:
- ‘We are different from others”
- “We need flexibility”
- ‘We already completed this and it is just practice for me.”
- ”I only need some changes on the actual system”
- ”To make the change I need more personnel”
The 3 key points are:
- To make an analysis driven by the PQE knowledge base (built on 25 years and thousands of projects), the user should be offered TO BE scenarios and asked to ‘forget’ the AS IS situation. The use of a knowledge base is also important to help users understand what the system can potentially do. So the methodology used is far from the standard AS-IS TO-BE analysis and is more effective.
- Make multifunctional working groups - this is very helpful in letting people understand the consequence of some (missing) activities. Often a major effort during the first stages of a process can mean huge savings (for example, it is easier not to have a complete purchase order when you only need material/services, but afterwards, finance must work harder for invoice verification, etc.). Only with an analysis team covering the entire flow is it possible to understand what activities are effective and which are ineffective (and we have found many) or duplicate efforts.
- Managerial involvement is important because the charges and resources should be re-equilibrated: an overall saving may correspond to more activities required in some steps (generally when starting) and less in others (generally in final and controlling steps).
It is never too late or too soon to start an activity of this kind if not already completed.
Of course it is best when we start the analysis from a specific request to improve the efficiency of the information chain; typically these cases are related to a need for a better use of the system pushed by:
- A request to have a more efficient flow in the information chain and verify if the system in use can be better used (optimization in batch record management, in production programming and managing and in stock management are the most frequent pain points);
- The necessity to push the plant in a vision industry 4.0 with integration between ERP and equipment.
The disruption periods (a necessary major change release, organizational changes forecasted) are the topic moments in which the analysis should be executed to overlap the supply chain with the information chain.
- Sometimes when we arrive we find that the system already in use has so many criticalities that the company wants to understand if they must change the system, or if the problems lie with how the system is used. Typical examples are in companies where there are some plants/subsidiaries that have more difficulties using company standards or in small and medium enterprises where organizational impacts are underestimated. PQE’s independence from other vendors is a key point for analysis correctness
- Sometimes the request is related to an ongoing ERP project; it happens (mainly in product-driven projects) that during final phases, users do not see expected functionalities and it is typical that in cases like this that change management is underestimated. We have seen many of these projects and generally companies prefer to close the project and remediate after. In some cases the project is not correctly managed and it is impossible to know when it will finish (go live dates pushed, for example); in this case it is better to make the analysis to understand the real situation and refund or restart the project.
- Other cases are related to validation activities - in some cases the validation is the momentum in which the company wants to understand if the system can be improved in order to comply with GMP processes.
Therefore, no matter what stage you are at, an analysis to map and synchronize the physical flow with information one brings you is always beneficial.
Read the first part of the article here