Analog Vs. Digital Transmission: Data integrity for GxP applications

viewLinc with AP10 and RFL100 in a laboratory setting
Paul Daniel, Senior Regulatory Compliance Expert
Paul Daniel
Senior GxP Regulatory Expert
Published:
Life Science

In the realm of data collection and integrity, questions frequently arise regarding the reliability and accuracy of temperature data loggers. One such inquiry was recently addressed by Paul Daniel, Vaisala Senior GxP Regulations expert. In this exchange, Paul explains some of the complexities and assurances surrounding the data loggers used in GxP-regulated environmental monitoring applications. 

Dear Paul,

Could you please tell me why a temperature data logger does not need to be qualified? Specifically, how can we ensure that the recorded data is not impacted in terms of data integrity when we transfer the data to a computer either in .PDF or .xls format? 

When a temperature data logger is calibrated, it ensures that the data obtained from the logger is accurate and within specification. But how can we ensure that the series of data logged are not altered while migrating that data to a computer?

From my reading of GAMP5, when data is migrated, there is always a possibility that the data is altered during the transfer process, so the raw data in the logger should be compared with the data transferred to a computer to check its integrity. It is only my opinion and maybe completely wrong… but I’d like your opinion.

Some people say that data loggers are calibrated and also checked at the manufacturer site in all aspects, and there is no need to qualify them since they are deemed as instruments. However, in the Automation field, instruments are loop checked and tested regarding the sent data to the control systems, so it is completely ensured that data is transferred correctly. But in the case of the handheld data loggers, I am indeed in doubt....

Paul answered: 

Dear friend,

You have characterized the problem with ensuring data integrity when transferring data very well. However, there is a big difference between typical automation devices and data loggers. In order to store data in the data logger, it must go through analog to digital conversion. So the data stored in a data logger, and what is usually sent from the data logger, is digital. Automation systems, especially when you are dealing with loops, usually deal with analog data.

The Difference Between Analog and Digital Transmission:

This distinction is crucial. An analog signal loses information over distance, so the received data may differ from what was sent. Conversely, digital information does not degrade over distance. Digital data is sent in packets that are cataloged before sending and verified upon arrival. So signal degradation does not happen with digital transmissions. You either receive the complete message, or you don’t receive it at all. This reliability is ensured by standards like TCP/IP, which have built-in checksums and other tools to ensure message integrity.

Analog transmission is like someone yelling to you across the house; you’ll get signal degradation and potentially miss part of the message depending on wall thickness, other sounds in the house, etc. 

Ensuring Data Integrity in Digital Transfers:

If a company follows good coding and development practices in their devices, the code that reads the data in the monitoring system interface will be the same as the code that reads the data in the data logger.  Calibrating data loggers ensures that the measurement data is accurate, and digital transmission ensures that what was sent is what was received. Theoretically, the only way to verify that the received data matches the sent data is to conduct a giant loop calibration between the field sensor and the monitoring system database, which is impractical.

GAMP5 and Manual Data Systems:

What you read in GAMP5 pertains to manual systems where data is downloaded from a data logger, potentially in an editable format like a .CSV Excel file, before being uploaded to a permanent and secure system or database. In such cases, verification is necessary due to the potential for transcription errors or unauthorized data changes. However, in automated systems, this risk is significantly lower as there is no manual step where data could be altered accidentally or intentionally.

Qualification Perspective:

From a qualification perspective, it would be during the specification stage that you would specify a data logger with automated digital transfer following a protocol like TCP/IP. Verification would ensure that the logger has this capability. During the IQ (Installation Qualification) or OQ (Operational Qualification) of your system, you would verify that data collected by the logger was received by the system and properly protected from changes in the database by security rules or an audit trail.

Addressing Security Concerns:

There is a minimal chance that a malicious actor could access the data logger before data transmission and edit the stored data. However, this is highly unlikely due to the technical expertise and equipment required, along with procedural controls in place. The easiest way to falsify data would be to move the data logger to a location where it records the desired data, such as moving it from a failing refrigerator to a functioning one. The control against this is to fix data loggers in place to prevent easy movement. Verifying the fixed location of data loggers can be part of the IQ process to ensure data integrity.

Conclusion:

In summary, ensuring data integrity in temperature data loggers hinges on understanding the differences between analog and digital data transmission, following good coding practices, and implementing robust procedural controls. Automated systems significantly reduce the risk of data alteration during transfer, making qualification of these devices unnecessary beyond the initial specification and verification stages.

I hope this addresses your concerns!

Paul

Contact us to learn more about Vaisala measurement, mapping and monitoring solutions. 


 

Webinar: GxP data integrity: What you don't know may make you non-compliant

In this recorded webinar you will learn how to maintain data integrity in GxP-regulated environmental monitoring applications. Along with best practices, we provide an up-to-date overview of the current regulatory expectations. Attend the live session to take part in the question period. Vaisala's regulatory expert will answer your questions on data management best practices for GxP-compliant monitoring.

Watch now

Add new comment