Validation of calibration software ? as required by ISO 17025, for instance ? is a topic that people don?t prefer to talk about. Often there is uncertainty concerning the following: Which software actually must be validated? If so, who should look after it? Which requirements should be satisfied by validation? How does one take action efficiently and how is it documented? The following blog post explains the background and gives a recommendation for implementation in five steps.
In a calibration laboratory, software can be used, among other things, from supporting the evaluation process, around fully automated calibration. Whatever the amount of automation of the program, validation always refers to the complete processes into that your program is integrated. Behind validation, therefore, may be the fundamental question of whether the procedure for calibration fulfills its purpose and whether it achieves all its intended goals, that is to say, does it provide the required functionality with sufficient accuracy?
To be able to do validation tests now, you ought to know of two basic principles of software testing:
Full testing is not possible.
Testing is always influenced by the environment.
The former states that the test of most possible inputs and configurations of a program cannot be performed because of the large number of possible combinations. Depending on the application, the user should always decide which functionality, which configurations and quality features must be prioritised and which are not relevant for him.
Which decision is made, often depends on the second point ? the operating environment of the program. With regards to the application, practically, you can find always different requirements and priorities of software use. Additionally, there are customer-specific adjustments to the program, such as concerning the contents of the certificate. But also the individual conditions in the laboratory environment, with a wide range of instruments, generate variance. The wide variety of requirement perspectives and the sheer, endless complexity of the program configurations within the customer-specific application areas therefore ensure it is impossible for a manufacturer to check for all your needs of a particular customer.
Correspondingly, considering the above points, the validation falls onto an individual themself. In order to make this process as efficient as possible, a procedure fitting the next five points is recommended:
The info for typical calibration configurations should be defined as ?test sets?.
At regular intervals, typically one per year, but at least after any software update, these test sets ought to be entered into the software.
Recreate resulting certificates could be weighed against those from the previous version.
Regarding an initial validation, a cross-check, e.g. via MS Excel, may take place.
The validation evidence should be documented and archived.
WIKA offers a PDF documentation of the calculations carried out in the software.
Note
For further information on our calibration software and calibration laboratories, visit the WIKA website.

Leave a Reply