Wednesday, August 31, 2011

OSI PI Tag Compression Setting Recommendations

OSIsoft PI Tag data compression has been the subject of considerable debate for the regulated markets over the years. So much so that I wrote a paper about it almost 5-years ago.

Get the paper here:

You see, GMP managers are reluctant to discard data the FDA calls "Original data." Yet other managers rationalize that disk-space is cheap... as are IT resources, so why not collect as much data as you can?

There are many reasons, but from the perspective of the user who has to go through that data, we want enough to study the process, but not so much that we're buried in the haystack.

This is where rational settings for data compression come into play. Data compression on OSI PI servers let you conserve on administration/IT costs, filter out useless data, while providing your scientific staff with "the right amount" of data for continuous process improvement.

By the same token, if you have thousands of PI Points, you may not have the resources to rationally examine each process measurement and come up with customized excdev and compdev settings for each.

So what’s the answer?

We think the answer is to understand that every instrument has error… the range of error is called, “Instrument accuracy.” Repeat values that are within that range are not different than the first value and ought to be filtered out.

We also think that if points are removed along a straight line that those points ought to be within one instrument accuracy of that line.

It turns out that setting excdev to instrument accuracy and compdev to 0.5 excdev is where this happens.

Get our FREE 3-page paper

It's an easy read.