Resistance to Instrumentation, Water Technology Innovations

Resistance to the Effective Use of Instrumentation VI: Can I really trust what my instrument tells me?

In this penultimate version of this blog on the resistance to the effective use of instrumentation Oliver Grievson  probes into why the fact that instrumentation in general is a lack of trust in instrumentation in general which prevents instrumentation, control & automation taking a step forward and treatment works becoming more instrumentally based.

The modern instrument is a marvel in what it does and as I have said in the past in this blog it is the eyes and the ears of the modern treatment works, if it is installed, maintained and calibrated properly it is a very reliable part of any treatment process. However in potable treatment you tend to see double or triple validation (two or three instruments running in parallel) and throughout the water industry you see operators wasting thousands of man hours a day in manual sampling, add to this external verification of water samples in laboratories and the daily operating costs of checking the product whether it be the quality of the water that we drink or the purity of the water we discharge runs into the hundreds of thousands of pounds (dollars, euros, take your pick). Why?

Well there is the argument that it depends upon the criticality of what you are doing, for drinking water it is important that the product is always safe to drink and with wastewater it is important not to pollute the waters that we discharge to. However in drinking water that is why we have double (or triple) validation. When an operator takes a sample he (or she) is taking a sample of something that is correct in the exact second that the sample is taken, this is the principle of grab sampling. So the question that has to be asked is why is there a need for continuous validation of the data that an instrument gives with a chemical test that actually gives a poorer quality of information and actually globally waste hundreds of thousands of pounds per day in doing it, because of the lack of trust in what an instrument tells us

Some comments that I have typically heard throughout my career in the water industry:

  • We (the utility) are regulated in that way. All are samples get sent to a certified laboratory and we sample manually to ensure that we get the correct results
  • Laboratory analysis is more accurate
  • The results that the instruments give us are only an indication
  • Our instruments are constantly breaking down, we can’t rely on them

 

Let us analyse these comments and see where the flaws in the arguments are.

The first comment about the fact that the industry is regulated on analysis of water or wastewater samples based upon laboratory analysis is certainly true but again this shows a lack of trust in instrumentation. In the UK the wastewater side of the water industry is sampled a certain number of times per year for measuring the compliance with the conditions of the environmental permit using a grab sample and additionally a certain number of times per year using a composite sampler under the Urban Wastewater Treatment Directive (UWWTD) located on the inlet and outlet of the wastewater treatment works. If we realistically look at this the treatment works is monitored (for a 12 sample per year frequency) about 3.5% of the time. The humble instrument allowing for breakdowns would be in the region of 95-100% of the time. There is the argument that laboratory analysis is more accurate and instrumentation is more of an indications, let us analyse this point.

In general laboratory analysis is more accurate, however there are several points to analyse. The first of these is whether or not the modern water industry really needs the accuracy of analysis that the modern day laboratory gives. In general the answer to this is, no.

For operational decisions to be made the data and information that is required needs to be available within a few minutes to enable an operator to make a decision of how to operate the treatment works. This is mostly done using field test kits or in the case of some parameters on-site laboratories. The way that water companies will operate is with a trigger base system and if an operator sees a sample that he has taken close to the trigger point then action is taken. The fact that a sample reads 2.99mg/L with a trigger point of 3 will be satisfactory for a regulatory point of view but not operationally. Basically the importance is in the speed of analysis not particularly the accuracy.

However this is not to say that the modern instrument is not accurate. It may not have the same accuracy as a test in the laboratory but it has the accuracy that is good enough for day to day operation and arguably so for regulatory purposes as well. This is usually proved when an instrument is commissioned and brought into service. In my past when I commissioned wastewater treatment works part of the reliability testing of an instrument was to compare it to onsite laboratory tests. The accuracy of the instruments had to be within an acceptable percentage, in practice the instrumentation that I commissioned was all within 5% of the laboratory analysis, more than accurate enough.

 

Laboratory analysis is not infallible either. Again an example from my past was a laboratory analysing a sample of wastewater for mercury. The result 72ug as opposed to a consent of 1ug, the result was of course called into question and a spiking trial undertaken. The results of the spiking trial showed that the three laboratories that split samples were sent to yielded recoveries of between 50 and 200%. The point is that even laboratories can make mistakes.

The last point about the reliability of instruments has already been dealt with within the blog. Instruments do require maintenance, instruments do drift and they do require calibration. However if an instrument is maintained, serviced and calibrated, which should take less time than the regular site testing that is undertaken every day of the week (at least for larger wastewater treatment works) then the burden of sampling is removed from the operator and allows an operator to operate rather than analysing samples.

In summary, there is a lack of trust in instrumentation mainly because of problems that have been encountered in the past either due to bad installation or a lack of maintenance and calibration. This has led to the current state of the industry where the burden of sampling and operation (at least in the UK where the number of operators is relatively low) has lead to a state where the operator is sampling for a large proportion of the day rather than operating the treatment works that he (or she) runs. The quality of this analysis, simply because it provides a snapshot of the situation is poor.

With the correct instrumentation properly installed, maintained and calibrated there is no reason why it cannot be used for operational and regulatory purposes which will lead to the industry increasing the efficiency of the way it operates.

In the last of this series of blogs I will summarise the whole series and look to the future on instrumentation and how we can make best use of it to assist in the day to day operations.

 

 

Advertisements

About noahmorgenstern

Entrepreneurial Warlock, mCouponing evangelist, NFC Rabbi, Innovation and Business Intelligence Imam, Secular World Shaker, and General All Around Good Guy

Discussion

One thought on “Resistance to the Effective Use of Instrumentation VI: Can I really trust what my instrument tells me?

  1. First I should say, that I am out of the analytical lab side of the industry, but the issue of trusting technology and resistance to change is one I have dealt with and been subject to.

    A question I would pose, is could there be resistance due to relocating the responsibility of the actual measurement from a perceived expert (the external analytical lab) to internally? I spent some time as a lab manager for a hazardous waste incinerator, and in that area, in the US, there were automated recording instruments on the stack. We were routinely audited for their maintenance and performance (as well as obviously what the data indicated). However, this was what was expected so there was no trepidation. If we had been taking samples and then sending them offsite to an “expert” I could see where moving it onsite would have led to significant concern.

    The other very real issue is simply change overall. We are creatures of habit, and a proven system, even if somewhat flawed, is something we are used to. Here at Suburban Labs, we have been working on decreasing the use of log books, etc., in moving to a paperless lab. In a number of areas, the analyst was writing a value down in a logbook and then later transcribing that value to a computer (into the LIMS). With the dropping cost of tablet PC’s the question was posed if we could simply have the analyst enter the data directly. The initial response was what would we use to audit the result in the LIMS? Now, this sounds fairly reasonable, but if you recognize that people can easily transpose numbers when writing them down, by having a system where the analyst first would write them down, and then transcribe the value into the LIMS, we actually had two spots for human error. Going to directly entering the data, would decrease the possibility for error. If the analyst transposed a value on the paper we had no chance in catching, so all we were ever doing was ensuring that the transcription was correct. The point in this, is that we can develop a high level of comfort on how things are done, that is often beyond logic when well analyzed. We had fairly high resistance to moving to direct data entry, from those reviewing and auditing the data, but when the system was considered in total, the change had to be decreasing the possibility of error. In the end it was recognized, but it was an issue that was met with significant resistance, even when the whole picture was presented.

    In both cases, it took a fairly committed organization, with a willingness to work through these issues. It needs to be recognized that it is strongly human nature to resist change and a lot of support needs to be put in place. At the same time there was a necessity from management to continue moving forward, and delivering the resources necessary (often in terms of nothing more than facilitating buy-in, training, etc).

    Sincerely,

    Greg Pronger

    Posted by Greg Pronger | April 4, 2012, 10:49 pm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: