Craig Marshall, project engineer at NEL, a provider of technical consultancy, research, testing, and flow measurement services to the energy industries, has discussed a more innovative approach to calibration, saving time and money for the oil & gas industry.
Metering technology has advanced significantly over the past few decades, as flow meters are now able to record and store a vast amount of data. However to calibrate an offshore fiscal measurement device, the cost including shut-down, packaging, transport, calibration, witnessing etc. could be in the region of $50,000. This cost is typically an annual expense and does not include the amount of planning and preparation involved.
As recalibrations are both costly and labour intensive, particularly when multiple meters are involved, a more innovative approach to calibration is needed – one that could save valuable cost and time for the oil & gas industry.
In the oil & gas industry, an emphasis on environmental standards and fiscal accountability means that accurate measurement is a key driver, not only for the regulators that want to be assured of accurate fiscal accountability, but for the operators that also want to maximise business efficiency.
In the UK, the Department of Energy & Climate Change (DECC) requires that flow meters are regularly calibrated (Guidance Notes for Petroleum Measurement Issue 8) and these UK specific requirements are reflected in other regulatory requirements across the globe. DECC requirements include that the operator must be able to demonstrate that, prior to its installation and on-site commissioning, the meter that takes fiscal measurements is fully operational. The operator must therefore designate within their organisation a responsible authority that will co-ordinate the testing procedure and advise DECC of the identity of the representative(s) that will be present during the testing procedure.
While physical witness testing is accepted as the norm in the oil & gas industry, it is costly as calibrations may take days to complete. This approach consumes valuable staff time that could be more productively invested elsewhere in the business. Other escalating costs such as staff accommodation and subsistence are also incurred.
A new approach to calibrations that could be adopted would be remote witnessing, where Internet-based technology takes the laboratory to the world and no longer requires the world to come to the laboratory. Remote witnessing would mean that those delegated to witness calibrations would not have to be physically present, allowing them to dedicate more time to their primary role within the business and negating any costs associated with travel. While this new approach has obvious benefits for business efficiency, it would have no negative impact on the accuracy or traceability of calibrations. Such an approach could also reduce the cost associated with the physical time of the calibration process. For example, if an anomaly is identified during the calibration process, experts can remotely log in to resolve it. As their physical presence at a test is no longer required, answers can be found more quickly so that calibration downtime is minimised.
Technology could also be used to enhance the productivity of calibration laboratory operations and remove the need for post-processing of data. Currently a calibration is completed and then analysed to identify any problems. It is then repeated once those problems have been addressed. By monitoring the calibration against pre-set criteria in real time, this would allow issues to be immediately flagged and addressed during the original calibration. Because such a system would store historical calibration data, the current performance could be compared with previous performance at similar conditions to check for shifts or drift between calibrations. Again, by setting performance criteria, unacceptable changes over time could be identified and addressed during the calibration.
A development of this capability is the use of historical calibration data to improve the quality of the individual instrument uncertainty values, used to develop an overall system uncertainty. Such a development was discussed at the 31st North Sea Flow Measurement Workshop, held in Norway in October 2013. This was attended by delegates from around the world, including operators, service companies and equipment vendors, who discussed the latest challenges and technologies.
Currently, calibration is also completed in isolation, with no correlation between in-service operation and calibration periods. Using data, sourced from local SCADA (supervisory control and data acquisition) systems, to monitor a meter’s output against data gathered during the original laboratory calibration, offers the prospect of improving overall system management. Ultimately this could move the calibration process from one based on pre-defined re-calibration intervals to one that is condition-based. This would reduce well operations downtime and save significant amounts of money on calibrating meters which are operating within required limits. In addition, remote access to calibration data from anywhere in the world at any time also positively impacts on business performance by enabling quicker, more accurate decision making.
The oil & gas industry’s utopian goal is to significantly reduce operation downtime and save money associated with calibrating meters, while meeting the requirements of government and regulatory bodies. However, calibration frequencies remain typically calendar-based, with intervals often based upon a pre-programmed time period or the volume of flow that has passed through the meter.
Technology is now advancing to a point where much more computer processing can be completed in real time, making a new approach to the calibration of oil & gas meters a real possibility. Calibration utopia is one step nearer, where costs and labour time are reduced and trending of both calibration data, and data from the field, allow for accurate meter condition-based rather than time-based calibration.