Hioki Process / Calibration

 
Hioki SS7012 DC Signal Source
  • Style (Voltage Calibrators): Hand Held
  • Display (Voltage Calibrators): Dual
  • Measure Input Voltage: Yes
  • Source Ouput Voltage: Yes
  • Maximum Measured Voltage: 28 DC Volts (1 VWhat's This?)
  • Voltage Measure Resolution: 100 uV

Your Price: $1,470.00

Sign-in for EDU Pricing!

In Stock:
  • Free shipping over $99
  • View Payment Options
null

Recommended for you

Hioki Process / Calibration

Process Calibration Equipment refers to test equipment used to calibrate process instrumentation, particularly temperature, pressure, and process signals (ex. Pulse, 4-20 mA, 0-10 VDC).

Where is process instrumentation used?
  • Chemical
  • Food
  • Gas/Coal/Oil/Nuclear Power Generation
  • Marine
  • Paper
  • Petroleum
  • Pharmaceutical and Hospital
The above are just a few examples of industries using process instrumentation in manufacturing, R&D, and the laboratory. There are many types of process instruments used to measure various variables (flow, level, temperature, pressure, RPM, pH, conductivity, humidity, etc.) and they would return a measurement signal to a control system or data logger. All of these sensors need to be periodically calibrated as well as the instruments receiving the signals because they might drift, causing the process to deviate and lead to quality issues, or even safety hazards. No one wants their medicine, food, power, etc. out of specification.

DIY (Do It Yourself): Justification of in-house calibration/troubleshooting instruments

Looking to justify the purchase of calibration equipment? Here is a list of considerations. 
  • Costs due to down time
  • Costs due to out of specification product and their disposal
  • Costs of outside calibration laboratories, shipping expense, and time
  • Start by only taking on the most important or critical calibrations in-house. Leave less critical ones for outside laboratory
  • More control over calibration quality
    • Better consistency
    • Better accuracy
    • Better documentation
    • Calibrate more than once per year
    • Calibrate instruments that normally would not get sent out
Most Common Process Calibration Variables
  • Frequency (pulse signals from turbine flowmeters, for example)
  • Milliamp loop current (4-20 mA) / DC Voltage (typically 0-30 VDC)
  • Pressure
  • Temperature (Thermocouple and/or RTD)
  • Multifunction models combine two or more of the above capabilities
  • HART Multifunction Calibrators add HART Communications to the multifunction capability.

Accuracy Considerations

The following hierarchal table lists the traceability of standards
 
Measurement Standard
National Measurement Standard (i.e. NIST in USA, ISO Europe)
Primary Standard
Secondary Standard
Working Standard (i.e. “shop” standard. Typical NIST calibrators)
Process Measuring Instrument
(i.e. Temp. probe, pressure transmitter,etc.)
 
When selecting a calibrator, make sure to look at the accuracy of all instruments it will be used with, not only the measurement instruments like a temperature transmitter but also the receiving instrument, such as a recorder or controller. Instruments have good accuracies, so a calibrator with better accuracy is required. Otherwise, just use the calibrator for a calibration check only.

The vast majority of calibrators will be in the Working Standards category. The rule of thumb in the instrumentation industry is to use a calibrator with 4x better accuracy than the device being calibrated (4:1 ratio). Certain industries require better or you can purchase Secondary or even Primary Standards for in-house calibration of calibrators and avoid the repeated costs of sending them out for calibration.

What is the difference between Percent Full Scale (%FS) and Percent Reading?

As an example of the difference of 1% full scale versus 1% reading, if the full scale is 100 psig, then 1% of full scale is +/-1. So if you are measuring 50 psig the accuracy would be 50 +/-1 psig, or about 2%, and as a percentage it gets worse the lower the pressure. In percent of reading, it would be 1% of reading, so if you are measuring 50 psig, then the accuracy is 50 +/-1 psig, still maintaining the 1% accuracy throughout the measuring range. This example is for pressure but applies for any unit of measure.

Signal Calibrators / Loop Calibrators / Voltage Calibrators

Signal Calibrators, also called Loop and Voltage Calibrators, are used to calibrate and troubleshoot signal transmitters and loops. Signal calibrators usually come supplied with a built in DC Voltage calibration capability.

What is a signal transmitter?

Sensors such as pH meters, temperature probes, and positive displacement flowmeters output signals that cannot travel long distances and/or are susceptible to noise from other wiring cables in the same conduit. A loop powered transmitter is supplied power typically from a 24 volt DC power supply. It converts the analog measured signal to a 4-20 mA current signal. It uses the supplied voltage from the signal loop to power itself. A non-loop powered transmitter will be 3-wire or 4-wire, using the additional wires for power.

Percentvs4-20ma
How does a mA signal translate to
an Engineering Unit?
 
Zero is represented by 4 mA and 20 mA represents full span or 100%, so 50% regardless of whether it is psig, °C, liters per minute, or another unit, would be 12 mA.
 
Here is a graphical representation for converting 4-20 mA signals.

An elevated zero of 4 mA is used because a zero mA signal is indistinguishable between a true zero and a broken connection. For short distances or laboratory applications, a voltage output such as 0-5 VDC or 0-10 VDC is common and less expensive than a 4-20 mA transmitter. Current signals handle long runs with much less interference than voltage signals.

Milliamp Calibrators (Loop Calibrators) are important for compensating for wiring runs

At the other end of the wiring run from the transmitter is an instrument such as a PLC (programmable logic controller), DCS (distributed control system), data logger, or controller. These devices actually will be measuring voltage so an external precision 250 ohm shunt resistor is installed across the terminals or built into the instrument. The instrument will be reading a voltage signal from 1 to 5 VDC. How? Look at ohms law, V = I x R, at 20 mA ==>  0.02 A x 250 ohms = 5.0 Volts.

Great, but what happens when you are using 200 feet of 20 gauge wire. Quick internet search and you will find a resistance of roughly 1 ohm per 100 feet, meaning the instrument will see 0.02 x 252 = 5.04, an increase of 0.8%. So now your brand new, NIST calibrated device is +0.8% in error before even installing it. Depending on your application, this could be significant. And it gets worse with longer runs and higher gauges. For 22 gauge, it is approximately 1.6 ohms/100 ft and 24 gauge is 2.6 ohms/100 ft. 20-24 gauge are typical wire gauges for process instrumentation.

Rather than taking estimates of the length of wire and loss in resistance, use a calibrator to quickly correct the error. These adjustments are often called 4-20 mA trim or current loop trim. Connect the milliamp calibrator in the source mode in place of the transmitter and source 20 mA and 4 mA to the instrument and setup the instrument to read properly 100% and 0%. Make sure to wait sufficient time to let the reading stabilize and/or set any instrument damping (sometimes called filter) to zero. 

Source Only vs. Read/Source Calibrators and the Milliamp Process Clamp Meter

Source mA
The minimum function of a calibrator is to source a known thermocouple, RTD, mA, or voltage. That permits performing zero/span adjustments and loop calibrators compensate for wiring runs, as discussed above. In source mode, testing of conditions that otherwise would be difficult or unsafe is possible. Imagine testing a tank level situation. With a mA calibrator, it is safe to simulate a 90% full condition to test a high alarm warning and 95% to test the high high (HH) alarm. Another safety example is simulating a high pH mA signal to trigger shutdown of pumps sending water into the sewer.

“I think we have a bad valve…” Milliamp calibrators in source mode can also be used to test valve positioners (i.e. “Stroke” the valve). Send signals to open, close, or any other position while watching the valve stem position on a bench or in the field for troubleshooting.

Variable frequency drives (VFD) are used to power motors, blowers, and fans in process applications as well as in conveyor systems and machine tools. Control inputs are generally voltage (1 V to 5 V or 0 V to 10 V) or current (4 mA to 20 mA). A milliamp/voltage calibrator can source a signal for commissioning and troubleshooting.

Make sure the source feature has selectable zero/span, slow ramp, fast ramp, and step ramp. It is invaluable in simulating a process. One cannot be at two places at the same time. The calibrator can be on an up or down ramp while the results are being viewed at the controller.

Read mA
For troubleshooting problems, reading the signal coming from the sensor or a re-transmitted mA signal from a controller or recorder is necessary.

What is the difference between “Source” and “Simulate” in a milliamp calibrator?

Source will actually output a 4mA to 20mA signal based on the value selected. Simulate does not output anything but rather controls the current flow from an external source to be within 4mA to 20mA.
 
What are Linearity, Repeatability, and Hysteresis?

Linearity for process instrumentation is expressed as a percentage of the full scale of how far the instrument deviates from a best fit straight line (BSFL). For example, the average pressure transducer is 0.5% or 0.25% full scale accuracy. Sometimes the specification will state it as Linearity.

Repeatability is not accuracy, instead it is how much is the variation by the same instrument and conditions over multiple tests, expressed usually as percentage of full scale.

One definition of Hysteresis has to do with control systems and preventing “chatter” and not applicable for our accuracy discussion. Think of your furnace thermostat, as an example. It turns on when the ambient temperature falls below setpoint but stays on until a preset temperature above your setpoint and will not come back on until a preset temperature below your setpoint is reached. This prevents unnecessary cycling (i.e., "chatter") of the unit. There could be constant on/off action if the temperature is fluctuating from 69.9 to 70.1 °F, for example, and could wear out the furnace prematurely. This is different than the discussion of hysteresis with measurement accuracy.

From a measurement standpoint, Hysteresis is the effect of loading versus unloading across the range of the instrument. Temperature can also affect hysteresis. For example, a pressure transducer has a pressure applied from zero to 100 psig. Then the pressure is removed from 100 psig to zero.

How accurate is your pressure transducer, mass flowmeter, and other sensors subject to Hysteresis?

linearity_graph

The graph above is for illustration purposes but indicates an inherent hysteresis problem with pressure transducers and to a lesser extent thermal mass flowmeters. The Y-axis displayed pressure represents the pressure on the pressure transducer display, mA output signal, of HART signal value. The X-axis represents the applied input. ISA-37.1 Standard talks about for pressure transducer calibration to collect data at a minimum of 5-6 points of rising and falling pressure (0, 20, 40, 60, 80, and 100%). Even with the collection of several points, it is still a best fit straight line. Across the whole range in the graph there are places where you will have better and worse accuracy. Manufacturers can play with the data too and still be correct in their specification. They can take multiple tests and average results. They can be selective in which points they measure and use regression analysis in determining the best fit straight line. End result is a 0.5% linearity generally speaking but at your specific measurement point and specific batch today at 2 PM, it can be considerably worse accuracy.

There are a few approaches to reducing your hysteresis error and error in general. Most processes operate at a fixed temperature, pressure flow, etc. or at least a narrow range. Being able to see a larger range might be needed to move the process to the desired value, but the key time is spent in a narrow range of a larger full scale. Keeping that in mind, consider the following:
  • When getting instruments NIST calibrated, get the data!  Usually the factory or independent calibration laboratory will charge a small amount extra for the calibration data, but it is worth it. Remember to ask how many calibration points and which points are measured. If necessary, ask for more calibration points, usually a small adder in price. Now you can see where the errors are and compensate. Some controllers, PLC’s, etc., will have a menu to include a linearization table, typically 16 or more points. At a minimum, they might have a menu called bias or offset where you can enter a single point positive or negative offset. If your process runs at 100.0 psig for the main critical portion of your batch and you know you are off by -0.2 psig, if you cannot make the adjustment at the pressure transducer, look for an input bias/offset menu in the controller.
  • When requesting calibration of your sensors, supply desired points and get the data as mentioned above.
  • Get higher accuracy sensors. Also, keep in mind the difference between percent full scale and percent reading when selecting sensors discussed in greater detail above. If you get a percent full scale instrument make sure your operating range is in the top third of the instrument range.
  • Compensate for wiring runs with a loop calibrator. Discussed earlier.
  • DIY (Do It Yourself). Calibrate instruments in-house.
X