BOD (Biochemical Oxygen Demand)
Detergents - see Surfactants
ORP (Oxidation-Reduction Potential)
Soap - see Surfactants
Chlorine can bleach the indicator and interfere with the test. Add one drop 0.1 N sodium thiosulfate per 100 mL sample before adding any reagents to eliminate this interference.
If the sample does not turn pink after adding the phenolphthalein indicator, this means that the there is no phenolphthalein alkalinity in the sample. This will happen whenever the starting pH of the sample is less than 8.3.
Continue with the test by adding the bromcresol-green methyl red indicator and titrate to the endpoint for total alkalinity.
Titrate to an endpoint of 4.5 initially and use this as your estimated result. Repeat the test if the estimated result is significantly lower than the 500 mg/L range. Use the endpoint that most closely matches the estimated result.
Looking for information about Hach solutions for this parameter?
The term free ammonia is used when chloramines are used to disinfect water . During chloramination, chlorine and ammonia are added to water to form monochloramine. The portion of ammonia that has not combined with chlorine is called free ammonia, and exists as either NH4+ or NH3 depending on the pH and temperature of the water. At a neutral pH and ambient temperature, almost all of the free ammonia exists as NH4+. As the pH and temperature increase, the amount of NH3 increases and the amount of NH4+ decreases.
Download table: Percentage Un-ionized Ammonia in Aqueous Solution by pH Value and Temperature (from FF2 Freshwater Aquaculture Test Kit Manual)
Traditional testing methods for ammonia such as the Nessler or Salicylate Methods show interference from chloramines and give ammonia results that are higher than actual. To measure free ammonia in water disinfected with monochloramine, use Hach® Method 10200 for laboratory or field testing, or the APA 6000 Ammonia and Monochloramine Analyzer for continuous monitoring. An ammonia ion-selective electrode can also be used, but may not provide enough sensitivity at low ammonia concentrations.
Do not add sodium thiosulfate to samples to remove monochloramine. The reaction of sodium thiosulfate and chloramines forms ammonia, and your results will be higher than actual.
Yes. This method was omitted from the 19th edition of "Standard Methods for the Examination of Water and Wastewater," but it is still approved by the United States Environmental Protection Agency.
A small amount of red or brownish precipitate is normal for this reagent and does not indicate a problem. Do not mix the solution. Let the precipitate settle and use the solution above it.
Ammonia exists in water as either the ammonium ion (NH4+) or un-ionized ammonia (NH3). Un-ionized ammonia is toxic to fish, while the ammonium ion is nontoxic except at extremely high concentrations. At a neutral pH (pH 7) and ambient temperature, almost all of the ammonia exists as NH4+. As the pH and temperature increase, the amount of NH3 increases and the amount of NH4+ decreases.
To find the amount of un-ionized ammonia in your water, first measure the ammonia concentration, pH, and temperature. All Hach methods for ammonia measure both NH4+ and NH3. Then find the percent of un-ionized (toxic) ammonia from the Percentage Un-ionized Ammonia in Aqueous Solution Table (from FF2 Freshwater Aquaculture Test Kit Manual). Multiply your ammonia concentration by the percent from the table, and then divide by 100.
Looking for information about Hach solutions for this parameter?
The replacement reagent set for the EZ arsenic test kit is Catalog No. 2823200.
The two kits differ in range, complexity, price, and performance. Kit 2800000 has a range up to 500 ppb and is best for samples containing sulfide or arsenic-iron particles. It has excellent recovery of As+5 (100%) and can measure organic arsenic with optional steps.
The EZ arsenic kit 2822800 has a range up to 4000 ppb, fewer steps, and is more economical, however it recovers only 90% of As+5, and has no option for measuring organic arsenic.
Be sure that your arsenic or QC standard has not been prepared in nitric acid. Nitric acid is commonly used to prepare AAS standards, but it will cause low results with the arsenic test kits.nic.
If arsenic samples cannot be analyzed at the sampling site, add 2 mL hydrochloric acid per liter of sample. The pH should be less than 2. Do not use nitric acid because it prevents the full reduction of the arsenic to arsine gas. Acidified samples can be stored up to 6 months.
Before testing, add sodium hydroxide to neutralize the sample to a pH between 6 and 7. Correct the results for the increase in volume from the acid and base.
The Ascorbic Acid Test Kit was designed for measuring ascorbic acid in beverages. The kit measures reducing agents in general, and the chemical reactions are similar to those for sulfite titrations.
In the Ascorbic Acid kit, sulfuric acid is added to acidifiy the sample, and a starch solution is added as an indicator. The titrant contains a potassium iodide-iodate standard solution. When the titrant is added to the acidified sample, iodine is released.
The iodine reacts with ascorbic acid in the sample. When all the ascorbic acid has reacted with the iodine, excess iodine reacts with the starch indicator and forms a blue color. The amount of titrant needed to reach the blue endpoint color is directly proportional to the amount of ascorbic acid in the sample.
The Atrazine immunoassay test detects the presence of various atrazines and metabolites to different degrees. The test is approximately 40 times less sensitive to Atrazine Desethyl as it is to Atrazine, however it will detect it. Approximately 120 ppb of Atrazine Desethyl is required to give the same test response as 3 ppb of Atrazine.
See the Sensitivity section of the Atrazine procedure for the relative sensitivity of other similar compounds.
This standard solution is made up of 300 milligrams each of glucose and glutamic acids, and is used to validate your technique. We recommend that you add 1, 2, 3 and 4 milliliters of this standard, respectively, to four BOD bottles, add your seed, dilution water, and perform the BOD test. Your results should be 400 mg/L +/- 30.5 mg/L on each of these bottles.
You may also add 3 mL's, respectively, to three BOD bottles, add your seed, dilution water and perform the BOD test. Your results should be 400 mg/L +/- 30.5 mg/L. This standard is twice the concentration of the standard used in "Standard Methods for the Examination of Water and Wastewater." To correlate your results with Standard Methods, divide your result by 2.
The Standard Methods result of 198 (+/- 30) mg/L BOD is based on standards containing 150 mg/L each of glucose and glutamic acid, which is half the concentration of the Hach BOD standard.
The Hach BOD standard solution (Catalog No. 1486510) contains 300 mg/L each of glucose and glutamic acid. Therefore when using the Hach BOD standard, divide results by 2 for comparison with Standard Methods.
The most common problem with low results using a BOD standard is insufficient seed. Increasing the amount of seed in each bottle should improve results.
BOD standards consisting of glucose and glutamic acid are very stable and easy to prepare, therefore it is unlikely that the standard has degraded. You can verify the concentration of the BOD standard by using 2 mL of this standard in the HR COD test. The COD result for this standard should be approximately 613 (+/- 25) mg/L COD.
A lower result for CBOD using standards is relatively common but not well understood. It appears that the nitrification inhibitor has an adverse on the seed, possibly due to nitrifying bacteria in the seed. Increasing the amount of seed in each bottle normally improves results.
Actually the amounts are the same. The difference is that Standard Methods refers to a small amount of full-strength TCMP, the active ingredient, while the Hach formulation uses a greater amount of a powder that contains only 2 percent TCMP.
When used as specified, 0.16 g of the inhibitor or 3 mg TCMP (0.16 x 0.02) is added to each 300 mL bottle. This is the same amount of TCMP specified in Standard Methods.
BOD can be estimated from COD measurements, however the ratio of BOD to COD can only be found by measuring BOD and COD over time and using this data to find a correlation.
Once you have gathered COD and BOD data for your sample, divide your average BOD result by the average COD result to find the ratio or conversion factor. Multiply your COD results by this factor to estimate your BOD concentration. COD values are almost always higher than BOD results for the same sample; the conversion factor should be less than one.
This correlation will only apply to your sample. If your sample composition changes significantly, for example seasonally, you will need to determine a new ratio.
The term seed refers to microorganisms that consume the biodegradable organic matter in samples for BOD measurement. If not enough seed is present, all of the biodegradable matter may not be consumed and results will not be accurate. Seed requires proper pH and temperature control, and nutrients such as phosphorus, calcium, and magnesium for proper growth. Hach nutrient buffer pillows provide the necessary nutrients and pH.
The best source of seed that gives the most reproducible results is domestic wastewater (influent) or effluent from biological water treatment plants before disinfection. Other sources such as industrial wastewater may not have enough microorganisms, or may contain toxins that prevent the organisms from growing. If wastewater is not available, prepare a seed solution from a freeze-dried capsule such as PolySeed.
Whatever source of seed is used, it will exert some demand. Therefore a seed control must be measured to correct for this demand.
The BOD test requires very clean glassware to avoid any contamination that can skew results and lead to irreproducible data. To clean BOD bottles, use a 20 to 50% sulfuric acid solution (approximately 5 to 14 N), or for maximum cleaning use a chromic acid cleaning solution.
Pour several mL of the acid into each bottle, cap, and invert to coat the inner surfaces of the bottles. Rinse thoroughly with distilled water. Use this acid rinse in the dilution bottle, fittings, tubing, glassware, and in any other labware that comes into contact with the sample.
The BOD nutrient buffer pillows are sterilized and verified for 0 bacteria count and 0.0 (+/- 0.2) mg/L DO depletion after 5 days incubation. If your blanks show high depletion after 5 days, the most common problem is bacterial contamination in the water or lab equipment. These bacteria can grow once nutrients are added from the slurry pillows.
The most common source of contamination from labware are jugs used to store the distilled or deionized water, including spigots, fittings, and tubing. Aeration stones used with small pumps to aerate the water can also be a source of contamination. All equipment should be thoroughly cleaned using a 5 N to 14 N sulfuric acid solution following by thorough rinsing with distilled water.
It may be helpful to try a different water source, as even the best deionizing systems typically release trace amounts of organic matter or allow bacteria to grow on the resins, which can cause an oxygen demand. Ideally the source water should be distilled from alkaline potassium permanganate as described in the test procedure.
Good quality dilution water is extremely important in the BOD test. Any contamination in the water will cause problems with the test. Expensive deionized water systems often leach organic compounds from the ion-exchange resins or allow bacteria to grow on the resins, even though standard conductivity measurements show the quality is very good.
The best dilution water for BOD is distilled water. Distilled water can sometimes contain ammonia or volatile organic compounds. To distill water with the lowest organic content on a consistent basis, distill the water using alkaline permanganate. Dilution water prepared using alkaline permanganate will give the most accurate and consistent results. Do not store dilution water for more than 24 hours.
Preparation: Prepare the seed by adding one PolySeed capsule to 500 mL of dilution water and stir gently or aerate for one hour. The dilution water must contain a nutrient buffer pillow as described in the test procedure.
Seed Control: The seed itself will consume some dissolved oxygen that must be measured and used in calculating a seed correction factor. Run the seed as a normal sample, so that depletion is at least 2 mg/L DO. Use the depletion as well as ratio of volume of seed in sample to seed in the control to correct for the oxygen demand from the seed.
Use: For the dilution method, add 1 to 2 mL (amount varies) of seed directly to each BOD bottle when measuring samples or standards. Use 15 mL if using the BODTrak. This is described in more detail in the test procedure for each method.
Dissolve 2 grams ATU in 1 liter water, mix, then add 0.3 mL of this solution to each 300 mL BOD bottle. The BOD bottles should be at least two-thirds full with diluted sample before adding the ATU.
ATU is a nitrification inhibitor mentioned in the 21st edition of Standard Methods for measuring CBOD. Standard Methods mentions that ATU may not work as well as TCMP (contained in Hach nitrification inhibitor), and concentrations above 2 mg/L will cause high results.
Looking for information about Hach solutions for this parameter?
Monochloramine is commonly used as an alternative to free chlorine for disinfecting drinking water. This is because monochloramine forms less disinfection by-products (DBP) than free chlorine.
Hach developed the Indophenol method for Chloramine (Mono), which is very specific for monochloramine. The test involves adding one reagent to the sample, waiting a 5 minute reaction time, and measuring the concentration in a colorimeter or spectrophotometer.
Hach does not recommend using the difference between a total and free DPD chlorine test for measuring monochloramine. This is because high levels of monochloramine interfere with the free DPD chlorine test, causing the reading to continually increase. Therefore an accurate reading for free chlorine in chloramination systems cannot be measured.
In addition to monochloramine, measure free ammonia to be sure that you are feeding a ratio of chlorine to ammonia that is optimal.
Chloramination involves mixing chlorine and ammonia together at a controlled ratio to form monochloramine. The ideal mix is five parts chlorine to one part ammonia, but this can vary depending on the pH, chlorine demand, and ammonia in the source water. If too much chlorine is added, dichloramine will form, which creates taste and odor problems. If too much ammonia is added, bacteria may grow, which can lead to biofilms or nitrification.
To make sure you are feeding chlorine and ammonia at the optimum ratio, monitor the concentration of monochloramine, total chlorine, and free ammonia. The monochloramine and total chlorine concentrations (as Cl2) should be equal. If total chlorine is higher than monochloramine, your system has formed dichloramine in addition to monochloramine.
Free ammonia in an optimized process should be very low but have some measurable concentration. (Free ammonia is ammonia in the form of NH3 or NH4+ that has not combined with chlorine.) A small but measurable concentration of free ammonia ensures that you are feeding a ratio of chlorine to ammonia that is optimal.
If the free ammonia concentration is too high, nitrifying bacteria may form and form biofilms or convert excess ammonia to nitrite. Biofilms and nitrite will consume your monochloramine disinfectant. Measure nitrite to determine if your system is undergoing nitrification.
Free chlorine refers to both hypochlorous acid (HOCl) and the hypochlorite (OCl-) ion or bleach, and is commonly added to water systems for disinfection. When ammonia or organic nitrogen is also present, chloramines known as monochloramine, dichloramine, and trichloramine will quickly form. Chloramines are also known as combined chlorine.
Total chlorine is the sum of free chlorine and combined chlorine. The level of total chlorine will always be higher than or equal to the level of free chlorine.
Free chlorine is typically measured in drinking water disinfection systems using chlorine gas or sodium hypochlorite to find whether the water system contains enough disinfectant. Typical levels of free chlorine in drinking water are 0.2 - 2.0 mg/L Cl2, though levels can be as high as 5.0 mg/L.
Total chlorine is typically measured to determine the total chlorine content of treated wastewater, often for discharge purposes. If you are required to measure and report chlorine levels to a regulatory agency, we advise that you check with your regulator to find whether you are required to measure free chlorine or total chlorine.
If you are using a Hach instrument such as a colorimeter or spectrophotometer, follow the accuracy check section in the step-by-step procedure. This will verify that your instrument and reagents are working properly and that you are performing the test correctly.
Hach also carries SpecCheck Secondary Chlorine Standards, Catalog No. 2635300, to verify instrument performance. This is a very simple and quick way to determine that your instrument is functioning properly.
If your procedure does not have an accuracy check section, you can use a sample of known chlorine concentration, called a chlorine standard solution, and run the test using that solution in place of your sample. If you measure the correct concentration of that solution, then you can be confident that your reagents and instrument or kit are working properly, and that you are performing the test correctly.
Hach chlorine standard solutions, Catalog No. 1426810, must be diluted using high-quality deionized water before use. Chlorine can be lost if the dilution water is not of high quality.
Two laboratory methods that can measure ultra-low levels of chlorine are amperometric titration and ultra-low range DPD. The AutoCAT 9000 is an amperometric titrator that is easy to use and can measure down to 1.2 µg/L. Once set up with your sample and reagents, it runs on its own and calculates the results.
If you have a spectrophotometer such as the Hach DR/2500 or DR/4000, you can follow the DPD ultra-low range Method 10014 (wastewater) or 8370 (drinking water). These methods use a pour-thru cell, filter, and liquid reagents, and can measure down to 2 µg/L.
No simple test kit is available that can measure ultra-low levels of chlorine.
Other oxidants such as bromine, iodine, ozone, chlorine dioxide, or hydrogen peroxide can react with DPD and cause false positives. The most common interferent is oxidized manganese, which can be corrected for by treating the sample with potassium iodide and sodium arsenite.
Sunlight can react with the DPD indicator during the 3-minute reaction time for total chlorine; keep the sample covered during the reaction time if testing outdoors.
If you are trying to measure a very low concentration on a colorimeter or a spectrophotometer, be sure to measure a reagent blank using deionized water. Subtract the reagent blank value from your sample concentration.
It is also a good idea to use the same sample cell for zeroing the instrument and reading the sample concentration. This avoids any concentration that might be due only to optical differences between the zero and read cell.
This happens when the concentration of chlorine in the sample is much higher than the upper limit for the test. Try a ten-fold or higher dilution of the sample using high-quality deionized water, and repeat the test. If the sample still turns pink initially but then turns clear, try an even higher dilution.
If the color appears stable, multiply the result by the dilution factor to get the chlorine concentration of the undiluted sample. Some chlorine may be lost during the dilution step.
If you are required to calibrate your colorimeteror spectrophotometer, you will need to prepare several (typically 5 to 7) primary calibration standards at different concentrations. The calibration standards should span the full concentration range of the test you use. For example, if you use a Hach method that measures from 0 - 2.00 mg/L, prepare calibration standard concentrations that span this range, for example 0.3, 0.6, 0.9, 1.2, 1.6, and 1.9 mg/L Cl2.
The following procedure is one way to accurately prepare 6 chlorine calibration standards from a Hach Chlorine Standard Solution.
Required Items:
Chlorine Standard Solution, 50-75 mg/L (actual concentration on label): Catalog No. 1426810
Flasks, volumetric, class A, glass, 200-mL: Catalog No. 1457445
Pipets, volumetric, class A, glass, 1, 2, 3, 4, 5, and 6 mL: Catalog No. 1451535, 1451536, 1451503, 1451504, 1451537, 1451506
Pipet Filler: Catalog No. 1218900
Deionized water, 4 liters: Catalog No. 27256
Diluting the Chlorine Standard Solution:
Fill a 200-mL volumetric flask approximately half full with deionized water.
Open a chlorine standard solution ampule.
Pipet 1 mL of the chlorine standard solution into the flask.
Fill the flask to the mark with deionized water, stopper, and invert repeatedly to mix. This makes an approximately 0.3 mg/L Cl2 standard solution. (Use the equation below to find the exact concentration.)
Prepare additional standards by repeating the above steps, substituting 2, 3, 4, 5, and 6 mL of the chlorine standard solution for the 1 mL shown in step 3.
Calculating the Concentration:
Because Chlorine Standard Solutions are difficult to prepare accurately, Hach targets a concentration range, in this case 50 to 75 mg/L, and then determines the actual concentration by amperometric titration. You will find the actual concentration on the label. Use this concentration to calculate each of your calibration standard concentrations.
Multiply the concentration from the Chlorine Standard Solution label by the multipliers in the following table.
Standard # | Pipet volume (mL) | Multiplier |
1 | 1 | 0.005 |
2 | 2 | 0.01 |
3 | 3 | 0.015 |
4 | 4 | 0.02 |
5 | 5 | 0.025 |
6 | 6 | 0.03 |
Diluted concentration = chlorine ampule concentration x multiplier
Example: Your chlorine standard solution (Catalog 1426810) is 63.1 mg/L. You pipetted one mL of this standard into a 200-mL flask and diluted to the mark. The concentration of your diluted standard is 63.1 x 0.005 = 0.32 mg/L chlorine. Enter 0.32 in the calibration mode of your instrument for standard 1. Measure the absorbance of this standard after adding the DPD reagent.
Note: Use the diluted standard solutions as soon as possible after mixing. Discard any solution remaining in opened ampules.
Pretreat your sample with potassium iodide and sodium arsenite as described below. This pretreatment destroys any chlorine in the sample, and allows you to measure and subtract the concentration due only to the manganese or chromium interference.
Adjust at least 25 mL of sample to pH 6-7 using 1 N sulfuric acid or 1 N sodium hydroxide.
Add 3 drops potassium iodide, 30 g/L (Cat. No. 343-32) to 25-mL of the pH-adjusted sample.
Mix and wait one minute.
Add 3 drops sodium arsenite, 5 g/L (Cat. No. 1047-32) and mix. The arsenite destroys any chlorine in the sample but not manganese or chromium.
Pour 10 mL (or the required amount for your instrument or test kit) of the treated sample into a sample cell and add the DPD reagent. Wait 3 minutes if using total DPD reagent. Read the concentration in your instrument or kit.
Subtract this concentration from your original result (without the pretreatment) to find the correct chlorine concentration.
This instability indicates that your sample has a high concentration of monochloramine, which interferes with the free chlorine test at high concentrations. There is no way to get an accurate reading for free chlorine using DPD when there is a high amount of monochloramine in the sample.
The best thing to do is measure total chlorine using the total DPD reagent, or measure monochloramine using the Monochlor F reagent. Monochloramine is also disinfecting.
Looking for information about Hach solutions for this parameter?
In general, oxidants other than chlorine can be measured using the DPD method for chlorine, and the result can be converted from chlorine to the other oxidant using the molecular weight ratio.
However, a kit or program for free chlorine should not be adapted for chlorine dioxide. This is because the calibration curve for chlorine dioxide is not linear, while the free chlorine curve is linear. Therefore a single conversion factor cannot be used.
Yes. Dilute the sample, and then add the appropriate volume (either 2 or 0.2 mL) of this dilution into the vial. Multiply the final answer by the dilution factor.
(see also: 'How do I dilute my sample to bring it within the test range?' in the General Analytical Testing and Measurement FAQ)
The same blank can be used repeatedly for measurements using the same lot of vials. This blank should be stored in the dark.
Monitor the absorbance of the blank over time to see that it is stable. Zero the instrument in absorbance mode using a vial containing deionized water, and then measure the absorbance of the blank. Record the value. Prepare a new blank when the absorbance has changed by about 0.01 absorbance units.
Chloride is the primary interference when testing for COD. Each COD vial used with the dichromate method contains mercuric sulfate that will eliminate chloride interference up to the level specified in the procedure. The MnIII method uses the vacuum pretreatment device to remove chloride up to 1000 mg/L.
Prepare a blank that consists of deionized water and chloride at the same concentration as the chloride in the sample. The blank will turn very dark after adding digestion, but you can then zero out the chloride interference and obtain a useable result for your sample.
Looking for information about Hach solutions for this parameter?
To make sure you are getting total copper, first digest the sample. You may use the USEPA-approved mild or vigorous digestion or Hach's Digesdahl Digestion Apparatus.
Then use a test such as the bicinchoninate method to measure the concentration of copper in the sample.
Dissolved Oxygen samples for Winkler titration can be preserved for up to 8 hours by following these steps:
Collect samples in a clean BOD bottle as described in the procedure.
Add the first 2 reagents (manganous sulfate and alkaline iodide-azide, or Dissolved Oxygen 1 and 2).
Immediately insert the stopper and mix well.
Store in the dark at the temperature of the water source, or seal with water and store in the dark at 10-20 °C.
To seal with water:
Pour a small amount of water around the stopper. Place a BOD bottle cap (Cat. No. 2419-06 for 300 mL BOD bottle) over the top of the bottle and push until it snaps in place
After storage, start the procedure at the step where you add the sulfamic acid (or Dissolved Oxygen 3) reagent, and titrate the sample.
The floc that forms after adding the first two reagents usually settles to the bottom of the bottle after sitting undisturbed for a few minutes. In samples of high density, however, for example seawater or very cold samples, the floc may not settle.
If the floc does not settle after 5 minutes, proceed to the next step, adding sulfamic acid. Be careful that none of the floc sticks to the stopper when it is removed from the bottle.
The best way to check results for any test is to use a standard solution in place of your sample. A standard solution contains the parameter you are measuring at a known concentration. If you test the standard solution and your result matches the concentration of the standard, you can be confident that your kit is working properly and that you are performing the test correctly.
Unfortunately, a dissolved oxygen standard solution of known concentration is very difficult to make. For the Dissolved Oxygen Winkler titration, however, a potassium iodide-iodate standard solution can be substituted. This standard, equivalent to 10 mg/L DO, checks the strength of the sodium thiosulfate solution used for the titration.
Follow the accuracy check section of the procedure using this standard solution. If your results are correct, you can be confident that the sodium thiosulfate solution for titrating your samples will give the correct result.
The most common problem when using this method is preventing oxygen from the atmosphere from entering the ampule. Be sure to hold the ampule upside down so the tip is under the water as the ampule fills. Then cap the ampule with the rubber stopper while the ampule is still upside down and under water. You should not see any air bubbles in the ampule. Air bubbles will cause results to be higher than actual
If you keep air bubbles out of the ampule, your results should be reproducible. This method has few interferences, and is the fastest and easiest way to measure dissolved oxygen in water.
Looking for information about Hach solutions for this parameter?
The most likely problem is fluoride contamination in the water used for the blank. If there is any fluoride contamination in the blank, the color difference between the blank and sample will be less than it should be, and results will be low.
Use a fluoride standard solution in place of the sample and check results using the standard. If results are still low, try a different source of deionized or distilled water. Hach Company carries deionized water that is fluoride-free (Cat. No. 27249 or 27256).
If you measure the correct concentration with the standard solution but results are still low on a sample, there is likely an interference in the sample. Known interferences that cause low results are aluminum, high concentrations of alkalinity (>5000 mg/L as CaCO3), and high levels or iron (>10 mg/L Fe). You may need to distill the sample to remove interferences.
Other important things to check are that the temperature of the blank and sample is the same (within 1 oC), and that all glassware is very clean.
When sodium sulfite from the Formaldehyde 1 reagent is added to deionized or distilled water, the pH changes to approximately 9.8. Because the thymolphthalein indicator changes from clear to blue at a range of 9.3 to 10.5, the deionized or distilled water sample will turn blue even though no formaldehyde is present.
Just one drop of sulfuruc acid will turn this solution clear again. Although it may appear that the test indicates the presence of formaldehyde, the kit was not designed for levels below 0.05%, and cannot be used as a presence/absence test for formaldehyde.
Hardness in water refers to specific minerals that consume soap and cause scaling in water heaters and boilers. The more minerals, the harder the water. Soft water refers to the absence of these minerals.
The term hardness comes from an expression of how difficult or "hard" it is to wash clothes with the water. When soap is mixed with hard water, these minerals combine with the soap and form a precipitate, or a solid. This decreases the cleaning efficiency of the soap and forms soap scum. As more soap is added, solids continue to form until the minerals are depleted. When the minerals are no longer available, the soap forms a lather and works as a cleaning agent.
The minerals that precipitate with soap are polyvalent cations such as calcium, magnesium, iron, manganese, and zinc. The concentration of calcium and magnesium in natural waters generally far exceeds that of any other polyvalent cation. Therefore, hardness is generally considered to be the concentration of calcium and magnesium in water.
Carbonate and Non-Carbonate Hardness:
Hardness can be classified as carbonate and non-carbonate hardness. Carbonate hardness refers to calcium and magnesium bicarbonate. When calcium bicarbonate is heated, solid calcium carbonate forms. This is the primary cause of scale formation in water heaters and boilers. Non-carbonate hardness is caused primarily by calcium and magnesium nitrates, chlorides, and sulfates.
Hardness is removed from water systems by precipitation or ion exchange. The treatment method varies depending on the relative amounts of carbonate vs. non-carbonate hardness.
The amount of carbonate vs. non-carbonate hardness can be found by measuring alkalinity. If the alkalinity is equal to or greater than the hardness, all of the hardness is carbonate. Any excess hardness is non-carbonate hardness.
Hardness is typically reported in terms of mg/L as CaCO3 or gpg as CaCO3. Because alkalinity is also reported as CaCO3, the results of the two tests can be compared directly.
Hardness is most commonly measured by titration with an EDTA solution. A titration involves adding small amounts of a solution to a water sample until the sample changes color. You can titrate a sample for total hardness using a buret or test kit. You can also measure calcium hardness separately from magnesium hardness by adjusting the pH and using different indicators.
Hach Drop Count Test Kits for total hardness use a dropper to add the EDTA solution to the sample. Test kit model HA-71A, which uses ManVer indicator, works best for natural water samples, especially when iron or manganese is present, or when alkalinity is high. Test kit models 5-B, 5-EP, and 5-EP/MG-L, which use UniVer reagent, work best for industrial samples that may have high concentrations of metals such as copper. Other kits are available for measuring calcium and magnesium hardness separately.
Kits using the Digital Titrator can measure hardness concentrations more accurately than drop count titration kits. This is because the Digital Titrator dispenses the EDTA solution in very small increments. Kits using the Digital Titrator use the ManVer indicator.
Test strips are also available for measuring hardness. A color develops on the strip and the strip is matched to a chart. The charts shows colors for concentrations of 0, 25, 50, 120, 250, and 425 ppm, or 1, 1.5, 3.7, 15, and 25 gpg. Use test strips when a general range for hardness is sufficient. Test strips should not be used when an exact hardness concentration is required.
When you need to measure hardness in extremely soft water, where the concentration is expected to be less than 4 mg/L as CaCO3, use a colorimeter such as the DR/820, DR/850, or DR/890 Colorimeter, or a spectrophotometer such as the DR/2400, DR/2500, or DR/4000 Spectrophotometer.
Calcium can also be measured using an ion-selective electrode, such as the model ISE25Ca Calcium Electrode made by Radiometer Analytical. An electrode is the best method to use when color or turbidity in the sample interferes with titration or colorimetric methods.
Hach Company also makes several online Hardness Analyzers for continuous hardness monitoring such as the model SP 510 and APA6000 Hardness Analyzers. These instruments can activate alarms or pumps when the hardness concentration reaches selected concentrations.
There is no universal agreement on what exact concentrations are considered hard or soft. The following table shows the classifications used by the U.S. Department of Interior and Water Quality Association. Other organizations may use slightly different classifications.
Classification | mg/L | gpg |
Soft | 0 - 17 | 0 - 1 |
Slightly hard | 17 - 60 | 1 - 3.5 |
Moderately hard | 60 - 120 | 3.5 - 7.0 |
Hard | 120 - 180 | 7.0 - 10.5 |
Very Hard | >180 | >10.5 |
The hardness concentrations shown above are in terms of mg/L or gpg as CaCO3.
This is an indication that your sample contains an interfering metal. Dilute your sample and repeat the test. This may dilute the interfering metal to a level at which it does not interfere.
If this helps, multiply your result by the dilution factor. If this does not help, follow the suggestions in the interference section of your procedure or in the Water Analysis Handbook.
If the sample contains high levels of calcium (>200 mg/L as CaCO3), calcium carbonate will precipitate due to carbonate in the UniVer3 reagent. The sample will turn blue but then slowly fade back to pink as the calcium carbonate slowly redissolves.
Keep adding drops of the Hardness 3 solution until the sample remains blue. You can also start over and add several drops of the Hardness 3 titrant before adding the UniVer3. Include the initial drops when counting the total number of drops.
If there is hardness in your sample, the color after adding UniVer3 should be pink or red. Iron and manganese can interfere with the UniVer3 and cause the pink-brown color, and also interfere with the endpoint color.
Repeat the test with a fresh sample, but add 1-2 drops of the Hardness 3 titrant first, then add the UniVer3, and then continue adding the drops of the Hardness 3 solution. The titrant will complex the iron and manganese so that they will not interfere with the reagent. Include the initial drops when determining the hardness concentration.
Using Normality (equivalents/L):
At endpoint, equivalents of EDTA = equivalents of CaCO3
Equivalents/L of CaCO3 = conc (g/L)/equiv. weight
The equivalent weight of CaCO3 = ½ MW CaCO3 = 50 g/equivalent (In a complex-formation reaction, the equiv weight is that weight which reacts with or provides one mole of the reacting cation if it is univalent, or one-half mole if divalent, one-third mole if trivalent, etc.)
Each mL EDTA used = 0.00002 equivalents:
0.001 L x 0.02 equiv. EDTA/L = 0.00002 equiv. EDTA = equiv. CaCO3
Convert to mg CaCO3and divide by sample volume to get sample concentration:
0.00002 equiv. CaCO3 x 50 g/equiv. = 0.001 g or 1 mg CaCO3
Example: Sample volume is 50 mL: 1 mg/0.05 L = 20 mg/L
Therefore 1 mL of 0.02 N EDTA = 20 mg/L CaCO3 if 50 mL sample volume is taken; multiplier is 20 Using Molarity and a 0.02 N EDTA solution:
0.020 N EDTA = 0.01 M EDTA
At endpoint, moles EDTA = moles Ca or CaCO3
Each mL EDTA used = 0.00001 moles CaCO3:
0.001 L x 0.01 mol EDTA/L = 0.00001 moles EDTA = moles Ca or CaCO3
Convert to mg CaCO3 and divide by sample volume to get sample concentration:
0.00001 mol CaCO3 x 100 g CaCO3/mol = 0.001 g or 1 mg CaCO3
Example: Sample volume is 50 mL: 1 mg/0.05 L = 20 mg/L
Therefore 1 mL 0.02 N EDTA = 20 mg/L CaCO3 if 50 mL sample volume is taken; multiplier is 20
The direct replacement for UniVer2 is UniVer3 Hardness Reagent. The UniVer3 reagent is almost the same as UniVer2 but has more of the indicator. Therefore the pink and blue colors may be more intense, but this will not change the endpoint of the titration.
To replace Catalog No. 27801, order Catalog No. 2883702. This consists of a bottle of UniVer3 and a scoop. There is no direct replacement for UniVer2 packaged in powder pillows (Catalog No. 85068), but you can use the UniVer3 in a bottle with the scoop (Catalog No. 2883702).
If you prefer powder pillows, use ManVer2 powder pillows (Catalog No. 85199) with Hardness 1 Buffer Solution (Catalog No. 42432). Follow the total hardness procedure in the Hach Water Analysis Handbook, which is available in the Download Documentation area.
The RegeneVer Hardness Reagent (Catalog No. 21053 and 21032H) included a buffer, indicator, and titrant all in one solution. When added to a sample, a red color would form if the sample contained more than 1 gpg hardness (as CaCO3). If red, more RegeneVer was added until the sample turned blue. Each drop was equivalent to 1 gpg hardness. This reagent was discontinued in January 2004.
Other hardness reagents and test kits are available that can be used in place of RegeneVer. The 5-B and 5-EP test kits (Catalog No. 145300 and 145400) use UniVer3 reagent and a titrant solution. If the sample turns blue after adding UniVer3, the water is soft. If the sample turns red, add drops of the titrant until the sample turns blue to find the hardness concentration.
If you are only interested in finding whether the sample turns red or blue, add the UniVer3 Reagent to the sample and observe the color. Use the following items when a titration is not desired:
Sample Measuring Tube (5.8 mL): (Cat. No. 43800)
Sample Bottle: (Cat. No. 43906)
UniVer3 Reagent: Powder pillows: (Cat. No. 96299) OR bulk reagent with scoop: (Cat. No. 21320H and Cat. No. 51100).
If you prefer liquid reagents, add 3 drops of Hardness 1 and 1 drop of Hardness 2 solution (Catalog No. 42432 and 42532) to one measuring tube full of sample. If the sample turns red, add drops of Hardness 3 solution (Catalog No. 42632) to find the hardness concentration.
The colorimetric hardness test using calmagite indicator, Hach Method 8030, was designed for ultra pure water or water that has been softened to remove hardness. This method was designed to measure low levels of hardness using a colorimeter or spectrophotometer at concentrations less than 4 mg/L (as CaCO3).
When the hardness concentration is well above 4 mg/L, this method may indicate an incorrect concentration. To see if this is the case, measure the hardness in your sample by titration using a test kit or buret. If the titration indicates that the hardness is much higher than 4 mg/L, your result using the colorimetric test may be incorrect. In this case you can dilute your sample using good quality deionized water to obtain a correct result with the colorimetric method. Then multiply the displayed result by the dilution factor.
Most natural water samples have hardness levels that are many times higher than 4 mg/L as CaCO3. These samples should be measured using a titration method.
Looking for information about Hach solutions for this parameter?
The sample is acidified and the reaction is catalyzed by the addition of the Ammonium Molybdate solution. The Sulfite 1 reagent is added to provide iodide and starch. The hydrogen peroxide is reduced by the iodide to produce water and free iodine.
Sodium thiosulfate is then used to titrate the iodine to a colorless end point. Starch enhances the determination of the end point by producing a color change from dark blue to colorless. The amount of hydrogen peroxide in the sample is calculated from the quantity of sodium thiosulfate added.
Ferrous iron is not stable when exposed to the atmosphere and needs to be measured as soon as possible after sampling. Use Hach Method 8146, which uses the 1,10 phenanthroline chemistry with one powdered reagent.
Add the reagent to a portion of sample and measure the concentration using a test kit, colorimeter, or spectrophotometer. Do not digest the sample.
Hach does not have a single method for testing ferric iron, however you can determine ferric iron by measuring total iron and ferrous iron (Fe2+). Then subtract the ferrous iron result from the total iron result:
ferric iron = total iron - ferrous iron
Use the BART (Biological Activity Reaction Test) tester for iron-related bacteria. Remove the cap from the BART tester, add your sample, incubate, and look for a reaction.
A chart included with the tester describes what reactions to look for. The approximate baterial population in cfu/mL (colony forming units per milliliter of sample) can be estimated from the number of days it takes before a reaction appears.
The FerroVer, FerroZine, and TPTZ methods for iron differ primarily in sensitivity. FerroZine is the most sensitive, then TPTZ, and then FerroVer. FerroVer is the only USEPA-accepted method for reporting total iron (note: the sample must be digested, and a colorimeter or spectrophotometer must be used when reporting).
Both the FerroVer and TPTZ methods react with iron that is sequestered with complexing agents such as EDTA, but FerroZine will not. The FerroVer method is the simplest to use and reacts with all but the most very resistant forms of iron in the sample. The FerroZine method uses a liquid reagent that has a strong sulfur smell.
Iron can exist in water in several different forms, such as ferrous iron (Fe2+) ferric iron (Fe3+) complexed iron, in which iron is bound with various chemicals such as EDTA, and iron oxides, such as rust.
Total iron includes all of these forms. Ferrous iron is soluble in water and is commonly found in ground water. Once exposed to air, ferrous iron quickly oxidizes to ferric iron.
Digestion is the only way to know with certainty that you are measuring all of the iron in the sample. If a sample is not digested, however, the FerroVer reagent will still measure all of the soluble forms and most of the insoluble forms of iron in the sample.
To be sure you are measuring all of the iron in a sample, digest a sample and compare the results with an undigested portion of the same sample. If there is no difference between the digested and undigested results, you can be reasonably sure that a digestion will not change your results for total iron. However, if you are following USEPA requirements and reporting your results to a regulatory agency, you must digest every sample.
Ferrous iron (Fe2+) standard solutions are not stable because they easily react with oxygen and are converted to ferric iron (Fe3+). A total iron test will confirm that some of the ferrous iron has oxidized to the ferric state.
Adding a reducing agent such as RoVer Rust Remover to the ferrous iron standard will keep the iron in the ferrous state. Add 0.1 g of RoVer to 25 mL ferrous iron standard to be sure the standard remains in the ferrous iron state.
In general, testing for bacteria involves incubating a sample in a broth or agar to promote the growth of the organisms of interest and looking for a reaction that is characteristic of the organisms. There are several methods that can be used to do this: P/A presence/absence, MPN Most Probable Number, MF Membrane Filtration, and HPC Heterotrophic Plate Count.
HPC measures overall bacteria populations in the water, while the other methods measure indicator organisms such as coliforms. P/A is the simplest method to use and involves adding sample to a bottle containing broth and incubating for 24-48 hours. The color of the broth will change if the indicator organism is present. P/A is USEPA accepted for the reporting of coliforms in drinking water samples.
MPN involves adding sample or diluted sample to 10 or 15 tubes and incubating for 24-48 hours. Gas production and/or cloudiness will be visible if the indicator organism is present. MPN is typically used for wastewater or turbid samplesMPN is typically used for wastewater or turbid samples and uses a table based on the number of positive tubes to report results as MPN per 100 mL of sample.
MF involves filtering the sample or the diluted sample, transferring the filter to a petri dish and incubating for 24-48 hours. If the indicator organism is present, colonies will appear on the membrane and can be counted. Results are reported as the number of colonies per 100 mL of sample.
Positive results for all methods typically require confirmation in a different broth and sometimes a different incubation temperature. The m-coliBlue24 method is a widely used MF method for total coliforms and E. coli because it does not require confirmation and has a maximum 24-hour incubation period.
For more information on bacteria testing see Literature #7015 and 7047, "The Use of Indicator Organisms to Assess Public Water Safety" and the "Microbiological Laboratory Start-Up Guide" in the Learning Library.
The m-ColiBlue24 test shows positive results for total coliforms and E. coli. While E. coli results may be the best coliform indicator of fecal contamination, m-ColiBlue24 media cannot be used for testing of all fecal coliforms. Use m-FC broth when you are required to test fecal coliforms.
A constant temperature must be maintained to provide proper conditions for growth of target organisms and to suppress growth of non-target organisms. If the temperature is too high, the organisms may die. If the temperature is too low, the organisms may not grow well and non-target organisms may proliferate. An incubator that maintains a constant temperature is the only way to ensure accurate results.
The PathoScreen media for hydrogen sulfide producing bacteria and BART testers for various bacteria are the only tests that do not have strict temperature requirements. The BART testers can be incubated at room temperature and the PathoScreen can be incubated at 25 to 35 degrees Celsius.
Microbiological tests are often conducted in two stages. The first stage, called the presumptive test, uses media that allows stressed or injured organisms to grow. The media will indicate by color change, turbidity, or gas formation if the target organism is present. However, a non-target organism can sometimes interfere and cause a false positive by also changing the color of the media or forming turbidity or gas.
To be sure that a positive result from the presumptive test is indeed a target organism (such as a coliform bacteria), confirm the result using confirmation media. The media used for confirmation is very restrictive to non-target organisms and contains nutrients only for the target organisms. If the target organisms are stressed or injured, they may not grow in the confirmation media and will lead to false negatives. That is why the presumptive media is always used first.
Some media, such as the m-coliBlue24 broth for total coliforms and E. coli, and broth containing MUG do not require confirmation because the non-target organisms do not interfere.
Use water that is buffered to a neutral pH and sterilized for microbiological testing. Hach dilution water containing magnesium chloride and potassium phosphate is recommended for dilution of most nonpotable and wastewater samples. This solution is packaged in 99 mL bottles for convenience (Catalog No. 1430598). Add 11 mL of sample to a bottle and mix. This will result in a 10-fold dilution. See the Hach procedure for the test that you are using for more specific instructions.
You can also prepare this solution from magnesium chloride and potassium phosphate powder pillows (Catalog No. 2143166). This item contains 25 magnesium chloride pillows (large) and 25 potassium phosphate pillows (small). Add one pillow of magnesium chloride and one pillow of potassium phosphate to one liter of distilled water and autoclave for 15 minutes.
Use Butterfield's Buffered Phosphate Diluent ready-made (Catalog No. 2319109, 2319125, 2319145, or 2319110) or in powder form (Catalog No. 2323668) when testing food samples, or whenever your procedure specifies this water for dilution.
Use Peptone Dilution Water (Catalog No. 2142964) if you are following a Standard Methods procedure that specifies it. Peptone contains nitrogen, which helps enrich organisms that are stressed.
The nitrate cadmium reduction method is very dependent on how the sample is shaken. When the sample is shaken, cadmium particles in the reagent reduce some of the nitrate to nitrite. The nitrite then reacts with the indicator to form a color.
If the sample is not shaken hard enough, not enough nitrate will be reduced, and results will be low. If the sample is shaken too hard, results will be high. You must be consistent when shaking each sample after adding the NitraVer5 or NitraVer6 reagent.
The best way to find whether you are getting correct results is to use a nitrate standard solution in place of your sample, and see if you get the correct result. You may need to adjust your shaking technique if your results are higher or lower than the standard concentration.
Be sure to measure a reagent blank and subtract this reading from your test results. The medium and high range nitrate methods especially have high reagent blank values.
Make sure you are adding both NitraVer6 and NitriVer3 to your sample. The NitraVer6 reduces the nitrate to nitrite, and the NitriVer3 reacts with the nitrite to form a pink color.
Use a nitrate standard solution in place of your sample to make sure you can get accurate results on a solution of known concentration.
The high range nitrate test using the NitraVer5 reagent should not be used for measuring low concentrations; it is not sensitive enough and will show poor reproducibility at low concentrations.
You will get the best results with the low range nitrate test using the NitraVer6 and NitritVer3 reagents.
This is likely due to the form of nitrate that is displayed on your Hach colorimeter or spectrophotometer. The default form is NO3-N, which is nitrate as nitrogen, and is a common way to report results to regulatory agencies. However, Hach instruments also display results as NO3. The same amount of color is measured in the test regardless of which form is used, but when this color is converted to concentration, different conversions are used for the different forms.
The conversion to mg/L NO3-N uses the weight of nitrogen, whereas the conversion to mg/L NO3 uses the weight of nitrate, which is 4.4 times the weight of nitrogen. To change the displayed form on your instrumnent, press the CONC key on DR/800 colorimeters or select options on your spectrophotometer. Be sure to record the displayed form with your concentration value for consistency.
Use information on the Material Safety Data Sheet and follow hazardous waste regulations for disposal of cadmium waste from this test. Federal regulations may be superseded by state and local regulations, so it is important to check with a regulatory agency in your area for proper disposal instructions.
Looking for information about Hach solutions for this parameter?
ORP probes cannot be calibrated, but they can be checked to see if they give the correct reading in a solution with a known ORP value.
Open a Light's solution amuple (use amuple breaker 25640-00) and place in a 50 mL beaker. Use a water bath or incubator to make sure the solution is at 25 oC. Place the ORP probe in the solution and wait 5-10 minutes for the reading to stabilize.
The ORP reading should be 475 (+/- 10) mV. If the reading is outside of this range, try conditioning the probe in a 1:1 nitric acid solution for 15 minutes, rinse with deionized water, and repeat the test.
If regulatory reporting is not required, pH measurements can be made using pocket pH testers, test kits, colorimeters, or test strips. Choose the method that best matches your accuracy requirements and ease of use.
A pocket tester should be used when the sample has color or is visibly turbid. Pocket testers can measure across the full pH range and give results to the nearest 0.1 pH unit, however they must be calibrated using pH buffers regularly.
Test kits and colorimeters give an accuracy of 0.1 to 0.5 pH units in a limited pH range, and test strips give semi-quantitative results. Oxidants such as chlorine can interfere with these tests. Add one drop of 0.1 N sodium thiosulfate before adding the pH indicator to remove chlorine interference.
Looking for information about Hach solutions for this parameter?
In the Accuracy Check section of the Phosphonate procedure (Hach Method 8007), a 1 mg/L phosphate standard solution is used to check the instrument calibration. Digestion is not necessary for this verification, and the standard does not need to be diluted.
Because you are not digesting or diluting the standard, you can equivalently begin with 10 mL of sample, add the PhosVer3, and read the result. The displayed result will be 10 times higher than the standard concentration because a factor of 10 is programmed into the instrument calibration. This factor allows the range of the test to go to 125 mg/L.
This factor assumes that a 10-fold dilution was made (5 mL in 50 mL). Note that if a 5 mL volume is used from Table 1, no multiplication factor is needed from Table 2 (factor=1). You could alternatively follow the procedure using 50 mL of standard (using the volume suggested in Table 1 for a 1 mg/L concentration). From Table 2, for 50 mL you will multiply the displayed reading by 0.1. This adjusts for the factor of 10 programmed in the instrument calibration.
The phosphonate test converts all phosphonate compounds into phosphate, and then measures the increase in phosphate due to the phosphonates. The conversion factor for a particular phosphonate is determined from the molecular weight of the phosphonate relative to phosphate, and the number of PO4 (or PO3) groups in the phosphonate molecule:
Example:
What is the conversion factor for MDTP (2-methylene diamine tetrakis-(Methylene Phosphonic Acid))?
The molecular weight (MW) of MDTP is 490 g/mole
The molecular weight of phosphate is 95 g/mole
There are 4 PO3 groups per molecule of MDTP
The conversion factor is: 490/(95 x 4) = 1.29
In this example, you would multiply the result displayed on your instrument by 1.29 to find the concentration of MDTP in your sample.
Phosphorus exists in water almost solely as phosphates, which can be dissolved, attached to particles, or found in aquatic organisms. Phosphates can exist in simplest form as orthophosphate (PO43-), or in larger molecules as condensed phosphates or organic phosphates.
Orthophosphate is often referred to as reactive phosphorus because it is the only type of phosphorus that will react directly with colorimetric phosphate reagents. This type of phosphorus is used by plants, bacteria, and algae and is considered a limiting nutrient for surface waters such as lakes. Fertilizers contain orthophosphate and can contribute significant levels to waters via agricultural runoff.
Condensed phosphates (also called meta, pyro, or polyphosphates) are two or more orthophosphate groups that are linked together. They are strong complexing agents and are widely used in treatment systems for boiler water, and are also used in many detergents. To measure condensed phosphates, follow a procedure for acid hydrolyzable phosphorus. This involves heating the sample with strong acid to break the condensed phosphates down into simple orthophosphate molecules. An acid hydrolyzable test will measure both condensed and reactive phosphorus:
condensed phosphates = acid hydrolyzable - orthophosphate
Organic phosphates contain one or more orthophosphate groups that are attached to an organic molecule such as sugar. They are formed primarily by biological processes and can be found in organic matter such as plant or animal tissue, in sewage from animal or human waste and food residues, as well as in pesticides. To measure organic phosphates, follow a procedure for total phosphorus. This involves heating the sample with strong acid and a strong oxidizer such as persulfate to convert the organic phosphates into orthophosphate molecules. This test will measure organic, condensed, and reactive phosphorus:
organic phosphates = total - acid hydrolyzable
Note: both condensed phosphates and organic phosphates are not as stable as orthophosphate and naturally break down into orthophosphates over time. Therefore an orthophosphate test will likely measure a small amount of condensed phosphates, and an acid-hydrolyzable test will measure a small amount of organic phosphates.
Phosphorus levels in natural waters such as lakes and streams are typically very low, less than 0.05 mg/L as P. Higher phosphorus levels reflect contributions from raw or treated wastewater, agricultural drainage, or industrial waste. Some drinking water plants also add small amounts of orthophosphate or condensed phosphates during treatment.
Orthophosphate is the simplest phosphorus form to measure, but total phosphorus is considered the best indicator of phosphorus levels in water because it measures all three forms. If you are required to measure and report phosphorus levels to a regulatory agency, we advise that you check with your regulator to determine whether you are required to report ortho, condensed, or total phosphorus levels.
It is important to be sure that the chemical form and units are the same when comparing results. Hach instruments and kits report phosphorus levels as mg/L PO43-. Outside testing labs often report phosphorus levels as mg/L P, which has a concentration that is approximately 1/3 the concentration as PO43-.
The same amount of color is measured in the test regardless of which form is used, but when this color is converted to concentration, different conversions are used for the different forms. The conversion to mg/L PO43- uses the weight of PO4, whereas the conversion to mg/L P uses the weight of P, which is approximately 1/3 the weight of PO4. Therefore a value of 3 mg/L PO43- is equivalent to a value of approximately 1 mg/L P. (Actual conversion: divide mg/L PO43- by 3.07 to get result as mg/L P.)
Different chemical forms are used for various reasons, which are somewhat arbitrary, though they all refer to the same result. Most Hach colorimeters and spectrophotometers display PO43- as the default form for phosphorus, but they can also display concentrations for other forms by making selections from the keypad or touch screen.
If you are comparing results using the same concentration units and chemical form, be sure you are following the correct procedure for the program used on your instrument. As usual, it is best to start with a standard solution to be sure you can get the correct results on a solution of known value. Make certain that the units and form used on the standard solution match the units and form used for your result.
Looking for information about Hach solutions for this parameter?
The HR silica test is sensitive to the wavelength setting. If the wavelength setting of your instrument is slightly different from what it should be, your results may be slightly lower or higher than actual.
First be sure that your standard is prepared accurately, and that you are following the procedure exactly. If your results are still low or high, use the standard adjust feature in your instrument to adjust the instrument to read the correct standard concentration.
If using a DR/2000 or DR/2010 Spectrophotometer, adjust the wavelength calibration as described in the lamp replacement section of the instrument manual, and repeat the test. If the results are still low (or high), try replacing the lamp, and repeat the wavelength calibration adjustment.
Looking for information about Hach solutions for this parameter?
The filter paper in the rapid silver kit is divided by blue-colored pieces of plasticine to keep the filters from sticking together. The white pieces are the filters; the blue pieces can be discarded.
This test responds to synthetic detergents such as laundry detergents, dish detergents, and car wash detergents. Although the term surfactants refers to compounds that lower the surface tension of water and includes both soaps and detergents, the Hach surfactants test does not respond to soaps. Soaps are salts of fatty acids, whereas detergents are salts of sulfonic acids. The Hach test responds primarily to LAS detergents (linear alkylate sulfonate) and ABS (alkyl benzene sulfonate) detergents.
The molecular weight of the LAS standard solution, Cat. No. 14271-10, is 342 g/mol.
Be certain that the standards are fresh and you are following the calibration instructions correctly. If this is the case, and the unit will still not calibrate, it may be time to replace the unit.
Looking for information about Hach solutions for this parameter?
The TKN indicator changes color at a certain pH. You can measure the pH of the sample, and then add the 8.0 N KOH until the sample reaches a pH of about 3 rather than looking for the blue color. Then add the 1.0 N KOH until the pH is between 4 and 6, and continue with the rest of the test.
Using an ammonia standard is a good way to check whether your instrument, reagents, and technique are working correctly for the TKN test. Two dilutions are factored into the calibration curve. In addition, the dilution of the standard needs to be taken into account.
The calibration curve for TKN includes a dilution factor of 33.3. This incorporates an assumed 25 in 100 mL dilution (4-fold) from the Digesdahl digestion as well as an assumed 3 in 25 mL dilution (8.33-fold) in the analysis procedure (4 x 8.33 = 33.3).
The calculation in the last step adjusts for dilutions other than these.
If 25 mL of the 1 mg/L NH3-N standard solution were used without digestion, the expected displayed concentration would be 33.3. However, only 20 mL of the standard is put into the mixing cylinder, which is then diluted to 25 mL. This reduces the displayed result from 33 to 27: 1 mg/L NH3-N x 33.3 x 20 mL/25 mL = 26.7 mg/L.
The expected result using a TKN standard depends on whether you add the standard (solid) directly to the digestion flask or whether you prepare a solution from the standard and digest a portion of the solution.
From a solid digestion (solid standard added directly to digestion flask), the result after the calculation in the last step of the procedure should be:
Ammonium PTSA: 74,020 ppm
Glycine PTSA: 56,640 ppm
Nicotinic Acid PTSA: 47,430 ppm
From a liquid digestion (solution prepared from 10 g standard diluted to 1 liter), the result after the calculation in the last step of the procedure should be:
Ammonium PTSA standard: 740 ppm
Glycine PTSA standard: 566 ppm
Nicotinic Acid PTSA standard: 474 ppm
The results of the liquid digestion may differ if an amount other than 10 g were used to make the solution.
If the actual result is significantly different (greater than 10 %) from the theoretical result, follow the standard solution method in the TKN test procedure using a 1 mg/L ammonia standard solution. If the results using an ammonia standard are correct, there may be a problem with the digestion procedure.
Organic-free water must be used for the blank vial in the TOC test; deionized water can have low concentrations of organic matter that will cause the blank to read high and cause low test results.
Not using organic-free water is one of the most common problems that people have with this test. Organic-free water should be packaged in glass bottles; water packaged in plastic containers will show contamination and cause low results.
The blank in the TOC method must be digested. The heat during digestion causes a slight degradation of the indicator, and the persulfate causes a TOC background level that needs to be compensated for.
The blank does not need to be sparged if using the Hach organic free water as instructed, but it can be sparged if CO2 contamination is suspected. CO2 contamination will occur if the blank water is exposed to the atmosphere.
Seawater or brine samples cannot be measured with the Hach TOC method due to the high levels of chloride in these samples.
Chloride interferes with LR TOC at 500 mg/L, MR at 1500 mg/L, and HR at 5000 mg/L by generating chlorine gas during the digestion and bleaching the indicator, which gives lower than actual results. Seawater contains about 19,000 mg/L chloride.
A good bottle for collecting TOC samples is Cat. No. 2794005. This is a glass 40-mL vial that is precleaned to USEPA standards for volatile organics analyses.
If immediate analysis is not possible, cool the sample to 4 degrees C with no headspace. Cooled samples can be stored for several days. Acid preservation is not recommended because it results in poor digestion efficiency and low recovery.
For the best accuracy, however, samples should be analyzed as soon as possible to prevent consumption of organic carbon by bacteria. Bacteria consume some organic molecules such as glucose much faster than others.
TOC results from an outside lab that uses an analysis method such as UV persulfate can differ from the Hach method for a couple of reasons. The UV persulfate method is often less efficient at digesting compounds than the colorimetric method and therefore can give lower results.
Particles can also cause discrepancies, because the particles often settle out by the time the sample is measured at outside labs. When samples are first filtered before being split and sent to an outside lab, results tend to agree much more closely.
There is no procedure for measuring MR TOC on the DR/2000 spectrophotometer. You will need to enter a user calibration into the instrument using TOC standard solutions.
Looking for information about Hach solutions for this parameter?
Turbidity is a measure of the clarity of water. Water that has a very high turbidity will appear cloudy or opaque while water with very low turbidity will appear clear or translucent. Turbidity is caused by particles such as silt, clay, microorganisms, and organic matter.
Turbidity is not a direct measure of these particles but rather a measure of how these particles scatter light. The American Public Health Association defines turbidity as the "expression of the optical property that causes light to be scattered and absorbed rather than transmitted in straight lines through the sample."
For more information, see the Turbidity Science 'bluebook' technical bulletin, Literature No. 7061.
In drinking water, turbidity can indicate the presence of high bacteria levels, pathogens, or particles that can shelter harmful organisms from disinfection processes. Therefore water treatment plants constantly monitor turbidity levels to ensure that the water does not exceed safe levels.
Turbidity is also important in industrial processes or products where particulates can be detrimental to the end use, or are vital ingredients of the product. In either case, turbidity can be used as a quality control measure to monitor the efficiency of the treatment or manufacturing process.
Turbidity is measured in instruments called turbidimeters. Turbidimeters shine light through a section of water and detect how much light is scattered from particulates in the water at a 90-degree angle from the incoming light. This type of scattered light measurement is called nephelometric. Any true turbidity measurement must be made in this way. Turbidimeters are designed for field or laboratory measurements as well as for round-the-clock monitoring, where an alarm can be set to go off when turbidity levels reach unsafe levels.
Turbidity can also be approximated in an instrument such as a colorimeter or spectrophotometer by measuring the decrease in transmitted light due to blockage by particles. This type of measurement, however, is not considered valid by regulatory agencies and does not fit the definition of turbidity by the American Public Health Association.
Transmittance measurements are also susceptible to interferences such as light absorption from color or particle absorption. What ́s more, there is no correlation that can be made between transmittance versus nephelometric measurements. Nevertheless, colorimeters and spectrophotometers are sometimes used for determining large changes in the turbidity of a water system or for process control.
Hach Company has been designing and manufacturing turbidimeters for more than 50 years, and has a great deal of expertise in turbidity measurements.
The light source used in turbidimeters has a specified temperature (required by USEPA regulations) at which it is required to operate. Because these lamps operate at a relatively high temperature, the output of the lamps can change over time, which can affect the amount of light that reaches the detector and thus the turbidity reading.
Hach lamps are specially designed for increased stability and are stable upon factory calibration. However, USEPA regulations mandate calibration upon receipt and set up of the instrument at the test site, as well as calibration on a quarterly basis.
Hach turbidimeters are calibrated by placing one or more prepared primary standards into the instrument and then completing a series of keystrokes on the instrument keypad. Step-by-step instructions are included in each instrument manual.
The primary standard for turbidity is formazin. Formazin is a polymer solution that can be synthesized in the laboratory or purchased from Hach as a 4000-NTU standard. The 4000 NTU standard must be diluted to make several lower NTU standards immediately before calibration.
Hach StablCal® Standards are stabilized formazin standards that are recognized by the USEPA as primary formazin standards. They are widely used for turbidimeter calibration due to their convenience and ease of use. StablCal standards come in varying NTU concentrations and do not require dilution or any preparation other than mixing. All standards, except for the dilution water (<.1 NTU), must be gently inverted immediately before calibration to ensure that the formazin polymer is suspended. Wait a minute or two to allow entrained gases to escape the solution. Never agitate or invert the dilution water standard.
Although it may seem more accurate to calibrate with low-level standards, there are several practical reasons why this does not work as intended and leads to inaccurate calibration.
Contamination is a significant concern at low turbidity levels. The calibration standard can easily become contaminated from particles in the air, sample cells, or dilution water. If the dilution water used to prepare a 0.1 NTU standard had a turbidity of 0.04 NTU, for instance, the actual concentration would be higher than expected by 40 percent.
Stray light and sample cell variation are also significant at low NTU values but are negligible at 20 NTU. If a calibration is made with low-level standards, these factors of contamination, stray light, and sample cell variation can easily skew the calibration curve and make all subsequent readings inaccurate.
On the other hand, a 20 NTU standard can be prepared accurately and reproducibly and any turbidity contribution from contamination, stray light, and sample cell variation is negligible relative to the 20 NTU value. The zero point of the calibration curve is taken as the dark current in the instrument before the lamp is turned on, and the calibration curve is linear from 0-40 NTU, therefore calibration accuracy is maintained at lower concentrations.
After calibration with a 20 NTU standard, Hach recommends verifying the accuracy of low-level measurements using StablCal low-level standards, available down to 0.100 NTU. When meticulous care is taken to minimize contamination, your instrument should be able to read within the tolerance specified on the standard.
NTU stands for Nephelometric Turbidity Unit and signifies that the instrument is measuring scattered light from the sample at a 90-degree angle from the incident light. FNU standards for Formazin Nephelometric Units and also signifies that the instrument is measuring scattered light from the sample at a 90-degree angle from the incident light. NTU is most often used when referencing the USEPA Method 180.1 or Standard Methods For the Examination of Water and Wastewater. FNU is most often used when referencing the ISO 7027 (European) turbidity method.
When formazin was initially adopted as the primary reference standard for turbidity, units of FTU or Formazin Turbidity Units were used. These units, however, do not specify how the instrument measures the sample.
FAU or Formazin Attenuation Units signify that the instrument is measuring the decrease in transmitted light through the sample at an angle of 180 degrees to the incident light. This type of measurement is often made in a spectrophotometer or colorimeter and is not considered a valid turbidity measurement by most regulatory agencies.
The turbidity units NTU, FNU, FTU, AND FAU are all based on calibrations using the same formazin primary standards. Therefore when a formazin standard is measured, the value for each of these units will be the same, however the value on samples may differ significantly.
A JTU or Jackson Turbidity Unit is a historical unit used when measurements were made visually using a Jackson Candle Turbidimeter. Water was poured into a tube until a flame underneath the tube could no longer be distinguished.
Gelex standards are secondary standards in solid or gel form that require no dilution or mixing. They provide a quick and convenient way to check if the calibration of laboratory and portable turbidimeters has changed on a daily basis.
To use, place the Gelex standard in the sample cell compartment in a defined orientation right after the turbidimeter has been calibrated with a primary standard, and record the displayed value. Then periodically (daily or weekly, for instance) place the standard back in the instrument in the same orientation to verify that the calibration has not changed significantly.
When the value of the Gelex standard differs by more than 5% from the value right after the turbidimeter was last calibrated, the turbidimeter should be calibrated again with a primary formazin or StablCal standard. Both formazin and StablCal are accepted primary standards for turbidimeter calibration.
Gelex standards should never be used to calibrate a turbidimeter, only to check the calibration after the turbidimeter has been calibrated with a primary standard.
Gelex standards, developed well before StablCal standards became available, were designed to provide a convenient way to check the calibration of turbidimeters on a daily or weekly basis. At that time, the only other way to check turbidimeter calibration was to prepare formazin standards from dilutions of a 4000 NTU standard.
Now that stabilized formazin standards are available as StablCal in sealed ampules, they also eliminate the need for dilution and can also be used for checking the calibration of a turbidimeter on a daily or weekly basis. StablCal standards must first be inverted before use according to directions, however, to ensure that the formazin polymer is suspended in solution.
When measuring very clean water with very low turbidity (below about 0.2 NTU), it becomes more and more difficult to obtain an accurate reading with a laboratory tubidimeter. This is primarily due to contamination of the sample cells and stray light in the instrument, which can be caused by imperfections on the surface of the sample cells as well as dust on the optical lenses and other internal components.
Sample cells must be cleaned meticulously with 1:1 hydrochloric acid and rinsed thoroughly with distilled or deionized water and then sample water. Sonicating the cells in a sonicating bath is also helpful for dislodging particles attached to the cell walls. Once clean, sample cells should be stored with the caps on and rinsed with deionized water immediately after sample measurement.
When contamination is removed from the sample cells and silicone oil is properly applied to minimize the effects of scratches on the outer cell walls, the turbidity levels will drop and approach the readings obtained from the process turbidimeter. Because process turbidimeters do not use sample cells and the detector is placed directly in the sample stream, turbidity due to contamination and stray light are minimized.
Different turbidimeters may have different light sources, photodetectors, and optical systems that can result in differences in sensitivity and linearity between instruments. In addition, the way that the zero point is set and how calibrations are done has changed over the years to improve accuracy, especially at low concentrations.
Many older turbidimeters did not incorporate the ratio mode for color interference and therefore give lower than actual results for samples having color. Some of the older turbidimeters also had zero knobs to zero out the reading from deionized water. Modern instruments do not zero on deionized water because even the cleanest water will have some turbidity. Zero knobs create a false negative condition where the instrument displays a lower than actual result.
Newer instruments use the dark current (90 degree detector current with the lamp off) reading, made when the instrument is first turned on, to set the zero point. A dilution water (or <0.1 NTU StablCal) reading is taken and used to adjust the concentration of calibration standards for greatest accuracy. Modern turbidimeters incorporate the latest advances in electronics, optical components, and design, and will have the lowest stray light interference and give the most accurate results.
Although suspended solids will cause turbidity, a turbidity measurement is not the same as a measurement of suspended solids. A suspended solids measurement, as defined by the USEPA, determines the amount of solids in a sample by weight, where a turbidity measurement shows how the suspended solids scatter light. When the particulate makeup of the sample changes, the light scattering characteristics of the sample may change in an unpredictable way.
If the particulate makeup of the sample is known to be consistent over time, it may be possible to use turbidity measurements to estimate the level of suspended solids. This would require making a calibration curve of suspended solids (determined gravimetrically) vs. measured turbidity values (NTU) for a series of samples with varying levels of suspended solids.
Even in these cases, many samples do not exhibit a linear relationship between ppm suspended solids and turbidity value. This can be caused by interferences such as color, particle shape, distribution, and absorption. For example, a natural sample showing a turbidity of 500 NTU often shows a turbidity of substantially more than 100 when diluted 5-to-1 with distilled water.
It is best to use an instrument that can measure your sample without dilution, because dilution can alter the characteristics of the suspended particles and produce erroneous results. The upper limit for the 2100AN turbidimeter is 10,000 NTU. The upper limit for the 2100N turbidimeter is 4000 NTU.
When necessary, you can dilute samples using a portion of filtered sample. Deionized or distilled water should not be used for dilution water, because some of the turbidity in the sample may dissolve. Use the Sample Filtration and Degassing Kit (Catalog No. 4397510) for filtering the sample. After dilution and measurement, multiply the displayed result by the dilution factor.
Gelex standards should never be used to calibrate a turbidimeter. They are designed to check the calibration after the turbidimeter has been calibrated with a primary standard such as StablCal.
After calibrating the turbidimeter with StablCal or formazin standards, place each Gelex vial in the sample compartment, and write the NTU value of each Gelex standard on each vial. Place the Gelex vials in the turbidimeter on a regular basis and compare the NTU value with the value written on the vial. When the measured turbidity is greater than 5 percent different from the recorded value, the turbidimeter should be calibrated again using a primary standard.
Note: you may need to calibrate your turbidimeter more often than indicated by the reading from the Gelex vials in order to meet regulatory compliance.
Looking for information about Hach solutions for this parameter?
SUVA is a calculated parameter found by dividing a sample's UV-254 value (in cm-1) by the DOC (dissolved organic carbon, in mg/L), and then multiplying by 100.
UV-254 can be measured by following the organic constituents procedure using the DR/4000 U Spectrophotometer. DOC can be measured by filtering a sample through a 0.45 micron filter and analyzing the filtrate for TOC.
The Water in Oil Reagent Vials should contain a small amount of white powder (calcium hydride) and also a few milliliters of liquid (isooctane). If the powder appears grey, or if the vials do not contain both powder and liquid, they should not be used.
The results of the Oil in Water Test Kit are expressed as the percent of water (by weight) per volume of oil sample.
There are several techniques that are helpful when using the Water in Oil Test Kit:
Before placing the reagent vial in the assembly stand, shake the vial well to coat the inside walls.
Make sure the lift tube (0-1% scale) is secure in the column cap; it should not be loose.
Make sure your sample is well mixed so that the water does not separate from the oil. The water in the sample will react with the calcium hydride and form hydrogen gas, which displaces the water in the column.
Allow the reaction to continue until no more bubbling can be observed in the reaction vial. This normally takes about 5 minutes.