Quality Assurance and Quality Control (QA/QC) protocols are essential
to WOW. We have gone to great lengths to assure the accuracy of
our data - the following sections describe these measures in detail.
What are Quality Assurance and Quality Control?
refers to all those things good investigators do to make sure their
measurements are right on (accurate; the absolute true value),
reproducible (precise; consistent), and have a good estimate of their
uncertainty. In the regulatory arena, this aspect of data collection
is as crucial to the final outcome of a confrontation as the numbers
themselves. It specifically involves following established rules in
the field and lab to assure everyone that the sample is representative
of the site, free from outside contamination by the sample collector
(no dirty hands touching the water) and that it has been analyzed following
standard QA/QC methods. This typically involves comparing the sample
to a set of known samples for estimating accuracy and
by replicating the measurement to estimate its precision.
The U.S. Environmental Protection Agency has lots to add should you
wish to learn more of the technical aspects of a Quality Assurance
Program (QAP). See also the WOW Lessons about
data quality and interpretation for further information.
Conventional data quality assurance procedures follow guidelines
set by the U.S.EPA (1987; 1989a,b), and APHA (1998). Water chemistry
and manual field profiles are collected by trained staff limnologists
and technicians at both Hennepin Parks (HP under Principal Investigator/Limnologist
John Barten's supervision) and the Natural Resources Research Institute
(NRRI under Co-Principal Investigator / Limnologist Rich Axler).
Both the Hennepin Parks Water Quality Laboratory and the NRRI Central
Analytical Laboratory are certified annually by the Minnesota Department
of Health for Federal Safe Drinking Water Act and Clean Water Act
parameters (Ameel et al. 1993, 1998; Axler and Owen 1994; Archer
and Barten 1995, 1996; Barten 1997; MCWD 1997). The certification
procedure involves blind analyses of certified performance standards
and an in-depth site inspection and interview approximately every
other year. The NRRI lab has also been certified over the past decade
by the Minnesota Pollution Control Agency and the Minnesota Department
of Natural Resources for low-level water quality analyses in pristine,
acid-sensitive lake monitoring programs and for sediment contaminant
analyses in the St. Louis River and Upper Mississippi Rivers.
LAKE DATA QA/QC
There are basically two sets of environmental data that are collected
for WOW Lake sites:
(1) conventional water quality parameters such as nutrients (N- and
P-series of nutrients), chlorophyll, clarity, fecal coliform bacteria, manual
field profiles for temp, DO, EC, etc. These are based upon traditional methods
where a trained staff person records measurements at different depths from
a sensor lowered over the side of a boat and collects water from discrete depths
that are returned to the lab for analysis.
(2) remotely sensed and controlled R.U.S.S. (Remote Underwater Sampling
System) units that control the depth and sampling interval of water quality
sondes housing depth, temperature, DO, pH, EC and turbidity probes. Data may
be transmitted via cellular phone/modem to our base computer/website immediately
upon completion of a depth profile, or may be stored on board the RUSS and
downloaded less frequently (each morning, currently) to save connection costs.
RUSS QA/QC is performed at a number of levels. The sensors are either Hydrolab
H20 or YSI 6820 probe/sonde instruments; both HP and NRRI staff follow the
Instrument Manuals for calibration and maintenance procedures. Our staff also
have extensive experience with these calibration procedures and with their
importance in interpreting field data and distinguishing systematic errors
associated with deteriorating, or bio-fouled probes. In 1998 and 1999 we gained
considerable experience in dealing with problems associated with continuous
sensor deployment; the resultant protocols are included in our WOW efforts.
Other aspects of the data management process are discussed in Host et al. (2000a,
NRRI contributed to the initial development of the RUSS technology.
During the preliminary and early stages of Water on the Web, numerous
tests were conducted in regard to the accuracy and precision of in-situ
data. Since both the YSI and Hydrolab systems are well established
and used for numerous state and federal monitoring programs, the
principal concerns related to the time allowed for sensor equilibration
at each depth . Of all the sensors that we use, dissolved oxygen
is most susceptible to erroneous values from inadequate stabilization-
the error being greatest in regions with steep depth-gradients in
DO. Following our collaborative work on this topic with Apprise
Technologies, Inc., the company subsequently ran a nearly yearlong
experiment in Lake Waco, Texas with Hydrolab, Inc. comparing RUSS-transmitted
data to conventional datalogger data. The data sets agreed within
sensor specifications. Both sensor companies have internal quality
control systems (YSI is ISO14001 registered) that guarantee the consistent
quality of their sensors. Apprise has worked independently with both
companies to integrate these sensor packages with their RUSS units.
As a part of these programs, the RUSS technology was independently
field-tested by both companies and both YSI and Hydrolab have audited
the Apprise facilities for QA/QC compliance. Apprise has also implemented
an internal quality system based on the ISO9000 system and has been
extremely helpful in dealing with problems that occasionally arise
with the Lake Access and WOW units. A more complete description of
our current protocols follows:
RUSS SENSOR RESOLUTION & REPORTING LIMITS
On the RUSS unit, the on-board computer processes a user-submitted
instruction sequence, the sensor package is sent to a specified depth,
and a series of feedback corrections are made until the sensors are
stabilized within 0.2 m of the specified depth. Output from the sensors
is monitored to assess when the readings on all parameters have stabilized
to a specified criterion, usually a coefficient of variation <20%
for a running set of 10 consecutive measurements over an interval
of ~1 minute. Dissolved oxygen typically requires the most amount
of time to stabilize on average, in part because of the occurrence
of steeper depth gradients for this parameter. Depending on the site
characteristics and the specific O2-sensor, as much as 3-5 minutes
may be required for complete equilibration. Once stabilized, readings
on all parameters are stored in buffer memory on the on-board computer.
The raw data stream is a simple string of comma-delimited ASCII text
containing a time signature, depth, and parameter values (Table 1).
Table 1. Output from Lake Access RUSS unit on Halsteds Bay, Lake Minnetonka,
EMPT2 site: Halsteds Bay
@ 25 C
To date we have set
the reporting limits for RUSS data based on instrument specifications
and prior knowledge of the magnitude of typical field variations. This
information is presented within the RUSS data section of the WOW web
site. The resolution, i.e. the smallest reading shown for a particular
parameter is likely to be considerably lower than the error associated
with differences in time, with depth fluctuations, and with sensor
drift and calibration accuracy. Periodic examination of the RUSS data
stream with Apprise Technologies, Inc. has generally confirmed the
estimated accuracy reported below (Table 2). An important, and greatly
underestimated element of the WOW projects has been to assess the accuracy
of these data by comparison with approximately biweekly manual profiles.
However, it is likely that the relative precision of the data between
depths within a water column profile and within a few hours to a day
will be better than from week-to-week.
if all of the RUSS sensors behaved according to sensor-manufacturer's
specifications (Table 2) we could simply post the data on the Lake
Access web site and assume it is accurate to these levels. However,
except for temperature, all of the sensors require routine maintenance
and calibration. When using these sensors for manual profiling, that
is, visiting lake sites by boat, we always re-calibrate the pH, EC
and turbidity sensors using individual standard solutions with known
values, and the DO by air calibration. Experience has taught us that
the sensors remain stable during the course of a sampling day.
Table 2. Reporting
limits for RUSS sensor data (Hydrolab or YSI sensors)
when deployed for continuous operation, as for the RUSS unit, the sensors
are colonized gradually by a biofilm of algae and less noticeably by
bacteria and fungi as well. As this material builds up, its metabolic
activity interferes with the sensor's ability to accurately sample
the surrounding water. One can easily picture the effect of fine filaments
of algae wafting intermittently between the electrodes of the EC sensor
or in the light path of the turbidimeter giving seemingly erratic values
with wide swings as the sensors move up and down. An anomalous spike
in the Ice Lake EC data during July 1998 (see shaded region in the Surface
Trends for Ice Lake on the WOW site), are a good example of this
effect and are the basis of a lab lesson (Increased Conductivity:
Are Culverts The Culprits? in draft). DO and turbidity
probes are most susceptible to these changes, followed by pH and EC.
(what is reported by the RUSS sensors)
Accuracy (what we really trust)
Lake Access and WOW staff set up the following protocols to minimize these
biofouling and instrument drift effects to quality assure the RUSS data:
* Clean and re-calibrate sensors frequently (about every 2
weeks) and perform manual profiles with an independent instrument
at the same time
* Compare independent manual profiles with near-simultaneous
RUSS data prior to cleaning (re-calibration). This provides assurance
that data from the previous period are accurate. We calculate test
statistics for each parameter as:
for each parameter.
They PASS according to rules in Table 3
3. Quality Assurance Criteria for RUSS Sensors
If the data "passes",
it is considered acceptable for the previous period. If not, we examine
it in the context of our understanding of the limnology of the individual
lake and other data (nutrients, chlorophyll, trends, etc.) and then either
delete it from the database or allow it to be posted. We have to be careful
not to delete anomalous data that may simply reveal real dynamic changes.
The sheer volume of data (218,720,430,742,644,316,434,172,687,130 values
to date) has been taxing and we lack the resources to always be as current
as we would like. In the interim, data are posted as provisional .
Dates of calibrations and these manual data are posted in the DATA
section of WOW and are available within easily accessible Excel files
- these will soon be posted on the Lake Access site as well. The three
Data Visualization Tools (DVTs) developed for Lake Access and Water on
the Web are also helpful in rapidly displaying the data in a variety of
formats to help identify anomalous data. We are currently in the process
of adding 'calibration date flags' to the control panels of the Profile
Plotter and Color Mapper DVTs and to the DxT Profiler to allow the user
to more easily keep track of calibration dates as the data stream is being
viewed. The first year of
WOW, 1998, taught us that we were understaffed for the frequency of
maintenance required for continuous RUSS operation at Ice Lake and
Lake Independence. With an additional three units being deployed for
1999 and 2000, we set up collaborations with Itasca Soil and Water
Conservation District (for Ice Lake), Hennepin Parks (for Lake Independence
and Lake Minnetonka), the Minnesota Department of Natural Resources
Regional Fisheries for Grindstone Lake, and the Minnesota Pollution
Control Agency staff for the St Louis River site (still in development
as of July 2000). Lake Access and WOW staff work with these folks to
clean and re-calibrate all sensors approximately every 1-3 weeks depending
on the site. The less productive sites (Grindstone and Ice lakes) generally
require less maintenance.
AND INITIAL QA SCREENING
The program that imports the RUSS data currently is scheduled to run every
day at 7:30 AM. The RUSS base station software is used to call each RUSS and
download data that has been collected since the last call. A file containing
real-time data (RTD) collected during the duration of the call is also created.
These new profile data and RTD files are stored on the base station computer
as plain ASCII text files, one file for each day's data. The data files from
each site are stored in a separate directory on the computer. Table 1 (above)
is an sample of an original profile data file created by the RUSS base station.
The Conversion Process
A program (the importer) is now launched. It reads data files that have been
created or changed since the last time it was run, and converts the data to
the format used by the report generating and data visualization programs. Additionally,
the original data files are copied to the web server so they are accessible
for immediate QA/QC. Profile data files are copied to http://wow.nrri.umn.edu/data/
and RTD files are copied to http://wow.nrri.umn.edu/rtd/ .
The importer parses the first line of a new or modified RUSS data
file and tests to make sure that the Unit and Site correspond to
what is expected. If
not, an error message is generated and no further action is taken with this
file. This will catch errors that could occur if, for example, a data file
from Halsteds Bay was somehow stored in the Lake Independence directory. Next,
it reads the line containing the column descriptions, and compares it with
what is expected. If it differs, an error message is generated and no further
action is taken with this file. This will catch errors that could occur if,
for example, a new parameter is being read by the RUSS, but the importer hasn't
been updated to handle the change. Now, each data line is read and converted
to a "Reading". A set of readings is combined to form a "Profile" in the data
base. Specific data is rejected by the importing program if it is outside these
or > 35 oC
|| < 5
or > 10
|| < 1
or > 600 uS/cm
|| < -1
or > 20 mg/L O2/L
or > 200 %
|| < -5
or > 100NTU (note: turbidity values between -5 and 0 are set
There is no direct
indication in the raw data files of where one profile ends and the
next begins, so the importer applies some heuristics to decide how
to assign readings to profiles. The values listed below are those in
current use, but they can be changed. Since only the actual time is
reported on each data line, the importer assigns a "scheduled" time
to the new profile, using the nearest :00 or :30 minute time value
before the time reported for the first reading in the profile. Subsequent
readings are added to the same profile provided that:
1) the reading is from a lower depth, and
2) the reading was taken within 30 minutes of the
When the importer either comes to a line where the reading no longer qualifies
for the current profile or it reaches the end of the data file, it will add
the new profile to the data base provided that:
1) the first reading starts within 3 meters of
2) there are at least 4 readings in the profile,
3) the date is not in the future
Instead it will generate an appropriate error message in the log file, and
disregard the profile. This helps eliminate partial or invalid profiles that
could be caused by RUSS hardware problems.
If it is winter and the RUSS is installed on ice, we set the minimum upper
depth to 1 or 2 meters to minimize the risk of the unit becoming trapped in
the hole through the ice. The data importer then creates a default reading
at 0 meters, listing a temperature of 0 oC, with all other parameters blank
(since we don't know what their true values are). The time for this reading
is set equal to the scheduled time. The timestamp of each reading is expected
to be unique, and can be used as the key value in a database. There is the
possibility that the first actual reading in the profile could have the same
timestamp as the bogus reading, so the readings in the profile are checked
for duplicate times. If found, 5 seconds are added to the time of the deeper
reading, and the change is noted in the log file.
Sometimes a sensor for a particular parameter at a particular site will go
bad. In this case, the importer program can be customized to reject that parameter
when importing the data from the site. Data stored in buffer memory is transmitted
to the base station via cellular phone at specified intervals, or at the request
of the user. Standard parity-based error correction techniques are used to
ensure that data were not altered during transmission. At the base station,
a JAVA based application adds the raw data to a standardized relational database
(DBMS) file. For archival purposes the original ASCII data are stored in a
compressed data format (ZIP) file. The ASCII and DBMS files are periodically
downloaded to an off-site location via File Transfer Protocol (FTP).
FINAL DATA REVIEW & POSTING
At present (July 2000), funding limitations have precluded adherence to a rigorous
schedule for removing the provisional label from RUSS data. In part this is
a due to the need to review ancillary water chemistry data before making final
decisions when the RUSS data are questionable. All water chemistry data posted
on the Lake Access and WOW sites however, have passed QA/QC prior to being
posted, although this typically takes from 30-60 days after collection.
Despite regular maintenance and calibration schedules, occasional RUSS data
anomalies still occur. To date, they have virtually always been associated
with DO and/or turbidity data although there have been recurring problems with
the pH probe at the WOW Grindstone Lake site.
The most troublesome anomalies are those that occur within the calibration
window of time, are not flagged by our automated screening tools and are not
unreasonable values in terms of the range of values previously measured for
that depth stratum and time of year. These errors have not been trivial to
identify and require careful examination in a complete limnological (lake/watershed/climate)
context by a professional limnologist. The process is adequately described
as Best Professional Judgement (BPJ). In some cases we have decided to adjust
data by calculating correction factors when there is accurate calibration data
spanning the period in question and when the results estimated by interpolation
are consistent with the rest of the data set. In other cases we have simply
rejected the data - omitting it from the website. Data deletions are summarized
and circulated to all limnological staff and archived in a hidden section of
the Lake Access and WOW websites. The WOW project sends a periodic e-mail newsletter
providing data updates to all teachers and researchers using the site for educational
or research purposes; you can subscribe to this newsletter at http://wow.nrri.umn.edu/wow/contactus.html.
STREAM AND RIVER QA/QC
DATA TYPES FOR RIVERS AND STREAMS
(1) water quality parameters analyzed in the lab from "grab" samples
of water such as nutrients (N- and P-series of nutrients),
pH and alkalinity, turbidity, total suspended solids (TSS), total
dissolved solids (TDS), color, chloride, fecal coliform bacteria,
total volatile solids (TVS) and biochemical oxygen demand (TVS
and BOD5 are both measures of organic matter);
(2) water quality measurements made in the field at the
time of water sample collection such as sensor measurements
of temperature, water velocity (to estimate flow), dissolved
oxygen (DO), specific electrical conductivity (EC25) and transparency
tube measurements of water transparency or clarity, etc.; and
(3) remotely sensed water quality measurements collected,
logged and transmitted by electronic sensors deployed in the streams
that we call SMU's for Stream Monitoring Units. These instruments are
programmed to record temperature, stream elevation (also called stage
height), EC25 (a measure of total salt concentration), and turbidity
(used as a measure of suspended sediment). In addition, the SMU at
the Kingsbury Creek site is connected to an automated sampling device
(ISCO 6712) that upon preset increases in stage height, signifying
a storm event, initiates sample collection of up to 24 individual water
samples into one liter plastic bottles. We have initially programmed
the unit to sample at 2 hour intervals during storm events.
The stream sensor data is typically collected every 15 minutes,
stored throughout the day and then transmitted via cellular phone/modem
to our base computer/website each morning. We can collect the data
more frequently, and call up the unit to see what the latest data
looks like, but this adds to our phone bill. A final intensive
data set is collected at the Duluth-Superior Harbor/ St. Louis
River outlet to Lake Superior where a sensor array measures and
logs temperature, EC25 and turbidity at 15 minute intervals. This
unit was installed on the channel wall above the USGS Superior
Bay Duluth Ship Channel (Station 464646092052900) where an acoustical
velocity meter (AVM) system with a two-path transducer installation
is used to measure river discharge into Lake Superior. We have
previously established a data-sharing partnership with USGS (Madison,
WI, P. Hughes) to link our on-line water quality data with their
(4) volunteer monitoring data. During this grant
period, DuluthStreams staff will work with the MPCA to coordinate
existing volunteer monitoring with the goal of achieving more consistent
methodology and developing a repository and database for stream-related
information in the City of Duluth. Programs include the state-wide
Citizens Stream Monitoring Program (MPCA sponsored), St. Louis
River Watch and a variety of less formal efforts by local schools
and environmental learning centers. DuluthStreams initiated school-based
stream monitoring programs at sites on Chester, Tischer and Kingsbury
Creeks in October 2002 as part of the National Monitoring Day,
nation-wide celebration (http://www.yearofcleanwater.org)
of the 30th anniversary of the Clean Water Act on October 15. It
was subsequently decided to include these schools in the St. Louis
River Watch program which already includes over 30 area schools
(see http://www.duluthstreams.org/citizen/involvement.html for
Parameters will vary depending on the level of effort available,
but at a minimum will include temperature, transparency tube depth,
turbidity and EC25 during seasonal baseflow conditions, snowmelt
runoff and rainstorm runoff. Training will be provided by River
Watch and DuluthStreams staff, and turbidity and EC25 measurements
will be performed by NRRI on discrete samples collected by the
schools. Biological monitoring of macroinvertebrate communities
is fundamental to River Watch, and DuluthStreams will work to encourage
standardized sampling protocols that conform to peer-reviewed research
standards and existing state and federal guidelines. The goals
of this effort are to initiate a comprehensive database, standardize
monitoring methods, and educate students and citizens
STREAM SENSOR RESOLUTION & REPORTING LIMITS
Ideally, if all of the sensors behaved according to the sensor manufacturer's
specifications (Table 1) we could simply post the data on the DuluthStreams
web site and assume it is accurate to these levels. However, except for temperature,
all of the sensors require routine maintenance and calibration. When visiting
sites we re-calibrate the EC and turbidity sensors using individual standard
solutions with known values for EC25 and turbidity. Experience has taught
us that the sensors remain stable during the course of a sampling day. The
depth sensor is also calibrated at this time to read zero when the sensor
is set at the water surface.
When deployed for continuous operation the sensors are colonized
gradually by a biofilm of algae and less noticeably, by bacteria
and fungi as well. As this material builds up, the biofilm interferes
with the sensor's ability to accurately sample the surrounding
water. One can easily picture fine filaments of algae, like you
see on the rocks in summer, wafting intermittently across the electrodes
of the EC sensor or in the light path of the turbidimeter. This
would produce erratic values with wide swings even during periods
of low and relatively constant flow when the stream is running
clear. Turbidity probes are most susceptible to these changes,
followed by dissolved oxygen (not measured automatically at this
time) and EC. Fine particulate sediment will also be trapped and
gradually contribute to erroneous readings if not cleaned on a
regular basis. Turbidity sensors are "notorious" for
their maintenance problems - the YSI 6136 sensor we are using is
called a "self-wiping" sensor that mechanically wipes
the optical window used for the measurement just before a reading
is taken. Although this feature certainly reduces errors due to
fouling, the wiper occasionally "sticks", creating apparent "spikes" in
the data set (see below)
Other sources of variability include the natural variations that
occur in a stream due to eddies and larger debris suddenly breaking
free, events less likely to be seen in the calmer water in deeper
stream pools or in lakes. Also, the sensor is actually "sampling" only
a tiny (millimeter scale) patch of water for less than a second
every 15 minutes. Table 2 compares the theoretical sensor resolution
with our best estimate of the true accuracy of the sensor readings,
incorporating all sources of known error.
|Table 1. Reported automated sensor
YSI 6920 sonde
YSI 6136 turbidity sensor
||± 0.15 °C
||± 0.5% reading +1 µS/cm
% of reading or 2 NTU
(the greater of the two)
||± 0.003 m (± 0.01
YSI 6820 sonde
|turbidity sensor as
above; a 6820 sonde instead of a 6920 sonde is used for cost
and data logging considerations associated with the automated
|stage height sensor
triggers ISCO 6712 automatic water sampler
|St. Louis River
||YSI 6820 sensors as
above (w/o stage height)
Apprise Technologies RepDAR unit
(see http://lakeaccess.org/QAQC.html for
Table 2. DuluthStreams Reporting limits for SMU sensor
data (YSI 6920/6820)
The resolution, i.e. the smallest reading displayed for a particular
parameter, is likely to be considerably lower than the error associated
with differences in time, with sensor drift and calibration accuracy.
is reported by the SMU sensors)
Accuracy (what we really trust)
SMU PLACEMENT, FLOW CALIBRATION , SAMPLE COLLECTION
We attempted to place the water quality probes at representative
measurement points in the stream cross-section with allowances
made for protecting the units from debris and sediment where necessary.
Probes are inspected, calibrated, and maintained following the
manufacturers recommendations in addition to following USGS (2000a,
2000b) procedures. The exact location of each unit within a watershed
was chosen based upon considerations of security, stability regarding
anticipated road and bridge construction activities, and on upstream
land use characteristics. Because of this, summary interpretations
as to water quality differences between streams must be done with
caution since such a direct comparison was not intended.
Stream discharge is determined as in Anderson et al (2000) and
USGS (2000b) with rating curves developed and subsequently used
to calibrate flow. Stream depth is measured remotely using a pressure
transducer and discharge is determined using rating curves based
upon a set of cross-sectional and discrete depth in-stream velocity
measurements made with a Marsh-McBirney stream velocity meter over
a range of discharge conditions.
Discrete water samples are collected at Kingsbury Creek at various
points along the hydrograph using an ISCO 6712 slaved to the YSI
6820 stream elevation sensor. Periodic intensive manual collections
throughout high water periods as well as during baseflow periods
are conducted for Chester and Tischer Creeks.
NRRI staff follow the Instrument Manuals for calibration and maintenance
procedures. Our staff also have extensive (i.e. decades collectively)
experience with these calibration procedures and with their importance
in interpreting field data and distinguishing systematic errors
associated with deteriorating, or bio-fouled probes. Our Lake Access,
EMPACT project is a companion to an earlier NSF-funded Advanced
Technology Education project entitled Water on the Web (WOW: http://wow.nrri.umn.edu),
now in its fifth year, that involved establishment of Apprise Technologies,
Inc. robotic water quality sensing RUSS units on five Minnesota
lakes. Since 1998 we have gained extensive experience in dealing
with the problems associated with continuous sensor deployment
and the resultant protocols are included also in the Lake Access
and WOW websites.
The following protocols have been set up to minimize these biofouling
and instrument drift effects to quality assure the DuluthStreams
(1)Clean and re-calibrate sensors frequently (about every 2 weeks)
and perform manual measurements with an independent instrument
at the same time (temperature and EC25) or soon after in the
lab (turbidity). The depth sensor is also calibrated at this
time by re-setting it to equal zero depth when placed just above
the water surface.
(2) Compare independent manual measurements with
near-simultaneous SMU data prior to cleaning
(re-calibration). This provides assurance that the previous period
of data is accurate. We calculate test statistics
for each parameter as:
RPD (relative % difference) = |Manual-SMU|
x 100 , and
DELTA = |Manual SMU| for each parameter.
They PASS according to rules in Table 3.
|Table 3. Quality
Assurance Criteria for SMU Sensors
||< 0.2 °C
||< 5 µS/cm
||< 5 NTUs
If the data "passes," it is considered acceptable for the
previous period. If not, we examine it in the context of our understanding
of the instrumentation, the stream's prior automated and "manual" water
quality data, its watershed and the recent weather pattern and then
either delete it from the database or allow it to be posted. We have
to be careful not to delete anomalous data that may simply reveal
real dynamic changes. The sheer volume of data has been taxing and
we lack the resources to always be as current as we would like. In
the interim, data are posted as provisional. Dates of calibrations
and manual data are both posted in the DATA section of the website
and are available within easily accessible Excel files. The stream
data visualization tool (DVT) is also helpful in rapidly displaying
the data in a variety of formats to help identify anomalous data.
Calibration "date flags" will be added to the control panel
of the DVT to allow the user to more easily keep track of calibration
dates as the data stream is being viewed.
Although not yet implemented (as of December 2002), we are
also exploring a data quality classification rating system
such as used by the US Geological Survey for certain continuous
water quality records. The table below is taken from USGS
(2000a; Guidelines and Standard Procedures for Continuous
Water-Quality Monitors: Site Selection, Field Operation,
Calibration, Record Computation, and Reporting, U.S. Geological
Survey, WRIR 00-4252, by R.J. Wagner, H.C. Mattraw, G.F.
Ritz, and B. A. Smith; http://water.usgs.gov/pubs/wri/wri004252/htdocs/record_comp.html#table9.
This USGS document also discusses the use of such data when there are extensive
periods without data as well as other considerations that are beyond the scope
of our present objectives.
Table 9. Rating continuous water-quality
records , < less than or equal to; +, plus
or minus value shown; °C, degree Celsius; >,
greater than; %, percent; mg/L, milligram per liter;
pH unit, standard pH unit
||> 0.2 to 0.5 °C
||> 0.5 to 0.8 °C
||> 0.8 °C
||> 3 to 10%
||> 10 to 15%
||> 15 %
||> 0.3 mg/L
||> 0.3 to 0.5 mg/L
||> 0.5 to 0.8 mg/L
||> 0.8 mg/L
||> 0.2 to 0.5 unit
||> 0.5 to 0.8 unit
||> 0.8 unit
||> 5 to 10%
||10 to 15%
||> 15 %
OTHER CONTINUOUS WATER MONITORING DATA
Water quality data and discharge measurements are also being collected at Amity
Creek, by the MPCA (2001 and 2002) in collaboration with the USGS with separate
funding and when available these data will be posted on the DuluthStreams website.
The MPCA maintains a continuous stage height monitor in Amity and collects
20-30 grab samples throughout different flow regimes (~75% should represent
the higher flow storm and snowfall periods) for analysis by the Minnesota Department
of Health. The MPCA, in collaboration with the South S. Louis County Soil and
Water Conservation District (SSLCSWCD) has also operated several continuous
stage height monitors on Miller Creek, and these data will also be posted.
A limited number of water quality samples have also been collected at the St.
Louis River/Lake Superior entry site.
DATA TRANSMISSION AND INITIAL QA SCREENING
Each SMU is called daily and data that has been collected since the last call
is downloaded. These new data files are stored on the base station computer
as comma-delimited ASCII text files. The Kingsbury station is polled twice
daily, at 6 AM and 2 PM. The Chester and Tischer sites are scheduled to be
polled at 6AM each day because of cell phone costs.
The Conversion Process
A program (the data importer) is now launched. It reads any data files that
have been created or changed since the last time it was run, and converts
the data to the format used by the report generating and data visualization
programs. The data importer examines the beginning of these files and checks
to make sure that the contents (e.g. site name, parameter names and units)
correspond to what is expected. If not, an error message is generated and
no further action is taken with this file. This will catch errors that could
occur if, for example, a new parameter is being read by the SMU, but the
property files used by the data importer for that site haven't been updated
to handle the change.
Now, each data line is read and converted to the "DataTable" format
used for storage. The importing program can reject specific
data if it is outside of a pre-defined range for the given
parameter. Data are automatically rejected if outside the
||< -1 °C or > 35 °C.
||< 50 µS/cm or > 3000 µS/cm
(subject to review after we've experienced spring snowmelt)
||< - 5 & > 2000, with values from
- 5 to 0 set equal to 0
If it encounters a value outside of the range it will generate
an appropriate error message in the log file, and disregard
the value. This helps eliminate invalid readings that could
be caused by sensor drift or SMU hardware problems. After
all of the new data files have been imported the data importer
triggers the appropriate programs that will create and send
updated reports and data-visualization data files to the
FINAL DATA REVIEW & POSTING
At present (December 2002), funding limitations have precluded adherence to
a rigorous schedule for removing the provisional label from DuluthStreams automated
data. In part this is due to the need to review ancillary water chemistry data
before making final decisions when the data is questionable. All manually sampled
water quality data posted on the DuluthStreams however, have passed QA/QC prior
to being posted, although this typically takes from 30-60 days after collection.
Despite regular maintenance and calibration schedules, occasional
SMU data anomalies still occur. To date, they have generally
been associated with the turbidity sensor although there
was great difficulty during summer 2002 in getting the stage
height sensors to function properly. Additional difficulties
were caused by the Minnesota DNR modifying the Chester Creek
stream channel in the vicinity of our SMU, causing excessive
turbidity values and requiring a new empirical calibration
of the stage height-flow relationship. There have also been
start up problems associated with precipitation monitoring
and the modem link to Kingsbury Creek. All of these problems
were resolved by about September 1, 2002.
The most troublesome anomalies are those that occur within
the calibration window of time, are not flagged by our automated
screening tools and are not unreasonable values in terms
of the range of values previously measured or expected for
the hydrologic conditions at the time. These errors will
not be trivial to identify and will require careful examination
in a complete fluvial/limnological (stream/watershed/climate)
context by professionals. The process is adequately described
as Best Professional Judgement (BPJ). In some cases data
will need to be adjusted by calculating correction factors
when there is accurate calibration data spanning the period
in question and when the results estimated by interpolation
are consistent with the rest of the data set. In other cases,
it will be necessary to simply reject the data - omitting
it from the website. A log of data deletions will be maintained
on the website, and an e-mail announcement sent to all teachers
and researchers known to be using the site for educational
or research purposes related to curricula associated with
DuluthStreams and Water-on-the-Web (http://wow.nrri.umn.edu).
Data collected under
ice: When streams are ice covered the
depth readings and flow calculations are subject to
- The stream can be running under the ice without an
air gap, essentially pressurizing the stream -- increasing
the apparent depth and velocity.
- The streams can be running on top of the ice (probably
unaccounted for flow).
- Anchor Ice may change the channel cross section, making
our rating curves unreliable.
- Ice dams below the site can change flows.
- Bank ice can constrict the channel.
All of these things can happen at the same place over the
course of time as well.
We have chosen to flag this data as 'collected under ice
cover'. It will still show relative changes, especially
over the short-term, but it should be noted that the reported
depths and flows are not necessarily accurate.
WATER SAMPLE ANALYSES
Data from the monitoring programs will follow established
procedures of the NRRI Central Analytical Laboratory
(Ameel et al. 1998) and WLSSD (2000a,b).
Both laboratories are annually certified for Clean Water Act and Safe Drinking
Water Act water quality parameters by the Minnesota Department of Health.
Sample Collection: Samples for micronutrients
([nitrate+nitrite]-nitrogen, ammonium-nitrogen, total-nitrogen
[TN], orthophosphate-phosphorus [OP], total phosphorus [TP],
alkalinity, chloride, BOD5, total suspended solids [TSS],
total volatile solids [TVS], and total dissolved solids [TDS]
and fecal coliform bacteria are collected exclusively by
trained personnel (City of Duluth, WLSSD, MPCA, NRRI Cental
Analytical Laboratory personnel), according to individual
analysis requirements using an ISCO 6712 automated sampler
at the Kingsbury site. Manual samples are collected at various
times of the year to characterize Chester and Tischer Creeks
at their SMU sites. Several intensive sampling surveys will
be conducted to characterize high flow storm events and the
snowmelt runoff period in the Spring. Grab samples for fecal
coliform bacteria will be collected using sterile technique
(APHA 1998). All samples are transported to the laboratory
on ice in dark coolers on the day of collection, within a
few hours after collection. A field data sheet accompanies
each sample. Sample sites for the collection of duplicate
samples and preparation of field blanks are chosen at random.
Sample Preservation: All samples are initially
preserved by refrigeration and when filtration is necessary
it is performed as soon as possible after return to the lab.
If delay in analysis is anticipated, additional preservation
will be incorporated, as provided in individual methods and
according to approved guidelines (e.g. acidification or freezing).
The container label is marked when preservative is added.
Sample Identification and Tracking: Sample
containers are labeled in the field with date and time of
collection, location of sample, and person collecting the
sample. Field data sheets include all of the above information,
and also conditions specific to the project and lists the
individual parameters for which analysis is requested. For
stream monitoring, and other sampling events for which field
analysis are performed (such as conductivity, dissolved oxygen,
pH, temperature), an additional sheet is used which lists
site, stream conditions and other pertinent information.
Water samples delivered to the laboratory are assigned a
laboratory identification code and entered into a computerized
database (Microsoft Access) along with location, collection
date and required analytes. The code is used to keep a running
tally of all samples received, and also to track analysis
as they are completed.
Instrumentation: Laboratory instruments are
calibrated, operated, and maintained properly according to
manufacturer's recommendations and this is a requirement
for State Laboratory Certification. Instruments are calibrated
each time they are used, if applicable. A log is kept of
each instrument where maintenance and general comments on
instrument performance are documented. For spectroscopy procedures,
the calibration is performed with reagent blanks and standards
of a matrix closely matching the samples. A typical standard
curve consists of at least five concentrations. The linearity
of each standard curve is monitored and must have a regression
coefficient (r2) greater than 0.99. Additional quality assurance
procedures that are a part of every micro- and macronutrient
run include blanks, QCCS (quality control check standards),
and internal spikes (recoveries of 80-120% or 85-115% are
typically required depending on analyte).
Data Storage and Archiving: After analysis,
raw data from labsheets are entered into spreadsheets to
calculate concentrations and compute the relevant QC statistics.
The analyst checks the quality control data against laboratory
limits, and if the data meets the requirements, the data
is saved and printed in hard copy. Both the original data
sheet as well as the spreadsheet printout are saved. On the
hard copy, the analyst checks the data, shows the quality
control calculations, and initials the completed sheet. This
data is reviewed by the Laboratory QA officer who also initials
the sheet. At this point the data is considered valid and
reportable. Data from spreadsheets is imported electronically
into the water quality database on the DuluthStreams-NRRI
file server. The server is backed up daily to prevent data
The QA/QC of near-real time remotely collected
sensor data has provided challenges that were not present
under traditional sampling regimes. We have attempted to
develop rigorous protocols for each step of the data aquisition
effort, and believe these protocols suit the needs of projects
such as Lake Access and Water on the Web. Nonetheless, as
these technologies become more common in resource management,
future efforts must be directed toward the unique problems
posed by real-time data collection.
RIchard Axler, Elaine Ruzycki, and Norm Will
contributed to the development, testing, and documentation of these
Ameel, J.J., Axler, R.P. and Owen, C.J. 1993. Persulfate digestion for determination
of total nitrogen and phosphorus in low-nutrient waters. Amer. Environ. Labor.
October 1993, p.1-11.
Ameel, J., E. Ruzycki and R.P. Axler. 1998 (reviewed annually). Analytical
chemistry and quality assurance procedures for natural water samples. 6th edition.
Central Analytical Laboratory, NRRI Tech. Rep. NRRI/TR98/03.
Anderson, J., T. Estabrooks, and J. McDonnel. 2000. Duluth Metropolitan Area
streams snowmelt runoff study. Minnesota Pollution Control Agency, St. Paul,
MN. March 2000.
APHA. 1998. Standard methods for the examination of water and wastewater. American
Public Health Association, Washington, D.C.
Archer, A. and J. Barten. 1995. Quality assurance manual. Hennepin Parks Water
Quality Laboratory. September 1995. Hennepin Parks, 3800 County Road 4, Maple
Plain, MN 55359.
Archer, A. and J. Barten. 1996. Laboratory Procedures Manual. Hennepin Parks
Water Quality Laboratory. October 1996. Hennepin Parks, 3800 County Road 4,
Maple Plain, MN 55359.
Axler, R.P. and C.J. Owen.1994. Fluorometric measurement of chlorophyll and
phaeophytin: Whom should you believe? Lake and Reservoir Management 8:143-151.
Barten, J. 1997. Water quality monitoring plan. Hennepin Parks, 3800 County
Road 4, Maple Plain, MN 55359.
EPA. 2000. Delivering timely water quality information to your community: The
Lake Access-Minneapolis project. EPA/625/R-00/012, September 2000, U. S. Environmental
Protection Agency, Office of Research and Development, Cincinnati, OH, 45268,
EPA. 1998. Guidance for Quality Assurance Project Plans EPA QA/G-5. EPA/600/R-98/018,
Feb 1998 (http://www.epa.gov/quality1/qs-docs/g5-final.pdf ).
U.S. EPA, Washington, D.C. 20460
EPA 1989a. Preparing perfect project plans. US EPA Risk Reduction Engineering
Laboratory, Cincinnati, OH, EPA/600/9-89/087.
EPA.1989b. Handbook of methods for acid deposition studies-Field operations
for surface water chemistry. EPA/600/4-89-020.
EPA. 1987. Handbook of methods for acid deposition studies-Laboratory
analysis for water chemistry. EPA/600/4-87-026
EPA. 1996. The
Volunteer Monitor's Guide to: Quality Assurance Project Plans. EPA 841-B-96-003,
Sep 1996, U.S. EPA, Office of Wetlands, Washington, D.C. 20460, USA (http://www.epa.gov/owowwtr1/monitoring/volunteer/qappexec.htm)
Host, G., N. Will, R.Axler, C. Owen and B. Munson. 2000a. Interactive
technologies for collecting and visualizing water quality data. URISA Journal
(In Press; refereed: http:// wow.nrri.umn.edu/urisa)
Host, G.E. , B. H. Munson, R. P. Axler, C. A. Hagley, G. Merrick and C. J.
Owen. 2000b. Water on the Web: Students
monitoring Minnesota rivers and lakes over the Internet. AWRA Spec.Ed.
(Dec., 1999). (refereed: www.awra.org/proceedings/www99/w74/index.htm.).
MCWD. 1997. Quality assurance - quality control assessment report. Lake Minnetonka
Monitoring Program 1997. Minnehaha Creek Watershed District, 2500 Shadywood
Road, Excelsior, MN 55331-9578.
USGS. 2000a. Guidelines and standard procedures for continuous water-quality
monitors: Site selection, field operation, calibration, record computation, and
reporting. R.J. Wagner, H.C. Mattraw, G.F. Fritz and B.A. Smith. U.S. Geological
Survey Techniques of Water-Resources Investigations Report 00-4252 (http://water.usgs.gov/pubs/wri/wri004252/).
U.S. Geological Survey, Reston, Virginia, USA.
USGS. 2000b. National field manual for the collection of water-quality data.
U.S. Geological Survey Techniques of Water-Resources Investigations, book 9,
chaps. A1-A9, 2 v., variously paged.
WLSSD. 2000a. Laboratory Procedures Manual. Revision 2, Oct 1998. Western Lake
Superior Sanitary District, Duluth, MN. 55807.
WLSSD. 2000b. Laboratory Quality Assurance Manual. (Revision 2, May 1998 with
annual minor revisions). Western Lake Superior Sanitary District., Duluth, MN