Variations and fluctuations in the quantity and quality of streamflow of the Smoky Hill River and tributaries in the reach between Cedar Bluff Dam and station 334.4, which is about 2 miles (3.2 km) downstream from the district, were monitored at the gaging stations shown in plate 1. Because the approximately 6,000 acres (2,400 ha) of irrigated land normally contributed an insignificant part of storm runoff from the 220 square miles (570 km2) of drainage area to the reach, effort was directed mainly to isolating the progressive changes in quantity and quality of low flow caused by irrigation from the wider variations and fluctuations caused by other factors in the hydrologic environment.
Seepage-Salinity Surveys of Low Flow
Sixteen surveys of a 22-mile (35-km) reach of the Smoky Hill River from the dam to station 334.4 (pl. 1) were made between April 1964 and November 1971 to reveal progressive changes in the quantity and quality of low flow in the main stem, of tributary inflow, and of seepage directly into the main stem as a result of irrigation. The Division of Water Resources of the Kansas State Board of Agriculture made most of the field measurements during the surveys in 1964 to obtain necessary background data before the comprehensive study began. Regulation of releases from the fish hatchery (355.8, pl. 1) by personnel of the U.S. Bureau of Sports Fisheries and Wildlife during most of the surveys facilitated the field work and analysis of the data.
Basic data consisted of current-meter measurements of streamflow at the gaging stations shown in plate I and chemical analysis of water samples taken at the time of measurement. Each survey was completed during a single day when streamflow had stabilized after several weeks without significant rainfall, direct surface runoff, or releases to the river from the dam. All but two of the surveys were made before or after the growing season, when inflow to the stream consisted mainly of effluent ground water from the irrigated area and measured releases from the fish hatchery. Results of the survey in March 1966 were complicated by releases from the fish hatchery and snow melt. Results of the two surveys conducted during the growing season (September 1, 1965; August 31, 1971) were affected by waste, tail-water runoff, unmeasured withdrawals and runoff associated with irrigation outside of the district, and consumptive use by phreatophytes which proliferated along the channel since irrigation began. Significant turbidity, observed only during the survey in August 1971, originated upstream from the district.
Flow in each of the main stem reaches consisted principally of flow entering the upstream end of the reach, tributary inflow, and seepage into the channel. Variations in the quantity and chemical quality of each component, or combination of components were evaluated by use of computer programs (Leonard and Morgan, 1970) developed specifically to facilitate compilation, analysis, and interpretation of the voluminous data. The programs are applicable wherever salinity surveys of low flow can be used for qualitative and quantitative definition of localized sources of chemical pollution.
A progressive increase in the concentration of chloride in a downstream direction in the river adjacent to the irrigation district during successive surveys (fig. 26) reflects accelerated drainage of ground water from the irrigated acreage. Although the concentration of chloride was greater than in the irrigation water it remained well below the maximum of 250 mg/l recommended for drinking water by the Kansas State Board of Health (1973). The rate of increase per river mile was greatest in the reach 350.0 to 338.0 adjacent to the main part of the irrigation district, herein defined as the irrigation reach.
Figure 26--Concentrations of sulfate and chloride in the Smoky Hill River and in irrigation water during seepage-salinity surveys, 1964-71.
The concentrations of the other ions did not increase proportionately. For example, the concentration of sulfate, the most abundant ion in the applied water, generally decreased in a downstream direction from values higher than to values lower than the irrigation water. As was shown in the previous discussions of ground water and soil, some of the sulfate in the applied water evidently was precipitated in the soil or on the land surface, or was removed by plants; thereby depleting the concentration of that constituent in the return flow.
To evaluate the quantity and quality of inflow between successive stations along the main stem, the water and chemical discharge at each of the stations was subtracted from corresponding quantities at the adjacent downstream station. The algebraic difference is the net gain (or loss) of water or chemical discharge in the reach. During the surveys net gains consisted mainly of effluent ground water from the irrigated area north of the river. After the 1965 irrigation season, inflow from right-bank tributaries (south) was a negligible part of total inflow.
The water discharge at stations along the main stem and near the mouths of all known flowing tributaries was reported in cubic feet per second. The chemical discharge, or ionic load (L), of an ionic constituent is defined by the equation
L = KQC
where K is a unit constant, Q is the measured discharge in cubic feet per second (assumed equal to the daily mean discharge), and C is the concentration of the constituent. Load is defined in tons per day when C is in milligrams per liter and K = 0.0027. Load also can be defined in terms of kiloequivalents, or thousands of equivalent weights per day, when C is in milliequivalents per liter and K = 2.446. The latter method of defining L was used for data analysis in this report. Because the sums of the cationic and anionic constituents should be nearly equal, anomalous relations between the ionic loads are readily apparent.
If the water discharge and ionic load at the downstream and upstream stations respectively are Q2, Q1, L2, and L1, then the net gain in the reach is Q2-Q1 for water discharge and L2-L1 for chemical discharge. Negative values represent losses from the channel, changing stage, or erroneous measurements of water discharge or concentration. Differences greater than about 8 percent between the anionic load and cationic load in net gains or losses, in ke/day (kilo equivalents per day), normally indicated erroneous calculations or measurements.
Despite wide and relatively random variation in antecedent rainfall, the magnitude of net gains between successive stations adjacent to the district generally increased as low flow was augmented by effluent ground water and salts from the irrigated land. However, as shown in figure 27, the inflow per mile varied from reach to reach in response to variations in the distribution of irrigation water and localized differences in the geohydrology. For example, high inflow in the reach 340.3 to 338.0 probably represents discharge of effluent ground water from a buried channel that drains a large part of the irrigated area (pl. 2). During the irrigation season, canal waste also augments flow in that reach.
Figure 27--Net gains of water, sodium, sulfate, and chloride during seepage-salinity surveys 1964, 1966, 1968, and 1970, and of concentrations in 1970.
Progressive changes in the magnitude and ionic composition of net gains to the irrigation reach 350.0 to 338.0 are caused mainly by changes in the quantity and chemical quality of drainage from most of the irrigated land. Net gains upstream from the irrigation reach consisted mainly of variable inflow from the fish hatchery and from a seepage area east of the dam. Part of that inflow is now diverted for irrigation of land outside of the district. Net gains downstream from the irrigation reach appear to be relatively unaffected by irrigation of land in the district, although withdrawals from wells adjacent to the channel between stations 338.0 and 335.5 caused minor net losses during several of the surveys. The wells may intercept return flow from the district.
To illustrate the effect of irrigation on flow downstream from the district, exclusive of variable releases from the fish hatchery, net gains are expressed in figure 28 as percentages of total inflow to the river between the fish hatchery 355.8 and station 334.4. Net gains to the irrigation reach ranged from 30 percent in 1964 to about 80 percent in 1967 when rainfall during part of the growing season far exceeded consumptive use (table 2). From 1968 to 1971, when irrigation water was used more efficiently, the percentage remained relatively constant between 62 and 69 percent.
Figure 28--Net gains in the irrigation reach, expressed as percentages of the total inflow to the Smoky Hill River between Cedar Bluff Fish Hatchery (355.8) and station 334.4.
Progressive increases in net gains in the spring of each year, when excess water from the preceding season had ample time to drain from the aquifer, attest to the increase in storage shown by higher water levels adjacent to the irrigated area during successive years. The lowest net gain in the irrigation reach was 1.4 cfs (0.04 m3/s) in October 1964. After 1967, net gains fluctuated between 8 and 10 cfs (0.23 and 0.28 m3/s) in the spring and fall, respectively, as the water table fell and rose in response to drainage from and infiltration of excess water to ground-water storage.
Net gains consist of measured tributary inflow plus net seepage. Tributary inflow, measured near the mouths of the streams, consisted mainly of effluent ground water from the aquifer underlying the high terrace. Net seepage gains represent unmeasured inflow mainly from small tributaries and from marshy areas along the river. Net seepage losses represent unmeasured withdrawals, including evapotranspiration, from the main stem or from tributaries downstream from data-collection sites. Measured withdrawals during the surveys in the spring and fall were negligible.
In April 1964, seepage was the major component of net gains of less than 2 cfs (0.06 m3/s). As more water was applied to more acres, the ground-water reservoir under the high terrace filled and tributary inflow comprised a generally increasing percentage of net gains in the irrigation reach (fig. 28). Flow of natural tributaries increased and was augmented by drains constructed in marshy areas that formerly contributed to seepage.
Net seepage gain or loss between adjacent stations probably is significant only when the magnitude exceeds about 6 percent of the measured discharge at the terminus of the reach. The effect of small errors in measurement and analysis at each station is minimized when seepage gains and losses are accumulated in a downstream direction. Cumulative net seepage losses of nitrate in the irrigation reach were recorded during all surveys, but gains were recorded for water and all the other major ions. Net seepage losses of nitrate that ranged from about 45 to 53 kilograms of nitrate (10 to 12 kg. of nitrogen) per day during the last four surveys would represent improbably high unmeasured withdrawals of about 21 acre-feet (10.6 cfs-days) per day of river water at the observed concentration of about 2 mg/l. The losses probably represent interception and consumption of much smaller quantities of highly concentrated tributary inflow by abundant phreatophytes on the flood plain as well as withdrawal of nitrate by aquatic plants.
Figure 29--Specific conductance, sodium-adsorption ratio, and magnitude and composition of net gains in the irrigation reach during seepage-salinity surveys, 1964-71.
Although the maximum net gains in water discharge to the irrigation reach after each growing season was measured during the survey in October 1969, the net gain in chemical discharge increased progressively from 1964 to 1971 (fig. 29). However, the rate of annual increase in chemical discharge diminished from year to year.
As shown in figure 29, the percent composition of the ionic load in net gains remained relatively constant after 1965. Calcium and sulfate were the predominant ions, but the percent sulfate was much lower than in the irrigation water. After 1965, the percent sulfate in net gains and in the irrigation water generally increased with a corresponding decrease in percent. However, the percentages of sodium and chloride in net gains remained higher than the percentages in the irrigation water. Significant relations between the ions are readily described in terms of concentrations and concentration ratios.
The specific conductance of water discharge in the irrigation reach increased from 880 to 900 micromhos per centimeter at 25°C in the fall of 1964. In the fall of 1971, the specific conductance in the same reach increased from 1,070 to 1,230 micromhos per centimeter at 25°C. The concentration ratio for specific conductance increased from 1.01 in 1964 to 1.15 in 1971.
The concentration in milliequivalents per liter of the ions in net gains was calculated by dividing the net gains in chemical discharge shown in figure 29 y the product of the net gains in water discharge and the unit constant 2.446. The concentrations of the ions in net gains, net seepage, and tributary inflow during each of the surveys is shown in table 8; progressive changes are illustrated graphically in figure 30. Because concentrations of nitrate were very low and the range of concentration ratios was very wide, it was not practical to include nitrate in figure 30.
Table 8--Quantity and chemical quality of net gains, tributary inflow, and net seepage in the irrigation reach 350.0-338.0 during seepage-salinity surveys, April 1964-November 1971.
|Discharge in cfs
(residue at 180°C)
|Boron (B)||Phosphate (PO4)||pH
|Calcium (Ca)||Magnesium (Mg)||Sodium (Na)||Potassium (K)|
|Bicarbonate (HCO3)||Sulfate (SO4)||Chloride (Cl)||Nitrate (NO3)|
|1 CR = Concentration ratio--Ratio of concentration or property in net gain to that in irrigation water.
3 Reach 348.9-340.3.
4 Surveys made during irrigation season.
The specific conductance and the concentrations of most of the major ions in net gains generally increased progressively after 1964, but were not directly related to the rate of water discharge. Anomalously high concentrations were recorded for the survey in April 1964 when small net gains consisted chiefly of seepage. In October 1964, sharply lower concentrations in net gains that consisted mainly of tributary inflow (fig. 28) probably reflect the first arrival of excess water from irrigation of land on the high terrace. The concentrations of most of the ions in seepage (excluding sodium) exceeded the concentrations in tributary inflow, possibly as a result of transpiration of water by phreatophytes and evaporation of water from marshy areas near the river. Despite the paucity of sodium, the possibility can not be discounted entirely that seepage is augmented by saline water from abandoned wells that tap artesian aquifers at depth.
The relatively constant concentration ratios for calcium and sulfate suggest that the concentrations of those ions in net gains and in the irrigation water are related. The ratios for calcium were similar to those for the specific conductance which increased from 1.07 in November 1965 to 1.39 in November 1970, but the ratios for sulfate were less than one. Low ratios for sulfate, magnesium, and potassium indicate dilution, organic removal, or precipitation of the salts of those ions. Some of the magnesium and potassium may have replaced sodium in the soil by ion exchange.
Disproportionately high concentration ratios for sodium and chloride indicate a source for those ions that is supplementary to the irrigation water. Although the concentrations continued to increase through 1971, the trends in the ionic loads shown in figure 29 and in the ratios shown in figure 30 suggest depletion of that source. The ratios of the concentration of sodium to the concentration of chloride (in me/l) generally exceeded one; the sodium to chloride ratio in oil-field brine normally is less than one (Leonard, 1972). Recently produced brine does not seem to be the principal source of the sodium and chloride unless it has become enriched with sodium displaced from the soil by calcium and magnesium from the irrigation water. The ratios seem to substantiate the evidence given in preceding sections that naturally accumulated salts in the soil are the common source.
Figure 30--Concentrations and concentration ratios of ions in net gains in the irrigation reach during seepage-salinity surveys, 1964-71.
Concentrations of the other ions appear to have achieved some sort of temporary equilibrium with the environment at concentrations about equal to or less than those predicted from consumptive use (table 2). If meteorologic conditions, quality of the irrigation water, and management of the district do not change drastically, the specific conductance and concentrations of sodium and chloride in low flow should not significantly exceed the concentrations of ions in transient storage in the ground water or the maximums shown in figure 30 for extended periods of time as a result of irrigation.
Pesticides and Non-Silicate Trace Elements in Low Flow
Seven sets of samples were collected during periods of low flow at stations 355.8, 352.8, 347.4, 340.3, and 334.4 (pl. 1) between August 1964 and November 1967 for determination of pesticides and carbon extractables. Qualitative and quantitative analyses for halogenated and sulfur containing phosphatic organic and phenoxy-type pesticides plus carbon chloroform (CCE) and carbon alcohol extractables (CAE) were performed on 1-gallon (3.8-l) samples of river water and on packed granular carbon columns representing 100-gallon (380-l) volumes of river water. Analysis was by electron gas capture chromatography and micro coulometric titration gas chromatography at the Southeast Water Laboratory of the Environmental Protection Agency at Athens, Georgia. No identifiable pesticides were detected at low flow (Leonard and Stoltenberg, 1972, table 10); however, some pesticides may be carried from the irrigated area to the river by surface runoff resulting from heavy rainfall.
Only trace levels (less than 0.1 part per billion) of chlorinated hydrocarbons were reported by Knutson and others (1971, p. 25) in samples of water from the irrigation canal, Cedar Bluff Reservoir, the Smoky Hill River, and in runoff from Plot 1 collected between 1966 and 1969. Higher concentrations (up to 13 parts per billion) of Dieldren were found in the suspended sediment in surface runoff from plot 1. During 1969, no residues were detected in silt or water from either the reservoir or the river, perhaps as a result of the decreased use of chlorinated hydrocarbons.
Table 9--Non-silicate trace element content of composited samples of bed material, suspended sediment, and water at sites 338.0, 338.5, and 339.1 on the Smoky Hill River, April 4, 1971. [Analyses by E. A. Jenne, U.S. Geological Survey.]
|Type of sample||Particle
|Concentration, in micrograms per gram1|
core 0-.5 inch
core 0-4 inches
|Concentration, in micrograms per liter|
|1 Oven dried at 110°C.
2 Filtered through Gelman 0.1 micron membrane. Soluble organic material combusted by ultraviolet radiation prior to analysis.
3 Suspended sediment concentration 25 mg/l.
Table 10--Average specific conductance and concentrations of selected ions in discrete samples from stations on the Smoky Hill River during water years 1966-71. [Samples collected when discharge <15 cfs at station 352.8, <20 cfs at station 345.5, and <30 cfs at station 334.4]
|Constituents||Number of analyses at stations||Mean at stations,
in milligrams per liter
|Standard deviation at stations,
in milligrams per liter
|Specific conductance (micromhos/cm at 25°C)|
|Sodium-adsorption ratio (SAR)|
|1Concentrations of boron given in micrograms per liter.|
On March 26, 1971, when the water discharge of the Smoky Hill River adjacent to the district was representative of the low flow that is exceeded about 60 percent of the time, samples of water, suspended sediment, and bed material were collected at three sites (338.0, 338.5, and 339.1). The non-silicate trace element content of composited samples (table 9) were analyzed by E. A. Jenne, U.S. Geological Survey (1971). The analyses indicate that appreciable quantities of trace elements are associated with sediment that would be incorporated in streamflow during periods of storm runoff. Anomalously high concentrations of copper in the suspended sediment probably were derived from copper-bearing algicides used in the canal. The chemistry of direct runoff from the district warrants additional study.
Changes in Chemical Quality of Streamflow Adjacent to the District
Data collected for the seepage-salinity surveys under idealized conditions during only 6 days are presumed to represent changes in the quantity and chemical quality of low flow during a period of more than 7 years. Efforts were made to standardize the data at the time of collection and analysis, but the effects of short-term fluctuations during the surveys and unmeasured variations could obscure significant relations. To describe the quality of flow in the river over a wider range of conditions, samples were collected for chemical analysis at monthly or more frequent intervals at five of the stations during water years 1966-71. Discharge measurements accompanied most of the samples.
In addition to the river mile designation used for stations in this report, some stations have been assigned a number (shown in parentheses) in downstream order according to the identification system used by the U.S. Geological Survey. Records for the Smoky Hill River near Schoenchen 344.4 (0686270) and for the Smoky Hill River at Cedar Bluff Dam 355.9 (0686200) are published in the series of annual reports "Water Resources Data for Kansas." Both stations are equipped with continuous stage recorders. A graphic conductivity recorder was maintained at the downstream station during water years 1966-70. Records for the two intermediate project stations 352.8 (0686210) and 345.5 (0686220) and for the Fish Hatchery outfall 355.8 are included in the basic data report (Leonard and Stoltenberg, 1972).
During infrequent periods of high runoff from the 220 square miles (570 km2) of drainage area to the reach of the river between the dam and station 334.4, contributions from the approximately 6,000 acres (2,400 ha) of irrigated land normally comprised an insignificant part of streamflow in the reach. The effects of storm runoff, accumulated salts flushed from the land surface outside of the irrigated area, and releases from the dam to the river far outweighed the effects of irrigation on the chemical quality of high flow.
Statistical analysis of the data for discrete samples from stations along the main stem showed that omission of a small number of samples collected at high rates of discharge drastically reduced the standard deviations from the means and the standard errors of estimate for linear and polynomial regressions relating discharge, specific conductance, and concentrations of the ions. The statistics shown in table 10 and related illustrations represent samples collected at rates of discharge less than 15, 20, and 30 cfs (0.42, 0.57, and 0.85 m3/s) at stations 352.8, 345.5, and 344.4, respectively. Statistics for station 355.9, at Cedar Bluff Dam, are included in table 10, but are generally omitted from the illustrations because the quality of water is similar to the irrigation water. The mean values shown for specific conductance and discharge probably are higher and lower, respectively, than the actual values, but they represent conditions that prevailed during more than 80 percent of the time.
In general, the mean specific conductance increased from year to year at each of the stations and in a downstream direction. The specific conductance of natural streamflow normally varies inversely with the rate of water discharge as a result of dilutant rainfall. However, the relations of specific conductance to water discharge shown in figure 31 indicate that the specific conductance (and concentration of dissolved solids) corresponding to a given rate of discharge increased from year to year. To minimize the effect of the progressive increase in the specific conductance of the irrigation water, the values shown in the figure are expressed as concentration ratios to the specific conductance of the irrigation water during the preceding irrigation season. At the downstream station 344.4, the ratios increased from 1.08 in 1966 to 1.25 in 1971, slightly less than the ratios for net gains in the irrigation reach during the seep age-s alinity surveys.
Figure 31--Relations of the average specific conductance (expressed as concentration ratios) to the corresponding water discharge for discrete samples at three stations on the Smoky Hill River for water years 1966-71.
The concentration ratios for calcium remained relatively constant and similar to the ratios for specific conductance at all three stations (fig. 32). The ratios for sulfate remained relatively constant from year to year at each station and decreased in a downstream direction to values less than one. The concentration ratios for sodium and chloride increased in a downstream direction to disproportionately high values with respect to specific conductance. The concentration ratios representing the mean concentration of chloride increased progressively through 1971, but the concentration ratios for sodium were exceeded during several preceding years. Changes in the other ions are shown in table 10.
Figure 32--Average concentration ratios for selected ions in discrete samples at three stations on the Smoky Hill River during water years 1966-71.
Despite minor discrepancies attributable to sampling over a wider range of discharge, the recorded changes substantiate the results of the seepage-salinity surveys. Inflow between the three stations varied widely in quantity and quality during each year. During periods of low to moderate flow, drainage from the irrigation district apparently was the major factor that determined the rate and quality of flow at station 344.4 near Schoenchen.
Changes in Quantity and Quality of Streamflow Downstream from the District
Results of analyses of samples collected at semimonthly or more frequent intervals were combined with continuous records of stage and specific conductance to relate variations in chemical quality to water discharge and to time at station 334.4. Continuous specific conductivity records were available for water years 1966-70. A specific conductivity record for 1971 was synthesized by combining the specific conductance of samples collected at weekly or more frequent intervals with the continuous discharge records. Simulated records calculated from similar data collected during preceding years compared favorably with the actual records except during periods of extremely high discharge.
Relations of daily mean specific conductance to daily mean discharge and time (fig. 33) show that successively higher values of specific conductance generally prevailed for similar periods of time during water years 1966-71. Except during the dry 1968 water year, the specific conductance of streamflow exceeded the specific conductance of the irrigation water for progressively longer periods ranging from 82 percent of the 1966 water year to 97 percent of the 1971 water year. During the same periods, the water discharge was less than about 30 cfs (0.85 m3/s), the rate at which streamflow was evidently diluted by overland runoff or releases from the reservoir. Only during the 1966 and 1967 water years, when substantial releases were made to the river from the reservoir, did the daily mean discharge exceed 30 cfs (0.85 m3/s) more than 10 percent of the time. Largely because drainage from the irrigated land sustained base flow, the daily mean discharge equalled or exceeded 90 and 50 percent of the time remained relatively constant.
Figure 33--Daily mean specific conductance and daily mean discharge of the Smoky Hill River at station 334.4 near Schoenchen during 90, 50, and 10 percent of water years 1966-71.
Relations between specific conductance, concentrations of the ions, and discharge were poorly defined for the relatively few samples collected when the discharge exceeded 30 cfs (0.85 m3/s), because the high rates of discharge were normally associated with storm runoff. During each rise, the specific conductance corresponding to a given rate of discharge generally was higher than during the subsequent recession. Linear relations of the specific conductance of discrete samples to the corresponding rates of discharge equal to or less than 30 cfs (0.85 m3/s) (fig. 34) illustrate that the specific conductance (and concentration of dissolved solids) corresponding to a given rate of discharge increased progressively for water years 1966-71 despite random variations in precipitation. Although the coefficients of correlation are low, the maximum standard error of estimate is only about 6 percent of the corresponding mean specific conductance for each water year.
Figure 34--Relations of the specific conductance of discrete samples to the corresponding discharge of the Smoky Hill River at station 334.4 near Schoenchen for water years 1966-71.
During periods of stable flow, the specific conductance of discrete samples was similar to the daily mean specific conductance. The concentrations of the ions in discrete samples taken during those periods of each water year were related to the specific conductance of the samples by the least squares method using polynomial regressions of the third degree. The concentrations of the ions corresponding to the daily mean specific conductance E10, E50, and E90, shown in figure 33, were then determined from the regression equations.
Near equivalence of the sums of the cations to the sums of the anions (in me/l) for each of the synthetic analyses (table 11) determined from independent equations for each of the ions attests to the validity of the method and results. The values for the concentrations of the ions at E50 are nearly equal to the mean concentrations of the corresponding ions calculated directly from the analyses of the samples. The specific conductance, the sodium-adsorption ratio, and the concentration ratios for the ions corresponding to E50 are shown graphically in figure 35.
Table 11--Concentrations of the ions and percent composition of the ionic load corresponding to the daily mean specific conductance equalled or exceeded 10, 50, and 90 percent of the time in the Smoky Hill River at station 334.4 near Schoenchen during water years 1966-71. [Load computed using third-degree polynomial equations that define the relation of specific conductance (independent variable) to the concentration of the ions for discharges less than 30 cfs (0.85 m3/s).]
|Specific conductance||Calcium (Ca)||Magnesium (Mg)||Sodium (Na)||Potassium (K)|
|Micromhos per centimeter
|Concentrations, in milligrams per liter|
|Sodium-adsorption ratio||Percent composition|
|Dissolved solids (DS)||Bicarbonate (HCO3)||Sulfate (SO4)||Chloride (Cl)||Nitrate (NO3)|
|Concentrations, in milligrams per liter|
Figure 35--Specific conductance, sodium-adsorption ratios, and concentration ratios of the ions that were equalled or exceeded 50 percent of the time in the Smoky Hill River at station 334.4 near Schoenchen during water years 1966-71.
Part of the general increase in the specific conductance and concentrations of the ions downstream from the district reflects similar changes in the quality of water in Cedar Bluff Reservoir. Although the concentrations of sodium and chloride ions increased appreciably as a result of irrigation, they remained well below recommended maximums for drinking water (Kansas State Board of Health, 1973). Concentrations of dissolved solids and sulfate generally exceed the recommended maximum limits in the irrigation water and in the river water downstream from the district.
Estimates of Quantity and Quality of Drainage from the District
Annual discharge at the gage near Schoenchen is a variable mixture of drainage from the irrigation district (including irrigation return flow) and a larger quantity of water of differing composition from other sources. The similarity of the curves shown in figure 35 to those shown in figure 30 for net gains indicate that drainage from the district measured during the seepage-salinity surveys continued during the rest of the year and was a major factor governing the chemical quality of water downstream. By use of simplifying assumptions, independent estimates of the quantity and quality of drainage attributable to irrigation during each year were calculated from records of streamflow in and releases to the river (net drainage), and from irrigation requirements and distribution of the irrigation water (consumptive use). Much of the data were collected as part of the normal operation and management of the reservoir and irrigation district. The results are compiled in table 12 for comparison with results from the seepage-salinity surveys.
Table 12--Calculated quantity and chemical quality of net drainage to the Smoky Hill River from Cedar Bluff Irrigation District during calendar years 1964-71. [Reach between fish-hatchery outfall (355.8) and the gage near Schoenchen (334.4).]
cm at 25°C
|Concentrations, in milligrams per liter|
1 Net drainage. Streamflow at gage near Schoenchen (334.4) minus storm runoff, releases from dam to river, and waste from fish hatchery, irrigation canal, and laterals. No record before 1966.
2 See page 44 salinity survey. Net inflow to reach. Average of spring and fall surveys.
3 Consumptive use. Excess water (Ax) = inflow = excess rainfall (Pa - Pe) plus excess irrigation water (I-R) = Pa + I - C. (See table 2 and figure 9).
Specific conductance of net drainage Ex = ΣEiN/ΣXn where Ei is the specific conductance of the irrigation water (see table 2).
Estimates based on records of streamflow and releases to the river
Streamflow at the gage near Schoenchen (334.4) consists of a variable high-flow component, and a relatively stable low-flow component that prevailed most of the time. The variable component consists mainly of storm runoff and direct releases to the river from Cedar Bluff Reservoir. Flow is mostly regulated by the dam, but the maximum discharge recorded near Schoenchen since the recording stage gage was installed in July 1964 was 20,400 cfs (580 m3/s) on June 14, 1970 when discharge recorded at the dam (355.9) was less than 5 cfs (0.14 m3/s) . As the scope of the project did not include adequate data collection for measuring the quantity and quality of storm runoff from the irrigated area, which represents less than 5 percent of the drainage area, efforts were confined to evaluation of the stable component of streamflow.
The stable component consists mainly of subsurface drainage of excess water from the irrigated cropland that is augmented by seepage from the dam, fish hatchery ponds, canal and laterals, bank storage, and a small amount of natural ground-water runoff not directly associated with irrigation. During the growing season, natural streamflow was depleted by evaporation, transpiration by phreatophytes and other vegetation that proliferated along draws, tributaries, and the river channel in response to the increased availability of water, and by withdrawals for irrigation. Depending on magnitude and timing, measured releases from the fish hatchery and waste from the canal and laterals augmented both stable and variable components.
Releases from the dam and fish hatchery were subtracted from the discharge measured at the gage near Schoenchen (334.4) for corresponding 5-day intervals during the period of record, 1965-71. The cumulation of differences against time (ΣQ) shown on figure 36 is a series of straight-line segments representing periods of relatively stable low flow separated by intervals during which the discharge fluctuated rapidly as a result of storm runoff. The slope of each of the linear segments presumably represents the rate of inflow to the reach between the fish hatchery and the gage. If the information collected during the seepage-salinity surveys applies to conditions that prevail during the remainder of each year, the inflow consisted chiefly of effluent ground water and waste from the irrigated area (the combination that conforms to the generally accepted definition of irrigation return flow). The hypothesis would be substantiated if the chemical quality of the inflow minus waste (hereafter termed net drainage, Xd) determined indirectly from the records at the gage near Schoenchen were similar to the net gains in the same reach during the seepage-salinity surveys.
Figure 36--Components of cumulative discharge of the Smoky Hill River at station 334.4 near Schoenchen, 1965-71.
By assuming that the average rate of ground-water inflow during each period of major fluctuation in discharge was equal to the mean of the rates prevailing before and after the period, cumulative net drainage (ΣXd) was calculated for the period of record. To illustrate the calculation graphically, successive linear segments were extended and translated vertically to form the relatively smooth curve that represents cumulative net drainage to the river with storm runoff, releases from the dam and fish hatchery, and canal and lateral waste (overflow) removed (fig. 36). The average daily net drainage ranged from 10.2 cfs (0.29 m3/s) per day in 1966 to 15.1 cfs (0.43 m3/s) per day in 1969. High values in 1965 and 1969 may reflect above average rainfall during those years. For all years, the values were lower than Q50 (fig. 34), but some were nearly equal to net gains measured for the reach between the fish hatchery outfall (355.8) and the gage near Schoenchen (334.4) during the seepage-salinity surveys (table 12).
The chemical discharge (or load) of the major ions that passed the gage during the periods of relatively stable flow, shown in figure 36, was calculated for each calendar year from the daily mean water discharge, the daily mean specific conductance, and regression equations relating the specific conductance to the concentrations of the ions in discrete samples. Digital-computer programs developed specifically for this phase of the project (written commun., D. Maddy, 1971) facilitated the lengthy calculations. The chemical discharge in measured releases from the dam, fish hatchery, and wasteways during the same periods was evaluated readily because the chemical quality was nearly similar to the irrigation water for each year (table 3). The differences between the chemical and water discharge at the station and in the releases and waste during the same periods are assumed to represent the chemical and water discharge of net drainage for those periods. The specific conductance and concentrations of the major ions in net drainage (table 12) were calculated from the differences.
Although the calculated values for specific conductance and concentrations of the ions in net drainage were derived by indirect methods using relatively imprecise data collected during the entire year, they are remarkably similar to the values for net gains between the fish hatchery (355.8) and the gage near Schoenchen (334.4) that were based on direct measurements during seepage- salinity surveys representing periods of one or two days under idealized conditions.
Evidently the results of the seepage-salinity surveys approximately represent conditions of low How throughout the year. On the other hand, unless the more detailed information provided by the surveys is required, changes in the quantity and chemical quality of drainage from the irrigated areas can be estimated from continuous records at the downstream site supplemented with data collected as part of the normal operation of the reservoir and irrigation district.
Estimates based on irrigation requirements and distribution of the irrigation water
Theoretically, the quantity and quality of net drainage could be estimated from the consumptive use and distribution of precipitation and irrigation water (table 2) and the quality of the irrigation water (table 3) if there were no other sources of the ions or chemical reactions between the ions in the soils and aquifers. Results of the early seepage-salinity surveys justify the assumption that without irrigation there would be virtually no net drainage.
As shown by the cumulative curves in figure 36, net drainage for most years was nearly equal to the sum of the excess rainfall and irrigation water (Ax, table 2) on the irrigated acreage. Annual values for Ax are used for "inflow" in table 13. The empirical relation may be useful for prediction, but the implication that net drainage, including dissolved salts, originated only in the irrigated acreage is an oversimplification. Cumulative deliveries of irrigation water to the farms (ΣI) (table 2) were nearly equal to cumulative net drainage (ΣXd) (fig. 36) for the period 1965-71, but the similarity is probably coincidental because the annual values differed widely.
Excess input (Xn, table 2) to the system probably represents a more realistic estimate than Ax of actual recharge to the aquifers from which effluent ground water sustains net drainage. As the ground-water reservoir filled from 1965 to 1968, excess input (Xn) far exceeded net drainage. By the end of 1971, the cumulative excess input (ΣEn) exceeded cumulative net drainage (ΣXd) by less than 3 percent (fig. 36). If drainage from about 660 acres (270 ha) of land outside of the district that were irrigated by estimated diversions of about 870 acre-feet (1.1 hm3) per year (records of Kansas State Department of Agriculture) were added to cumulative net drainage, the difference would be even smaller. Therefore, ΣXn appears to be a reasonable estimate of ΣXd.
The specific conductance (Ex) and the concentrations of the ions in the excess input (Xn) were calculated using the expression
Ex = (ΣEiN) / (ΣXn)
as shown in table 2, and analogous expressions using the concentrations of the ions in the irrigation water. The specific conductance and the concentrations of dissolved solids and calcium were remarkably similar to corresponding values determined by the other two methods (table 12). However the relatively low predicted concentrations of sodium, chloride, bicarbonate, and nitrate substantiate the hypothesis that the higher concentrations of those ions in measured inflow to the river were derived from supplementary sources.
Comparison of net input and net drainage
Lack of adequate data describing overland runoff precluded evaluation of a comprehensive salt balance for the district; however, the similarities described above indicate that the estimates of the quantity of water and salts in net drainage (Xd) (fig. 36) can be related with some degree of confidence to corresponding quantities in net input of irrigation water (N). Because most of the infiltrated water probably remained in the aquifers longer than one year, relations of cumulative input to cumulative drainage (fig. 37) probably are more significant than relations between annual values.
Figure 37--Cumulative loads of selected ions in cumulative net drainage (ΣXd) expressed as percentage of the corresponding ions in cumulative net input (ΣN) of irrigation water, 1966-71.
Cumulative net drainage of water was equal to about 60 percent of cumulative net input of irrigation water, but the percentages representing the various ions differed widely. As shown by the parallel configuration of the curves, the fluctuation in chemical discharge generally depended mainly on the water discharge. If all of the ions in net drainage were derived from the irrigation water, and there were no significant losses of ions to (or contributions from) the soil, aquifers, or storm runoff, the curves for all of the ions would coincide and approach 100 percent.
During the entire 1966-71 period, the chloride in net drainage exceeded the chloride in deliveries of the irrigation water, whereas the net input of magnesium, potassium, and sulfate in the irrigation water exceeded net drainage. More sodium, bicarbonate, and nitrate was carried out of the system in net drainage than was delivered in the irrigation water. The configuration of the curves shown in figure 37 substantiates the evidence from changes in the chemistry of the soil, ground water, and net gains during seepage-salinity surveys that net drainage consists mainly of effluent ground water containing accumulated sodium, chloride, and nitrate leached from the soil and underlying aquifers by excess irrigation water. Part of the calcium, magnesium, and sulfate carried by the irrigation water was transported to the river in overland runoff and part remains in the soil or ground-water reservoir. A relatively small part of the ionic load, including potassium, probably was consumed by vegetation.
With continued irrigation under existing conditions, exchangeable cations in the soil and irrigation water will reach unstable equilibrium, the soluble salts that accumulated in the soil and aquifers before irrigation will become depleted, and most of the altered ground water containing the leachate will have been displaced to the river. The quality of drainage from the irrigated land would then depend almost entirely on the quality and rate of application and consumptive use of the applied irrigation water. Unless conditions change drastically, the composition of ground water under the irrigated area and drainage from the irrigated land eventually should become nearly uniform and similar in composition to the applied water; however, the concentrations of the major ions should be from 1.5 to 2.0 times higher. The analyses of the soils and well waters show that significant quantities of soluble salts remain in transient storage, but that, with the exception of nitrate from fertilizer, they pose no significant threat to the downstream user.
Kansas Geological Survey, Geohydrology
Placed on web Nov. 2012; originally published 1975.
Comments to firstname.lastname@example.org
The URL for this page is http://www.kgs.ku.edu/Publications/Bulletins/CQS1/07_surf.html