Elevation & Flow Data: Technology is great … when it works
Technology is great … when it works the way it’s supposed to.
Case in point: visitors to Central’s “Reservoir/River Data” web page may have noticed “anomalies” in some of the graphs. In particular, the graph depicting recent inflows to Lake McConaughy shows a sudden spike; flows coming into the reservoir – according to the graph – jumped from around 1,200 cubic feet per second (cfs) to more than 7,500 cfs.
No, there wasn’t a cloudburst above Lake McConaughy and, no, there wasn’t a sudden release of large volumes of water from upstream reservoirs. The spike was caused by cold temperatures and ice in the North Platte River that interfered with gauging station equipment and the ability to accurately measure flows in the river.
Similarly, the elevation graph for Lake McConaughy shows a sudden and dramatic drop in the reservoir’s water level – almost 30 feet – that, we can assure you, did not actually happen. No, the dam didn’t break and there’s not a huge volume of water surging down the river.
Again, there was a problem with the data collection equipment that resulted in the generation of inaccurate graphs.
Now, you might think it’s a fairly simple matter to correct the data displayed on the graphs, but it’s more complicated than that because of the nature of how the graph is populated with data. In the past, all data was manually keyed into a spreadsheet and table. The data was used to create each graph which were then manually uploaded to the server for display on the web page. It was easy to recognize when “bad” data was reported, to confirm that the data was indeed “bad,” and to input correct data.
When the “Reservoir/River Data” page was automated earlier this year, the task the programmer faced was how to pull together data from multiple sources (gauges maintained by the U.S. Government Survey, the Nebraska Department of Natural Resources, and Central’s own supervisory control and data acquisition system (SCADA)) into a cohesive form and then to code the information to automatically generate the graphs that appear on the web page.
However, the “bad” data is already recorded and stored in the data base that the automated system queries to populate the table and graphs on the web page. In some cases, since the data is not compiled by Central, it becomes difficult to change the source data (which is archived in the source data base) and requires manual override by Central personnel on a daily basis, a somewhat time-consuming task that was supposed to be unnecessary after automating the page.
So, the upshot is that we’re working on a way to resolve the issue that typically arises when winter weather conditions interfere with gauge function. Until a solution can be found, don’t get too excited by sudden sharp spikes – up or down – in the data reported in the table or that appears on the graphs. If something unusual does occur, rest assured that we’ll let you know.
Leave a Reply