Subscribe for 33¢ / day

BOLTON  Last year came a diagnosis of Lake George water quality, with 30 years of monitoring the lake’s chemistry told through a 72-page report.

This year at the Fund for Lake George annual meeting at the Sagamore resort, a crowd of roughly 170 caught glimpses of the computer modeling being done from a deep level of research that is helping shape a science-based treatment for the lake.

The report released a year ago detailed data from 1980 to 2009 and found a 6 percent decline in water clarity and a tripling of salt concentrations since 1980.

The report identified invasive species, salt increases and declines in water clarity as the top three problems in a lake that is healthy overall.

The Jefferson Project’s high-definition computer models offer an unprecedented look at specifics, like a new model that shows which roadways are the worst when it comes to salt spray used for deicing making its way to the lake.

Researchers are generating high-definition computer models for the salt runoff and other watershed data such as water circulation and weather. They will be displayed in unprecedented detail for scientists at the Darrin Fresh Water Institute Visualization Lab.

“We are now pressing the boundaries in this area,” said engineer Harry Kolar from IBM Research, associate director of the Jefferson Project.

The Jefferson Project is a multimillion-dollar undertaking through a collaboration between The Fund for Lake George, Rensselaer Polytechnic Institute and IBM Research. It embodies the three “pillars” outlined at the meeting as a blueprint for protection: partnership, innovation and investment.

The difference between the models available from last year to this year is stark. The Jefferson Project has 14 sensor platforms deployed, collecting “10,000 times” the data collected over 30 years for last year’s water quality study, said John E. Kelly, an IBM vice president.

“The amount of data we’re collecting on this lake cannot be analyzed in spreadsheets or any other way,” Kelly said.

As another example of the detail, the new circulation model is now calculated for 20 layers within the lake, not just the top, bottom and somewhere in the middle showed in last year’s model.

“This gives you an idea of how much data is in the new calculations. It’s very,very complicated. This doesn’t even have chemical or biological data in it yet,” Kolar said.

The weather model shows precise measurements and now runs on a regular basis.

“It is more data than has ever been collected on any lake in the world. The insights we’re getting from this are just incredible,” Kelly said.

Fund Chairman Jeff Killeen said the research will help guide solutions by showing how things like invasive species and nutrients circulate and interact with each other and the food web. It can show “sensitive” areas where road salt is more likely to make its way to the lake, helping craft a solution.

“We would waste a ton of money on the solutions if we weren’t united by this type of fundamental research,” Killeen said.

Jefferson Project Director Rick Relyea said they are using three approaches: monitoring, modeling and experimentation.

The 35th year of monitoring the lake’s chemistry just wrapped up. Automated sensors are providing data for the modeling. The crew is conducting controlled experiments within 900 outdoor tanks filled with Lake George water, covering several acres at the Rensselaer Technology Park.

“I was stunned when I came here that this lake’s food web has never had a comprehensive survey of the zoo plankton, invertebrates, the fish, etc. We are now doing that,” Relyea said.

Follow Amanda May Metzger on Twitter @AmandaWhistle and read her blog at


Load comments