OOS 5-9
Estimating uncertainty for continental scale measurements

Monday, August 5, 2013: 4:20 PM
101E, Minneapolis Convention Center
Jeffrey Taylor, National Ecological Observatory Network (NEON, Inc.), Boulder, CO
Joshua Roberti, Fundamental Instrument Unit (FIU), National Ecological Observatory Network (NEON)
Derek Smith, National Ecological Observatory Network (NEON, Inc.), Boulder, CO
Steve Berukoff, National Ecological Observatory Network, Boulder, CO
Henry W. Loescher, National Ecological Observatory Network (NEON), Boulder, CO

The National Ecological Observatory Network’s Fundamental Instrument Unit (NEON-FIU) is responsible for making automated terrestrial observations at 60 different sites across the continent.   FIU will provide data on key local physical, chemical, and climate forcing, as well as associated biotic responses (CO2, H2O, and energy exchanges).   The sheer volume of data that will be generated far exceeds that of any other observatory network or agency, (i.e., > 45 Tb/year from 10’s of thousands of remotely deployed sensors).   We address the question of how to develop and implement an ecological observatory that can accommodate such a large volume of data while maintaining high quality.   Here, we describe our approach to uncertainty for large scale measurements with specific examples that focus on quality control while leveraging cyber infrastructure tools and optimizing technician time.


Results focus on novel approaches to uncertainty that advance the techniques that have been historically employed in other networks (DOE-ARM, AmeriFlux, USDA ARS, OK Mesonet) to new state-of-the-art functionality.   These automated and semi-automated approaches are also used to inform automated problem tracking to efficiently deploy field staff.   Ultimately, NEON will build upon existing frameworks of standardized uncertainty characterization to define its own operational standards for continental scale data products.   The overarching philosophy relies on attaining the highest levels of accuracy, precision, and operational time, while efficiently optimizing the effort needed to produce quality data products.