2016 International Land Model
Benchmarking (ILAMB) Workshop

The International Land Model Benchmarking (ILAMB) Project:
Comprehensive evaluation of the carbon cycle, hydrology, and terrestrial ecosystem processes in Earth system models

Forrest M. Hoffman, William J. Riley, James T. Randerson, Gretchen Keppel-Aleks, and David M. Lawrence

Original Prospectus: December 3, 2015
Updated: April 19, 2016

Workshop Title:

2016 International Land Model Benchmarking (ILAMB) Workshop

Workshop Location:

DoubleTree by Hilton Hotel Washington DC
1515 Rhode Island Avenue NW
Washington, DC 20005-5595, USA
+1-202-232-7000

Workshop Dates:

May 16–18, 2016

Workshop Rationale:

As Earth system models (ESMs) become increasingly complex, there is a growing need for comprehensive and multi-faceted evaluation of model predictions. To advance understanding of biogeochemical processes and their interactions with hydrology and climate under conditions of increasing atmospheric carbon dioxide (CO2), new methods are needed that use observations to constrain model predictions, inform model development, and identify needed measurements and field experiments. Better representations of biogeochemistry–climate feedbacks and ecosystem processes in ESMs are essential for reducing uncertainties associated with projections of climate change during the remainder of the 21st century. In an effort sponsored by the U.S. Department of Energy (DOE), a diverse team of National Laboratory and university researchers is developing new diagnostic approaches for evaluating ESM hydrological and biogeochemical process representations. This research effort supports the International Land Model Benchmarking (ILAMB) project (http://www.ilamb.org/) by creating an open source benchmarking system that leverages the growing collection of laboratory, field, and remote sensing data. This benchmarking system performs comparisons of model results with best-available observational data products, focusing on atmospheric CO2, surface fluxes, hydrology, soil carbon and nutrient biogeochemistry, ecosystem processes and states, and vegetation dynamics. Next generation benchmarking priorities will focus on design of new perturbation experiments (e.g., atmospheric CO2 enrichment, water exclusion, nutrient addition, soil/plant warming) and resulting model evaluation metrics, new metrics from extreme events (e.g., drought, floods), and process-specific experiments (e.g., litterbags, 14C tracers). This benchmarking system is expected to become an integral part of model verification for future rapid model development cycles. Moreover, it will contribute model analysis and evaluation capabilities to phase 6 of the Coupled Model Intercomparison Project (CMIP) and future model and model–data intercomparison experiments.

The complexity of today’s process-rich ESMs poses a verification challenge to developers, implementing new parameterizations or tuning process representations. Model developers and software engineers require a systematic means for evaluating changes in model results to assure that their developments improve the fidelity of the target process representations while not adversely affecting results in other parts of the model. In addition to objectively assessing the performance of ESMs and identifying model weaknesses, the ILAMB benchmarking system can also provide a framework for verifying and tuning model developments, supporting a more rapid development cycle and providing continuous and documented evaluation of model skill. DOE’s Accelerated Climate Modeling for Energy (ACME) project has adopted ILAMB for this purpose and is implementing new metrics for more advanced features currently under development and testing.

CMIP provides essential information about future climate scenarios and ESM behavior that is used by the Intergovernmental Panel on Climate Change (IPCC) for periodic assessment and more broadly by policy makers and resource managers for the design of effective climate mitigation and adaptation strategies. Planning for CMIP6 is well underway, and the sophistication of model representations of biosphere and carbon cycle processes will far exceed levels achieved in past efforts. The complexity and volume of archived simulation output will be unparalleled, creating new opportunities for increasing our knowledge of biosphere–climate interactions, yet creating important challenges with respect to effectively harvesting this information to reduce uncertainties and improve our understanding of fundamental Earth system processes. Coordination among several MIPs is needed, as well as enhanced information flow among the modeling, evaluation, and observational communities.

We propose to continue and expand DOE’s role in coordination of CMIP activities and to improve upon DOE’s capabilities to assess the fidelity of model development in the ACME project by supporting comprehensive terrestrial model evaluation and benchmarking through the ILAMB project. Specifically, we propose to convene a three-day workshop to be held in downtown Washington, DC, USA, during May 16–18, 2016. Following more than five years after the first ILAMB Workshop in the U.S.—which was co-sponsored by DOE, NASA, and the International Geosphere–Biosphere Programme’s (IGBP’s) Analysis, Integration and Modeling of the Earth System (AIMES) project in January 2010—this second U.S. workshop is designed to accomplish the following objectives:

  1. To highlight new techniques for model evaluation that can reduce uncertainties with respect to biosphere processes and biogeochemical feedbacks with the climate system;
  2. To enable coordination among the Coupled Climate–Carbon Cycle Model Intercomparison Project (C4MIP); the Land Surface, Snow, and Soil Moisture Model Intercomparison Project (LS3MIP); and the Landā€Use Model Intercomparison Project (LUMIP) activities, particularly with respect to synergies that may exist for model evaluation and analysis;
  3. To increase awareness of new data streams that will be available for model verification and benchmarking from remote sensing, in situ measurements, and synthesis activities;
  4. To increase the use and sharing of information and community tools for model evaluation and benchmarking, including the ILAMB software package;
  5. To design new metrics and evaluation approaches for integration into the next generation ILAMB system; and
  6. To create new metrics that integrate across carbon, surface energy, hydrology, and land use disciplines.

Workshop Outcomes:

To meet these objectives for DOE’s climate research mission and to provide needed international climate science leadership, the workshop will invite participation from approximately 40 leading Earth system modelers, remote sensing experts, ecosystem ecologists, and data producers from around the world, in addition to team members of the Biogeochemistry–Climate Feedbacks Scientific Focus Area (SFA) and Program Managers from DOE and other federal agencies. A list of candidate participants from 11 countries is provided below. The outcome of the workshop will include a formal written report summarizing invited presentations, breakout group findings, and discussions regarding model evaluation strategies, gaps, and synergies. We also plan to have demonstrations on how to install and run version 1.0 of the ILAMB software system. In particular, the workshop report is expected address the topics following the draft outline shown here.

  1. Model evaluation and benchmarking concepts and principles
  2. Benchmarking tools
    1. PALS / PLUMBER
    2. ESMValTool
    3. NASA LIS Evaluation
    4. ILAMB
    5. Existing model evaluation capabilities in use at modeling centers
    6. Synergies between different benchmarking activities
  3. Existing and new metrics for carbon, water, energy, and ecosystem processes
    1. Ecosystem processes and states
    2. Hydrology
    3. Atmospheric CO2
    4. Soil carbon and nutrient biogeochemistry
    5. Surface fluxes (energy and carbon)
    6. Vegetation dynamics
  4. Model Intercomparison Project (MIP) benchmarking needs and evaluation priorities
    1. CMIP6 historical and DECK
    2. C4MIP
    3. LS3MIP
    4. LUMIP
    5. TRENDY
    6. MsTMIP
    7. PLUME-MIP
  5. Next generation benchmarking challenges and priorities
  6. Process-specific experiments
    1. Metrics from extreme events
    2. Design of new perturbation experiments
    3. High latitude processes
    4. Tropical processes
    5. Global remote sensing
    6. Fluxnet and other surface hydrology and ecosystem networks
  7. Model benchmarking gaps and synergies
    1. Integration with uncertainty quantification frameworks
    2. Computational requirements for post-processing and workflow
    3. Frameworks, Open Model Benchmarking Architecture (OpenMBA)
    4. Integration with archival and distribution cyberinfrastructure
    5. Evaluating new process representations and rapid model development
  8. Conclusions and next steps (5 and 10 y long term goals)
  9. Appendix A. ILAMB Tutorial Materials and Data

The workshop will also serve as a venue for communicating recent scientific results in applying the “emergent constraints” approach for limiting the range of model predictions and reducing uncertainty, as well as new methodologies for generating and applying metrics in the assessment of model fidelity. Opportunities for information exchange will be through invited plenary presentations held over the course of the workshop, focused breakout group discussions, informal breakaway group meetings on the evening of the second day, and a poster session to be conducted on the first evening of the workshop in conjunction with a hosted dinner, consisting of food items suitable for eating while standing. Moreover, the workshop will provide the opportunity for key international participants to discuss the organization and proposal of special issues of scientific journals addressing new model analyses, evaluation and benchmarking, and verification and validation for new process representations and parameter optimization.

We expect the ILAMB benchmarking software development, supported primarily by DOE, to be of keen interest to workshop participants. Access to these software tools and related data sets will be provided at the workshop, and time will be set aside for a tutorial or training session focused on installing, using, and extending these tools for model development and verification, model validation exercises, and model–data intercomparison studies. This training session will be led by the software package developers from the BGC Feedbacks SFA to assure that participants receive hands-on assistance and to enable direct feedback from users on bugs, analytical issues, and desired features.

A sample workshop agenda is contained below.


Monday, May 16, 2016
7:00 Breakfast Lobby
8:00 Welcome, Introductions, and Safety Plenary Room
Welcome and Safety
U.S. Dept. of Energy (DOE) Research
DOE Climate Research Priorities
DOE RGCM Program
DOE ESM Program
Biogeochemistry-Climate Feedbacks SFA
Accelerated Climate Modeling for Energy (ACME)
Workshop Charge and Reporting
9:10 Plenary Presentations on Benchmarking Tools Plenary Room
PALS / PLUMBER
ESMValTool
NASA LIS Evaluation System
ILAMB
10:30 Morning Break Lobby
11:00 Plenary Discusson on Model Evaluation Plenary Room
Summary of Evaluation Methods at Modeling Centers
Discussion on Model Evaluation
11:50 Plenary Presentations on Emergent Constraints and Evaluation Metrics I Plenary Room
12:30 Working Lunch Lobby
13:30 Metrics Breakout Group Meetings I
Ecosystem Processes and States Plenary Room
Hydrology Breakout Room 1
Atmospheric CO2 Breakout Room 2
15:00 Afternoon Break Lobby
15:20 Metrics Breakout Group Meetings II
Soil Carbon and Nutrient Biogeochemistry Plenary Room
Surface Fluxes (Energy and Carbon) Breakout Room 1
Vegetation Dynamics Breakout Room 2
16:50 Breakout Group Reports (1–3 datasets, 1–3 new metrics, and bibliographies) Plenary Room
Ecosystem Processes and States
Hydrology
Atmospheric CO2
Soil Carbon and Nutrient Biogeochemistry
Surface Fluxes (Energy and Carbon)
Vegetation Dynamics
17:20 Poster Lightning Presentations Plenary Room
18:00 Poster Session and Reception Plenary Room,
Breakout Room 1, and
Breakout Room 2
20:00 Adjourn for the Day

Tuesday, May 17, 2016
7:00 Breakfast Lobby
8:00 Invited Presentation on New Data Products Plenary Room
8:30 Plenary Presentations on MIP Descriptions and Benchmarking Needs Plenary Room
CMIP6 Historical and DECK
C4MIP
LS3MIP
LUMIP
TRENDY
MsTMIP
PLUME-MIP
Discussion
10:30 Morning Break Lobby
11:00 Plenary Presentations on Emergent Constraints and Evaluation Metrics II Plenary Room
11:30 Breakout Groups on CMIP6 Evaluation Priorities (pre-lunch)
C4MIP Plenary Room
LS3MIP Breakout Room 1
LUMIP Breakout Room 2
12:30 Working Lunch Lobby
11:00 Breakout Groups on CMIP6 Evaluation Priorities (post-lunch)
C4MIP Plenary Room
LS3MIP Breakout Room 1
LUMIP Breakout Room 2
14:00 Breakout Group Reports (1–3 datasets, 1–3 new metrics, and bibliographies) Plenary Room
C4MIP
LS3MIP
LUMIP
14:30 Invited Presentation on Synthesis Plenary Room
15:00 Synthesis Discussion Plenary Room
15:15 Afternoon Break Lobby
15:45 ILAMB Package Tutorial / Training Session Plenary Room
17:00 Dinner on your own Downtown DC

Wednesday, May 18, 2016
7:00 Breakfast Lobby
8:00 Plenary Presentations on Emergent Constraints and Evaluation Metrics III Plenary Room
9:30 Breakout Groups on Next Generation Benchmarking Challenges and Priorities I
Process-specific experiments (litterbags, 14C) Plenary Room
Metrics from extreme events Breakout Room 1
Design of new perturbation experiments Breakout Room 2
10:00 Breakout Groups on Next Generation Benchmarking Challenges and Priorities II
High latitude processes Plenary Room
Tropical processes Breakout Room 1
Global remote sensing Breakout Room 2
10:30 Morning Break Lobby
11:00 Breakout Group Reports (1–3 datasets, 1–3 new metrics, and bibliographies) Plenary Room
Process-specific experiments
Metrics from extreme events
Design of new perturbation experiments
High latitude processes
Tropical processes
Global remote sensing
12:00 Plenary Presentations on Uncertainty Quantification (UQ) Methods Plenary Room
Land model uncertainty analysis
Reduced order techniques
ACME UQ
PeCAN
12:40 Working Lunch Lobby
13:40 Prioritizing Next Steps Plenary Room
16:30 Workshop Report Organization and Writing Assignments Plenary Room
15:00 Afternoon Break Lobby
15:30 Parallel Sessions on the ILAMB Package and Synthesis
ILAMB Package Tutorial / Training Session Plenary Room
Synthesis Meeting (Continued from Tuesday) Breakout Room 1
17:00 Adjourn the Meeting

2016 Washington (USA) Workshop