Location: Karlsruhe Institute of Technology, Karlsruhe, Germany
Date: 23-25 March 2020
Keynote talks: Nikolina Ban, Nigel Roberts, Julien Le Sommer
Organizers: Aiko Voigt, Elzabeth Kendon, Daniel Klocke, Florian Pantillon
High-resolution atmosphere models are playing an increasing role for understanding and predicting weather and climate. High-resolution models promise to overcome some of the stubborn difficulties of coarse models arising from the need of the latter to parameterize moist convection. They also provide a natural link to observations as well as the impact of individual weather events and regional climate change. We here use the term high-resolution models to refer to models with resolutions of a few kilometer or finer, which allows for an explicit treatment of (deep) convection with no need for convection parametrization schemes. This includes both convection-permitting models (e.g., DYAMOND) as well as large-eddy models (e.g., HD(CP)2). Our focus is on model runs with realistic boundary conditions, so as to separate from the large community of idealized studies on tropical convection and radiative-convective equilibrium.
However, using high-resolution models in an efficient and productive manner can be challenging. While simulating the atmosphere with fine spatial detail, these models currently can only be integrated for short periods, making it difficult to accumulate sufficient statistics. Similarly, the massive increase in the amount of data these models produce requires us to reconsider, and in fact revise traditional work flows and analysis strategies. Therefore, the workshop has two goals:
- provide a forum to articulate and present the use and benefit of high-resolution for weather and climate science,
- and provide an opportunity for sharing ways and exchanging experiences in how to handle and make sense of the enormous increase in data.
The first goal targets the science that is enabled by high-resolution models, and the way how to realize this science. This is particularly relevant for researchers from the realm of large-scale dynamics, who are often used to long simulations with coarse models. Given the limited length of the high-resolution simulations, a more object- and hence process-based analysis seems necessary. The second goal is directed at technical aspects, including new software tools such as xarray and dask for out-of-memory and parallel data analysis, as well as machine-learning based analyses. It also includes how to work on models’ native unstructured grids without remapping.
The agenda and some information on logistics are summarized here (last change Feb 8, 2020).
The workshop receives financial supported from BMBF and FONA.