Workgroup I summary

Scientific basis for next-generation models

The following text is the section 'Summary of discussions in Work Group I' from the proceedings of the workshop "Objectives for Next Generation of Practical Short-Range Atmospheric Dispersion Models".

The workshop was organized by DCAR (Danish Centre for Atmospheric Research).

The proceedings are now out of print, but certain sections of the proceedings are available via the Word Wide Web.


Contents of this document

Introduction

Presentation of papers

Discussion

1. What do decision makers need?

2. Do we need better models to meet the needs of decision makers? What is wrong with the "old-generation" models?

3. What is the scientific basis for the model description of various processes? What do we know, what do we not know?

4. Which type of data are needed? Are local data sufficient?

5. Model uncertainties.


Summary of discussions in Work Group I:

Scientific basis for next-generation models

Rapporteur: R. Berkowicz, NERI, Denmark.

Chairman: A.P. van Ulden, KNMI, the Netherlands.


Introduction

The topics that were discussed in Work Group I included:

- the need for a new generation of practical dispersion models

- the need for models dealing with complex effects

- state-of-the-art in practical dispersion modelling

- input required by the models and provision of the input

- quantification of model uncertainties.

The purpose of the discussions in the work group was to identify objectives for "next-generation" models and to determine important building stones for these models. However, as the work group discussions did not cover all aspects of this subject matter, the summary does not either.

It was not a priori well defined what is meant by the term "next-generation models". One point of view was the following: "Old generation models" are based on the traditional Pasquill-Turner stability classification and associated schemes for dispersion parameterisation. They do not take into account the results of more recent research, which points to the importance of a number of basic scaling parameters for an adequate description of the atmospheric boundary layer. Thus, the term "next-generation models" may be used for models which go beyond the Pasquill-Turner concept and make use of boundary-layer parameterisations based on the scaling concept.

Another point of view expressed in the work group was that next-generation models must also address the non-deterministic nature of turbulence and atmospheric dispersion. Models should not only predict mean values, but also concentration fluctuations and its statistics.

Presentation of papers

During the plenary sessions, a number of papers of relevance to Work Group I were presented. Eight more papers were contributed to the work group sessions. The models presented in these eight papers could be divided into three groups:

- models based on analytical solutions to the dispersion equation

- research models

- statistical models.

The following four papers deal with the first group - models based on analytical solutions:

T. Tirabassi: Analytical solutions of advection-diffusion equation as a tool for next generation of atmospheric models (Paper 17).

M.C. Cirillo and A.A. Poli: An easy to use semi-empirical model for treating diffusion under weak wind conditions (Paper 18).

G.M.F. Boermans and W.A.J. van Pul: SLAM, a short term and local scale ammonia transport model (Paper 19).

A.P. van Ulden: A surface-layer similarity model for the dispersion of a skewed passive puff near the ground.

A general characteristic of the models presented in these papers is that they do not use the Gaussian plume approximation, but instead apply a K-theory approximation. Inhomogeneous turbulence and wind shear can be treated explicitly. The models are easy to use and require modest computer time. The fact that these models are based on K-theory has the implication that they are most suitable for surface releases. For elevated releases, the diffusivity coefficient should depend on the length scale of the concentration distribution.

There were two papers dealing with the second group - research models:

F. Nieuwstadt: Large-eddy modelling as a data-base for practical air pollution models (Paper 20).

H. van Dop: On the potential use of Lagrangian stochastic models for short-range dispersion (Paper 21).

Presently, the models described in these two papers can be primarily considered as very powerful research tools.

Large-eddy models can provide data for the development and testing of simpler models; they can tell us which physical processes are important and how they should be parameterised. Large-eddy simulation can also be used for some more complex conditions for which simpler models will usually fail.

Lagrangian stochastic models are capable of simulating transport and diffusion for a broad range of atmospheric dispersion conditions; however, a prerequisite for their application is a detailed knowledge of turbulence statistics. This can be obtained from e.g. large-eddy models. Stochastic models require a substantial amount of computer time, but with the present development in computer skills, practical application of these models is no longer ruled out.

The third group - statistical models - had two papers:

G. Finzi: Stochastic models for real time forecast and control of pollution episodes (Paper 22).

P.C. Chatwin: The role of statistical models (Paper 23).

The stochastic real-time models considered in the first paper are used for forecast and control when an air quality network is available. The models make use of current concentration and meteorological data. The model parameters are determined on the basis of a historical series of the variables, while all possible uncertainties of the model are taken into account by introducing a "noise" input with assigned statistical properties.

The second paper by P.C. Chatwin addresses the fact that atmospheric dispersion is a stochastic phenomena. Therefore, it should be described quantitatively by statistical, not deterministic models. Statistical models predict, for example, the probability that the actual concentration exceeds a given (e.g. toxic) level, or they predict, less ambitiously, the standard deviation of the mean concentration. It should be noted that this standard deviation is known to be comparable with or greater than the mean.

Discussion

1. What do decision makers need?

Requirements concerning model capabilities depend on the environmental problem in question. Decision makers may need answers on such issues as:

- yearly average concentrations

- frequency distributions

- extreme events

- real-time assessments.

Some statements concerning the needs of decision makers were made in the group; however, this issue was not treated systematically.

One point raised was that decision makers tend to ask the wrong questions, i.e. questions that do not relate to present scientific knowledge or that cannot be answered properly. An important example is that decision makers ought to ask questions whose answers are probabilities; otherwise their questions, and the answers that they receive, are unscientific.

Another point relating to the needs of decision makers is that many important air pollution problems cannot be treated as isolated local phenomena. In addition to local-scale models, regional models are needed for regulatory purposes. When new sources are evaluated, it is necessary to take into account the background pollution. Impact of major sources can also be important on a regional scale. Further, the modelling of chemical processes (e.g. NONO2 conversion) requires knowledge of large-scale contributions (ozone).

Local-scale models are useful for decision makers when the impact of the sources considered is large compared to the background and when regulation of these sources has a major effect on the local air quality. They are appropriate in the case of e.g. isolated industrial complexes or energy production sources or close to small local sources.

Risk assessment studies require concentration predictions for a variety of emission scenarios (including probabilities) and meteorological situations (also including probabilities). In this case, class division of meteorology may be required. Moreover, for these studies, finite-time releases and transient releases must be considered. Puff models are most appropriate for this purpose.

Emergency response requires both fast and reasonably accurate predictions. Models should predict short time averages (a few minutes).

2. Do we need better models to meet the needs of decision makers? What is wrong with the "old-generation" models?

"Old-generation models" have several major shortcomings, and they often yield unreliable results.

One important problem is that these models normally make use of discrete classes for stability classification. Each of the classes covers a broad range of atmospheric conditions, and therefore such models give crude, and possibly incorrect, answers.

Also, a problem with the models is that it is difficult to quantify the uncertainties.

Because of the economic implications, it is important that a model used for estimating consequences of various emission restrictions and source modifications is both reasonably precise and flexible. Only models based on an appropriate description of physical processes can be used for meaningful investigations of present pollution levels and for predictions of future trends.

Due to the present large uncertainties in model predictions, decision makers are forced to choose unnecessarily restrictive (conservative) approaches if they want to make sure that certain environmental goals are met. Use of better models, with fewer uncertainties involved, will open up for less restrictive approaches and thus reduce expenses for polluters.

In short, we need new models because

- In many instances the old models give incorrect results.

- Modern models are expected to give less uncertain results than old. Further, it should be possible to quantify the uncertainties.

3. What is the scientific basis for the model description of various processes? What do we know, what do we not know?

It is now well established that the boundary layer similarity theory provides a good description of diffusion processes in the planetary boundary layer.

Even simple Gaussian plume models, combined with models based on probability density functions (pdf-models) may be able to do a good job (concerning ground level concentrations) if a proper description is given for such model elements as:

- dispersion parameters

- plume rise

- building downwash

- fumigation.

Integral models are now available for plume rise. Special approaches are required for "wet" plumes.

Single prismatic buildings can be handled by models using concepts of wake areas, recirculation and downwash zones. Dispersion from point sources between small groups of buildings can only be properly handled by 3-D codes or wind tunnel measurements.

Mixing height (important for e.g. fumigation processes) is an important model parameter, but is still difficult to determine.

There is no widespread agreement on techniques to deal with calms.

An important problem, especially for tall stacks, is cloud venting. Some work on this subject is in progress.

Modelling of dispersion in complex terrain is first of all a matter of flow modelling. Usually, advanced mesoscale models are required for this purpose.

Other questions that have to be answered by next-generation models are:

- How to deal with chemistry (especially NONO2 conversion)?

- How to deal with dry/wet deposition?

- How to deal with plume descent (particles)?

Effects of instationarities can be handled best by puff models. It might be desirable to develop generalized puff models that will be able to handle both local and regional dispersion.

 

4. Which type of data are needed? Are local data sufficient?

The old-generation models make use of easily available surface observations. However, it is now known that surface data alone do not provide an adequate description of dispersion in the planetary boundary layer.

Meteorological preprocessors are now available which can provide adequate data for next-generation models, but they do not provide all the desired parameters, and there are still some unresolved problems with the methods. As an example, the determination of mixing height is still very problematic.

Preprocessing of data should preferably be based on similarity theory.

The required measurements include such parameters as:

- surface wind and roughness

- cloud cover and/or net radiation

- moisture availability or surface resistance.

With new measuring techniques (RASS, wind profiler) it will be possible to obtain continuous vertical profiles of:

- temperature

- wind and turbulence.

Numerical weather forecast models are now becoming capable of providing the necessary data for air pollution dispersion models. For complex terrain, including coastal areas, atmospheric flow models combined with data assimilation (for interpolation in time and space), might be needed in order to provide an adequate description of transport and dispersion of air pollution.

5. Model uncertainties.

All model predictions are inevitably associated with some uncertainty. Different kinds of uncertainties can be identified. In the discussions of the work group, a distinction was made between two categories of model uncertainties:

1) internal uncertainties

2) external uncertainties

Internal uncertainties are related to the model itself and the way in which it is used. Thus, they can be influenced by the modeller and the model user.

The internal uncertainties of model predictions can be due to:

- the physics of the model (model assumptions and parameterizations)

- errors and uncertainties in input data, hereunder:

meteorological input,

emission data,

Modellers and model users can seek to reduce the internal model uncertainties by:

- improving the physics of the model,

- improving the quality of input data and reducing errors in measurements of input variables.

What is here called external uncertainties are not uncertainties in the true sense of the word, as they are inherent in nature. These "uncertainties" have their origin in the fact that:

- dispersion in the atmosphere is a stochastic process

- all variables of interest are random

- every random variable has a probability structure

- in atmospheric dispersion, the probability structure is determined by the physics and by the choice of ensemble.

The external uncertainties can hardly be reduced, due to the stochastic nature of the atmospheric processes. It is, however, important that these uncertainties are recognized and also taken into account by decision makers.

Ultimately, models should not only predict means, but additionally estimates of uncertainties. These estimates should include as well internal model uncertainties (including input data), as the external, due to the probabilistic nature of the atmospheric processes.

Thus, important recommendation of Work Group I is to encourage work in order to quantify model uncertainties. A similar recommendation was submitted by Work Group II.


[CONTENTS]

Back to Homepage of the
Initiative on Harmonisation within Atmospheric Dispersion Modelling for Regulatory Purposes