The History of the Unified Model

These notes are based on the UM user guide (http://ncas-cms.nerc.ac.uk/content/view/60/86/)


Contents

Background to the Unified Model
Version upgrades and hardware platforms
Early versions 2.0-2.6
Version 2.7
Version 2.8
Version 3.0
Version 3.1
Version 3.2
Version 3.3
Version 3.4

Version 3.5/4.0
Version 4.1
Version 4.2/4.3
Version 4.4

Version 4.5
Versions 5.0, 5.1 and 5.2
Version 5.3
Version 5.4

Version 5.5


Background to the Unified Model

By the end of the 1980's the Met Office had developed separate numerical prediction models to sustain climate research and operational forecast capabilities. The climate model consisted of 11 vertical layers with a global domain at 250km horizontal resolution and incorporated a relatively sophisticated representation of physical processes. The operational model had 15 vertical layers and ran in two versions: a `coarse mesh' global model at 150km resolution and a `fine mesh' limited area model covering Europe and the N.Atlantic at 75km resolution. These models had simpler calculations of physical processes and employed a different dynamical integration scheme to that of the climate model. In addition a mesoscale model with 15km resolution provided some operational products for the UK area, and comprised a completely distinct formulation for both physics and dynamics. There were also new requirements arising from the need to include more complex interactions into long climate integrations, in particular the introduction of an ocean model, to be coupled explicitly with the atmosphere model.

Each model had its own control, file and output structure, as well as separate scientific formulations, with separate teams responsible for development. The models ran on a CDC CYBER 205 which was due to be replaced by an ETA systems platform during 1989. Significant effort was expended in adapting the code of each model to run on the new platform, but this work was suspended when the manufacturing company was wound up and a new vendor had to be found. It was decided to promote a longer term project to achieve a unification of model systems to fill the gap, and reduce the duplication of effort required to implement separate models on new hardware. The lifetime of the CYBER was extended and development of the CYBER models frozen; a Cray Systems YMP with 8 vector processors was procured with a Unix based operating system and the machine arrived in January 1990. A formal Unified Model project was started in July 1989 and continued until October 1991 when it was judged that the main components were ready and the Unified Model was born.

A number of key strategic decisions has shaped the structure and development of the UM:

• The system would share a common control and file structure for all types of models, so that individual models running in separate environments would be replaced by different model configurations of the UM.

• Model set-up would be achieved via a graphical user interface, divorcing the detail of control file information from decisions made by the user. Thus presentation of model options to the user is simplified, the process of model submission is streamlined and the occurrence of inconsistencies or errors during set-up is reduced.

• The UM system would be upgraded periodically through change control management techniques, which is an essential safeguard for code changes made by developers that may affect other model configurations. Each upgrade constitutes a UM version.

• Model software would be coded in Fortran, with a minimum of machine specific or low-level language additions. This is required to loosen the dependency on a particular hardware platform, and allow more rapid development of new scientific routines by researchers. This decision was later extended to include the objective of providing explicit portability. Hence model configurations which were developed on supercomputer platforms should be readily ported to other systems such as workstations, with a minimum of changes.

• Models would depend on a common output diagnostics system, which was capable of intercepting transient fields, or calculations based on such fields, as well as primary fields available through periodic model dumps.

• Separate choices of scientific schemes would be readily available from the user interface, and different physics schemes would be `plug compatible', ie it should be easy to import or export schemes to other organisations' models with a minimum of effort.

Version upgrades and hardware platforms

Table 1.2 lists release dates for each UM version and the date at which the operational forecast model was upgraded to the new release. Some indication of the more major changes to the hardware platform is given.

Table 1.2: Unified Model Versions
UM Version Release Operational Hardware Platform: Compiler
  Date Release Operating System Release
2.0 16.05.91 05.06.91 Cray YMP-8: Unicos5.1 4.0.2
2.1 27.06.91 16.07.91    
2.2 15.08.91 17.09.91    
2.3 23.10.91 12.11.91    
2.4 07.01.92 11.02.92 Unicos6.0  
2.5 26.03.92 12.05.92    
2.6 04.06.92 11.08.92   5.0.2
2.7 01.09.92 27.10.92   5.0.3
2.8 22.12.92      
3.0 15.01.93 23.02.93    
3.1 30.03.93 25.05.93 Unicos7.0  
3.2 06.09.93 19.10.93    
3.3 03.06.94 12.07.94 Cray C90: 6.0.3.2
3.4 24.01.95 07.03.95    
3.5 03.08.95      
4.0 12.01.96 19.03.96    
4.1 20.08.96   Unicos8.0  
4.2 15.01.97   Cray T3E: mk1.4.1  
4.3 09.06.97 29.01.98 mk1.6.1 2.0.3.4
4.4 22.12.97 15.04.98 mk2.0.0  
4.5 11.12.98 23.06.99   3.0.2.1
5.0 30.11.99   mk2.0.4 3.0.2.1
5.1 02.08.00   mk2.0.5  
5.2 20.04.01 07.08.02 mk2.0.5.40 3.0.4.2+
5.3 12.04.02 07.08.02 mk2.0.5.52 3.0.4.2++
5.4 06/12/02   mk2.0.5.59  
5.5 09/06/03      

The following sections illustrate the more significant changes that have contributed to version upgrades. Brief descriptions of system enhancements are given, with some indication of new scientific features, concentrating on the atmosphere model. In reality each UM version consists of a hundred or more individual changes, ranging from minor corrections to major improvements of infra-structure facilities, plus work required to take advantage of successive enhancements to the available computing resources.

Early versions 2.0-2.6
Rapid development continued to consolidate the performance of the UM. A significant number of error corrections arose from the relative immaturity of the code. Changes included:

• Introduction of C I/O routines to replace Cray specific unblocked data files and replacement of reference to file names by assign statements to being passed by environment variables.

• Source code modification of code by users for compilation was changed from ``quick'' to ``normal'' modes of Cray proprietary update package, to ensure that a change to a COMDECK was propagated to all affected routines (see section 3.1).

• Vertical interpolation of boundary files was enabled, allowing generation of boundary conditions for a model on a different vertical grid.

• A facility to enable timeseries to be output from the diagnostic system was introduced.

• The conversion of potential temperature to temperature ( ) was modified to account for the dynamical formulation of the hydrostatic relation; definitions of standard vertical hybrid co-ordinates changed.

• Dynamics calculations at the top model level could now be switched to run with a halved timestep, preventing model instabilities causing model failures without the cost of running the entire model with a reduced timestep.

Version 2.7
Changes already available through the UM system were co-ordinated to produce a UM mesoscale configuration with 30 vertical levels. This replaced the earlier mesoscale model and went operational in December 1992.

• Several enhancements of physics were consolidated in the system, including the components contributing to the first climate version (1CV, now labelled HADAM1):
1. Convection with an explicit downdraught;
2. Rapidly mixed boundary layer code.

• Introduction of a separate version of the longwave radiation scheme based on transmissivity values derived at ECMWF.

Version 2.8
• Upgrade of source code management tool from Cray update to Cray nupdate.

• A new packing code was introduced into look-up headers of data fields to cater for new combinations of packing and data compression. This required an explicit utility to convert model dumps at 2.7 and earlier to the new format at 2.8.

• (Outstanding 1CV science change): grouping of model levels into 3 explicit cloud layers within the shortwave radiation scheme.

• Time-smoothing during the adjustment timestep within the dynamics was introduced to alleviate the problem of noisy fields at upper levels in the operational global model.


Version 3.0
The active code of Version 3.0 was identical in content to that at 2.8, but line numbers of code were re-sequenced and some old comments were removed. DECK names were changed systematically to reflect the concept of multiple version-releases of a code section (see section 4.7).

Version 3.1
• A new option was provided to allow more than one boundary file to be produced by the model, so that multiple limited area models could be driven from the same generating run.

• Functionality of the reconfiguration program was extended to include initialisation of model analyses with ECMWF perturbations, simplifying running of this category of ensemble forecasts. An option to transplant fields from a different model dump into a given domain within the model analysis was also developed.


Version 3.2
The first version of the portable model was made to work at 3.2 on HP workstations, but with a large number of local modifications, and relied on fixed experiment libraries generated by the user interface on the IBM mainframe computer.

• Dynamic allocation of primary fields was introduced and required large scale changes to the top level control structure of the model. Sizes of arrays depending on the model configuration were now passed at run-time, removing the necessity to re-compile when changes in diagnostic output were required.

• The value of the real missing data indicator was changed from -32768.0 to -2**30 to reduce the possibility of valid data being assigned as missing.


Version 3.3
• A facility for users to introduce their own prognostic variables without requiring a change to the user interface was implemented, giving code developers freedom to make and test a greater range of modifications locally.

• GRIB format for output fields was provided as an option. The file structure was of hybrid form, with lookup headers retained, but data packed as GRIB.

• A halving of the dynamics timestep could be initiated on an automatic basis, depending on thresholds of wind speed or divergence in the top level being exceeded.

• Calling convection routines less frequently than every timestep was enabled, providing a method of reducing cpu costs of the operational mesoscale model without significant degradation in performance.

• The provision of source terms for aerosols, as part of a tracer advection technique for dealing with passive variables, provided essential functionality for both climate studies and later mesoscale visibility calculations.

Version 3.4
• Unix scripts in the UM system were brought within code management by nupdate as part of change control for new versions. This replaced having single frozen copies and simplified the action of modifying scripts by different people.

• Macrotasking techniques were introduced into dynamics, convection and radiation schemes to facilitate more efficient use of the multi-tasked C90 environment.

• New options for atmosphere science routines, constituting the main components of the second climate version (2CV, now labelled HADAM2a):
1. A more complex treatment of orographic roughness within the boundary layer code, including the representation of silhouette orographic roughness;
2. A new gravity wave drag scheme incorporating anisotropic directional orographic variances and `hydraulic jump' terms;
3. A new version of the convection routine with excess parcel buoyancy generated from the surface level fluctuations of temperature and humidity;
4. A new large scale precipitation scheme with all snow falling out in a layer in the same manner as for rain;
5. Inclusion of primitive cloud microphysics within the shortwave radiation scheme, allowing a variation in cloud effective radius;
6. An option to re-set negative humidity values locally rather than globally.

• A new multi-layer soil hydrology scheme.

• Inclusion of trace gases in a new version of the longwave radiation.

• Assimilation of humidity from MOPS (Moisture Observation Pre-processing System) was introduced, providing a method of feeding moisture information back into the hydrology cycle of the operational mesoscale model.

• A dynamical sea-ice model was included in the Slab model (see section 4.12).

• An ice section was added to the ocean model and assimilation within the ocean model enabled.

• This version was released to a number of external users, now with the capability of running on a Dec Alpha workstation.

Version 3.5/4.0
3.5 was an interim version required to stage the large changes planned for 4.0, and only had significance for the core development and testing team. This version spanned the transfer of central UM system facilities, including the user interface, from the IBM mainframe to a more portable Unix environment. At the Met Office this consists of a combination of HP workstation local area networks with the Cray C90. Modifications to code for 3.5 relative to the previous version were located on the mainframe. The repository for modifications to code against 3.5 and later versions is the Cray C90. Development of a new graphical user interface, the UMUI, was partially completed at 3.5, and fully implemented for 4.0. Submission of model jobs to the Cray C90, and inspection of subsequent output listings, was made independent of the IBM mainframe.

• An X-windows user interface (UMUI) based on public domain TCL/TK building tools was produced, a completely new program and structure from the mainframe interface designed at the beginning of the UM. The UMUI now resides on HP workstation networks, and is capable of being ported to other machines, such as the Cray C90 itself. This provided a more flexible platform for initiating experiment set-up within the Met Office and created the capability to package up the entire UM system for implementation by external users.

• The first stage of work to facilitate the introduction and coupling of new sub-models into the system was completed. This included the transfer of primary addressing calculations, which had been performed pre-3.5 by the user interface on the mainframe, into the model initialisation. In addition a new paradigm of sub-models and internal models was introduced for dealing with different combinations of sub-model types.

• Provision of the capability of running with a long physics timestep enabled significant reductions in cpu costs to be achieved by reducing the number of physics calculations while maintaining model numerical stability.

• Adoption of the UMUI running on HP workstations led to a loss of bit reproducibility for models set up at 3.4 and 4.0 due to the change in precision of floating point calculations now performed on the 64 bit Cray machine rather than the 32 bit IBM; and also due to the difference in number representation from IBM to HP. Other minor changes were made that also affected bit reproducibility:
1. Correction in the value of ;
2. Corrections to filtering within the dynamics.

• Addition of science routines to constitute third climate physics (3CV, now labelled HADAM2b):
1. Precipitation changes: Improved mixed phase and evaporation calculations; explicit flux divergence and flux equilibrium schemes for ice fallout dependent on timestep;
2. Re-appraisal of saturated specific humidity formulation.

• A new radiation scheme (Slingo-Edwards) was introduced, covering both longwave and shortwave regions, providing a greater flexibility of choice of radiative spectral intervals.

• A new hydrology routine incorporating the Penman-Monteith scheme was added, with consistent changes made to a new version of the boundary layer routine.

• Optional removal of horizontal diffusion for sloping model levels was included.

• Latent heat nudging in the assimilation was provided for investigations with the operational mesoscale model.

• Copyright notices were added to each item of source code.

Version 4.1
Version 4.1 rationalised a number of changes introduced earlier, including some structural changes associated with the forthcoming move to a distributed memory (MPP) platform and was the last version to be released on the Cray C90. HADCM3, the emergent climate model for the next set of major production runs was mostly developed with modifications against version 4.0, being upgraded to 4.1 in early 1997. The operational forecast model continued at 4.0 during the lifetime of the Cray C90.

The repository for nupdate modifications to source code was changed, providing an RCS capability for lodging successive corrections. This preserves a history of local changes during the period testing a new version, after the initial build but before the final release.

• Extensive addition of routines and code for running on a distributed memory platform. Existing source code was protected by placing most new code within a *DEF MPP compile time switch. Changes included the rationalisation of DO loop variable limits passed down to lower level physics and dynamics routines.

• The WAVE model was introduced into the UM system, with WAVE source code converted to nupdate form. Control level changes allowed a WAVE dump to be read and interfaced into lower level routines. This required changes to STASH to deal with an extra dimension for the prognostic variables, since WAVE model variables are a function of (x,y,frequency,direction).

• Further development of the submodel paradigm, partly associated with changes for the WAVE model. ppxref file functions were replaced by a newly formatted set of STASHmaster files, with a separate file for each internal model providing a description of each prognostic and diagnostic variable known by STASH for that model.

• Addition of science routines as a preliminary to, but not defining fully, HADCM3: MOSES (Met Office Surface Exchange Scheme) incorporating:
1. a Penman-Monteith boundary layer and hydrology scheme;
2. an improved multi-layer soil moisture and temperature model;
3. an interactive vegetation canopy.
• Introduce code for calculations of the sulphur cycle, with:
1. A new chemistry section;
2. Wet scavenging;
3. Boundary layer and convective transport;
4. Tracer advection of aerosol.

Version 4.2/4.3
Version 4.2 was a development version, the first to be released on the Cray T3E, giving an MPP capability, with no science changes. Version 4.3 consolidated corrections for both MPP and HADCM3 science, providing full functionality for operational and HADCM3 climate models.

• Most changes were to provide an MPP capability:
1. Introduction of message passing libraries, realised through a generalised software interface (GC/GCOM);
2. Domain decomposition of model grids: 2-dimensional sub-areas for the atmosphere model; 1-dimensional (rows) for the ocean model;
3. Model dump read/write and STASH output routines re-arranged to cater for gathering and scattering data for distributed processors.
• Single processor optimisation of code tailored for a cache based rather than vector based architecture.
• The change in number representation from C90 to T3E (Cray specific to IEEE) gave greater precision but a smaller range; conversion utilities were provided for using Cray C90 files.
• A new compilation system was introduced, based on Unix `make' operating on nupdate decks, providing consistency between pre-built and locally generated compiled code, with flexibility for quick compilations.

Version 4.4
The second UM version (after 4.0) to be released as a complete portable package, tested on a variety of platforms.

• OASIS coupler code, allowing external models to be coupled with UM models with a minimum of changes to the external model.
• Well-formed i/o, with data files re-structured to start at sector boundaries, improving i/o performance to disk.
• New science sections (precipitation, convection, vertical diffusion, dynamics adjustment and filtering), optimised for T3E processors but not bit reproducible with equivalent sections.
• Introduction of mixed phase cloud and precipitation scheme.
• New options for science routines, constituting the main components of the planned new climate experiment, HADCM4:
1. MOSES II: enhancement of MOSES, with surface hydrology and boundary layer scheme modifications;
2. Radiationally active anvils in convection;
3. A new ice scheme in radiation.
• Introduction of a scheme for interactive vegetation.
• Climate meaning for a Gregorian calendar (enabled only for a 360 day calendar previously).
• Faster energy correction: accumulate fluxes over a period rather than summing every timestep.
• New utilities for generating boundary condition files: bcreconf (reconfigure from another resolution) and makebc (generate from dumps).

Version 4.5
Mostly a science update.

• Introduction of the Single Column Model as a UM configuration.

• Addition of NH3 and soot tracer species (and related modelling).

• Many improvements to carbon cycle modelling, including introduction of CO2 coupling capability in Atmosphere/Ocean runs.

• Many performance optimisations including convection load-balancing.


Versions 5.0, 5.1 and 5.2
The 5.x series of versions saw the introduction of the ``New Dynamics". This was the new Semi-Lagrangian dynamics based on a non-hydrostatic formulation. At the same time, the physical parameterisations were rationalised so that most schemes had only one version remaining. Versions 5.0 and 5.1 were intermediate versions with limited functionality that were only of interest to developers. Version 5.2 was the first of the 5.x versions to be used operationally.

The 5.x series introduced a number of major technical changes as well as the new dynamical core,

• The reconfiguration was rewritten in Fortran 90 (using data-types, dynamical memory allocation etc).

• Model addressing was changed to start from the Southern row rather than the Northern row.

• Lateral boundary conditions were extended to use a larger range of variables and to include halos external to the domain as this was required by the dynamics.

• The C pre-processor (cpp) was introduced as a means of conditional compilation, taking over this functionality from nupdate.

Version 5.3
• nupdate was replaced by pumscm - a Perl program that was written in-house and implemented just those features of nupdate that were commonly used.

• A coastal tiling scheme was introduced to give better representation of surface land-sea processes at the sub-gridbox scale.

• A new parametrization scheme for calculation of local mixing coefficients within the boundary layer under stable conditions was introduced.

• Addition of many STASH diagnostics required for operational running.

• The Ultra-Simple Spectral gravity wave parameterisation was introduced (A06_4A)

• There was a major restructuring of the Ocean model code to improve performance and tidy the code.


Version 5.4
This version was primarily a science update for the planned HadGEM1 climate model.

• A new convection scheme (4A) was introduced.

• A preliminary version of a new Prognostic Cloud and Prognostic Condensate scheme (PC2) was introduced.

• A new option to use mixing ratios rather than humidities in the dynamics was included.


Version 5.5
Another release primarily for consolidation of new science.

• The STOCHEM atmospheric chemistry model was introduced.

• The Wave model (which had been largely dormant since its introduction into the UM) was updated with the WAM model.

• A new river-routing scheme was included to improve the water cycle particularly for coupled ocean-atmosphere configurations.

• The ability to reconfigure from ECMWF GRIB data (which had not been possible since the introduction of the New Dynamics) was re-enabled.

• In order to accommodate new schemes, the STASHmaster option codes were extended to 30 digits.

• Schemes for modelling the effects of mineral dust and biomass smoke atmospheric tracers were included.

• A new FOAM assimilation scheme was introduced.

• Multi-category sea-ice was introduced as an option.

• The ability to use WGDOS packing for fieldsfiles and 32 bit packing for dumps on non-Cray platforms was introduced.