Issue |
J. Space Weather Space Clim.
Volume 10, 2020
|
|
---|---|---|
Article Number | 14 | |
Number of page(s) | 23 | |
Section | Agora | |
DOI | https://doi.org/10.1051/swsc/2020012 | |
Published online | 17 April 2020 |
Agora – Project report
The Virtual Space Weather Modelling Centre
1
KU Leuven, 3000 Leuven, Belgium
2
Royal Observatory of Belgium, 1180 Ukkel, Belgium
3
Von Karman Institute, 1640 Sint-Genesius-Rode, Belgium
4
Space Applications Systems, 1932 Zaventem, Belgium
5
DH Consultancy, 3000 Leuven, Belgium
6
Royal Belgian Institute for Space Aeronomy, 1180 Ukkel, Belgium
7
British Antarctic Survey, CB3 0ET Cambridge, United Kingdom
8
European Space Research and Technology Centre (ESTEC), 2201 AZ Noordwijk, The Netherlands
9
Institute of Physics, University of Maria Curie-Skłodowska, 20-400 Lublin, Poland
10
European Space Operations Centre, European Space Agency, 64293 Darmstadt, Germany
* Corresponding author: Stefaan.Poedts@kuleuven.be
Received:
29
October
2019
Accepted:
4
March
2020
Aims. Our goal is to develop and provide an open end-to-end (Sun to Earth) space weather modeling system, enabling to combine (“couple”) various space weather models in an integrated tool, with the models located either locally or geographically distributed, so as to better understand the challenges in creating such an integrated environment. Methods. The physics-based models are installed on different compute clusters and can be run interactively and remotely and that can be coupled over the internet, using open source “high-level architecture” software, to make complex modeling chains involving models from the Sun to the Earth. Visualization tools have been integrated as “models” that can be coupled to any other integrated model with compatible output. Results. The first operational version of the VSWMC is accessible via the SWE Portal and demonstrates its end-to-end simulation capability. Users interact via the front-end GUI and can interactively run complex coupled simulation models and view and retrieve the output, including standard visualizations, via the GUI. Hence, the VSWMC provides the capability to validate and compare model outputs.
Key words: space weather models / heliospheric-ESC / operational modelling centre
© S. Poedts et al., Published by EDP Sciences 2020
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1 Introduction
Given the enormous socio-economic impact of space weather on Earth (Eastwood et al., 2017), it is increasingly important to provide reliable predictions of the space weather and its effects on our ground-based and space-borne technological systems, human life and health. This requires a deeper insight in the physical mechanisms that are causing the space weather and its multiple effects. Clearly, observations and continuous monitoring are also extremely important but sometimes observations are limited or difficult to interpret (due to e.g., projection effects) and some important parameters can simply not be observed directly (e.g., the coronal magnetic field). In these cases we have to rely on mathematical models. As a matter of fact, such models can take into account the physical and/or chemical processes behind the phenomena of interest and the resulting equations can be solved by powerful computer clusters. Such numerical simulation models become ever more realistic and can provide additional information where direct observations are not possible, such as on the solar coronal magnetic field topology, the density structure in a CME or magnetic cloud, and the local velocity of an approaching CME. After appropriate validation, some of these models even have a predictive value so that they can be used for forecasting for instance the arrival of a CME shock at Earth or the radiation to be expected at the location of a satellite, enabling, in some cases, mitigation of its destructive effects.
Empirical and semi-empirical models are much simpler and to be preferred for forecasting and predictions as long as they are reliable. When they are not working satisfactorily, physics-based models can bring a solution. However, such physics-based models are often rather complicated and difficult to install and operate. Moreover, they often require a substantial amount of CPU time and computer memory to run efficiently and they produce enormous amounts of output that needs to be interpreted, analysed and visualised. Therefore, integrated Space Weather model frameworks are being developed that provide a simple (graphical) user interface to simplify the use of such simulation models. The Community Coordinated Modeling Center (CCMC, https://ccmc.gsfc.nasa.gov/), for instance, is a multi-agency (NASA, NSF) initiative that “enables, supports and performs the research and development for next-generation space science and space weather models”. The Space Weather Modeling Framework (SWMF, Tóth et al., 2005) at the Center for Space Environment Modeling (CSEM) at the University of Michigan (USA) is another example of such a framework. CSEM too develops high-performance simulation models and uses them to forecast space weather and its effects. These frameworks thus provide a standard environment and serve as model and data repositories, enable model simulation runs and validation of the obtained results and even facilitate the coupling of different (sub) models integrated in the framework to support space weather forecasters and even space science education.
The ESA Virtual Space Weather Modelling Centre (VSWMC) is an ambitious project that aims to develop an alternative European framework with extra features and facilities. The VSWMC-Part 2 project was part of the ESA space situational awareness (SSA) Programme which is being implemented as an optional ESA programme supported by 19 Member States (https://www.esa.int/Our_Activities/Space_Safety/SSA_Programme_overview).
More precisely, it was part of the space weather segment (SWE) of the SSA Period 2 programme as a “Targeted Development”, viz. P2-SWE-XIV: Virtual Space Weather Modelling Centre. This ambitious project further developed the VSWMC building on the Part 1 prototype system that was developed as a General Support Technology Programme (GSTP) project (Contract No. 4000106155, ESTEC ITT AO/1-6738/11/NL/AT, 2012–2014), and focusing on the interaction with ESA’s SSA SWE system. This included the efficient integration of new models and new model couplings (compared to the earlier prototype), including a first demonstration of an end-to-end simulation capability, but also the further development and wider use of the coupling toolkit and the front-end GUI which was designed to be accessible via the SWE Portal (http://swe.ssa.esa.int/), and the addition of more accessible input and output data on the system and development of integrated visualization tool modules.
The consortium that took up this challenge consisted of KU Leuven (prime contractor, Belgium), Royal Belgian Institute for Space Aeronomy (BISA, Belgium), Royal Observatory of Belgium (ROB, Belgium), Von Karman Institute (VKI, Belgium), DH Consultancy (DHC, Belgium), Space Applications Services (SAS, Belgium), and British Antarctic Survey (BAS, United Kingdom).
The VSWMC-P2 system is an updated design and fresh implementation of the earlier prototype VSWMC-P1 system and contains full-scale SWE models and model couplings that are ready for operational use and also some demo models enabling tests with models installed in remote locations (Brussels, Paris, and Cambridge). The models and model couplings that are ready for operational use have been separated in a limited operational system that has passed the acceptance tests on 04/03/2019. The system went operational on 28 May 2019 as part of the ESA SSA SWE Portal.
In Section 2 we provide a brief general description of the scope of the project and its objectives. In Section 3 we provide an overview of the design of the fully deployed VSWMC system and in Section 4 we focus on the release of the system, with two components, viz. a test/development system and a limited operational system with the tested, stable models and model couplings integrated in the SSA Space Weather Portal. Section 4 of the present paper is dedicated to the functionality of the newly released system. We conclude with a summary of the major achievements and ideas/recommendations for future enhancements.
2 General description and objective(s)
2.1 Background
The Virtual Space Weather Modelling System (GEN/mod) is a service in the general data services (GEN) domain of the SSA SWE services (ESA SSA Team, 2011). See also the the ESA website on SSA Space Weather services: http://swe.ssa.esa.int/web/guest/user-domains.
The continued development of the VSWMC was intended to be fully in-line with the federated approach (see Fig. 1) of the SSA programme space weather element (SWE) in its current period 3. The VSWMC system is installed on a virtual server within the firewall of the KU Leuven and makes use of the data storage and High Performance Computing facilities of the Vlaams Supercomputer Centrum (VSC, https://www.vscentrum.be/) at the KU Leuven. The users login via the Single Sign On system which requires a “hand-shaking” procedure with a server at the ESA centre in Redu. Some of the integrated models are installed on the Tier 2 cluster in Leuven. Other models, however, are integrated remotely, i.e. they are installed on the local server or cluster of the modeller involved and they use the local CPU time. Nevertheless, these models controlled via the VSWMC system in Leuven. Examples are XTRAPOL that is running in Paris and BAS-RBM, running in Cambridge. The same applies to external databases that serve as input for running some of the models. For instance, solar surface magnetograms are downloaded from the GONG (Harvey et al., 1996) database (https://gong.nso.edu/data/magmap) and CME input parameters from the space weather database of notifications, knowledge, information (DONKI, https://kauai.ccmc.gsfc.nasa.gov/DONKI) server at CCMC.
Fig. 1 Basic set-up of the federated VSWMC-P2 service with geographically distributed system elements. |
2.2 Long term objective: future more advanced VSWMC
The VSWMC is being developed in different phases. The present paper reports on the status after the second phase. The future final VSWMC shall perform, as a minimum, the following functions:
Provide a repository for accessing space weather models.
Allow the user to interactively couple SWE models.
Allow for the execution of coupled model simulations, providing a robust framework supporting end-to-end space weather simulations.
Ability to ingest additional or new SWE data sets/data products from remote data providers in order to run the included models.
Provide an infrastructure for installing geographically distributed system elements as federated elements within the SSA SWE network.
Perform verification of installed models.
Provide model output visualisations capabilities.
Provide capability to validate and compare model outputs.
Provide an interface for forecasters and other users to perform complex simulations.
2.3 Potential user groups/users
Most of the “users” or “customers”, i.e. the people that will interact with the VSWMC, can be associated with a specific space weather science and/or service domain. The Expert Service Centre on Heliospheric Weather and its Expert Groups are an obvious example of potential users of the VSWMC. It is equally evident that the Expert Groups of the other four Expert Service Centres will also benefit from the use of the VSWMC as it contains, or will contain in the near future, solar models e.g. for solar flares and CME onset (Solar-ESC), ionospheric models (Ionospheric Weather-ESC), Solar Energetic Particles models (Space Radiation-ESC), and Earth magnetosphere and geomagnetic effects models (Geomagnetic-ESC).
In the design of the VSWMC, the “customers” of foremost importance were the model developers, with ESA using model predictions, and with scientists, industrial users, and the general public as additional users. These users and their potential requirements of the system have first been described in detail. Upon translating these user requirements to system requirements, a further distinction between different users was made, since this proved necessary for the implementation of the prototype system. The approach followed was to consider four categories of users: content providers (modellers), simulators (running models), end users (using the output of model runs), and VSWMC personnel (admin and support).
2.4 Short term objective: VSWMC Part 2
The new developments for Part 2 have been focused on the VSWMC prototype and the interaction with the SSA SWE system, the modification of model wrappers (so-called MCIs, model coupling interfaces) to exchange only relevant/required information, the interfacing of new models on the system, and the development of new model couplings including a first demonstration of an end-to-end (Sun – Earth) simulation capability. It also paid a lot of attention to the development of the run-time interface (RTI, allowing the user to easily change simulation/model parameters before the simulation starts) in order to be able to cope with high communication loads, and the development and wider use of the coupling toolkit and the front-end. In particular, the graphical user interface (GUI) which has been designed to be accessible via the SWE Portal. Last but not least, this Part 2 project also assured the availability of more data on the system (for model input and for valorisation of the results) and the development of visualisation tools.
2.5 Outcome
The project work flow contained three distinct parts focussing on the design, the development and the demonstration of its functionality, respectively. The main outcomes of the project are: (1) an updated architectural design of the full VSWMC system and the detailed design of the system based on an updated requirements analysis; (2) the new release of the VSWMC, i.e. an updated core system, new added models, additional data provision nodes and a novel graphical user interface; and (3) a demonstration of the model federate outputs in visualisation federates and validations in order to showcase the functionality of the system to perform verification and validation of models.
These outcomes are described in more detail below. The Scientific Advisory Team (SAT) of this activity consisted of A. Aylward, S. Bruinsma, P. Janhunen, T. Amari, D. Jackson, S. Bourdarie, B. Sanahuja, P.-L. Blelly, and R. Vainio. The SAT members were consulted via emails and during two so-called round-table meetings. During the first round table meeting with the Scientific Advisory Team, the planning of the VSWMC project and the Customer Requirements Document and System Requirements Document have been discussed with the SAT members as well as the Asset Review focussing on the missing assets. The 2nd round table meeting focused on the selection of the models to be included, the required data provision, the desired model couplings and visualizations, related challenges, as well as a first reflection on the verification and validation problems and how to handle them properly.
3 Design of the full VSWMC system
3.1 Assets review and customer requirements
The VSWMC-P2 team’s first task consisted of reviewing the existing space weather models and data-related assets across Europe including assets produced within projects of the 7th Framework Programme funded by the European Commission from 2007 until 2013, assets from the SSA SWE Assets Database, the assets already identified during the VSWMC-P1 project and additional assets suggested by the Science Advisory Team during the first Round Table meeting. The review process led to an Assets Review Report and assessed the suitability of each asset for its exploitation in the VSWMC, especially with regards to real-time space weather forecasting. Moreover, based on the review a gap analysis was presented to indicate areas of lacking maturity in present European modelling capabilities.
The VSWMC-P2 team and the SAT also reflected upon the Customer Requirements and the relations of the VSWMC customer requirements to the customer requirements of the whole SSA SWE service system. The set of domains can be divided in two major groups, viz. the domains corresponding to the physical components of the space environment and the space weather service domains, representing the ESA Space Situational Awareness classification. The origin and the relevance of the requirements has been illustrated by a number of “user stories”, presenting the actual needs of e.g. a general end user (amateur astronomer, public), a space instrument designer, or an experienced space weather scientist.
3.2 System requirements
The VSWMC system requirements address the following audiences: system developers and operators, model implementers, and end users. One of the most challenging system requirements concerned the desire to develop an infrastructure for installing geographically distributed system elements as federated elements within the SSA SWE network. To achieve this challenge, the VSWMC is built on “high-level architecture (HLA)”. This terminology refers to a general purpose software architecture that has been developed to enable distributed computer simulation systems, i.e. with different components of the simulation running on different (remote) computers with different operating systems. As a matter of fact, within the HLA framework different computer simulations, or different components of a large simulation, can interact (i.e., interchange data, synchronize actions) regardless of the computing platforms on which they run and regardless of the programming language they are developed in. The interaction between the different components of the simulation is managed by a run-time infrastructure (RTI). In other words, HLA provides a general framework and standard (IEEE Standard 1516-2000) facilitating interoperability (and reusability) of distributed computer simulation components. It is currently used in applications in a number of different domains such as, defence, air traffic management, off-shore, railway and car industry, and manufacturing.
The other systems requirements are summarized in Figure 2. The users login via the web portal (in the SSA SWE system). The Core system contains four service components, viz. the model and simulation repositories, the model coupling interfaces (MCI) and a collection of reusable algorithms and tools. The model developers have to write a model-specific MCI implementation. The computational models/solvers and data streams are treated uniformly through the same Abstract MCI. The Core system also contains a data archive and a user management component. Only the runtime system interacts with HLA bus to coordinate simulations. Note that model visualizations are implemented as “federates” (HLA terminology), i.e. as any other model, taking synthetic data form simulations and producing plots and/or movies.
Fig. 2 VSWMC-P2 system requirements overview. |
3.3 Architectural and detailed design
A system interface control document (ICD) has been developed that provides third party model integrators and infrastructure providers all necessary information to be able to integrate and run new models in the VSWMC system. This document covers the interaction between system, Model Coupling Interfaces and model and describes the MCI functions, the MCI communication to the core system and the Coupling Toolkit functionality available to the MCI, as well as access methods and environmental requirements.
The run-time system (RTS) prepares the models for execution and manages the data exchange between the models, over the internet in case the models are installed in different locations (see Fig. 3). The run-time system is capable of executing parameterized simulation runs. As a simulation is interpreted, different models are retrieved from the Model Repository (cf. Sect. 3.2 and Fig. 2).
Fig. 3 Illustration of the run-time system taking models form the repository and linking them to each other via Model Coupling Interfaces. |
The architectural design of the complete VSWMC system has been updated. During the VSWMC-Part 1 project, a prototype of the VSWMC system had been developed. This consisted of a simulation framework using at its core CERTI, an open source implementation of the high level architecture (HLA) middleware. The latter allows for connecting various models (through data exchange) which can run remotely distributed and concurrently, potentially achieving a good performance in complex simulations. The coupling is not intrusive, in the sense that each model is treated as a black-box, therefore unmodified, and encapsulated in a separate component through a model coupling interface (MCI). At present, custom scripts must be provided by the modellers to integrate and run their models within VSWMC. Before exchanging the data, data conversions or transformations necessary for achieving the coupling are taken care by the coupling tools kit (CTK). The latter has been implemented as a standalone flexible infrastructure where each new feature is treated as a configurable and dynamic plug-in. A GUI is also available for simplifying the usage of the VSWMC system by end-users.
When a user wants to run a simulation, he/she has to provide inputs as required by the simulation. An input file will be created by the front-end based on a template, by replacing template variables with user provided (via an input widget) values. The user can see the resulting file (together with the simulation outputs) but does have the possibility to change the file as we want to prevent invalid input files to be used. Only the operator can generate/modify the templates used for the configuration files.
4 Release of the VSWMC
4.1 Development of the VSWMC core system
Priority has been given to the interfaces between the core-system to models federates, to the data provision federate, and the front-end component. The underlying processing has been implemented to make all the defined interfaces operational.
The run-time system was enhanced with, amongst others, a parallel real-time infrastructure gateway (RTIG, illustrated in Fig. 4) to share the communication loads, extensible simulation-specific configuration through the use of Python scripts, real-time data connectivity to connected clients (e.g., live streaming of log files), and connectivity to VSWMC nodes installed both on-premise and off-premise (the VSWMC system is spread out on different compute cluster nodes or platforms).
Fig. 4 RTI gateways (RTIGs) manage the simulations and transfers messages between federates. The VSWMC-P2 system supports multiple RTIGs to tackle high communication loads. |
The software architecture is made up of the following parts:
A front end for regular users to interact with the VSWMC to run simulations.
Couplers software which deals with timing of model calls and data exchange.
A library of coupling tools for data transformation from one model to another.
A model coupling interface for the models themselves and datasets for model input.
The VSWMC system treats each model as a black-box. Hence, to integrate and run it within the VSWMC system (which is done by encapsulating the model in a separate component through a model coupling interface [MCI]), the model developer has to provide all the information (metadata) that describes properties of the model necessary for enabling its integration and operation through the VSWMC system.
4.2 Interfacing of additional space weather models
Extra physics-based as well as empirical and data-driven codes have been added to the VSWMC as the prototype system only contained a few demonstration models to show the capabilities of the system. The VSWMC system now contains the following models (some are still not fully operational as some modellers only provided a limited demo model):
AMRVAC Solar Wind (demo): Steady 2.5D (axisymmetric) solar wind (quiet Sun) to 1 AU (Hosteaux et al., 2018, 2019; Xia et al., 2018).
AMRVAC CME (demo): 2.5D (axisymmetric) flux rope CME superposed on the AMRVAC Solar Wind as an initial condition and propagating to 1 AU (Xia et al., 2018; Hosteaux et al., 2019).
COOLFluiD Steady (demo): calculates the position and shape of the bow shock at Earth for specified steady solar wind conditions (3D) (Lani et al., 2013; Yalim & Poedts, 2014).
CTIP (demo): Coupled Thermosphere Ionosphere Plasma sphere model, a global, three-dimensional, time-dependent, non-linear code that is a union of three physical components (a thermosphere code, a mid- and high-latitude ionosphere convection model, and a plasma sphere and low latitude ionosphere) (Fuller-Rowell & Rees, 1980; Millward et al., 1996).
EUHFORIA: 3D steady heliospheric wind with superposed (Cone) CME propagation (Pomoell & Poedts, 2018; Scolini et al., 2018).
Dst index: empirical model to determine the Dst index from solar wind data at L1 (provided by C. Scolini, based on O’Brien & McPherron, 2000).
Kp index: empirical model to determine the Kp index from solar wind data at L1 (provided by C. Scolini, based on Newell et al., 2008).
Plasma pause stand-off distance: empirical model using solar wind data at L1 (provided by C. Scolini, based on Taktakishvili et al., 2009).
BAS-RBM: 3-D, time-dependent diffusion model for phase-space density based on solution of the Fokker–Planck equation that produces a time-series of the flux or phase-space density on the 3-D grid (Glauert et al., 2014); (Note: this model has been taken out of operation as BAS left the project team).
GUMICS-4: a global magnetosphere–ionosphere coupling simulation based on global MHD magnetosphere and an electrostatic ionosphere (Janhunen et al., 2012; Lakka et al., 2019).
COOLFluiD unsteady: 3D time-accurate Earth magnetosphere model (Lani et al., 2013; Yalim & Poedts, 2014).
ODI: takes data from the open data interface (ODI), a database system for retrieving, processing and storing space environment (and other) data and metadata in a MySQL (MariaDB, one of the most popular open source relational databases) database (https://spitfire.estec.esa.int/trac/ODI/wiki/OdiManual).
XTRAPOL (demo): extrapolation of coronal magnetic field in an active region (Amari et al., 2006).
An interface control document (ICD) has been provided for each VSWMC model showing e.g. its overall functions, outputs, inputs, hardware requirements, the amount of CPU hours required, and including the model structure where appropriate, its technology readiness level (TRL) and a reference. A set of new model coupling interfaces has been developed to interface the VSWMC models. More information on each of the operational models is given below in Section 4.3.
4.3 Operational models
The VSWMC went operational in May 2019 and is available to everybody via the SSA Space Weather portal. Of course, user’s first need to provide credentials but these can be obtained after simple request. For the time being, the VSWMC is reachable via the Heliospheric-Expert Service Centre (H-ESC) webpage where it is shown as “Product demonstration” under “Centre for mathematical Plasma-Astrophysics (KUL/CmPA)” (see Fig. 5).
Fig. 5 Screen shot of the H-ESC webpage with the Link “VSWMC” on the “Product demonstration” tab that gives access to the login page of the VSWMC. |
It will be put more forward in the future. Clicking on the “VSWMC” button brings the user to the login page: http://swe.ssa.esa.int/web/guest/kul-cmpa-federated.
As mentioned above, this operational system only contains the models that are full-fledged, verified and validates via the standard procedures. These are the following models.
4.3.1 EUHFORIA-Corona
Model description
The European Heliospheric FORecasting Information Asset (EUHFORIA) model aims to provide a full Sun-to-Earth modelling chain combining efficient data-driven, semi-empirical, forward approaches with physics-based models wherever appropriate. It consists of two parts, viz. a coronal module and a heliospheric solar wind module that enables superimposed CME evolution. The modules can be run together or each module can be run separately if wanted. This section describes the EUHFORIA Corona module interface.
The aim of the coronal module is to provide the required MHD input quantities at 21.5 Rs for the heliospheric solar wind module. The coronal module in EUHFORIA is data-driven and combines a PFSS magnetic field extrapolation from GONG or ADAPT magnetograms (1 – 2.5 Rs) with the semi-empirical Wang–Sheeley–Arge (WSA) model and the Schatten current sheet (SCS) model to extend the velocity and magnetic field from 2.5 Rs to 21.5 Rs. This is done in combination with other semi-empirical formulas so that also the density and the temperature is given at 21.5 Rs
Model access and run information
In the VSWMC framework, EUHFORIA Corona is supposed to be installed and run on one of the KU Leuven HPC servers, access to which is provided via ssh.
Model input
As an input EUHFORIA Corona takes either standard GONG (as stored in URL http://gong.nso.edu/data/magmap/QR/) or GONG ADAPT Magnetogram Synoptic Maps (as stored in ftp://gong2.nso.edu/adapt/maps/gong/) in the FITS file format. Files compressed with gzip also supported. The magnetogram provider (GONG or GONG_ADAPT) is defined in configuration file with the keyword provider.
Magnetogram source is defined in the configuration file with the keyword source. Four source types are supported:
If an URL is provided, the file is downloaded using that URL.
If it is a locally stored magnetogram file, the file is used.
If a date & time string (e.g., 2012-06-12T12:23:00) is provided, the file corresponding to the date is downloaded using the URL of magnetogram provider defined in the provider field.
If keyword “latest” is provided, the most recently available magnetogram is downloaded using the URL of magnetogram provider defined in the provider field.
Model output
The output of the EUHFORIA Corona model is the solar wind boundary data file that provides MHD input quantities at 21.5 Rs for the EUHFORIA Heliosphere solar wind module.
Related paper
J. Pomoell and S. Poedts: “EUHFORIA: EUropean Heliospheric FORecasting Information Asset”, J. of Space Weather and Space Climate, 8, A35 (2018). DOI: https://doi.org/10.1051/swsc/2018020.
4.3.2 EUHFORIA-heliosphere
Model description
The heliosphere module of EUHFORIA provides the solar wind from 21.5 Rs to 2 AU (or further if necessary). Input at 21.5 Rs is provided by a coronal module, for example, EUHFORIA-Corona. It initially extends the (purely radial) velocity and magnetic field to 2 AU and subsequently relaxes this initial MHD solution by applying a rotating inner boundary to create the solar background wind in a relaxed state. This yields a steady solar wind from 21.5 Rs to 2 AU in the co-rotating frame, as the inner boundary condition is not updated, but merely rotated. The coordinate system of the model is HEEQ.
Apart from providing the background solar wind, the EUHFORIA-heliosphere model is also able to launch CME models superimposed on the background solar wind. Therefore, it can simulate CME evolution up to 2 AU (and beyond, if required). It currently has the classic cone CME model fully supported and a novel Gibson–Low flux-rope CME model is being under development. In contrast with the classic cone model, the Gibson–Low flux-rope model not only enables to model the CME shock evolution but also the internal magnetic structure of the IP magnetic cloud following the shock. The magnetic field of the CME is not modelled with the cone model. The Gibson–Low model was added into EUHFORIA recently and is not yet capable of predicting the flux rope parameters for efficient forecasting. However, in the future efforts will be made towards this goal and the VSWMC will be a great way to test the prediction capabilities of this flux-rope model.
Model access and run information
In the VSWMC framework, EUHFORIA Heliosphere is supposed to be installed and run in the KU Leuven HPC, access to which is provided via ssh.
Model input
As an input EUHFORIA Heliosphere accepts two files: file, containing solar wind boundary data, this data file is mandatory, and file, containing a list of CMEs relevant for the particular simulation run.
Solar wind boundary data file. This file is mandatory and normally it is provided as the EUHFORIA Corona v.1.0 model output.
CME list file. This data is optional, the file is also in ASCII text format and have to be created by the user manually (or with the help of VSWMC framework) in accordance with CME model template. Several CME models are supported by EUHFORIA Heliosphere v.1.0, and hence there are several templates that describe somewhat different CME parameters but in current VSWMC phase we support only Cone CME model.
Model output
As an output EUHFORIA Heliosphere generates physical parameters of solar wind from 21.5 Rs to 2 AU (or further if necessary), see Model Description Section. The parameters are the following:
Output solar wind parameter | Data unit | Data type |
---|---|---|
Date | YYYY-MM-DDThh:mm:ss (ISO8601 date and time format) | String |
Grid point radial coordinate r | AU (Astronomical unit) | Float |
Grid point colatitude clt | rad (radian) | Float |
Grid point longitude lon | rad (radian) | Float |
Number density n | 1/cm3 (particles per cubic centimeter) | Float |
Pressure P | Pa (Pascal) | Float |
Radial component of velocity vr | km/s (kilometers per second) | Float |
Colatitude component of velocity vclt | km/s (kilometers per second) | Float |
Longitude component of velocity vlon | km/s (kilometers per second) | Float |
Radial component of magnetic field Br | nT (nano Tesla) | Float |
Colatitude component of magnetic field Bclt | nT (nano Tesla) | Float |
Longitude component of magnetic field Blon | nT (nano Tesla) | Float |
Related paper
J. Pomoell and S. Poedts: “EUHFORIA: EUropean Heliospheric FORecasting Information Asset”, J. of Space Weather and Space Climate, 8, A35 (2018). DOI: https://doi.org/10.1051/swsc/2018020.
4.3.3 GUMICS-4
Model description
The global magnetosphere–ionosphere coupling model GUMICS-4 solves ideal MHD equations in the magnetosphere and couples them to an electrostatic ionosphere. The inner boundary of the MHD domain is at 3.7 Earth radii. Between the ionosphere and 3.7 Earth radii, quantities involved in the ionosphere–magnetosphere coupling loop (potential, precipitation, field-aligned currents) are mapped along unperturbed magnetic field lines. The MHD part uses an unstructured finite volume octogrid which is automatically refined and coarsened during the run using also refinement priorities hand-coded for the magnetosphere. The MHD solver is the Roe solver. In cases where one or more of the intermediate states returned by the Roe solver are not physical (negative density or pressure), the robust HLL solver is used instead. Analytic splitting of the magnetic field in dipole field and perturbation field is used. The MHD code is time accurate and uses temporal subcycling to speed up computation.
The ionospheric model consists of a 2-D elliptic solver for the electric potential. The source term is proportional to the field-aligned current obtained from the MHD variables and mapped down to ionospheric plane. The coefficients of the elliptic equation contain the height-integrated Pedersen and Hall conductivities. The ionospheric electron density is initially computed in a 3-D grid using production and loss terms. The conductivities needed by the elliptic solver are obtained by explicit height integration of the 3-D conductivities. The electron density takes contribution from modelled solar UV radiation, from a constant background profile and electron precipitation which is modelled from the MHD variables which map to the point.`
Model access and run information
In the VSWMC framework, GUMICS-4 is installed and run on the same virtual server where the framework itself runs.
Model input
The main input for GUMICS-4 is the solar wind time series at the Lagrange L1 point or other upstream point. The input can be artificial or obtained from a satellite such as ACE or SOHO, for example via ODI. The solar wind data are read from a separate text file.
The parameters required for GUMICS-4 are the following: particle density, temperature, velocity and magnetic field vectors in GSE coordinate system. The data values are in SI data units. The file contains the following parameters:
Input solar wind parameter | Data unit | Data type |
---|---|---|
Time | Seconds | Integer |
Number density | m−3 (particles per cubic meter) | Float |
Temperature | K (Kelvin) | Float |
Velocity component V_GSE_x | m s−1 (meters per second) | Float |
Velocity component V_GSE_y | m s−1 (meters per second) | Float |
Velocity component V_GSE_z | m s−1 (meters per second) | Float |
Magnetic field component B_GSE_x | T (Tesla) | Float |
Magnetic field component B_GSE_y | T (Tesla) | Float |
Magnetic field component B_GSE_z | T (Tesla) | Float |
Model output
GUMICS-4 saves the 3-D MHD variables in its unstructured grid in a custom binary file “HC” format (“Hierarchical Cartesian” format) which is efficient to store and fast to read. The ionospheric quantities are similarly stored in a “TRI” file (for TRIangular finite element grid data).
The full list of output parameters contains more than 50 entries. The list of main physical parameters for each output time stamp is the following:
Main output solar wind parameter | Data unit | Data type |
---|---|---|
Time stamp of the output file | Second | Integer |
x: X-coordinate | m (meter) | Float |
y: Y-coordinate | m (meter) | Float |
z: Z-coordinate | m (meter) | Float |
ρ: Mass density | kg/m3 (kilogram per cubic meter) | Float |
ρvx: X-component of the momentum flux | kg/(m2 s) | Float |
ρvy: Y-component of the momentum flux | kg/(m2 s) | Float |
ρvz: Z-component of the momentum flux | kg/(m2 s) | Float |
U: Total energy density, thermal + kinetic + magnetic | J/m3 | Float |
Bx: X-component of the total magnetic field | T (Tesla) | Float |
By: Y-component of the total magnetic field | T (Tesla) | Float |
Bz: Z-component of the total magnetic field | T (Tesla) | Float |
Related paper
Janhunen, P.; Palmroth, M.; Laitinen, T.; Honkonen, I.; Juusola, L.; Facskó, G.; Pulkkinen, T. I., “The GUMICS-4 global MHD magnetosphere-ionosphere coupling simulation”, Journal of Atmospheric and Solar-Terrestrial Physics, Volume 80, p. 48–59 (2012).
4.3.4 BAS-RBM
Model description
The BAS Radiation Belt Model (BAS-RBM) solves a Fokker–Planck equation for the time-dependent, drift-averaged, high energy (E > ~100 keV) electron flux in the Earth’s radiation belts. A detailed description of the model is given in Glauert et al. (2014a, b). The model includes the effects of radial transport, wave-particle interactions and losses to the atmosphere and magnetopause. Radial transport is modelled as radial diffusion using the coefficients of Ozeke et al. (2014). Wave-particle interactions due to upper and lower band chorus (Horne et al., 2013), plasmaspheric hiss and lightning-generated whistlers (Glauert et al., 2014a) and electro-magnetic ion cyclotron waves (Kersten et al., 2014) are included in the model. Losses to the atmosphere follow Abel & Thorne (1998) and losses to the magnetopause are modelled as described in Glauert et al. (2014b).
The outer radial (L*) boundary condition is determined from GOES 15 data. The inner radial boundary and the low energy boundary are set using statistical models derived from CRRES data, see Glauert et al. (2014b). The drift-averaged differential flux is calculated as a function of pitch-angle (α), energy (E), L* and time. Pitch-angles lie in the range 0° ≤ α ≤ 90°. L* is calculated using the Olson–Pfitzer quiet time model and lies in the range 2 ≤ L* ≤ 6.5. At L* = 6.5, 103.2 keV ≤ E ≤ 30 MeV and the energy range increases with increasing L*.
The model requires the Kp index, electron and proton fluxes and position data from GOES 15 (to provide the outer boundary condition) and the magnetopause location for the period of the simulation. The start time can be any time between 00:00 on 1-1-2011 and the present. Simulations are limited to 1 week.
References
-
Glauert SA, Horne RB, Meredith NP. 2014a. Three-dimensional electron radiation belt simulations using the BAS Radiation Belt Model with new diffusion models for chorus, plasmaspheric hiss, and lightning-generated whistlers, J Geophys Res Space Phys 119: 268–289. https://doi.org/10.1002/2013JA019281.
-
Glauert SA, Horne RB, Meredith NP. 2014b. Simulating the Earth’s radiation belts: Internal acceleration and continuous losses to the magnetopause, J Geophys Res Space Phys. 119: 7444–7463. https://doi.org/10.1002/2014JA020092.
-
Horne RB, Kersten T, Glauert SA, Meredith NP, Boscher D, Sicard-Piet A, Thorne RM, Li W. 2013. A new diffusion matrix for whistler mode chorus waves, J Geophys Res. Space Phys 118: 6302–6318. https://doi.org/10.1002/jgra.50594.
-
Kersten T., Horne RB, Glauert SA, Meredith NP, Fraser BJ, Grew RS. 2014. Electron losses from the radiation belts caused by EMIC waves, J Geophys Res Space Phys 119: 8820–8837. https://doi.org/10.1002/2014JA020366.
-
Ozeke LG, Mann IR, Murphy KR, Jonathan Rae I, Milling DK. 2014. Analytic expressions for ULF wave radiation belt radial diffusion coefficients, J Geophys Res Space Phys 119: 1587–1605. https://doi.org/10.1002/2013JA019204.
Model access and run information
For the VSWMC the BAS-RBM is accessed by placing a run request on an ftp server (ftp://vswmcftp@ftp.nerc-bas.ac.uk/vswmc_data/) at the British Antarctic Survey. Each request has a unique id generated by the VSWMC and referred to as run_id for the rest of this document. The VSWMC creates a directory called run_id in ftp://vswmcftp@ftp.nerc-bas.ac.uk/vswmc_data/ and places the following files in that directory:
A run_name.VSWMC file which defines the run to be performed.
A run_name.KP file defining the KP sequence for the simulation.
A run_name.GOES_COORD file with the required GOES 15 data for the run.
A run_name.MP file with the magnetopause standoff distance for the simulation.
Here run_name is an identifier supplied by the VSWMC to name this particular run. It can be the same as the run_id, but does not need to be. The run_name identifies all the files (both input and output) associated with a particular model run.
The BAS server will automatically detect when the files are placed in the directory and initiate a run, creating a log file run_name.log, containing information about the progress of the run. This file can be read by the VSWMC as required to allow progress to be monitored. Only one run can be calculated at any time – if requests are received while a run is in progress then they will be queued and run in turn.
The output files will be placed in the same directory as the input files (i.e. ftp://vswmcftp@ftp.nerc-bas.ac.uk/vswmc_data/run_id). A model run will produce three output files:
A run_name.3d file. This is the main output from the model and contains the differential flux as a function of pitch-angle, energy, L* and time for the simulation.
A run_name.PRE file. This gives the precipitating electron flux at the edge of the loss cone as a function of energy, L* and time.
A run_name.EFLUX. This contains the differential energy flux as a function of pitch-angle, energy, L* and time for the simulation.
When the run is finished these output files, together with the log file, are combined into run_name.tar.gz and placed in the run_id directory on the FTP site.
Model input
The BAS-RBM requires five input parameters and three input data files. The input parameters, specified in the run_name.VSWMC file, are:
The starting time of the run.
The length of the run.
The timestep for the run.
The output frequency – i.e. output the flux every n timesteps.
The initial condition.
For the initial state of the radiation belt at the start of the calculation the user can choose one of 12 options. Each option is the steady state solution for the given Kp value, with the flux at the outer boundary (L* = Lmax) set at a given percentile of the >800 keV flux distribution at GEO, derived from GOES 15 data. The Kp value for the steady state can be 1, 2, 3, or 4 and the flux at the outer boundary can be the 10th, 50th, or 90th percentile value. The options available for the initial condition, along with the 24 h average, >800 keV flux at GEO for the given percentile are shown in the table below.
Option | Kp | Flux percentile | >800 keV flux at GEO (cm−2 sr−1 s−1) |
---|---|---|---|
1 | 1 | 10 | 1952.7 |
2 | 1 | 50 | 17,833.9 |
3 | 1 | 90 | 75,068.6 |
4 | 2 | 10 | 2904.6 |
5 | 2 | 50 | 20,496.2 |
6 | 2 | 90 | 78,197.9 |
7 | 3 | 10 | 3676.5 |
8 | 3 | 50 | 24,131.7 |
9 | 3 | 90 | 10,8545.2 |
10 | 4 | 10 | 2247.8 |
11 | 4 | 50 | 19,293.3 |
12 | 4 | 90 | 98,334.8 |
Model output
At the end of a successful run the model will produce three output files; run_name.3d, run_name.PRE, and run_name.EFLUX. These are all ASCII files that start with a header containing information about the run and then give a drift-averaged flux at the specified output times. The .3d files contain the differential flux in SI units (m−2 sr−1 s−1 J−1) at all the points of the pitch-angle, energy, L* grid used in the calculation. Similarly, the .EFLUX files contain the differential energy flux in (m−2 sr−1 s−1). The .PRE files show the differential flux (m−2 sr−1 s−1 J−1) at the edge of the bounce loss cone.
The computational grid
The modelling domain is specified in terms of pitch-angle (α), energy (E), and L* coordinates. The model assumes symmetry at α = 90°, so 0 ≤ α ≤ 90°. The L* range is 2 ≤ L* ≤ 6.5. The maximum energy (Emax) is set at Emax = 30 MeV at L* = Lmax = 6.5. The minimum energy is Emin = 103.2 keV at L* = 6.5, corresponding to a first adiabatic invariant, μ = 100 MeV/G. The minimum and maximum energies at other L* values are determined by following lines of constant first adiabatic invariant for points that lie in the computational grid at Lmax. Hence, the energy at the boundaries depends on L*, increasing as L* decreases, so that at Lmin, Emin = 708.6 keV and Emax = 178.2 MeV.
The pitch-angle grid used in the calculation is regular, independent of energy and L*, and covers the range 0 ≤ α ≤ 90°. All grid indices run from 0 to the maximum index shown in the output file, so the pitch-angle grid is given by αi = i 90/Nα, i = 0, Nα. Similarly the L* grid is given by Lk = Lmin + k (Lmax−Lmin)/NL, k = 0, NL.
The energy grid is a little more complicated as it is L* dependent and uniform in ln (energy). In the output files, starting at line 29, there is a table of the maximum and minimum energy values for each L* in the grid. The energy grid at each L* is a uniform grid in ln (energy) between the maximum energy, Emax (L), and the minimum energy, Emin (L), with NE + 1 grid points. So, if Ej is the jth grid point,
The .3d file
The .3d files contain the differential flux in SI units (m−2 sr−1 s−1 J−1) at the points of the computational grid. The file starts with a header that describes the setup for the run. All lines up to and including line 25 have a 40 character description, followed by the appropriate value. These are detailed in the table below.
Following the table of minimum and maximum energies vs. L* that starts at line 29, there are three title lines, then the first set of output from the model. A header line gives the time in seconds since the start of the run, the Kp index at that time and the plasmapause location in L*. This line is followed by the flux at each grid point in m−2 sr−1 s−1 J−1, in the order (((flux (i, j, k), i = 0, Nα) j = 0, NE) k = 0, NL), i.e. with the pitch-angle index on the inner loop, the energy index on the middle loop and the L* index on the outer loop. This array is written out 10 values to a line.
The flux is followed by a blank line, then the time, Kp and plasmapause position for the next set of results is given, followed by the flux again. This is repeated until the end of the run is reached.
Line | Contents | Format |
---|---|---|
1 | Model description | String |
2 | Blank | |
3 | Maximum pitch-angle index (Nα) | 40 characters followed by an integer |
4 | Maximum energy index (NE) | 40 characters followed by an integer |
5 | Maximum L* index (NL) | 40 characters followed by an integer |
6 | Minimum energy at maximum L* (keV) | 40 characters followed by a real |
7 | Maximum energy at maximum L* (keV) | 40 characters followed by a real |
8 | Minimum L* value | 40 characters followed by a real |
9 | Maximum L* value | 40 characters followed by a real |
10 | Start time for simulation | 40 characters then yyyy-mm-dd hr:mm |
11 | Time step (seconds) | 40 characters followed by a real |
12 | Number of time steps | 40 characters followed by an integer |
13 | Number of sets of results | 40 characters followed by an integer |
14 | Generalised Crank–Nicolson parameter | 40 characters followed by a real |
15 | Plasmapause model | 40 characters followed by a string |
16 | Blank | |
17 | Name of chorus diffusion matrix | 40 characters followed by a string |
18 | Name of hiss diffusion matrix | 40 characters followed by a string |
19 | Name of EMIC diffusion matrix | 40 characters followed by a string |
20 | Name of magnetosonic diffusion matrix | Not used |
21 | Name of lightning generated whistler diffusion matrix | Not used |
22 | Name of transmitter diffusion matrix | Not used |
23 | Type of radial diffusion diffusion coefficient | 40 characters followed by a string |
24 | Name of collision diffusion matrix | 40 characters followed by a string |
25 | Initial condition | 40 characters followed by a string |
26 | Blank | |
27 | Title – “Pitch-angle/Energy (keV) Grid” | |
28 | Title – “Lshell min (energy) max (energy)” | |
29 to 29 + NL | L*, minimum energy, maximum energy table | Real (10 characters) real (14 characters) real (14 characters) |
30 + NL | Blank | |
31 + NL | Title | “Flux at each grid point in/(m2 s sr J)” |
32 + NL | Blank | |
33 + NL | Time (seconds since the start of the run), Kp index, Plasmapause location (in L) | Title (6 characters) real (16 characters) title (8 characters) real (6 characters) title (17 characters) real (6 characters) |
34 + NL and following | (((Flux (i,j,k), i = 0,Nα) j = 0,NE) k = 0,NL) | Ten, 12 character, real values to a line |
Blank | ||
Time, Kp, Plasmapause location | As above | |
(((Flux (i,j,k), i = 0,Nα) j = 0,NE) k = 0,NL) | As above | |
… |
The .EFLUX file
The format of the .EFLUX files is identical to the .3d files, except that the differential flux is replaced by differential energy flux in m−2 sr−1 s−1. The pitch-angle, energy and L* grids and the timestamps will be identical to those in the corresponding .3d file and are provided for completeness.
The .PRE file
The .PRE files contain the drift-averaged, differential flux (m−2 sr−1 s−1 J−1) at the edge of the bounce loss cone. They begin with a header with a similar format to the .3d files, but without the pitch-angle information. The header repeats some of the corresponding output in the .3d file and is there for completeness, as the output times and energy and L* grids in the .PRE file are the same as those in the .3d file.
Following the table of minimum and maximum energies vs. L*, there are three title lines, then the first set of output from the model. The first line gives the time in seconds since the start of the run. This line is followed by the precipitating flux at each (energy, L*) grid point in m−2 sr−1 s−1 J−1, in the order ((flux (j, k), j = 0, NE) k = 0, NL), i.e. with the energy index on the inner loop and the L* index on the outer loop.
The flux at the given time is followed by a blank line, then the next time is given followed by the flux again. This is repeated until the end of the run is reached.
Line | Contains | Format |
---|---|---|
1 | Header | |
2 | Blank | |
4 | Maximum energy index (NE) | 40 characters followed by an integer (10 characters) |
5 | Maximum L* index (NL) | 40 characters followed by an integer (10 characters) |
6 | Minimum energy at maximum L* (keV) | 40 characters followed by a real (10 characters) |
7 | Maximum energy at maximum L* (keV) | 40 characters followed by a real (10 characters) |
8 | Minimum L* | 40 characters followed by a real (10 characters) |
9 | Maximum L* | 40 characters followed by a real (10 characters) |
10 | Start time for simulation | yyyy-mm-dd hr:mm |
11 | Number of sets of results | 40 characters followed by an integer (10 characters) |
12 | Blank | |
13 | Title – Energy (keV) Grid | |
14 | Title – Lshell min Energy max Energy | |
15 to 15 + NL +1 | L*, minimum energy, maximum energy table | Real (10 characters) real (14 characters) real (14 characters) |
15 + NL + 2 | Blank | |
… | Title – Precipitating flux at each grid point in /(m2 s sr J) | |
Blank | ||
Time | Title (6 characters) real (16 characters) | |
((Flux (j, k), j = 0, NE) k = 0, NL) | 10, 12 character, real values to a line | |
Blank | ||
Time | As above | |
((Flux (i, j, k), j = 0, N E ) k = 0, N L ) | As above | |
… |
4.3.5 Kp prediction model
Model description
The geomagnetic Kp index was introduced by J. Bartels in 1949 and is derived from the standardized K index (Ks) of 13 magnetic observatories. It is designed to measure solar particle radiation by its magnetic effects and used to characterize the magnitude of geomagnetic storms. The K-index quantifies disturbances in the horizontal component of Earth’s magnetic field with an integer in the range 0–9 with 1 being calm and 5 or more indicating a geomagnetic storm. Kp is an excellent indicator of disturbances in the Earth’s magnetic field.
The VSWMC contains a simple model based on the empirical equation for the least variance linear prediction of Kp proposed in the paper by Newell et al. (2008). The paper shows that Kp is highly predictable from solar wind data (even without a time history) if both a merging term and a viscous term are used. The solar wind data for the Kp prediction can be taken, for example, from EUHFORIA forecast outputs at Earth. The model calculates Kp index and outputs it as a time series file, and also draws it as a plot image.
Model access and run information
The empirical Kp prediction model is a simple Python 2.7 script that can run in the same environment as, e.g. EUHFORIA Corona. In the VSWMC framework, it is installed and run on one of the KU Leuven HPC servers, access to which is provided via ssh.
There are two usage modes supported by the model script.
Standalone run mode. This runs the Kp prediction model using already available time series of solar wind parameters stored in the input file.
Coupled mode. In this mode the Kp prediction model dynamically receives the solar wind data from another model and generates Kp index time series nearly synchronously with the other model output generation.
The Kp prediction model computation takes around a minute at most.
Model input
As an input Kp prediction model accepts a time series of solar wind physical parameters at Earth. The parameters required for Kp calculation are the following: particle density, velocity and magnetic field. Currently the model is implemented to take as an input the ASCII text file containing time series output of EUHFORIA Heliosphere model for Earth (euhforia_Earth.dsv). The file contains the following parameters:
Output solar wind parameter | Data unit | Data type |
---|---|---|
Date | YYYY-MM-DDThh:mm:ss (ISO8601 date and time format) | String |
Grid point radial coordinate r | AU (Astronomical unit) | Float |
Grid point colatitude clt | rad (radian) | Float |
Grid point longitude lon | rad (radian) | Float |
Number density n | 1/cm3 (particles per cubic centimeter) | Float |
Pressure P | Pa (Pascal) | Float |
Radial component of velocity vr | km/s (kilometers per second) | Float |
Colatitude component of velocity vclt | km/s (kilometers per second) | Float |
Longitude component of velocity vlon | km/s (kilometers per second) | Float |
Radial component of magnetic field Br | nT (nano Tesla) | Float |
Colatitude component of magnetic field Bclt | nT (nano Tesla) | Float |
Longitude component of magnetic field Blon | nT (nano Tesla) | Float |
Model output
As the output Kp prediction model generates a time series file of Kp indices. The output file name is Kp.dat, it is in ASCII format, and contains the following parameters:
The model also generates the Kp index plot image as the output with the name Earth_plot_kp.png.
Output solar wind parameter | Data unit | Data type | Accuracy |
---|---|---|---|
Date | YYYY-MM-DDThh:mm:ss (ISO8601 date and time format) | String | N/A |
Kp index | dimensionless | Float | 1 decimal place |
4.3.6 Dst prediction model
Model description
The Dst or disturbance storm time index is a measure of geomagnetic activity used to assess the severity of magnetic storms. It is expressed in nanoteslas (nT) and is based on the average value of the horizontal component of the Earth’s magnetic field measured hourly at four near-equatorial geomagnetic observatories. Use of the Dst as an index of storm strength is possible because the strength of the surface magnetic field at low latitudes is inversely proportional to the energy content of the ring current, which increases during geomagnetic storms. In the case of a classic magnetic storm, the Dst shows a sudden rise, corresponding to the storm sudden commencement, and then decreases sharply as the ring current intensifies. Once the IMF turns northward again and the ring current begins to recover, the Dst begins a slow rise back to its quiet time level. The relationship of inverse proportionality between the horizontal component of the magnetic field and the energy content of the ring current is known as the Dessler–Parker–Sckopke relation. Other currents contribute to the Dst as well, most importantly the magnetopause current. The Dst index is corrected to remove the contribution of this current as well as that of the quiet-time ring current.
The VSWMC contains a simple model based on the empirical equation for Dst prediction proposed in the paper by O’Brien & McPherron (2000). The paper uses a large database of ring current and solar wind parameters, covering hundreds of storms. Any study of individual storms is highly susceptible to the uncertainty inherent in the Dst index. The paper shows, however, that the empirical Burton equation, with only slight modification, does accurately describe the dynamics of the ring current index Dst. That is, allowing the decay time to vary with interplanetary electric field VBs is the only modification necessary.
The solar wind data for the Dst prediction can be taken, for example, from EUHFORIA forecast outputs at Earth. Initial Dst value required for correct index calculation is taken from the ODI index_dst.dst dataset. The model calculates Dst index and outputs it as a time series file, and also draws it as a plot image.
Model access and run information
The empirical Dst prediction model is a simple Python 2.7 script that can run in the same environment as, e.g. EUHFORIA Corona. In the VSWMC framework, it is installed and run on one of the KU Leuven HPC servers, access to which is provided via ssh.
There are two usage modes supported by the model script.
Standalone run mode. This runs the Dst prediction model using already available time series of solar wind parameters stored in the input file.
Coupled mode. In this mode the Dst prediction model dynamically receives the solar wind data from another model and generates Dst index time series nearly synchronously with the other model output generation.
The Dst prediction model computation takes around a minute at most.
Model input
As an input Dst prediction model accepts a time series of solar wind physical parameters at Earth and an initial Dst value required by the model for correct index calculation. The parameters required for Dst calculation are the following: particle density, velocity and magnetic field. Currently the model is implemented to take as an input the ASCII text file containing time series output of EUHFORIA Heliosphere model for Earth (euhforia_Earth.dsv). The file contains the following parameters:
The initial Dst value is taken during the model start-up by sending a query to the ODI database with the use of a php script.
Input solar wind parameter | Data unit | Data type |
---|---|---|
Date | YYYY-MM-DDThh:mm:ss (ISO8601 date and time format) | String |
Grid point radial coordinate r | AU (Astronomical unit) | Float |
Grid point colatitude clt | rad (radian) | Float |
Grid point longitude lon | rad (radian) | Float |
Number density n | 1/cm3 (particles per cubic centimeter) | Float |
Pressure P | Pa (Pascal) | Float |
Radial component of velocity vr | km/s (kilometers per second) | Float |
Colatitude component of velocity vclt | km/s (kilometers per second) | Float |
Longitude component of velocity vlon | km/s (kilometers per second) | Float |
Radial component of magnetic field Br | nT (nano Tesla) | Float |
Colatitude component of magnetic field Bclt | nT (nano Tesla) | Float |
Longitude component of magnetic field Blon | nT (nano Tesla) | Float |
Model output
As the output Dst prediction model generates a time series file of Dst indices. The output file name is Dst.dat, it is in ASCII format, and contains the following parameters:
The model also generates the Dst index plot image as the output with the name euhforia_Earth_plot_dst.png.
Output solar wind parameter | Data unit | Data type | Accuracy |
---|---|---|---|
Date | YYYY-MM-DDThh:mm:ss (ISO8601 date and time format) | string | N/A |
Dst index | nT (nano Tesla) | float | 5 decimal places |
4.3.7 Magnetopause standoff distance model
Model description
The magnetopause is the boundary between the magnetic field of the planet and the solar wind. The location of the magnetopause is determined by the balance between the pressure of the planetary magnetic field and the dynamic pressure of the solar wind.
This document describes a simple model for the prediction of the standoff distance from the Earth to the magnetopause along the line to the Sun based on the equation proposed in the Shue model (Taktakishvili et al., 2009). To improve predictions under extreme solar wind conditions the Shue model takes into consideration a nonlinear dependence of the parameters on the solar wind conditions to represent the saturation effects of the solar wind dynamic pressure on the flaring of the magnetopause and saturation effects of the interplanetary magnetic field Bz on the subsolar standoff distance.
The solar wind data for the magnetopause standoff distance prediction can be taken, for example, from EUHFORIA forecast outputs at Earth. The model calculates magnetopause standoff distance and outputs it as a time series file, and also draws it as a plot image.
Model access and run information
The magnetopause standoff distance prediction model is a simple Python 2.7 script that can run in the same environment as, e.g. EUHFORIA Corona. In the VSWMC framework, it is installed and run on one of the KU Leuven HPC servers, access to which is provided via ssh.
There are two usage modes supported by the model script.
Standalone run mode. This runs the magnetopause standoff distance prediction model using already available time series of solar wind parameters stored in the input file.
Coupled mode. In this mode the magnetopause standoff distance prediction model dynamically receives the solar wind data from another model and generates magnetopause standoff distance time series nearly synchronously with the other model output generation.
The magnetopause standoff distance prediction model computation takes around a minute at most.
Model input
As an input magnetopause standoff distance prediction model accepts a time series of solar wind physical parameters at Earth. The parameters required for magnetopause standoff distance calculation are the following: particle density, velocity and magnetic field. Currently the model is implemented to take as an input the ASCII text file containing time series output of EUHFORIA Heliosphere model for Earth (euhforia_Earth.dsv). The file contains the following parameters:
Output solar wind parameter | Data unit | Data type |
---|---|---|
Date | YYYY-MM-DDThh:mm:ss (ISO8601 date and time format) | String |
Grid point radial coordinate r | AU (Astronomical unit) | Float |
Grid point colatitude clt | rad (radian) | Float |
Grid point longitude lon | rad (radian) | Float |
Number density n | 1/cm3 (particles per cubic centimeter) | Float |
Pressure P | Pa (Pascal) | Float |
Radial component of velocity vr | km/s (kilometers per second) | Float |
Colatitude component of velocity vclt | km/s (kilometers per second) | Float |
Longitude component of velocity vlon | km/s (kilometers per second) | Float |
Radial component of magnetic field Br | nT (nano Tesla) | Float |
Colatitude component of magnetic field Bclt | nT (nano Tesla) | Float |
Longitude component of magnetic field Blon | nT (nano Tesla) | Float |
Model output
As output magnetopause standoff distance prediction model generates a time series file of magnetopause standoff distances. The output file name is DSO.dat, it is in ASCII format, and contains the following parameters:
The model also generates the magnetopause standoff distance plot image as the output with the name Earth_plot_dso.png.
Output solar wind parameter | Data unit | Data type | Accuracy |
---|---|---|---|
Date | YYYY-MM-DDThh:mm:ss (ISO8601 date and time format) | String | N/A |
Magnetopause standoff distance | Re | Float | 1 decimal place |
4.3.8 ODI
Model description
The main data source for commonly used datasets (such as solar wind parameters, IMF parameters, magnetic indices) is an Open Data Interface instance maintained at ESA/ESTEC. ODI is a database framework which allows for streamlined data downloads and ingestion (https://spitfire.estec.esa.int/trac/ODI/wiki/ODIv5). The ESTEC instance is accessible through a REST interface for data queries.
In order to integrate access to the ODI database into the VSWMC framework, a php script was written that formulates REST requests, executes them and outputs the download data into a flat ASCII file. For all intents and purposes, the php script can be considered to be a model federate which provides inputs for model runs and as such can be coupled to other federates.
A secondary php script was added to retrieve the start and end epochs of a (set of) dataset(s).
Model access and run information
The ODI federate is instantiated as a single php script (runODI.php) which is to be run on command line with a number of named parameters to specify the time range, data cadence, and a list of quantities to be retrieved. No other modules need to be loaded in order to execute the php script. Access to the ESA/ESTEC ODI server at https://spitfire.estec.esa.int/ should be allowed in the firewall rules (standard https port).
Model input
All model inputs are supplied as named parameters in a command line call.
Model output
At present, ODI is used as a data provision tool for a small (but growing) number of federates, e.g. GUMICS or the BAS RBM model. The quantities required are specific to each model, and as such it is not possible to provide a definitive list of output quantities for the ODI federate. However, the SPASE SimulationModel file for ODI contains all information (e.g., physical units, coordinate systems) on the quantities that can currently be retrieved from the ODI database. When additional quantities are added in future, the SPASE file will be updated as well.
4.3.9 Visualization “federate”
Model description
The purpose of the Visualization federate is to automatically generate 2D/3D images or movies showcasing the evolution of a simulation using tools like Python Matplotlib, Paraview, Visit (all open source), or Tecplot, IDL (commercial), etc. The users will be able to view these images/movies via the web interface. Those 2D/3D images and movies will offer a preview of the simulation results, since the end-user will be able to download the full simulation output data sets to his/her own computer for offline visualization and use tools like Paraview, Visit, or Tecplot, as long as the file format is compatible to those software packages. It is also possible to use Visualization federate for conversion of the original model output file formats to the formats compatible with the offline visualization tools, and make them available for downloading, using special tools, utilities, or scripts provided with the models for this purpose.
The VSWMC contains a Visualization federate for the EUHFORIA Heliosphere simulation output visualization based on the Python script provided together with the model. The script processes the EUHFORIA Heliosphere output files and generates two 2D images (for the solar wind particle density and radial velocity components) for each of the model output cycle. Using these images the VSWMC GUI component creates an animated slide show available via the web interface.
Model access and run information
The Visualization federate uses the EUHFORIA Heliosphere visualization script as a model, which is a simple Python 2.7 script that can run in the same environment as, e.g. EUHFORIA Corona. In the VSWMC framework, it is installed and run on one of the KU Leuven HPC servers, access to which is provided via ssh.
There are two usage modes supported by the model script.
Standalone run mode. This runs the EUHFORIA Heliosphere visualization script using already available (pre-generated) model output files with solar wind data.
Coupled mode. In this mode the Visualization federate dynamically receives solar wind data files during the EUHFORIA Heliosphere run and generates images nearly synchronously with the model output generation. The federate runs the visualization script each time the model generates new output files.
The visualization script computation takes around a minute.
Model input
As an input the visualization script accepts the EUHFORIA Heliosphere output data saved in binary files in the NumPy format (NPY, NPZ) for the whole simulation greed. An NPZ (NumPy Zipped Data) file contains NPY files with particular physical parameter data each. An NPY file is named after the parameter it contains, e.g.: Br.npy, or vclt.npy. The files contain the physical parameters of solar wind from 21.5 Rs to 2 AU (or further if necessary). The parameters are the following:
Output solar wind parameter | Data unit | Data type |
---|---|---|
Date | YYYY-MM-DDThh:mm:ss (ISO8601 date and time format) | String |
Grid point radial coordinate r | AU (Astronomical unit) | Float |
Grid point colatitude clt | rad (radian) | Float |
Grid point longitude lon | rad (radian) | Float |
Number density n | 1/cm3 (particles per cubic centimeter) | Float |
Pressure P | Pa (Pascal) | Float |
Radial component of velocity vr | km/s (kilometers per second) | Float |
Colatitude component of velocity vclt | km/s (kilometers per second) | Float |
Longitude component of velocity vlon | km/s (kilometers per second) | Float |
Radial component of magnetic field Br | nT (nano Tesla) | Float |
Colatitude component of magnetic field Bclt | nT (nano Tesla) | Float |
Longitude component of magnetic field Blon | nT (nano Tesla) | Float |
Model output
The visualization script outputs two 2D images: for solar wind particle density and radial velocity component. Each image consists of three parts that plot corresponding parameter distribution in the constant Earth latitude plane, the meridional plane of the Earth, and the parameter time series profile at Earth.
The solar wind particle density plot is named as following: nscaled_YYYY-MM-DDTHH-MM-SS.png, where YYYY-MM-DDTHH-MM-SS corresponds to the time stamp of the input file, and the radial velocity plot is named as following: vr_YYYY-MM-DDTHH-MM-SS.png.
4.4 Development of data provision nodes
The ODI script (see Sect. 4.3.8) has been extended and supplemented with additional supporting scripts. For instance, it now enables OMNI data (spacecraft-interspersed, near-Earth solar wind data) input that is used for COOLFluiD, GUMICS4 and as input for the magneto pause stand-off distance (used for BAS-RBM). Other additional data sources include GOES and ACE near-real time data of the solar wind, interplanetary magnetic field (IMF), and electron and proton radiation environment in GEO.
Other external datasets have also been exploited in the new VSWMC system. EUHFORIA Corona, for instance, uses magnetograms from the Global Oscillation Network Group (GONG) database, and the cone CME parameters for the EUHFORIA Heliosphere model are retrieved from the space weather database of notifications, knowledge, information (DONKI) server which has been developed at the Community Coordinated Modeling Center (CCMC).
4.5 Running models and coupled models via the User Front-End (GUI)
As not all models integrated in the VSWMC are sufficiently mature yet, the mature models and model couplings have been duplicated in a limited (but operational) system (with another URL: https://spaceweather.hpc.kuleuven.be/, accessible via the H-ESC webpage as described above). Figure 6 shows how the VSWMC welcome page looks when integrated in the SSA SWE Portal. This limited system passed the acceptance tests for integration in the H-ESC and has been made available to a user community (cf. Sect. 4.3). The full development system will be maintained in the same time, of course. Clicking on the button in the upper banner on the right, one gets a list of all currently available models (including the demo models) and model couplings, as shown in Figure 7. The same action in the operational system gives a more limited list of models and model couplings as the immature “demo” models and couplings are not listed there. But as apart from that both systems are identical, including the info pages.
Fig. 6 Mock-up of screen shot of the VSWMC welcome page once integrated in the SSA SWE Portal providing an impression of how the integrated VSWMC will look. |
Fig. 7 Choice offered when requesting a new run (in the test environment system). |
When clicking on one of the offered federates, the actual set up is depicted showing all the models involved. This is demonstrated in the screen shot below (Fig. 8) for the EUHFORIA + indices option. Hence, it is immediately clear that this model chain involves six federates, viz. EUHFORIA Corona and EUHFORIA Heliosphere, the Visualizer of EUHFORIA results, and three empirical geo-effective index models that use the synthetic solar wind data at L1 from EUHFORIA to determine the Kp and Dst indices and the magnetopause standoff distance.
Fig. 8 The EUHFORIA + indices model chain. |
Upon selecting the model or model chain, the GUI displays the input page offering different options for all the input required to successfully run the coupled models. As soon as the minimum required input is available, the “Start the run” button lights up, indicating the model could run with the provided input. At this point the input can still be modified or complimented, e.g. by adding CMEs, or changing the CME parameters for the EUHFORIA Heliospheric evolution model. When the “Start the run” button is pushed, the simulation starts and the user sees the log screen enabling him to follow the status of the different models in the chain. The model or model chain is shown in the left column in orange, with an indication on how long it has been running.
When the simulation is finished, the model or chain in the left column turns green. Clicking on it, show all the runs made by the user with this specific model, indicating which runs were successful, or stopped (by the user) or unsuccessful (see Fig. 9 below). The most recent run is on top of the list. Clicking on it triggers the GUI to show the page for this specific simulation run, which contains tabs with the input parameters, the log screen and the results. These consist of output files (e.g., the solar wind parameters at Earth, Stereo A, Marsand other planets or even virtual satellites) and, if applicable, visualizations (plots and movies). All these files can be previewed and/or downloaded by simply clicking on the appropriate buttons behind the file names. As an example, we show in Figure 10 one of the 145 snapshots of the radial velocity in the equatorial and meridional planes from the movie produced for the EUHFORIA chain run.
Fig. 9 Example of list of runs made with a certain model or model chain (in this case EUHFORIA + indices). |
Fig. 10 Example screenshot of the radial velocity in the equatorial plane (left) and meridional plane of the Earth (right). |
5 Functionality of the system
A set of HLA federations (coupled simulations) has been deployed in the VSWMC, including several end-to-end chain of existing physics-based models. For instance, the coupling of EUHFORIA Corona – EUHFORIA Heliosphere – EUHFORIA Visualizer, combines three federates and first calculates the Wang–Sheeley–Arge corona using a GONG magnetogram uploaded from the external GONG database, and yields the MHD parameters at 0.1 AU which are used as boundary conditions for the Heliospheric wind on which cone CMEs can be superposed. The parameters of the latter can be typed in or uploaded from another external (DONKI) database. The output files of the time dependent 3D MHD simulation are turned into standard plots and movies by the visualization federate. The model chain EUHFORIA Corona – EUHFORIA Heliosphere – EUHFORIA Visualizer + the three geo-effect indices models (Dst index, Kp index and magnetopause stand-off distance) (see Fig. 11), does the same as the previous one but computes on top of that the Kp and Dst indices and the magnetopause stand-off distance from the (synthetic) solar wind parameters at L1.
Fig. 11 EUHFORIA coupled to the geo-effect indices models. |
The next model chain depicted in Figure 12 also involves seven federates but is slightly more complicated and involves also the BAS-RBM model. In fact, the output of EUHFORIA is used to calculate the Kp index and this is subsequently used as one of the inputs for the BAS Radiation Belt models. The other required input for BAS-RBM is also synthetic in this example and taken form the magnetopause stand-off distance model that is supplied itself with synthetic wind data at L1 from EUHFORIA.
Fig. 12 The BAS-RBM model in a model chain such that it can be fed with synthetic Kp and magnetopause stand-off distance data obtained from solar wind parameters calculated by EUHFORIA. |
6 Lessons learned and future development
6.1 Challenges and lessons learned
The development and especially the verification and validation tests have revealed several challenges for the VSWMC, including:
It turned out to be difficult to convince the modellers to provide their models, even though the compiled object files suffice, and to fill out the necessary ICD forms, even though this is not a big work load. Without the information in the ICD form, however, it turned out to be extremely difficult to install the model and to adjust the GUI to run the model.
CPU capacity: some of the models are 3D time dependent and require a lot of CPU time to run. These are installed on a computer cluster which is not dedicated, i.e. also used by others. But even if the cluster would be a dedicated one, the fact that multiple users exploit the system simultaneously means that the cluster must be operated as a batch system. This means that the simulations are queued and run one after the other, or simultaneously if there are sufficient nodes available.
Data storage: some of the models generate a huge amount of data. These data cannot be stored forever and it will have to be decided which data are kept and which erased after 2–3 weeks.
Wall clock time: some models are not parallel, e.g. GUMICS4, which means they run on a single processor and this can take a lot of time. For GUMICS4 with five levels of adaptation of the grid, the wall clock time is an order of magnitude longer than real time. This can only be solved by installing a parallel version of the model.
Network: Running the models requires a lot of communication between the VSWMC system server and the (remote) clusters on which the models are installed. When there are interruptions of these communications, it is a challenge to retrieve the links. The tests revealed that sometimes the model ran successfully, but due to loss of the link, the user could not see that.
Two-way couplings remain a challenge, especially if the two models are not installed in the same location.
We did check different levels of model integration. For instance, the XTRAPOL model runs in Paris and the cluster there has been integrated as a full VSWMC node. The BAS-RBM model, however, is integrated at the lowest possible level: the input is generated by the VSWMC and placed in a folder at BAS where the system checks once per minute if an input file is provided. If so, it runs the BAS-RBM with this input file and puts the output in another folder. The VSWMC automatically checks whether the output files are available and downloads them to the VSWMC if necessary. This is a delicate setup that slows down coupled simulations and enables only a weak couplings to other models in the system. But it works, at least for simple couplings in which the models run in series and not simultaneously, and as long as the weakly coupled model is at the end of the model chain. It remains to be tested whether this setup also works for a model in the middle of a chain.
6.2 Future updates and development
Part 3 of the VSWMC should address the above-mentioned challenges and also make the system “idiot proof”, i.e. verify automatically if the given parameter values are within a physically meaningful range, or prevent the possibility to provide “unphysical” values.
The system could also make an estimate of the required CPU time and data storage a specific run of a model or a model chain will require. Although the system makes it very simple to set up simulations, the users not always realize how much they ask from the back-end system by requesting a run or a set of runs.
Another improvement would be automatic checks of all components of the system that can be run when a new model has been added or a new coupling has been established or a model has been changed, as this might influence the system in unexpected manners. An update of a model might for instance ruin all model chains in which that particular mode features. Or even an upgrade of a compiler of the cluster on which a model is installed may ruin the simulation runs in which that model is involved.
The Part 2 system does not deliver any hardware platform but rents computing and storage resources from KU Leuven via a virtual server and runs the simulations on one of the available clusters at KU Leuven and the clusters in graphically distributed nodes, like at VKI (Sint-Genesius-Rode), in Paris (École Polytechnique) and Cambridge (BAS). Given the uncertainties of the growth rate of the model, data and simulation repositories of the VSWMC-P2 system, the number of the different types of users, and the actual demand of CPU time, computer memory and data storage, we recommend that ESA follows this strategy for at least a few years, in order to keep the flexibility of the system and to perform a scalability analysis of the entire system setup.
Future run-time system (RTS) functionality might include more flexibility. For instance, models may run on different nodes and the RTS could find out which node will provide the fastest execution for a particular simulation, depending on the capacity and use of the different nodes in the system. Alternatively, some models might be executed in the Cloud. It may even become possible to execute large parallel simulations simultaneously on different clusters. The latter would be feasible even with today’s technology for ensemble simulations as these are “embarrassingly parallel” and can simply run the same simulation model with different sets of input parameters in parallel on different nodes of the same cluster or on different clusters, depending on the availability and level of usage of the different clusters in the system.
For the prototype, the Developer Environment consists of simple scripts that can be used to create/validate/upload models. It is expected that this procedure will be used for a while until it will become clearer whether a more integrated development environment is actually needed or not.
In the future, many more models and model chains, like those in the current VSWMC-P2 system, need to be established. The list of models integrated into the VSWMC thus needs to be expanded, including more data streams, alternative CME evolution models (e.g., the drag-based model), CME onset models (e.g., XTRAPOL), ionosphere and thermosphere models, SEP acceleration and transport models, alternative global coronal models, etc. In particular, the SEP transport models, that need a combined global MHD coronal model and MHD heliospheric wind and CME evolution model as input, would profit from the VSWMC setup.
To increase the awareness of the modellers on the usability of VSWMC we suggest to setup demo sessions, e.g. during the Fair on the European Space Weather Week, to demonstrated the easiness of use and the possibilities and potential of the VSWMC. On such occasions, it could also be clarified with examples, how easy it is to include a new model and to provide the necessary information via the ICD form, providing a template ICD and some examples.
7 Summary and conclusion
The VSWMC-P2 consortium succeeded in extending the VSWMC Phase 1 prototype by implementing a graphical user interface (GUI), facilitating the operation; providing a generalized framework implementation; including full-scale models that can easily be run via the GUI; including more and more complicated model couplings; including several visualization models; creating a modest operational system that has been integrated into the SSA SWE service network and is thus accessible via the SWE portal.
In conclusion, the VSWMC system has been successfully integrated into the SWE network as a federated SWE service at level 2 integration which required that a remote website integrates the SWE portal’s identity and single sign-on (SSO) subsystem, making it available via its home institute, and federated as part of the product portfolio of the Heliospheric Weather ESC. We succeeded in extending the VSWMC prototype substantially by a complete redesign of the core system, implementing a graphical user interface (GUI), including full-scale models that can easily be run via the GUI, including more and more complicated model couplings involving up to seven different models, some of which are ready for operational use, and also including several visualization models that can be coupled to the models so that standard output pictures and movies are automatically generated, and which can be viewed directly via the GUI and downloaded.
The new VSWMC has great potential to add value to ESA’s SSA SWE service network, even though the amount of models integrated is limited and some of these models are only demo versions. Full support of the modellers is often needed for a proper integration in the system, especially when the model needs to be coupled to other models. However, it is expected that modellers will want to participate in the VSWMC and provide their model once the system gets operational and the many advantages will become apparent.
Acknowledgments
These results were obtained in the framework of the ESA project “SSA-P2-SWE-XIV - Virtual Space Weather Modelling Centre - Phase 2” (ESA Contract No. 4000116641/15/DE-MRP, 2016–2019). Additional support from the projects C14/19/089 (C1 project Internal Funds KU Leuven), G.0A23.16N (FWO-Vlaanderen), C 90347 (ESA Prodex), Belspo BRAIN project BR/165/A2/CCSOM and EU H2020-SPACE-2019 (SU-SPACE-22-SEC-2019 – Space Weather) project “EUHFORIA 2.0” is greatly acknowledged. For the computations we used the infrastructure of the VSC – Flemish Supercomputer Center, funded by the Hercules foundation and the Flemish Government – department EWI.
References
- Abel B, Thorne RM. 1998. Electron scattering loss in Earth's inner magnetosphere: 1. Dominant physical processes. JGR Space Phys. 103(A2): 2385–2396. https://doi.org/10.1029/97JA02919. [Google Scholar]
- Amari T, Boulmezaoud TZ, Aly JJ. 2006. Well posed reconstruction of the solar coronal magnetic field. A&A 446: 691–705. https://doi.org/10.1051/0004-6361:20054076. [NASA ADS] [CrossRef] [EDP Sciences] [Google Scholar]
- Eastwood JP, Biffis E, Hapgood MA, Green L, Bisi MM, Bentley RD, Wicks R, McKinnell L-A, Gibbs M, Burnett C. 2017. The economic impact of space weather: Where do we stand? Risk Anal. 37(2): 206–218. https://doi.org/10.1111/risa.12765. [CrossRef] [Google Scholar]
- ESA SSA Team. 2011. Space situational awareness – space weather customer requirements document (CRD). SSA-SWE-RS-CRD-1001, issue 4.5a, 28/07/2011. Available from: http://swe.ssa.esa.int/documents. [Google Scholar]
- Fuller-Rowell TJ, Rees D. 1980. A three-dimensional time-dependent global model of the thermosphere. J Atmos Sci 37: 2545–2567. https://doi.org/10.1175/1520-0469(1980)037<2545:ATDTDG>2.0.CO;2. [CrossRef] [Google Scholar]
- Glauert S, Horne R, Meredith N. 2014. Three dimensional electron radiation belt simulations using the BAS Radiation Belt Model with new diffusion models for chorus, plasmaspheric hiss and lightning-generated whistlers. J Geophys Res: Space Phys 119: 268–289. https://doi.org/10.1002/2013JA019281. [CrossRef] [Google Scholar]
- Harvey JW, Hill F, Hubbard RP, Kennedy JR, Leibacher JW, et al. 1996. The Global Oscillation Network Group (GONG) project. Science 272(5266): 1284–1286. https://doi.org/10.1126/science.272.5266.1284. [NASA ADS] [CrossRef] [PubMed] [Google Scholar]
- Hosteaux S, Chané E, Decraemer B, Tălpeanu D, Poedts S. 2018. Ultra-high resolution model of a breakout CME embedded in the solar wind. A&A 620: A57. https://doi.org/10.1051/0004-6361/201832976. [Google Scholar]
- Hosteaux S, Chané E, Poedts S. 2019. On the effect of the solar wind density on the evolution of normal and inverse Coronal Mass Ejections. A&A 632: A89. https://doi.org/10.1051/0004-6361/201935894. [NASA ADS] [CrossRef] [EDP Sciences] [Google Scholar]
- Janhunen P, Palmroth M, Laitinen T, Honkonen I, Juusola L, Facskó G, Pulkkinen T. 2012. The GUMICS-4 global MHD magnetosphere–ionosphere coupling simulation. J Atmos Sol-Terr Phys 80: 48–59. https://doi.org/10.1016/j.jastp.2012.03.006. [CrossRef] [Google Scholar]
- Lakka A, Pulkkinen TI, Dimmock AP, Kilpua E, Ala-Lahti M, Honkonen I, Palmroth M, Raukunen O. 2019. GUMICS-4 analysis of interplanetary coronal mass ejection impact on Earth during low and typical Mach number solar winds. Ann Geophys 37: 561–579. https://doi.org/10.5194/angeo-37-561-2019. [CrossRef] [Google Scholar]
- Lani A, Villedieu N, Bensassi K, Kapa L, Vymazal M, Yalim MS, Panesi M. 2013. COOLFluiD: An open computational platform for multi-physics simulation. In: Proc. 21st AIAA CFD Conference, AIAA 2013-2589, San Diego, June 2013. https://doi.org/10.2514/6.2013-2589. [Google Scholar]
- Millward GH, Moffett RJ, Quegan S, Fuller-Rowell TJ. 1996. A coupled thermosphere ionosphere plasmasphere Model (CTIP). In: STEP handbook on ionospheric models, Schunk RW (Eds.), Utah State Univ, Logan, pp. 239–279. [Google Scholar]
- Newell PT, Sotirelis T, Liou K, Rich FJ. 2008. Pairs of solar wind-magnetosphere coupling functions: Combining a merging term with a viscous term works best. J Geophys Res: Space Phys 113: A04218. https://doi.org/10.1029/2007JA012825. [CrossRef] [Google Scholar]
- O’Brien TP, McPherron RL. 2000. An empirical phase space analysis of ring current dynamics: Solar wind control of injection and decay. J Geophys Res 105: 7707–7720. https://doi.org/10.1029/1998JA000437. [CrossRef] [Google Scholar]
- Pomoell J, Poedts S. 2018. EUHFORIA: EUropean Heliospheric FORecasting Information Asset. J Space Weather Space Clim 8: A35. https://doi.org/10.1051/swsc/2018020. [CrossRef] [Google Scholar]
- Scolini C, Verbeke C, Poedts S, Chané E, Pomoell J, Zuccarello FP. 2018. Effect of the initial shape of Coronal Mass Ejections on 3D MHD simulations and geoeffectiveness predictions. Space Weather 16: 754–771. https://doi.org/10.1029/2018SW001806. [NASA ADS] [CrossRef] [Google Scholar]
- Taktakishvili A, Kuznetsova M, MacNeice P, Hesse M, Rastätter L, Pulkkinen A, Chulaki A, Odstrcil D. 2009. Validation of the coronal mass ejection predictions at the Earth orbit estimated by ENLIL heliosphere cone model. Space Weather 7: S03004. https://doi.org/10.1029/2008SW000448. [CrossRef] [Google Scholar]
- Tóth G, Sokolov IV, Gombosi TI, Chesney DR, Clauer CR, De Zeeuw DL, Hansen KC, Kane KJ, Manchester WB, Oehmke RC, Powell KG, Ridley AJ, Roussev II, Stout QF, Volberg O, Wolf RA, Sazykin S, Chan A, Yu B, Kóta J. 2005. Space weather modeling framework: A new tool for the space science community. J Geophys Res 110: A12226. https://doi.org/10.1029/2005JA011126. [CrossRef] [Google Scholar]
- Xia C, Teunissen J, El Mellah I, Chané E, Keppens R. 2018. MPI-AMRVAC 2.0 for solar and astrophysical applications. ApJ Suppl 234: 30 (26 p). https://doi.org/10.3847/1538-4365/aaa6c8. [Google Scholar]
- Yalim MS, Poedts S. 2014. 3D Global Magnetohydrodynamic Simulations of the Solar Wind/Earth’s Magnetosphere Interaction. In: “Numerical modeling of space plasma flows, ASTRONUM-2013”, Proc. 8th International Conference on Numerical Modeling of Space Plasma Flows (Astronum 2013), July 1–5, 2013, Biarritz, France, Pogorelov NV, Audit E., Zank GP (Eds.), ASP Conference Series, 488, pp. 192–197. [Google Scholar]
Cite this article as: Poedts S, Kochanov A, Lani A, Scolini C, Verbeke C, et al. 2020. The Virtual Space Weather Modelling Centre. J. Space Weather Space Clim. 10, 14.
All Figures
Fig. 1 Basic set-up of the federated VSWMC-P2 service with geographically distributed system elements. |
|
In the text |
Fig. 2 VSWMC-P2 system requirements overview. |
|
In the text |
Fig. 3 Illustration of the run-time system taking models form the repository and linking them to each other via Model Coupling Interfaces. |
|
In the text |
Fig. 4 RTI gateways (RTIGs) manage the simulations and transfers messages between federates. The VSWMC-P2 system supports multiple RTIGs to tackle high communication loads. |
|
In the text |
Fig. 5 Screen shot of the H-ESC webpage with the Link “VSWMC” on the “Product demonstration” tab that gives access to the login page of the VSWMC. |
|
In the text |
Fig. 6 Mock-up of screen shot of the VSWMC welcome page once integrated in the SSA SWE Portal providing an impression of how the integrated VSWMC will look. |
|
In the text |
Fig. 7 Choice offered when requesting a new run (in the test environment system). |
|
In the text |
Fig. 8 The EUHFORIA + indices model chain. |
|
In the text |
Fig. 9 Example of list of runs made with a certain model or model chain (in this case EUHFORIA + indices). |
|
In the text |
Fig. 10 Example screenshot of the radial velocity in the equatorial plane (left) and meridional plane of the Earth (right). |
|
In the text |
Fig. 11 EUHFORIA coupled to the geo-effect indices models. |
|
In the text |
Fig. 12 The BAS-RBM model in a model chain such that it can be fed with synthetic Kp and magnetopause stand-off distance data obtained from solar wind parameters calculated by EUHFORIA. |
|
In the text |
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.