Last Updated: 09/09/2022

International Forum for Space Weather Capabilities Assessment Inactive Teams

SUPERTOPIC: QUANTIFYING SCIENTIFIC PROGRESS

Assessment of Understanding and Quantifying Progress Toward Science Understanding and Operational Readiness

We will be working towards quantifying and developing cross domain metrics and frameworks for tracking progress in our research and understanding.

Leads: A. Halford, A. Kellerman, K. Garcia-Sage, B. Thompson, S. Morley
Communications: ccmc-trackprogress@googlegroups.com (mailing list)

Website for ongoing forum activities

Interactive team website for the 2017 CCMC Forum Workshop: agenda, feedback, meeting minutes, and more (constantly updated)
schedule
planning and input page
meeting minutes
ARLs and metrics

Working team activities during the April 2017 meeting:
This working team evaluated existing metrics and developed new metrics for measuring advancements, and current capabilities within the field of space weather focusing on three broad topics. Discussions, panels, and working groups focused on metrics for:

1- Progress in our research understanding.
2- The functionality of current models, both physics based and data driven. (including new physics e.g.)
3- Ability to include current research/physical understanding into models.

With each of these areas comes distinct challenges for developing metrics which we will explore in this working team. Topic 1 poses unique challenges for quantifying progress. Metrics such as number of papers or citation counts fail to accurately capture improvements in our understanding of the field. Metrics that accurately capture our research progress will allow us as a community to direct our research in an unbiased way towards addressing unanswered questions. This topic will consider questions such as “How critical are we?”, “Are we addressing end user concerns?”, and perhaps most importantly “How are we assessing our progress?”. Other fields have encountered similar problems, and we will consider how they have tracked progress as well as how our own field has attempted to show a continue advancement of understanding. Focus topics 2 and 3 will address questions such as “How should we compare our models against theory and observations?”, “Are our current metrics unbiased, and how can we develop ways to improve them?”, and “Are we consciously or unconsciously ignoring specific issues or topics?”. For topic 2 we will discuss and make progress towards understanding which current metrics are appropriate for the modeler and end user, as well as to identify new metrics that are required to understand model performance and appropriateness for a given problem. For topic 3 we will quantify our ability to model parts of the heliosphere with physics based models, including the time it takes to test and implement new ideas/models/observations into our larger models. We hope to find metrics that will quantify the advancements achieved by including new physics into models, as well as determining which physics is most important for different regions of the heliosphere and under what conditions.

This will be a working meeting and hope that all come prepared to engage in discussion and subsequent write ups of the results. We may also have online telecons prior to the workshop in order to make the most of the in person meeting.


SOLAR

Solar Flare Prediction

Team Leads: S. Murray, M. Georgoulis, S. Bloomfield, KD Leka
Scoreboard Leads: S. Murray, M. L. Mays

Communications: ccmc-flare-team@googlegroups.com (mailing list); slack: ccmc-collab.slack.com (email leads to join our slack discussion channels)
Participants: Anastasios Anastasiadis · Shaun Bloomfield · Monica Bobra · David Falconer · Manolis K. Georgoulis* · Jordan Guerra · Rachel Hock-Mysliwiec · Irina Kitiashvili* · Alexander Kosovichev* · Kangjin Lee · Laure Lefevre* · KD Leka · Periasamy K Manoharan* · Daniel Matthiä* · Leila Mays* · Mike McAleenan* · Karin Muglach* · Sophie Murray* · Naoto Nishizuka · Pete Riley · Aleksandre Taktakishvili* · Matthew West* · Katherine Winters* · Jie Zhang ·
Tarek Al-Ubaidi* · Michael Balikhin* · Francois-Xavier Bocquet · Mark Cheung · Yaireska Collado-Vega* · Francis Eparvier* · Natalia Ganushkina* · Laura Godoy* · Christina Kay · Yuki Kubo · Masha Kuznetsova* · Peter MacNeice* · Anthony Mannucci* · Marilena Mierla · Chigomezyo Ngwira* · Ian Richardson* · Alexis Rouillard · Howard Singer* · Robert Steenburgh* · Manuela Temmer · Barbara Thompson* · Alexandra Wold* · Tom Woods · KiChang Yoon · Yihua Zheng* ·
*attending 2017 CCMC-LWS working meeting

Latest News
Agenda and materials from the ESWW15 (2018) Topical Discussion Meeting: Flare Forecasting: where are we and where should we be going?
April 2017 working meeting: team agenda | solar/heliosphere agenda | full agenda

User Needs
🚧 under construction

Working Team Goals
✶ Evaluate where we stand with solar flare prediction; define specific questions.
✶ Agree on different metrics that address the specific questions chosen.
✶ Provide a benchmark against which future models can be assessed.
✶ Complementary to the Flare Scoreboard activity whose goal is collect and display real-time flare predictions and ultimately facilitate validation.
✶ Complementary to the ISEE/PSTEP workshop. This workshop will focus on the comparison of currently operational flare prediction models.

Working Team Deliverables
✶ Catalog of metrics including how they address specific questions, and relate to user needs.
✶ Online database of model outputs with complete input/output metadata, and observations.
✶ Selecting time intervals based on chosen questions.
✶ Ensure standardized flare predictions for making comparisons.
✶ Model assessments for selected time intervals.
✶ Publication describing model assessment results summarizing where we stand with solar flare prediction.

By April 2017: Deliver the first item, and make a start on the second item through collaboration with the Information Architecture for Interactive Archives (IAIA) working team. Collaborate with the working team: Assessment of Understanding and Quantifying Progress Toward Science Understanding and Operational Readiness.

Potential Questions to Address
Where do we stand with solar flare prediction?
✶ Beginning vs end of solar cycle
✶ Peak vs other parts of solar cycle
✶ Influence of prior flare information
✶ Over long time periods

Please contribute to this draft list of potential items for this team to address.

Physical Quantities and Metrics for Model Validation
This team will deliver a catalog metrics including how they address specific questions, and relate to user needs. This will be discussed during the April 2017 working meeting. Some items to help the discussion:
Evolving document of some common metrics used for flare prediction
Barnes et al. (2016) paper

Observation Data
✶ Flare forecasts are generally probabilistic in nature, giving a forecast of when a GOES soft X-ray class flare will occur over a certain defined time period. Specifics one how the GOES observations should be compared to the probabilistic predictions will be part of discussion at the April 2017 working meeting.

Participating Models
✶ Please contact us about your model to participate
✶ Some participants overlap with those from the Flare Scoreboard: AMOS, ASAP, ASSA, BoM, MAG4, Met Office, SIDC, UFCORIN.

List of Time Intervals in this Study
🚧 under construction ✶ Consider some period of overlap with SEP working team intervals

Resources and Past Progress
🚧 under construction
✶ Barnes et al. (2016), A Comparison of Flare Forecasting Methods. I. Results from the “All-Clear” Workshop, ApJ, 829, 2. doi:10.3847/0004-637X/829/2/89. Website for data from the workshop.
Scoreboard discussion at ESWW13 in November 2016
Flare Scoreboard

Coronal and Solar Wind Structure

Coronal & SW Structure; Ambient SW; Coronal Hole Boundaries

Team Leads: Peter Macneice, Lan Jian
Participants: Eric Adamson* · Anastasios Anastasiadis · Nick Arge · Hazel Bain · Francois-Xavier Bocquet · Mark Cheung · Consuelo Cid · Craig DeForest* · Sarah Gibson · Rachel Hock-Mysliwiec · Bernard Jackson* · Lan Jian · Irina Kitiashvili* · Alexander Kosovichev* · Masha Kuznetsova* · Jon Linker · Peter MacNeice* · Periasamy K Manoharan* · Leila Mays* · Slava Merkin* · Marilena Mierla · Karin Muglach* · Nariaki Nitta · Dusan Odstrcil* · Mathew Owens · Spiros Patsourakos · Rui Pinto · Nikolai Pogorelov* · Martin Reiss* · Pete Riley · D Aaron Roberts* · Alexis Rouillard · Robert Steenburgh* · Aleksandre Taktakishvili* · Manuela Temmer · Christine Verbeke* · Matthew West* ·
Shaun Bloomfield · Steven Brown* · Baptiste Cecconi · Yaireska Collado-Vega* · Jackie Davies · David Falconer · Tadhg Garton* · Manolis K. Georgoulis* · Jordan Guerra · Bernd Heber · Carl Henney · Christina Kay · Burcu Kosar* · Noé Lugaz · Anthony Mannucci* · Daniel Matthiä* · Mike McAleenan* · Joseph Minow* · Christian Moestl · Sophie Murray* · Chigomezyo Ngwira* · Teresa Nieves · Naoto Nishizuka · PAUL OBRIEN* · Athanasios Papaioannou · Steve Petrinec* · vic pizzo · Ian Richardson* · Neel Savani* · Barbara Thompson* · Karlheinz Trattner* · Brian Walsh · Chunming Wang* · Daniel Welling* · Katherine Winters* · Alexandra Wold* · KiChang Yoon · Jie Zhang · Yongliang Zhang* · Yihua Zheng* ·
Communications: ccmc-solar-helio@googlegroups.com (mailing list)

3D CME Kinematics and topology

Team Leads: David Barnes (RAL Space, UK), Christian Möstl (Uni Graz, Austria), Barbara J. Thompson (NASA GSFC, USA)
_ D. Biesecker (NOAA/SWPC), R. Colaninno (NRL), R. Y. Kwon (GMU), Heather Mei (Tufts U./GSFC), Marilena Mierla (ROB), A. Reinard (CU/CIRES/SWPC), N. Savani (UMBC/GSFC), M. Temmer (UNIGraz), Matthew West (ROB), A. Vourlidas (APL), M. L. Mays (CCMC), A. Taktakishvili (CCMC), M. Kuznetsova(CCMC), K. Muglach (CCMC)_ Note: we are still accepting team members, contact the organizers to join the group
Communications: ccmc-cmes@googlegroups.com (mailing list)

User Needs
Our end users are primarily modelers who use 3D CME measurement inputs to drive models to determine the propagation of CMEs throughout the heliosphere, and forecast the impacts (such as CMEs, Bz prediction, and SEP generation) at different locations. We also include as our user community those who are interested in comparative studies of CME structure and propagation in the inner heliosphere.

The team has determined that a key user need is the ability to assess how accurate a CME measurement is, so that they can constrain the error in their comparative studies or their model boundary conditions. Improvement in CME measurement accuracy is important, but what is also needed is a more clear understanding of the limitations of the accuracy.

Working Team Goals
The team will focus on analysis methods, models, multi-viewpoint reconstruction and EUV proxies for determining the 3D CME input to heliospheric simulations. This will include a comparative study of the different methods and different users, to understand how much variation there can be in a measurement of a CME.

Status
The team is in the process of identifying a series of CMEs that have been or will be measured by multiple users, in order to undertake a study of the differences between the results and start to identify the sources of measurement accuracy.

Resources
See the agenda and materials from the ESWW14 topical discussion meeting (2017): Advance Predictions of Solar Wind Conditions at L1: Quantifying Performance

Working Team summary from the International CCMC-LWS Working Meeting: Assessing Space Weather Understanding and Applications held April 3-7 2017: 3D CME Kinematics Working Team Summary [PDF]

Solar Indices and Irradiance

Solar indices (e.g. F10.7, MgII core-to-wing ratio, total magnetic flux)

Team Leads: C. Henney, J. Klenzing, K. Muglach
List of participants: C. Henney (AFRL), J. Klenzing (NASA/GSFC), K. Muglach* (NASA/GSFC), J-S. Shim* (CCMC), S. Bruinsma* (CNES), T. Fuller-Rowell* (NOAA), N. Arge (NASA/GSFC), D. Bilitza (NASA/GSFC), J. Fontenla (NWRA), A. Mannucci* (JPL), K. Tobiska* (SET), H. Warren (NRL), R. Hock-Mysliwiec (ARFL/RVBXS), A. Vourlidas (JHU APL), S. Brown* (GMU), L. Lefevre* (ROB), K. Winters* (45th Weather Squadron), L. Mays* (CCMC), P. Riley (PSI), R. Steenburgh* (NOAA), M. Georgoulis* (RCAAM), D. Matthiae* (DLR), C. Wang* (University of Southern California), M. Snow* (LASP), F. Eparvier* (LASP), E. Thiemann* (LASP), T. Woods (LASP), R. Viereck (NOAA), A. Kosovichev* (NJIT), I. Kitiashvili* (NASA Ames Research Center), J. Lee (UBC/JCET)
*attending CCMC-LWS working meeting
Communications: ccmc-solar-indices@googlegroups.com (mailing list)

User Needs

Solar XUV (0.1-10 nm), EUV (10-120 nm), and FUV (120-200 nm) radiation is absorbed in the Earth's upper atmosphere, driving ionization and heating of the neutral atmosphere. Current Ionosphere-Thermosphere (IT) models are capable of using measured VUV (0 to 200 nm) spectral information to drive the models, as well as modeled EUV spectra based on solar indices (e.g. F10.7, the solar radio flux at 10.7 cm, along with Mg II core-to-wing ratio, total magnetic flux) used as a proxy for EUV activity.

Day-to-day variability: Many of these solar indices are averaged over long periods of time (81 days to 1 year). When compared with EUV spectral lines (such as the He II 30.4 nm line) over a solar rotation cycle, indices such as F10.7 may peak several days before the EUV peak or several days after, depending on the propagation path through the solar atmosphere. The questions for the group are: 1) How well do solar indices reproduce the observed variability of radiation bands within the VUV that drive the IT models on a cadence of 24 hours? 2) Are there other proxies or direct measurements we should use? 3) How do we develop solar spectral irradiance models for input to IT models?

Long-term variability: During the minimum between solar cycles 23 and 24 the ionosphere and thermosphere were unusually contracted. This has been frequently attributed to EUV radiation decreasing more than usual during the minimum whilst traditional solar proxies remained consistent. Additionally, F10.7 is typically capped at values of 200 sfu in IT models. So another one of the questions for the group is: Are other proxies such as MgII c-w better to represent solar EUV long-term variability?

The questions for the group are: 1) How accurate are these indices that drive the models relative to the ionizing radiation on these shorter time scales, and 2) Are there other proxies or direct measurements we should use?

Working Team Goals

  • Identify the current and future input needs of IT models.
  • Identify the effectiveness of available solar proxies (F10.7, MgII c-w, total magnetic field strength) in reproducing the variability of bands within the observed VUV spectrum (0-200 nm) on timescales appropriate to measure the 27-day solar rotation cycle, as well as in terms of overall long-term (11-year) variability.
  • Agree on long-term datasets of EUV that could be used to calibrate and validate these indices on these daily timescales, as well as IT data to validate the response.

Working Team Deliverables

Physical Quantities and Metrics for Model Validation

  • F10.7
  • MgII core-to-wing ratio
  • total magnetic flux
  • others?

Observed Solar Data Sources:

  • 10.7cm solar radio flux (F10.7): National Research Council of Canada, K. Tapping, Publication with description of data, data since 2004: ftp://ftp.geolab.nrcan.gc.ca/data/solar_flux/daily_flux_values/fluxtable.txt
  • MgII core-to-wing ratio (Mg II index)
  • total magnetic flux

HELIOSPHERE

IMF Bz at L1

Leads: N. Savani, P. Riley

Communications: ccmc-imf-bz@googlegroups.com (mailing list)
Participants: Eric Adamson · Nick Arge · Michael Balikhin* · Francois-Xavier Bocquet · Sean Bruinsma* · Yaireska Collado-Vega* · Pedro Corona-Romero* · Curt de Koning* · Manolis K. Georgoulis* · Edmund Henley · Bernard Jackson* · Lan Jian · Christina Kay · Noé Lugaz · Anthony Mannucci* · Periasamy K Manoharan* · Slava Merkin* · Marilena Mierla · Joseph Minow* · Christian Moestl · Karin Muglach* · Chigomezyo Ngwira* · Teresa Nieves · Nariaki Nitta · Dusan Odstrcil* · Mathew Owens · Spiros Patsourakos · Pete Riley · Alexis Rouillard · Neel Savani* · Camilla Scolini · Daikou Shiota · Howard Singer* · Robert Steenburgh* · Manuela Temmer · Christine Verbeke* · Angelos Vourlidas · Bob Weigel · Daniel Welling* · Alexandra Wold* · Yongliang Zhang* · Jie Zhang
*Anastasios Anastasiadis · Steven Brown* · Craig DeForest* · David Falconer · Natalia Ganushkina* · Adam Kellerman* · Burcu Kosar* · Alexander Kosovichev* · Masha Kuznetsova* · Ramon Lopez · Peter MacNeice* · Daniel Matthiä* · Naoto Nishizuka · PAUL OBRIEN* · Evangelos Paouris · Athanasios Papaioannou · Steve Petrinec* · Nikolai Pogorelov* · Ian Richardson* · David Sibeck · Karlheinz Trattner* · Rodney Viereck · Brian Walsh · Chunming Wang* · KiChang Yoon · Yihua Zheng**
*attending CCMC-LWS working meeting

News
See the agenda and materials from the ESWW14 topical discussion meeting (2017): Advance Predictions of Solar Wind Conditions at L1: Quantifying Performance

ICCMC-LWS Workshop updates
➡ April 2017 working meeting: team agenda | solar/heliosphere agenda | full agenda

Following on from the original draft document that went out to the community, we will be discussing each of the 6 topics at the workshop. At the first session on Wednesday (04/05), we will be looking to find areas of agreement and complexities to resolves. We test a novel approach to the discussion by attempting live and interactive updates of the conversation by the community. The document is open to everyone, and contributions are solicited to the entire community. Wednesday’s live updates can be found here:
https://goo.gl/m2kICP

In addition to the live updating of the Wednesday session. The follow up session on Thursday will predominately attempt to focus on the future strategy and the pathways to impact and operations. Thursday’s live updates can be found here:
https://goo.gl/ZatBiR

If anyone has ideas they wish to convey, please feel free to upload them here, and convey a summary via email to the team leads so that these points can be entered onto the floor of discussion:
https://goo.gl/B2AGQO

Working Team Goals
To create a community-agreed selection of events and metrics, that all current and future models should test their magnetic field forecasting capabilities.
In this topic the community will focus on forecasting the magnetic structure of interplanetary CMEs and the ambient solar wind upstream of Earth. This group intends to open communication with the community in order to agree upon a standardized process by which all current and future models can be compared under an unbiased test. Current models will provide the initial set of forecasting skills, with the longer term goal of providing a standardised test procedure which future model improvements can follow. This procedure is intended to provide concrete requirements to progress a scientific model along the Application Readiness Levels (ARLs) and into an operational setting. The conversation and scientific rationale behind all decisions will be recorded in order to facilitate future ARL procedures.

Solicitation for Community Opinions
We invite the wider community to participate and provide further insight that would benefit the final determination of evaluation criteria especially in those areas that remain outstanding. All new ideas are welcome, as well as additional suggestions on current evaluation themes.

A small team of model developers and end users (SWPC and UKMO) were selected to ‘seed’ an initial direction for further discussion by the wider community. Please find the our initial finding here [PDF].

Current list of models incorporated in our discussion:
Data driven
1. Bz4Cast model (N. Savani)
2. Helicity-CME (H-CME) model (Patsourakos, Georgoulis)
3. A. Rouillard model

Numerical simulation
4. SUSANOO (D. Shiota)
5. EUHFORIA (S. Poedts – under development)

Recommendation algos
6. ProjectZed model (P. Riley)


GEOSPACE: Geomagnetic Environment

Ground Magnetic Perturbations: dBdt, delta-B, GICs, FACs

Team Leads:
Primary: D. Welling (U. of Michigan), H. Opgenoorth (Swedish Institute of Space Physics), C.Ngwira (CCMC) (email team leads/forum organizers to be added to the team)

_ M. Cash (NOAA/SWPC) TBC, C. Ngwira (CCMC), L. Rastaetter (CCMC), M. Kuznetsova (CCMC), I. Honkonen (CCMC), H. Singer (NOAA/SWPC), G. Millward (NOAA/SWPC), C. Balch (NOAA/SWPC), G. Toth (U. of Michigan), Lee-Anne McKinnell (SANSA), L. Rosenqvist (Swedish Institute of Space Physics), representatives of LWS GIC Institute (A. Pulkkinen, TBC)_
Communications: ccmc-mag-perturbations@googlegroups.com (mailing list)

Description
This working group will be focused on studying ground magnetic field perturbations and the ability of global geospace models to reproduce and predict it.

Working Team Goals

  • To improve the SWPC-CCMC dB/dt validation challenge by expanding on the work of Pulkkinen et al., (Space Weather, 2013).

Working Team Deliverables

  • Define procedure for providing error bars
  • Define metric for auroral boundary locations
  • Provide guidelines on thresholds determination in models.

Parameters and Metrics for Model Validation

  • dB/dt, delta-B, geoelectric fields/GICs, auroral boundary locations, and FACs.Preliminary Event List
  • August 31 – September 1, 2001
  • October 29-31, 2003
  • August 31- September 1, 2005
  • December 14-16, 2006
  • April 5, 2010
  • August 5- 6, 2011

Geomagnetic Indices

Kp, Regional-K, Dst, Ap, ...

Team Lead: M. Liemohn (U. of Michigan) (email team lead to be added to the team)
_ L. Rastaetter (CCMC), A. Glocer (NASA/GSFC), D. Welling (U. of Michigan), Consuelo Cid (U. of Alcala), R. Boynton (U. of Sheffield), L. Rosenqvist (Swedish Institute of Space Physics), P. Wintoft (Swedish Institute of Space Physics), H. Singer (NOAA/SWPC), C. Balch (NOAA/SWPC), K. Tobiska (SET), S. Vennerstrom (DTU)_
Communications: ccmc-geomag-indices@googlegroups.com (mailing list)

Introduction

We are coming into this topic with a long history of model predictions of geomagnetic indices. One of the latest is the Rastaetter et al. (2014) paper on the Dst index challenge (http://onlinelibrary.wiley.com/doi/10.1002/swe.20036/full), but there are many others. That is, we are not starting from zero on this topic. In fact, here are some preliminary answers:

User Needs

People use indices to interpret the global intensity of space weather activity. In addition, many geospace models use these indices as input. Prediction of these indices is useful.

Predictions of geomagnetic indices must come with errors that can be fed into a modeling ensemble to arrive at a level of uncertainty from the specification of these indices.

Working Team Goals

The unresolved issues are in the details and our goal for the week is to agree on them. Here are some thoughts:

  1. Which indices will we focus on (Dst/SYMH, Kp, AE, AL, PC, etc),
  2. which error analysis will we focus on (RMSE, correlation coefficient, or some other error value),
  3. if a contingency table ("yes/no" 2x2 chart of storm event prediction), then what time interval/cadence and what threshold for an “event”, and
  4. what intervals/events to consider the “baseline” for benchmarking performance.

Working Team Deliverables

A framework for assessing a new geomagnetic index prediction model.

Predicted indices with error bars for different lead time of predictions. Errors or uncertainties can be very big for long lead times.
Ionosphere teams will transfer uncertainties in indices (used as input) into uncertainties in outputs (neutral density, TEC, etc.).

Parameters and Metrics for Model Validation

Dst and Kp are the most likely targets for our time, but AE, AL, and others could/should be discussed. I am hoping that the framework is generic enough to be applied to any index.

Preliminary Event List

Should this working team focus less on individual storm events, but rather on long intervals of 6 months or more?

Magnetopause location and geosync. orbit crossing

Team Leads: Y. Collado-Vega (CCMC), S. Merkin (JHU APL) (co-lead) (email team leads to be added to the team)
_ M. Collier (NASA GSFC), H. Connor (NASA GSFC), D. Cramer (UNH), K. Garcia-Sage (NASA/GSFC), M. Georgoulis (AOA), H. Hietala (UCLA), I. Honkonen (NASA GSFC), A. Kellerman (UCLA), K. Kuntz (JHU), M. Kuznetsova (CCMC), N. Lugaz (UNH), R. Lopez (UTA),S. Merkin (JHU APL), S. Petrinec (LMCO), J. Raeder (UNH), L. Rastaetter (CCMC), K. Raymer (Leicester), S. Sazykin (RICE), D. Sibeck (NASA GSFC), H. Singer (NOAA), K. Trattner (LASP), B. Walsh (BU), Chih-Ping Wang (UCLA), D. Welling (UMICH), M. Wiltberger (NCAR), K. Winters (USAF)_
Communications: ccmc-magnetopause@googlegroups.com (mailing list)

Working Team Description
This working team will be devoted to studies of the magnetopause (MP) location, the physical processes governing it, and the ability of global geospace models to reproduce and predict it.

Working Team Goals

  • Compile a list of previous publications summarizing the observational, theoretical and modeling knowledge regarding the MP geometry, dynamics, ability of models to predict it, etc. In particular, to what extent can we claim to know where the magnetopause is at a given time and at a given location in absence of in situ data (see Samsonov et al., 2016)
  • Define/discuss the physical processes governing the MP location and to what extent different models include these processes (e.g. ring current effects).
  • Summarize/Discuss previous events and statistical studies. Consider candidate events for simulations, including recent MMS MP season.
  • Define metrics for model-data comparison; techniques used. Consider dynamics vs. morphology-metrics can be different.
  • Discuss modeling challenges during the meeting.
  • Ongoing-Compare model outputs with data and amongts each other; and define the differences.
  • Define which model settings can affect their output significantly and relate to missing physics, if possible.
  • If possible, define how models can have better agreement
  • Define improvements/capabilities needed for better predictions

User Needs

  • Determine end users and their needs, metrics, etc.
  • Science applications thresholds to understand the parameters needed for improvement
  • Real time simulations capabilities/now casting

Working Team Deliverables

  • Support GEM and LWS Programs and Challenges. In example - Focus Group on Dayside Kinetic Processes in Global Solar Wind Magnetosphere Interaction. The metrics they will obtain with a CCMC-hosted modelling challenge comparing different (kinetic) global models against near-magnetopause observations during southward IMF can be compared to the metrics established in this workshop. Also, in a long term interval these results can be compared again when better models are developed.
  • Define (if possible) some settings that could make the models have better agreement on the magnetopause location. Make those settings available to everyone.
  • Ideas on how to create an ensemble of the model predictions and make that available to the scientific community.
  • TBD after discussion at the workshop

Physical Quantities and Metrics for Model Validation

  • Model settings
  • Inside/Outside of Magnetopause decision: variable, threshold, sampling time interval

Available Data Sources

  • Observations, impact information
  • Solar wind conditions (density, IMF all directions, solar wind speed)
  • MP crossings by THEMIS, MMS, other platforms

Participating Models

  • SWMF
  • LFM
  • OpenGGCM
  • GUMICS
  • VERB
  • Empirical models
  • Kinetic models (long term)
  • Others

Workshop Presentations and meeting notes

Presentations given at the workshop and meeting notes available at:
https://drive.google.com/drive/folders/0B94meBUH9vymZHdoMXBKOElwc0E

Tentative Agenda and Presentations

  1. Wednesday at 3PMWednesday at 4:45PM . Nowcasting and forecasting of the magnetopause and bow shock—A status update (Steven Petrinec, Lockheed Martin) . GEM Model Challenges; Metrics and Skill Scores already done (Lutz Rastaetter, NASA GSFC) . Events Description . Magnetospheric balance of solar wind dynamic pressure [Jan 13, 2013 and Jun 23, 2015 events] (Ramon Lopez, UTA). . Comparison of MHD models and Geosync. Orbits Crossings (Yari Collado-Vega, NASA GSFC) . Other Events
  2. The last-closed drift shell (LCDS) computations (Adam Kellerman, UCLA)
  3. Scientific "end-user" requirements in regards to our VERB code simulations and real-time forecasting (Adam Kellerman, UCLA)
  4. User needs identified by SWPC (Howard Singer, NOAA SWPC)
  5. User needs identified by SWRC (Antti Pulkkinen, NASA GSFC)
  6. New developments and why validation is important for Science: Soft X-ray Imaging of the Magnetosphere (Kip Kuntz, JHU)
  7. Discussion of the 5 min team update on Thursday

Thursday at 10:45AM

  • Overall Discussion and preparation for Domain presentation on Friday

GEOSPACE: Auroral Region

Auroral precipitation and high latitude ionosphere electrodynamics

(FACs, Joule heating, electric fields)

Team Leads: R. Robinson* (CUA), Y. Zhang* (APL), B. Kosar* (NASA/GSFC)

Participants and Followers: List of participants and followers (as of November 2, 2017) [PDF]
Communications: ccmc-auroral-precip@googlegroups.com (mailing list)

Google Drive
Documents for AuroraPHILE teams

Latest News
Summary of AuroraPHILE working team discussions at April 2017 working meeting [PDF]

User Needs
The requirement for information on the auroral ionosphere is broadly driven by two types of users: (1) those for whom the properties of the auroral ionosphere represent the final product, such as energetic particle fluxes, location of visible aurora, ionospheric currents, and (2) those who use auroral properties as input to models that specify electron densities, TEC, scintillation, neutral atmosphere densities and temperature, Joule heating, and other quantities.

Working Team Goals
To establish quantitative means to measure the accuracy and reliability of modeled properties of the auroral ionosphere, including particle precipitation, conductivities, electric fields, neutral winds, currents, and Joule heating.

Working Team Deliverables
The working team will establish a set of properties that describe the state of auroral particle precipitation and electrodynamics, and then quantify the accuracy and reliability currently achievable using a combination of data and models. Parameters that specify the auroral ionosphere will include both local and global quantities.

AuroraPHILE Worksheet [xlsx] lists
Models [PDF],
Data sources [PDF] (both ground-based and space-based),
Events [PDF], and
Metrics [PDF] that will be considered in the Working Team discussions.


IONOSPHERE

Neutral Density and Orbit Determination at LEO

Team Leads: S.Bruinsma (Primary lead, CNES) S. Solomon* (UCAR HAO), T. Fuller-Rowell* (NOAA/SWPC), E. Sutton* (AFRL) (email team leads to be added to the team)

Participants and Followers: List of participants and followers (as of November 2, 2017) [PDF]
Communications: ccmc-neutral-density@googlegroups.com (mailing list)

Latest News
Summary of ionosphere working team discussions at April 2017 working meeting [PDF]
Presentations and Discussions from the 2017 working meeting

User Needs

  • Prediction of atmospheric density to minimize drag errors in orbit determination (OD) and propagation for purposes of collision avoidance or re-entry predictions
  • Model uncertainty information that can be formulated into correction matrices to the OD-produced covariances in order to represent the atmospheric density model errors properly

Working Team Goals

  • Establish metrics agreed upon by the community
  • Evaluate where we stand with atmospheric density prediction
  • Provide a benchmark against which future models can be assessed against

Physical Quantities and Metrics for Model-Data Comparison Neutral density in the altitude range 200 – 800 km:

  • Daily mean
  • Orbit averaged
  • Model sampled at satellite locations, binned 5° along track
  • Global mean (J. Emmert: global daily mean of TLE-inferred densities)
  • Mean, standard deviation, and RMS of observed-to-modeled density ratios, as well as correlation coefficient Orbits of LEO satellites

Available Data Sources

  • Accelerometer/GPS measurements by CHAMP/GRACE/GOCE/SWARM
  • Global mean neutral density survey by J. Emmert
  • Daily mean density at 800km from Starlette and Stella
  • Standard versions to be provided by E. Doornbos, S. Bruinsma, and J. Emmert
  • POE (Precise Orbit Ephemeris):
    - Aqua/Aura/Terra provided by NASA FDF
    - Cryosat-2/Sentinels at 800 km,
    - Tandem/TerraSAR-X at 500 km to be provided by S. Bruinsma

Participating Models [PDF]
Please contact the team leads for additional model suggestions

  • Empirical Models: JB2008, DTM, NRLMSISE-00
  • Numerical Models: CTIPe, TIE-GCM, GITM

List of Time Intervals

  • Proposed Complete Years: 2002, 2007, 2012 (starting with 2012)
  • Storms from list adopted by ionosphere group:
    • Time Intervals: Selection was mainly based on data availability
      • Proposed to study entire year 2012
        • Understand quiet times
        • Understand importance of background conditions
    • Proposed Storm Events| Date | Min Dst (nT) || 29 March–3 April 2001 | -387 || 18 - 31 July 2014 | -99, -126, -170 || 14 - 16 May 2005 | -247 || 8 - 11 March 2012 | -74, -131 || 16 - 20 March 2013 | -132 || 31 May - 4 June 2013 | -119 || 21 -24 June 2015 | -204 |
  • USAF “problem storms” from 2005:
    - Days 18, 21, 128, 135, 150, 191, 236, 254

Resources and Past Progress
Community Challenges supported by CCMC

Global & Regional TEC

Team Leads: L. Scherliess* (Primary lead, USU) (email team leads to be added to the team)

Participants and Followers: List of participants and followers (as of November 2, 2017) [PDF]
Communications: ccmc-tec@googlegroups.com (mailing list)

Latest News
Summary of ionosphere working team discussions at April 2017 working meeting [PDF]
Presentations and Discussions from the 2017 working meeting

User Needs

  • Effects of the ionosphere on GPS positioning application:
    Position Error for GNSS Users (Errors in TEC affect single frequency GNSS positioning)

Working Team Goals

  • Establish metrics agreed upon by the community
  • Evaluate where we stand with atmospheric density prediction
  • Provide a benchmark against which future models can be assessed against

Challenges to address

  • Uncertainty in observations and models
  • Metrics for specific applications
  • Ground truth for evaluations of data assimilation models

Lessons learned from previous assessments of TEC prediction

  • Difference in vTEC between different data sources
  • Difference in range for TEC calculations between models and data sets
  • 1-day quiet time values are not enough as a quiet time reference to quantify storm impact
  • Lack of user input in defining appropriate metrics for specific applications

Physical Quantities and Metrics for Model-Data Comparison

  • vertical TEC, slant TEC
  • RMSE, NRMSE, and Ratio of the modeled to observed maximum increase

Available Data Sources

  • TEC measurements from the global network of GNSS ground receivers

Participating Models [PDF]
Please contact the team leads for additional model suggestions

List of Time Intervals

  • Storms from list adopted by ionosphere group:
    • Time Intervals: Selection was mainly based on data availability
      • Proposed to study entire year 2012
        • Understand quiet times
        • Understand importance of background conditions
    • Proposed Storm Events| Date | Min Dst (nT) || 29 March–3 April 2001 | -387 || 18 - 31 July 2014 | -99, -126, -170 || 14 - 16 May 2005 | -247 || 8 - 11 March 2012 | -74, -131 || 16 - 20 March 2013 | -132 || 31 May - 4 June 2013 | -119 || 21 -24 June 2015 | -204 |

Resources and Past Progress
Community Challenges supported by CCMC

Ionosphere Plasma Density: NmF2/foF2, hmF2, TEC

Team Leads: I. Tsagouri* (Primary lead, NOA), M. Angling (U. of Birmingham), K. Garcia-Sage* (CCMC) (email team leads to be added to the team)

Participants and Followers: List of participants and followers (as of November 2, 2017) [PDF]
Communications: ccmc-plasma-density@googlegroups.com (mailing list)

Latest News
Summary of foF2 working team discussions on Sep. 14th, 2017 [PDF]
Summary of ionosphere working team discussions at April 2017 working meeting [PDF]
Presentations and Discussions from the 2017 working meeting

User Needs
The Earth’s ionosphere poses an interesting challenge for the reliable performance of a variety of technological systems, either as an essential part of the system in practice (e.g., in the HF communications), or as a source of nuisance (e.g., in satellite communication and navigation systems). This makes the accurate modeling of the 3-D electron density in special temporal and spatial scales a critical requirement for the design and operation of the systems.

  • ♦ Types of products: long-term predictions, nowcasts and short-term predictions (forecasts)
    ♦ Spatial scales: local, regional, global
    ♦ Challenges:
    • Location, onset, magnitude and duration of ionospheric disturbances in 3-D electron density distribution
      Timeliness and accuracy of the predictions/forecasts

Working Team Goals

Strategic goals

  • ♦ Discussion and better understanding of the users' needs for different spatial and temporal scales
    ♦ Establish a set of community agreed metrics that address specific user needs
    ♦ Evaluate existing ionospheric modeling capabilities based on the selected metrics and provide a benchmark against which future models can be assessed
    ♦ Bring together existing and new community initiatives, organization of working meetings and tag ups and coordination of international efforts to facilitate progress.

Secondary goals

  • ♦ Provide online tools to facilitate models' assessment within the international community (in collaboration with the Information Architecture for Interactive Archives (IAIA) working team).

Physical Quantities and Metrics for Model-Data Comparison

  • ♦ foF2/NmF2, MUF
    ♦ hmF2
    ♦ vertical TEC (local)
    Challenges:
      • Quiet time ionospheric variation in different temporal and spatial scales
        - Ionospheric storm effects (large scale)

Available Data Sources

  • ♦ Ionosondes, Incoherent Scatter Radars (ISRs), GNSS receivers, and Radio occultation data

Participating Models [PDF]
Please contact the team leads for additional model suggestions

Preliminary considerations on the selection of the time intervals [PDF]

Resources and Past Progress
Community Challenges supported by CCMC

Ionosphere Scintillation

Team Lead: E. Yizengaw* (BC) (contact team lead to be added to the team)

Participants and Followers: List of participants and followers (as of November 2, 2017) [PDF]
Communications: ccmc-scintillation@googlegroups.com (mailing list)

Latest News
Summary of ionosphere working team discussions at April 2017 working meeting [PDF]
Presentations and Discussions from the 2017 working meeting

User Needs

  • Indices to characterize the state of the ionosphere for radio communication and satellite navigation purposes

Working Team Goals

  • Determine widely used quantities that can be utilized for model-data comparison
  • Obtain scintillation/irregularity data collected at different geographic locations using different instruments
  • Evaluate the capabilities of current models by performing model-data comparison assessment, during different geomagnetic conditions and at different geographic locations.

Challenges:

  • The physics behind the significant longitudinal variability of scintillation/bubbles
  • Role of external and internal drivers, especially to understand the longitudinal scintillation variabilities
  • Potential drivers for quiet time scintillation/bubbles
  • Uncertainties between observations and models

Physical Quantities and Metrics for Model Validation

  • S4-index, ROTI, sigma-phi
  • Onset time and duration of scintillations above a certain S4-index level
  • Maximum/peak value of the scintillation index
  • Spatial variability of scintillation activity

Available Data Sources

  • GNSS receivers / S4 index and ROTI
  • UHF/VHF receivers
  • LEO satellite in-situ density

Participating Models
Please contact the team leads for additional model suggestions

  • PBMOD

List of Time Intervals in this Study

  • Storms from list adopted by ionosphere group:
    • Time Intervals: Selection was mainly based on data availability
      • Proposed to study entire year 2012
        • Understand quiet times
        • Understand importance of background conditions
    • Proposed Storm Events| Date | Min Dst (nT) || 29 March–3 April 2001 | -387 || 18 - 31 July 2014 | -99, -126, -170 || 14 - 16 May 2005 | -247 || 8 - 11 March 2012 | -74, -131 || 16 - 20 March 2013 | -132 || 31 May - 4 June 2013 | -119 || 21 -24 June 2015 | -204 |

Resources and Past Progress
Comparing Radio Scintillation Models with Observations ( [John Retterer, BC]) [PDF]


RADIATION and PLASMA EFFECTS

Surface Charging

few eV - keV electrons, plasma density

Team Leads: J. Minow, D. Pitchford, N. Ganushkina
Participants: Laila Andersson · Michael Balikhin* · Janessa Buhler* · Yaireska Collado-Vega* · Natalia Ganushkina* · Tim Guild · Insoo Jun* · Adam Kellerman* · Masha Kuznetsova* · Collin Meierbachtol* · Joseph Minow* · Paul O'Brien* · Lutz Rastaetter* · Howard Singer* · Brian Walsh · Yihua Zheng* ·
Jesse Woodroffe* · Consuelo Cid · Kangjin Lee · Larry Paxton* · Michael Liemohn* · Katherine Winters* · Mike McAleenan* · Laura Godoy* · Yongliang Zhang* · Steve Petrinec* · Manolis K. Georgoulis* · Daniel Matthiä* · DAVE PITCHFORD · Chunming Wang* · Alexis Rouillard · KiChang Yoon · Michael Henderson · Chigomezyo Ngwira* · W. Kent Tobiska* · Irina Kitiashvili* · Chris Jeffery* ·
Communications: ccmc-surf-charging@googlegroups.com (mailing list)

Google drive: documents for radiation and plasma effects teams

User Needs
Satellite design:
Need worst cases electron spectra, but worst case is hard to define.

Satellite anomaly resolution:
Need electron spectrum (and, at lower priority, plasma density) at time of anomaly at location of anomaly. Also need this information over history of mission to get a sense of whether time of anomaly was especially severe.

Internal Charging

keV–MeV electrons

Team Leads: P. O'Brien, Yuri Shprits (contact team leads/forum organizers to be added to the team)
Participants: Michael Balikhin* · Yaireska Collado-Vega* · Natalia Ganushkina* · Insoo Jun* · Adam Kellerman* · James McCollough · Joseph Minow* · Steven Morley* · PAUL OBRIEN* · Yuri Shprits · Howard Singer* · Yihua Zheng* ·
Laila Andersson · Janessa Buhler* · Consuelo Cid · Manolis K. Georgoulis* · Laura Godoy* · Michael Henderson · Chris Jeffery* · Piers Jiggens* · Masha Kuznetsova* · Kangjin Lee · Daniel Matthiä* · Mike McAleenan* · Collin Meierbachtol* · Chigomezyo Ngwira* · larry paxton* · Steve Petrinec* · DAVE PITCHFORD · Lutz Rastaetter* · Alexis Rouillard · W. Kent Tobiska* · Katherine Winters* · KiChang Yoon ·
Communications: ccmc-int-charging@googlegroups.com (mailing list)

Google drive: documents for radiation and plasma effects teams

User Needs
User groups:
⊕ Satellite design (SD)
⊕ Satellite operators and anomaly analysts (SOAA)
⊕ Scientists (not top priority for our focus team, but want to include if we can) (SCI)
⊕ Insurance companies (IC)
⊕ Government agencies included in SWORM task force (GA)

Satellite design:
Need worst case spectra, but worst case is hard to define: depends on time history of hours to days to months even, and depends on shielding.

Satellite anomaly resolution:
Need the electron spectrum along vehicle trajectory for hours leading up to anomaly. Also need this information over history of mission to get a sense of whether time of anomaly was especially severe.

Physical Quantities and Metrics for Model Validation
Metrics for each user group
⊕ SD+IC: Peak 12-hour average current emerging from 100 mils slab Al shielding at GEO and GTO over 10 year mission. Binary: is it above/below the NASA-4002 safe level (100 fA/cm2)
⊕ SOAA+GA: 12-hour average current emerging from 100 mils slab Al shielding at GEO and GTO over from launch to now +Forecast
⊕ SCI: Total radiation belt content above some energy? Belt indices like POES? (>0.1 MeV, >1 MeV, >2 MeV, >4 MeV) averaged over 1 hour, 12 hours, 3 days, 1 week.

Error metrics to consider:
⊕ Normalized difference, average normalized difference, Table 4 of Subbotin et al., 2009
⊕ Median log accuracy ratio and median symmetric accuracy. (Morley, LA-UR-16-24592, Alternatives to accuracy and bias metrics based on percentage errors for radiation belt modeling applications)
⊕ Spence et al. 2004 CISM metrics. Metric was MSE (the mean square- error) or SS (Skill Score), but gave list of physical quantities whose error should be tracked, and models for each Median L* at fixed M or E, (K=0)
⊕ Principal component amplitudes
⊕ Correlation coefficient between the observed and modeled peak of the outer belt, inner and outer edges of the outer belt

Available Data Sources
Observations, impact information

Participating Models
Empirical model for each metric
⊕ SD+IC: AE9, extreme value analyses, (long term reanalysis)
⊕ SOAA+GA:
⊕ GOES data for GEO vehicles
⊕ GREEP (Geosynchronous Radiation-belt Electron Empirical Prediction) for GEO
⊕ NARMAX (Nonlinear Autoregressive Moving Average modelling) for GEO
⊕ CRRESELE driven by sumKp?
⊕ SCI: (could maybe pull this out of CRRESELE for >1 day timescales)
Physical model for each metric
⊕ SD+IC: VERB, SALAMBO, DREAM (long term reanalysis)
⊕ SOAA+GA: VERB, SALAMBO, DREAM, RBE, BAS, etc. (long term reanalysis or sim)
⊕ SCI: same as SOAA
(Note problem with distinction between empirical and physical models is data assimilative models: where do they fall in? Even physical simulations with heavily influential boundary conditions behave more like assimilative models. One way to draw the line: does the model advance an equation of state through physical simulation over time?)

Single Event Effects

MeV–GeV protons, ions

Team Leads: M. Xapsos, J. Mazur, P. Jiggens (contact team leads/forum organizers to be added to the team)
Participants: Anastasios Anastasiadis · Janessa Buhler* · Yaireska Collado-Vega* · Natalia Ganushkina* · Tim Guild · Piers Jiggens* · Insoo Jun* · Periasamy K Manoharan* · Christopher Mertens · Joseph Minow* · PAUL OBRIEN* · Brian Walsh · Michael Xapsos* · KiChang Yoon · Yihua Zheng* ·
Laila Andersson · Consuelo Cid · Erwin De Donder · Manolis K. Georgoulis* · Laura Godoy* · Daniel Heynderickx* · Irina Kitiashvili* · Masha Kuznetsova* · Kangjin Lee · Daniel Matthiä* · Leila Mays* · James McCollough · Collin Meierbachtol* · Athanasios Papaioannou · larry paxton* · Steve Petrinec* · DAVE PITCHFORD · Lutz Rastaetter* · Alexis Rouillard · Howard Singer* · W. Kent Tobiska* · Katherine Winters* ·
Communications: ccmc-see@googlegroups.com (mailing list)

Google drive: documents for radiation and plasma effects teams

User Needs
User groups:
⊕ Satellite designers (SD)
⊕ Satellite/launcher/aircraft* operators [SLAO]
⊕ Standards organizations (ISO/ECSS/NASA internal) [SO]
*effects on avionics are dealt with outside space industry

Satellite design:
Need average and worst case spectra, but worst case is hard to define because it depends on shielding and part of interest.

Satellite anomaly resolution:
Need the proton and ion spectrum at the vehicle at the time of the anomaly. Also need this information over history of mission to get a sense of whether time of anomaly was especially severe.

Physical Quantities and Metrics for Model Validation
⊕ SD+SLAO (SEU rate): proton fluxes (>30 MeV & > 50 MeV) [radiation belt peak vales (5-minute); worst-case SEP values; worst-case solar particle event (SPE) fluence]
⊕ SD (SEL/SEB probability): proton fluences (>30 MeV & > 50 MeV) [Orbit-averaged radiation belt flux (fluence); cumulative SEP fluence]
⊕ SD+SLAO+SO: Abundance ratios and charge states of SEP heavy ions (Z>2) [extension to event-to-event variability/distributions if possible]
⊕ SD+SLAO: LET behind nominal shielding** (1 g.cm-2)
*application of particle transport codes as black box only to derive useful quantities
ected energies/for set of species [for SEP and GCR magnetically shielded flux calculations]
_ ⊕ SLAO: SEP onset and peak timings_
***forecast not applied in specifications but could potentially be used by operators

Available Data Sources
**Observations, impact information

Identified empirical/statistical models
⊕ Trapped protons: AP9 [RD 2] (also AP8 [RD 3] still used in some standards); PSB97 [RD 4] + update (local model based on SAMPEX/PET)
⊕ SEPs: ESP-PSYCHIC [RD 5] [RD 6] [RD 7] [RD 8] [RD 9]; JPL [RD 10] [RD 11] [RD 12] [RD 13]; MSU [RD 14]; SAPPHIRE [RD 15] [RD 16]
⊕ GCRs: ISO-15390 GCR model [RD 17]; Badhwar-O'Neill (BON) [RD 18] [RD 19] [RD 20]; DLR GCR model [RD 21]
⊕ ESHIEM-MSM (magnetospheric shielding code); Shea and Smart [RD 22]
Physics-based Models
⊕ PLANETOCOSMICS

List of Time Intervals in this Study

TBD
References

⊕ [RD 1] http://dev.sepem.oma.be/help/sep_effects.html
⊕ [RD 2] G.P. Ginet et al., AE9, AP9 and SPM: New Models for Specifying the Trapped Energetic Particle and Space Plasma Environment, 179:579-615, Space Sci. Rev., 2013.
⊕ [RD 3] Sawyer, D. M. & J. I. Vette, AP-8 Trapped Proton Environment for Solar Maximum and Solar Minimum, NSSDC/WDC-A-R&S 76-06, 1976.
⊕ [RD 4] Heynderickx et al., A Low Altitude Trapped Proton Model for Solar Minimum Conditions Based on SAMPEX/PET Data, IEEE Trans. Nuc. Sci., 46(6) 1999.
⊕ [RD 5] M.A. Xapsos et al., Probability Model for Peak Fluxes of Solar Proton Events, IEEE Trans. Nuc. Sci., 45(6), 1998.
⊕ [RD 6] M.A. Xapsos et al., Probability Model for Worst Case Solar Proton Event Fluences, IEEE Trans. Nuc. Sci., 46(6), 1999.
⊕ [RD 7] M.A. Xapsos et al., Probability Model for Cumulative Solar Proton Event Fluences, IEEE Trans. Nuc. Sci., 47(3), 2000.
⊕ [RD 8] M.A. Xapsos et al., Model for Solar Proton Risk Assessment, IEEE Trans. Nuc. Sci., 51(6), 2004.
⊕ [RD 9] M.A. Xapsos et al., Model for Cumulative Solar Heavy Ion Energy and Linear Energy Transfer Spectra, IEEE Trans. Nuc. Sci., 54(6), 2007.
⊕ [RD 10] J. Feynman et al., New Interplanetary Proton Fluence Model, J. Spacecraft & Rockets, 27(4), 1990.
⊕ [RD 11] J. Feynman et al., Interplanetary Proton Fluence Model: JPL 1991, J. Geophys. Res., 98(A8), 1993.
⊕ [RD 12] J. Feynman et al., The JPL proton fluence model: an update, J. Atm. STP, 64, 2002.
⊕ [RD 13] I. Jun et al., Statistics of solar energetic particle events: Fluences, durations, and time intervals, Adv. Space Res., 40, 2007.
⊕ [RD 14] R.A. Nymmik, Improved Environment Radiation Models, Adv. Space Res., 40, 2007.
⊕ [RD 15] P. Jiggens et al., ESA SEPEM Project: Peak Flux and Fluence Model, IEEE Trans. Nuc. Sci., 59(4), 2012.
[RD 16] P. Jiggens et al., Long-Term Destructive SEE Risk and Calculations Using Multiple “Worst-case” Events Versus Modelling, IEEE Trans. Nuc. Sci., 61(4), 2014.
⊕ [RD 17] International Standard ISO-15390, Space environment (natural and artificial) — Galactic cosmic ray model, 2004.
⊕ [RD 18] G.D. Badhwar & P.M. O’Neill, Galactic Cosmic Radiation Model and its Applications, Adv. Space Res., 17(2), 1996.
⊕ [RD 19] P.M. O’Neill, Badhwar–O’Neill 2010 Galactic Cosmic Ray Flux Model – Revised, IEEE Trans. Nuc. Sci., 57(6), 2010.
⊕ [RD 20] P.M. O’Neill et al., Badhwar - O'Neill 2014 Galactic Cosmic Ray Flux Model Description, NASA/TP-2015-218569, 2016.
⊕ [RD 21] Matthiä et al., A ready-to-use galactic cosmic ray model, Adv. Space Res., 51, 2013.
⊕ [RD 22] D.F. Smart and M.F. Shea, A review of geomagnetic cutoff rigidities for earth-orbiting spacecraft, Adv. Space Res., 36, pg. 2012-2020, 2005.

Total Ionizing Dose

keV–MeV electrons, keV–GeV protons, GCR ions

Team Leads: I. Jun, T. Guild, M. Xapsos
Participants: Anastasios Anastasiadis · Michael Balikhin* · Yaireska Collado-Vega* · Natalia Ganushkina* · Tim Guild · Insoo Jun* · Daniel Matthiä* · Joseph Minow* · Alexis Rouillard · W. Kent Tobiska* · Brian Walsh · Michael Xapsos* · Yihua Zheng* ·
Laila Andersson · Steven Brown* · Janessa Buhler* · David Falconer · Manolis K. Georgoulis* · Daniel Heynderickx* · Masha Kuznetsova* · Kangjin Lee · Leila Mays* · James McCollough · Collin Meierbachtol* · Steven Morley* · Naoto Nishizuka · Athanasios Papaioannou · larry paxton* · Steve Petrinec* · DAVE PITCHFORD · Lutz Rastaetter* · Howard Singer* · KiChang Yoon ·
Communications: ccmc-total-dose@googlegroups.com (mailing list)

Google drive: documents for radiation and plasma effects teams

User Needs
User groups
⊕ Satellite designer (SD) for both commercial and government
⊕ Satellite operators and anomaly analysts (SOAA) for both commercial and government
⊕ Scientists (SCI) for both academia and government

Satellite design:
Need average spectrum along mission orbit/trajectory.

Satellite anomaly resolution:
Need the electron and (trapped+solar) proton spectrum along the mission orbit for few day leading to anomaly. Also need this information over history of mission to get a sense of whether time of anomaly was especially severe.

Working Team Goals

⊕ Define a metric or metrics that can be used to validate model outputs (in terms of total dose) with observations.
⊕ Then, the CCMC-LWS program would develop short-term or long-term plans to provide desired flight data to the community.

Working Team Deliverables

⊕ Define a metric or metrics that can be used to validate model outputs (in terms of total dose) with observations.
⊕ Then, the CCMC-LWS program would develop short-term or long-term plans to provide desired flight data to the community.

Physical Quantities and Metrics for Model Validation

Typical Model Outputs:
⊕ Particle spectra (flux and fluence)
⊕ Dose-depth curves
Identify metrics for each user group
SD: Dose-depth for the mission
SOAA: Dose-depth from launch to given time
SCI: proton and electron energy spectra
i. Electrons for >100 keV
ii. Protons for > 1 MeV

Error metrics to consider:
⊕ Long-term variability of trapped particles (data-based statistical analysis?) at different locations within planetary magnetosphere(s)
⊕ Most of the error metrics used for Internal Charging Working Group seem to apply here, too.
⊕ Statistics of solar proton environment: fluence spectra
⊕ Model predicted dose vs. measured dose (e.g., CRaTER instrument)
⊕ Measured solar proton event spectra for individual events vs. physical model prediction

Possibly Available Data Sources

⊕ Particle spectra (integral or differential) data from particle detector(s) at selected energies
⊕ Total dose data from dosimeter(s) at certain shielding depth(s)

Participating Models

Identify empirical models for each metric
⊕ Trapped: AE8/AP8; AE9/AP9/SPM; IGE2006/POLE (other older models are also available (e.g., CRESSELE, CRESSPRO, etc.))
⊕ Solar: King; JPL; ESP/PSYCHIC; SAPPHIRE
Identify physical model for each metric
⊕ Trapped: SALAMMBO; DREAM
⊕ Solar: SOLPENCO

List of Time Intervals in this Study

TBD: coordinate with other subtopic areas.

Radiation effects for aviation

Team Leads: K. Tobiska, M. Meier (email team leads/forum organizers to be added to the team)
Participants: Yaireska Collado-Vega* · Erwin De Donder · Natalia Ganushkina* · Manolis K. Georgoulis* · Tim Guild · Alex Hands · Piers Jiggens* · Adam Kellerman* · Daniel Matthiä* · Matthias Meier* · Paul O'Brien* · Athanasios Papaioannou · W. Kent Tobiska* · Brian Walsh · KiChang Yoon · Yihua Zheng* ·
Steven Brown* · Janessa Buhler* · Consuelo Cid · Yuki Kubo · Masha Kuznetsova* · Kangjin Lee · Chigomezyo Ngwira* · Naoto Nishizuka · Alexis Rouillard · Howard Singer* · Chunming Wang* · Katherine Winters* ·
Communications: ccmc-aviation@googlegroups.com (mailing list)

Google drive: documents for radiation and plasma effects teams

User Needs
How can we specify risk in a way that is easy for the public to understand? Are there new indices, such as the D-index, that could be implemented? What are the most useful features to show in dose rate graphics? Are global and/or baseline effective dose maps useful for air traffic management?

Working Team Goals
How can we compare science models and data with common metrics?

Working Team Deliverables
We would find utility in having GCR solar minimum, moderate, and maximum spectral baselines, at least through Z=26 (iron). SEP spectral baselines for at least protons and alphas during moderate, large, and worst case events are desired. Cutoff rigidity, Rc, baseline maps for multiple levels of geomagnetic activity can be considered. Best practices to quantify and report error sources in data sets are important.

Physical Quantities and Metrics for Model Validation
The reporting of radiation data and the details of factors for their calculation should be considered. This includes Q, absorbed dose in Si, ambient dose equivalent, and effective dose.

Available Data Sources
Observations, impact information
TEPC, Liulin, Raysure, ARMAS, bubble detectors

Participating Models
CARI-7, NAIRAS, and PANDOCA at the minimum.

List of Time Intervals in this Study
TBD


INFORMATION ARCHITECTURE

Information Architecture for Interactive Archives (IAIA)

Leads: C. Wiegand, D. Heynderickx, D. De Zeeuw, T. King (email team leads to be added to the team)
Participants: A. Pembroke (CCMC), L. Rastaetter (CCMC), R. Mullinix (CCMC), J. Boblitt (CCMC), A. Roberts (NASA/GSFC), S. Fung (NASA/GSFC), A. Kochanov (KU Leven), Stijn Calders (BIRA-IASB), R. Calfas (U. Texas), T. Al-Ubaidi (IWF), M. Khodachenko (IWF) (full list)
Communication: ccmc-iaia@googlegroups.com (mailing list), SLACK: ccmc-collab.slack.com (email to join our channel)

Mission Statement

Facilitate the development of a global network of distributed web-based resource for the purpose of model-data comparison.

Focused Area:

  • Metadata standards/data model to describe both observation and model data (implementation of SPASE with IMPEx extension)
  • Data discovery and access via implemented API
  • Next generation interpolation libraries/approach

Meetings/Workshops

  • ICCMC-LWS Workshop: IAIA sessions
    • CCMC-LWS Workshop 2017
      Goals going into the meeting**Deliverables prior to the meeting**Agenda for the meeting_IAIA Sessions AgendaList of available presentationsIAIA Sessions Presentations
      IAIA Summary Slides/Progress Report [PDF]
      • Describe a complex model chain using the current SPASE meta-data model
      • Explore how observation data, model data, and their associated meta-data can be stored in a centralized database for the purpose of model-data comparison (prototype of the Metrics&Validation, MAV, framework developed at CCMC)
      • Explore how the SPASE meta-data model can be used in combination of other meta-data schema for additional post-processing tasks, such as interpolation
      • Describe the Space Weather Modeling Framework (SWMF SWPC test version) using the SPASE meta-data model
      • Obtain 2 or 3 datasets (observation and model) from other working teams to be stored in the new MAV database
      • Extend SPASE model description to reference external meta-data schema for post-processing, which is required for interpolation. As an example, show how to reference a schema for interpolating a CCMC model using the Advanced Scientific Data Format (ASDF). ASDF is a hybrid text and binary format designed for extensibility and human-readable metadata
      • Tutorials/tools for working with SPASE metadata. These should be made publicly available, to be used in splinter groups
  • 2018 CCMC Workshop: IAIA Focus Session
    • Information on the workshop: CCMC Workshop 2018 Main PageAgenda for the IAIA Session on Friday, April 27, 2018*Organizers for the session: Chiu Wiegand, Aaron Roberts***List of available presentations**
      • Tools, Libraries, and Framework for Model Validation
      • Comprehensive Assessment of Models and Events Based on Library tool (CAMEL) framework
      • Metadata standards: Their use cases as well as interoperability and compatibility
      • NASA Heliophysics Data Environment (HPDE) progress
      • Space Physics Archive Search and Extract (SPASE)
      • Using SPASE for models at CCMC
      • ISO standards
      • Discussion
      • Connecting Different Data Centers and Modeling Centers, Data Access, Data Distribution
      • Virtual Modeling Repository (VMR) migrated to CCMC
      • Heliophysics Application Programming Interface (HAPI)
      • iSWA HAPI implementation
      • ESA's SPace ENVironment Information System (SPENVIS)
      • CCMC collaboration with SPENVIS
      • EASPAS: the European e-science platform to access near-Earth space data
      • Discussion
      • Global Network of Interconnected Databases
      • VOEvent
      • An Interactive Multi-Instrument Database of Solar Flares
      • Space Weather Database Of Notifications, Knowledge, Information (DONKI)
      • Discussion

Resources
Space Physics Archive Search and Extract (SPASE)
Integrated Medium for Planetary Exploration (IMPEx)
Advanced Scientific Data Format (ASDF)