International Forum for Space Weather Capabilities Assessment

Forum Home | Working Teams & Topics |  Overall Goals and Deliverables  | Sign Up | FAQ |  ICCMC-LWS Working Meeting & Presentations 
SEPs

Related Links | Frequently Asked Questions | Community Feedback | Downloads | Sitemap
CCMC Home | CCMC Stakeholders | Our Team | Publications | Meetings and Workshops | Concept of Operations
Models At A Glance | ModelWeb Catalog and Archive
Request Procedures | Generate Input Data Files & Parameters | Movies on Request | About the Run Process | Publications Policy
Search run database | Request run output | Special events | Kameleon Software | Space Weather Explorer | Publications policy
Instant Model Run
Forecasting Support tools | iSWA | DONKI | Mission Support | Experimental Real Time Simulations | Operational Geospace Model Validations
Intl Forum | GEM Challenge | CEDAR ETI Challenge | GEM-CEDAR Challenge | SHINE Challenge | CME Arrival Time Scoreboard | Flare Scoreboard | SEP Scoreboard | IMF Bz Scoreboard
Educational materials &activities | Space Weather REDI Initiative | SW REDI Bootcamp | Student Research Contest | Tutorial at CEDAR | Forecaster Tools
Missions near Earth/in Earth-orbit | MMS | Van Allen Probes | THEMIS | MESSENGER | STEREO | Spitzer | MAVEN | MSL | Dawn | Kepler | EPOXI | Juno | CASSINI | Voyager | New Horizons | Sounding Rockets | International
Research Community Support | CCMC Workshops | NASA Robotic Mission Operator Workshops | LWS Support | Exo-CCMC | DREAM2 Support | HELCATS Support
iSWA | DONKI | Kameleon | StereoCat | EEGGL | CME Scoreboard | SEP Scoreboard | FLR Scoreboard | SEA5

SEP Working Team

Team Leads: I.G. Richardson. P. Quinn, M. Marsh, M.L. Mays
Scoreboard Leads: M. Dierckxsens, M. Marsh

Communications: ccmc-sep-team@googlegroups.com (mailing list); slack: ccmc-collab.slack.com (email leads to join our slack discussion channels)
Participants: Anastasios Anastasiadis · Yaireska Collado-Vega* · Maher Dayeh · Mark Dierckxsens* · David Falconer · Bernd Heber · Daniel Heynderickx* · Lan Jian · Piers Jiggens* · Noé Lugaz · Periasamy K Manoharan* · Mike Marsh · Daniel Matthiä* · Leila Mays* · Mike McAleenan* · Joseph Minow* · Karin Muglach* · Marlon Nunez · PAUL OBRIEN* · Dusan Odstrcil* · Mathew Owens · Athanasios Papaioannou · Philip Quinn* · Ian Richardson* · Pete Riley · Alexis Rouillard · Howard Singer* · Robert Steenburgh* · Angelos Vourlidas · Katherine Winters* · KiChang Yoon · Yihua Zheng* ·
Michael Balikhin* · Francois-Xavier Bocquet · Steven Brown* · Baptiste Cecconi · Consuelo Cid · Craig DeForest* · Natalia Ganushkina* · Manolis K. Georgoulis* · Carl Henney · Alexander Kosovichev* · Yuki Kubo · Masha Kuznetsova* · Kangjin Lee · Janet Luhmann · Slava Merkin* · Marilena Mierla · Teresa Nieves · Steve Petrinec* · Manuela Temmer · Barbara Thompson* · W. Kent Tobiska* · Karlheinz Trattner* · Rodney Viereck · Jie Zhang ·

*attending CCMC-LWS working meeting

Latest News
➡ April 2017 working meeting: team agenda | solar/heliosphere agenda | full agenda


User Needs
🚧 under construction
What are the requirements of different users, e.g. satellite operators, aviation and ground-based services?
  ⊕  What can or should be forecast? Current models mostly focus on predicting the GOES >10 MeV proton flux but certain users may be interested in higher energies or heavier ions.
  ⊕  Probabilistic forecasts for 24 hour intervals 1-3 days ahead.
  ⊕  Estimation of SEP effects in retrospective sense are needed for aviation.
  ⊕  Models that are advanced enough for actionable forecasts.


Working Team Goals
  ⊕  Evaluate how well different models/techniques can predict historical SEP events throughout the heliosphere.
  ⊕  Establish metrics agreed upon by the community
  ⊕  Provide a benchmark against which future models can be assessed against
  ⊕  Complementary to the SEP Scoreboard activity whose goal is collect and display real-time SEP predictions and ultimately facilitate validation.


Working Team Deliverables
  ⊕  Catalog of metrics, including how they address specific questions, and relate to user needs.
  ⊕  Selecting time intervals based on chosen questions.
  ⊕  Model assessments for selected time intervals (case studies).
  ⊕  Online database of model outputs with complete input/output metadata, and observations (in collaboration with the Information Architecture for Interactive Archives (IAIA) working team).
  ⊕  Publication describing model assessment results summarizing where we stand with SEP prediction.

By April 2017: Deliver the 2nd and 3rd items. We will discuss the first item during the April 2017 workshop, but not yet produce a catalog. Collaborate with the working team: Assessment of Understanding and Quantifying Progress Toward Science Understanding and Operational Readiness.


Potential Questions to Address
Where do we stand with SEP prediction?
  ⊕  Using case studies of selected events, can we assess where we stand with SEP prediction?
  ⊕  How well can the success of SEP models be compared? Is it possible to identify a uniform metric?
  ⊕  How do we move beyond using case studies for model/data comparisons?


Physical quantities and Metrics for Model Validation
Possible Physical Quantities and Metrics for Model Validation
  ⊕  Categorical (yes/no) predictions:
      ⊖  Skill scores based on contingency tables;
      ⊖  SEP event vs. no event (including all-clear) predictions;
      ⊖  Probabilistic and continuous predictions that can be converted to categorical predictions using specified thresholds.
  ⊕  Continuous predictions (e.g., SEP intensity-time profiles):
      ⊖  SEP event parameters (e.g., onset time, peak intensity, peak time, evolution of energy spectrum and anisotropy).
      ⊖  Metrics, e.g., correlation plots, correlation coefficient, mean error, …
  ⊕  Probabilistic predictions:
      ⊖  Reliability diagram, Brier Skill Score, ...


Observations
🚧 under construction


List of Time Intervals in this Study — under discussion
🚧 under construction
  ⊕  Proposed: October 2011 – June 2012
Considerations:
  ⊕  Consider a training set, validation set, and test set. The test set would not be revealed until a later stage and model parameter tweaking would not be allowed.
  ⊕  Time period should have some overlap with Radiation & Plasma Effets working teams: SEEs, Total Dose, Radiation effects for aviation.
  ⊕  Time period should include STEREO (to have good CME and SEP observations).
  ⊕  Initially Focus on SEP events at Earth, later expand to multi-spacecraft even periods
  ⊕  Have a mix of events: ones that cross certain thresholds, and ones that do not.
  ⊕  What energy range should be used to select events?



Currently Participating Models
COMESEP SEPForecast  
COronal Mass Ejections and Solar Energetic Particles SEPForecast
FORSPEF  

Forecasting Solar Particle Events and Flares
RELeASE  
Relativistic Electron Alert System for Exploration
PREDICCS  
Predictions of radiation from REleASE, EMMREM, and Data Incorporating CRaTER, COSTEP, and other SEP measurements
SPARX  
UMASEP  
University of Malaga Solar energetic proton Event Predictor
AER SEP Model  

  ⇒ Model/techinque registration form


Resources and Past Progress
🚧 under construction
  ⊕  SEP Models in the Community and Literature (compiled by Mike Marsh)
  ⊕  Scoreboard discussion at ESWW13 in November 2016
  ⊕  SEP Scoreboard

<< Return to the forum homepage

National Aeronautics and Space Administration Air Force Materiel Command Air Force Office of Scientific Research Air Force Research Laboratory Air Force Weather Agency NOAA Space Environment Center National Science Foundation Office of Naval Research

| | Privacy, Security Notices

CCMC logo designed by artist Nana Bagdavadze