International Forum for Space Weather Capabilities Assessment

Forum Home | Working Teams & Topics |  Overall Goals and Deliverables  | Sign Up | FAQ |  ICCMC-LWS Working Meeting & Presentations 
Working Team Expected Deliverables

Related Links | Frequently Asked Questions | Community Feedback | Downloads | Sitemap
CCMC Home | CCMC Stakeholders | Our Team | Publications | Meetings and Workshops | Concept of Operations
Models At A Glance | ModelWeb Catalog and Archive
Request Procedures | Generate Input Data Files & Parameters | Movies on Request | About the Run Process | Publications Policy
Search run database | Request run output | Special events | Kameleon Software | Space Weather Explorer | Publications policy
Instant Model Run
Forecasting Support tools | iSWA | DONKI | Mission Support | Experimental Real Time Simulations | Operational Geospace Model Validations
Intl Forum | GEM Challenge | CEDAR ETI Challenge | GEM-CEDAR Challenge | SHINE Challenge | CME Arrival Time Scoreboard | Flare Scoreboard | SEP Scoreboard | IMF Bz Scoreboard
Educational materials &activities | Space Weather REDI Initiative | SW REDI Bootcamp | Student Research Contest | Tutorial at CEDAR | Forecaster Tools
Missions near Earth/in Earth-orbit | MMS | Van Allen Probes | THEMIS | MESSENGER | STEREO | Spitzer | MAVEN | MSL | Dawn | Kepler | EPOXI | Juno | CASSINI | Voyager | New Horizons | Sounding Rockets | International
Research Community Support | CCMC Workshops | NASA Robotic Mission Operator Workshops | LWS Support | Exo-CCMC | DREAM2 Support | HELCATS Support
iSWA | DONKI | Kameleon | StereoCat | EEGGL | CME Scoreboard | SEP Scoreboard | FLR Scoreboard | SEA5
Model evaluation guide for working teams
International Forum for Space Weather Capabilities Assessment Below is a suggested guide for initial model evaluations of the first handful of events. This list is just a starting point to help working teams begin defining goals, tasks, and discussion items.
  • Determine the physical parameters for model-data comparison.
  • Determine the observational data sources.
      ○  Prepare SPASE metadata for the observations.
      ○  CCMC and Data Access Working Team will provide support with metadata generation.
  • Determine the time intervals or event lists.
      ○  E.g., storms + preceding quiet time periods. Include a number of events, including long-duration events, and preceding quiet time periods.
  • Identify sources for uncertainties.
      ○  E.g., due to external drivers, internal assumptions, spatial and temporal resolution, etc.
      ○  Suggest an approach to quantify those uncertainties.
  • Determine a set of metrics relevant for specific applications, user needs, and science needs.
      ○  Different forecasting skill scores can represent different aspects of model or forecasting techniques performance (e.g., uncertainty, accuracy, reliability, temporal and spatial aspects).
      ○  A "message to the user" for each forecasting skill score is desirable to make it clear what useful information can be derived from current state of space environment modeling.
  • Invite modelers to submit their results for the selected events.
      ○  At least one model is needed for the initial assessment.
  • Perform model assessment for a handful of events with at least one model
After initial validation has been completed the working team may continue to discuss the metrics, techniques, and invite more modelers/data providers for a comprehensive validation that may be written up in a paper by the team. Click here for a description of expected deliverables from the working teams.

<< Return to the forum homepage

National Aeronautics and Space Administration Air Force Materiel Command Air Force Office of Scientific Research Air Force Research Laboratory Air Force Weather Agency NOAA Space Environment Center National Science Foundation Office of Naval Research

| | Privacy, Security Notices

CCMC logo designed by artist Nana Bagdavadze