Author's Note: I have had the good fortune to organize the triennial Ionospheric Effects Symposia since 1975, and the event in 2002 included a special forum on "Ionospheric Models-Current and Future ". The Forum was ably chaired by Dr. Ken Davies of NOAA (retired) and Dr. Anthea Coster (MIT-Lincoln Lab, now MIT-Haystack Observatory). The panel co-chairs developed a full summary that is contained in the Proceedings of IES2002. It turned out to be a rather interesting forum, and I felt it would be useful to provide a short synopsis of the summary report written by Davies and Coster. The recorders were Greg Bishop and Patricia Doherty. The panelists included: John Seago (user needs), Tim Fuller-Rowell (storm models and metrics), Jan Sojka (new data and data quality), Dieter Bilitza (empirical models), Terence Bullett (data sources), and Brian Wilson (TEC models). The discussions were freewheeling and I would like to express my appreciation to Jack Klobuchar for keeping the forum lively.
I have occasionally entered some comments within the text, identified as "Editorial Note", to distinguish my personal views or additions from those of Davies, Coster, and the various contributors at the forum. Finally, it was regrettable that the user community was not as well represented at the forum as would have been preferred. This is always a problem, and needs to be rectified.
Summary Report on Ionospheric Models - Current and Future
By: Ken Davies and Anthea Coster (as edited by John Goodman)
John Seago (Honeywell Technology Solutions, Inc.) represented himself as a "novice user", or an individual who is aware of the need to account for ionospheric effects, but lacks the formal training or experience in modeling approaches and techniques to apply them. Mr. Seago recommended that models and documentation be accessible on-line, and be completely self-contained, with software installation instructions, necessary data files, version numbers, and points of contact provided. He also suggested on-line information should include test cases and estimates of model uncertainties.
Tim Fuller-Rowell (NOAA-SEC) discussed the difficulties in modeling the ionosphere under storm conditions, which he regarded as being comprised of both climatological features and weather features. The climatological features were repeatable from storm-to-storm, but the differences between storms should be properly described as weather. He lamented that the physics of storm phenomena was not yet completely understood. Fuller-Rowell reported that the IRI2000 model, which captures solar cycle, seasonal, geographical, and local time variations, has now included magnetic activity variations (through the STORM model). He remarked that the magnetic activity incorporation, while an improvement, still has problems at low and high latitudes, and the middle latitudes in wintertime. Metrics are still needed, according to Fuller-Rowell, to quantify the amount of variability that has been captured by any specified model, and to determine its utility in any space weather application.
188.8.131.52 Observations and Data Issues
Jan Sojka (Space Environment Corporation) discussed data quality and specified the following sources of error:
• Statistical (Gaussian noise)
• Instrumental effects
• Representation of data vis-a-vis the model targeted
• Absolute or offset values
Sojka expressed the concern that data assimilation models, which require real-time data streaming, are subject to the problem that required data sets might not be properly reviewed. He suggested improved handling of data in order to produce results that assimilation models can effectively use. He also notes that there are some constraints on error distributions. For example the Kalman filters in data assimilation require a Gaussian distribution for errors. In the area of new observations, Sojka reflected on two experiments flying on the ARGOS satellite: (i) Low Resolution Airglow and Aurora Spectrograph (LORAAS) and (ii) The High Resolution Airglow and Aurora Spectroscopy (HIRASS). The LORAAS should provide data useful in validation of Ne in models; the HIRASS should provide data useful for thermospheric models that need to be validated. This is important since some ionospheric models have an imbedded thermospheric model.
Dieter Bilitza discussed the fact that empirical models are based upon long records of measured data. He noted the obvious bias of the empirical models to those areas where more data had been accumulated (i.e., Northern Hemispheric middle latitudes). The ocean areas are especially under-represented, as are the equatorial and high latitude regions. Still these empirical models have many applications, and now have the capability to be updated in real-time. Bilitza discussed the attributes of the current version of the International Reference Ionosphere, IRI2001. This version incorporates information on the D-region, the bottomside of the F1 region, the F region peak, electron temperature, and equatorial vertical ion drift. Improvements are considered continually. Specifically, the following improvements are planned: the F2 peak height, spread F, sporadic E, and a quantitative examination of variability in terms of a monthly standard deviation. Bilitza points out that there are a number of organizations that support empirical modeling, including: COSPAR, URSI, and ITU. Moreover, the International Standardization Organization (ISO) is in the process of registering standard models for the Earth's environment, including the ionosphere. Bilitza refers to the IRI home page.
Terence Bullett (AFRL) identified a number of on-line sources of data. The categories he considered were: (a) radio remote sensing (e.g., incoherent scatter), (b) ionosondes, (c) topside sounder, (d) coherent scatter radar (i.e., SuperDARN), (e) GPS Network, (f) tomography, (g) occultation, (h) satellite UV, and (i) in-situ probes. Bullett stressed ionosonde data in his presentation. He noted that recent investment in the international network of sounders have largely focused on data availability and timeliness and not on quality. While there is an enormous amount of sounder data available, the issues of data accuracy and latency need to be recognized. Bullet addressed the need for model validation, and suggested that data providers make raw data available.
184.108.40.206 Future of Ionospheric Modeling
Brian Wilson (Jet Propulsion Laboratory) discussed new data types including:
• Additional GPS stations (within IGS and CORS)
• New occultation data sets (current satellites plus COSMIC constellation)
Wilson illustrated strides made in the area of TEC mapping. Daily maps of TEC are available and a number of groups are moving to the production of hourly maps. JPL's global mapping scheme, GIM, has been validated using TOPEX data as ground truth. Some limitations in the low latitude region have been documented. Differences (in TEC units) between GIM and TOPEX were ~ 3-5 at middle latitudes and 5-10 at low latitudes. The JPL GENESIS web site is a location to obtain occultation data, including data from CHAMP, and SAC-C, and also from IOX, GRACE, and COSMIC (when available). The GIM TEC maps can be found on the Web. Wilson took the opportunity to comment on the GAIM work being carried out by JPL and the University of Southern California. The following data types can be assimilated: (i) GPS-TEC data, (ii) relative TEC from occultation, (iii) ionospheric data from ionosondes, (iv) in-situ data from DMSP, and (v) nighttime UV limb scans. Other data types are planned.
Ludger Scherliess (Utah State University) discussed the purpose of data assimilation modeling. He mentioned that the goal is to combine models and data optimally. Models illustrate our knowledge of the physics and data provide information about the true state of the ionosphere. The problem is that both models and data are imperfect, since models require parameterization and data contain observation errors. Scherliess itemized the data subject to assimilation, and expressed the view that millions of data measurements will be available in the next decade, and that physics-based data assimilation models will provide real-time snapshots of the ionospheric behavior.
Kent Tobiska (Space Environment Technologies) discussed current and future data and measurements, and various models. He identified the most recent EUV model, SOLAR2000, which is now operational. Tobiska noted that nowcast irradiances are produced for use in solar monitoring; and forecasts are being made for operational systems.
Jim Secan (NorthWest Research Associates) discussed the wideband model WBMOD. This model is a semi-empirical model of the effects of Flayer plasma density irregularities on trans-ionospheric radio propagation. Included in the model is a propagation component that predicts the S4 index (i.e., the normalized RMS variation in signal power), the phase scintillation parameter, and the probability distribution function. Secan mentioned some of the shortcomings of the model, mostly the result of data source limitations, and he explained that additional data sets would be useful. He indicated that attempts would be made to make the model available to the wider user community, and suggested that more information could be found on the NWRA web site.
Santimay Basu (AFRL) described plans for a weather model for scintillation based upon data from the C/NOFS system. This is the Communication/Navigation Outage Forecasting System having the ability to convert observations of ionospheric turbulence to scintillation parameters. Additional information on C/NOFS can be obtained at several web sites.
The discussion began with some views from Jack Klobuchar (Innovative Solutions International), who served as a devil's advocate at the forum. The more relevant comments were as follows:
1) Certain applications, such as the NAVSPASUR system require electron density profdes and not TEC.
2) There are limitations to some of the data sets used in data assimilative modeling.
3) There are limitations in data from the CORS network in that they are not well-calibrated or uniform, multiple receiver types are included in the data sets, and it is difficult for the user to determine the biases for the receivers.
4) The WAAS system is currently available for the CONUS region, and does not suffer from many of the issues that plague the CORS network. Why does the community of data assimilation modelers not use this resource?
5) It was recommended that the STORM model development should be suppressed at low latitudes in favor of ascertaining the effect of vertical drifts on the ionosphere in quiet times, and specifically longitudinal differences. At the least, this information should be evaluated prior to modeling the magnetic storm effects.
6) Daily EUV daily has little correlation with daily TEC, and it is of questionable value in modeling the TEC.
7) The WBMOD scintillation model suffers from limited source data from a few stations. (Editorial Note: Jim Secan of NWRA generally agrees with the view that WBMOD can be improved with additional data sets, See comments from Secan above)
There were responses to these views. With respect to #3, Anthea Coster (MIT Lincoln Laboratory, currently MIT Haystack) defended the usefulness and quality of the CORS data set since she had exploited the data successfully in several applications. She also noted its availability on the Internet. Brian Wilson (JPL) also defended the CORS data, and also noted that TOPEX data is used to validate the GIM data, and to determine structures in the equatorial region.
With respect to comment #4, Bob Schunk (Utah State University) indicated the WAAS data is actually equivalent vertical TEC, and thus not useful for data assimilation. His concern was the error introduced when converting the slant TEC to the vertical. Concerned about data smearing, he would prefer to use the original slant measurements. The CORS data has real slant measurements preserved.
Also with respect to #4, Wilson noted that while the WAAS supertruth data contains high quality slant TEC information, it is the result of a lot of post-processing ... and not real-time. Hence it is not a good candidate for data assimilation. Wilson indicates that there is a GPS TEC data uncertainly due to receiver bias; and there is an additional 2-3 TEC unit uncertainty (where 1 TEC unit = 1016 electrons/m2) in the TOPEX TEC.
With respect to #5, Tim Fuller-Rowell (NOAA-SEC) argued the usefulness of ionospheric storm modeling since it was apparent that magnetic storm effects can drastically change the diurnal variation. (Editorial Note: Klobuchar was essentially saying that priority should be given to the determination the low-latitude quiet-time mean density, so that storm departures would have a meaningful reference.)
With respect to #6, Kent Tobiska (Space Environment Technologies) commented that the solar irradiance is useful for long-term modeling, and it now becoming better quantified.
A general discussion ensued. Sandro Radicella (Abdus Salam ICTP) indicated that TOPEX data has been shown to be quite useful in validations of the GIM and IR1 models. He expressed the concern that supertruth slant TEC data could not be obtained in real-time. In response. Coster indicated that it should be possible to obtain high quality GPS data (~ 2-3 TEC unit accuracy) in real-time if calibration is handled carefully. Wilson agreed that JPL could compute streaming TEC (instead of batch-processed) every few seconds and could provide the data in real time.
On the issue of data availability for assimilative models, Brian Wilson (JPL) cited the use of GPS occultation data. These data sets are useful since they can lead to profiles of the ionosphere to the height of the low orbit satellite. Schunk (Utah State University) commented that topside data was very useful for modeling purposes, including topside data on electron and ion temperatures. Jack Klobuchar noted that there was limited reception in the American zone, vis-à-vis the Alouette satellite.
Coster asked Bilitza (NASA-Goddard) about the validity of the IRI model in the low latitude sector. She had noticed that the IRI underestimates the TEC by ~ 50% near the equator at solar maximum. Bilitza responded that there was a task force structure within IRI to address various problems such as this. He continued by saying topside ISIS data, obtained at high solar activity, will no doubt lead to improvements. The IRI task-force work in 2002 stresses TEC on the topside.
Bodo Reinisch (University of Massachusetts-Lowell) cited the usefulness of the worldwide ionosonde database. He mentioned its availability at a number on Internet web sites, including NOAA-NGDC. Sojka (Space Environment Corporation) commented that real data was extremely important, and that ionosondes are still a leading source of good data. Klobuchar commented that the integrity of ionosonde data is an issue. Much data is missing because it did not pass the quality control algorithm.
Speaking with reference to the HF community that constitutes -50% of the customers for the ionosonde network, Terence Bullett (AFRL) wonders if GAIM will benefit them. Bullett also inquired about the treatment of TIDs with the GAIM approach. Would TIDs be visible? Schunk responded by saying that TIDs are not available at present, but future advances may lead to the possibility of detecting TIDs. For example, the addition of 200 spaced receivers could detect TIDs. Bullet said that the addition of D and E-region models would be helpful, and customers would find a 1-hour forecast of TIDs useful.
Both Jim Secan (NWRA) and Jan Sojka, referring to assimilative models, indicated that data providers need to be mindful of the fact that assimilative models have a grid size element (i.e., a volume element) within which ionospheric parameter estimates are made. Data needs to be characterized in terms of variance within a given volume element.
Earlier in the presentations, John Seago (Honeywell) indicated that new model developments might benefit from increased product exposure and "marketing". He was asked to elaborate. Seago pointed out that he was simply noting that users need to be aware of the existence of the models, their capabilities, and how to access them. Tim Fuller-Rowell indicated that the 3rd party vendors would take the responsibility for the transfer of tailored products to the ultimate customer.
On the issue of modelers, data providers, and users, a number of comments were made. Mannucci (JPL) indicated that modelers and data providers should work together. Specifically, data providers should be sensitive to the need by modelers to address error bounds on the data sets they use in model development. Also the data should be well documented and easy to use. Jack Klobuchar responded that in addition to data providers and modelers working together, there was also a need for a more institutionalized way to bring users and scientists together. He pointed out the IES symposia as a good example, given the fact that full papers are provided to the attendees, and that published proceedings are made available. Paul Bellaire (AFOSR) replied that this was also the intent of the annual Space Weather Week, and that presentations and discussions are posted on a designated web site for Space Weather Week. (Editorial Note: In this connection, the organizers of Space Weather Week have published the SWW Proceedings following the 2004 event, and have provided CDs of all presentations upon request.) Paul Kintner (Cornell University) followed with the thought that there was a need for a system to bridge the gap between science and application. There has never been adequate funding to do transition to application (i.e., step out of the science realm). Anthea Coster suggested that we should document user needs to help transition science to application.
Mannucci (JPL) inquired about sources of funding. Specifically, what agencies fund data system research? Klobuchar indicated that funding is based upon need, and the researcher should seek funding from sources that specify needs. Klobuchar agrees with Paul Kintner about forcing the transition from science to applications, but reminds the audience that the ionosphere, (and space weather) will never be as relevant as tropospheric weather.
(Editorial Note: The following is a fitting conclusion to the forum, and is "lifted" almost verbatim from the summary by Davies and Coster. My apologies go to the authors for some re-ordering of text and occasional additions.)
It is evident that near-real-time data assimilative models, which utilize many aspects of empirical and physics-based models, seem to offer the most promise for capturing the true state of the ionosphere. The future of these models depends on intelligently incorporating the wealth of information from new satellite systems, from the addition of multiple ground-based systems, and, perhaps more importantly, the communication links that will allow this data to flow in near-real-time to the various data processing centers, and from there to the users.
Modelers are concerned with the testing of models with valid data, and since data assimilative models require real-time data, it is essential that data be inspected for quality. This was a recurrent theme. Users suggested that information about state-of-the-art models be made readily available.
A concern, which continues to surround the development of the newer models, is that of funding. Which agencies need and fund data system research? There are real-world applications for this research, but funding is not always available to make the transition from science to applications.
Ionospheric predictions influence several disciplines including the prediction of radio system performance, a matter of some interest in planning as well as ultimate operations. Long-term predictions are generally based upon predictions of driving parameters such as sunspot number, the 10.7 cm solar flux, magnetic activity indices, etc. Unfortunately these parameters are not easy to predict. We are now unfortunately faced with the job of predicting outcomes from models driven by parameters that also need to be predicted. This is truly double jeopardy. Moreover, the functions relating these parameters to the ionosphere are imprecise. Consequently, long-term predictions needed for system design are subject to a considerable amount of uncertainty. To first order the uncertainty in the median value offoF2 for a particular time and location is proportional to the uncertainty in the mean sunspot number.
In addition to the uncertainty in the mean parameters, we must account for the fact that ionospheric parameters have real distributions, and with few exceptions the spread of these distributions is such that errors about the mean may be a dominant contribution. Short-term ionospheric predictions (or forecasts) generally refer to departures from the median behavior, the latter being well characterized by running averages of solar flux and related parameters (viz., sunspot number). The short-term fluctuations may be specified in terms of hour-to-hour, day-to-day, and week-to-week variabilities. There are also second-to-second and minute-to-minute variations but this class of variations generally falls within the realm of unpredictable behavior. Compensation for such fluctuations is quite difficult, but may be accommodated through use of system protocols which enable real-time channel evaluation (or RTCE) measures to be initiated, such as channel sounding or probing. These very short-term forecasts are generally referred to as Nowcasts.
There are four ITU-R documents that are pertinent to the investigation of the ionospheric forecasting problem. The first deals with the exchange of data forecasts [ITU-R, 1995]; the second outlines various measures for forecasting of ionospheric parameters [ITU-R, 1994a]; the third deals specifically with solar-induced ionospheric effects [ITU-R, 1994b]; and the fourth outlines various real-time channel evaluation schemes [CCIR, 1990]. These reports should be consulted.
Distributions of parameters such as foF2, foEs, and hF2 are important since these parameters depart significantly from fundamental intuition and from rules set forth by Sidney Chapman and his classic theory. Distributions of foF2 and foEs are available [Lucas and Haydon, 1966; Lefitin et al., 1968] but F2 layer height distributions are not directly available. Ionospheric predictions in the short and intermediate terms provide the most exciting challenge for the ionospheric researchers.
Observational data have shown that Traveling Ionospheric Disturbances (or TIDs) are the ionospheric tracers to a class of atmospheric gravity waves; and these disturbances are a major contribution to ionospheric variability, especially at F region heights. TIDs have a major impact on layer height as well as peak electron density, and possess a variety of scales, from kilometers to thousands of kilometers. The small to intermediate scale TIDs, having wavelengths of less than a few hundred kilometers and periods of the order of 10-20 minutes, arise from local sources and have relatively small amplitudes away from the source region. The large-scale TIDs have sources that are located at great distances, and there is a strong correlation between this class of disturbances and geomagnetic storms. Evidence suggests that large scale TIDs have an impact over global distances and originate within the auroral zone as a result of atmospheric modifications associated with precipitation and auroral arc formation. A survey of the effects of TIDs on radiowave systems may be found in a review paper by Hunsucker .
The field of ionospheric predictions is undergoing continuous evolution with the introduction of new scientific methods and instruments, which are providing fresh insight. The requirements for quasi-real-time products based upon current ionospheric specification has led to an increased importance of so-called real-time ionospheric models. This class of models, in turn, is driven by a hierarchy of solar-terrestrial observations, which enables the analyst to examine the space-weather environment as an integrated complex of phenomena. This general approach is leading to an improvement in our understanding of ionospheric structure and it variations, if not better short-term forecasts. In the immediate future, it is anticipated that the primary ionospheric specification tools will be comprised of terrestrial sounding systems, including real-time networks of ionospheric sounders [Galkin et al., 1999]. Real-time data services based on these approaches are becoming available [Goodman and Ballard, 1999].
Perhaps the most exciting new development in recent years has been science and technology for ingesting large amounts of real-time data and the assimilation of these data within various models. COST programs in Europe have led the way in the incorporation of data within empirical models, while the American GAIM technology shows great promise in the assimilation of data within physical models with the aid of Kalman filtering and related schemes.
Meanwhile, other more direct methods are being used in a number of practical situations where computational assets are limited. For example, direct ingestion of real-time data can be used for updating climatological models when data sets are sparely distributed and when less precision is required. When data sets are dense, particularly over areas where GPS-TEC and sounding data are available, careful mapping techniques have been applied to capture the most likely continuous distribution of data over selected regions.
Was this article helpful?