Soo

0. 2. W. 6. B. 10. 12. m. 16. IB. 20. I. 3. S. 7. 9. 11. 13. IS. 17. 19. 21. TINE IN MINUTES FROM START T1M£

Fig. 9-2. Earth-Out Horizon Scanner Data From AE-3. Large quantity of spurious data makes identification of valid data difficult.

9.1.1 Checking Data Flags and Sensor Identification

The first method of validation is concerned with the type of data being analyzed, rather than whether the values of these data are acceptable. For example, there may be tell-tales or flags indicating whether the data were received in real time from the spacecraft (real-time data) or were recorded on a tape recorder aboard the spacecraft and transmitted later while over a tracking station (playback data). There may be one or more flags used to determine in which of several formats the-data were transmitted. Flags may also describe the operating mode of the spacecraft at the time of transmission and what attitude determination sensors were operating at that time.

These flags are normally evaluated before attempting to read other data because they determine what types of data are present and where and how often these data occur in the telemetry. An example of the need for this form of validation is GEOS-3, which has two telemetry formats, one containing a single data sample from the two-axis digital Sun sensors in each major frame of data and the other containing four Sun sensor data sample per major frame. A flag byte, included in the raw telemetry data, is examined to determine the number of Sun data items present before extracting them from the telemetry frame.

9.12 Validation of Discrete Data Points

The most common method of validation for discrete data points is upper- and lower-limit checking; that is, the value of the data must fall within specified limits to be acceptable. These limits can be constant (a maximum sensor voltage) or time varying (the proper day for an attached time). If the value of a data point lies outside the prescribed limits, it is invalid and may be corrected, flagged, or deleted. This method of validation is often performed on data types such as the attached times and the spacecraft clock. Sometimes limit checking is useful even when the data will not be used in further attitude calculations. For example, if the data are to be plotted automatically, outlying data points may adversely affect the limits of plot axes, causing valid data to lose significance. Limit checking is not useful when all the values a data item may assume are acceptable. In these cases, a discrete data item cannot be classified as erroneous without examining its value relative to other data, as discussed in Sections 9.2 through 9.4.

Another method of discrete data point validation is examination of the quality flag attached to the data by previous ground software processing. In data processed at Goddard Space Flight Center, this flag is set by an Operations Control Center or the Information Processing Division. The quality flag denotes whether a minimum number of bits in the telemetry synchronization (sync) pattern for each major or minor frame are incorrect, and hence indicates the likelihood of remaining bits in the data segment bejng bad. (The number of incorrect bits in the sync pattern which causes the quality flag to indicate bad data can vary from satellite to satellite; it is generally 9 bits out of 24.) A quality flag indicating bad data does not necessarily imply that bad data are present, but rather that there is a greater probability of bad data, since the sync pattern itself is in error. This flag can be validated as a discrete data point and the remaining data in the major or minor frame flagged or deleted accordingly.

Date may also be validated on a discrete point-by-point basis by comparing one type of data with another. For example, one can compare the value of the spacecraft clock for a given data sample with the time attached to that sample, or either of these might be compared with the minor or major frame number. Another example is comparison of the selected two-axis Sun sensor identification number with the analog output of the ATA photocell for each sensor (see Section 6.1 for a description of ATA) to determine if the Sun sensor ID corresponds to the Sun sensor most intensely illuminated. A third example is determination that star tracker data are valid by checking the values of associated flags, which indicate whether the tracker is in the track mode and whether the intensity of the object being tracked is within acceptable limits.

Validation may also be performed on information contained in the header provided by the receiving station. Information such as the location of the tracking station that received the data, the date the data were received, the start time of the data, and the spacecraft ID may be validated if desired.

9.13 Handling Invalid Data

Data which have been determined to be invalid can sometimes be corrected. For example, if the attached time is invalid but the spacecraft clock reading is valid and a known attached time corresponds to a known spacecraft clock reading, a current attached time may be computed on the basis of the current spacecraft clock time. The minor and major frame numbers might be used in a similar manner. Another example is correcting the two-axis Sun sensor identification based on the largest of the ATA readings.

When data have been examined and found to be invalid and no method exists to correct them on à discrete point-by-point basis, we must decide what to do with the bad data. In some cases, an invalid data point is useless and renders other data gathered at the same time useless as well. In these cases, all the data in question can be deleted and not processed by attitude determination software. In other cases, although a particular data value is useless, related data may be useful and should be retained. Sometimes the invalid data itself may be worth examining in further analysis. In these cases, the data are retained and used in further attitude determination calculations or corrected as discussed in the following sections. Data so treated are often flagged so that subsequent software can readily identify questionable data and correct or ignore them. The two most common methods of flagging data are internal flagging (changing the value of the data to a flag value, such as 99999) and external flagging (setting the value of a corresponding flag variable to a flag value). The latter method has the advantage of retaining incorrect data values for further analysis and the disadvantage of requiring extra computer storage for flag variables; extra core is generally required even when no data are flagged.

Similar manipulation can be done manually when data are viewed in interactive mode on a graphic display device. This enables the operator to evaluate the data and selectively process those considered acceptable. As seen in Fig. 9-2, it is often impossible to foresee all the ways in which the data will be bad and to provide fully automatic validation checks in the software; consequently an interactive processing capability is included in most software systems to permit manual data validation and manipulation.

After data validation and processing, it may become apparent from attitude solutions that telemetry data should have been selected, validated, or processed in a different manner. In this case, the entire procedure may be repeated using différent discrete data validation criteria. Iterative procedures of this type are discussed in Section 9.4.

92 Data Validation and Smoothing

Gerald M. Lenter

Data validation is a procedure by which we either accept or reject measurements but do not otherwise alter them. Rigorously, validating data by rejecting measurements which are "obviously" incorrect alters the statistical characteristics of the data. For example, data with Gaussian noise will have, on the average, one measurement in 1.7 million with an error of 5a or more. In a practical sense, however, rejecting such data is justified because all spacecraft data are subject to random bit errors (see Section 9.1), which typically occur much more frequently than So Gaussian noise errors.

Data smoothing is a technique which is widely used both to preprocess and validate data before attitude determination and to postprocess computed attitude solutions, primarily to reduce random noise or to derive attitude rates. Data smoothing is the only processing required for some data types which are used or displayed directly, such as boom length, accelerometer, or spin rate data.

Data smoothing is one method used to obtain an expected value for a measurement which is then used for validation. In using smoothing as a validation technique, one assumes that the telemetered data frequency is high compared with the frequencies characteristic of the data type and that similar measurements made at nearby times are reliable. In this section, we describe techniques used to "smooth" or to obtain an expected value for either measured or processed data. The expected value may be used either for subsequent processing or just for validation.

In addition to validating, data smoothing may be used to:

1. Remove high-frequency noise. The effects of sensor data digitization and noise may be reduced by the use of an algorithm which attenuates high-frequency components in the data.

2. Reduce data volume. If telemetry data rates are sized for a particular data type or operating mode, a large fraction of the telemetered data may be redundant and can be discarded to reduce the data processing load significantly without degrading attitude solutions.

3. Interpolate. For postprocessing, short periods of data dropout may be bridged. For preprocessing, interpolation is useful for data display or for providing estimated data at times other than those measured.

4. Improve accuracy. For some data types, the intrinsic accuracy of the sensor exceeds the telemetered least significant bit (LSB). For example, digital Sun sensor errors are typically less than half the LSB at transitions. Processing techniques which emphasize data when the LSB changes can, in principle, improve the accuracy of computed attitudes.*

5. Compute Rates. Attitude rates are required for some applications such as the initialization of data predictors and verification of control system performance. Magnetometer rate data is required for some attitude control systems and is usually obtained by analog differentiation; however, backup ground support may require numerical differentiation.

6. Filter Data. Some data types, such as AE-3 accelerometer data [Dennis, 1974], are used directly and may be enhanced by filtering, which can remove high-frequency noise.

'The practical worth of this scheme is doubtful because the reduced data volume may nullify the increased data accuracy. This procedure was implemented by Pettus 11973] with two-axis digital Sun sensor data. Pettus concluded that it was not useful because of the reduced data volume.

7. Display Data. A smooth function through noisy data can improve the intelligibility of graphic displays.

Four basic techniques for data smoothing are filtering, curve fitting, sifting, and preaveraging. Filtering is a data weighting scheme which is applied symmetrically to each measurement, y„ to produce a filtered measurement

Was this article helpful?

0 0

Post a comment