Using CitectSCADA > Logging and Trending Data > Trend Graphs > Trend interpolation

Trend interpolation

Trend interpolation is used to define the appearance of a trend graph when the incoming samples fall out of synchronization with the display period or when samples are missed.

For example, a particular trend might be sampled five times between each update of the trend graph. As only one value can be displayed for each update, a single value needs to be used that appropriately represents the five samples; and that could be the highest value, the lowest value, or an average.

To define how CitectSCADA calculates the value to use, you set a particular trend interpolator display method.

The following table shows the available interpolator display methods, grouped into condense methods (where the display period is longer than the sample period) and stretch methods (where the display period is less than or equal to the sample period).

Condense methods

Stretch methods

Average (default) - this displays the average of the samples within the previous display period

Step (default) - This method simply displays the value of the most recent sample.

Minimum- This displays the lowest value that occurred during the previous display period.

Ratio - This method uses the ratio of sample times and values immediately before and after the requested time to interpolate a "straight line" value.

Maximum - This displays the highest value that occurred during the previous display period.

Raw Data - This method displays the actual raw values.

The interpolation display method is set via TrnSetDisplayMode() function. You can also use the [Trend]GapFillMode parameter, but it will interpolate values within the actual trend file as well as on the trend graph.