C.11.7. - Normalization and error bars
Data can be normalized to either monitor counts or time. When normalizing to monitor counts, the error bars will include the uncertainty in the counting statistics of the monitor counts. Otherwise, there is no difference between specifying time or monitor counts.
By default, scans.4 normalizes data to monitor counts, with the second to last data column used for the monitor count values. Use the .B -n flag to turn off normalization. If a column number is selected using the
m=
col or
t=
col arguments,
normalization is set to monitor or time mode,
respectively,
using the column number specified.
If the column number in either case is given as zero,
the normalization mode and value given by the
#M
or
#T
directives for a particular scan in the data file are used.
It is an error for normalization mode to be on, for
the normalization column to be
set to zero and for no
normalization directives to be present for a scan.
The normalization modes selected remain in effect for subsequent scans.
The values returned as error bars are those due to counting statistics (the square root of the number of counts). When the counts are derived from the algebraic combination of detector, background and monitor counts, the error bars are calculated using the appropriate "propagation of errors" formalism. See the source code for details.
If the
+I
option is selected, the counts for each point
are multiplied by the value given by the
#I
control line
in the scan header.
If the
+I
option is selected, the counts for each point
are multiplied by the value given by the
#I
control line
in the scan header. If the
+I
option is selected and the
scan header doesn't contain a
#I
control line, the counts
are not changed.