aboutsummaryrefslogtreecommitdiff
path: root/noao/imred/dtoi/doc
diff options
context:
space:
mode:
authorJoseph Hunkeler <jhunkeler@gmail.com>2015-07-08 20:46:52 -0400
committerJoseph Hunkeler <jhunkeler@gmail.com>2015-07-08 20:46:52 -0400
commitfa080de7afc95aa1c19a6e6fc0e0708ced2eadc4 (patch)
treebdda434976bc09c864f2e4fa6f16ba1952b1e555 /noao/imred/dtoi/doc
downloadiraf-linux-fa080de7afc95aa1c19a6e6fc0e0708ced2eadc4.tar.gz
Initial commit
Diffstat (limited to 'noao/imred/dtoi/doc')
-rw-r--r--noao/imred/dtoi/doc/dematch.hlp51
-rw-r--r--noao/imred/dtoi/doc/dtoi.ms576
-rw-r--r--noao/imred/dtoi/doc/dtoi.toc34
-rw-r--r--noao/imred/dtoi/doc/hdfit.hlp79
-rw-r--r--noao/imred/dtoi/doc/hdshift.hlp50
-rw-r--r--noao/imred/dtoi/doc/hdtoi.hlp88
-rw-r--r--noao/imred/dtoi/doc/selftest.hlp81
-rw-r--r--noao/imred/dtoi/doc/splotlist.hlp81
8 files changed, 1040 insertions, 0 deletions
diff --git a/noao/imred/dtoi/doc/dematch.hlp b/noao/imred/dtoi/doc/dematch.hlp
new file mode 100644
index 00000000..7dffad26
--- /dev/null
+++ b/noao/imred/dtoi/doc/dematch.hlp
@@ -0,0 +1,51 @@
+.help dematch Feb87 imred.dtoi
+.ih
+NAME
+dematch -- match density to log exposure values
+.ih
+USAGE
+dematch database
+.ih
+PARAMETERS
+.ls database
+Database containing density list, probably from \fIspotlist\fR.
+.le
+.ls wedge = "", filter = "", emulsion = ""
+Information used to retrieve log exposure values from \fBwedgefile\fR.
+.le
+.ls wedgefile = "noao$lib/hdwedge.dat"
+Name of file containing wedge intensity information.
+.le
+.ls nskip = 0
+Number of faint spots skipped, used as an offset into the list of
+log exposure values.
+.le
+.ls verbose = yes
+Print the log exposure information to STDOUT as well as to \fBdatabase\fR.
+.le
+.ih
+DESCRIPTION
+Task \fIdematch\fR matches density values to log exposure values. A database
+of density values is input, as well as information needed to
+retrieve log exposure values from a reference file. The two sources of
+information are matched, and the matching log exposure values are added
+as a record in the database.
+
+Parameter \fBnskip\fR tells how many faint spots were not
+included in the density \fBdatabase\fR. This information is
+used to align the density, exposure values. It doesn't matter if the
+densities are listed in a monotonically increasing or decreasing
+order, as long as no spots were omitted between the first and last
+measured.
+.ih
+EXAMPLES
+Match densities in db1 to log exposure values for wedge#117
+with a IIIAJ emulsion and a GG385 filter.
+.nf
+
+ cl> dematch db1 wedge=117 filt=gg385 emulsion=IIIAJ
+.fi
+.ih
+SEE ALSO
+spotlist, hdfit, hdtoi
+.endhelp
diff --git a/noao/imred/dtoi/doc/dtoi.ms b/noao/imred/dtoi/doc/dtoi.ms
new file mode 100644
index 00000000..4f999259
--- /dev/null
+++ b/noao/imred/dtoi/doc/dtoi.ms
@@ -0,0 +1,576 @@
+.RP
+.TL
+An Overview of the IRAF DTOI Package
+.AU
+Suzanne Hammond Jacoby
+.AI
+IRAF Group - Central Computer Services
+.K2 "" "" "*"
+February 1987
+.br
+Revised July 1988
+
+.AB
+This document describes the DTOI package, which contains tasks
+for determining and applying a density to intensity transformation to
+photographic data. The transformation is determined from a set
+of calibration spots with known relative intensities. A curve is
+interactively fit to the densities and intensities of the calibration
+spots. The transformation is then applied and a new output image written.
+.AE
+
+.NH
+Introduction
+.PP
+The DTOI package contains tasks for computing and applying a density
+to intensity transformation to photographic data. These tasks perform the
+standard steps in linearizing data: calculating HD data points from
+calibration spots, fitting a curve to these points and applying the HD
+curve to the data. It is also possible to combine related HD curves.
+Communication between the tasks is via text files which the user can
+inspect or modify. It is intended
+to be easy for users to introduce data from outside the DTOI package
+into the processing.
+.PP
+There are currently six tasks in the package. They are:
+
+.ce
+The \fBDTOI\fR Package
+.TS
+center;
+n.
+spotlist \&- Calculate densities and weights of calibration spots.
+dematch \&- Match densities to log exposure values.
+hdfit \&- Fit characteristic curve to density, exposure data.
+hdtoi \&- Apply HD transformation to image data.
+hdshift \&- Align related characteristic curves.
+selftest \&- Test transformation algorithm.
+.TE
+.PP
+The DTOI package does not currently support the self calibration of images,
+but the addition of this capability is planned. This would involve
+determining the HD curve from the data itself, by assuming the point spread
+function scales linearly with intensity.
+.PP
+Upon entering the package, your calibration spots and images to be
+transformed should be on the disk in IRAF image format.
+
+.NH
+Determining the HD Curve Data
+.PP
+To determine the HD curve, you need two sets of data: the
+measured photographic densities of a set of calibration spots and
+the log exposure values corresponding to these measurements. The
+log exposure values must be known a priori. Tasks \fIspotlist\fR and
+\fIdematch\fR are used to assemble these two data sets.
+.NH 2
+SPOTLIST
+.PP
+The first step is to calculate the density of
+the calibration spots, each of which is a separate IRAF image or image
+section. The spot density is either the median of the spot pixels or
+the mean of the pixels when pixels more then a user specified number of
+standard deviations away from the mean have been rejected. The numbers
+in the spot image must be scaled to density; parameter \fBspotlist.scale\fR
+is used such that density = input_value * scale. Task \fIspotlist\fR also
+calculates the standard deviation of each spot and reports
+the number of good pixels, i.e., the number of pixels not rejected
+when determining the mean density.
+The final product of this task is a record in the data base containing a
+density for each spot. The scale factor used is also written to the data
+base; it will be read later in task \fIhdtoi\fR.
+.NH 2
+DEMATCH
+.PP
+Log exposure values must be matched to the measured density values. These
+log exposure values must be known a priori and will be read from a file.
+Task \fIdematch\fR retrieves the proper exposure information by
+matching the wedge number, emulsion type and filter used. Once a match
+has been made, the proper log exposure values are written to a record
+in the database.
+.PP
+A database of log exposure values for the NOAO standard wedges is maintained
+in a system file; the wedge/emulsion/filter combinations available are listed
+in last section of this document. This file can be replaced with one specific
+to any institution; the file name is supplied to task \fIdematch\fR as a
+parameter. In this way the wedge file can be personalized to any application
+and not be lost when the system is updated.
+
+.NH
+Fitting the Curve
+.PP
+The HD curve, or characteristic curve, is a plot of density versus log
+exposure. This curve is determined from the data points generated by
+tasks \fIspotlist\fR and \fIdematch\fR. The objective is to fit
+a curve to these points, such that Log exposure = F(Density). The
+technique available in this package allows the independent variable of the
+fit to be a transformation of the density (log opacitance, for example).
+The log exposure and density values are
+read from the database. If multiple entries for a particular record are
+present in the database, the last one is used.
+.NH 2
+HDFIT
+.PP
+Task \fIhdfit\fR fits a characteristic curve to density and log exposure
+values in preparation for transforming an image from density to intensity.
+Five functional forms of the curve are available:
+.nf
+
+ Power Series
+ Linear Spline
+ Cubic Spline
+ Legendre Polynomials
+ Chebyshev Polynomials
+
+.fi
+.LP
+It is possible to apply a transformation to the
+independent variable (density above fog) prior to the fit. The traditional
+choice is to fit log exposure
+as a function of the log opacitance, rather than density directly. This is
+sometimes referred to as the Baker, or Seidel, function. Transforming
+the density has the effect of stretching the low density data points, which
+tend to be relatively oversampled.
+In the DTOI package, four independendent variables are currently available:
+.nf
+
+ Density
+ Log Opacitance
+ K50 - (Kaiser* Transform with alpha = 0.50)
+ K75 - (Kaiser* Transform with alpha = 0.75)
+
+.fi
+.FS
+* Margoshes and Rasberry, Spectrochimica Acta, Vol 24B, p497, (1969)
+.FE
+Any combination of transformation type and fitting function can be used and
+changed interactively. Two combinations of interest are discussed here.
+
+The default fit is a power series fit where the independent variable is
+Log Opacitance. That is:
+.LP
+.EQ
+
+ "Log Exposure = " sum from k=0 to {ncoeff - 1} {A sub k Y sup k}
+
+.EN
+.sp 1
+.EQ
+ "where Y = Log Opacitance = "Log sub 10 (10 sup Density - 1)
+.EN
+.LP
+A fit that is expected to best model a IIIA-J emulsion is a power series
+fit to a K75 transform of the density. That is,
+.LP
+.EQ
+
+ "Log Exposure = "sum from k=0 to {ncoeff - 1} {A sub k Y sup k}
+
+.EN
+.sp 1
+.EQ
+"where Y = K75 transform = Density + 0.75 " Log sub 10 (1 - 10 sup -Density )
+.EN
+.LP
+Over the expected small dynamic range in variables of the fit, legendre
+and chebyshev functions offer no advantages over a simple power series
+functional form. The cubic and linear spline fits may follow the data very
+closely, but with typically sparse data sets this is not desirable. It
+is expected that power series fit will serve satisfactorily in all cases.
+
+.NH 3
+Interactive Curve Fitting
+.PP
+Task \fIhdfit\fR can be run interactively or not. In interactive mode,
+points in the sample can be edited, added or deleted. Weighting values
+can be changed as well as the fog value, the type of transformation
+and the fitting function chosen. To obtain the best fit possible, interactive
+fitting is recommended. A complete list of the available commands
+is printed here; this list is also available interactively with the
+keystroke '\fL?\fR'.
+.TS
+center;
+c s s w(3.0i)
+c l s.
+
+ DTOI INTERACTIVE CURVE FITTING OPTIONS
+
+\fL?\fR Print options
+\fLa\fR Add the point at the cursor position to the sample
+\fLc\fR Print the coordinates and fit of point nearest the cursor
+\fLd\fR Delete data point nearest the cursor
+\fLf\fR Fit the data and redraw or overplot
+\fLg\fR T{
+Redefine graph keys. Any of the following data types may be along
+either axis:
+T}
+.T&
+l l l.
+ \fLx\fR Independent variable \fLy\fR Dependent variable
+ \fLf\fR Fitted value \fLr\fR Residual (y - f)
+ \fLd\fR Ratio (y / f) \fLn\fR Nonlinear part of y
+ \fLu\fR Density above fog
+
+Graph keys:
+.T&
+c l s.
+
+\fLh\fR h = (x,y) transformed density vs. log exposure
+\fLi\fR i = (y,x) log exposure vs. transformed density
+\fLj\fR j = (x,r) transformed density vs. residuals
+\fLk\fR k = (x,d) transformed density vs. the y(data)/y(fit) ratio
+\fLl\fR l = (y,u) log exposure vs. density above fog (HD Curve)
+
+\fLo\fR Overplot the next graph
+\fLq\fR T{
+Terminate the interactive curve fitting, updating the database file.
+T}
+\fLr\fR Redraw graph
+\fLu\fR Undelete the deleted point nearest the cursor
+\fLw\fR Set the graph window. For help type 'w' followed by '?'.
+\fLx\fR Change the x value of the point nearest the cursor
+\fLy\fR Change the y value of the point nearest the cursor
+\fLz\fR Change the weight of the point nearest the cursor
+
+.T&
+l s s w(3.0i).
+T{
+The parameters are listed or set with the following commands which may be
+abbreviated. To list the value of a parameter type the command alone.
+T}
+
+.T&
+l l s.
+
+\fL:show \fR[\fIfile\fR] Show the values of all the parameters
+\fL:vshow \fR[\fIfile\fR] Show the values of all the parameters verbosely
+\fL:errors \fR[\fIfile\fR] Print the errors of the fit (default STDOUT)
+\fL:reset \fR T{
+Return to original conditions of x, y, wts and npts.
+T}
+\fL:ebars \fR[\fIerrors/weights\fR] T{
+The size of marker type '[hv]ebars' can show either standard deviations or
+relative weights.
+T}
+\fL:function \fR[\fIvalue\fR] T{
+Fitting function (power, chebyshev, legendre, spline3, or spline1)
+T}
+\fL:transform \fR[\fIvalue\fR] Set the transform type (none, logo, k50, k75)
+\fL:fog \fR[\fIvalue\fR] Change the fog level (or ":fog reset")
+\fL:order \fR[\fIvalue\fR] Fitting function order
+\fL:quit \fR Terminate HDFIT without updating database
+\fL:/mark \fRstring T{
+Mark type (point, box, plus, cross, diamond, hline, vline, hebar, vebar, circle)
+T}
+
+.T&
+l s s.
+T{
+Additional commands are available for setting graph formats and manipulating
+the graphics. Use the following commands for help.
+T}
+
+.T&
+l l s.
+\fL:/help\fR Print help for graph formatting option
+\fL:.help\fR Print cursor mode help
+
+.TE
+.PP
+The value of fog can be changed interactively if you have
+reason to override the value written in the database by \fIspotlist\fR.
+You can reset the fog to its original value with the command ":fog reset".
+A common problem with defining the HD curve is that some of
+the calibration spot densities fall below fog. This is caused by either
+the low signal to noise at low densities or by making a poor choice of
+where to scan the fog level. These points are rejected from the fit
+when a transformation of the density is being made, as the transform cannot
+be evaluated for negative density. If the fog value or transformation
+type is interactively changed so this problem no longer exists,
+the spot densities are restored in the sample.
+
+The parameters of the final fit are written to a database which then
+contains the information
+necessary to reinitialize the curfit package for applying the transformation
+in \fIhdtoi\fR.
+
+.NH
+Applying the Transform
+.PP
+.NH 2
+HDTOI
+.PP
+Once the HD curve has been defined, it is applied to a density image
+in task \fIhdtoi\fR.
+Here the transformation is applied, as described by the fit parameters
+stored in the database. If more than one record of fit parameters is
+present, the last one is used. This means task \fIhdfit\fR can be
+repeated until an acceptable solution is found; the last solution will
+be used by \fIhdtoi\fR. On output, a new output image is written; the
+input image is left intact.
+.PP
+The transformation is accomplished by using a look-up table. All possible
+input values, from the minimum to maximum values found in the image, are
+converted to density using the scale value read from the database, and then
+to intensity using the fit parameters determined by \fIhdfit\fR. The input
+value is then the index into the intensity table:
+intensity = look_up_table (input_value).
+.PP
+A scaling factor can be applied to the final intensities, as typically
+they will be < 1.0. (The maximum log exposure in the NOAO wedge database
+is 0.0) By default, a saturated density pixel will be assigned the "ceiling"
+intensity of 30000 and the other pixels are scaled accordingly.
+The user is responsible for choosing a ceiling value
+that will avoid having significant digits truncated.
+The precision
+of the transform is unaffected by scaling the
+final intensities, although caution must be used if the output image
+pixel type is an integer.
+.PP
+The value of fog to be used is entered by the user, and can be either
+a number or a list of file names from which to calculate the fog value.
+The fog value is subtracted from the input image before the transformation
+takes place.
+Again, consider density values below fog. Two choices are available for
+these densities: the calculated intensity can be equal to the constant
+value 0.0 or equal -1.0 times the intensity determined for absolute (density).
+
+.NH
+Aligning Related HD curves
+.PP
+Calibration data sets from several plates can be combined once a shift
+particular to each set has been removed. "Different spot exposures
+define a series of HD curves which are parallel but mutually separated
+by arbitrary shifts in log exposure, produced by differing lamp intensities
+or exposure times. Generally, Kodak spectroscopic plates can be
+averaged if [1] they come from the same emulsion batch and box, [2]
+they receive identical hypersensitizing, [3] they are exposed similarly and
+[4] they receive the same development." *
+.FS
+* "Averaging Photographic Characteristic Curves", John Kormendy, from
+"ESO Workshop on Two Dimensional Photometry", Edited by P. Crane and
+K.Kjar, p 69, (1980), an ESO Publication.
+.FE
+.NH 2
+HDSHIFT
+.PP
+Procedure \fIhdshift\fR calculates and subtracts a zero point shift to
+bring several related HD curves into alignment. The individual shifts
+are calculated by elimination of the first coefficient (Bevington, eqn 9-3):
+.EQ
+
+a0 = y bar - a sub 1 X bar - a sub 2 X bar sup 2 - ~ ...~ - a sub n X bar sup n
+
+.EN
+Here, the averages over y and X refer to individual calibration set averages;
+the coefficients a1, ... an were previously calculated using data from all
+calibration sets with task \fIhdfit\fR, and stored in the database. The
+a0 term is calculated individually for each database; this term represents
+the zero point shift in log exposure and will be different for each database.
+
+On output, the log exposure values in each database have been
+shifted to the zero point shift of the first database in the list. The
+log exposure records are now aligned and it would be appropriate
+to run \fIhdfit\fR on the modified database list.
+.NH
+Testing the Transformation Algorithm
+.PP
+A test task is included to see if any numerical errors were introduced
+during the density to intensity transformation. It also evaluates
+truncation errors produced when an output image with integer pixels,
+rather than reals, is written.
+.NH 2
+SELFTEST
+.PP
+An intensity vector is generated from a density vector in two different
+ways. The first method uses the density vector and known coefficients
+to compute the intensity. The second method uses the curfit package
+to generate a look up table of intensities as done in task \fIhdtoi\fR. The
+residual of the two vectors is plotted; ideally the difference between
+the 'known' and 'calculated' intensity is zero.
+.PP
+Task \fIselftest\fR also plots intensity as a function of density for
+both integer and real output pixels. The user should investigate the
+plot with the cursor zoom and expand capabilities to determine if
+truncation errors are significant.
+.NH
+The Wedgefile Database
+.PP
+Task \fIdematch\fR reads a database and retrieves log exposure information
+for certain combinations of wedge number, photographic emulsion and filter.
+Those combinations included in the NOAO database are listed in the next
+section, although any calibration data can be included if the values are
+known. To modify the database, it is recommended that
+you generate a new file rather than add records to the existing file. This
+way, the modifications will not be lost when a new version of the IRAF
+system is released.
+
+In the database, the information for each wedge makes up a separate record;
+each record starts with the word \fBbegin\fR. Each record has a title field
+and can have multiple emulsion/filter fields. The number of log exposure
+values must be given, followed by the values written 8 per line. The order
+of the exposure data can be either monotonically increasing or decreasing.
+Here is an example:
+.DS
+begin 115
+ title MAYALL 4-M PF BEFORE 15APR74 (CHROME) [MP1-MP968]
+ IIIAJ/UG2 16
+ 0.000 -0.160 -0.419 -0.671 -0.872 -1.153 -1.471 -1.765
+ -2.106 -2.342 -2.614 -2.876 -3.183 -3.555 -3.911 -4.058
+ IIAO/UG2 16
+ 0.000 -0.160 -0.418 -0.670 -0.871 -1.152 -1.468 -1.761
+ -2.102 -2.338 -2.609 -2.870 -3.176 -3.547 -3.901 -4.047
+
+.DE
+.NH 2
+Contents of the NOAO Wedgefile
+.LP
+The following table lists the wedge/emulsion/filter combinations available in
+the NOAO wedgefile database.
+.TS
+center;
+l l s s s
+l l l l l.
+
+\fBWedge 24 CTIO SCHMIDT WESTON TUBE SENSITOMETER. \fR
+ MONO/MONO
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 48 PALOMAR 48-INCH SCHMIDT STEP WEDGE. \fR
+ MONO/MONO
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 84 OLD 84-INCH SPOT SENSITOMETER (1967) \fR
+ MONO/MONO
+
+.TE
+.TS
+l l s s s
+l l l l l.
+\fBWedge 101 SPOT BOX 4, KEPT IN SCHOENING-S LAB. \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4760 MONO/5200 MONO/5876
+ MONO/6470
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 115 MAYALL 4-M PF BEFORE 15APR74 (CHROME) [MP1-MP968] \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4770 MONO/5200 MONO/5876
+ MONO/6470
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 117 CTIO 4-METER P.F. \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4770 MONO/5200 MONO/5876
+ MONO/6470
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 118 CTIO 4-METER CASSEGRAIN \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4760 MONO/5200 MONO/5876
+ MONO/6470 MONO/6900
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 119 SPOT BOX 5, KEPT AT MAYALL 4-METER. \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4760 MONO/5200 MONO/5876
+ MONO/6470
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 120 SPOT BOX 6, KEPT AT 2.1-METER. \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4760 MONO/5200 MONO/5876
+ MONO/6470
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 121 SPOT BOX 8, KEPT IN SCHOENING'S LAB. \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4760 MONO/5200 MONO/5876
+ MONO/6470
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 122 SPOT BOX 7, AVAILABLE AT KPNO NIGHT ASST'S OFFICE \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4760 MONO/5200 MONO/5876
+ MONO/6470C
+.TE
+.TS
+l l s s s
+l l l l l.
+\fBWedge 123 MAYALL 4-M P.F. 15APR74 TO 21MAY74 [MP969-MP1051] \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4770 MONO/5200 MONO/5876
+ MONO/6470
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 129 MAYALL 4-METER P.F. AFTER 21MAY74 [MP1052--> ] \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4760 MONO/5200 MONO/5876
+ MONO/6470
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 130 MAYALL 4-METER CASS CAMERA. \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4760 MONO/5200 MONO/5876
+ MONO/6470
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 138 TRAVELLING BOX AFTER 06JAN78. \fR
+ IIIAJ/UG2 IIAO/UG2 IIIAJ/*5113 IIAO/*5113
+ IIAO/GG385 IIIAJ/CLEAR IIIAJ/GG385 IIAD/GG495
+ 127/GG495 098/RG610 127/RG610 IVN/RG695
+ MONO/4363 MONO/4770 MONO/5200 MONO/5876
+ MONO/6470
+
+.T&
+l l s s s
+l l l l l.
+\fBWedge 201 TEN UCLA SPOTS (H. FORD, 10JAN78) \fR
+ MONO/MONO
+.TE
diff --git a/noao/imred/dtoi/doc/dtoi.toc b/noao/imred/dtoi/doc/dtoi.toc
new file mode 100644
index 00000000..cd794321
--- /dev/null
+++ b/noao/imred/dtoi/doc/dtoi.toc
@@ -0,0 +1,34 @@
+.LP
+.DS C
+\fBTable of Contents\fR
+.DE
+.sp 3
+1.\h'|0.4i'\fBIntroduction\fP\l'|5.6i.'\0\01
+.sp
+2.\h'|0.4i'\fBDetermining the HD Curve Data \fP\l'|5.6i.'\0\01
+.br
+\h'|0.4i'2.1.\h'|0.9i'SPOTLIST\l'|5.6i.'\0\02
+.br
+\h'|0.4i'2.2.\h'|0.9i'DEMATCH\l'|5.6i.'\0\02
+.sp
+3.\h'|0.4i'\fBFitting the Curve\fP\l'|5.6i.'\0\02
+.br
+\h'|0.4i'3.1.\h'|0.9i'HDFIT\l'|5.6i.'\0\02
+.br
+\h'|0.9i'3.1.1.\h'|1.5i'Interactive Curve Fitting\l'|5.6i.'\0\03
+.sp
+4.\h'|0.4i'\fBApplying the Transform\fP\l'|5.6i.'\0\05
+.br
+\h'|0.4i'4.1.\h'|0.9i'HDTOI\l'|5.6i.'\0\05
+.sp
+5.\h'|0.4i'\fBAligning Related HD curves\fP\l'|5.6i.'\0\06
+.br
+\h'|0.4i'5.1.\h'|0.9i'HDSHIFT\l'|5.6i.'\0\06
+.sp
+6.\h'|0.4i'\fBTesting the Transformation Algorithm\fP\l'|5.6i.'\0\06
+.br
+\h'|0.4i'6.1.\h'|0.9i'SELFTEST\l'|5.6i.'\0\06
+.sp
+7.\h'|0.4i'\fBThe Wedgefile Database\fP\l'|5.6i.'\0\06
+.br
+\h'|0.4i'7.1.\h'|0.9i'Contents of the NOAO Wedgefile\l'|5.6i.'\0\07
diff --git a/noao/imred/dtoi/doc/hdfit.hlp b/noao/imred/dtoi/doc/hdfit.hlp
new file mode 100644
index 00000000..0b55137e
--- /dev/null
+++ b/noao/imred/dtoi/doc/hdfit.hlp
@@ -0,0 +1,79 @@
+.help hdfit Mar88 imred.dtoi
+.ih
+NAME
+hdfit -- fit characteristic curve to density, exposure data
+.ih
+USAGE
+hdfit database
+.ih
+PARAMETERS
+.ls database
+Database[s] containing the density, log exposure information.
+.le
+.ls function = "power"
+Type of curve to fit; chosen from "power", "legendre", "chebyshev",
+"spline1" or "spline3". Abbreviations are permitted.
+.le
+.ls transform = "logopacitance"
+Transformation performed on the density prior to fitting. Chosen from
+"none", "logopacitance", "k50" or "k75".
+.le
+.ls weighting = "none"
+Weights can be assigned to the independent variable for fitting a curve.
+Choices are "none", "user" and "calculated".
+.le
+.ls order = 4
+Order of the fit.
+.le
+.ls interactive = yes
+Fit the data interactively?
+.le
+.ls device = "stdgraph"
+Interactive graphics device.
+.le
+.ls cursor = "stdgcur"
+Source of cursor input.
+.le
+.ih
+DESCRIPTION
+Task \fIhdfit\fR is used to fit a curve to density and log exposure
+values in preparation for transforming an image from density to intensity.
+The log exposure and density are read from \fBdatabase\fR.
+More than one database can be input,
+in which case one curve is fit to the combined data and the results
+written to each database in the list.
+
+Weights can be applied to the independent variable of the fit.
+Weights can be changed interactively, and are originally chosen from
+"none", "user" and "calculated". A weights value can
+be calculated from the standard deviations, read from \fBdatabase\fR,
+as weight = (normalized density) / sdev. If user weights are to be
+used, they are read from \fBdatabase\fR record "weights" as "wts_vals"
+entries.
+
+When \fBinteractive\fR = yes, the HD curve is plotted and the cursor
+made available for interactively examining and altering the fit.
+The fitting function, transformation and order can be modified; data
+points can be added, deleted or edited. Four choices of independent
+variable are available in \fBhdfit\fR by means of the parameter
+\fBtransform\fR. No transformation can take place, in which case
+the independent variable is the density. Other choices are the log
+opacitance or a Kaiser transform with alpha = 0.50 or 0.75. The
+default choice is to fit log exposure as a function of the log opacitance;
+this is traditionally known as the Baker-Seidel function.
+.ih
+EXAMPLES
+.nf
+Using the defaults as starting parameters, interactively fit a curve to
+the data points in db1.
+
+ cl> hdfit db1
+
+A sixth order power series function is fit in batch mode to the db1 data.
+
+ cl> hdfit db1 order=6 interactive-
+.fi
+.ih
+SEE ALSO
+spotlist, dematch, hdtoi
+.endhelp
diff --git a/noao/imred/dtoi/doc/hdshift.hlp b/noao/imred/dtoi/doc/hdshift.hlp
new file mode 100644
index 00000000..aaa59063
--- /dev/null
+++ b/noao/imred/dtoi/doc/hdshift.hlp
@@ -0,0 +1,50 @@
+.help hdshift Feb87 imred.dtoi
+.ih
+NAME
+hdshift - calculate and subtract zero point to align HD curves.
+.ih
+USAGE
+hdshift database
+.ih
+PARAMETERS
+.ls database
+Input list of databases containing density, exposure and fit information.
+.le
+.ih
+DESCRIPTION
+For each file in \fBdatabase\fR, procedure \fBhdshift\fR calculates and
+subtracts a zero point shift to bring several related HD curves into
+alignment. The individual shifts are calculated by elimination of the
+first coefficient (Bevington, eqn 9-3):
+.nf
+ _ _ _ _
+ a0 = y - a1*X - a2*X**2 - ... - an*X**n
+
+.fi
+Here, the averages over y and X refer to individual \fBdatabase\fR averages;
+the coefficients a1, ... an were previously calculated using data from all
+\fBdatabase\fRs, in task \fIhdfit\fR, and stored in the database. The
+a0 term is calculated individually for each database; this term represents
+the zero point shift in log exposure and will be different for each database.
+
+On output, the log exposure values in each \fBdatabase\fR have been
+shifted to the zero point shift of the first database in the list. The
+log exposure records are now aligned and it would be appropriate
+to run task \fIhdfit\fR on the modified \fBdatabase\fR list and
+determine the common solution.
+.ih
+EXAMPLES
+.nf
+
+Shift the curves in four databases to a common zero point.
+
+ cl> hdshift db1,db2,db3,db4
+.fi
+.ih
+SEE ALSO
+hdfit, hdtoi
+.br
+"Averaging Photographic Characteristic Curves", John Kormendy, from
+"ESO Workshop on Two Dimensional Photometry", Edited by P. Crane and
+K.Kjar, p 69, (1980), an ESO Publication.
+.endhelp
diff --git a/noao/imred/dtoi/doc/hdtoi.hlp b/noao/imred/dtoi/doc/hdtoi.hlp
new file mode 100644
index 00000000..bf4355f0
--- /dev/null
+++ b/noao/imred/dtoi/doc/hdtoi.hlp
@@ -0,0 +1,88 @@
+.help hdtoi May88 imred.dtoi
+.ih
+NAME
+hdtoi -- transform images according to hd curve
+.ih
+USAGE
+hdtoi input output database
+.ih
+PARAMETERS
+.ls input
+List of images to be transformed.
+.le
+.ls output
+List of output image names.
+.le
+.ls database
+Name of text database describing HD curve.
+.le
+.ls fog = ""
+Value of fog level, read from database if unspecified.
+.le
+.ls option = "mean"
+Option for calculating fog density when \fBfog\fR is a file list, can be
+either "mean" or "median".
+.le
+.ls sigma = 3.0
+If \fBfog\fR is a file name, and \fBoption\fR = "mean", the mean fog density
+is iteratively calculated using this rejection criteria.
+.le
+.ls floor = 0.0
+Value assigned to levels below fog, can be either 0.0 or -1.0.
+.le
+.ls ceiling = 30000.
+The final intensities are scaled to this value, such that a saturated
+input density equals \fBceiling\fR on output.
+.le
+.ls datatype = "r"
+Datatype of output image pixels.
+.le
+.ls verbose = yes
+Print log of processing to STDOUT.
+.le
+.ih
+DESCRIPTION
+Task \fIhdtoi\fR transforms one image to another as described by the
+\fBdatabase\fR. There is only one HD curve per run; the same
+transformation is applied to all input images.
+
+The fog value can be obtained in three ways: read from the database, read
+as a floating point number, or calculated from a list of fog images. If
+parameter \fBfog\fR is not specified, the fog value is read from
+\fBdatabase\fR. If \fBfog\fR is specified, it can be entered
+as either a floating point number or as a list of file names. If the
+value cannot be read as a number, it is assumed to be a file name. In that
+case, the density of each file in the fog list is calculated and the
+average of these values is subtracted from \fBinput\fR before processing.
+The algorithm used to calculate the fog density is selected by the
+\fBoption\fR parameter, and is either a "mean" or "median" calculation.
+The fog density can be the mean value after pixels more than the specified
+number of sigma have been rejected, or the median value of all the fog spot
+pixels.
+
+The fog value is subtracted from the input image before the transformation
+takes place. It is possible that some density values will fall below
+the fog level; these values are handled in one of two ways. Values
+below the fog value are set equal to 0.0 when \fBfloor\fR = 0.0. If
+\fBfloor\fR = -1.0, the resulting intensity = -1 * intensity (abs (value)).
+
+A scaling factor is applied to the final intensities, as typically
+they will be < 1.0. The \fBceiling\fR parameter is used to specify what
+value a saturated density is transformed to; all intensities are scaled
+to this upper limit. The precision of the transformation is unaffected by
+this parameter, although caution must be used if the output image pixel
+type is an integer. The user is responsible for choosing
+a \fBceiling\fR that avoids the truncation of significant digits.
+.ih
+EXAMPLES
+Convert three density images to intensity images as described in database db1.
+
+ cl> hdtoi denin* intim1,intim2,intim3 db1
+.ih
+TIME REQUIREMENTS
+Task \fBhdtoi\fR requires 20 cpu seconds to transform a 512 square image, with
+a 12 bit data range, on a VAX 750
+.ih
+SEE ALSO
+spotlist, dematch, hdfit
+.endhelp
diff --git a/noao/imred/dtoi/doc/selftest.hlp b/noao/imred/dtoi/doc/selftest.hlp
new file mode 100644
index 00000000..329c9099
--- /dev/null
+++ b/noao/imred/dtoi/doc/selftest.hlp
@@ -0,0 +1,81 @@
+.help selftest Feb87 imred.dtoi
+.ih
+NAME
+selftest -- test routine to verify \fIdtoi\fR transformation
+.ih
+USAGE
+selftest nbits
+.ih
+PARAMETERS
+.ls nbits = 12
+Dymanic range of data to test.
+.le
+.ls device = "stdgraph"
+Plotting device for graphical output.
+.le
+.ls verbose = no
+A table of density, intensity values is printed if \fBverbose\fR = yes.
+.le
+.ls ceiling = 30000.
+Maximum intensity to output.
+.le
+.ls max_raw = 0
+The maximum raw data value. Needed only if \fInbits\fR equals something
+other than 12, 15 or 0.
+.le
+.ls scale = 0.0
+The raw data value to density scale value. Needed only if \fInbits\fR
+equals something other than 12, 15, or 0.
+.le
+
+.ih
+DESCRIPTION
+Task \fIselftest\fR is a test program for the \fIdtoi\fR package. Its
+output can be examined to see if numerical errors are introduced during
+the density to intensity transformation. It also evaluates truncation
+errors produced when an output image with integer pixels is written.
+
+Many different PDS setups can be investigated with task \fBselftest\fR.
+Setting parameter \fInbits\fR = 12
+indicates PDS format data, with data range 0 to 3071. Setting \fInbits\fR = 15
+indicates FITS format data, with data range 0 to 24575. The special value of
+\fInbits\fR = 0 means a small test data range from 1 to 144 is investigated.
+If any other value of \fInbits\fR is entered, the user is queried for the
+max raw data values and the raw data to density scaling factor.
+
+An intensity vector is generated from a density vector in two different ways.
+The first method uses the density vector and known coefficients to compute
+the intensity. The second method uses the curfit package to generate a
+look up table of intensities as done in task \fBHDTOI\fR. The residual
+of the two intensity vectors is plotted. Ideally, the difference between
+the 'known' intensities and 'calculated' intensities is zero.
+
+The second plot output by \fBselftest\fR shows intensity as a function
+of density. Two lines are overplotted; integer intensity versus density
+and real intensity versus density. Because truncation errors are most
+pronounced at low density values, the plot covers only the lowest 5%
+of the density range. The user should investigate the plot with the
+cursor zoom and expand capabilities to determine if truncation errors
+are significant.
+
+In verbose mode, \fBselftest\fR produced a three column table of raw
+data value, density and calculated intensity.
+
+.ih
+EXAMPLES
+
+.nf
+Run task selftest for 12 bit data with plots appearing on the terminal.
+
+ cl> selftest
+
+.fi
+Run selftest in verbose mode, spooling the output to file 'ditable'. This
+file is then run through the 'fields' task to extract the density and intensity
+columns which are piped to plot. The results in a plot of the look up table.
+.nf
+
+ cl> selftest ver+ > ditable
+ cl> fields ditable 2,3 | graph xlab=Density ylab=Intensity
+.fi
+.endhelp
diff --git a/noao/imred/dtoi/doc/splotlist.hlp b/noao/imred/dtoi/doc/splotlist.hlp
new file mode 100644
index 00000000..43b3f223
--- /dev/null
+++ b/noao/imred/dtoi/doc/splotlist.hlp
@@ -0,0 +1,81 @@
+.help spotlist May88 imred.dtoi
+.ih
+NAME
+spotlist -- calculate densities of calibration spots
+.ih
+USAGE
+spotlist spots fogs database
+.ih
+PARAMETERS
+.ls spots
+List of image files containing the calibration data.
+.le
+.ls fogs
+List of image files containing fog spots.
+.le
+.ls database
+Name for output database.
+.le
+.ls scale = 0.00151 # (4.65 / 3071.)
+The scale factor to convert values in the image files to densities, such
+that scale = density / input_value.
+.le
+.ls maxad = 3071
+The maximum A/D value, that is, the input value corresponding to a
+saturated pixel.
+.le
+.ls option= "mean"
+Option for calculating densities can be either "mean" or "median".
+.le
+.ls sigma = 3.0
+Rejection criteria for iteratively calculating mean density.
+.le
+.ih
+DESCRIPTION
+Task \fIspotlist\fR reads calibration spot images and calculates their
+density and standard deviation. Three records are entered in the
+database: density, standard deviation and number of unrejected pixels.
+Each record contains as many entries as calibration spots.
+
+All input values are multiplied by the \fBscale\fR parameter to convert
+them to densities. The value of \fBscale\fR is not critical to the
+reductions, it is provided so that \fIspotlist\fR output can be in the
+familiar units of density. The default value of \fBscale\fR is correct
+for NOAO PDS data written to a PDS format tape. If a FITS format tape was
+written, \fBscale\fR = 0.0001893. These values are appropriate for the PDS
+with its new 15-bit logarithmic A to D converter. The value of \fBscale\fR
+used is also entered in the database.
+
+Parameter \fBmaxad\fR is the integer input value that represents a
+saturated pixel. This value is used by \fIspotlist\fR to accurately
+calculate the density of a saturated pixel, which is then entered in the
+database. This value of "maxden" will later be used by task \fIhdfit\fR
+to normalize the independent variable vector, and by task \fIhdtoi\fR to
+scale the intensity range precisely to a user specified value.
+
+A fog level is calculated from image \fBfogs\fR, and entered into
+the database file. If more than one image is given for \fBfogs\fR,
+a single fog value is calculated from all fog pixels. The fog level
+is calculated but not subtracted in this procedure. The fog images to be
+averaged should be the same size. An entry for the fog level is made
+in the database.
+
+The \fBspots\fR files are assumed to be ordered such that they are either
+monotonically increasing or decreasing in density, with no omitted spots
+between the first and last measured. The calculated density can be
+calculated two ways; the algorithm used is selected by the \fBoption\fR
+parameter. The density is either the mean spot value after pixels more
+than the specified number of sigma from the mean value have been rejected,
+or the median value of all the spot pixels.
+.ih
+EXAMPLES
+Calculate mean densities of calibration spots which had previously been
+read in from a FITS format tape. The database "db1" will be created.
+
+.nf
+ cl> spotlist spots* fogspot db1 scale=1.893e-4
+.fi
+.ih
+SEE ALSO
+dematch, hdfit, hdtoi
+.endhelp