Chapter 4
Applications Packages
When you run ICL and type:
you will be shown two lists of packages. The first contains Standard packages which are available at all Starlink
sites. The other contains Options which are made available at a site only on request.
This Chapter gives a brief overview of each package, while Chapter 20 describes the specific commands
available within each package. Further information can be obtained from their associated Starlink User Note
(SUN) which is referenced at the top of each section. Starlink software as a whole is described in
SUN/1.
An important part of the rationalization of Starlink software which the coming of ADAM made
possible concerns data structures. The Hierarchical Data System (HDS) (see Chapter 10) is very
flexible, and is capable of creating an infinite variety of data structures. Without recommending some
standard structure there would be a danger of programmers writing applications which could
not read each other’s data. If the standard is also implemented in a small number of routines, the
restrictions imposed by the standard also make programming easier. The main unifying theme of
Starlink applications is the standard data format defined by Starlink; this is the Extensible
-dimensional
Data Format — NDF — described in Section 10.2. This is centred on an
-dimensional
data array that can store most astronomical data such as spectra, images and spectral-line data cubes. The NDF
may also contain such information as title, axis labels and units, and error and quality arrays. There
is also a place to store ancillary data associated with the data array. These could be information
about the original observing set-up, such as airmass during the observation or temperature of
the detector; there may be calibration data or results produced during processing, for example
spectral line fits. Groups of related parameters not defined by the NDF format itself are held in
extensions.
A key component of Starlink software is KAPPA (Kernel Application Package). This does not process
non-standard extensions, but neither does it lose them — it copies them to any NDFs which it creates. Other
application packages may be able to process some, but not all, extensions. It is hoped that such packages will
use KAPPA applications as templates, in procedures, or directly as appropriate.
Each section below contains a sketch of how to use the application being described. In these:
means “issue the command from DCL”, while:
means issue it from ICL. Remember that before using any of these packages, the commands:
should have been executed. Likewise, to run ICL you should type:
to get the ICL prompt.
The packages described in this chapter are listed below ordered by function. This should help you find a
package which is appropriate for a particular purpose. The packages are then described in alphabetical order on
separate pages.
-
Image Analysis & Photometry
-
-
KAPPA
- — Kernel applications
-
DAOPHOT
- — Stellar photometry
-
PHOTOM
- — Aperture photometry
-
PISA
- — Object finding and analysis
-
Spectroscopy
-
-
FIGARO
- — General spectral reduction
-
SPECDRE
- — Spectroscopy data reduction
-
Specific Wavelengths
-
-
ASTERIX
- — X-ray data analysis
-
Specific Instruments
-
-
CCDPACK
- — CCD data reduction
-
IRCAM
- — Infrared camera data reduction
-
Polarimetry
-
-
TSP
- — Time series and polarimetry analysis
-
Database Management
-
-
SCAR
- — Catalogue data base system
-
Utilities
-
-
CONVERT
- — Data format conversion
-
SST
- — Simple software tools
4.1 ASTERIX — X-ray data analysis
[SUN/98]
This is a collection of programs to analyse astronomical data in the X-ray waveband. Many of the programs are
general purpose and are capable of analysing any data in the correct format. It is instrument independent and
currently has interfaces to the Exosat and Rosat instruments.
ASTERIX data are stored in HDS files and are therefore compatible with all ADAM packages. There are
basically two different types: binned and event datasets. Binned data (e.g. time series, spectra, images) are stored
in files whose structure is based on the Starlink standard NDF format (SGP/38). Data errors (stored in
the form of variances) and quality are catered for. Event data sets store information about a set of
photon ‘events’. Each event will have a set of properties, e.g. X position, Y position, time, raw pulse
height.
The input data are first processed by an instrument interface. Event data are then processed and binned, and
then the binned data are processed. Finally, graphical output is generated.
The commands may be classified as follows:
- Interface to a particular instrument (EXOSAT, ROSAT, etc).
- Event dataset and binned dataset processing.
- Data conversion and display.
- Mathematical manipulations.
- Time series analysis.
- Image processing.
- Spectral analysis.
- Statistical analysis.
- Data quality analysis.
- HDS editor.
- Source searching.
- Graphical and textual display.
To run ASTERIX:
-
from ICL:
-
$ ASTSTART
$ ICL
ICL> HELP ASTERIX
ICL> ASTERIX
ICL> ASTHELP
ICL> (any ASTERIX command)
-
from DCL:
-
$ ASTSTART
$ ASTHELP
$ (most, but not all, ASTERIX commands)
Demonstration:
-
- A demonstration session is described in SUN/98.
4.2 CCDPACK — CCD data reduction
[SUN/139]
A package to perform the initial processing of CCD data. Its main advantage over previous methods is its
enhanced functionality. It processes large amounts of data easily and efficiently, with a minimum of effort on the
user’s part.
It includes routines for performing:
- Bias calibration data preparation.
- Bias subtraction.
- Flash and dark calibration data preparation.
- Flash and dark-count correction.
- Flatfield data preparation.
- Flatfield correction.
The following features are of particular note:
- Accesses lists of NDFs, using wildcards and by using names stored in text files.
- Logs the progress of a reduction sequence.
- Processes all non-complex numeric HDS data types.
- Supports many data-combination techniques.
- Takes full account of statistical uncertainty using variance production and propagation.
To run CCDPACK:
-
from ICL:
-
ICL> HELP CCDPACK
ICL> CCDPACK
ICL> <individual commands>
ICL> HELP <command>
-
from DCL:
-
$ CCDPACK
$ CCDHELP
$ (any CCDPACK commands)
4.3 CONVERT — Data format conversion
[SUN/55]
This package converts data between the Starlink standard
-dimensional
data format (NDF) and other formats. Currently, it can handle three data formats:
- DIPSO.
- FIGARO (version 2).
- INTERIM (BDF).
To run CONVERT:
-
from ICL:
-
ICL> CONVERT
ICL> (any of the CONVERT programs)
-
from DCL:
-
Simply type the name of the conversion program you want.
Demonstration:
-
- This will convert an NDF file into BDF format, as used by the earlier Interim environment.
ICL> CONVERT
ICL> NDF2BDF
NDF - Name of NDF to be converted > adam_examples:image
BDF - BDF filename > image
ICL>
A file IMAGE.BDF will have been created in your current directory.
4.4 DAOPHOT — Stellar photometry
[SUN/42]
A stellar photometry package written by Peter Stetson at the Dominion Astrophysical Observatory, Victoria, B.C.,
Canada and adapted for use under ADAM. It performs the following tasks:
- Finding objects.
- Aperture photometry.
- Obtaining the point spread function.
- Profile-fitting photometry.
Profile fitting in crowded regions is performed iteratively, which improves the accuracy of the photometry. It
does not directly use an image display (which aids portability), although three additional routines allow results
to be displayed on an image device. It uses image data in NDF format.
To run DAOPHOT:
-
from ICL:
-
-
from DCL:
-
$ DAOPHOT
Command: HELP
Command: ...
Command: EXIT
N.B. If you enter ‘DAOPHOT’ just to see what happens, you will be disappointed to receive the message ‘Value
unacceptable –- please re-enter’. You will need to read the documentation and user manual before you
can make any progress with this program.
4.5 FIGARO — General spectral reduction
[SUN/86]
This is a general data reduction system written by Keith Shortridge at Caltech and the AAO. Most people find it of
greatest use in the reduction of spectroscopic data, though it also has powerful image and data cube
manipulation facilities. Starlink recommends FIGARO as the most complete spectroscopic data reduction
system in the Collection. Examples of its facilities are:
- Analyse absorption lines interactively.
- Aperture photometry.
- Calibrate B stars.
- Calibrate flat fields.
- Calibrate using flux calibration standards.
- Calibrate wavelengths of spectra.
- Correct S-distortion.
- Extract spectra from images and images from data cubes, and insert spectra into images
and images into data cubes.
- Extract spectra from images taken using optical fibres.
- Fit Gaussians to lines in a spectrum interactively.
- Generate and apply a spectrum of extinction coefficients.
- Input, output, and display data.
- Look at the contents of data arrays, other than graphically.
- Manipulate complex data structures (mainly connected with Fourier transforms).
- Manipulate data arrays ‘by hand’.
- Manipulate images and spectra (arithmetic and more complicated).
- Process data taken using FIGS (the AAO’s Fabry-Perot Infra-Red Grating Spectrometer).
- Process echelle data, in particular the UCL echelle in use at the AAO.
At present, a number of related packages are bundled with FIGARO. In future, these may be released as
separate items. They include:
-
TWODSPEC
- [SUN/16]
This reduces and analyses long-slit and optical-fibre array spectra. A number of its functions are
useful outside the area of spectroscopy. The main application areas are:
- Line profile analysis; LONGSLIT analyses calibrated long-slit spectra. For example,
it can fit Gaussians, either manually or automatically, in batch. It can handle data
with two spatial dimensions, such as TAURUS data. FIBDISP provides further
options useful for such data, although it is primarily designed for fibre array data.
An extensive range of options is available, especially for output.
- Two-dimensional arc calibration.
- Geometrical distortion correction; S-distortion and Line curvature.
- Conversion between FIGARO and IRAF data formats.
- Display programs.
- Removing continua.
To run FIGARO:
-
from ICL:
-
ICL> HELP FIGARO
ICL> FIGARO
ICL> HELP FIGARO CLASSIFIED
ICL> (any FIGARO commands)
-
from DCL:
-
$ FIGARO
$ HELP FIGARO
$ HELP FIGARO CLASSIFIED
$ (any FIGARO commands)
4.6 IRCAM — Infrared camera data reduction
[SUN/41]
This package reduces, displays, and analyses 2-dimensional images from the UKIRT infrared camera (IRCAM).
The image data reduction facilities available are:
- Mathematical and statistical operations.
- Size changing and mosaicking.
- Inspection.
- Interpolation.
- Smoothing.
- Feature enhancement.
- Bad pixel removal.
- Polarimetry.
- Median filtering of flat-fields.
The graphics and image display facilities available are:
- Image display of various types (PLOT, CONTOUR, NSIGMA, RANPLOT).
- Display cursor position and value.
- Colour control.
- Line graphics such as 1-dimensional cuts/slices through images and contour maps.
- Annotation.
To run IRCAM:
-
from ICL:
-
Cannot be run from ICL (it uses the older ADAMCL).
-
from DCL:
-
$ IRCAM_SETUP
$ IRCAM_CLRED
(you are then asked to say where your data are
and which plotting device you want to use)
Ircam-CLRED : > ?
(produces a list of commands)
Ircam-CLRED : > (select commands)
.
Ircam-CLRED : > EXIT
4.7 KAPPA — Kernel applications
[SUN/95]
The Kernel Application Package runs under ADAM, using the NDF data format, and provides general-purpose
applications. It is the backbone of the software reorganization around the ADAM environment, and its
applications integrate with other packages such as PHOTOM, PISA, and FIGARO. It is usable as a
single large program from the ADAM command language ICL, or as individual applications from
DCL.
It handles bad pixels, and processes quality and variance information within NDF data files. Although oriented
towards image processing, many applications will work on NDFs of arbitrary dimension. Its graphics are device
independent. Currently, KAPPA has about 140 commands and provides the following facilities for data
processing:
- Generation of NDFs and ASCII tables by up-to-date FITS readers.
- Generation of test data, and NDF creation from ASCII files.
- Setting NDF components.
- Arithmetic, including a powerful application that handles expressions.
- Editing pixels and regions, including polygons and circles, and re-flagging bad pixels
by value or by median filtering.
- Configuration changing: flip, rotate, shift, subset, dimensionality.
- Image mosaicking; normalization of NDF pairs.
- Compression and expansion of images.
- Filtering: box, Gaussian, and median smoothing; very efficient Fourier transform,
maximum-entropy deconvolution.
- Surface fitting.
- Statistics, including ordered statistics, histogram, pixel-by-pixel statistics over a
sequence of images.
- Inspection of image values.
- Centroiding of features, particularly stars; stellar PSF fitting.
- Detail enhancement via histogram equalization, Laplacian convolution, edge
enhancement via a shadow effect, thresholding.
There are also many applications for data visualization:
- Use of the graphics database, AGI, to pass information about pictures between
applications. Facilities for the creation, labelling and selection of pictures, and obtaining
world and data co-ordinate information from them.
- Image and greyscale plots with a selection of scaling modes and many options such as
axes.
- Creation, selection, saving and manipulation of colour tables and palettes (for axes,
annotation, coloured markers and borders).
- Snapshot of an image display to hardcopy.
- Blinking and visibility of image-display planes.
- Line graphics: contouring, including overlay; columnar and hidden-line plots of
images; histogram; line plots of 1-d arrays, and multiple-line plots of images; slices
through an image. There is some control of the appearance of plots.
To run KAPPA:
-
- See Chapter 3 for a demonstration of KAPPA.
4.8 PHOTOM — Aperture photometry
[SUN/45]
This performs aperture photometry. It has two basic modes of operation:
- Using an interactive display to specify the positions for the measurements.
- Obtaining those positions from a file.
The aperture is circular or elliptical, and the size and shape can be varied interactively on the display, or by
entering values from the keyboard or parameter system. The background sky level can be sampled
interactively by manually positioning the aperture, or automatically from an annulus surrounding the
object.
To run PHOTOM:
-
from ICL:
-
ICL> HELP PHOTOM
ICL> PHOTOM
IN - NDF containing input image /@ramp1/ > adam_examples:image
COMMAND - PHOTOM /’Values’/ > H (for help)
COMMAND - PHOTOM /’Values’/ > (an option from the menu)
COMMAND - PHOTOM /’Values’/ > E
-
from DCL:
-
$ RUN PHOTOM_DIR:PHOTOM
IN - NDF containing input image /@ramp1/ > adam_examples:image
.
. (as above)
4.9 PISA — Object finding and analysis
[SUN/109]
The Position, Intensity and Shape Analysis package, PISA, locates and parameterizes objects in an image frame.
The core of the package is a routine which performs image analysis on a 2-dimensional data frame. It searches
for objects having a minimum number of connected pixels above a given threshold, and extracts the image
parameters (position, intensity, shape) for each object. The parameters can be determined using thresholding
techniques, or an analytical stellar profile can be used to fit the objects. In crowded regions, deblending of
overlapping sources can be performed.
The package derives from the APM IMAGES routine originally written by Mike Irwin at the University of
Cambridge to analyse output from the Automatic Photographic Measuring system.
To run PISA:
-
from ICL:
-
ICL> HELP PISA
ICL> PISA
ICL> <individual commands>
ICL> HELP <command>
-
from DCL:
-
$ PISA
$ <individual commands>
$ HELP <command>
Demonstration:
-
- This example performs isophotal analysis with deblending of overlapped images on a
frame containing a mixture of stars and galaxies. The results are then plotted on a
suitable device.
ICL> PISA
Welcome to PISA ...
ICL> PISAFIND
IN - NDF containing input image /.../ > PISA_DIR:FRAME
Analysing whole image
MINPIX - Minimum pixel size for images (typically 4-16) > 6
METHOD - Intensity analysis ( 0=Isophotal, 1=Total, 2=Profile ) /0/ > 0
Estimated background level = 492.2
Background standard deviation = 7.4
BACKGROUND - Background (global sky) value /492.17/ >
THRESH - Threshold for analysis (data units) /18.61135/ >
Total number of positive images = 118
The results have been written to PISAFIND.DAT
and PISASIZE.DAT
ICL> PISAPLOT
RESULTS - File of PISAFIND parameterised data /@PISAFIND.DAT/ >
DEVICE - Name of graphics device /@IKON/ >
ICL>
4.10 SCAR — Star catalogue database system
[SUN/70,
106]
The Starlink Catalogue Access and Reporting system is a relational database management system. It was
designed principally for extracting information from astronomical catalogues, but it can be used to process any
data stored in relational form. A large number of catalogues are available, including the IRAS catalogues.
For general database requirements, REXEC may be preferable. SCAR can perform the following
functions:
- Extract data from a catalogue using selection criteria.
- Manipulate data using various statistical and plotting routines.
- Output data from a catalogue.
- Put a new catalogue into the database.
- Search catalogues and generate reports on what has been found.
- Sort, merge, join, and difference catalogues.
- Plot sources in a gnomonic (tangent plane) or Aitoff (equal area) projection.
- Analyse the fields of a catalogue by scatterplot and histogram.
- Calculate new fields.
A distinctive feature of SCAR is the use of index files which you can create and which contain pointers to rows
in one or more catalogues. This is a compact and flexible method of accessing catalogues; for example, a very
large catalogue may be physically ordered by declination, but you can create an index giving access to it
ordered by flux.
To run SCAR:
-
from ICL:
-
$ SCARSTART
.
ICL> HELP SCAR
ICL> SCAR
ICL> SCAR_HELP (for help on SCAR)
ICL> CAR_HELP (for help on CAR commands)
ICL> CAT_HELP (for help on catalogues)
ICL> (any SCAR commands)
-
from DCL:
-
$ SCARSTART
$ (any SCAR commands)
Demonstration:
-
- A script is available which demonstrates some of the features of SCAR. It can be invoked
by typing:
ICL> LOAD SCAR_DOC_DIR:SCAR_SCRIPT
It is suggested that you run this in an empty directory so that you can identify the files which
have been created.
4.11 SPECDRE — Spectroscopy data reduction
[SUN/140]
A package for spectroscopy data reduction and analysis. It fills the gap between FIGARO and KAPPA — on the
one hand, all its routines conform with Starlink’s concept of bad values and variances, on the other hand, they
offer spectroscopy applications hitherto available only in FIGARO. In general, it can work on data sets with
seven or less axes. Often an application will take just a one-dimensional subset as a spectrum.
In most cases the spectroscopy axis can be any axis, but for some applications must be the first
axis.
To run SPECDRE:
-
from ICL:
-
ICL> HELP SPECDRE
ICL> SPECDRE
ICL> <individual commands>
ICL> HELP <command>
-
from DCL:
-
4.12 SST — Simple software tools
[SUN/110]
The Simple Software Tools package helps produce software and documentation, with particular
emphasis on ADAM programming using Fortran 77. It performs fairly simple manipulations of
software, and also tackles some of the commonly encountered problems which are not catered for in
the more sophisticated commercial software tools (such as FORCHECK and VAXset) available on
Starlink.
The main purpose of the first version is to extract information from subroutine ‘prologues’, and to format it to
produce various forms of user documentation. A simple source-code and comment statistics tool is also
included.
There are five applications:
- Convert ‘old-Style’ ADAM/SSE prologues to ‘new-style’ ones.
- Produce LATEX documentation.
- Produce Help libraries.
- Produce STARLSE package definitions.
- Produce source-code statistics.
To run SST:
-
from ICL:
-
ICL> HELP SST
ICL> SST
ICL> HELP <any SST command>
ICL> (any of the SST commands)
-
from DCL:
-
$ SST
$ (any of the SST commands)
Demonstration:
-
- This simple demonstration will create a file containing statistics about source and comment
lines in a Fortran program; the one used happens to be the one which is run in the
demonstration.
ICL> SST
ICL> FORSTATS
IN - Input file(s) /’*.FOR’/ > CONVERT_DIR:CONVERT.FOR
.
. (messages from FORSTATS)
.
ICL> $ TYPE FORSTATS.LIS
.
. (the output which was written by FORSTATS)
.
ICL>
4.13 TSP — Time-series and polarimetry analysis
[SUN/66]
This is a data reduction package for time-series and polarimetric data. These facilities are missing from most
existing data reduction packages which are usually oriented towards either spectroscopy or image processing or
both. Currently TSP can process the following data:
- Spectropolarimetry obtained with the AAO Pockels cell spectropolarimeter in
conjunction with either IPCS or CCD detectors.
- Time series polarimetry obtained with the Hatfield Polarimeter at either UKIRT or AAT.
- Time series polarimetry obtained with the University of Turku UBVRI polarimeter.
- Five channel time series photometry obtained with the Hatfield polarimeter at the AAT
in its high speed photometry mode.
- Time series infrared photometry obtained with the AAO Infrared Photometer
Spectrometer (IRPS).
- Time series optical photometry obtained using the HSP3 high speed photometry
package at the AAT.
To run TSP:
-
from ICL:
-
ICL> HELP TSP
ICL> TSP
ICL> (any TSP command)
-
from DCL:
-
Demonstration:
-
- The following example shows the use of the PPLOT command to plot a polarization
spectrum. The SN1987A data file is included with the software, so you can use this
command to check that TSP is working.
ICL> PPLOT
Loading TSP_DIR:TSP into xxxxTSP
INPUT - Stokes Data to Plot > TSP_DIR:SN1987A
BINERR - Error per bin (per cent) /0.1/ >
AUTO - Autoscale Plot /YES/ >
LABEL - Label for plot /’’/ > SN1987A 1987 Sep 2
DEVICE - Plot Device > IKON
ICL>