For details on the various frequency and velocity definitions and rest frames used in JCMT heterodyne observations, please see our Velocity Considerations page.
- to convert processed data from the JCMT Science Archive, which is in Barycentric frequency, into radio line-of-sight velocity () in LSRK. You can use the following Starlink commands to do this, assuming you have started with a file named ‘archive_file.fits’. These would be copied into a terminal, after setting up the Starlink software within that terminal. The lines starting with a # symbol are comments.
# Load the convert package to enable conversion from FITS to NDF convert # Convert your archive file into Starlink Data Format (NDF) fits2ndf archive_file.fits archive_file.sdf # Load the KAPPA package (contains many useful routines) kappa # Set the 3rd axis of the coordinate system to be in units of radio velocity wcsattrib ndf=archive_file.sdf mode=set name=system\(3\) newval=vrad # Set the standard of rest to be the LSRK wcsattrib ndf=archive_file.sdf mode=set name=StdofRest newval=LSRK # OPTIONAL: if fits output is required, convert the file back to fits ndf2fits archive_file.sdf archive_file_radiovelocity.fits
- to add together spectral data from moving targets (eg comets) onto a ‘source’ (‘cometocentric’) velocity scale:
makecube <rawdatafile00099> system=GAPPT alignsys=TRUE out=out99.sdf wcsattrib out99.sdf set alignoffset 1 wcsattrib out99.sdf set skyrefis origin wcsattrib out99.sdf set sourceVRF topocentric wcsattrib out99.sdf set stdofrest source wcsattrib out99.sdf set alignstdofrest source wcsattrib out99.sdf set SourceVel <velocity> wcmosaic out*.dat . . .
You can run this recipe as follows (in a terminal, after setting up the Starlink software):
picard -log sf SCUBA2_MAPSTATS scuba2_reduced_file.sdf
This will produce an output file named ‘log.mapstats’ in the directory specified by the environmental variable ‘ORAC_DATA_OUT’ (if set), or in the current directory if that is not set.
This file currently contains entries for each of the following columns:
# (YYYYMMDD.frac) (YYYY-MM-DDThh:mm:ss) () () () (um) (deg) () () () () (s) (s) () () () () (") (") () () () # UT HST Obs Source Mode Filter El Airmass Trans Tau225 Tau t_elapsed t_exp rms rms_units nefd nefd_units mapsize pixscale project recipe filename
with some of the units indicted in the top line (ones that are always the same), and others specified as a column (these can be different depending on what was applied to the dataset).
These tell you information about the observation — the time it was started (UT and HST), the observation number (column labeled Obs, i.e. was it the 1st, 2nd,3rd etc SCUBA-2 scan of the night), the sourcename, the Mode (Daisy or Pong), the filter (450um or 850um), the Elevation, Airmass, Transmission, Tau225 and Tau at the start of the observation (taken from the fits headers of the file). The total time spent on source is given in t_elapsed, and the average exposure time for a pixel is given by t_exp.
The RMS is calculated from the variance array of the observation, and its units are explicitly given. The NEFD is calculated from the data in the NDF extension ‘MORE.smurf.nefd’, and its units are explictly given.
The RMS, exposure time and nefd are calculated over the central square region of the map, defined by the MAP_WDTH and MAP_HGTH headers.
If multiple files are run through SCUBA2_MAPSTATS, either in a single call of PICARD or by repeatedly running PICARD in the same terminal on different files, the results will be appended to the existing log.mapstats file. The final columns — project, recipe and filename — are given to ensure it is clear to users which line of the logfile corresponds to which input file.
SCUBA2_MAPSTATS is only designed to work on reductions of a single observations. On coadded observations it could produce misleading results, or even fail completely to work.
If your data is calibrated, this will assume that the units are correctly set in the file, and that the noise and NEFD extensions have also been correctly updated. This can be done by using the PICARD recipe CALIBRATE_SCUBA2_DATA.
You can download the Starlink suite of data reduction, analysis and visualisation tools from http://starlink.eao.hawaii.edu. Binary installations are produced for Mac OSX and for Linux Centos-6 compatible systems.
Please follow the instructions on the download page for installation and required dependencies.
Before you can use the Starlink software in a terminal, you will have to set it up in the current terminal session. If you are running a sh derived shell( sh, bash or ksh) please enter the following commands into your current terminal session:
export STARLINK_DIR=/path/to/your/starlink/installation source $STARLINK_DIR/etc/profile
If you’re using a csh derived shell (csh or tcsh) please instead run the following commands:
setenv STARLINK_DIR /path/to/your/starlink/installation source $STARLINK_DIR/etc/login source $STARLINK_DIR/etc/cshrc
After running these commands, you will be able to start running Starlink commands in the current terminal. If you switch to a new terminal, you will need to repeat those commands before you can run Starlink software. We do not recommend setting these files up in your standard login configuration scripts, as there are name conflicts between some Starlink software and some commonly used linux software — for example, we have a command named ‘convert’ which allows use of astronomy-specific file format conversions. This would conflict with the common command ‘convert’ from ImageMagick.
Under Linux, the starlink set up will also include Starlink libraries in the LD_LIBRARY_PATH variable, and these can conflict with other packages such as CASA. By only setting up the software in a specicifc terminal, you can have one terminal running Starlink and another running software such as CASA without any conflicts.
Once Starlink is setup, you can run commands and load the various Starlink packages. For example, to run commands from the package KAPPA you would simply type the command kappa into your terminal, and you would be shown the following output:
$ kappa KAPPA commands are now available -- (Version 2.3) Type kaphelp for help on KAPPA commands. Type 'showme sun95' to browse the hypertext documentation. See the 'Release Notes' section of SUN/95 for details of the changes made for this release.
Dealing with data from different telescopes is a common activity for astronomers. Here is a rough method for convolving a beam (i.e. PACS, Herschel) to another beam (i.e. SCUBA-2 at 850um).
A very rough and ready method is to assume that both beams are Gaussian. Let’s say PACS has an FWHM of “A” arc-seconds and SCUBA-2 has an FWHM of “B” arc-seconds. If “A” is larger than “B”, then you need to smooth the SCUBA-2 map using a Gaussian kernel of FWHM equal to sqrt(A*A – B*B) arc-seconds. If the SCUBA-2 map has a pixel size of Pa arc-seconds, then first convert the above size into pixels by calculating:
width = sqrt( A*A – B*B )/Pa
and then smooth the SCUBA-2 map using the “gausmooth” command in the starlink kappa package:
gausmooth in=<your scuba-2 map> out=<smoothed map> fwhm=<your "width" value>
The smoothed map is put into the file specified by the “out” parameter. Alternatively, if “B” is larger than “A” smooth the PACS map inthe same way, using a width of:
width = sqrt( B*B – A*A )/Pb
where Pb is the pixel size in the PACS map.
You will need to know what the A and B values are (at 850 um “B” is about 13.5 arc-seconds). If your maps have point sources in them, then you could determine A and B by measuring the widths of the point sources in your maps. For instance, the “psf” command (“Point Spread Function”) in the starlink kappa package allows you to determine a mean beam shape from one or more point sources in an image. It does this by fitting a generalized Gaussian function to the mean radial profile of the indicated point sources.
The above assumes that both beam shapes are Gaussian. The SCUBA-2 beam shape is not quite Gaussian and so the above method can be improved, but it involves a lot more time and effort. You need first to get accurate models for the two beams (either as analytical functions or as 2D images), then you smooth the SCUBA-2 map using the PACS beam, and then smooth the PACS map using the SCUBA-2 beam. This approach requires no deconvolution, but results in maps that have lower resolution than either the PACS or SCUBA-2 maps. The kappa package includes the “convolve” command that will smooth a map using a beam shape specified as a 2D image (the kappa “maths” command can be used to generate a 2D image if your beam shape is expressed as an analytical expression). The details of this method depend on the form in which you obtain the beam shape information.
Sometimes you might wish to ignore certain receptors form your HARP reduction. For example let’s consider that I wish to reduce only the central four HARP receptors form some data. As a reminder the two places to look for information on heterodyne data reduction are:
Quick Guide – short description on how to reduce heterodyne data.
DR Cookbook – should be consulted for a more detailed description.
So to reduce data without certain receptors will require using the bad_receptors calibration option. It is possible to put a list on the command line, or use a file. Given that in this example we want to make twelve receptors “bad” (i.e. ignored) a file seems better. Create a file called $ORAC_DATA_OUT/bad_receptors.lis and space-separate a list of those you want to EXCLUDE from the reductions like this example:
H00 H02 H03 H05 H06 H07 H08 H09 H11 H12 H13 H15
Then run oracdr with the following:
oracdr -loop file -file mylist.lis -nodisplay -log sf -verbose -calib bad_receptors=FILE
This will instruct ORACDR to read a bad_receptors.lis file in ORAC_DATA_OUT, which it expects to contain space separated detectors. Alternatively provide a colon list of receptors, i.e. H02 and H08:
oracdr -loop file -file mylist.lis -nodisplay -log sf -verbose -calib bad_receptors=\"H02:H08\"
It is often useful to know what version of the starlink software you are using. This is especially important if you wish to report a bug to the staff at EAO. To find out what version you are running simply use the command starversion. The output provides you with the version and when the software was last updated. i.e.
starversion 2017A @ bfdc8534a17c406c59302030ed1c1ae1a1223bd1 (2017-07-28T19:12:59)