Scientific programmer.

New FITS Header indicating makemap convergence

David Berry recently added a new feature to SMURF’s makemap. There is now a FITS header in the output maps which will let you know if makemap successfully converged.

The new header is NCONTNCV – the Number of CONTiguous chunks that did Not ConVerge. This should be zero if everything went well. If you are reducing data yourself, you can also check the makemap output or the ORAC-DR log for more information.

You will have access to this feature if you are using an rsynced Starlink build from after the 19th of June 2019. Observations in the archive reduced from 19th June 2019 onwards should also have this FITS header present.

You can check the fitsheaders in the output file with the KAPPA commands fitslist or fitsval, or if you are downloading a reduced file from the archive in .FITS format you can use any of your favourite FITS header viewers.

Changes to the NDF (.sdf) Data Format

Those of you who carefully read our Starlink release notes (everyone, right?) will have noticed that our last release (2018A) included a new HDF5-based data format.

The 2018A release could read data in the new format, but by default still wrote data in the old format. In preparation for our next release we have now changed the default data format in our codebase. This means if you download the Linux* nightly build from EAO (see http://starlink.eao.hawaii.edu/starlink/rsyncStarlink) you will now start producing data files in the new format.

Our data files will still have the same ‘.sdf’ extension. You can tell if a file is in the old or new format most easily by using the unix command ‘file’. For an old-style .sdf file you will see ‘data’. for the new-style .sdf files you will see: ‘Hierarchical Data Format (version 5) data.

If you need to convert a new style .sdf file to the old format, you can set the environmental variable ‘HDS_VERSION’ to 4, and then run the ndfcopy command on your file. The output file will now be in the old style format. You’ll need to unset the variable to go back to writing your output in the new format.

E.g. in bash, you could do:

kappa
export HDS_VERSION=4
ndfcopy myfile.sdf myfile-hdsv4.sdf
unset HDS_VERSION

If you need to use the old format all the time you could set HDS_VERSION=4 in your login scripts.

If you want to know more detail, you should know that the NDF data structures you use with Starlink are written to disk in a file-format known as HDS. This new file format we are now using is version 5 of the HDS file format, and is based on the widely used Hierarchical Data Format (HDF) standard. It is possible to open and read the new-style files with standard HDF5 libraries — e.g. the python H5py library. There are a few unusual features you may find, particularly if looking at the history or provenance structures. The new format is described in Tim Jenness’ paper (https://ui.adsabs.harvard.edu/#abs/2015A&C….12..221J/abstract). Our file format and data model are described in the HDS manual ( SUN/92) and the NDF manual (SUN/33). If you’re interested in the history of NDF and how it compares to the FITS file format/data model you could also read Jenness et al’s paper on Learning from 25 years of the extensible N-Dimensional Data Format

We haven’t changed the format that the JCMT or UKIRT write raw data in, so all raw data from both these telescopes will continue to be written in the old format. We also haven’t yet changed the format for new JCMT reduced data available through the CADC archive, but that is likely to happen at some point. We don’t intend to remove the ability for Starlink to read old files. The version of the starlink-pyhds python library available through pypi can read the new-format data files already.

If you have any questions or issues with the new file format you can contact the Starlink developers list https://www.jiscmail.ac.uk/cgi-bin/webadmin?A0=STARDEV, or if you prefer you can directly email the observatory via the  helpdesk AT eaobservatory.org email address. If you’re subscribed to the JCMT/EAO software-discussion@eaobservatory.org  email address you can send comments there (see https://www.eaobservatory.org/jcmt/help/softwareblog/ for subscription links/instructions).

*Only the Linux build because we currently have an issue with our OSX nightly build and it is not updating right now– our hardware has died and I haven’t got the temporary replacement up and running yet.

POL-2 information for 16B proposals

For potential PIs who wish to submit a POL-2 proposal for semester 16B (deadline 2016-03-16 01:00 UT), we have now prepared a brief guide page with details on the estimated sensitivity and observing mode of the instrument, at:

http://www.eaobservatory.org/jcmt/instrumentation/continuum/scuba-2/pol-2/

The Integration Time Calculator in Hedwig has also been updated. When performing a SCUBA-2 calculation, you can now select the POL-2 scan pattern POL-2 Daisy, to get a POL-2 time or noise estimate.

Please do remember, the 16B call for proposal closes in only 15 days!

JCMT Legacy Release 1 (SCUBA-2 850µm)

The JCMT is pleased to announce that its JCMT Legacy Release 1 (SCUBA-2 850µm) data set is now publicly available. This project makes available to the community uniformly reduced observations of most public SCUBA-2 850µm observations taken before 2013-08-01, as well as coadds and emission catalogs.  Please see our web page on the JCMT-LR1 for more details.

JCMT-LR1 (September 2015: SCUBA-2 850µm)

  • Public 850µm SCUBA-2 observations taken between 2011-02-01 and 2013-08-01.
  • Gridded onto HEALPix tiles using the HPX projection.
  • All reduced using the same SMURF makemap dimmconfig ‘jsa_generic’ (included in the Starlink 2015A release)
  • Coadds of all the reduced non-pointing observations which passed Quality Assurance coadded onto a single tile.
  • Catalogues of detected (>5σ) regions of contiguous emission (extents) towards each tile.
  • Catalogues of local peaks within the extents of each tile.
  • Over 2000 hours of observing time included.
  • Search CADC with proposal ID=’JCMT-LR1′ to view the coadds and catalogues.