Sky Survey software

Affiliation
British Astronomical Association, Variable Star Section (BAA-VSS)
Tue, 11/27/2018 - 00:43

I'm thinking of putting together an amatuer sky survey system of my own.  The hardware is reletively staight forward. I It is the software for doing the photometry and analysis that is my problem.  Is any body aware of any available software to automatically or semi automatically ID the stars do the photometry  and create databases of individual stars and interogate them for variability?

Eric

Affiliation
American Association of Variable Star Observers (AAVSO)
a Python-based approach

Hi Eric,

I encountered a similar problem a couple of years ago. My solution was to use Python to (1) plate-solve each image, (2) download a list of all stars visible in the image, (3) convert their equatorial coordinates into pixel coordinates, (4) perform aperture photometry on those pixel coordinates, and (5) use statistics to identify candidate variable stars. Although there are already a number of well-developed Python packages to perform most of these tasks individually, you'd need to be comfortable with coding in Python to go this route.

The drawback to this approach is that you need a fast method for plate-solving a lot of images. I wound up writing my own code to do this, but there is commercial software that can do the trick, too. Another consideration with my approach is that it only performs photometry on the positions of known stars; it doesn't search for new transients.

Best,

Colin

Affiliation
American Association of Variable Star Observers (AAVSO)
My experience with small-scale survey

We have done something similar in our observatory, using both in IDL in the past and now Python. In general our workflow works as:

0) Preprocessing of the data in automatic way. It is a real headache - there can be so much data that you can't look at each individual flat frame. Which ones are actually good? Are there any variable problems with scattered light etc etc?

1) High-quality plate solution is a must: local Astrometry.net solution with SIP (survey telescope probably has relatively wide FoV ==> plate scale over the entire frame is variable almost by definition) and/or Scamp are good options unless you can find good aperture photometry software for Windows(?) that understands Pinpoint's proprietary (crying) distortion description. Pinpoint is very fast, too.

2) Object detection can be tricky, because you may want to detect all objects from frames automatically, basically down to the noise.

4) Doing aperture or (automatic) PSF photometry on detected objects in pixel coordinate system. What we learned quickly, is that placing apertures by world coordinates is unreliable and gives systematically deviating measurement results. Locating objects by WCS and re-centering apertures or having objects with pixel coordinates and matching those later using WCS works well. But in that case you do photometry only on objects that are in your input list - are you interested in transients? Automatic PSF photometry (it is very difficult to measure Milky Way regions without PSF photometry) can be done e.g. by using PSFex, however automatic quality control of PSF photometry can be quite challenging. In fact, Sextractor does also gives good aperture photometry and allows highly customizable output (with world coordinates based on TAN-SIP solution, too). FWHM-driven aperture selection can be also useful when field(s) are not crowded.

5) Some kind of background processes for data quality assessment would be desirable - were there passing clouds or fog when frame #123075 was taken? Was there a fly w(a/o)ndering on filter for that one exposure? How good was the sky, was it transparent, did transparency vary during the observations? Some assumptions of Ubercalibration procedure are useful for that (e.g. most stars are not variable, see e.g. Sterken & Manfroid "Astronomical photometry. A Guide").

Plate solving with Astrometry.net is also very fast if you know approximate center of your FoV, your plate scale, and have disabled all add-ons (pictures with overlays). I found that Sextractor is faster for star detection than astrometry.net itself, so I'm using that, too. I'll paste my astrometry.net solver script below as a Unix Shell script:

#!/bin/bash -x
export PATH=$PATH:/usr/local/astrometry/bin
# Field lower and upper scale limits in arcsec per pixel (if stable, shrink the range for speed)
SCL=1.05
SCH=1.15
# Something is wrong, solution shouldn't take 1 minute
GIVUP=60
# Get field center coordinates from FITS header in sexagesimal, needs wcstools software installed
RA=`gethead RA $1 | sed 's/\ /\:/g'`
DEC=`gethead DEC $1 | sed 's/\ /\:/g'`
# File name without extension
FILEBASE=`echo $1 | sed 's/\.fits//g'`

echo $RA
echo $DEC

date
solve-field -t 2 --use-sextractor -S none -M none -R none -B none -U none -N none --no-plots \
--overwrite --skip-solved --crpix-center --ra $RA --dec $DEC --radius 1 \
--scale-low $SCL --scale-high $SCH --scale-units arcsecperpix -l $GIVUP $1
date

new-wcs -i $1 -w $FILEBASE.wcs -o $FILEBASE-wcs.fits -d
rm $FILEBASE.wcs
rm $FILEBASE.axy

I execute that Shell script from command line: myscope.sh myfile.fits
And output in few seconds will be: myfile-wcs.fits
To solve large amount of images, I just call the script for all of them one by one from shell as:
for i in `ls my*.fits`; do myscope.sh $i; done
and go for quick coffee... :-)

Normally, solving our Planewave CDK12.5+Apogee Alta U42 frames having 37'x37' FoV (plate scale 1.09 "/pix) with search radius of 1 degree (major source of optimization!) with local Astrometry.net, I get WCS solution in 3-10 seconds on 3.2 GHz Core 2 Duo computer with 8G of RAM and non-SSD disks. In my case, disk I/O is clearly the limiting factor (reading index files and reading-writing data files).

Few years ago, one of my colleagues wrapped all of that + automatic PSF photometry with Python and saved the photometry output of tens of thousands of frames into a MySQL database...

If my reply did not help directly, hopefully it was at least somewhat interesting to read. :-)

With best wishes,
Tõnis

Affiliation
American Association of Variable Star Observers (AAVSO)
A step back

Taking perhaps a step back, I wonder: what is the purpose of the survey system?

Do you want to build a a massive catalog of stars with measurements of them, perhaps to find new variable stars among known (catalogued)  stars, or do you want to detect perhaps new optical transients?

I think this will make a big difference. If you are "just" interested in new transients, doing photometry on each and every detectable star in a potentially wide field is perhaps overkill and will slow you down, plus it might even work poorly for transients that are blended within galaxies (SNs etc). I wonder whether for this latter use case, it would be also useful to use techniques that first produce a difference image between normalized archived reference images and the current observation...no easy task as well and then you still need to check for known variables among the candidates detected in the difference image. It's ambitious either way I guess. Here is for example a paper on the "hotPants" [sic] image differencing pipeline. 

http://sro.sussex.ac.uk/61736/1/10-1088-0004-6256-150-6-172.pdf.pdf

 

In the end you might want to implement a combination. On your first pass, you do the photometry as you describe above, but you do NOT throw away the images. The next time you visit a sky region for a second time, your system also uses image differencing. It would be a shame if ypou would miss a new transient event even tho your survey system has all the necessary means to detect it.

Also note that once you have a working, automatic survey system up and running you might want to listen to new Gamma Ray Burst or even Gravitational Wave event notifications  transmitted in real time via GCN, in order to catch an optical counterpart. Again, it would be a shame if an opportunity is missed while your system is scanning the sky.

Just my 2 cents.

CS
HB

Affiliation
British Astronomical Association, Variable Star Section (BAA-VSS)
SKy Survey Software

Many thanks to those of you that have replied I shall read and inwardly digest. A number of suggestions have been made and hopefully I can respond individually.

 I was hoping for a piece or set of ready to go software.  My computer skills are far to limited to put together something myself.  I do a lot of time series runs on eclipsing binaries and exoplanets with relatively small field of view and it would be nice to run these through some software to see what may turn up.  Having said that I'd like to expand to reqular search programme with a somewhat wider fov with regard to finding new variables and exoplanets.

 

Regards,

Eric

Affiliation
Vereniging Voor Sterrenkunde, Werkgroep Veranderlijke Sterren (Belgium) (VVS)
software

Hi Eric,

we are working on a suit of software packages (available and homemade) to run through my many images over several years and get all stars analyzed. We use the info in VSX about know variables and get the photometry about those and all other stars in the field. My field is about 45x45 arcmin^2. Unfortunately there is no software available doing all this but there are a couple of programs e.g. MUNIWIN which do a particular job and you have to combine those with homemade software. We are not yet there and I am not sure if it will run on windows as well as I am not the programmer.

Josch

Affiliation
British Astronomical Association, Variable Star Section (BAA-VSS)
sky survey software

Josch,

the fact that a group is working on suite of software packages that is hopeful. I can also use the Linux OS a basic level if needs be. If you finaly  arrive at something I hope you'll let me know.

Eric

Affiliation
American Association of Variable Star Observers (AAVSO)
survey comments

Eric's basic concept is a good one IMHO.  I've always advocated that, if your field of view is 30x30 arcmin, and in that field, you only use a single target and one or two comp stars, you are not making the best scientific use of your time and equipment.  Every image that I've taken, from my graduate school days over 3 decades ago until present, has had every possible star extracted from it.  It's what we do with APASS and AAVSOnet survey images, and what has created DR10 and the Epoch Photometry Databases.

My original concept was to take my suite of programs that pipeline-processed images and make them available to the AAVSO community, and make the AID into a far more massive database that contained transformed data of millions of stars.  Unfortunately, APASS got in my way - an even more necessary task!

I invested heavy effort into VPHOT, because I felt it gave the best analysis of single images or simple time series.  One of my other goals was best practices for CCD observing, and VPHOT was a key to understanding photometry and good imaging.  The CCD School, CHOICE courses, and manuals were other key factors along that path.

So, if you want to extract all of your camera's data and do something with it, what tools can you use?  Right now, I don't know of a complete package that does the whole thing and that is available as a turnkey product for Windows users.  If someone wants to take my Fortran code, running under linux, and convert it to Python and make it available to the community, I'm all for it.

As Tõnis mentions, there are accepted tools in the professional community that do large pieces of the problem.  Sextractor is wonderful in that it finds all of the stars and does aperture (or with PSFextractor, psf-fitting) photometry.  It is a key part of APASS.  However, it is not a simple tool to use; a fair amount of configuration is necessary.  (Note: some Windows analysis packages will also extract all stars; I think AIP4WIN and Canopus are examples.)  The astrometry.net suite does an excellent job of plate-solving images and providing World Coordinate System (WCS) information for all objects, though if you are going to do this for thousands of images, you need the local version.  TA can apply transformation to the raw data (though on an image-by-image basis without nightly zeropoint or extinction corrections).  MySQL, Postgres and other relational database engines can store that plate-solved photometry.  The process gets complicated by the need for scripts to glue everything together, and query scripts to access the data in a meaningful manner.  Also, the tools mentioned mostly run under linux, not Windows, which is why the Python offer was suggested.  However, I tend to use as much "canned" software as possible, and then tie things together with scripts, as the canned software is usually solid and tested.

So unless you are a programmer, I'd approach the problem by processing your images your normal way, and analyzing your normal way.  Then, before archiving the images (some people even throw away their images!), use some tool to extract all of the stars into text files.  Those are the steps that tend to be unique between installations, and are the hardest for some general-purpose analysis suite to accomplish.  Once text files are created, a universal pipeline can be written to handle the remaining steps.

Arne

Affiliation
None
Survey comments

Arne,

Your comments give me the opportunity to ask a few questions on my own.  I've been doing DSLR photometry for a couple of years now, and have accumulated a modest amount of data - still under a terabyte - consisting of every single valid image collected (including raw frames from darks and flats).  The system has a FOV of approximately 2.5 x 4 degrees and typically I capture defocused images down to about magnitude 11.5 or so with SNRs over 20.  I have my hands full just doing the variable star measurements, but have wondered whether the images might suit someone else's research purposes.

Any thoughts or suggestions?

CS,

Stephen

Affiliation
British Astronomical Association, Variable Star Section (BAA-VSS)
Arne,

Arne,

thanks for you insight and comments.  Indeed most of us take images for a single purpose often over many years and with a little more effort could extract more data and make discoveries.  With the lack of clear nights here in the UK there is plenty of down time to work through any analysis of images.

Brain Warner's Canopus does have a facility to extract stars from each frame and does have a facility to look at any variability across a number of frames from which one can then look more closely at any suspects.

I can work under linux and have treid to look at some Python routines that may assist, but that involves linking various other pieces of software together to run as a routine which streaches my abilities.

Eric

 

Affiliation
American Association of Variable Star Observers (AAVSO)
Survey Experiment

This year I have decided to make such systematic survey "experiment" of a field of 100 degree-square, it is divided in 5 FOVs covered by my EOS M3 APS-C 24 Mpix camera and a 280 mm F/4 lens. I make 20 images of each FOV every time the weather permits.

The software is my own, as usual, written in APL working under Dyalog APL 16 (not Windows). Extracting "all' stars seems to me far too much from such large FOV, I extract only the 500 brightest in each FOV, resulting total with an acceptable SNR being about 2000 stars. I consider SNR below 30 of little interest as, combined with the sky variability, it results in error of about 40 mmag.  The brightest stars (non saturated) are about mag 6.5, the fainter about mag 11.8 (overall SNR ~40).

The flux extraction is a first APL function, then a second makes  the V and direct B-V magnitudes calculation. That second APL function includes the plate solving, the AM correction, the color correction useing my VSF technique and a specific B-V processing. All up to that point, including proper records, is automated. Next is the analysis of variability, the detection is automated but generates number of questionnable results (?), then I have to select the best candidats by hand and make the detailed variability analysis. 

As usual the first issue is to get good comparisons ! My next step is to review it using HIP, my present UCAC4 references being bad for such configuration of instrument. This is a major issue that is not resolved at time being. Next I have to improve my analysis software. As I said this is an experiment, I don't pretend to have the right solution !

Clear Skies !

Roger