This tables summarize pi-programs related to daq and frame analysis available after logging as pi user on any pi-machine.

Table of contents :
  1. Programs for operating cameras
  2. Analysing fits images, converting, mathematical operations, viewing fits header
  3. Database loading / fixing programs scripts
  4. Events presentation and publishing
  5. photometry/astrometry/cataloging/flash recognition algorithms
  6. Running flash recognition algorithms and daq
  7. Usfull scripts
  8. Data analysing scripts
  9. Pi system controlling programs and scripts
  10. DAQ simulator
  11. GRB limits determination


1. Programs for operating cameras


Program name
Usage
Description
test2K2K
test2K2K -h
Program for operating pi cameras and changing settings from command line. Simple manu shows up and there are several actions to be performed. Options :

- to run in safe mode use -safe
- to have all options available use -all

NOTE : can also be run with camera simulator ( depends on ccd.cfg file )
operate2K2K operate2K2K -h executes several actions on cameras ( as cooling on/off , gettemp etc )
cfgcomp
cfgcomp ccd1.cfg ccd2.cfg
cfg files comparator
getstar
getstar -h
prepares list of star positions on subsequent frames , in order to prepare samples
nightdate
nightdate
nightdate -1
nightdate -5
print date of night in format YYYYMMDD, also nights before and after can be obtained by -N or +N  parameter , where N means number of days
tcoic
tcoic
find coicydences in allevents_0.log and allevents_1.log files
satsearch
satsearch
combines temp.tle , to choose single sat uniquly according to most recent date
ang2ang
ang2ang -h
converts angle units , syntax is :

ang2ang TYPE_FROM TYPE_TO angle

use "ang2ang -h" to see supported angle TYPES , in case something does not work put :

ang2ang YOUR_TYPE 0 your_angle

or

ang2ang YOUR_TYPE 4 your_angle

In fact this program can handle all types but not every one to every one , just every one to degress for sure
usbtest
usbtest -h
Program for testing many usb transfers , login to pi2 as pi and :

cd /data2/results/20040928/TEST/TEST_2
usbtest -id=2 -n=100

or

usbtest -id=3 -n=100

first tests k2a, second k2b
testscript
testscript -h
testscript script.pish -r=moon_radius_to_warn -year=2004 -month=10 -day=6 -hete

This program checks goto_ra_dec commands in given pish script and warns if moon is closer then given radius, in case no date is provided
moon position for current day is used , and default radius is 10 degrees
In case -hete is specified, also point_hete commands are checked
In case no script is specified program only prints moon position for current date


2. Analysing fits images, converting, mathematical operations, viewing fits header :


calcfits, calcfits2
print usage :
calcfits2 -h
Several usful operations on fits files, examples :

1/ to make average of frames do :
   calcfits2 -list=k2a_list -dark=dark1.fitc -out=aver.fitc
getstat
print usage : getstat -h
Calculates basic statistics of image stored in FITS file :

Program for simple statistic information about FITS file, usage :
getstat FileName -distrib -dump -x=33 -y=434 -verb -show_non_zero -show_columns -int_file -min_val=-1000000.00 -max_val=1000000.00 -ddfile=d.txt -window=(1500,1500)-(1900,1900)
X,Y - are optional to see value in this position
Example output : getstat k2a_090926_00875.fit
PARAMTERS:Anlysing file : k2a_090926_00875.fit
#######################################

Image k2a_090926_00875.fit statictics :

Image size = 2062 x 2048 = 4222976
Total Sum = 930052225.000000
Max = 63456.000000 at (1841.000000,351.000000)
Min = 0.000000
MinNonZero = 2.000000
Average = 220.236209
RMS = 436.218182
Most popular value = 0.000000
#######################################
Flatmaker by Bogumil Pilecki
flatmaker2
flatmaker2
Flatmaker with several options , type :
    flatmaker2 -h
type=0 - normal Bogumil's flat
type=1 - msok flat
type=2 - gauss fit to each pixel
getpart2
getpart2 -h
retrives parts of frame ( for sample preparing ), example :

getpart2 frames_list_ccd1 - -x=1000 -y=1000 -part_size=50 -outdir=Sample_at_1000_1000
getpart2 xxx1.fitc,xxx2.fitc - -x=1000 -y=1000 -part_size=50
getpart2 frames_list_ccd1 x_y_list_file -part_size=20

Usfull options :
   -mark - marks object of interest into white circle
getkey
getkey xxx.fit  KEY
getkey xxx.fit
lists FITS header or only value of specified key, or several keys - option -listkey=RA,DEC,ALT,AZIM :
Examples :
    getkey xxx.fit -listkey=RA,DEC
    getkey xxx.fit RA
getstrips
getstrips -h
divides frame into strips of specified size and finds mean/sigma of values in  every strip ( fits gaussian )

3. Database loading / fixing programs scripts :

loadevt2db
loadevt2db -h
loads events from log files to pi database the following options are available :
    -clt - uses time of LCO
    -ignore_read_error - ignors error when reading old log without newly added columns
    -default_format=0  - sets default format , in case it cannot be recongnized from log file header, possible values :
                0 - eVerif
                1 - eFinal
    -camid=2 - sets id of camera
    -runtype=0 - type of analysis , possible values :
            0 - eRunOnlineCoic
            1 - eRunOfflineSum
            2 - eRunOfflineSN
            3 - eRunOfflineSingle
            4 - eRunOnlineSingle
            5 - eOfflineCoic
    -db - if set events are loaded to database

Recommended usage :
     loadevt2db all_finalevents_0.log -runtype=3 -db -camid=2 -default_format=1 -clt -ignore_read_error

parse_fits_header_txt! 1 - name of file to be parsed parses fits header in txt file form and creates sql-s for inserts
gen_fits_header_load!
parses all header-txt files in current direcory and creates script load.sql



4. Events presentation and publishing  :


pifrbrowser
pifrbrowser -h
( by LWP )
Usage :
   pifrbrowser DIR_k2a DIR_k2b list_name

    pifrbrowser k2a/scan_parts/ k2b/scan_parts/ list
pievbrowser
pievbrowser --help
( by LWP )
In order to prepare single frame event run :

  pievbrowser -single NUMERY_KLATEK -odir KATALOG

gdzie jako NUMERY_KLATEK moze byc lista : 1200,1201,1220 etc lub przedzial 1200-1240

Examples :
   pievbrowser -single 123,124,156 -odir 123_124_156
   pievbrowser -timarange 00:23:00,900 -odir konus
prep_public!

usage example, go to directory on server containing events and type :
   cd /lhome/piwww/www/pi0/daq/events/200412/20041210/EventsHtml/Frame00124
   prep_public! event0.php
http://grb.fuw.edu.pl/pi0/lewhoo_test/events_index_public.php

page for checking database consistency


5. photometry/astrometry/cataloging :

a/ Basic programs for photometry/astrometry/cataloging

piphoto piphoto -h
fast photometry. Programs finds stars on frame ( or list of frames ) and calculates magnitudos. Results are written to mag file. Can be seen using dump_mag program
picalib picalib -h
Program for finding stars on given frame, and prininting list of them with catalog magnitudo , coordinates etc
executes calibration - dumps catalogue magnitudes of stars , out instrumental mag and stars RA,DEC
piastrometry piastrometry -h
Use in order to do astrometry and obtain cfg file to be included in ccd_pipelineX.cfg , typicaly run :
photometry k2a.fit k2a.mag
piastrometry k2a.mag k2a.ast ccd_astrometry_pipeline0.cfg -pixscale 59.5 -ord 4 -verb

then check ( and add is missing ) line :
%LOAD% ccd_astrometry_pipeline0.cfg
in ccd_pipeline0.cfg

repeat same for camera 1 ( in case using ccddouble / Nparamstest2C )
piaddast2
piaddast2 -h
Script for cataloging ast files to database.
pi_red_frame
pi_red_frame -h
for detailed description see - link
piad2xy piad2xy -h calculates x,y of given RA,DEC for given fits frame :

usage:
piad2xy FITS_FILE RA(h) DEC(deg) -from_fits

Example :
    piad2xy xxxf.fit 20.545 -34.0055

RA must be provided in hours , DEC in degrees
pixy2ad
pixy2ad -h
calculates (RA,DEC) for given frame coordinates (x,y) , usage :

pixy2ad FITS_FILE x y -from_fits
ad2azh
ad2azh -h
calculates (az,h) coordinates for given (RA,DEC) and time ( in format YYYYMMDD_HHMISS ), example :

ad2azh 7.00278200 -0.00539093 20050131_003443

RA - in hours
DEC in degrees

Geographical coordinates are read from ccd.cfg file
red_frame

Wersja 0.4.22
Program redukujacy pliki z listy(uzupelnianej) o DARK'a i FLAT'a.
Lista plikow zredukowanych zapisuje sie do: 'phot_pipe'
Uzycie:
prog_name lista(lista) outlist(phot_pipe) dark(DARK) flat(FLAT) flipped(0) liczba(0) multiplier(1.0) output_dir(./) remove_raw(0) lock_file

liczba - liczba klatek do przerobienia - jesli 0 to wszystki przerabia
remove_raw - czy usuwac fitsa przed redukcja
flipped - 1 - vertical , 2 - horyzontal
findfield
findfield RA DEC OBJECT
Finds best field to be observed from the list of known fields for camera k2a and k2b ( Cannon 85mm objectives by default )
pigencat
pigencat -h
program for generating catalog from txt file of format :
  RA[h-decimal] DEC[deg-decimal] VMAG[magnitude] IMAG[I-magnitude optional] BMAG[B-magnitude optional]
example of usage :
    tycho2txt!
    pigencat tycho.txt -save=tycho -verb -ra_in_deg
checkstarcat
checkstarcat -h
Program for checking star in catalog, example of specifying arbitary catalog ( in binary asas-line format ) :

      checkstarcat  17.0089 -39.8500 -radius=200 -cat_file=/opt/pi/dev/pisys/daq/ndir/data/cat/tycho -cat=ASAS

catalog is binary file, which can be generated from text file by pigencat program ( see line above )

b/ asas-pipeline ( sum 20 ) astrometry/photometry/cataloging

run_asas_pipeline_online_cat!
PARAMETERS :
1 - night
2 - source dir ( without night )
3 - out dir ( without night )
4 - database name
5 - if 1 no database cataloging ( default 0 - cataloging
      on)
6 - database host name
7 - do recalculation : 0 / 1 ( default 0 )
8 - if wait for file that copy of night data is done (not required by NFS)
9 - run  synchro 0 / 1 / -1 , -1 means that no reduction is done ( only
     cataloging ) , 0 - reduction and cataloging in parallely, 1 - reduction
    and cataloging after it ( synchro )  ( default 0 )
10 - wait for data ( night fits files ) ( default 1 )
11 - additional cataloging options ( default empty )
12 - send e-mail 0 /1 ( default 0 )
13 - drop indexes before load 0 / 1 ( default 1 )
14 - if prepare on-line DB ( dump seed and load to aver20_online )
       (default 0 )
15 - if check clouds : 0 / 1 ( default 1 )
16 - camera1 ( default from /opt/pi/dev/pisys/daq/config/custom/photometry.cfg file )
17 - camera2 ( default from /opt/pi/dev/pisys/daq/config/custom/photometry.cfg file )
main script for running on-line pi-pipeline and cataloging
NOTE : currently only k2a is cataloged on-line k2b is waiting
there was some (locking ? ) problem when running these to
parallely !

Currently it works in such a way :
   1/ in main loop astrometry and photometry
       of coming frames is performed - first k2a and
       then k2b - synchronicly
    2/ parallely piaddast2 program is running on k2a
        frames, however this will be changed
        soon thanks to new option :
              -lock_file=../cat.lock
   
3/ after all fits files are converted to mag/ast
         and file daqExitOK.txt is found in run
         directory. Photometry/Astrometry is finshed
         stamp files are created, cataloging will also
         finish when all ast files are processed
   

EXAMPLES :
  • To run cataloging of single night 20071202 from ast files to database 20071202 run  script :
    nohup run_asas_pipeline_online_cat! 20071202 /pi20/msok/cat/ 
                            /pi20/msok/cat/20071202/cat/ 20071202 - - 1 0 -1 0 > out 2>&1 &




do_astrometry!
PARAMETERS :
1 - name of list file with mag files
2 - name of output directory
3 - value of timeout ( default 100 sec )
4 - if skip already processed mag files ( default yes )
5 - if fix astrometry in fits file - pass directory where fits are
     stored in this paramter
makes astrometry on list of mag files
To verify reason for timeout of 100 seconds see plots here
fix_old_astro!

Script for making missing astrometry on data 2004-2005 - asas-pipeline
asas_pipe_process_new! PARAMETERS :
1 - nightdir - full path to directory with fits files (
      together with YYYYMMDD )
2 - camera number ( 1 - k2a, 2 - k2b )
3 - list of fits files
4 - night name YYYYMMDD
5 - number of frames to be averaged ( default 20 )
6 - if this is last run
7 - path to DARK fits file
8 - path to FLAT field fits file
9 - index of output file
script for reduction of list of fits files . Processes only new fits
files which were not processed before ( those are stored in file list )
asas_pipeline_online!
PARAMETERS :
1 - NIGHT as 
      YYYYMMDD
2 - catalog where fits files are stored ( subdir with night
      name )
3 - output catalog name ( where subdirs
      YYYYMMDD are created )
4 - if make only one execution ( no waiting for new
     fits files )
5 -
Script running on-line asas-pipeline - taking new 20 frames when collected
 and runs asas_pipe_process_new!
Program waits SLEEP_TIME=900 seconds for new portion of data to
be analysied
cat_both_synchro!
PARAMETERS :
1 - file containing list of nights
2 - place where ast files are stored default
      ( /scratch/pi10/results/ASAS_PIPELINE/ )
3 - database name
4 - border size (default 100)
5 - options to piaddast2 program ( cataloging options )
Script for cataloging ast files from both cameras , makes also
optimization, recalucation according to parameters.
Typical usage :
#!/bin/bash
                                                                             
nohup cat_both_synchro! night_list 
 /scratch/pi_asas/results/ASAS_PIPELINE/ 20040918_asas >
 out 2>&1 &

cat_nights_remote!
PARAMETERS :
1 - file containing list of nights to be cataloged
2 - camera number (1-k2a, 2-k2b)
3 - path to directory, where ast files are stored (without night )
4 - options passed to piaddast2 program ( default : -no_update  -skip_field_change )
5 - ast subdir base name ( default ast -> ast1, ast2 ) -OBSOLATE
6 - do recalculation ? : 1/0  (default 0)
7 - do optimization ? : 1/0 ( default 1 )
8 - do remove ast files ? : 1/0 ( default 1 )
9 - size of chip border skiped in cataloging ( default 100 pixels )
SCRIPT for cataloging list of nights
cat_ast_test!
PARAMETERS:
1 - size of ignored border ( default 100 pixels )
2 - n_parts - OBSOLATE
3 - do process ? 1:0 (default 0)
4 - additional options to piaddast2 (default - empty)
5 - name of file with list of ast files (default - k2a_ast_list)
6 - do optimization of db ? 1:0 (default 0)
7 - drop indexes before loading ? 1:0 (default 0)
8 - calculate sigma of ra,dec ? 1:0 (default 0)
                                                                            
most basic script for running pi-cataloging on list of
ast files
cat_new_ast!
PARAMETERS :
1 - size of ignored border ( default 100 pixels )
2- do process ? 1:0 (default 0)
3- additional options to piaddast2 (default - empty)
4- name of file with list of ast files (default - k2a_ast_list)
5 - do optimization of db ? 1:0 (default 0)
6 - calculate sigma of ra,dec ? 1:0 (default 0)
script for cataloging new data - does not drop indexes
re-runs cluster command optionaly, processes according to min_id and max_id written by piaddast2 program
cat_both_synchro_fast!


run_cat_parallel!
PARAMTERS :
1 - database name
2 - if drop indexes ( if not it is slower , but maybe more realistic when
      adding new measurements )
running cataloging in parallel mode, using
locking file cat.lock. Should be run in directory where subdirectories ast1 and ast2 exist
do_asas_phot_astro.sh
PARAMETERS :
1 - night
2 - name of list file [ default aver_list ]
3 - mag dir [ default ../mag1/ ]
4 - ast dir   [ default ../ast1/ ]
5 - camera [ default 1 ]
6 - camera name [ default k2a ]
script executing asas-photometry and astrometry on
list of fits frames, it should be ran in the directory
where fits files are stored


c/ scan cataloging

This script are mainly written by Kasia Malek and are described in detail on here page. Source is kept under subverion in $SRCDIR/ccd/scripts/cat/scan/

scan_run_pipe.sh
PARAMETERS :
1 - night
2 - source dir (/data2/results/)
3 - destination dir
    ( /data1/results/SCAN_ASAS_PIPELINE/ )
photometry/astrometry of scans - 3 frames on each field are
averaged ast files are createded in destination directory and
subdirectory corresponding to current night
scan_run_pipe_one1.sh
PARAMETERS :
1 - night
2 - source dir (/data2/results/)
3 - destination dir
 (/data1/results/SCAN_ASAS_PIPELINE/SINGLE/ )
Performs photometry/astrometry on scan images, but no
 averaging is performed just single frames are analysied
run_scan_cat! PARAMETERS :
1 - night
2 - place for night fits are stored
3 - place where output subdirectory YYYYMMDD
     will be created
4 - database name
catalogs scan frames, makes reduction/photomery/astrometry and cataloging, averaged frames are made
run_scan_cat_many!
PARAMETERS :
1 - file with list of nights to be cataloged
2 - output directory ( without night ) [ default =
      /data1/results/SCAN_PIPELINE/ ]
runs scan cataloging of many nights but in optimzed way
so that some actions are performed before loop over nights,
then cataloging is done, and then recalculation after everything is finished
NOTE : it is faster to run this script instead of running run_scan_cat! many times for different nights !
run_single_scan_cat!
PARAMETERS :
1 - night
2 - place for night fits are stored
3 - place where output subdirectory YYYYMMDD
     will be created
4 - database name
# SCRIPT FOR RUNNING :
# reduction/fast-photometry/asas-astrometry/pi-cataloging
# on scan images
cat_nights_scan_new!
PARAMETERS :
1 - remote dir , where scan frames are stored (
     without night )
2 - options passed to piaddast2 program ,
      DEFAULT : -scan -no_update -min_aver=2
      NOTE : -min_aver=2 is good for cataloging
      3 averaged scan frames, but for singles
      parameters 2 should be overwritten with :
          "-scan -no_update"
3 - border skiped in cataloging ( default 150 pixels )
4 - if run processing at the end : 0/1 ( default 1 )
5 - if do field list ( if catalog by fields ) default 1
6 - if do optimization of database at the end ( default
     1 )
Script for cataloging many scan nights by fields, it creates list of frames from  same field and catalogs list of single field frames. This is more
efficient when cataloging many nights as re-selects are not often , only when field is changed so it is about ~120 selects ( # of different fields )
Example of usage :

        #!/bin/bash
        export PI_DBNAME=scan_single
        nohup cat_nights_scan_new! /disk03/results/ast_single/ - - 0 - >>
                    out 2>&1 &

or if processing should be executed at the end :

        #!/bin/bash
        export PI_DBNAME=scan2005
        # cotinuing with given field_list.txt :
       nohup cat_nights_scan_new! /scratch/pi_scan/results/ast/ - - - - >>
                    out 2>&1 &






New based on pi-programs :
scan_pipe_process_new!
PARAMETERS :
1 - night directory ( example :
     /data2/results/20050301/ ), no default value
2 - camera number ( 1 - k2a , 2 - k2b )
3 - name of list file containing frames to be
      processed
4 - night (YYYYMMDD)
5 -  number of frames to be averaged ( default 3 )
6 - name of dark file ( default night dark )
7 - name of flat field file
      ( default
    ${DATADIR}/FLAT/FLAT_${CAM_NAME}.fit   
      )
8 - directory where processing will be performed

NEW - to replace old asas(kasia malek) pipeline, runs reduction and cataloging on scan , this is for single camera cataloging , must be called twice to catalog both
scan_pipe_process_single!
PARAMETERS :
1 - night directory ( example :
     /data2/results/20050301/ ), no default value
2 - camera number ( 1 - k2a , 2 - k2b )
3 - name of list file containing frames to be
      processed
4 - night (YYYYMMDD)
5 -  IGNORED
6 - name of dark file ( default night dark )
7 - name of flat field file
      ( default     ${DATADIR}/FLAT/FLAT_${CAM_NAME}.fit   
      )
8 - directory where processing will be performed

script for reduction of single camera of scan frames, it is possible to force that after astrometry is done it is written back to scan-fitc file
do_asas_phot_astro.sh
PARAMETERS :

runs photometry and astrometry on scan reduced frames
cat_nights_scan_new!
PARAMETERS :
1 - REMOTE DIR - place where scan ast
     files are stored , default 
      /scratch/pi20/kkrupska/katalog_scan_summ/
2 - options to piaddast2, default : -scan-no_update
3 - border size ignored by piaddast2
4 - DO_PROCESS [ default 1 ]
5 - DO_FIELD_LIST [ default 1 ]
6 - DO_OPTIMIZE [ default 1 ] 
script for cataloging list of scan nights , it is specialy designed for many nights
cataloging as it makes list of frames by field and catalogs same field list
at once, to decrease number of re-selects
NEW : option -no_new_star was added as
default to piaddast2 program so new_star='f' for
all measurements and must be re-calculated
after cataloging is finished !


NOTE: this procedure catalogs scan frames grouping by
field, so this is not done in chonological order of frames.
Due to this fact field new_star in table measurements may be set incorrectly, as some field may be cataloged ealier
then chronologicaly ealier frame causing new_star
value to be incorrect requireing recalculation
Soon this field will be set to 'f' and after cataloging
is finished database will be processed to determine this value, example of this problem :

field 1026-30 was catalogued as 32-nd  field ,  but star  4682080 was visible ealier on field S1048-45 which was
catalogued as 52-nd field causing that new_star='t' was
on chronologicaly later frames from field 1026-30 which
was catalogued first !
 cat_single_night_scan!
1 - piaddast2 options, [ default  : -scan -no_update
2 - size of border ignored by piaddast2
3 - do process [ default 1 ]

Script for cataloging single night scan, runs processing ( if parameter is set )
And always optimization is executed



do_cat_from_scan!
do_cat_from_scan.sh
OBSOLATE
OBSOLATE
old version of scan cataloging script


Recalculation scripts for scan :

recalc_scan_fast.sh
recalc_scan_fast.sh DATABASE
Script for fast  re-calculation of database fields from Stars are re-calculated, which are : magnitude, sigma_mag, no_measurements, min_mag and max_mag
update_catalog_fast_basic!
update_catalog_fast_basic! STEP MIN_ID MAX_ID DO_RUN
Typically :
export PI_DBNAME=scan
update_catalog_fast_basic! 1000000











d/ fast photometry cataloging

run_fast_pipeline_online_cat!
PARAMETERS :
1 - night
2 - source dir ( without night )
3 - out dir ( without night ) - directory where ast are produced
4 - database name
5 - if cataloging to DB is NOT required ( default 0 - cataloging required )
6 - database host name
7 - if remove mag files after astrometry done ( default 0 - no removing )
8 - run_once ( or wait for new data on-line ) 0/1 ( default 0 )
9 - directory where fits files are stored ( if update of fits header with astrometry is required )
10 - do reduction 0/1 ( default 1 )
11 - options passed to piaddast2 cataloging program ( default = "-no_update -max_ast_err=0.5 -min_star_count=10000 -dump_freq=500 -apert=0 -no_skip_field_change" )
12 - do recalculation 0/1 ( default 1 )
13 - treshold for photometry - in sigmas ( default 5 )
14 - camera1 ( default taken from file /opt/pi/dev/pisys/daq/config/custom/photometry.cfg or k2a if file is not found )
15 - camera2 ( default taken from file /opt/pi/dev/pisys/daq/config/custom/photometry.cfg or k2b if file is not found )

script for running fast-photometry/reduction, asas-astrometry and pi-cataloging of single frames

NOTE : in order to run without reduction ( on already prepared ast files )
set paramter 10 equal to 0 and parameter 3 path to place where ast files are stored
fast_pipeline_online!
PARAMETERS :
1 - NIGHT as 
      YYYYMMDD
2 - catalog where fits files are stored ( subdir with
      night name )
3 - output catalog name ( where subdirs
      YYYYMMDD are created )
4 - if make only one execution ( no waiting for   
      new fits files )
script for pi-photometry and asas-astrometry for new frames
coming from device
fast_pipe_process_new!
PARAMETERS :
1 - nightdir - full path to directory with fits files (
      together with YYYYMMDD )
2 - camera number ( 1 - k2a, 2 - k2b )
3 - list of fits files
4  - night name YYYYMMDD
5 - path to DARK fits file
6 - path to FLAT field fits file
script for fast-photometry,asas-astrometry of list of new fits files . Processes only new fits
files which were not processed before ( those are stored in file list )
red_cat_fast_photo!
PARAMETERS :
NOTE : there are no default - all 5 parameters must be provided :
1 - NIGHT
2 - OUT_DIR ( without 20050429 )
3 - START OLD - if start processing of
     old (not-processed) data
4 - DATA DIR ( without 20050429 )
5 - DATABASE NAME

script running fast photometry/astrometry and catloging ast files to specified database
cat_fast_photo!
PARAMETERS :
1 - FAST PHOTO DIR ( without night ), default :
       /data1/results/
2 - NIGHT (default : 20050529 )
3 - database name ( default : pidb )
script for cataloging results of fast photometry on given night
calls script cat_both_synchro_fast! ( see below )
cat_both_synchro_fast! PARAMETERS :
1 - name of file with list of nights to be cataloged
     ( default night_list )
2 - path to directory where ast files are stored
     ( without night )
3 - name of database ( default pidb_fast )

script for cataloging fast photometry pipeline to database.
Some special options must be provied so this is different one then for
asas-pipeline, calls cat_nights_remote!
finish_fast_photo!
PARAMETERS :
Finishes astrometry on fast photometry data
process_fast_photo!
PARAMETERS :
Same as for script run_fast_pipeline_online_cat! - except IN_DIR : 1 - night
2 - out dir ( without night )
3 - database name
4 - 1 if cataloging to DB is NOT required ( default = 0 - means cataloging required , name of variable is NO_DB - maybe confusing ... )
5 - database host name
6 - remove mag files ( default = 0 - means no removing )
Finishes astrometry on fast photometry data , assumes directory structure is new : mag1/ mag2/ ast1/ ast2/ etc ...
First removes ast.gz files smaller then 200 bytes ( empty ast files ) then makes missing astrometry, then removes mag files if required and compreses ast files
run_cat_fast_all.sh

OBSOLATE - script for making ASAS catalog
run_cat_fast.sh

OBSOLATE - script for making ASAS catalog

e/ testing flash recognition algorithms


laptest_new
USAGE :
laptest_new ast_list

ast_list - is list of ast files  to be used
PARAMTERS :
-ast_dir - directory where ast files
                are stored
-laplace - laplace type value to be
                 tested
-outdir - name of output directory
This is program for testing laplace types it reads list of ast and fits files. Matches ast-stars to star catalog, chooses
stars which have corresponding star in the star catalog on the first frame and creates list of them.
Then on next frames stars from this original list are found and their laplace value is calculated and written to
output file - named by ra,dec of star ( example ; 082230-0610.6_6.17.txt )
After program is finished many star files are created, then script make_lap_plots! , can be used to produce
gauss fits to laplace distribution determine sigma_laplace and mean_laplace for each star and
create plot of <sigma_laplace/mean_laplace> vs magnitudo , this was done for data 20060307







f/ astronomic calculations


calcobject
USAGE :                          
1 - object name
2 - start time in format 20060924_180000
3 - time period to show ( in seconds )

OPTIONS :
-ra=23.3232 [ hours decimal ]
-dec=-12.32 [ degrees decimal ]
-step=5 [ time step in seconds ]
program for calculating (az,h) position of specified object in specified period of time with specified time step







g/ Flare recognition algorithm


find_nova_many.sh
USAGE :                          
1 - number of nights back to start search ( default -10)
2 - output directory basename ( default /data1/results/flares_nova/ )
3 - pi-pipeline output dir ( default /data1/results/PI_PIPELINE/ )
4 - dbhost ( default pi1 )
Example :
nohup find_nova_many.sh -67 > out 2>&1 &

script for finding flares ( brightness increase of stars existing in database ) , for many nights, it is possible to executed program since given number of nights back, up to current night

find_flares.sh

PARAMATERS :
1 - night ( can be 20060501 or -1 ) so absolute or relative value may be passsed
2 - output directory basename ( default /data1/results/flares_nova/ )
3 - pi-pipeline output dir ( default /data1/results/PI_PIPELINE/ )
4 - dbhost ( default pi1 )
Example ( from pi1 crontab ) :
0 17 * * * /opt/pi/dev/pisys/daq/ndir/bin/find_flares.sh > /opt/pi/dev/pisys/log/find_flares.log 2>&1

Script for running flare recoginition algorithm on single night




h/ flash recognition and verification


sattest
USAGE :
sattest UNIX_TIME [OPTIONS]

sattest  1168221921 -ra=60.78610995 -dec=-38.00133888 -all -radius=2
or to list all visible :
sattest  1168221921
to list all :
sattest  1168221921 -all
Program lists satellites position at given time, file satelitesdb.tle must exist in current directory and
should be fresh




i/ finding limits

find_limit_ast!
USAGE :
find_limit_ast! RA DEC
Finds in given position according to given list of ast files and RA,DEC
maglimit_db_frame!
USAGE :
maglimit_db_frame! DBNAME ID_FRM MIN_NO_MEASURE - - - X Y RADIUS
finds distribution of stars on given frame and limit can be found "by eye"
it is possible to specify position in which check frame
Usage example :
nohup maglimit_db_frame! aver20_2006 15499 10 - - - 1000 1000 200 > limit.txt 2>&1 &


6. Running flash recognition algorithms and daq :


ccddouble
ccddouble -h
main DAQ program, collects frames from 2 CCD cameras and performs on-line ( or off-line ) analysis
ccdsingle
ccdsingle -h
runs analysis on single camera
ccdcollector ccdcollector -h
For collecting frames from CCD , without any kind of analysis, just takes pictures, on-line program tu be used with real cameras or simulator
Nparamstest2C Nparamstest2C -h
for testing parameters and determine bacground and efficiency. Parameters to test should be listed in mcinput.txt file
Nparamstest Napramstest -h
Same as above but for testing algorithms on single cam
analres
analres -h
for printing results of Nparamstest, Nparamstest2C
ccdview
ccdview -h
PI fits viewer, but also can be used with real camera ( proper ccd.cfg must be used ), so that real images are collected and shown on-line !
varhisto
varhisto -h
same as ccddouble , but also histograms cut variables
runccdsingle!

script running program ccdsingle, but in case it terminates it is re-started , module pikam.ko is re-loaded
before re-start of ccdsingle
runccddouble!

script running program ccddouble, but in case it terminates it is re-started , module pikam.ko is re-loaded
before re-start of ccddouble


7. Usfull scripts  :


a/ Preparing and running DAQ analysis

Script name
Usage Description
getSatDBNew! getSatDBNew! transfers tle files from known sources and combines them to build most up to date possible DB
prepCOIC_2K2K! prepCOIC_2K2K! NIGHT
LOCATION
CAMERA

type :
prepCOIC_2K2K! -h
to see usage
prepares directory, cfg files, satdb for today's analysis
Script preparing configuration files for night analysis, it creates catalog /data2/results/YYYYMMDD and copies cfg files there ( by default from $SRCDIR/ccd/cfg/WZORY_CCD_CFG/doubleCAM_Real/2K2K_2K2K ) , by default current night is used, in case location is specified file obs_site.cfg is overwritten with cfg file for specific location in the
data folder /data2/results/YYYYMMDD , so for example to prepare analysis for current night in Brwinow run :

prepCOIC_2K2K! - BRW f50mm

Acceptable location codes are :
LCO - Las Campanas Observator ( DEFAULT )
BRW - Brwinow test station
CAN - Canary Island

3rd parametetr is camera , currenlty possible values are :
f50mm    - Carl Zeiss 50mm
cannon_f85mm - Cannon 85 mm

prepSINGLE_2K2K!
prepSINGLE_2K2K! NIGHT
LOCATION
CAMERA

type :
prepSINGLE_2K2K! -h
to see usage
prepares directory, cfg files, satdb for today's analysis
Script preparing configuration files for night analysis, it creates catalog /data2/results/YYYYMMDD and copies cfg files there ( by default from $SRCDIR/ccd/cfg/WZORY_CCD_CFG/doubleCAM_Real/2K2K ) , by default current night is used, in case location is specified file obs_site.cfg is overwritten with cfg file for specific location in the
data folder /data2/results/YYYYMMDD , so for example to prepare analysis for current night in Brwinow run :

prepSINGLE_2K2K! - BRW f50mm

Acceptable values same as above
runDAQ! runDAQ!
runDAQ! 20040506
runs prepCOIC_2K2K! and starts ccddouble program
run_pisys!
run_pisys! check_dome 40 noinit
starts system, 3 parameters are optional , default is :
  - not to check dome status
  - 40
  - run init script
in order to skip parameter in middle put - sign , example :

  run_pisys! - - noinit
ux2ut!
ux2ut! UNIX_TIME
Example :
   bash-2.05b$ ux2ut! 1093282693
   2004-08-23 17:38:13

prepdaq!
prepdaq! -h :

PARAMETERS :
LOCATION [ default =LCO ]
CAMERA [ default = f50mm ]
prepares catalog for current night analysis ( $RESDIR/YYYYMMDD ) with usage of 2 cameras
working in coincydence
Takes 2 paramters :

LOCATION acceptable values are :
LCO - Las Campanas Observator ( DEFAULT )
BRW - Brwinow test station
CAN - Canary Island

CAMERA acceptable values are :
f50mm    - Carl Zeiss 50mm  ( DEFAULT )
cannon_f85mm - Cannon 85 mm

prepdaq_single!
same as above
prepares catalog for current night analysis ( $RESDIR/YYYYMMDD ) with usage single camera
looking for flashes in confirmation on next frame mode.
Parameters same as above
prepOnlineSum!

preparing analysis on summed frames ( on-line - on copied data )
prepOnlineSumNew!
new version of preparing analysis on summed frames ( on-line - on copied data ), uses pi_red_frame and
runs ccddouble on summed frames
run_online_sum8!

starts sumation program and ccddouble running sumed frames

b/ Converting fits to jpg, viewing images, headers etc ...

fits2jpg!
fitc2jpg!
fits2jpg! xxx.fit
fitc2jpg! xxx.fitc
converts fits file ( or ASAS compressed fitc ) to jpg (  ~300 kB  per frame ) . To show usage run :
 fits2jpg! -h
 fitc2jpg! -h
Generally there are 3 compression options ( see description here ) , default is 1 ( compression :  convert -resize 20%x20% -normalize -colorspace GRAY -gamma 0.5 )

fits2jpg! xxx.fit - 0
fits2jpg! xxx.fit - 1
fits2jpg! xxx.fit - 2
tojpg!
Converts all fitc files in current directory and all subdirectories to jpg ( for search : 2jpg! )
scan_conv! scan_conv! converts all .fit files in current directory to .jpg files , very usful for sky scan, see here for more details
do_all_scans!

retrives all scan fits headers to txt files


c/ Parsing and analysing of event logs :

getevt_by_ut!
 getevt_by_ut! LOG 20040922_023021 900
prints only this events from log file which are in +-900 sec since date 20040922_023021
final! final!
final! finalevents_1.log
adds column with LOCAL TIME of events in format YYYYMMDD_HHMISS
events! events!
events! EVT_NO
For viewing events saved as 100x100 parts
prepevt! prepevt!
prepevt! 20040718
run just prepevt! to prepare tar.gz with final events only from previous night
run
check_final_in_verif!
check_final_in_verif! verifiedevents_0.log
for finding final events according to verifiedevents log file
load_final_from_verif!
load_final_from_verif! pidb
for loading final events according to verifiedevents_0.log to DB

d/ Making parts of fits images , frames analysis :

getparts_scan.pl getparts_scan.pl -h
scripts for creating parts of fits according to novea list genereted by Kasia Malek script .
PARAMETERS :
 -scans_dir=/scratch/pi_scan/results/DAQ_RESULTS/
 -file=scan_nova_list.txt
getpart2!

calls program getpart2 , but allows some actions to be faster :

getpart2! files 1230 1400

parts are saved to sub-directory 1230_1400/
also packs this directory to file totake.tar.gz
getpart3!
getpart3! -h
same but more easier to use , simply frames range is passed :

getpart3! 20 30 1230 1400 dark2.fit k2b OUT_DIR SIZE

start frame = 20, end = 30 , x=1230 , y=1400

getpart_by_list!

getpart_by_list! start_frame end_frame x y dark frames_list OUT_DIR SIZE
getparts!

program for getting parts according to specified events log file usage :

getparts! allevents_0.log 1200 1220 0 1200_1220 dark.fit 5 50

this generates script tmp! which can be run in order to save parts of events or run :

getparts! allevents_0.log 1200 1220 1 1200_1220 dark.fit 5 50

in order to run tmp! automatically
it will cut events parts of size 50 from allevents_0.log frames range 1200-1220 to directory 1200_1220 using dark.fit
takes +-5 frames for each event

To get darks , example - need date to be specified :

getparts! ../RESULTS/20041202_112238/allevents_1.log 100 300 0 100_300 ../dark2.fit 1 20 041202

getparts_by_list!

same but uses list of fits files ( not generates like getparts! ) :
      getparts_by_list! allevents_0.log 1200 1230 0 events_1200_1230 dark.fit 5 size FRAMES_LIST
getpart_by_radec!

usage:
getpart_by_radec! start_frame end_frame ra( 02h23m23.1s ) dec( 23.2 deg ) dark dir size

saves parts of specified size around given position to subidriectory dir from boths cameras, example :

getpart_by_radec! 1900 1949 06h43m53s 20.338 dark1.fit GRB 100
gettrigger!

script for preparing parts of FITS corresponding to specified time range of event in log file , usage :

gettrigger! allevents_0.log YYYYMMDD_HHMISS 500  0 events_1200_1230 dark.fit 5 size

this generates script tmp! in case it should be run automatically specify 1 instead of 0



e/ Transfering files from LCO :

getfits!
getfits! -h
Use this script to retrive events fits parts , usage example :

getfits! 20040926 256 - GET

retrives from pi2 events from night 20040926 and frame 256





f/ Operating camera and webcamera :

k2a_sky! / k2b_sky!
k2a_sky! 5
k2b_sky!
takes desired number of frames ( 1 if no parameter provided )
k2a_usbtest! , k2b_usbtest!

testing usb
takePhoto! takePhoto!
takes photo with webcam
shutter_opened_mode!
shutter_normal_mode!


scripts for changing shutter mode to permanently opened or to normal mode ( open/close )
load_altera!
loading of firmware to altera , this is complete script for loading vhdl to altera, however it is currently
not working yet, maybe GK will find out why ...

g/ Handling camera driver modules ( modules reloading etc ), scripts requiring root permissions :

script name
Usage
Descritpion
load_cypress!
load_cypress!  CAMERA LOAD.bin -doload
loads firmaware to camera , add  -doload option to do perform loading, if not only checks if everything is ok
loadwebcam! / unloadwebcam!

load / unload webcam modules
loadpikam! / unloadpikam!

load / unload pikam module
rescue1!

reloads camera module
rescue2!
rescue2! -h    reloads camera module and unloads USB2.0 module, in case test2K2K works , modprobe ehci_hcd should be executed , in order to load USB2.0 again , script can be run without options, in such case unloads USB2.0 and reloads piman module :

rescue2!
test2K2K

or

rescue2! USB20
test2K2K

in this case USB2.0 module is reloaded back, I would recommend to run first without options , check with test2K2K, and if cameras work
run with option USB20 and check again



h/ scripts for analysis log files etc :

getnottrack!
getnottrack! verifiedevents_0.log
prints only these lines of specfied file which are not events assigned to fitted track, can be called without parameter :

getnottrack!

in this case default file is averframeevents_0.log

checktrack    

USAGE:

checktrack X Y Frame# -radec -ra=34.43 -dec=-23.43 -radecfile

To check radec track go to directory with tracks ( on heplx40 it is events/200606/20060606/Tracks/Cam0) and run command :


checktrack   238 953 2788 -ra=0.5307 -dec=0.7094 -radecfile

where event (x,y)=(238,953) and (ra,dec)=(0.5307,0.7094)  and option -radecfile
means that track file with radec tracks will be used
in order to show individual track run scripts :

getfulltrack!
getfulltrack_radec!

In order to get information from finalevents_0.log file which can be added to track.lst file
use script : show_final!


i/ on GCN alert :

find_frames_list!
find_frames_list! -h
script for finding frames by RA,DEC cooridnates , and for night specified in DATE/TIME of GRB alert , example :

find_frames_list! RA DEC YYYYMMDD_HHMISS
find_frames_list! 5.00 -20.00 20041226_012320

RA - in hours 5.45 or 05h24m34.34s
DEC - in degrees
DATE - date of midnight of next day after starting date
subdirectory Event_5.00_-20.00_20041226_012320 is created and parts 100x100 from both cameras are stored in proper subdirectories

NOTE : file new_frames_list_ccd1 is required to run this script , this can be found in night data catalog and has format :

FILE                          FRAME RA                 DEC        UNIX_TIME
-----------------------------------------------------------------------------------------
k2a_070704_00023.fitc 23 12.00410776 -29.99998549 1183589205
k2a_070704_00025.fitc 25 12.00410985 -29.99998549 1183589283
k2a_070704_00026.fitc 26 12.00495581 -30.01464363 1183589295

The best way - copy all files "*list*" to frames directory from pi2:/data2/results/20070620/


list_frames!
list_frames!
script for listing frames of specified coordinates and radius, example :

list_frames! RA[in hours] DEC[ in deg ]  FRAMES_LIST[ defult = new_frames_list_ccd1 ] RADIUS[ default = 15 deg ]
units as for find_frames_list!  - see above
On output list of frames containing specified position is generated


j/ scripts for analysing FITS headers :

listkey! , listkey_fits! , listkey_ast!

prints specified keys from all files ( or fits, ast only ) from current directory , example :
        listkey! RA DEC
        listkey_fits! RA DEC
        listkey_ast! RA DEC





k/ scripts for viewing traces :

crontrace!

shows traces from crontab
daqtrace!

shows current night DAQ trace
pimantrace!

shows current night piman trace
mounttrace!

shows current night mount trace

l/ ASAS catalog light-curves :

series6f_auto!

finds series of frames for same coordinates  ( by Kasia Kwiecinska )





m/ Perl scripts for postgres database analysis.

average_measurements.pl
-min_points=1600 -n_aver=20
script for averaging measurements of fast photometry in databse , saves results to sql file to be loaded to new DB
list_stars.pl

lists stars and measurements

n/ astrometry / photometry


do_astro_on_scan.sh
runs astrometry on scan files
do_check_scan_ast!

runs astrometry on scan frames  if missing and writes resuts to fits file header










o/ backups / synchronization


pidb_dump_table!
pidb_dump_table! table_name database_name
This scripts dumps given table from specified database to file named by table name , example :

pidb_dump_table! stars fast1

creates file stars.sql where COPY command is present
dump_stars!
dump_stars! database_name
Dumps tables for star catalog seed : stars , superstar, frame and frame_det
NOTE : table measurements is not dumped
pidb_dump_catalog!
dump_stars! database_name drop all star catalog tables : stars, superstar, measurements, frame and frame_det
pidb_dump_all!
pidb_dump_all! database_name
dumps whole pi-database
pidb_dump!
pidb_dump!
dumps tables event, frame, frame_det from default database pidb



synch_db.pl
PARAMETERS :
-remote_db - remote database name ( data
                     source )
-remote_host - remote host name ( source)
-local_db - local database ( destination )
-local_host - local host
( destination )
-night - night to be synchronized
-do_insert - do insert to local database ?
-verb - verbose mode
-night
script for synchronization of remote database and loading data to local database
Only tables : frame, frame_det, event are synchronized
synch_flareevents.pl
PARAMETERS :

Synchronization on flares , it retrives all flares from current night from remote host and saves to local
database. Already existing flares are skiped ( no duplication is provided )
synch_interesting.pl
PARAMETERS : Synchronizes interesting objects from remote machine , in case IO record already exists in local DB new
measurements are retrived and record Star is updated
synch_aver20.sh
PARAMETERS:
1 - database name [default aver20_2006]
2 - max wait time in seconds [ default 12 hours ]
Main synchronization script, waits until cataloging is finished on remote host and gets data from aver20 database
synch_scan.sh
PARAMETERS:
1 - database name [default scan ]
2 - max wait time in seconds [ default 12 hours ]
3 - wait for cataloging yes/no [ default 1 ]
Main synchronization script, waits until cataloging is finished on remote host and gets data from scan database
copy_all_io.sh
PARAMETERS:
    none

synchronizes interestingobjects in aver20 database to aver20_iodb database. This gives possibility to synchronize the
star measurements on-line during data collection


p/ usfull root-scripts ( available in daq/src/ccd/scripts/ROOT_SCRIPTS/PI_HISTOGRAMS )


plotfile.C
PARAMETERS :
1 - file name ( format x y )
2 - function to fit ( NULL is no fitting )
3 - min_y
4 - max_y
for ploting graph of points listed in text file
histofile.C
PARAMETERS :
1 - text file name
2 - column to be histgramed ( starting from 0 )
3 - do fit ( default 0 )
4 - lower end ( default auto - minimum )
5 - upper end ( default auto - maximum )
6 - number of bins ( default 100 )
7 - logarithmic scale ( default 0 )
8 - X axis title ( default NONE )
9 - Y axis title ( default NONE )

for histograming variable in text file
flare_lc.C
PARAMETERS :
1 - file name ( format x y )
2 - function to fit ( NULL is no fitting )
3 - min_y
4 - max_y
plots found flare light curve in file format :

# final_min_mag=8.04797
# final_max_mag=8.66076
# min_mag=7.67889
# max_mag=9.02186
165.761152329855 8.40658
165.764427649789 8.97653
165.767668790184 7.99856
165.776025480125 8.04797

with bound lines determined in analysing program


r/ operating daq by CORBA ( without piman )

corba manager program must be started first - scrip corba_nsd!

shutter_open_mode!
shutter_normal_mode!

scripts for changing shutter mode to permanently opened or to normal mode ( open/close )
daqExit!

require daq program to exit
SetShutterTime!
SetShutterTime! 10
changing shutter time






s/ retriving information from web ( pointing )


get_swift_info.sh

retrives SWIFT satellite poiting information from http://www.swift.psu.edu/operations/obsSchedule.php
This information is divided into days in UT so for night in LCO both current and next day files must be
concatenated together and this is done by this script, resulting txt file in simple format is save to :
/opt/pi/dev/pisys/log/pointing/swift/swift_current.txt
This script should be executed just before script generator to have fresh pointing plan for current night.

t/ analysing images, getting parts etc ...


 make_object_parts_both_cam!
PARAMETERS :
1 - directory where fits files are stored
2 - final directory to which results of getting parts should be copied
3 - getpart2 options ( example : -mark )
getting parts of given position from data in given location with list of ast files provided , this list maybe found in DB and used here, calls script
 make_object_parts! for both cameras and then pifrbrowser

 make_object_parts!
PARAMTERS :
1 - list of fits files
2 - fits directory
3 - number of camera k2a-1 , k2b-2
4 - options passed to getpart2
gets parts from full images , according to given list of ast files ( as found
be select in DB ) , additional options line -mark can be passed to script in paramter 4 , result is stored in subdirectory parts/

u/ retriving informatiom from remote hosts ( pi2@LCO )


get_night_frames_log_files!
PARAMETERS :
1 - parameter for "nightdate +N" - days back/forward
2 - host name
3 - destination directory on local host
This scripts :
- retrives mount log files and calls script for making mount plots
   ( by Kasia Malek ) :
       mount.script_vs_daq.sh
- retrives txt header files from pi2
- copies mount plots to final location
 




w/ getting images from remote


run_request_handler!

pseudo server waiting for request to make subrendres on remote host which can be copied
by scp. It is possible to pass configuration paramters by filling file :
   /opt/pi/dev/pisys/daq/ndir//cfg/sys.cfg
example :
 pi_pipeline_dir = /pi3/data3/results/PI_PIPELINE/
 pi_pipeline_host = pi3.lco.cl






8. Data analysing scripts  :

Data analysis script are mainly written in perl these are :

find_flares.pl
PARAMETERS :
-min_points_above - minium number of points with magnitudo < then mag_limit ( default 3 )
-min_points            - minimum number of measurements for star ( default 20 )
-n_sigma                - for treshold on max_mag-min_mag > aver(mag) + n_sigma*sigma(mag)  (default 1 )
-min_peak_hight    - limit for hight of found peak in brightness (default 0.25 m )
-min_mag_star       - minimum average magnitude of star to be analysied (default 11 )
-dbname                 - database name (default pidb )
-dbhost                   - database host  (default localhost )
-star                        - id of star (internal id - number ) in case single star check required
-n_peak_sigma       - threshold for peak limit in sigmas ( sigma caluclated in band of 85% of measurements )
                                 like on example plot
-night                     - name of night
-do_night               - name of night
script for finding flares on single night
Method described here
find_flares_all_nights.pl
PARAMETERS :
-min_points_above - minium number of points with magnitudo < then mag_limit ( default 3 )
-min_points            - minimum number of measurements for star ( default 20 )
-n_sigma                - for treshold on max_mag-min_mag > aver(mag) + n_sigma*sigma(mag)  (default 1 )
-min_peak_hight    - limit for high of found peak in brightness (default 0.25 m )
-min_mag_star       - minimum average magnitude of star to be analysied (default 11 )
-dbname                 - database name (default pidb )
-dbhost                   - database host  (default localhost )
-star                        - id of star (internal id - number ) in case single star check required
-n_peak_sigma       - threshold for peak limit in sigmas ( sigma caluclated in band of 85% of measurements )
                                 like on example plot
-night                     - name of night
-do_night               - name of night
-verb                      - to turn on verbose mode - a lot of meassages
script for finding flares on all nights cataloged in database
method as above , described here
Programs makes big loop over all stars in database each star is verfied to have flare during each night

usage example :
find_flares_all_nights.pl -dbname=2004_2005 -dbhost=localhost
all_night.pl
info on Kasia's Malek page and here
SCAN analysis by Kasia Malek
do_flareevents.pl
PARAMTERS :
usage: do_flareevents.pl -night=-1 -no_measur_star=0 -min_obs_field=5  -mag=11 -db_save -dbname=pidb_test -camera=2 -use_max_run -verb
script for finding flare events but of this kind that new stars
apears, it is based on Kasia's nova finding script ,but must have been modified in case option -db_save is
enabled results are saved to table FlareEvents
do_flareevents_all_nights.pl
PARAMTERS :
usage:
runs do_flareevents.pl on all nights in database, results are stored in table FlareEvents


9. Pi system controlling programs and scripts  :


fieldpos
PARAMETERS :
-size=30 - size of FOV
-step=15 - step beteen FOVs - in order to have half of fields overalping put size/2 here, size in case no
                 overlapping
-verb - to enable verbose mode
This is program generating list of fields to be
used in scan and to be used for following satellite FOV
in order to have stars in same positions on chip
to be comparable between nights it's better to
have such constant list of fields for given camera
List of fields generated for Carl Zeiss f=50mm camera , in this case FOV=30 deg
so command was :
   fieldpos -size=30 -step=15
in order to have 15 degrees overlapping fields
genfields!

generates list of fields in correct order and odd_event.txt file
genscript
PARAMETERS :
-az_h_list - name of file with list of AZ,H fields to be used instead of
                  predefined list of RA,DEC fields, in case one wants
                  to use best AZ,H field close to ra,dec position
-az_h_field - passes coordinates of single AZ,H field to be used
-force_field=(az=286.45,h=39.8,time=0630) - it is possible to force
                    observation of specfied desired position at specified time
                    or =(ra=12.00,dec=-20.l23,time=0440)
-sat_order=INTEGRAL,HETE it is possible to set order of importance
                  of satellites to be followed, default is HETE,INTEGRAL
                  currently only this 2 are supported in following option
-init_cmd="daq 0 stat" it is possible to put some initialization commands
                 which will be executed at the very begning of the night script
-min_alt - minimal altitude to follow objects ( default 28 degrees )
-enable_camera - NOT USED
-enable_astro     - NOT USED
-hete_step - time interval [second] in which field of hete is followed, after
                    this time next potential target is calculated ( default 40 min =
                    40*60 seconds )
-show_hete - prints HETE path on sky (az,h) during night, for debugging
-nowget - hete position file is not retrived from WWW, local file is used
                instead ( in case exists )
-show_moon - shows moon positions
-show_moon - shows sun positions
-sun - specifies altitude of sun which is used to calculated  sun-set and sun-rise
          time which is used as start/end observation time ( default -10 degrees )
-hete - altitude of object which is used as good altitude to observe ( default is
           30 degrees )
-swift - enables following of SWIFT, it is not fully implemented, position
            information is retrievedm, but following is not implemented like
            for HETE and INTEGRAL
-show_swift - shows SWIFT positions
-scan - specifies altitude of sun at which scan  of  whole sky may be performed
           ( default -15 degrees )
-night - specifies night for which script must be generated ( default current  
             night)
-verb - verbose mode
-scanev - specifies time of evening scan
-scanmor - specifies time of morning scan
-file - specifies name of pish file
-clt - forces change of time zone into LCO time ( Central Latin Time )
-integral - enables following of INTEGRAL
-nomooncheck - disables checking of MOON position, to avoid observations
            to close to the MOON
-err - specifies error log file , default genscript.err
-fields - specifies list of favoritue fields, in case following close to them
             they are used rather then other even closer fields
-nofields - disables following by pre-defined list of fields, working in this
               way that in case RA,DEC of object is choosen, closest field
               is found and it is choosen to be followed, in case -nofields
               option is passed telescope will follow just position of
               the object to position of the closest field
-genscan - enables generation of scan commands, also number of field
                  observations during scan can be passed ( -genscan=5 ) , default
                # of obsevations is 3
-magellan - enables following of LMC and SMC , as lowest priority targets
-force_hete_dec - forces declination of HETE satellite, it was used in order
                             to follow with Petersen's telescope on La Silla
                             which only could observe declination =0
-daq_cmd_after_scan - specify command which should be performed after
                                     evening and morning scan
-turn_off_k2b_astro_after_scan - turns of k2b camera after scan
-autoguide_on_off - disabling autoguide for mount move ( OBSOLATE -
                                I think )



This is program for generating night script controlling
pi system in LCO, configuration. Currently the main idea is to
follow field of view of HETE, INTEGRAL or SWIFT.
In the evening and morning scan on whole  celestial sphere is
performed. It is possible to requiere observation of specified
field at specified time


Currently default options used in LCO :
genscript -file=auto_gen/autogen.pish -err=auto_gen/autogen.err
-show_moon -show_hete -show_sun -genscan -integral $OPTIONS -show_swift
mount_calib
PARAMETERS :
1 - coma sparaterd list of ast files :
 k2a_050209_000.ast,k2a_050209_00075_rot30.ast,k2a_050209_00075_rot60.ast

-dec_pole - declination of pole
-ra_pole   - right ascension of pole

This is program for finding distance of mount rotation axis to pole number of mm screw-moves is also calculated , usage example :
mount_calib k2a_050209_000.ast,k2a_050209_00075_rot30.ast,k2a_050209_00075_rot60.ast -dec_pole=-89.999

It works in such way that finds common stars in all ast files and then assums they were
rotated and finds mean center of rotation, calculates (x_center,y_center) position of rotation center and (x_pole,y_pole) of pole ( as default pole coordinates can be overwritten )
Then finds distance (dx,dy) and determines screw move number
It is possible to call this procedure from piman, the following script should
be used for this purpose mount_calib.pish , description of the axis calibration
procedure can be found here.
For testing reasons on heplx43 script mount_calib_test! was created and
can be used :
ssh heplx43
cd $RESDIR/simul/ccddouble/mount_calib/
./run!
./mount_calib!


sendreq2
PARAMETERS :
Syntax :
sendreq2 COMMAND OPTIONS
The following commands are currently handled ( for meaning of command
see link to piman command description ) :

       
COMMAND
OTHER NAMES
DESCRIPTION
IN PIMAN
DAQ_DO_DARKS

do_darks
DAQ_TAKE_N_PICTURES

take_npictures
DAQ_TAKE_N_PICTURES_SYNCHRO

take_npictures_synchro
CCD_CHANGE_PARAM

change_param
CCD_GET_PARAM

get_param
DAQ_RESET


DAQ_EXIT exit / CCDDRV_EXIT
quit
CCDDRV_SET_TEMP

set_temp
CCDDRV_GET_TEMP

get_temp
CCDDRV_GET_COOLING

get_cooling
CCDDRV_SET_COOLING

set_cooling
DAQ_SAVE_CURRENT


DAQ_GET_SHUTTER_TIME

get_shutter_time
DAQ_SET_SHUTTER_TIME

set_shutter_time
DAQ_GET_CURRENT_FRAME


DAQ_GET_PICTURE_SIZE


DAQ_SEND_ALERT


DAQ_SET_MOUNT_POSITION
sets coordinates in daq
fast_pos_to_daq
DAQ_ON_TRIGGER_POSITION


DAQ_START_ANALYSIS start

start_analysis
DAQ_STOP_ANALYSIS
stop

stop_analysis
DAQ_STOP_ANALYSIS_NO_WAIT
stop_no_wait


TEST

testing if
daq is alive

DAQ_GET_POSITION

gets position from daq ( astrometry )
get_position
DAQ_LOAD_PARAM_FILE

loads specified cfg file

DAQ_DO_ASTROMETRY_NOW

forces execution of astrometry now

DAQ_DO_ASTROMETRY_MODE



DAQ_START


start_daq
DAQ_SET_CUSTOM_KEY

sets fits keyword value
set_fits_key
DAQ_CLEAN_AST_LIST


clean_ast_list
GETSTATUS


stat
DAQ_CALC_MOUNT_AXIS_CALIB

calculates mount axist calibration
according to collected ast files
calc_mount_axis_calib

Except command it is in most cases nesesary to provide command paramters, which can be the following :

-camno - number of camera
-param - to provide paramter name and value for command DAQ_CHANGE_PARAM
-id - to send alert with given id
-dtm -  date / time of  alert to be send
-move  - alert sending tests - moving to alert
-ra / -ra_in_h - ra coordinate to be used
-dec / -dec_in_deg - dec coordinate to be used
-prior - alert priority
-type - alert type ???
-azim / -alt - azim , alt to be used
-mode  - observing mode to be set
-value - sets value of paramter for commands : DAQ_TAKE_N_PICTURES , DAQ_DO_DARKS,
           DAQ_DO_DARKS, DAQ_TAKE_N_PICTURES_SYNCHRO, CCD_CHANGE_PARAM,
           CCDDRV_SET_TEMP, CCDDRV_SET_COOLING, DAQ_SET_SHUTTER_TIME,
           DAQ_GET_CURRENT_FRAME , DAQ_LOAD_PARAM_FILE, DAQ_SET_CUSTOM_KEY,
          


This was originaly program for testing communication with daq using CORBA
now the main tool for this communication is piman and pishell, but it
is sometimes usfull to run command directly to daq ( in case some new command
is not yet implemented in piman/pishell interface )









10. DAQ simulator

In order to run daq simulator daq should be checked out from svn repository and script run_daq_simul should be executed , it has the following options :
Without parameters or with parameter 0 it stars daq in waiting mode :
         run_daq_simul
         run_daq_simul 0
When started with parameter >0 ( for example 1 ) simulator starts with set of standard frames and makes astrometry etc ..etc etc :
         run_daq_simul 1

11. GRB Limits

First thing to check limit ( or magnitude ) of optical counter part of GRB is to get parts using interface. After parts are retirved , check images ( aver20 and singles, possibly scan )
Then verify limits determined automatically , in subdirectory of  results dir this limits can be found in subdirectory in file lc.txt ( or RADEC.txt )
This limits are quite good, in order to obtain them manually use ast files ( example ) and run script :

          find_limit_ast! 3.082466667 -47.370

which finds limit by finding average correction of magnitude ( value to be added to instrumental magnitude ) and adding it to magnitude of 3 Sigma value stored in header key MAGLIMIT .
Another way of finding limits is usage of script  maglimit_db_frame! which takes name of DB ( aver20_2006 ) , id_frm , min_measure_count and makes distribution of star
magnitudes on given frame, example :
       maglimit_db_frame! aver20_2006 15499 20
Example of magnitude limit from this script , giving limit of  ~12.5-13 mag ( grb070209 )
It is also possible to check maglimit in given position of (X,Y) :
      maglimit_db_frame! aver20_2006 15499 20 - - - 1000 1000 200
This takes to distribution only stars nearby (x,y) = (1000,1000) in radius=200 pixels , example of distribution which gives limit ~13mag ( grb070209 )
It was tested on aver20_2006@pi3 : /data3/results/PI_PIPELINE/20070208/grb070209