Changeset 19032 for trunk/DataCheck/Tools
- Timestamp:
- 06/28/18 13:05:39 (6 years ago)
- File:
-
- 1 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/DataCheck/Tools/get_data.sh
r19031 r19032 1 1 #!/bin/bash 2 2 3 # todo 3 # ---------------------------------------------------------------- 4 # README README README README README README README README README 5 # ---------------------------------------------------------------- 6 # # 7 # To use this script, you need # 8 # - a computer with access to the FACT database in La Palma # 9 # - a file with the password of a valid mysql-user # 10 # - to define the setup below for # 11 # a) the DB access # 12 # b) the data you want to have # 13 # # 14 # To define the setup, search for SETUP in this script and # 15 # read the details there (starting roughly line 295) # 16 # # 17 # Per data request, you get up to 3 files: # 18 # *_internal.dat # 19 # *_collaborators.dat # 20 # *_external.dat (only if binning is 20min or nightly) # 21 # # 22 # Please have in mind that this started as a tool for myself, # 23 # then others started using it. Also the script is not yet # 24 # finalized. In case you find problems and/or have a feature # 25 # request, please send and email to dorner@astro.uni-wuerzburg.de # 26 # # 27 # ---------------------------------------------------------------- 28 # README README README README README README README README README 29 # ---------------------------------------------------------------- 30 31 32 33 34 35 # ToDo (notes DD): 36 # ---------------- 37 # - add file for collaborators 38 # - update columns and content for 3 types of files 39 # - limit creation of file for externals to 4 40 # - update function for correction 5 41 # - update CU for QLA … … 9 45 # - check crab flux 10 46 # - add E2dNdE? 47 # - functionality to determine start time for seaon-binning 48 # - can get_data.sh / Send_Data*.sh be combined? 49 # get_data.sh should be able to run stand-alone and be kept simple for any user 50 51 # 52 # content of files (wish list): 53 # ----------------------------- 54 # REMARK: keep order of columns to allow for reading with TGraph directly from file: X Y EX EY 55 # 56 # internal 57 # -------- 58 # time: time, delta time, start, stop, ontime 59 # flux: excrate, excerr, corrate, corerr, CU CUerr, flux, fluxerr, 60 # other info on flux: signif, cu-factor, num exc, num sig, num bg 61 # other info: zd th R750cor R750ref 62 # 63 # external (allow only 20min and nightly binning) 64 # -------- 65 # time: time, delta time, start, stop 66 # flux: excrate, excerr 67 # 68 # collaborators 69 # ------------- 70 # time: time, delta time, start, stop, ontime 71 # flux: excrate, excerr, corrate, corerr, flux, flux-err, significance 72 # 73 # additional information to put: 74 # ------------------------------ 75 # timestamp of creation 76 # query (for debugging / answering questions) 77 # policy (adapted for internal/collaborators/external) [define in files to be used also by Send_Data*.sh 78 # 79 80 11 81 12 82 function get_results() … … 33 103 where=$where" "$dch 34 104 35 cufactor=" Avg(25.2) " 105 # 106 cufactor=" Avg(CUQLA(fNight)) " 36 107 crabflux="3.37e-11" 37 108 fluxprec=13 … … 108 179 fluxerr2="$cuerr2*"$crabflux 109 180 110 # internal111 # --------112 # timeselect:113 # mjdstar, mjdstop, mjdmean, ontime114 # excselect:115 # excrate, excerr116 # corrected: excrate, excerr117 # CU CUerr118 # flux, fluxerr119 # addselect:120 # signif121 # num exc, num sig, num bg122 # other info: zd? th?123 #124 #125 # external126 # --------127 # time, delta time, start, stop128 # corr-excrate, corr-excerr129 # flux, flux-err130 181 131 182 if [ $bin -le 0 ] … … 242 293 } 243 294 244 # setup 245 # db 246 sqlpw=/home/$USER/.mysql.pw # file with mysql credentials 247 #host=lp-fact 248 host=10.0.100.21 249 #host=localhost 250 dbname=factdata # name of database 295 # SETUP: 296 # ------ 297 # DB SETUP: 298 # --------- 299 # path to file with mysql password 300 sqlpw=/home/$USER/.mysql.pw 301 # host of mysql server with FACT DB 302 #host=lp-fact # ISDC 303 host=10.0.100.21 # LP or LP via vpn 304 #host=localhost # your local machine in case you have a copy of DB 305 # name of database 306 dbname=factdata 251 307 # defaults for zd and threshold 252 308 zdmax=90 # all data 253 309 thmax=1500 # all data 310 # 311 # SETUP for your data: 312 # -------------------- 254 313 # output path 255 314 path=`dirname $0` 256 315 datapath=$path"/data" 316 # create directory for data files 257 317 if ! [ -e $datapath ] 258 318 then … … 264 324 timeunit=mjd 265 325 # time binning 266 # positive values: minutes267 # negative values: days268 # special case 0: period269 # for season binning choose -365 and according start date326 # positive values: minutes 327 # negative values: days 328 # special case 0: period 329 # for season binning choose -365 and according start date 270 330 #bin=20 # minutes 271 331 #bin=0 # period … … 288 348 # 501 MAGIC 289 349 source=2 290 name="Mrk501_2014JulAug" 291 bin=-1 292 nightmin=20140714 293 nightmax=20140805 350 name="Mrk501_2014_QLA" 351 bin=-1 352 nightmin=20140501 353 nightmax=20140930 354 get_results 355 table="AnalysisResultsRunISDC" # ISDC 356 name="Mrk501_2014_ISDC" 294 357 get_results 295 358
Note:
See TracChangeset
for help on using the changeset viewer.