wiki:DatabaseBasedAnalysis/Spectrum

Version 257 (modified by tbretz, 5 years ago) ( diff )

--

Spectrum Analysis

Theory

Spectrum

The differential flux per area, time and energy interval is defined as

Often is also referred to as as observation time and effective collection area is a constant.

For an observation with an effective observation time , this yields in a given Energy interval :

For simplicity, in the following, will be replaced by just but always refers to to a given energy interval. If data is binned in a histogram, the relation between the x-value for the bin and the corresponding interval is not well defined. Resonable definitions are the bin center (usually in logarithmic bins) or the average energy.

Efficiency

The total area and the corresponding efficiency are of course only available for simulated data. For simulated data, is the production area and the corresponding energy dependent efficiency of the analysis chain. For a given energy bin, the efficiency is then defined as

where is the number of simulated events in this energy bin and the number of *excess* events that are produced by the analysis chain.

Note that the exact calculation of the efficiency depends on prior knowledge of the correct source spectrum . Therefore, it is strictly speaking only correct if the simulated spectrum and the real spectrum are identical. As the real spectrum is unknown, special care has to be taken of the systematic introduced by the assumption of .

Effective Collection Area

The effective area is then defined as . Note that at large distances the efficiency vanishes, so that the effective area is an (energy dependent) constant while and the efficiency are mutually dependent.

Excess and Error

The number of excess events, for data and simulations, is defined as

where is the number of events identified as potential gammas from the source direction ('on-source') and the number of gamma-like events measured 'off-source'. Note that for Simulations, is not necessarily zero for wobble-mode observations as an event can survive the analysis for on- and off-events, if this is not prevented by the analysis (cuts).

The average number of background events is the total number of background events from all off-regions times the corresponding weight (often referred to as ). For five off-regions, this yields

Assuming Gaussian errors, the statistical error is thus

For data this immediately resolves to

with the Poisson (counting) error .

Weights and Error

In the following refers to a number of simulated events and to a number of measured (excess) events.

is the number of produced events in the energy interval and the zenith angle interval . The weighted number of events in that interval is then

Since with the spectral weight to adapt the spectral shape of the simulated spectrum to the real (measured) spectrum of the source and the weight to adapt to oberservation time versus zenith angle.

The weighted number of produced events in the total energy interval and the total zenith angle interval is then

with being the total number of produced events.

For a sum of weights, e.g. the corresponding error is

As the energy is well defined, and thus

The weights are defined as follow:

where is the simulated spectrum and the (unknown) real source spectrum. is a normalization constant. The zenith angle weights in the interval are defined as

where is the number of produced events in the interval and is the total observation time in the same zenith angle interval. is the normalization constant. The error on the weight in each individual -bin with and is then

While is given by the data acquisition and 1s per 5min run, is just the statistical error of the number of events.

As the efficiency for an energy interval is calculated as

and and are both expressed as the sum given above, the constants and cancel.

The differential flux in an energy interval is then given as

Where is total area of production and the total observation time. The number of measured excess events is in that energy interval is .

Using Gaussian error propagation, the error in a given energy interval is then given by

with .

Conceptual Example

Define Binnings

Get Data File List

A list with file IDs containing the events to be analyed is required a.t.m. The following query retrieves such a list and fills a temporary table (DataFiles) with the IDs.

CREATE TEMPORARY TABLE DataFiles
(
   FileId INT UNSIGNED NOT NULL,
   PRIMARY KEY (FileId) USING HASH
) ENGINE=Memory
AS
(
   SELECT
      FileId
   FROM
      factdata.RunInfo
   WHERE
      %0:where
   ORDER BY
      FileId
)

%0:where is a placeholder, for example for

fZenithDistanceMean<30 AND fThresholdMinSet<350 AND 
fSourceKEY=5 AND 
fRunTypeKEY=1 AND 
fNight>20161201 AND fNight<20170201 AND
fR750Cor>0.9e0*fR750Ref

Get Observation Time

The following query bins the effective observation time of the runs listed above in zenith angle bins and stores the result in a temporary table (ObservationTime). Note that the result contains only those bins which have entries.

CREATE TEMPORARY TABLE ObservationTime
(
   `.theta` SMALLINT UNSIGNED NOT NULL,
   OnTime FLOAT NOT NULL,
   PRIMARY KEY (`.theta`) USING HASH
) ENGINE=Memory
AS
(
   SELECT
      INTERVAL(fZenithDistanceMean, %0:bins) AS `.theta`,
      SUM(TIME_TO_SEC(TIMEDIFF(fRunStop,fRunStart))*fEffectiveOn) AS OnTime
   FROM
      DataFiles
   LEFT JOIN
      factdata.RunInfo USING (FileId)
   GROUP BY
      `.theta`
   ORDER BY
      `.theta`
)

%0:bins is a placeholder for the bin boundaries, e.g. 5, 10, 15, 20, 25, 30 (five bins between 5° and 30° plus underflow and overflow).

Get Monte Carlo File List

The next query obtains all Monte Carlo runs which have their ThetaMin or ThetaMax within one of the bins obtained in the previous query (so all MC runs that correspond to bins in which data is available). Strictly speaking, this step is not necessary, but it accelerats further processing. In addition (here as an example) only runs with even FileIDs are obtained as test-runs (assuming that odd runs were used for training). The resulting FileIDs are stored in a temporary table (MonteCarloFiles).

CREATE TEMPORARY TABLE MonteCarloFiles
(
   FileId INT UNSIGNED NOT NULL,
   PRIMARY KEY (FileId) USING HASH
) ENGINE=Memory
AS
(
   SELECT
      FileId
   FROM
      ObservationTime
   LEFT JOIN
      BinningTheta ON `.theta`=bin
   LEFT JOIN
      factmc.RunInfoMC
   ON
      (ThetaMin>=lo AND ThetaMin<hi) OR (ThetaMax>lo AND ThetaMax<=hi)
   WHERE
      PartId=1 AND
      FileId%%2=0
   ORDER BY
      FileId
)

Get Zenith Angle Histogram

The following table creates a temporaray table (EventCount) internally which bins the MonetCarlo files from the file list in MonteCarloFiles in zenith angle bins. This temporary table ois then joined with table containing the binning for the data files (EventCount) and for each bin, the ratio (ZdWeight) and the corresponding error (ErrZdWeight) is calculated (assuming an error on the on-time of 1s per 5min). To have also the in edges in the same table, the binning is joined as well.

CREATE TEMPORARY TABLE ThetaHist
(
   `.theta`    SMALLINT UNSIGNED NOT NULL,
   lo          DOUBLE            NOT NULL COMMENT 'Lower edge of zenith distance bin in degree',
   hi          DOUBLE            NOT NULL COMMENT 'Upper edge of zenith distance bin in degree',
   CountN      INT UNSIGNED      NOT NULL,
   OnTime      FLOAT             NOT NULL,
   ZdWeight    DOUBLE            NOT NULL COMMENT 'tau(delta theta)',
   ErrZdWeight DOUBLE            NOT NULL COMMENT 'sigma(tau)',
   PRIMARY KEY (`.theta`) USING HASH
) ENGINE=Memory
AS
(
   WITH EventCount AS
   (
      SELECT
         INTERVAL(DEGREES(Theta), %0:bins) AS `.theta`,
         COUNT(*) AS CountN
      FROM
         MonteCarloFiles
      LEFT JOIN
         factmc.OriginalMC USING(FileId)
      GROUP BY
         `.theta`
   )
   SELECT
      `.theta`, lo, hi,
      CountN,
      OnTime,
      OnTime/CountN AS ZdWeight,
      (OnTime/CountN)*SQRT(POW(1/300, 2) + 1/CountN) AS ErrZdWeight
   FROM
      ObservationTime
   LEFT JOIN
      EventCount USING(`.theta`)
   LEFT JOIN
      BinningTheta ON `.theta`=bin
   ORDER BY
      `.theta`
)

%0:bins is a placeholder for the bin boundaries, e.g. 5, 10, 15, 20, 25, 30 (five bins between 5° and 30° plus underflow and overflow). It should be identical to the binning used for data files in DataFiles.

Analysis Query

The analysis of Data and MontaCarlo files must be done totally identical to produce reasonable results. Therefore, the exact same query (or code) should be used for the analysis of both. In this example, the query has to provide two columns, Weight and LogEnergyEst for all gamma-line events. The weight must be +1 for events in the on-region and -0.2 for an event in the off-region (corresponding to the number of five wobble positions). LogEnergyEst must contain of the estimated energy for each event. Only events surviving background suppression and spatial (theta) cuts should be considered.

WITH Table0 AS
(
   SELECT
      %0:columns  -- this could be removed if we can join events via the same columns (without CorsikaNumResuse)
      Weight,
      Size,
      NumUsedPixels,
      NumIslands,
      Leakage1,
      MeanX,
      MeanY,
      CosDelta,
      SinDelta,
      M3Long,
      SlopeLong,
      Width/Length      AS WdivL,
      PI()*Width*Length AS Area,
      cosa*X - sina*Y   AS PX,
      cosa*Y + sina*X   AS PY
   FROM
      %1:files
   LEFT JOIN
      %2:runinfo USING (FileId)
   LEFT JOIN 
      %3:events USING (FileId)  -- This could be replaced by a user uploaded temporary table
   LEFT JOIN 
      %4:positions USING (FileId, EvtNumber)
   CROSS JOIN 
      Wobble
   WHERE
      NumUsedPixels>5.5
   AND
      NumIslands<3.5
   AND
      Leakage1<0.1
),

Table1 AS
(
   SELECT
      %0:columns
      Weight,
      Size, CosDelta, SinDelta, M3Long, SlopeLong, Leakage1, WdivL,
      MeanX - PX/1.02e0 AS DX,
      MeanY - PY/1.02e0 AS DY
   FROM
      Table0
   WHERE
      Area < LOG10(Size)*898e0 - 1535e0
),

Table2 AS
(
   SELECT
      %0:columns
      Weight,
      Size, CosDelta, SinDelta, DX, DY, M3Long, SlopeLong, Leakage1, WdivL,
      SQRT(DX*DX + DY*DY) AS Norm
   FROM
      Table1
),

Table3 AS
(
   SELECT
      %0:columns
      Weight,
      Size, M3Long, SlopeLong, Leakage1, WdivL, Norm,
      LEAST(GREATEST((CosDelta*DY - SinDelta*DX)/Norm, -1), 1) AS LX,
      SIGN(CosDelta*DX + SinDelta*DY) AS Sign
   FROM
      Table2
),

Table5 AS
(
   SELECT
      %0:columns
      Weight,
      Size, Leakage1, WdivL, LX,
      Norm          *0.0117193246260285378e0 AS Dist,
      M3Long   *Sign*0.0117193246260285378e0 AS M3L,
      SlopeLong*Sign/0.0117193246260285378e0 AS Slope
   FROM
      Table3
),

Table6 AS
(
   SELECT
      %0:columns
      Weight,
      Size, WdivL, Dist, LX, M3L, Slope,
      1.39252e0 + 0.154247e0*Slope + 1.67972e0*(1-1/(1+4.86232e0*Leakage1)) AS Xi
   FROM
      Table5
),

Table7 AS
(
   SELECT
      %0:columns
      Weight,
      Size, Dist, LX,
      IF (M3L<-0.07 OR (Dist-0.5e0)*7.2e0-Slope<0, -Xi, Xi) * (1-WdivL) AS Disp
   FROM
      Table6
)

SELECT
   %0:columns
   Weight,
   (Disp*Disp + Dist*Dist - 2*Disp*Dist*SQRT(1-LX*LX)) AS ThetaSq,
   %5:estimator AS LogEnergyEst
FROM
   Table7
HAVING
   ThetaSq<0.024

The placeholder %0:column is currently used to request MonteCarlo true values (such as Energy) in addition to the anaylsis values only for the analysis of simulated data. The must be no comma behind! In additon, %1:files is a placeholder for the table containing the FileIds to analyze, %2:runinfo for the table with the run info data, %3:events for the table with the image parameters and %4:positions for the table with the source positions in the camera.

Analyze Data

The previous query is used to create a temporary table (Excess) with a common table expression (CTE). This table is referred to in the following query and produces a summary of the observation which is then stored in another temporary table (AnalysisData).

WITH Table0 AS
(
   SELECT
      INTERVAL(LogEnergyEst, %6:bins)  AS  `.energy`,
      COUNT(IF(Weight>0, 1, NULL))     AS  `Signal`,
      COUNT(IF(Weight<0, 1, NULL))/5   AS  `Background`
   FROM
      Excess
   GROUP BY
      `.energy`
   ORDER BY
      `.energy`
)
SELECT
   *,
   `Signal` - `Background`        AS `Excess`,
   LiMa(`Signal`, `Background`)   AS `Significance`,
   ExcErr(`Signal`, `Background`) AS `ErrExcess`
FROM
   Table0

%6:bins is a placeholder for a binning if log-energy, e.g. 2.5, 3.0, 3.5, 4.0, 4.5, 5.0 (five bins between 2.5 and 5.0 plus underflow and overflow).

Analyze Monte Carlo Data

Similarly to the analysis of Data, another query summarizes the MonteCarlo Analysis. The result is stored in a temporary table (AnalysisMC).

WITH Table0 AS
(
   SELECT
      Weight,
      INTERVAL(Zd,            %6:theta)        AS `.theta`,
      INTERVAL(LogEnergyEst,  %7:energyest)    AS `.energyest`,
      INTERVAL(log10(Energy), %8:energysim)    AS `.energysim`,
      (%9:spectrum)/pow(Energy, SpectralIndex) AS SpectralWeight,
      LogEnergyEst - log10(Energy) AS Residual
   FROM
      Excess
),
Table1 AS
(
   SELECT
      `.energyest`,
      `.energysim`,

      -- Signal, Background, Excess
      SUM(  IF(Weight>0,     ZdWeight*SpectralWeight,       0)) OVER EnergyEst  AS  `SignalW`,
      SUM(  IF(Weight<0,     ZdWeight*SpectralWeight,       0)) OVER EnergyEst  AS  `BackgroundW`,
      SUM(  IF(Weight>0, POW(ErrZdWeight*SpectralWeight,2), 0)) OVER EnergyEst  AS  `SignalW2`,
      SUM(  IF(Weight<0, POW(ErrZdWeight*SpectralWeight,2), 0)) OVER EnergyEst  AS  `BackgroundW2`,
      COUNT(IF(Weight>0, 1, NULL))                              OVER EnergyEst  AS  `SignalN`,
      COUNT(IF(Weight<0, 1, NULL))                              OVER EnergyEst  AS  `BackgroundN`,
      -- SUM(  IF(Table9.Weight>0,     ThetaHist.Weight*spectrum/Energy,    0)) OVER EnergyEst  AS  `SignalW`,
      -- SUM(  IF(Table9.Weight<0,     ThetaHist.Weight*spectrum/Energy,    0)) OVER EnergyEst  AS  `BackgroundW`,
      -- SUM(  IF(Table9.Weight>0, POW(ThetaHist.Weight*spectrum/Energy,2), 0)) OVER EnergyEst  AS  `SignalW2`,
      -- SUM(  IF(Table9.Weight<0, POW(ThetaHist.Weight*spectrum/Energy,2), 0)) OVER EnergyEst  AS  `BackgroundW2`,

      -- Threshold
      SUM(    Weight *    ZdWeight*SpectralWeight   ) OVER EnergySim  AS  `ThresholdW`,
      SUM(POW(Weight * ErrZdWeight*SpectralWeight,2)) OVER EnergySim  AS  `ThresholdW2`,
      SUM(    Weight                                ) OVER EnergySim  AS  `ThresholdN`,
      -- SUM(     Table9.Weight/Energy * ThetaHist.Weight*spectrum     ) OVER EnergySim  AS  `ThresholdW`,
      -- SUM( POW(Table9.Weight/Energy * ThetaHist.Weight*spectrum,2)  ) OVER EnergySim  AS  `ThresholdW2`,
      -- SUM(     Table9.Weight/Energy                                 ) OVER EnergySim  AS  `ThresholdN`

      -- Estimators
      SUM(IF(Weight>0,                 ZdWeight*SpectralWeight,  0)) OVER EnergySim  AS  SimW,
      SUM(IF(Weight>0,     Residual*   ZdWeight*SpectralWeight,  0)) OVER EnergyEst  AS  EstSum,
      SUM(IF(Weight>0,     Residual*   ZdWeight*SpectralWeight,  0)) OVER EnergySim  AS  SimSum,
      SUM(IF(Weight>0, POW(Residual,2)*ZdWeight*SpectralWeight,  0)) OVER EnergyEst  AS  EstSum2,
      SUM(IF(Weight>0, POW(Residual,2)*ZdWeight*SpectralWeight,  0)) OVER EnergySim  AS  SimSum2,

      -- Migration
      SUM(Weight * ZdWeight*SpectralWeight) OVER Migration  AS  `MigrationW`,
      SUM(Weight                          ) OVER Migration  AS  `MigrationN`
   FROM
      Table0
   INNER JOIN
      ThetaHist USING(`.theta`)
   WINDOW
      EnergyEst AS (PARTITION BY `.energyest`),
      EnergySim AS (PARTITION BY `.energysim`),
      Migration AS (PARTITION BY `.energysim`,`.energyest`)
)
SELECT DISTINCT
   *,
   `SignalW` - `BackgroundW`/5 AS `ExcessW`,
   `SignalN` - `BackgroundN`/5 AS `ExcessN`,
   ExcErr(`SignalW2`, `BackgroundW2`/5) AS `ErrExcessW`,
   ExcErr(`SignalN`,  `BackgroundN` /5) AS `ErrExcessN`,
   IF(SignalW=0, 0, EstSum / SignalW) AS BiasEst,
   IF(SimW   =0, 0, SimSum / SimW)    AS BiasSim,
   IF(SignalW=0, 0, SQRT(EstSum2/SignalW - POW(EstSum/SignalW, 2))) AS ResolutionEst,
   IF(SimW   =0, 0, SQRT(SimSum2/SimW    - POW(SimSum/SimW,    2))) AS ResolutionSim
FROM
   Table1

The placeholders %6:theta, %7:energyest and %8:energysim are the binnings (as used previously) for the zenith angle and the logarithm (base 10) of the estimated and true energy. %9:spectrum is the (unknown) 'true' source spectrum, for example POW(Energy, -2.4).

Summarize Corsika Production

The following queries produces a summary of the events simulated at first by Corsika. The result is stored in a temporary table (SimulatedSpectrum).

CREATE TEMPORARY TABLE SimulatedSpectrum
(
   `.energy` SMALLINT UNSIGNED NOT NULL COMMENT 'Bin Index [MC Energy]',
   CountN    DOUBLE            NOT NULL,
   CountW    DOUBLE            NOT NULL,
   CountW2   DOUBLE            NOT NULL,
   PRIMARY KEY (`.energy`) USING HASH
) ENGINE=Memory
AS
(
   SELECT
      INTERVAL(LOG10(Energy), %0:energyest) AS `.energy`,
      COUNT(*) AS CountN,
      SUM(    (%1:spectrum)/pow(Energy, SpectralIndex)   ) AS CountW,
      SUM(POW((%1:spectrum)/pow(Energy, SpectralIndex),2)) AS CountW2
   FROM
      MonteCarloFiles
   LEFT JOIN
      factmc.RunInfoMC USING (FileId)
   LEFT JOIN
      factmc.OriginalMC USING (FileId)
   GROUP BY
      `.energy`
   ORDER BY
      `.energy`
)

The placeholder %0:energyest is the binnings (as used previously) for the logarithm (base 10) of the estimated energy. %1:spectrum is the (unknown) 'true' source spectrum, for example POW(Energy, -2.4).

Result (Spectrum)

Result (Threshold)

Result (Migration)

Note: See TracWiki for help on using the wiki.