Processing of HiSPARC events

Process HiSPARC events

This module can be used analyse data to get observables like arrival times and particle count in each detector for each event.

Example usage:

import datetime

import tables

from sapphire.publicdb import download_data
from sapphire import ProcessEvents

STATIONS = [501, 503, 506]
START = datetime.datetime(2013, 1, 1)
END = datetime.datetime(2013, 1, 2)


if __name__ == '__main__':
    station_groups = ['/s%d' % u for u in STATIONS]

    with tables.open_file('data.h5', 'w') as data:
        for station, group in zip(STATIONS, station_groups):
            download_data(data, group, station, START, END, True)
            proc = ProcessEvents(data, group)
            proc.process_and_store_results()
sapphire.analysis.process_events.ADC_THRESHOLD = 20

Threshold for arrival times, relative to the baseline

sapphire.analysis.process_events.TRIGGER_2 = (2, 0, False, 0)

Default trigger for 2-detector station 2 low and no high, no external

sapphire.analysis.process_events.TRIGGER_4 = (3, 2, True, 0)

Default trigger for 4-detector station 3 low or 2 high, no external

class sapphire.analysis.process_events.ProcessEvents(data, group, source=None, progress=True)

Process HiSPARC events to obtain several observables.

This class can be used to process a set of HiSPARC events and adds a few observables like particle arrival time and number of particles in the detector to a copy of the event table.

Initialize the class.

Parameters:
  • data – the PyTables datafile

  • group – the group containing the station data. In normal cases, this is simply the group containing the events table.

  • source – the name of the events table. Default: None, meaning the default name ‘events’.

  • progress – if True show a progressbar while copying and processing events.

processed_events_description = {'baseline': Int16Col(shape=(np.int64(4),), dflt=np.int16(-1), pos=6), 'data_reduction': BoolCol(shape=(), dflt=np.False_, pos=4), 'event_id': UInt32Col(shape=(), dflt=np.uint32(0), pos=0), 'event_rate': Float32Col(shape=(), dflt=np.float32(0.0), pos=12), 'ext_timestamp': UInt64Col(shape=(), dflt=np.uint64(0), pos=3), 'integrals': Int32Col(shape=(np.int64(4),), dflt=np.int32(-1), pos=10), 'n1': Float32Col(shape=(), dflt=np.float32(-1.0), pos=17), 'n2': Float32Col(shape=(), dflt=np.float32(-1.0), pos=18), 'n3': Float32Col(shape=(), dflt=np.float32(-1.0), pos=19), 'n4': Float32Col(shape=(), dflt=np.float32(-1.0), pos=20), 'n_peaks': Int16Col(shape=(np.int64(4),), dflt=np.int16(-1), pos=8), 'nanoseconds': UInt32Col(shape=(), dflt=np.uint32(0), pos=2), 'pulseheights': Int16Col(shape=(np.int64(4),), dflt=np.int16(-1), pos=9), 'std_dev': Int16Col(shape=(np.int64(4),), dflt=np.int16(-1), pos=7), 't1': Float32Col(shape=(), dflt=np.float32(-1.0), pos=13), 't2': Float32Col(shape=(), dflt=np.float32(-1.0), pos=14), 't3': Float32Col(shape=(), dflt=np.float32(-1.0), pos=15), 't4': Float32Col(shape=(), dflt=np.float32(-1.0), pos=16), 't_trigger': Float32Col(shape=(), dflt=np.float32(-1.0), pos=21), 'timestamp': Time32Col(shape=(), dflt=np.int32(0), pos=1), 'traces': Int32Col(shape=(np.int64(4),), dflt=np.int32(-1), pos=11), 'trigger_pattern': UInt32Col(shape=(), dflt=np.uint32(0), pos=5)}
process_and_store_results(destination=None, overwrite=False, limit=None)

Process events and store the results.

Parameters:
  • destination – name of the table where the results will be written. The default, None, corresponds to ‘events’.

  • overwrite – if True, overwrite previously obtained results.

  • limit – the maximum number of events that will be stored. The default, None, corresponds to no limit.

get_traces_for_event(event)

Return the traces from an event.

Parameters:

event – a row from the events table.

Returns:

the traces: an array of pulseheight values.

get_traces_for_event_index(idx)

Return the traces from event #idx.

Parameters:

idx – the index number of the event.

Returns:

the traces: an array of pulseheight values.

process_traces()

Process traces to yield pulse timing information.

static first_above_threshold(trace, threshold)

Find the first element in the list equal or above threshold

If no element matches the condition -999 will be returned.

Parameters:
  • trace – iterable trace.

  • threshold – value the trace has to be greater or equal to.

Returns:

index in trace where a value is greater or equal to threshold.

class sapphire.analysis.process_events.ProcessIndexedEvents(data, group, indexes, source=None, progress=True)

Process a subset of events using an index.

This is a subclass of ProcessEvents. Using an index, this class will only process a subset of events, thus saving time. For example, this class can only process events making up a coincidence.

Initialize the class.

Parameters:
  • data – the PyTables datafile

  • group – the group containing the station data. In normal cases, this is simply the group containing the events table.

  • indexes – a list of indexes into the events table.

  • source – the name of the events table. Default: None, meaning the default name ‘events’.

  • progress – if True show a progressbar while copying and processing events.

process_traces()

Process traces to yield pulse timing information.

This method makes use of the indexes to build a list of events.

get_traces_for_indexed_event_index(idx)
class sapphire.analysis.process_events.ProcessEventsWithLINT(data, group, source=None, progress=True)

Process events using LInear INTerpolation for arrival times.

This is a subclass of ProcessEvents. Use a linear interpolation method to determine the arrival times of particles.

Initialize the class.

Parameters:
  • data – the PyTables datafile

  • group – the group containing the station data. In normal cases, this is simply the group containing the events table.

  • source – the name of the events table. Default: None, meaning the default name ‘events’.

  • progress – if True show a progressbar while copying and processing events.

class sapphire.analysis.process_events.ProcessIndexedEventsWithLINT(data, group, indexes, source=None, progress=True)

Process a subset of events using LInear INTerpolation.

This is a subclass of ProcessIndexedEvents and ProcessEventsWithLINT.

Initialize the class.

Parameters:
  • data – the PyTables datafile

  • group – the group containing the station data. In normal cases, this is simply the group containing the events table.

  • indexes – a list of indexes into the events table.

  • source – the name of the events table. Default: None, meaning the default name ‘events’.

  • progress – if True show a progressbar while copying and processing events.

class sapphire.analysis.process_events.ProcessEventsWithoutTraces(data, group, source=None, progress=True)

Process events without traces

This is a subclass of ProcessEvents. Processing events without considering traces will invalidate the arrival time information. However, for some analyses it is not necessary to obtain this information. Ignoring the traces will then greatly decrease processing time and data size.

Initialize the class.

Parameters:
  • data – the PyTables datafile

  • group – the group containing the station data. In normal cases, this is simply the group containing the events table.

  • source – the name of the events table. Default: None, meaning the default name ‘events’.

  • progress – if True show a progressbar while copying and processing events.

class sapphire.analysis.process_events.ProcessIndexedEventsWithoutTraces(data, group, indexes, source=None, progress=True)

Process a subset of events without traces

This is a subclass of ProcessIndexedEvents and ProcessEventsWithoutTraces. Processing events without considering traces will invalidate the arrival time information. However, for some analyses it is not necessary to obtain this information. Ignoring the traces will then greatly decrease processing time and data size.

Initialize the class.

Parameters:
  • data – the PyTables datafile

  • group – the group containing the station data. In normal cases, this is simply the group containing the events table.

  • indexes – a list of indexes into the events table.

  • source – the name of the events table. Default: None, meaning the default name ‘events’.

  • progress – if True show a progressbar while copying and processing events.

class sapphire.analysis.process_events.ProcessEventsWithTriggerOffset(data, group, source=None, progress=True, station=None)

Process events and reconstruct trigger time from traces

The trigger times are stored in the columnt_trigger, they are relative to the start of traces, just like the t# columns.

If no trigger can be found, possibly due to the data filter, a value of -999 will be entered.

Initialize the class.

Parameters:
  • data – the PyTables datafile

  • group – the group containing the station data. In normal cases, this is simply the group containing the events table.

  • source – the name of the events table. Default: None, meaning the default name ‘events’.

  • progress – if True show a progressbar while copying and processing events.

  • station – station number of station to which the data belongs.

class sapphire.analysis.process_events.ProcessEventsFromSource(source_file, dest_file, source_group, dest_group, progress=False)

Process HiSPARC events from a different source.

This class is a subclass of ProcessEvents. The difference is that in this class, the source and destination are assumed to be different files. This also means that the source is untouched (no renaming of original event tables) and the destination is assumed to be empty.

Initialize the class.

Parameters:
  • source_file,dest_file – PyTables source and destination files.

  • source_group,dest_group – the pathname of the source and destination group.

  • progress – if True show a progressbar while copying and processing events.

class sapphire.analysis.process_events.ProcessEventsFromSourceWithTriggerOffset(source_file, dest_file, source_group, dest_group, station=None, progress=False)

Process events from a different source and find trigger.

This is a subclass of ProcessEventsFromSource and ProcessEventsWithTriggerOffset. Processing events and finding the trigger time in the traces. And storing the results in a different file than the source.

Initialize the class.

Parameters:
  • source_file,dest_file – PyTables source and destination files.

  • source_group,dest_group – the pathname of the source and destination group.

  • station – station number of station to which the data belongs.

  • progress – if True show a progressbar while copying and processing events.

class sapphire.analysis.process_events.ProcessDataTable(data, group, source=None, progress=True)

Process HiSPARC abstract data table to clean the data.

Abstract data is a PyTables table containing a timestamp for each row. Weather and singles data are examples of such tables. This class can be used to process a set of abstract HiSPARC data, to remove duplicates and sort the data by timestamp to store it in to a copy of the table.

Initialize the class.

Parameters:
  • data – the PyTables datafile

  • group – the group containing the station data. In normal cases, this is simply the group containing the events table.

  • source – the name of the events table. Default: None, meaning the default name ‘events’.

  • progress – if True show a progressbar while copying and processing events.

table_name = 'abstract_data'
process_and_store_results(destination=None, overwrite=False, limit=None)

Process table and store the results.

Parameters:
  • destination – name of the table where the results will be written. The default, None, corresponds to the value stored in self.table_name.

  • overwrite – if True, overwrite previously obtained results.

  • limit – the maximum number of records that will be stored. The default, None, corresponds to no limit.

class sapphire.analysis.process_events.ProcessDataTableFromSource(source_file, dest_file, source_group, dest_group, progress=False)

Process HiSPARC abstract data table from a different source.

This class is a subclass of ProcessDataTable. The difference is that in this class, the source and destination are assumed to be different files. This also means that the source is untouched (no renaming of original event tables) and the destination is assumed to be empty.

Initialize the class.

Parameters:
  • source_file,dest_file – PyTables source and destination files.

  • source_group,dest_group – the pathname of the source and destination group.

  • progress – if True show a progressbar while copying and processing events.

class sapphire.analysis.process_events.ProcessWeather(data, group, source=None, progress=True)

Process HiSPARC weather to clean the data.

This class can be used to process a set of HiSPARC weather, to remove duplicates and sort the data by timestamp to store it in to a copy of the weather table.

Initialize the class.

Parameters:
  • data – the PyTables datafile

  • group – the group containing the station data. In normal cases, this is simply the group containing the events table.

  • source – the name of the events table. Default: None, meaning the default name ‘events’.

  • progress – if True show a progressbar while copying and processing events.

table_name = 'weather'
class sapphire.analysis.process_events.ProcessWeatherFromSource(source_file, dest_file, source_group, dest_group, progress=False)

Process HiSPARC weather from a different source.

This class behaves like a subclass of ProcessWeather because of a common ancestor (ProcessDataTable). The difference between this class and ProcessWeather is that in this class, the source and destination are assumed to be different files. This also means that the source is untouched (no renaming of original event tables) and the destination is assumed to be empty.

Initialize the class.

Parameters:
  • source_file,dest_file – PyTables source and destination files.

  • source_group,dest_group – the pathname of the source and destination group.

  • progress – if True show a progressbar while copying and processing events.

table_name = 'weather'
class sapphire.analysis.process_events.ProcessSingles(data, group, source=None, progress=True)

Process HiSPARC singles data to clean the data.

This class can be used to process a set of HiSPARC singles data, to remove duplicates and sort the data by timestamp to store it in to a copy of the singles data table.

Initialize the class.

Parameters:
  • data – the PyTables datafile

  • group – the group containing the station data. In normal cases, this is simply the group containing the events table.

  • source – the name of the events table. Default: None, meaning the default name ‘events’.

  • progress – if True show a progressbar while copying and processing events.

table_name = 'singles'
class sapphire.analysis.process_events.ProcessSinglesFromSource(source_file, dest_file, source_group, dest_group, progress=False)

Process HiSPARC singles data from a different source.

This class behaves like a subclass of ProcessSingles because of a common ancestor (ProcessDataTable). The difference between this class and ProcessSingles is that in this class, the source and destination are assumed to be different files. This also means that the source is untouched (no renaming of original event tables) and the destination is assumed to be empty.

Initialize the class.

Parameters:
  • source_file,dest_file – PyTables source and destination files.

  • source_group,dest_group – the pathname of the source and destination group.

  • progress – if True show a progressbar while copying and processing events.

table_name = 'singles'