Skip to main content

Events

To organize time based data in XINA, we employ events, which come in two forms: instants, referring to a single moment in time, and intervals, referring to a range of time. The goal of events is to make it easy to find, compare, and trend data. Each has their own databases and include fields for:

  • type (indicates how the event should be viewed and interpreted)
  • UEID (universally unique event identifier, generated at the creation of the event)
  • numeric event ID
  • plain text label (up to 128 bytes)
  • plain text, HTML, or JSON content
  • optional JSON object metadata

The UEID uniquely identifies an event, and is the only way to permanently, globally specify it. It should be applied at the time of creation to ensure consistency even if data is reprocessed. The event ID is optional, and can be used as needed (when not provided it will be zero by default). Its much faster and more reliable to query numbers than text, so this is the best way to indicate events having commmon meaning.

Event Types

XINA defines a fixed set of standard event types, each with an associated numeric code. The type is stored as the code in the database for performance reasons; for practical purposes most actions can use the type name directly, unless interacting directly with the API.

Standard Types

Code Name Ins Int Description
0 message Basic event, ID optional
1 marker Organized event, ID required
2 alert Organized event, ID required, level (severity) required
2000 test Discrete test period, may not overlap other tests, ID optional
2001 activity Discrete activity period, may not overlap other activities, ID optional
2002 phase Discrete phase period, may not overlap other phases, ID optional
3000 data General purpose data set
3001 spectrum General purpose spectrum data

Additional types will be added in the future as needed, with codes based on this chart:

Standard Type Code Ranges

code ins int description
0-999 General types for instants and intervals
1000-1999 General types for instants only
2000-2999 General types for intervals only
3000-3999 Data set types for instants and intervals
4000-4999 Data set types for instants only
5000-5999 Data set types for intervals only

Data Format

The data event type indicates a basic data set. This is typically used with the single file per event database structure, in which case the file will contain the data set. For event databases without files, the data is expected to be stored in the content field. This is only recommended for small datasets (less than 1MB).

Files must be either ASCII or UTF-8 encoded. New lines will be interpretted from either \n or \r\n. The conf object may define other customization of the format:

Conf Definition

Key Value Default Description
delimiter string auto detect (',', '\t', ';') value delimiter
quoteChar character " (double quote character) value quote character
ignoreLines number 0 number of lines to skip before the header
invalid null, 'NaN', number null preferred interpretation of invalid literal
nan null, 'NaN', number null preferred interpretation of 'Nan' literal
pInfinity null, 'Inf', number null preferred interpretation of positive 'Infinity' literal
nInfinity null, 'Inf', number null preferred interpretation of negative 'Infinity' literal
utc boolean false if true, interpret all unzoned timestamps as UTC

Starting after the number provided for ignoreLines, the content must include a header for each column, with a name and optional unit in parentheses. Special standard unit names may be used to indicate time types, which will apply different processing to the column:

Unit Description
ts text timestamp, interpretted in local browser timezone (absent explicit zone)
ts_utc text timestamp, interpretted as UTC timezone (absent explicit zone)
unix_s Unix time in seconds
unix_ms Unix time in milliseconds
unix_us Unix time in microseconds

Event Streams

To fully integrate events into a data pipeline, it is recommended to use event streams. This stores each event change as its own record with an associated time, allowing the processing utilities to correctly interpret accurate content for each event, even when events are being reprocessed.