# NT Summary Stats
Fast computation of traditional summary statistics for neutrino telescopes.
## Install
```bash
pip install nt_summary_stats
```
## Usage
```python
import numpy as np
from nt_summary_stats import compute_summary_stats
# Basic usage
times = np.array([10.0, 15.0, 25.0, 100.0]) # shape: (N,), dtype: float
charges = np.array([1.0, 2.0, 1.5, 0.5]) # shape: (N,), dtype: float
stats = compute_summary_stats(times, charges) # returns: np.ndarray, shape (9,)
print(stats[0]) # total_charge: 5.0
print(stats[3]) # first_pulse_time: 10.0
print(stats[7]) # charge_weighted_mean_time: 26.0
```
Process [Prometheus](https://github.com/Harvard-Neutrino/prometheus) events:
```python
from nt_summary_stats import process_prometheus_event
# Input: Prometheus event dictionary
event_data = {
'photons': {
'sensor_pos_x': [0.0, 0.0, 100.0], # list[float], length M
'sensor_pos_y': [0.0, 0.0, 0.0], # list[float], length M
'sensor_pos_z': [0.0, 0.0, 50.0], # list[float], length M
'string_id': [1, 1, 2], # list[int], length M
'sensor_id': [1, 1, 1], # list[int], length M
't': [10.0, 15.0, 20.0] # list[float], length M
}
}
# Default: no grouping (uses all hits as-is)
sensor_positions, sensor_stats = process_prometheus_event(event_data)
# Optional: group hits within time windows
sensor_positions, sensor_stats = process_prometheus_event(event_data, grouping_window_ns=2.0)
# sensor_positions: np.ndarray, shape (N_sensors, 3), dtype: float64
# sensor_stats: np.ndarray, shape (N_sensors, 9), dtype: float64
# Arrays are aligned: sensor_positions[i] corresponds to sensor_stats[i]
```
Process individual sensor data:
```python
from nt_summary_stats import process_sensor_data
# Input: sensor hit data
sensor_times = [10.0, 10.5, 15.0, 100.0] # list[float] or np.ndarray(N,)
sensor_charges = [1.0, 0.5, 2.0, 1.0] # list[float] or np.ndarray(N,), optional
# Default: no grouping (uses all hits as-is)
stats = process_sensor_data(sensor_times, sensor_charges) # returns: np.ndarray, shape (9,)
# Optional: group hits within time windows
stats = process_sensor_data(sensor_times, sensor_charges, grouping_window_ns=2.0)
```
## Summary Statistics
Computes 9 traditional summary statistics for neutrino telescope sensors as described in the [IceCube paper](https://arxiv.org/abs/2101.11589). All functions return numpy arrays with statistics in the following order:
```python
stats = compute_summary_stats(times, charges) # shape: (9,)
# Array indices:
stats[0] # total_charge: Total charge collected
stats[1] # charge_100ns: Charge within 100ns of first pulse
stats[2] # charge_500ns: Charge within 500ns of first pulse
stats[3] # first_pulse_time: Time of first pulse
stats[4] # last_pulse_time: Time of last pulse
stats[5] # charge_20_percent_time: Time at which 20% of charge is collected
stats[6] # charge_50_percent_time: Time at which 50% of charge is collected
stats[7] # charge_weighted_mean_time: Charge-weighted mean time
stats[8] # charge_weighted_std_time: Charge-weighted standard deviation
```
## API
### `compute_summary_stats(times, charges)`
**Args:**
- `times`: `np.ndarray` or `list`, shape `(N,)` - pulse arrival times in ns
- `charges`: `np.ndarray` or `list`, shape `(N,)` - pulse charges
**Returns:** `np.ndarray`, shape `(9,)` - array with 9 summary statistics in order shown above
### `process_prometheus_event(event_data, grouping_window_ns=None)`
**Args:**
- `event_data`: `dict` or `awkward.Array` - Prometheus event data. Supports:
- Dictionary with `photons` key containing sensor data
- Awkward array with `photons` field (requires `awkward` package)
- Direct photon data structure with required fields
- `grouping_window_ns`: `float` or `None` - time window for grouping hits (default: None, no grouping)
**Returns:** `tuple[np.ndarray, np.ndarray]`
- `sensor_positions`: `np.ndarray`, shape `(N_sensors, 3)` - sensor positions
- `sensor_stats`: `np.ndarray`, shape `(N_sensors, 9)` - statistics for each sensor (aligned with positions)
### `process_sensor_data(sensor_times, sensor_charges=None, grouping_window_ns=None)`
**Args:**
- `sensor_times`: `np.ndarray` or `list`, shape `(N,)` - hit times for sensor
- `sensor_charges`: `np.ndarray` or `list`, shape `(N,)` - hit charges (optional, defaults to 1.0)
- `grouping_window_ns`: `float` or `None` - time window for grouping hits (default: None, no grouping)
**Returns:** `np.ndarray`, shape `(9,)` - array with 9 summary statistics in order shown above
## License
MIT
Raw data
{
"_id": null,
"home_page": null,
"name": "nt-summary-stats",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "neutrino, telescope, summary, statistics, icecube, physics",
"author": null,
"author_email": "\"Felix J. Yu\" <felixyu@g.harvard.edu>",
"download_url": "https://files.pythonhosted.org/packages/bf/05/3e1f10fe3b0ff825c643ad662f533005f3bdff791d7b89c09ce2ea31213a/nt_summary_stats-0.1.1.tar.gz",
"platform": null,
"description": "# NT Summary Stats\n\nFast computation of traditional summary statistics for neutrino telescopes.\n\n## Install\n\n```bash\npip install nt_summary_stats\n```\n\n## Usage\n\n```python\nimport numpy as np\nfrom nt_summary_stats import compute_summary_stats\n\n# Basic usage\ntimes = np.array([10.0, 15.0, 25.0, 100.0]) # shape: (N,), dtype: float\ncharges = np.array([1.0, 2.0, 1.5, 0.5]) # shape: (N,), dtype: float\nstats = compute_summary_stats(times, charges) # returns: np.ndarray, shape (9,)\n\nprint(stats[0]) # total_charge: 5.0\nprint(stats[3]) # first_pulse_time: 10.0\nprint(stats[7]) # charge_weighted_mean_time: 26.0\n```\n\nProcess [Prometheus](https://github.com/Harvard-Neutrino/prometheus) events:\n\n```python\nfrom nt_summary_stats import process_prometheus_event\n\n# Input: Prometheus event dictionary\nevent_data = {\n 'photons': {\n 'sensor_pos_x': [0.0, 0.0, 100.0], # list[float], length M\n 'sensor_pos_y': [0.0, 0.0, 0.0], # list[float], length M\n 'sensor_pos_z': [0.0, 0.0, 50.0], # list[float], length M\n 'string_id': [1, 1, 2], # list[int], length M\n 'sensor_id': [1, 1, 1], # list[int], length M\n 't': [10.0, 15.0, 20.0] # list[float], length M\n }\n}\n\n# Default: no grouping (uses all hits as-is)\nsensor_positions, sensor_stats = process_prometheus_event(event_data)\n\n# Optional: group hits within time windows\nsensor_positions, sensor_stats = process_prometheus_event(event_data, grouping_window_ns=2.0)\n# sensor_positions: np.ndarray, shape (N_sensors, 3), dtype: float64\n# sensor_stats: np.ndarray, shape (N_sensors, 9), dtype: float64\n# Arrays are aligned: sensor_positions[i] corresponds to sensor_stats[i]\n```\n\nProcess individual sensor data:\n\n```python\nfrom nt_summary_stats import process_sensor_data\n\n# Input: sensor hit data\nsensor_times = [10.0, 10.5, 15.0, 100.0] # list[float] or np.ndarray(N,)\nsensor_charges = [1.0, 0.5, 2.0, 1.0] # list[float] or np.ndarray(N,), optional\n\n# Default: no grouping (uses all hits as-is)\nstats = process_sensor_data(sensor_times, sensor_charges) # returns: np.ndarray, shape (9,)\n\n# Optional: group hits within time windows\nstats = process_sensor_data(sensor_times, sensor_charges, grouping_window_ns=2.0)\n```\n\n## Summary Statistics\n\nComputes 9 traditional summary statistics for neutrino telescope sensors as described in the [IceCube paper](https://arxiv.org/abs/2101.11589). All functions return numpy arrays with statistics in the following order:\n\n```python\nstats = compute_summary_stats(times, charges) # shape: (9,)\n\n# Array indices:\nstats[0] # total_charge: Total charge collected\nstats[1] # charge_100ns: Charge within 100ns of first pulse\nstats[2] # charge_500ns: Charge within 500ns of first pulse\nstats[3] # first_pulse_time: Time of first pulse\nstats[4] # last_pulse_time: Time of last pulse\nstats[5] # charge_20_percent_time: Time at which 20% of charge is collected\nstats[6] # charge_50_percent_time: Time at which 50% of charge is collected\nstats[7] # charge_weighted_mean_time: Charge-weighted mean time\nstats[8] # charge_weighted_std_time: Charge-weighted standard deviation\n```\n\n## API\n\n### `compute_summary_stats(times, charges)`\n\n**Args:**\n- `times`: `np.ndarray` or `list`, shape `(N,)` - pulse arrival times in ns\n- `charges`: `np.ndarray` or `list`, shape `(N,)` - pulse charges\n\n**Returns:** `np.ndarray`, shape `(9,)` - array with 9 summary statistics in order shown above\n\n### `process_prometheus_event(event_data, grouping_window_ns=None)`\n\n**Args:**\n- `event_data`: `dict` or `awkward.Array` - Prometheus event data. Supports:\n - Dictionary with `photons` key containing sensor data\n - Awkward array with `photons` field (requires `awkward` package)\n - Direct photon data structure with required fields\n- `grouping_window_ns`: `float` or `None` - time window for grouping hits (default: None, no grouping)\n\n**Returns:** `tuple[np.ndarray, np.ndarray]`\n- `sensor_positions`: `np.ndarray`, shape `(N_sensors, 3)` - sensor positions\n- `sensor_stats`: `np.ndarray`, shape `(N_sensors, 9)` - statistics for each sensor (aligned with positions)\n\n### `process_sensor_data(sensor_times, sensor_charges=None, grouping_window_ns=None)`\n\n**Args:**\n- `sensor_times`: `np.ndarray` or `list`, shape `(N,)` - hit times for sensor\n- `sensor_charges`: `np.ndarray` or `list`, shape `(N,)` - hit charges (optional, defaults to 1.0)\n- `grouping_window_ns`: `float` or `None` - time window for grouping hits (default: None, no grouping)\n\n**Returns:** `np.ndarray`, shape `(9,)` - array with 9 summary statistics in order shown above\n\n## License\n\nMIT\n",
"bugtrack_url": null,
"license": null,
"summary": "Fast computation of traditional summary statistics for neutrino telescopes",
"version": "0.1.1",
"project_urls": {
"Homepage": "https://github.com/felixyu7/nt_summary_stats",
"Issues": "https://github.com/felixyu7/nt_summary_stats/issues",
"Repository": "https://github.com/felixyu7/nt_summary_stats"
},
"split_keywords": [
"neutrino",
" telescope",
" summary",
" statistics",
" icecube",
" physics"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "36bc281927804282005014e819cdbc27646284b33ee8448d4ac8c9ee44c602b0",
"md5": "88495b678f637c3c0f1075d21d9cf152",
"sha256": "9e889dc93b651d6188961bcecdf0c2f5d328fe63163c47bc53de4895d3e84efe"
},
"downloads": -1,
"filename": "nt_summary_stats-0.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "88495b678f637c3c0f1075d21d9cf152",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 8817,
"upload_time": "2025-07-12T07:04:12",
"upload_time_iso_8601": "2025-07-12T07:04:12.487595Z",
"url": "https://files.pythonhosted.org/packages/36/bc/281927804282005014e819cdbc27646284b33ee8448d4ac8c9ee44c602b0/nt_summary_stats-0.1.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "bf053e1f10fe3b0ff825c643ad662f533005f3bdff791d7b89c09ce2ea31213a",
"md5": "4838bcc9230bd58a3b5b6a96f3d22313",
"sha256": "db94307e42122219108e7f2f68e174d730e3ba20c4758564f2a614240ee374fb"
},
"downloads": -1,
"filename": "nt_summary_stats-0.1.1.tar.gz",
"has_sig": false,
"md5_digest": "4838bcc9230bd58a3b5b6a96f3d22313",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 8874,
"upload_time": "2025-07-12T07:04:13",
"upload_time_iso_8601": "2025-07-12T07:04:13.811780Z",
"url": "https://files.pythonhosted.org/packages/bf/05/3e1f10fe3b0ff825c643ad662f533005f3bdff791d7b89c09ce2ea31213a/nt_summary_stats-0.1.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-12 07:04:13",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "felixyu7",
"github_project": "nt_summary_stats",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "nt-summary-stats"
}