fpbot


Namefpbot JSON
Version 1.1.3 PyPI version JSON
download
home_pagehttps://github.com/simeonreusch/fpbot
SummaryForced photometry pipeline for the Zwicky Transient Facility
upload_time2023-08-24 13:30:16
maintainerSimeon Reusch
docs_urlNone
authorsimeonreusch
requires_python>=3.10,<4
licenseBSD-3-Clause
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.7404997.svg)](https://doi.org/10.5281/zenodo.7404997)
[![CI](https://github.com/simeonreusch/fpbot/actions/workflows/continous_integration.yml/badge.svg)](https://github.com/simeonreusch/fpbot/actions/workflows/continous_integration.yml)
[![Coverage Status](https://coveralls.io/repos/github/simeonreusch/fpbot/badge.svg?branch=main)](https://coveralls.io/github/simeonreusch/fpbot?branch=main)

# fpbot

This package a Forced Photometry Pipeline based on [ztfquery](https://github.com/mickaelrigault/ztfquery) and [ztflc](https://github.com/mickaelrigault/ztfquery). It needs [IPAC](https://irsa.ipac.caltech.edu/account/signon/login.do?josso_back_to=https://irsa.ipac.caltech.edu/frontpage/&ts=517) access to download the images, as well as access to the [AMPEL Archive](https://ampelproject.github.io/astronomy/ztf/index) to obtain information on the transients.

If you are planning to run forced photometry on many ZTF transients, this is the right tool for you!

Note: Requires Python >= 3.10. Also requires a MongoDB instance for storing the metadata, reachable under port 27017. This can be modified in database.py.

## Installation

1. Note that libpq-dev needs to be present. On Debian/Ubuntu, issue `sudo apt install libpq-dev`. On Mac OS, run `brew install postgresql`.

2. Then install via: `pip install fpbot`. Alternatively, clone this repo and install it with `poetry`. To do so, run
```bash
git clone https://github.com/simeonreusch/fpbot.git
cd fpbot
poetry install
```

3. If MongoDB is not present, it can easily be installed.
On Debian/Ubuntu, just follow this [instruction set](https://docs.mongodb.com/manual/tutorial/install-mongodb-on-debian/#install-mongodb-community-edition). After this, make sure the demon runs. Issue  
```bash
sudo systemctl start mongod
sudo systemctl enable mongod
```
On MacOS, make sure brew is present. To do so, follow [this tutorial](https://docs.mongodb.com/manual/tutorial/install-mongodb-on-os-x/).

4. `fpbot` requires an environment variable to know where to store the data. Include a line in your .bashrc or .zshrc like `export ZTFDATA='/absolute/path/to/ZTF-data-folder/'`. If you don't need AMPEL access, you are done!
---

5. If you want to use the [AMPEL API](https://ampel.zeuthen.desy.de/api/ztf/archive/v3/docs) for alert data (you don't have to!), you need credentials for the API. You can get these [here](https://ampel.zeuthen.desy.de/live/dashboard/tokens).

6. NOTE: If you are planning to run `fpbot` on a headless system which does not provide the luxury of a systemwide keychain, please add `export ZTFHUB_MODE='HEADLESS'` to your `.bashrc` or `.zshrc`. The pipeline will then uses `ztfquery`'s base64-obfuscated password storage.

## ALTERNATIVE: Use Docker container
fpbot comes shipped with a Dockerfile and a docker-compose.yml. Use them to build the docker container (this includes all dependencies as well as a MongoDB instance). Note: You have to provide a .ztfquery file in the fpbot directory containing access data for ztfquery (see [ztfquery](https://github.com/mickaelrigault/ztfquery) or [ztflc](https://github.com/mickaelrigault/ztfquery) for details).

First, do the following: 
```bash
git clone https://github.com/simeonreusch/fpbot.git
cd fpbot
docker-compose build
```

in the directory containing 1) the Dockerfile, 2) the docker-compose.yml and 3) the .ztfquery credentials file and run with

`docker-compose run -p 8000:8000 fpbot`. This exposes the web API to port 8000 of your local machine.

### Troubleshooting
In case way too few images are downloaded, check your IRSA credentials. These are stored in `~.ztfquery`. If there is a problem with these, `ztfquery` will not complain but simply only download publicly accessible images.

## Usage

### By importing class
All functionality of the command-line tool is present in the class. Just call it according to the commands available in `pipeline.py`.

For example:

```python
from fpbot.pipeline import ForcedPhotometryPipeline

pl = ForcedPhotometryPipeline(
    file_or_name="ZTF19aatubsj",
    daysago=90,
    nprocess=24
)

pl.download()
pl.psffit()
pl.plot()
```

### By systemwide command (`fp name -operations --options`)

Always:

`name` A ZTF name has to be provided, or an ASCII file containing one ZTF name in each line or an arbitrary name if followed by the ra/dec-option as to be provided.

optionally:

`-radec [RA DEC]`	If this is given, the name can be chosen arbitrarily (but a name MUST be provided). Radec must be given in a format that can be parsed by astropy; e.g. `-radec 218.487548 +40.243758`.

#### Additional commands

`-dl`        Downloads the images used for forced photometry from [IPAC](https://irsa.ipac.caltech.edu/account/signon/login.do?josso_back_to=https://irsa.ipac.caltech.edu/frontpage/&ts=517). Needs a valid IPAC account.

`-fit`       Performs the PSF-photometry fit and generates plots of the lightcurve(s).

`-plot`     Plots the lightcurve(s).

`-plotflux`     Plots the lightcurve(s), but with flux instead of magnitude.

`-sciimg`  Experimental: Also downloads the science images from IPAC (note: to create thumbnails if specified)

`-thumbnails` Experimental: Generates thumbnails for all science-images. Science images have to be downloaded (see `-sciimg`)

#### Options

`--nprocess [int]`  Specifies the number of processes spawned for parallel computing. Default is 4. Note: download is always performed with 32 processes in parallel, as IPAC upload-speed is the bottleneck there.

`--daysago [int]`  Determines how old the photometric data should be. Default: all.

`--daysuntil [int]`  Determines how new the photometric data should be. Default: all.

`--snt [float]` Specifies the signal-to-noise ratio for plotting and SALT-fitting.

`--magrange [float float]` Defines upper and lower magnitude bound for plotting the lightcurves; order is irrelevant.

`--fluxrange [float float]` Defines lower and upper flux bound for plotting the flux lightcurves; order is irrelevant.

#### Examples
`fp ZTF19aatubsj` downloads this ZTF object, does forced photometry, plots it and saves it to the default directory in "forcephotometry" (ZTFDATA, located at $ZTFDATA in your .bashrc/.zshrc/..., see ztfquery doc).

`fp ZTF19abimkwn -dl -fit --nprocess 16` downloads all images for ZTF19abimkwn found on IPAC, performs PSF-fitting and plots the lightcurve with 16 processes in parallel.

`fp supernovae.txt -dl -fit` Downloads all difference images for ZTF transients found in supernovae.txt, each line a ZTFname. These are then fitted, but not plotted. To get a nice example of ZTF lightcurves, issue: `fp example_download.txt -dl -fit -plot`.

`fp this_looks_interesting -radec 143.3123 66.42342 -dl -fit -plot --daysago 10 -magrange 18 20` Downloads all images of the last ten days of the location given in RA and dDecec, performs PSF-fits and plots the lightcurve in the 18--20 magnitude range.

### By systemwide bulk command (`fpbulk file.txt -operations --options`)
`file.txt` must be an ASCII file containing one ZTF-ID per line. The usual options apply (e.g. `-dl`, `-fit`).

## Requirements
- [ztfquery](https://github.com/mickaelrigault/ztfquery) is used to download the image files from IPAC.
- [ztflc](https://github.com/mickaelrigault/ztflc) is used for PSF-fitting.
- [AMPEL](https://github.com/ampelproject) credentials are neccessary for the pipeline to work.

## Notes
### Slackbot
There is a bot for Slack included, based on the SlackRTM-API.
You have to create a classic Slack app for this, because the newer version depends on the Events API, which itself seems to need a web server to run.
Classic slack Apps can be created [here](https://api.slack.com/apps?new_classic_app=1). Make sure not to convert to the new permission/privilege system in the process (Slack tries to push you towards it, be careful).
After successfully setting up the App/bot and giving it permissions, change the bot-username to the one of your bot in start_slackbot.py and it should basically work (first start requires you to enter the bot- and bot-user credentials, also provided by Slack).

### Resulting dataframe
The dataframes resulting after plotting (located at `ZTDATA/forcephotometry/plot/dataframes`) consists of the following columns:
- **sigma(.err)**: The intrinsic error
- **ampl(.err)**: The flux amplitude (error)
- **fval**: Total minimized value
- **chi2(dof)**: PSF-fit chi square (per degrees of freedom)
- **Columns 9-39**: The science image header
- **target_x/y**: pixel position of target
- **data_hasnan**: Data contains NaN-values (should always be False)
- **F0**: Zero point magnitude from header converted to flux
- **Fratio(.err)**: Flux to flux zero point ratio (error)
- **upper_limit**: For forced photometry result < signal to noise threshold, this is the limiting magnitude from the Marshal (see **maglim** column)
- **mag(_err)**: Flux amplitude (error) converted to magnitude. For detections below signal to noise threshold, this value is set to 99.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/simeonreusch/fpbot",
    "name": "fpbot",
    "maintainer": "Simeon Reusch",
    "docs_url": null,
    "requires_python": ">=3.10,<4",
    "maintainer_email": "simeon.reusch@desy.de",
    "keywords": "",
    "author": "simeonreusch",
    "author_email": "simeon.reusch@desy.de",
    "download_url": "https://files.pythonhosted.org/packages/88/44/8a58101b2455ab677748e2c1ce21a949359362f951f9f9887e7c1dde7682/fpbot-1.1.3.tar.gz",
    "platform": null,
    "description": "[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.7404997.svg)](https://doi.org/10.5281/zenodo.7404997)\n[![CI](https://github.com/simeonreusch/fpbot/actions/workflows/continous_integration.yml/badge.svg)](https://github.com/simeonreusch/fpbot/actions/workflows/continous_integration.yml)\n[![Coverage Status](https://coveralls.io/repos/github/simeonreusch/fpbot/badge.svg?branch=main)](https://coveralls.io/github/simeonreusch/fpbot?branch=main)\n\n# fpbot\n\nThis package a Forced Photometry Pipeline based on [ztfquery](https://github.com/mickaelrigault/ztfquery) and [ztflc](https://github.com/mickaelrigault/ztfquery). It needs [IPAC](https://irsa.ipac.caltech.edu/account/signon/login.do?josso_back_to=https://irsa.ipac.caltech.edu/frontpage/&ts=517) access to download the images, as well as access to the [AMPEL Archive](https://ampelproject.github.io/astronomy/ztf/index) to obtain information on the transients.\n\nIf you are planning to run forced photometry on many ZTF transients, this is the right tool for you!\n\nNote: Requires Python >= 3.10. Also requires a MongoDB instance for storing the metadata, reachable under port 27017. This can be modified in database.py.\n\n## Installation\n\n1. Note that libpq-dev needs to be present. On Debian/Ubuntu, issue `sudo apt install libpq-dev`. On Mac OS, run `brew install postgresql`.\n\n2. Then install via: `pip install fpbot`. Alternatively, clone this repo and install it with `poetry`. To do so, run\n```bash\ngit clone https://github.com/simeonreusch/fpbot.git\ncd fpbot\npoetry install\n```\n\n3. If MongoDB is not present, it can easily be installed.\nOn Debian/Ubuntu, just follow this [instruction set](https://docs.mongodb.com/manual/tutorial/install-mongodb-on-debian/#install-mongodb-community-edition). After this, make sure the demon runs. Issue  \n```bash\nsudo systemctl start mongod\nsudo systemctl enable mongod\n```\nOn MacOS, make sure brew is present. To do so, follow [this tutorial](https://docs.mongodb.com/manual/tutorial/install-mongodb-on-os-x/).\n\n4. `fpbot` requires an environment variable to know where to store the data. Include a line in your .bashrc or .zshrc like `export ZTFDATA='/absolute/path/to/ZTF-data-folder/'`. If you don't need AMPEL access, you are done!\n---\n\n5. If you want to use the [AMPEL API](https://ampel.zeuthen.desy.de/api/ztf/archive/v3/docs) for alert data (you don't have to!), you need credentials for the API. You can get these [here](https://ampel.zeuthen.desy.de/live/dashboard/tokens).\n\n6. NOTE: If you are planning to run `fpbot` on a headless system which does not provide the luxury of a systemwide keychain, please add `export ZTFHUB_MODE='HEADLESS'` to your `.bashrc` or `.zshrc`. The pipeline will then uses `ztfquery`'s base64-obfuscated password storage.\n\n## ALTERNATIVE: Use Docker container\nfpbot comes shipped with a Dockerfile and a docker-compose.yml. Use them to build the docker container (this includes all dependencies as well as a MongoDB instance). Note: You have to provide a .ztfquery file in the fpbot directory containing access data for ztfquery (see [ztfquery](https://github.com/mickaelrigault/ztfquery) or [ztflc](https://github.com/mickaelrigault/ztfquery) for details).\n\nFirst, do the following: \n```bash\ngit clone https://github.com/simeonreusch/fpbot.git\ncd fpbot\ndocker-compose build\n```\n\nin the directory containing 1) the Dockerfile, 2) the docker-compose.yml and 3) the .ztfquery credentials file and run with\n\n`docker-compose run -p 8000:8000 fpbot`. This exposes the web API to port 8000 of your local machine.\n\n### Troubleshooting\nIn case way too few images are downloaded, check your IRSA credentials. These are stored in `~.ztfquery`. If there is a problem with these, `ztfquery` will not complain but simply only download publicly accessible images.\n\n## Usage\n\n### By importing class\nAll functionality of the command-line tool is present in the class. Just call it according to the commands available in `pipeline.py`.\n\nFor example:\n\n```python\nfrom fpbot.pipeline import ForcedPhotometryPipeline\n\npl = ForcedPhotometryPipeline(\n    file_or_name=\"ZTF19aatubsj\",\n    daysago=90,\n    nprocess=24\n)\n\npl.download()\npl.psffit()\npl.plot()\n```\n\n### By systemwide command (`fp name -operations --options`)\n\nAlways:\n\n`name` A ZTF name has to be provided, or an ASCII file containing one ZTF name in each line or an arbitrary name if followed by the ra/dec-option as to be provided.\n\noptionally:\n\n`-radec [RA DEC]`\tIf this is given, the name can be chosen arbitrarily (but a name MUST be provided). Radec must be given in a format that can be parsed by astropy; e.g. `-radec 218.487548 +40.243758`.\n\n#### Additional commands\n\n`-dl`        Downloads the images used for forced photometry from [IPAC](https://irsa.ipac.caltech.edu/account/signon/login.do?josso_back_to=https://irsa.ipac.caltech.edu/frontpage/&ts=517). Needs a valid IPAC account.\n\n`-fit`       Performs the PSF-photometry fit and generates plots of the lightcurve(s).\n\n`-plot`     Plots the lightcurve(s).\n\n`-plotflux`     Plots the lightcurve(s), but with flux instead of magnitude.\n\n`-sciimg`  Experimental: Also downloads the science images from IPAC (note: to create thumbnails if specified)\n\n`-thumbnails` Experimental: Generates thumbnails for all science-images. Science images have to be downloaded (see `-sciimg`)\n\n#### Options\n\n`--nprocess [int]`  Specifies the number of processes spawned for parallel computing. Default is 4. Note: download is always performed with 32 processes in parallel, as IPAC upload-speed is the bottleneck there.\n\n`--daysago [int]`  Determines how old the photometric data should be. Default: all.\n\n`--daysuntil [int]`  Determines how new the photometric data should be. Default: all.\n\n`--snt [float]` Specifies the signal-to-noise ratio for plotting and SALT-fitting.\n\n`--magrange [float float]` Defines upper and lower magnitude bound for plotting the lightcurves; order is irrelevant.\n\n`--fluxrange [float float]` Defines lower and upper flux bound for plotting the flux lightcurves; order is irrelevant.\n\n#### Examples\n`fp ZTF19aatubsj` downloads this ZTF object, does forced photometry, plots it and saves it to the default directory in \"forcephotometry\" (ZTFDATA, located at $ZTFDATA in your .bashrc/.zshrc/..., see ztfquery doc).\n\n`fp ZTF19abimkwn -dl -fit --nprocess 16` downloads all images for ZTF19abimkwn found on IPAC, performs PSF-fitting and plots the lightcurve with 16 processes in parallel.\n\n`fp supernovae.txt -dl -fit` Downloads all difference images for ZTF transients found in supernovae.txt, each line a ZTFname. These are then fitted, but not plotted. To get a nice example of ZTF lightcurves, issue: `fp example_download.txt -dl -fit -plot`.\n\n`fp this_looks_interesting -radec 143.3123 66.42342 -dl -fit -plot --daysago 10 -magrange 18 20` Downloads all images of the last ten days of the location given in RA and dDecec, performs PSF-fits and plots the lightcurve in the 18--20 magnitude range.\n\n### By systemwide bulk command (`fpbulk file.txt -operations --options`)\n`file.txt` must be an ASCII file containing one ZTF-ID per line. The usual options apply (e.g. `-dl`, `-fit`).\n\n## Requirements\n- [ztfquery](https://github.com/mickaelrigault/ztfquery) is used to download the image files from IPAC.\n- [ztflc](https://github.com/mickaelrigault/ztflc) is used for PSF-fitting.\n- [AMPEL](https://github.com/ampelproject) credentials are neccessary for the pipeline to work.\n\n## Notes\n### Slackbot\nThere is a bot for Slack included, based on the SlackRTM-API.\nYou have to create a classic Slack app for this, because the newer version depends on the Events API, which itself seems to need a web server to run.\nClassic slack Apps can be created [here](https://api.slack.com/apps?new_classic_app=1). Make sure not to convert to the new permission/privilege system in the process (Slack tries to push you towards it, be careful).\nAfter successfully setting up the App/bot and giving it permissions, change the bot-username to the one of your bot in start_slackbot.py and it should basically work (first start requires you to enter the bot- and bot-user credentials, also provided by Slack).\n\n### Resulting dataframe\nThe dataframes resulting after plotting (located at `ZTDATA/forcephotometry/plot/dataframes`) consists of the following columns:\n- **sigma(.err)**: The intrinsic error\n- **ampl(.err)**: The flux amplitude (error)\n- **fval**: Total minimized value\n- **chi2(dof)**: PSF-fit chi square (per degrees of freedom)\n- **Columns 9-39**: The science image header\n- **target_x/y**: pixel position of target\n- **data_hasnan**: Data contains NaN-values (should always be False)\n- **F0**: Zero point magnitude from header converted to flux\n- **Fratio(.err)**: Flux to flux zero point ratio (error)\n- **upper_limit**: For forced photometry result < signal to noise threshold, this is the limiting magnitude from the Marshal (see **maglim** column)\n- **mag(_err)**: Flux amplitude (error) converted to magnitude. For detections below signal to noise threshold, this value is set to 99.\n",
    "bugtrack_url": null,
    "license": "BSD-3-Clause",
    "summary": "Forced photometry pipeline for the Zwicky Transient Facility",
    "version": "1.1.3",
    "project_urls": {
        "Homepage": "https://github.com/simeonreusch/fpbot",
        "Repository": "https://github.com/simeonreusch/fpbot"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "44bf1f990c992210dc42795bb936603aa3569a9b002373b5608033783b30d040",
                "md5": "0e94894601576976b769a09024f36702",
                "sha256": "f9d12a073016fbb9f55143c6d2a114ecae3d607c7d4f55388ed049798d083e06"
            },
            "downloads": -1,
            "filename": "fpbot-1.1.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0e94894601576976b769a09024f36702",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10,<4",
            "size": 16341427,
            "upload_time": "2023-08-24T13:30:12",
            "upload_time_iso_8601": "2023-08-24T13:30:12.996804Z",
            "url": "https://files.pythonhosted.org/packages/44/bf/1f990c992210dc42795bb936603aa3569a9b002373b5608033783b30d040/fpbot-1.1.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "88448a58101b2455ab677748e2c1ce21a949359362f951f9f9887e7c1dde7682",
                "md5": "59fef839925a3ec4bfdd63c04c9907a9",
                "sha256": "0cbf6a8191e2f879f414764cba39a0391e33f95262755c346f4612f9c1a4e1ed"
            },
            "downloads": -1,
            "filename": "fpbot-1.1.3.tar.gz",
            "has_sig": false,
            "md5_digest": "59fef839925a3ec4bfdd63c04c9907a9",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10,<4",
            "size": 15868856,
            "upload_time": "2023-08-24T13:30:16",
            "upload_time_iso_8601": "2023-08-24T13:30:16.344099Z",
            "url": "https://files.pythonhosted.org/packages/88/44/8a58101b2455ab677748e2c1ce21a949359362f951f9f9887e7c1dde7682/fpbot-1.1.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-24 13:30:16",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "simeonreusch",
    "github_project": "fpbot",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "fpbot"
}
        
Elapsed time: 0.10428s