jimGW


NamejimGW JSON
Version 0.1.1 PyPI version JSON
download
home_pagehttps://github.com/kazewong/jim
SummaryGravitatioanl wave data analysis tool in Jax
upload_time2023-09-18 20:10:19
maintainer
docs_urlNone
authorKaze Wong
requires_python>=3.9
licenseMIT
keywords sampling inference machine learning normalizing autodiff jax
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Jim <img src="https://user-images.githubusercontent.com/4642979/218163532-1c8a58e5-6f36-42de-96d3-f245eee93cf8.png" alt="jim" width="35"/> - A JAX-based gravitational-wave inference toolkit

Jim comprises a set of tools for estimating parameters of gravitational-wave sources thorugh Bayesian inference.
At its core, Jim relies on the JAX-based sampler [flowMC](https://github.com/kazewong/flowMC),
which leverages normalizing flows to enhance the convergence of a gradient-based MCMC sampler.

Since its based on JAX, Jim can also leverage hardware acceleration to achieve significant speedups on GPUs. Jim also takes advantage of likelihood-heterodyining, ([Cornish 2010](https://arxiv.org/abs/1007.4820), [Cornish 2021](https://arxiv.org/abs/2109.02728)) to compute the gravitational-wave likelihood more efficiently.

See the accompanying paper, [Wong, Isi, Edwards (2023)](https://github.com/kazewong/TurboPE/) for details.

_[Documentatation and examples are a work in progress]_

## Installation

You may install the latest released version of Jim through pip by doing
```
pip install jimGW
```

You may install the bleeding edge version by cloning this repo, or doing
```
pip install git+https://github.com/kazewong/jim
```

If you would like to take advantage of CUDA, you will additionally need to install a specific version of JAX by doing
```
pip install --upgrade "jax[cuda]"==0.4.1 -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
```

_NOTE:_ Jim is only currently compatible with Python 3.10.

## Performance

The performance of Jim will vary depending on the hardware available. Under optimal conditions, the CUDA installation can achieve parameter estimation in ~1 min on an Nvidia A100 GPU for a binary neutron star (see [paper](https://github.com/kazewong/TurboPE/) for details). If a GPU is not available, JAX will fall back on CPUs, and you will see a message like this on execution:

```
No GPU/TPU found, falling back to CPU.
```

## Directory

Parameter estimation examples are in `example/ParameterEstimation`.

## Attribution

Please cite the accompanying paper, [Wong, Isi, Edwards (2023)](https://github.com/kazewong/TurboPE/).

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kazewong/jim",
    "name": "jimGW",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "",
    "keywords": "sampling,inference,machine learning,normalizing,autodiff,jax",
    "author": "Kaze Wong",
    "author_email": "kazewong.physics@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/2c/80/d1230e7a8b12013dd72f4afae155b7ed08124ca689f1282f69e03594737e/jimGW-0.1.1.tar.gz",
    "platform": null,
    "description": "# Jim <img src=\"https://user-images.githubusercontent.com/4642979/218163532-1c8a58e5-6f36-42de-96d3-f245eee93cf8.png\" alt=\"jim\" width=\"35\"/> - A JAX-based gravitational-wave inference toolkit\n\nJim comprises a set of tools for estimating parameters of gravitational-wave sources thorugh Bayesian inference.\nAt its core, Jim relies on the JAX-based sampler [flowMC](https://github.com/kazewong/flowMC),\nwhich leverages normalizing flows to enhance the convergence of a gradient-based MCMC sampler.\n\nSince its based on JAX, Jim can also leverage hardware acceleration to achieve significant speedups on GPUs. Jim also takes advantage of likelihood-heterodyining, ([Cornish 2010](https://arxiv.org/abs/1007.4820), [Cornish 2021](https://arxiv.org/abs/2109.02728)) to compute the gravitational-wave likelihood more efficiently.\n\nSee the accompanying paper, [Wong, Isi, Edwards (2023)](https://github.com/kazewong/TurboPE/) for details.\n\n_[Documentatation and examples are a work in progress]_\n\n## Installation\n\nYou may install the latest released version of Jim through pip by doing\n```\npip install jimGW\n```\n\nYou may install the bleeding edge version by cloning this repo, or doing\n```\npip install git+https://github.com/kazewong/jim\n```\n\nIf you would like to take advantage of CUDA, you will additionally need to install a specific version of JAX by doing\n```\npip install --upgrade \"jax[cuda]\"==0.4.1 -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html\n```\n\n_NOTE:_ Jim is only currently compatible with Python 3.10.\n\n## Performance\n\nThe performance of Jim will vary depending on the hardware available. Under optimal conditions, the CUDA installation can achieve parameter estimation in ~1 min on an Nvidia A100 GPU for a binary neutron star (see [paper](https://github.com/kazewong/TurboPE/) for details). If a GPU is not available, JAX will fall back on CPUs, and you will see a message like this on execution:\n\n```\nNo GPU/TPU found, falling back to CPU.\n```\n\n## Directory\n\nParameter estimation examples are in `example/ParameterEstimation`.\n\n## Attribution\n\nPlease cite the accompanying paper, [Wong, Isi, Edwards (2023)](https://github.com/kazewong/TurboPE/).\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Gravitatioanl wave data analysis tool in Jax",
    "version": "0.1.1",
    "project_urls": {
        "Homepage": "https://github.com/kazewong/jim"
    },
    "split_keywords": [
        "sampling",
        "inference",
        "machine learning",
        "normalizing",
        "autodiff",
        "jax"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "22070b0fe0b6fd18aad4f8d98c3c1217a8f0f9edc4e49fed0ed3f05c8abbe143",
                "md5": "85b8cf7ea1f8bbaa20429dcacc68d949",
                "sha256": "32bec5ffc060cf5e5e796fb367f831abed13d615a81b43eb7066a0edfa814c14"
            },
            "downloads": -1,
            "filename": "jimGW-0.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "85b8cf7ea1f8bbaa20429dcacc68d949",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 16657,
            "upload_time": "2023-09-18T20:10:17",
            "upload_time_iso_8601": "2023-09-18T20:10:17.696065Z",
            "url": "https://files.pythonhosted.org/packages/22/07/0b0fe0b6fd18aad4f8d98c3c1217a8f0f9edc4e49fed0ed3f05c8abbe143/jimGW-0.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2c80d1230e7a8b12013dd72f4afae155b7ed08124ca689f1282f69e03594737e",
                "md5": "f7215b64a5fd1060571cdc838622ec0d",
                "sha256": "186b26f5e9bfdd0798f09da94dfc1719d150cc253b6b061ffe5210f8f4d36907"
            },
            "downloads": -1,
            "filename": "jimGW-0.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "f7215b64a5fd1060571cdc838622ec0d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 14152,
            "upload_time": "2023-09-18T20:10:19",
            "upload_time_iso_8601": "2023-09-18T20:10:19.339645Z",
            "url": "https://files.pythonhosted.org/packages/2c/80/d1230e7a8b12013dd72f4afae155b7ed08124ca689f1282f69e03594737e/jimGW-0.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-09-18 20:10:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kazewong",
    "github_project": "jim",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "jimgw"
}
        
Elapsed time: 0.18487s