| Name | np_codeocean JSON |
| Version |
0.2.1
JSON |
| download |
| home_page | None |
| Summary | Tools for uploading and interacting with Mindscope Neuropixels experiments on Code Ocean |
| upload_time | 2024-09-05 18:09:52 |
| maintainer | None |
| docs_url | None |
| author | None |
| requires_python | >=3.10 |
| license | MIT |
| keywords |
|
| VCS |
 |
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
# np_codeocean
Tools for uploading and interacting with Mindscope Neuropixels experiments on Code Ocean
Requires running as admin on Windows in order to create remote-to-remote symlinks
on the Isilon.
- `upload` CLI tool is provided, which uses the
[`np_session`](https://github.com/AllenInstitute/np_session) interface to find
and upload
raw data for one ecephys session:
```
pip install np_codeocean
upload <session-id>
```
where session-id is a valid input to `np_session.Session()`:
- a lims ID (`1333741475`)
- a workgroups foldername (`DRPilot_366122_20230101`)
- a path to a session folder ( `\\allen\programs\mindscope\workgroups\np-exp\1333741475_719667_20240227`)
- a folder of symlinks pointing to the raw data is created, with a new structure suitable for the KS2.5 sorting pipeline on Code Ocean
- the symlink folder, plus metadata, are entered into a csv file, which is
submitted to [`http://aind-data-transfer-service`](http://aind-data-transfer-service), which in turn runs the
[`aind-data-transfer`](https://github.com/AllenNeuralDynamics/aind-data-transfer)
tool on the HPC, which follows the symlinks to the original data,
median-subtracts/scales/compresses ephys data, then uploads with the AWS CLI tool
- all compression/zipping acts on copies in temporary folders: the original raw data is not altered in anyway
Raw data
{
"_id": null,
"home_page": null,
"name": "np_codeocean",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": "Ben Hardcastle <ben.hardcastle@alleninstitute.org>, Chris Mochizuki <chrism@alleninstitute.org>, Arjun Sridhar <arjun.sridhar@alleninstitute.org>",
"download_url": "https://files.pythonhosted.org/packages/5e/d3/27afbcbc00250dcef8c06869fa153d3b3024c479a5da25761dd30a8896e7/np_codeocean-0.2.1.tar.gz",
"platform": null,
"description": "# np_codeocean\nTools for uploading and interacting with Mindscope Neuropixels experiments on Code Ocean\n\nRequires running as admin on Windows in order to create remote-to-remote symlinks\non the Isilon.\n\n- `upload` CLI tool is provided, which uses the\n [`np_session`](https://github.com/AllenInstitute/np_session) interface to find\n and upload\n raw data for one ecephys session:\n\n ```\n pip install np_codeocean\n upload <session-id>\n ```\n \n where session-id is a valid input to `np_session.Session()`: \n - a lims ID (`1333741475`) \n - a workgroups foldername (`DRPilot_366122_20230101`) \n - a path to a session folder ( `\\\\allen\\programs\\mindscope\\workgroups\\np-exp\\1333741475_719667_20240227`)\n \n- a folder of symlinks pointing to the raw data is created, with a new structure suitable for the KS2.5 sorting pipeline on Code Ocean\n- the symlink folder, plus metadata, are entered into a csv file, which is\n submitted to [`http://aind-data-transfer-service`](http://aind-data-transfer-service), which in turn runs the\n [`aind-data-transfer`](https://github.com/AllenNeuralDynamics/aind-data-transfer)\n tool on the HPC, which follows the symlinks to the original data,\n median-subtracts/scales/compresses ephys data, then uploads with the AWS CLI tool\n- all compression/zipping acts on copies in temporary folders: the original raw data is not altered in anyway \n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Tools for uploading and interacting with Mindscope Neuropixels experiments on Code Ocean",
"version": "0.2.1",
"project_urls": {
"Issues": "https://github.com/AllenInstitute/np_codeocean/issues",
"Source": "https://github.com/AllenInstitute/np_codeocean"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "b76109e9b5834b87b4f571a4d4ae3754bc8f717013d4f362b02a2bdfbc43b0a5",
"md5": "96cee8af3a5226b72d5e2e933cd244be",
"sha256": "66f713975a31a23b72058c66eb21a3628d1447df2b705ca23108ae0daf4a5751"
},
"downloads": -1,
"filename": "np_codeocean-0.2.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "96cee8af3a5226b72d5e2e933cd244be",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 23016,
"upload_time": "2024-09-05T18:09:51",
"upload_time_iso_8601": "2024-09-05T18:09:51.131418Z",
"url": "https://files.pythonhosted.org/packages/b7/61/09e9b5834b87b4f571a4d4ae3754bc8f717013d4f362b02a2bdfbc43b0a5/np_codeocean-0.2.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "5ed327afbcbc00250dcef8c06869fa153d3b3024c479a5da25761dd30a8896e7",
"md5": "22181a0fe6b79c4022728a98a2ddbd7e",
"sha256": "9f3e8fd07f36b4897b6c5a12338b035927c91ce1ff54bb89676413a7c60f101f"
},
"downloads": -1,
"filename": "np_codeocean-0.2.1.tar.gz",
"has_sig": false,
"md5_digest": "22181a0fe6b79c4022728a98a2ddbd7e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 19088,
"upload_time": "2024-09-05T18:09:52",
"upload_time_iso_8601": "2024-09-05T18:09:52.685404Z",
"url": "https://files.pythonhosted.org/packages/5e/d3/27afbcbc00250dcef8c06869fa153d3b3024c479a5da25761dd30a8896e7/np_codeocean-0.2.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-05 18:09:52",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "AllenInstitute",
"github_project": "np_codeocean",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "np_codeocean"
}