dooc


Namedooc JSON
Version 0.0.2 PyPI version JSON
download
home_pageNone
SummaryDigtal Organoid On Chips
upload_time2024-05-28 12:47:14
maintainerNone
docs_urlNone
authorMichael Ding
requires_python>=3.8
licenseNone
keywords molcule organoid ai deep learning transformer drug response mutations
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # DOoC

## Usage


### Train


```python
# Regression train
from moltx import tokenizers
from dooc import models, datasets, nets


tk = tokenizers.MoltxTokenizer.from_pretrain(models.AdaMRTokenizerConfig.Prediction)
ds = datasets.MutSmiXAttention(tokenizer=tk, device=torch.device('cpu'))
smiles = ["c1cccc1c", "CC[N+](C)(C)Cc1ccccc1Br"]
mutations = [[1, 0, 0, ...], [1, 0, 1, ...]]
# e.g.
# import random
# [random.choice([0, 1]) for _ in range(3008)]
values = [0.85, 0.78]
smiles_src, smiles_tgt, mutations_src, out = ds(smiles, mutations, values)

model = models.MutSmiXAttention()
model.load_pretrained_ckpt('/path/to/drugcell.ckpt', '/path/to/moltx.ckpt')

crt = nn.MSELoss()

optim.zero_grad()
pred = model(smiles_src, smiles_tgt, mutations_src)
loss = crt(pred, out)
loss.backward()
optim.step()

torch.save(model.state_dict(), '/path/to/mutsmixattention.ckpt')
```

### Inference

```python
from dooc import pipelines, models
# dooc
model = models.MutSmiXAttention()
model.load_ckpt('/path/to/mutsmixattention.ckpt')
pipeline = pipelines.MutSmiXAttention()
pipeline([1, 0, 0, ...], "C=CC=CC=C")
# 0.85


```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "dooc",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "molcule, organoid, AI, deep learning, transformer, drug response, mutations",
    "author": "Michael Ding",
    "author_email": "yandy.ding@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/39/22/ab489a2776ad7d4a0a31fec64631296a2d9ba3df2adc80a83b237cfcc1f8/dooc-0.0.2.tar.gz",
    "platform": null,
    "description": "# DOoC\n\n## Usage\n\n\n### Train\n\n\n```python\n# Regression train\nfrom moltx import tokenizers\nfrom dooc import models, datasets, nets\n\n\ntk = tokenizers.MoltxTokenizer.from_pretrain(models.AdaMRTokenizerConfig.Prediction)\nds = datasets.MutSmiXAttention(tokenizer=tk, device=torch.device('cpu'))\nsmiles = [\"c1cccc1c\", \"CC[N+](C)(C)Cc1ccccc1Br\"]\nmutations = [[1, 0, 0, ...], [1, 0, 1, ...]]\n# e.g.\n# import random\n# [random.choice([0, 1]) for _ in range(3008)]\nvalues = [0.85, 0.78]\nsmiles_src, smiles_tgt, mutations_src, out = ds(smiles, mutations, values)\n\nmodel = models.MutSmiXAttention()\nmodel.load_pretrained_ckpt('/path/to/drugcell.ckpt', '/path/to/moltx.ckpt')\n\ncrt = nn.MSELoss()\n\noptim.zero_grad()\npred = model(smiles_src, smiles_tgt, mutations_src)\nloss = crt(pred, out)\nloss.backward()\noptim.step()\n\ntorch.save(model.state_dict(), '/path/to/mutsmixattention.ckpt')\n```\n\n### Inference\n\n```python\nfrom dooc import pipelines, models\n# dooc\nmodel = models.MutSmiXAttention()\nmodel.load_ckpt('/path/to/mutsmixattention.ckpt')\npipeline = pipelines.MutSmiXAttention()\npipeline([1, 0, 0, ...], \"C=CC=CC=C\")\n# 0.85\n\n\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Digtal Organoid On Chips",
    "version": "0.0.2",
    "project_urls": null,
    "split_keywords": [
        "molcule",
        " organoid",
        " ai",
        " deep learning",
        " transformer",
        " drug response",
        " mutations"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "41a4888e0b9cb580f34b4a7b05b3cc93856a2c5d4265a221456918d53aa482ba",
                "md5": "eb065df12e44f034ee903c1ed9127bc4",
                "sha256": "8d20ea9ecfcc5af20e684de8ea711c017dc1c006b8a79298b7ddbbd9c45ded65"
            },
            "downloads": -1,
            "filename": "dooc-0.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "eb065df12e44f034ee903c1ed9127bc4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 287396,
            "upload_time": "2024-05-28T12:47:12",
            "upload_time_iso_8601": "2024-05-28T12:47:12.340372Z",
            "url": "https://files.pythonhosted.org/packages/41/a4/888e0b9cb580f34b4a7b05b3cc93856a2c5d4265a221456918d53aa482ba/dooc-0.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3922ab489a2776ad7d4a0a31fec64631296a2d9ba3df2adc80a83b237cfcc1f8",
                "md5": "fb8558b27af407a70e63f6360b2706f3",
                "sha256": "40ba254f21396935e3e384eed7f3e6a3e4d4e10c78c8c795340cc6c19d2c0285"
            },
            "downloads": -1,
            "filename": "dooc-0.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "fb8558b27af407a70e63f6360b2706f3",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 282258,
            "upload_time": "2024-05-28T12:47:14",
            "upload_time_iso_8601": "2024-05-28T12:47:14.206464Z",
            "url": "https://files.pythonhosted.org/packages/39/22/ab489a2776ad7d4a0a31fec64631296a2d9ba3df2adc80a83b237cfcc1f8/dooc-0.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-28 12:47:14",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "dooc"
}
        
Elapsed time: 0.77956s