# compose-runner
Python package to execute meta-analyses created using neurosynth compose and NiMARE
as the meta-analysis execution engine.
## AWS Deployment
This repository includes an AWS CDK application that turns compose-runner into a
serverless batch pipeline using Step Functions, AWS Lambda, and ECS Fargate.
The deployed architecture works like this:
- `ComposeRunnerSubmit` (Lambda Function URL) accepts HTTP requests, validates
the meta-analysis payload, and starts a Step Functions execution. The response
is immediate and returns both a durable `job_id` (the execution ARN) and the
`artifact_prefix` used for S3 and log correlation.
- A Standard state machine runs a single Fargate task (`compose_runner.ecs_task`)
and waits for completion. The container downloads inputs, executes the
meta-analysis on up to 4 vCPU / 30 GiB of memory, uploads artifacts to S3, and
writes `metadata.json` into the same prefix.
- `ComposeRunnerStatus` (Lambda Function URL) wraps `DescribeExecution`, merges
metadata from S3, and exposes a simple status endpoint suitable for polling.
- `ComposeRunnerLogPoller` streams the ECS CloudWatch Logs for a given `artifact_prefix`,
while `ComposeRunnerResultsFetcher` returns presigned URLs for stored artifacts.
1. Create a virtual environment and install the CDK dependencies:
```bash
cd infra/cdk
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
```
2. (One-time per account/region) bootstrap the CDK environment:
```bash
cdk bootstrap
```
3. Deploy the stack (supplying the compose-runner version you want baked into the images):
```bash
cdk deploy \
-c composeRunnerVersion=$(hatch version) \
-c resultsPrefix=compose-runner/results \
-c taskCpu=4096 \
-c taskMemoryMiB=30720
```
Pass `-c resultsBucketName=<bucket>` to use an existing S3 bucket, or omit it
to let the stack create and retain a dedicated bucket. Additional knobs:
- `-c stateMachineTimeoutSeconds=32400` to control the max wall clock per run
- `-c submitTimeoutSeconds` / `-c statusTimeoutSeconds` / `-c pollTimeoutSeconds`
to tune Lambda timeouts
- `-c taskEphemeralStorageGiB` if the default 21 GiB scratch volume is insufficient
The deployment builds both the Lambda image (`aws_lambda/Dockerfile`) and the
Fargate task image (`Dockerfile`), provisions the Step Functions state machine,
and configures a public VPC so each task has outbound internet access.
The CloudFormation outputs list the HTTPS endpoints for submission, status,
logs, and artifact retrieval, alongside the Step Functions ARN.
Raw data
{
"_id": null,
"home_page": null,
"name": "compose-runner",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "meta-analysis, neuroimaging, neurosynth, neurosynth-compose",
"author": null,
"author_email": "James Kent <jamesdkent21@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/85/fc/fb2cbf15800815ccc7e489c72d32bf038b3e348581551b7c6fcdb8c918d6/compose_runner-0.6.4.tar.gz",
"platform": null,
"description": "# compose-runner\n\nPython package to execute meta-analyses created using neurosynth compose and NiMARE\nas the meta-analysis execution engine.\n\n## AWS Deployment\n\nThis repository includes an AWS CDK application that turns compose-runner into a\nserverless batch pipeline using Step Functions, AWS Lambda, and ECS Fargate.\nThe deployed architecture works like this:\n\n- `ComposeRunnerSubmit` (Lambda Function URL) accepts HTTP requests, validates\n the meta-analysis payload, and starts a Step Functions execution. The response\n is immediate and returns both a durable `job_id` (the execution ARN) and the\n `artifact_prefix` used for S3 and log correlation.\n- A Standard state machine runs a single Fargate task (`compose_runner.ecs_task`)\n and waits for completion. The container downloads inputs, executes the\n meta-analysis on up to 4 vCPU / 30 GiB of memory, uploads artifacts to S3, and\n writes `metadata.json` into the same prefix.\n- `ComposeRunnerStatus` (Lambda Function URL) wraps `DescribeExecution`, merges\n metadata from S3, and exposes a simple status endpoint suitable for polling.\n- `ComposeRunnerLogPoller` streams the ECS CloudWatch Logs for a given `artifact_prefix`,\n while `ComposeRunnerResultsFetcher` returns presigned URLs for stored artifacts.\n\n1. Create a virtual environment and install the CDK dependencies:\n ```bash\n cd infra/cdk\n python -m venv .venv\n source .venv/bin/activate\n pip install -r requirements.txt\n ```\n2. (One-time per account/region) bootstrap the CDK environment:\n ```bash\n cdk bootstrap\n ```\n3. Deploy the stack (supplying the compose-runner version you want baked into the images):\n ```bash\n cdk deploy \\\n -c composeRunnerVersion=$(hatch version) \\\n -c resultsPrefix=compose-runner/results \\\n -c taskCpu=4096 \\\n -c taskMemoryMiB=30720\n ```\n Pass `-c resultsBucketName=<bucket>` to use an existing S3 bucket, or omit it\n to let the stack create and retain a dedicated bucket. Additional knobs:\n\n - `-c stateMachineTimeoutSeconds=32400` to control the max wall clock per run\n - `-c submitTimeoutSeconds` / `-c statusTimeoutSeconds` / `-c pollTimeoutSeconds`\n to tune Lambda timeouts\n - `-c taskEphemeralStorageGiB` if the default 21 GiB scratch volume is insufficient\n\nThe deployment builds both the Lambda image (`aws_lambda/Dockerfile`) and the\nFargate task image (`Dockerfile`), provisions the Step Functions state machine,\nand configures a public VPC so each task has outbound internet access.\nThe CloudFormation outputs list the HTTPS endpoints for submission, status,\nlogs, and artifact retrieval, alongside the Step Functions ARN.\n",
"bugtrack_url": null,
"license": "BSD 3-Clause License",
"summary": "A package for running neurosynth-compose analyses",
"version": "0.6.4",
"project_urls": {
"Repository": "https://github.com/neurostuff/compose-runner"
},
"split_keywords": [
"meta-analysis",
" neuroimaging",
" neurosynth",
" neurosynth-compose"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "cc0d981668445ead243f2aa911848b33505657943d6c3834b3cbabf3e54b9761",
"md5": "46a78b463f03c27fbbd042ac3a156231",
"sha256": "ec1e86039fb88ecbc56ac4a68946195ffd9b3093520755ba4511df96bef38c6b"
},
"downloads": -1,
"filename": "compose_runner-0.6.4-py2.py3-none-any.whl",
"has_sig": false,
"md5_digest": "46a78b463f03c27fbbd042ac3a156231",
"packagetype": "bdist_wheel",
"python_version": "py2.py3",
"requires_python": null,
"size": 13372199,
"upload_time": "2025-10-21T18:45:06",
"upload_time_iso_8601": "2025-10-21T18:45:06.425070Z",
"url": "https://files.pythonhosted.org/packages/cc/0d/981668445ead243f2aa911848b33505657943d6c3834b3cbabf3e54b9761/compose_runner-0.6.4-py2.py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "85fcfb2cbf15800815ccc7e489c72d32bf038b3e348581551b7c6fcdb8c918d6",
"md5": "20e30fceda1f302a7e6257bcc8fcc647",
"sha256": "d2ddbd1cb4fd40d79d0e55d16922a82e51c333cdfd9a5230b1d35d7ffe67f54c"
},
"downloads": -1,
"filename": "compose_runner-0.6.4.tar.gz",
"has_sig": false,
"md5_digest": "20e30fceda1f302a7e6257bcc8fcc647",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 13360352,
"upload_time": "2025-10-21T18:45:08",
"upload_time_iso_8601": "2025-10-21T18:45:08.939138Z",
"url": "https://files.pythonhosted.org/packages/85/fc/fb2cbf15800815ccc7e489c72d32bf038b3e348581551b7c6fcdb8c918d6/compose_runner-0.6.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-21 18:45:08",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "neurostuff",
"github_project": "compose-runner",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "compose-runner"
}