# Evaluate
Evaluate is a script that can be run to gather information about all projects of a GitLab
- Instance
- Group (including sub-groups)
This information is useful to the GitLab Professional Services (PS) team to accurately scope migration services.
## Use Case
GitLab PS plans to share this script with a Customer to run against their GitLab instance or group. Then the customer can send back the output files to enable GitLab engagement managers to scope engagements accurately. There is a [single file generated](reading-the-output.md).
## Install Method
- [pip Install](#pip-install)
- [Docker Container](#docker-container)
- [Bash](#local-usage)
## Usage
### System level data gathering
Evaluate is meant to be run by an **OWNER** (ideally system **ADMINISTRATOR**) of a GitLab instance to gather data about every project on the instance or group (including sub-groups).
1. A GitLab **OWNER** (ideally system **ADMINISTRATOR**) should provision an access token with `api` scope:
- [Personal access token](https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html#create-a-personal-access-token) for instance
- [Group access token](https://docs.gitlab.com/ee/user/group/settings/group_access_tokens.html#create-a-group-access-token-using-ui) for group
2. Install `gitlab-evaluate` from the [Install](#install) section above,
3. Run :point_down:
For evaluating a GitLab instance
```bash
evaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com
```
For evaluating a GitLab group (including sub-groups)
```bash
evaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com -g 42
```
4. This should create a file called `evaluate_repost.xlsx`
For more information on these files, see [reading the output](reading-the-output.md)
5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.
### To gather CI data from a single repo
```bash
# For evaluating a single git repo's CI readiness
evaluate-ci-readiness --repo <git-repo-url> # -r for short
```
### Command help screen
```text
usage: evaluate-gitlab [-h] [-t TOKEN] [-s SOURCE] [-f FILENAME] [-o] [-i] [-p PROCESSES] [-g GROUP_ID]
optional arguments:
-h, --help show this help message and exit
-t TOKEN, --token TOKEN
Personal Access Token: REQ'd
-s SOURCE, --source SOURCE
Source URL: REQ'd
-f FILENAME, --filename FILENAME
XLSX Output File Name. If not set, will default to 'evaluate_report.xlsx'
-o, --output Output Per Project Stats to screen
-i, --insecure Set to ignore SSL warnings.
-p PROCESSES, --processes PROCESSES
Number of processes. Defaults to number of CPU cores
-g GROUP_ID, --group GROUP_ID
Group ID. Evaluate all group projects (including sub-groups)
```
```text
usage: evaluate-ci-readiness [-h] [-r REPO]
optional arguments:
-h, --help show this help message and exit
-r REPO, --repo REPO Git Repository To Clone (ex: https://username:password@repo.com
```
### pip Install
```bash
pip install gitlab-evaluate
```
## Docker Container
[Docker containers with evaluate installed](https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate/container_registry) are also available to use.
### Local Usage
```bash
# Spin up container
docker run --name evaluate -it registry.gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate:latest /bin/bash
# In docker shell
evaluate-ci-readiness <-r|--repo> <git-repo-url>
evaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com
```
### Example GitLab CI job using evaluate ci readiness script
```yaml
evaluate node-js:
stage: test
script:
- evaluate-ci-readiness --repo=https://github.com/nodejs/node.git
artifacts:
paths:
- node.csv
```
To **test**, consider standing up a local docker container of GitLab. Provision an access token with `api` scope and **OWNER** (ideally system **ADMINISTRATOR**) privileges. Create multiple projects with varying number of commits, pipelines, merge requests, issues. Consider importing an open source repo or using [GPT](https://gitlab.com/gitlab-org/quality/performance) to add projects to the system.
## Design
Design for the script can be found [here](https://gitlab.com/gitlab-com/customer-success/professional-services-group/ps-leadership-team/ps-practice-management/-/issues/83)
## Project Thresholds
_Below are the thresholds we will use to determine whether a project can be considered for normal migration or needs to have special steps taken in order to migrate_
### Project Data
- Project Size - 20GB
- Pipelines - 5,000 max
- Issues - 5,000 total (not just open)
- Merge Requests - 5,000 total (not just merged)
- Container images - 20GB per project
- Packages - Any packages present
### Repository Data
- Repository Size - 5GB
- Commits - 50K
- Branches - 1K
- Tags - 5K
## Contributions / Support
For support questions please create [an issue](https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate/-/issues/new?issuable_template=evaluate-support) from our [Evaluate support issue template](./.gitlab/issue_templates/evaluate-support.md).
Raw data
{
"_id": null,
"home_page": "https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate",
"name": "gitlab-evaluate",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8.0",
"maintainer_email": "",
"keywords": "",
"author": "GitLab Professional Services",
"author_email": "proserv@gitlab.com",
"download_url": "https://files.pythonhosted.org/packages/ff/02/79fd5c8513c4a047ad2ed10293891700055cc9a2515651eb9e8b331d7616/gitlab_evaluate-0.12.0.tar.gz",
"platform": null,
"description": "# Evaluate\n\nEvaluate is a script that can be run to gather information about all projects of a GitLab\n\n- Instance\n- Group (including sub-groups)\n\nThis information is useful to the GitLab Professional Services (PS) team to accurately scope migration services.\n\n## Use Case\n\nGitLab PS plans to share this script with a Customer to run against their GitLab instance or group. Then the customer can send back the output files to enable GitLab engagement managers to scope engagements accurately. There is a [single file generated](reading-the-output.md).\n\n## Install Method\n\n- [pip Install](#pip-install)\n- [Docker Container](#docker-container)\n - [Bash](#local-usage)\n\n## Usage\n\n### System level data gathering\n\nEvaluate is meant to be run by an **OWNER** (ideally system **ADMINISTRATOR**) of a GitLab instance to gather data about every project on the instance or group (including sub-groups).\n\n1. A GitLab **OWNER** (ideally system **ADMINISTRATOR**) should provision an access token with `api` scope:\n - [Personal access token](https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html#create-a-personal-access-token) for instance\n - [Group access token](https://docs.gitlab.com/ee/user/group/settings/group_access_tokens.html#create-a-group-access-token-using-ui) for group\n2. Install `gitlab-evaluate` from the [Install](#install) section above,\n3. Run :point_down:\n\n For evaluating a GitLab instance\n\n ```bash\n evaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com\n ```\n\n For evaluating a GitLab group (including sub-groups)\n\n ```bash\n evaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com -g 42\n ```\n\n4. This should create a file called `evaluate_repost.xlsx`\n\n For more information on these files, see [reading the output](reading-the-output.md)\n5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.\n\n### To gather CI data from a single repo\n\n```bash\n# For evaluating a single git repo's CI readiness\nevaluate-ci-readiness --repo <git-repo-url> # -r for short\n```\n\n### Command help screen\n\n```text\nusage: evaluate-gitlab [-h] [-t TOKEN] [-s SOURCE] [-f FILENAME] [-o] [-i] [-p PROCESSES] [-g GROUP_ID]\n\noptional arguments:\n -h, --help show this help message and exit\n -t TOKEN, --token TOKEN\n Personal Access Token: REQ'd\n -s SOURCE, --source SOURCE\n Source URL: REQ'd\n -f FILENAME, --filename FILENAME\n XLSX Output File Name. If not set, will default to 'evaluate_report.xlsx'\n -o, --output Output Per Project Stats to screen\n -i, --insecure Set to ignore SSL warnings.\n -p PROCESSES, --processes PROCESSES\n Number of processes. Defaults to number of CPU cores\n -g GROUP_ID, --group GROUP_ID\n Group ID. Evaluate all group projects (including sub-groups)\n```\n\n```text\nusage: evaluate-ci-readiness [-h] [-r REPO]\n\noptional arguments:\n -h, --help show this help message and exit\n -r REPO, --repo REPO Git Repository To Clone (ex: https://username:password@repo.com\n```\n\n### pip Install\n\n```bash\npip install gitlab-evaluate\n```\n\n## Docker Container\n\n[Docker containers with evaluate installed](https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate/container_registry) are also available to use.\n\n### Local Usage\n\n```bash\n# Spin up container\ndocker run --name evaluate -it registry.gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate:latest /bin/bash\n\n# In docker shell\nevaluate-ci-readiness <-r|--repo> <git-repo-url>\nevaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com\n```\n\n### Example GitLab CI job using evaluate ci readiness script\n\n```yaml\nevaluate node-js:\n stage: test\n script:\n - evaluate-ci-readiness --repo=https://github.com/nodejs/node.git\n artifacts:\n paths:\n - node.csv\n```\n\nTo **test**, consider standing up a local docker container of GitLab. Provision an access token with `api` scope and **OWNER** (ideally system **ADMINISTRATOR**) privileges. Create multiple projects with varying number of commits, pipelines, merge requests, issues. Consider importing an open source repo or using [GPT](https://gitlab.com/gitlab-org/quality/performance) to add projects to the system.\n\n## Design\n\nDesign for the script can be found [here](https://gitlab.com/gitlab-com/customer-success/professional-services-group/ps-leadership-team/ps-practice-management/-/issues/83)\n\n## Project Thresholds\n\n_Below are the thresholds we will use to determine whether a project can be considered for normal migration or needs to have special steps taken in order to migrate_\n\n### Project Data\n\n- Project Size - 20GB\n- Pipelines - 5,000 max\n- Issues - 5,000 total (not just open)\n- Merge Requests - 5,000 total (not just merged)\n- Container images - 20GB per project\n- Packages - Any packages present\n\n### Repository Data\n\n- Repository Size - 5GB\n- Commits - 50K\n- Branches - 1K\n- Tags - 5K\n\n## Contributions / Support\n\nFor support questions please create [an issue](https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate/-/issues/new?issuable_template=evaluate-support) from our [Evaluate support issue template](./.gitlab/issue_templates/evaluate-support.md).\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Scans GitLab instance and ranks projects against a set of criteria. Can be used to identiy projects that may have too much metadata/size to reliably export or import.",
"version": "0.12.0",
"project_urls": {
"Homepage": "https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "91b6249fa73d8a5eb183e37cd69f99a3b05996198c336153035eda7bea8b2eb0",
"md5": "7371af693a7179aeedc6f95486891898",
"sha256": "b5ba4122699d909bab00ea0f94cbf0d7b5c1b7b86022cfd882fc18b99e62ef20"
},
"downloads": -1,
"filename": "gitlab_evaluate-0.12.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7371af693a7179aeedc6f95486891898",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8.0",
"size": 25749,
"upload_time": "2023-08-02T16:52:08",
"upload_time_iso_8601": "2023-08-02T16:52:08.698723Z",
"url": "https://files.pythonhosted.org/packages/91/b6/249fa73d8a5eb183e37cd69f99a3b05996198c336153035eda7bea8b2eb0/gitlab_evaluate-0.12.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "ff0279fd5c8513c4a047ad2ed10293891700055cc9a2515651eb9e8b331d7616",
"md5": "6e75fa8947b511d1d867f2782bb333e3",
"sha256": "1c5182b6e893e5673f35126480e04a55f8153cd2a23147126e78395e154e4439"
},
"downloads": -1,
"filename": "gitlab_evaluate-0.12.0.tar.gz",
"has_sig": false,
"md5_digest": "6e75fa8947b511d1d867f2782bb333e3",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8.0",
"size": 21480,
"upload_time": "2023-08-02T16:52:11",
"upload_time_iso_8601": "2023-08-02T16:52:11.620418Z",
"url": "https://files.pythonhosted.org/packages/ff/02/79fd5c8513c4a047ad2ed10293891700055cc9a2515651eb9e8b331d7616/gitlab_evaluate-0.12.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-08-02 16:52:11",
"github": false,
"gitlab": true,
"bitbucket": false,
"codeberg": false,
"gitlab_user": "gitlab-org",
"gitlab_project": "professional-services-automation",
"lcname": "gitlab-evaluate"
}