# Evaluate
Evaluate is a script that can be run to gather information from a number of source code management and CI/CD orchestration systems to help prepare for migration or platform consolidation efforts. Currently Evaluate supports gathering data from
- GitLab
- Bitbucket Server/Data Center
- GitHub Enterprise
- Jenkins
- Azure DevOps.
This information is useful to the GitLab Professional Services (PS) team to accurately scope migration services. It is also helpful for customers and partners who are embarking on migration journeys.
[TOC]
## Contributions / Support
This tool is maintained by the Professional Services team and is not included in your GitLab Support if you have a license. For support questions please create [an issue](https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate/-/issues/new?issuable_template=evaluate-support) using our [Evaluate support issue template](./.gitlab/issue_templates/evaluate-support.md).
## Use Case
GitLab Professional Serivces shares this script with Customers to run against their GitLab instance or group. Then the customer can send back the output files to enable GitLab engagement managers to scope engagements accurately. There is a [single file generated](reading-the-output.md).
## Install Method
### Versioning
- For GitLab versions < 16.0. use Evaluate version <= 0.24.0. Evaluate switched to using GraphQL queries instead of REST API requests, which can cause some issues retrieving data from older GitLab instances
- For GitLab versions >= 16.0 use Evaluate version > 0.24.0, ideally the latest
### Docker Container
[Docker containers with evaluate installed](https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate/container_registry) are available to use.
```bash
# For GitLab versions older than 16.0. Evaluate versions newer than 0.24.0 switched to using GraphQL queries instead of REST API requests which can cause some issues retrieving data from older GitLab instances
docker pull registry.gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate:0.24.0
# For GitLab versions newer than 16.0
docker pull registry.gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate:latest
# Spin up container
docker run --name evaluate -it registry.gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate:latest /bin/bash
# In docker shell
evaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com
evaluate-jenkins -s https://jenkins.example.com -u <jenkins-admin-user> -t <access-token-or-password> # BETA
evaluate-bitbucket -s https://bitbucket.example.com -t <access-token> # BETA
evaluate-ado -s https://dev.azure.com/<your-org> -t <personal-access-token> # BETA
```
### Pipeline schedule
To schedule Evaluate to run on a regular basis we recommend using the following pipeline:
```yml
image: registry.gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate:latest
stages:
- evaluate
run-evaluate:
stage: evaluate
# variables:
# REQUESTS_CA_BUNDLE: "/custom/certs/my-cert.crt" # If you need a custom Root-ca-certificate
timeout: 4h
script:
- evaluate-gitlab -t $API_TOKEN -s https://<gitlab-hostname> -p <number-of-processes>
artifacts:
name: Report
paths:
- evaluate_report.xlsx
expire_in: 1 week
```
**NOTES:**
- Configure `API_TOKEN` as CI variable with Admin personal access token and `read_api` or `api` scope
- Add Runner `tags` for using a `docker` executor and **Linux** Runner
- Adjust the number of processes based on [recommendation](#recommended-processes-per-project-count)
- Adjust `timeout` after the 1st run
- Create pipeline schedule under _Build -> Pipeline schedules_
### Local (development / troubleshooting)
Requires Python 3.8 through 3.12 (Python 3.13 is not yet supported).
```bash
git clone https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate.git # or SSH
cd evaluate
pip install gitlab-evaluate
# In local terminal
evaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com
evaluate-jenkins -s https://jenkins.example.com -u <jenkins-admin-user> -t <access-token-or-password> # BETA
evaluate-bitbucket -s https://bitbucket.example.com -t <access-token> # BETA
evaluate-ado -s https://dev.azure.com/<your-org> -t <personal-access-token> # BETA
```
## Usage
### GitLab
Evaluate is meant to be run by an **OWNER** (ideally system **ADMINISTRATOR**) of a GitLab instance to gather data about every project on the instance or group (including sub-groups).
1. A GitLab **OWNER** (ideally system **ADMINISTRATOR**) should provision an access token with `api` or `read_api` scope:
- [Personal access token](https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html#create-a-personal-access-token) for instance
- [Group access token](https://docs.gitlab.com/ee/user/group/settings/group_access_tokens.html#create-a-group-access-token-using-ui) for group
2. Install `gitlab-evaluate` from the [Install](#install-method) section above,
3. Run :point_down:
For evaluating a GitLab instance
```bash
evaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com
```
For evaluating a GitLab group (including sub-groups)
```bash
evaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com -g 42
```
See [Recommended Processes per Project Count](#recommended-processes-per-project-count) to specify the number of processes to use.
**NOTE:** If you have configured rate limits on your instance to be more strict than the default settings, start with one process (`-p 1`) and adjust accordingly up to the recommended number of processes for your sized instance
**NOTE:** In the event Evaluate freezes or doesn't finish running while scanning a GitLab instance, re-run your evaluate command with an additional `-r` or `--generate-report` flag to generate a report based on the data retrieved so far
4. This should create a file called `evaluate_report.xlsx`
For more information on these files, see [reading the output](reading-the-output.md)
5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.
#### Recommended Processes per Project Count
Evaluate uses 4 processes by default, which is sufficient for smaller GitLab instances, but may result in a slower scan time for larger instances. Below is a table covering recommended processes based on the overall number of projects on an instance:
| Number of Projects | Recommended Processes |
| ------------------ | --------------------- |
| < 100 | 4 (default) |
| < 1000 | 8 |
| < 10000 | 16 |
| < 100000 | 32 |
| > 100000 | 64-128 |
The number of processes is limited by a few factors:
- API rate limits on the GitLab instance itself
- Overall stability of the GitLab instance
- Not as critical as the first two, but overall available memory on the machine running Evaluate is another factor to consider
You can ramp up the number of processes on a smaller instance to speed up the scans, but the performance gains for a large number of processes on a smaller instance will eventually plateau.
#### Command help screen
```text
Usage: evaluate-gitlab [OPTIONS]
Options:
-s, --source TEXT Source URL: REQ'd
-t, --token TEXT Personal Access Token: REQ'd
-o, --output Output Per Project Stats to screen
-i, --insecure Set to ignore SSL warnings.
-g, --group-id TEXT Group ID. Evaluate all group projects (including sub-
groups)
-f, --filename TEXT XLSX Output File Name. If not set, will default to
'evaluate_output.xlsx'
-p, --processes TEXT Number of processes. Defaults to number of CPU cores
-v, --verbose Set logging level to Debug and output everything to
the screen and log file
-r, --generate-report Generate full XLSX report from sqlite database.
Source and Token are still required for the report to
generate
--help Show this message and exit.
```
### [BETA] Jenkins
Evaluate supports scanning a Jenkins instance to retrieve basic metrics about the instance.
Evaluate is meant to be run by an admin of a Jenkins instance to gather data about jenkins jobs and any plugins installed on the instance.
1. A Jenkins **ADMINISTRATOR** should provision an API token for Evaluate to use during the scan.
2. Install `gitlab-evaluate` from the [Install](#install-method) section above,
3. Run :point_down:
```bash
evaluate-jenkins -s https://jenkins.example.com -u <jenkins-admin-user> -t <access-token-or-password>
```
4. This should create a file called `evaluate_jenkins.xlsx`
5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.
#### Command help screen
```sh
Usage: evaluate-jenkins [OPTIONS]
Options:
-s, --source TEXT Source URL: REQ'd
-u, --user TEXT Username associated with the Jenkins API token: REQ'd
-t, --token TEXT Jenkins API Token: REQ'd
-i, --insecure Set to ignore SSL warnings.
--help Show this message and exit.
```
### [BETA] BitBucket
Evaluate supports scanning a Bitbucket Server/Data Center to retrieve relevant metadata about the server.
You can use either a admin or a non-admin token to do the evaluation but non-admin tokens can't pull users information.
1. A user should provision an access token for Evaluate to use during the scan.
2. Install `gitlab-evaluate` from the [Install](#install-method) section above,
3. Run :point_down:
```bash
evaluate-bitbucket -s https://bitbucket.example.com -t <access-token>
```
4. This should create a file called `evaluate_bitbucket.xlsx`
5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.
#### Command help screen
```sh
Usage: evaluate-bitbucket [OPTIONS]
Options:
-s, --source TEXT Source URL: REQ'd
REQ'd
-t, --token TEXT Bitbucket access Token: REQ'd
--help Show this message and exit.
```
### [BETA] Azure DevOps
Evaluate supports scanning an Azure DevOps to retrieve relevant metadata about the organization.
<details>
<summary>
You need to use [Personal Access Token](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate?view=azure-devops&tabs=Windows) with [Read scope](https://learn.microsoft.com/en-us/azure/devops/integrate/get-started/authentication/oauth?view=azure-devops#scopes) to most of the services.
</summary>
When running Evaluate for Azure DevOps, the tool retrieves information from the endpoints listed below. To ensure the tool functions correctly, create a personal access token with the required scopes as shown in the image below.
```
Get Descriptor
Endpoint: /_apis/graph/descriptors/{project_id}
Sub-API: vssps
Scope: Graph (Read)
Get Project Administrators Group
Endpoint: /_apis/graph/groups?scopeDescriptor={scopeDescriptor}
Sub-API: vssps
Scope: Graph (Read)
Get Project Administrators
Endpoint: /_apis/GroupEntitlements/{project_group_id}/members
Sub-API: vsaex
Scope: MemberEntitlementManagement (Read)
Get Work Items
Endpoint: /{project_id}/_apis/wit/wiql
Scope: Work Items (Read)
Get Release Definitions
Endpoint: /{project_id}/_apis/release/definitions
Sub-API: vsrm
Scope: Release (Read)
Get Build Definitions
Endpoint: /{project_id}/_apis/build/definitions
Scope: Build (Read)
Get Commits
Endpoint: /{project_id}/_apis/git/repositories/{repository_id}/commits
Scope: Code (Read)
Get Pull Requests
Endpoint: /{project_id}/_apis/git/repositories/{repository_id}/pullrequests
Scope: Code (Read)
Get Branches
Endpoint: /{project_id}/_apis/git/repositories/{repository_id}/refs
Scope: Code (Read)
Get Repositories
Endpoint: /{project_id}/_apis/git/repositories
Scope: Code (Read)
Get Project
Endpoint: /_apis/project/{project_id}
Scope: Project and Team (Read)
Get Projects
Endpoint: /_apis/projects
Scope: Project and Team (Read)
Get Users
Endpoint: /_apis/graph/users
Sub-API: vssps
Scope: Graph (Read)
Get Agent Pools
Endpoint: /_apis/distributedtask/pools
Scope: Agent Pools (Read)
Variable Groups
Endpoint: /_apis/distributedtask/variablegroups
Scope: Variable Groups (Read)
Test Connection
Endpoint: /_apis/ConnectionData
Scope: Service Connections (Read)
```
</details>
1. A user should provision an access token for Evaluate to use during the scan.
2. Install `gitlab-evaluate` from the [Install](#install-method) section above,
3. Run :point_down:
- For Azure DevOps Service (Cloud):
```bash
evaluate-ado -s https://dev.azure.com/<your-org> -t <personal-access-token>
```
- For Azure DevOps Server:
```bash
evaluate-ado -s {instance_url}/{collection} -t <personal-access-token> --api-version=7.0
```
- For Team Foundation Server (TFS):
```bash
evaluate-ado -s {server_url:port}/tfs/{collection} -t <personal-access-token> --api-version=4.1
```
> **Note:**
> When running Evaluate against **Azure DevOps Server** or **Team Foundation Server (TFS)**, you must specify the correct API version.
>
> To determine the required API version:
> 1. Click your user icon and select **Help > About** to view your server information.
> 2. Refer to the [API and TFS version mapping documentation](https://learn.microsoft.com/en-us/rest/api/azure/devops/?view=azure-devops-rest-7.2#api-and-tfs-version-mapping) to identify the appropriate API version for your server.
4. Unless the user provides a custom `--filename`, the report file is named `evaluate_ado` by default.
5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.
#### Command help screen
```sh
Usage: evaluate-ado [OPTIONS]
Options:
-s, --source TEXT Source URL [required]
-t, --token TEXT Personal Access Token [required]
-p, --processes TEXT Number of processes. Defaults to number of CPU cores
--skip-details Skips details
--project TEXT Project ID. Evaluate all data within a given Azure
DevOps project (Project ID should be in UUID format)
--api-version TEXT API version to use (default: 7.2-preview)
-f, --filename TEXT XLSX Output File Name (default: evaluate_ado)
--help Show this message and exit.
```
### [BETA] Github Enterprise
Evaluate supports scanning a Github Enterprise Server (GHES) to retrieve relevant metadata about the server.
You have to use an admin personal access token ([other token types](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/about-authentication-to-github#githubs-token-formats) potentially supported) to do the evaluation.
1. A user should provision an admin access token for Evaluate to use during the scan.
2. Install `github-evaluate` from the [Install](#install-method) section above,
3. OPTIONAL: If you are using custom CA, export the CA bundle: `export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt`
3. Run :point_down:
```bash
evaluate-github-enterprise -s https://github.dev -t <access-token>
```
4. This should create a file called `evaluate_github.xlsx`
5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.
#### Command help screen
```sh
Usage: evaluate-github-enterprise [OPTIONS]
Options:
-s, --source TEXT Source URL: REQ'd
REQ'd
-t, --token TEXT Github access Token: REQ'd
--help Show this message and exit.
```
## GitLab Project Thresholds
_Below are the thresholds we will use to determine whether a project can be considered for normal migration or needs to have special steps taken in order to migrate_
### Project Data
- Project Size - 20GB
- Pipelines - 5,000 max
- Issues - 5,000 total (not just open)
- Merge Requests - 5,000 total (not just merged)
- Container images - 20GB per project
- Packages - Any packages present
### Repository Data
- Repository Size - 5GB
- Commits - 50K
- Branches - 1K
- Tags - 5K
Raw data
{
"_id": null,
"home_page": "https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate",
"name": "gitlab_evaluate",
"maintainer": null,
"docs_url": null,
"requires_python": "<=3.12,>=3.8",
"maintainer_email": null,
"keywords": null,
"author": "GitLab Professional Services",
"author_email": "proserv@gitlab.com",
"download_url": "https://files.pythonhosted.org/packages/cf/da/7721499082d4245199e5641997539878839920587042a3dee291a2ef5db6/gitlab_evaluate-0.31.0.tar.gz",
"platform": null,
"description": "# Evaluate\n\nEvaluate is a script that can be run to gather information from a number of source code management and CI/CD orchestration systems to help prepare for migration or platform consolidation efforts. Currently Evaluate supports gathering data from\n\n - GitLab\n - Bitbucket Server/Data Center\n - GitHub Enterprise\n - Jenkins\n - Azure DevOps. \n\nThis information is useful to the GitLab Professional Services (PS) team to accurately scope migration services. It is also helpful for customers and partners who are embarking on migration journeys. \n\n[TOC]\n\n## Contributions / Support\n\nThis tool is maintained by the Professional Services team and is not included in your GitLab Support if you have a license. For support questions please create [an issue](https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate/-/issues/new?issuable_template=evaluate-support) using our [Evaluate support issue template](./.gitlab/issue_templates/evaluate-support.md).\n\n## Use Case\n\nGitLab Professional Serivces shares this script with Customers to run against their GitLab instance or group. Then the customer can send back the output files to enable GitLab engagement managers to scope engagements accurately. There is a [single file generated](reading-the-output.md).\n\n## Install Method\n\n### Versioning\n\n- For GitLab versions < 16.0. use Evaluate version <= 0.24.0. Evaluate switched to using GraphQL queries instead of REST API requests, which can cause some issues retrieving data from older GitLab instances\n- For GitLab versions >= 16.0 use Evaluate version > 0.24.0, ideally the latest\n\n### Docker Container\n\n[Docker containers with evaluate installed](https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate/container_registry) are available to use.\n\n```bash\n# For GitLab versions older than 16.0. Evaluate versions newer than 0.24.0 switched to using GraphQL queries instead of REST API requests which can cause some issues retrieving data from older GitLab instances\ndocker pull registry.gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate:0.24.0\n\n# For GitLab versions newer than 16.0\ndocker pull registry.gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate:latest\n\n# Spin up container\ndocker run --name evaluate -it registry.gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate:latest /bin/bash\n\n# In docker shell\nevaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com\nevaluate-jenkins -s https://jenkins.example.com -u <jenkins-admin-user> -t <access-token-or-password> # BETA\nevaluate-bitbucket -s https://bitbucket.example.com -t <access-token> # BETA\nevaluate-ado -s https://dev.azure.com/<your-org> -t <personal-access-token> # BETA\n```\n\n### Pipeline schedule\n\nTo schedule Evaluate to run on a regular basis we recommend using the following pipeline:\n\n```yml\nimage: registry.gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate:latest\n\nstages:\n - evaluate\n\nrun-evaluate:\n stage: evaluate\n # variables:\n # REQUESTS_CA_BUNDLE: \"/custom/certs/my-cert.crt\" # If you need a custom Root-ca-certificate\n timeout: 4h\n script:\n - evaluate-gitlab -t $API_TOKEN -s https://<gitlab-hostname> -p <number-of-processes>\n artifacts:\n name: Report\n paths:\n - evaluate_report.xlsx\n expire_in: 1 week\n```\n\n**NOTES:**\n\n- Configure `API_TOKEN` as CI variable with Admin personal access token and `read_api` or `api` scope\n- Add Runner `tags` for using a `docker` executor and **Linux** Runner\n- Adjust the number of processes based on [recommendation](#recommended-processes-per-project-count)\n- Adjust `timeout` after the 1st run\n- Create pipeline schedule under _Build -> Pipeline schedules_\n\n### Local (development / troubleshooting)\n\nRequires Python 3.8 through 3.12 (Python 3.13 is not yet supported).\n\n```bash\ngit clone https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate.git # or SSH\ncd evaluate\npip install gitlab-evaluate\n\n# In local terminal\nevaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com\nevaluate-jenkins -s https://jenkins.example.com -u <jenkins-admin-user> -t <access-token-or-password> # BETA\nevaluate-bitbucket -s https://bitbucket.example.com -t <access-token> # BETA\nevaluate-ado -s https://dev.azure.com/<your-org> -t <personal-access-token> # BETA\n```\n\n## Usage\n\n### GitLab\n\nEvaluate is meant to be run by an **OWNER** (ideally system **ADMINISTRATOR**) of a GitLab instance to gather data about every project on the instance or group (including sub-groups).\n\n1. A GitLab **OWNER** (ideally system **ADMINISTRATOR**) should provision an access token with `api` or `read_api` scope:\n - [Personal access token](https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html#create-a-personal-access-token) for instance\n - [Group access token](https://docs.gitlab.com/ee/user/group/settings/group_access_tokens.html#create-a-group-access-token-using-ui) for group\n2. Install `gitlab-evaluate` from the [Install](#install-method) section above,\n3. Run :point_down:\n\n For evaluating a GitLab instance\n\n ```bash\n evaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com\n ```\n\n For evaluating a GitLab group (including sub-groups)\n\n ```bash\n evaluate-gitlab -t <access-token-with-api-scope> -s https://gitlab.example.com -g 42\n ```\n\n See [Recommended Processes per Project Count](#recommended-processes-per-project-count) to specify the number of processes to use. \n \n **NOTE:** If you have configured rate limits on your instance to be more strict than the default settings, start with one process (`-p 1`) and adjust accordingly up to the recommended number of processes for your sized instance\n\n **NOTE:** In the event Evaluate freezes or doesn't finish running while scanning a GitLab instance, re-run your evaluate command with an additional `-r` or `--generate-report` flag to generate a report based on the data retrieved so far\n\n4. This should create a file called `evaluate_report.xlsx`\n\n For more information on these files, see [reading the output](reading-the-output.md)\n5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.\n\n#### Recommended Processes per Project Count\n\nEvaluate uses 4 processes by default, which is sufficient for smaller GitLab instances, but may result in a slower scan time for larger instances. Below is a table covering recommended processes based on the overall number of projects on an instance:\n\n| Number of Projects | Recommended Processes |\n| ------------------ | --------------------- |\n| < 100 | 4 (default) |\n| < 1000 | 8 |\n| < 10000 | 16 |\n| < 100000 | 32 |\n| > 100000 | 64-128 |\n\nThe number of processes is limited by a few factors:\n\n- API rate limits on the GitLab instance itself\n- Overall stability of the GitLab instance\n- Not as critical as the first two, but overall available memory on the machine running Evaluate is another factor to consider\n\nYou can ramp up the number of processes on a smaller instance to speed up the scans, but the performance gains for a large number of processes on a smaller instance will eventually plateau.\n\n#### Command help screen\n\n```text\nUsage: evaluate-gitlab [OPTIONS]\n\nOptions:\n -s, --source TEXT Source URL: REQ'd\n -t, --token TEXT Personal Access Token: REQ'd\n -o, --output Output Per Project Stats to screen\n -i, --insecure Set to ignore SSL warnings.\n -g, --group-id TEXT Group ID. Evaluate all group projects (including sub-\n groups)\n -f, --filename TEXT XLSX Output File Name. If not set, will default to\n 'evaluate_output.xlsx'\n -p, --processes TEXT Number of processes. Defaults to number of CPU cores\n -v, --verbose Set logging level to Debug and output everything to\n the screen and log file\n -r, --generate-report Generate full XLSX report from sqlite database.\n Source and Token are still required for the report to\n generate\n --help Show this message and exit.\n```\n\n### [BETA] Jenkins\n\nEvaluate supports scanning a Jenkins instance to retrieve basic metrics about the instance.\n\nEvaluate is meant to be run by an admin of a Jenkins instance to gather data about jenkins jobs and any plugins installed on the instance.\n\n1. A Jenkins **ADMINISTRATOR** should provision an API token for Evaluate to use during the scan.\n2. Install `gitlab-evaluate` from the [Install](#install-method) section above,\n3. Run :point_down:\n\n ```bash\n evaluate-jenkins -s https://jenkins.example.com -u <jenkins-admin-user> -t <access-token-or-password>\n ```\n\n4. This should create a file called `evaluate_jenkins.xlsx`\n5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.\n\n#### Command help screen\n\n```sh\nUsage: evaluate-jenkins [OPTIONS]\n\nOptions:\n -s, --source TEXT Source URL: REQ'd\n -u, --user TEXT Username associated with the Jenkins API token: REQ'd\n -t, --token TEXT Jenkins API Token: REQ'd\n -i, --insecure Set to ignore SSL warnings.\n --help Show this message and exit.\n```\n\n### [BETA] BitBucket\n\nEvaluate supports scanning a Bitbucket Server/Data Center to retrieve relevant metadata about the server.\n\nYou can use either a admin or a non-admin token to do the evaluation but non-admin tokens can't pull users information.\n\n1. A user should provision an access token for Evaluate to use during the scan.\n2. Install `gitlab-evaluate` from the [Install](#install-method) section above,\n3. Run :point_down:\n\n ```bash\n evaluate-bitbucket -s https://bitbucket.example.com -t <access-token>\n ```\n\n4. This should create a file called `evaluate_bitbucket.xlsx`\n5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.\n\n#### Command help screen\n\n```sh\nUsage: evaluate-bitbucket [OPTIONS]\n\nOptions:\n -s, --source TEXT Source URL: REQ'd\nREQ'd\n -t, --token TEXT Bitbucket access Token: REQ'd\n --help Show this message and exit.\n```\n\n### [BETA] Azure DevOps\n\nEvaluate supports scanning an Azure DevOps to retrieve relevant metadata about the organization.\n\n<details>\n<summary>\n\nYou need to use [Personal Access Token](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate?view=azure-devops&tabs=Windows) with [Read scope](https://learn.microsoft.com/en-us/azure/devops/integrate/get-started/authentication/oauth?view=azure-devops#scopes) to most of the services.\n\n</summary>\n\nWhen running Evaluate for Azure DevOps, the tool retrieves information from the endpoints listed below. To ensure the tool functions correctly, create a personal access token with the required scopes as shown in the image below.\n\n```\n\nGet Descriptor\nEndpoint: /_apis/graph/descriptors/{project_id}\nSub-API: vssps\nScope: Graph (Read)\n\nGet Project Administrators Group\nEndpoint: /_apis/graph/groups?scopeDescriptor={scopeDescriptor}\nSub-API: vssps\nScope: Graph (Read)\n\nGet Project Administrators\nEndpoint: /_apis/GroupEntitlements/{project_group_id}/members\nSub-API: vsaex\nScope: MemberEntitlementManagement (Read)\n\nGet Work Items\nEndpoint: /{project_id}/_apis/wit/wiql\nScope: Work Items (Read)\n\nGet Release Definitions\nEndpoint: /{project_id}/_apis/release/definitions\nSub-API: vsrm\nScope: Release (Read)\n\nGet Build Definitions\nEndpoint: /{project_id}/_apis/build/definitions\nScope: Build (Read)\n\nGet Commits\nEndpoint: /{project_id}/_apis/git/repositories/{repository_id}/commits\nScope: Code (Read)\n\nGet Pull Requests\nEndpoint: /{project_id}/_apis/git/repositories/{repository_id}/pullrequests\nScope: Code (Read)\n\nGet Branches\nEndpoint: /{project_id}/_apis/git/repositories/{repository_id}/refs\nScope: Code (Read)\n\nGet Repositories\nEndpoint: /{project_id}/_apis/git/repositories\nScope: Code (Read)\n\nGet Project\nEndpoint: /_apis/project/{project_id}\nScope: Project and Team (Read)\n\nGet Projects\nEndpoint: /_apis/projects\nScope: Project and Team (Read)\n\nGet Users\nEndpoint: /_apis/graph/users\nSub-API: vssps\nScope: Graph (Read)\n\nGet Agent Pools\nEndpoint: /_apis/distributedtask/pools\nScope: Agent Pools (Read)\n\nVariable Groups\nEndpoint: /_apis/distributedtask/variablegroups\nScope: Variable Groups (Read)\n\nTest Connection\nEndpoint: /_apis/ConnectionData\nScope: Service Connections (Read)\n\n```\n\n</details>\n\n1. A user should provision an access token for Evaluate to use during the scan.\n2. Install `gitlab-evaluate` from the [Install](#install-method) section above,\n3. Run :point_down:\n\n- For Azure DevOps Service (Cloud):\n\n ```bash\n evaluate-ado -s https://dev.azure.com/<your-org> -t <personal-access-token>\n ```\n\n- For Azure DevOps Server:\n\n ```bash\n evaluate-ado -s {instance_url}/{collection} -t <personal-access-token> --api-version=7.0\n ```\n\n- For Team Foundation Server (TFS):\n\n ```bash\n evaluate-ado -s {server_url:port}/tfs/{collection} -t <personal-access-token> --api-version=4.1\n ```\n\n> **Note:** \n> When running Evaluate against **Azure DevOps Server** or **Team Foundation Server (TFS)**, you must specify the correct API version. \n>\n> To determine the required API version:\n> 1. Click your user icon and select **Help > About** to view your server information.\n> 2. Refer to the [API and TFS version mapping documentation](https://learn.microsoft.com/en-us/rest/api/azure/devops/?view=azure-devops-rest-7.2#api-and-tfs-version-mapping) to identify the appropriate API version for your server.\n\n4. Unless the user provides a custom `--filename`, the report file is named `evaluate_ado` by default.\n5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.\n\n#### Command help screen\n\n```sh\nUsage: evaluate-ado [OPTIONS]\n\nOptions:\n -s, --source TEXT Source URL [required]\n -t, --token TEXT Personal Access Token [required]\n -p, --processes TEXT Number of processes. Defaults to number of CPU cores\n --skip-details Skips details\n --project TEXT Project ID. Evaluate all data within a given Azure\n DevOps project (Project ID should be in UUID format)\n --api-version TEXT API version to use (default: 7.2-preview)\n -f, --filename TEXT XLSX Output File Name (default: evaluate_ado)\n --help Show this message and exit.\n```\n\n\n### [BETA] Github Enterprise\n\nEvaluate supports scanning a Github Enterprise Server (GHES) to retrieve relevant metadata about the server.\n\nYou have to use an admin personal access token ([other token types](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/about-authentication-to-github#githubs-token-formats) potentially supported) to do the evaluation.\n\n1. A user should provision an admin access token for Evaluate to use during the scan.\n2. Install `github-evaluate` from the [Install](#install-method) section above,\n3. OPTIONAL: If you are using custom CA, export the CA bundle: `export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt`\n3. Run :point_down:\n\n ```bash\n evaluate-github-enterprise -s https://github.dev -t <access-token>\n ```\n\n4. This should create a file called `evaluate_github.xlsx`\n5. If you're coordinating a GitLab PS engagement, email these files to the GitLab account team.\n\n\n#### Command help screen\n\n```sh\nUsage: evaluate-github-enterprise [OPTIONS]\n\nOptions:\n -s, --source TEXT Source URL: REQ'd\nREQ'd\n -t, --token TEXT Github access Token: REQ'd\n --help Show this message and exit.\n```\n\n\n\n## GitLab Project Thresholds\n\n_Below are the thresholds we will use to determine whether a project can be considered for normal migration or needs to have special steps taken in order to migrate_\n\n### Project Data\n\n- Project Size - 20GB\n- Pipelines - 5,000 max\n- Issues - 5,000 total (not just open)\n- Merge Requests - 5,000 total (not just merged)\n- Container images - 20GB per project\n- Packages - Any packages present\n\n### Repository Data\n\n- Repository Size - 5GB\n- Commits - 50K\n- Branches - 1K\n- Tags - 5K\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Scans GitLab instance and ranks projects against a set of criteria. Can be used to identiy projects that may have too much metadata/size to reliably export or import.",
"version": "0.31.0",
"project_urls": {
"Homepage": "https://gitlab.com/gitlab-org/professional-services-automation/tools/utilities/evaluate"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "cd032b24455d3d62a90783913d6dac829d3b1140134e22d04924aa95a699859b",
"md5": "a9f0c333495cc71a954f95f0377b9c7f",
"sha256": "5babde68e0f2ec337f7b40d06b10d75db7b1bac9fbaa59daf8966271371179c3"
},
"downloads": -1,
"filename": "gitlab_evaluate-0.31.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a9f0c333495cc71a954f95f0377b9c7f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<=3.12,>=3.8",
"size": 65362,
"upload_time": "2025-07-11T11:46:08",
"upload_time_iso_8601": "2025-07-11T11:46:08.487673Z",
"url": "https://files.pythonhosted.org/packages/cd/03/2b24455d3d62a90783913d6dac829d3b1140134e22d04924aa95a699859b/gitlab_evaluate-0.31.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "cfda7721499082d4245199e5641997539878839920587042a3dee291a2ef5db6",
"md5": "0619c9eab7ee4a33748ec80028aefe40",
"sha256": "2e481d0d15c4b2a6ac6a331039e901557d8af8fedcf00f499c9d69d332b65ed3"
},
"downloads": -1,
"filename": "gitlab_evaluate-0.31.0.tar.gz",
"has_sig": false,
"md5_digest": "0619c9eab7ee4a33748ec80028aefe40",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<=3.12,>=3.8",
"size": 55765,
"upload_time": "2025-07-11T11:46:09",
"upload_time_iso_8601": "2025-07-11T11:46:09.774606Z",
"url": "https://files.pythonhosted.org/packages/cf/da/7721499082d4245199e5641997539878839920587042a3dee291a2ef5db6/gitlab_evaluate-0.31.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-11 11:46:09",
"github": false,
"gitlab": true,
"bitbucket": false,
"codeberg": false,
"gitlab_user": "gitlab-org",
"gitlab_project": "professional-services-automation",
"lcname": "gitlab_evaluate"
}