socketsync


Namesocketsync JSON
Version 1.0.0 PyPI version JSON
download
home_pageNone
SummarySocket Security SIEM Sync Tool
upload_time2024-06-12 06:33:33
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseNone
keywords socketsync socket.dev sca oss security
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # socket-issues-export

## Purpose
This script provides a method to export the alerts from the Socket Health reports into other tools.

This tool supports the following connectors:

- CSV
- Google BigQuery
- Panther SIEM
- Elasticsearch
- WebHook
- Slack

### Other SIEM Integrations

Some SIEM tools have different ways of getting the data into their system.

- Splunk - App found [here](https://splunkbase.splunk.com/app/7158)

## Required Configuration

The connectors supported by this script have some shared configuration in order to pull the data from Socket.

### Options
| Option     | Required | Format                        | Description                                                                                                                                                             |
|------------|----------|-------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| api_key    | True     | string                        | This is the Socket API Key created in the Socket dashboard. This should have the scoped permissions to access reports                                                   |
| start_date | False    | string(`YYYY-MM-DD HH:MM:SS`) | If this is not defined then it will pull all reports and their corresponding issues. If defined only reports that match or are newer than the start_date will be pulled |
| report_id  | False    | Socket Report ID              | If this is provided then only the specified report ID will be processed                                                                                                 |


### Example

```python
import os
from socketsync.core import Core

if __name__ == '__main__':
    socket_org = os.getenv("SOCKET_ORG") or exit(1)
    api_key = os.getenv("SOCKET_API_KEY") or exit(1)
    start_date = os.getenv("START_DATE")
    core = Core(
        api_key=api_key,
        start_date=start_date,
        report_id=report_id
    )
    issue_data = core.get_issues()
```


## Examples for each supported connector

### CSV

The CSV Export function will output to a specified CSV file. Currently, it will overwrite the file if it already exists.

Initializing Options:

| Option  | Required | Default     | Description                                                                                                                                         |
|---------|----------|-------------|-----------------------------------------------------------------------------------------------------------------------------------------------------|
| file    | True     | None        | The name of the file to write the CSV results out to                                                                                                |
| columns | False    | All Columns | The names of the column headers and the order for the columns. Must match the property names for the issues. If not passed default columns are used |

```python
import os
from socketsync.core import Core
from socketsync.connectors.csv import CSV

if __name__ == '__main__':
    socket_org = os.getenv("SOCKET_ORG") or exit(1)
    api_key = os.getenv("SOCKET_API_KEY") or exit(1)
    start_date = os.getenv("START_DATE")
    report_id = os.getenv("SOCKET_REPORT_ID")
    core = Core(
        api_key=api_key,
        start_date=start_date,
        report_id=report_id
    )
    issue_data = core.get_issues()

    csv_file = "CSV_FILE"
    csv = CSV(
        file=csv_file
    )
    csv.write_csv(issue_data)
```

### Google BigQuery

The BigQuery connector will send data to the specified Table within BigQuery. Currently, in order to be authenticated you will need to do the following before running the code.

1. Install the [GCloud CLI](https://cloud.google.com/sdk/docs/install)
2. In a terminal run `gcloud auth login`
3. In a terminal run `gcloud config set project $MY_PROJECT_ID`

Initializing Options:

| Option | Required | Default | Description                                                                      |
|--------|----------|---------|----------------------------------------------------------------------------------|
| table  | True     | None    | This is the table in the format of `dataset.table` that results will be added to |

```python
import os
from socketsync.core import Core
from socketsync.connectors.bigquery import BigQuery

if __name__ == '__main__':
    socket_org = os.getenv("SOCKET_ORG") or exit(1)
    api_key = os.getenv("SOCKET_API_KEY") or exit(1)
    start_date = os.getenv("START_DATE")
    report_id = os.getenv("SOCKET_REPORT_ID")
    core = Core(
        api_key=api_key,
        start_date=start_date,
        report_id=report_id
    )
    issue_data = core.get_issues()
    bigquery_table = os.getenv('GOOGLE_TABLE') or exit(1)
    bigquery = BigQuery(bigquery_table)
    errors = bigquery.add_dataset(issue_data, streaming=True)
```

### Panther
The Panther connector requires you to have an HTTP connector setup in the Panther UI. In this example I used a bearer token but this can be overriden by using custom headers if desired.

Configuration can be found [here](panther/README.md)

Initializing Options:

| Option  | Required | Default | Description                                                                                           |
|---------|----------|---------|-------------------------------------------------------------------------------------------------------|
| token   | False    | None    | Token to use if you are using Bearer token. Default method if custom headers are not passed to `send` |
| url     | True     | None    | Panther Webhook URL to POST data to                                                                   |
| timeout | False    | 10      | Timeout in seconds for requests                                                                       |

```python
import os
from socketsync.core import Core
from socketsync.connectors.panther import Panther

if __name__ == '__main__':
    socket_org = os.getenv("SOCKET_ORG") or exit(1)
    api_key = os.getenv("SOCKET_API_KEY") or exit(1)
    start_date = os.getenv("START_DATE")
    report_id = os.getenv("SOCKET_REPORT_ID")
    core = Core(
        api_key=api_key,
        start_date=start_date,
        report_id=report_id
    )
    issue_data = core.get_issues()
    panther_url = os.getenv('PANTHER_URL') or exit(1)
    panther_token = os.getenv('PANTHER_TOKEN') or exit(1)
    panther = Panther(
        token=panther_token,
        url=panther_url
    )
    for issue in issue_data:
        issue_json = json.loads(str(issue))
        panther.send(str(issue))
        print(f"Processed issue id: {issue.id}")
```

### Elasticsearch
The Elasticsearch connector should work with on prem or cloud hosted Elastic search configurations. The configuration when loading `Elastic` is the same as from the [Elasticsearch documentation](https://elasticsearch-py.readthedocs.io/en/v8.11.1/quickstart.html#connecting)

```python
import os
from socketsync.core import Core
from socketsync.connectors.elastic import Elastic

if __name__ == '__main__':
    socket_org = os.getenv("SOCKET_ORG") or exit(1)
    api_key = os.getenv("SOCKET_API_KEY") or exit(1)
    start_date = os.getenv("START_DATE")
    report_id = os.getenv("SOCKET_REPORT_ID")
    core = Core(
        api_key=api_key,
        start_date=start_date,
        report_id=report_id
    )
    issue_data = core.get_issues()
    elastic_token = os.getenv('ELASTIC_TOKEN') or exit(1)
    elastic_cloud_id = os.getenv('ELASTIC_CLOUD_ID') or exit(1)
    elastic_index = os.getenv('ELASTIC_ID') or exit(1)
    es = Elastic(
        api_key=elastic_token,
        cloud_id=elastic_cloud_id
    )
    for issue in issue_data:
        es.add_document(issue, elastic_index)
```

### WebHook
The WebHook integration is a simple wrapper for sending an HTTP(s) Request to the desired URL.

Initialize Options:

| Option       | Required | Default                                                                                                        | Description                                                      |
|--------------|----------|----------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------|
| url          | True     | None                                                                                                           | URL for the WebHook                                              |
| headers      | False    | `{'User-Agent': 'SocketPythonScript/0.0.1', "accept": "application/json", 'Content-Type': "application/json"}` | Default set of headers to use if not specified                   |
| auth_headers | False    | None                                                                                                           | Dictionary of auth headers to use to authenticate to the WebHook |
| params       | False    | None                                                                                                           | Dictionary of query params to use if needed                      |
| timeout      | False    | 10                                                                                                             | Time in seconds to timeout out a request                         |

```python
import os
from socketsync.core import Core
from socketsync.connectors.webhook import Webhook

if __name__ == '__main__':
    socket_org = os.getenv("SOCKET_ORG") or exit(1)
    api_key = os.getenv("SOCKET_API_KEY") or exit(1)
    start_date = os.getenv("START_DATE")
    report_id = os.getenv("SOCKET_REPORT_ID")
    core = Core(
        api_key=api_key,
        start_date=start_date,
        report_id=report_id
    )
    issue_data = core.get_issues()
    webhook_url = os.getenv("WEBHOOK_URL") or exit(1)
    webhook_auth_headers = os.getenv("WEBHOOK_AUTH_HEADERS") or {
        'Authorization': 'Bearer EXAMPLE'
    }
    webhook = Webhook(webhook_url)
    for issue in issue_data:
        issue_json = json.loads(str(issue))
        webhook.send(issue_json)
```

### Slack WebHook
The Slack WebHook integration is a simple wrapper for sending an HTTP(s) Request to the desired Slack Webhook URL.

Initialize Options:

| Option       | Required | Default                                                                                                        | Description                                                      |
|--------------|----------|----------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------|
| url          | True     | None                                                                                                           | URL for the WebHook                                              |
| headers      | False    | `{'User-Agent': 'SocketPythonScript/0.0.1', "accept": "application/json", 'Content-Type': "application/json"}` | Default set of headers to use if not specified                   |
| params       | False    | None                                                                                                           | Dictionary of query params to use if needed                      |
| timeout      | False    | 10                                                                                                             | Time in seconds to timeout out a request                         |

```python
import os
from socketsync.core import Core
from socketsync.connectors.slack import Slack

if __name__ == '__main__':
    socket_org = os.getenv("SOCKET_ORG") or exit(1)
    api_key = os.getenv("SOCKET_API_KEY") or exit(1)
    start_date = os.getenv("START_DATE")
    report_id = os.getenv("SOCKET_REPORT_ID")
    core = Core(
        api_key=api_key,
        start_date=start_date,
        report_id=report_id
    )
    issue_data = core.get_issues()
    slack_url = os.getenv("SLACK_WEBHOOK_URL") or exit(1)
    slack = Slack(slack_url)
    for issue in issue_data:
        issue_json = json.loads(str(issue))
        slack.send(issue_json)
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "socketsync",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "Douglas Coburn <douglas@socket.dev>",
    "keywords": "socketsync, socket.dev, sca, oss, security",
    "author": null,
    "author_email": "Douglas Coburn <douglas@socket.dev>",
    "download_url": "https://files.pythonhosted.org/packages/12/6c/206e2df3cfc7db0b4f2c742a8c0f86a344a8e0cc333599b949c5159ccc8b/socketsync-1.0.0.tar.gz",
    "platform": null,
    "description": "# socket-issues-export\n\n## Purpose\nThis script provides a method to export the alerts from the Socket Health reports into other tools.\n\nThis tool supports the following connectors:\n\n- CSV\n- Google BigQuery\n- Panther SIEM\n- Elasticsearch\n- WebHook\n- Slack\n\n### Other SIEM Integrations\n\nSome SIEM tools have different ways of getting the data into their system.\n\n- Splunk - App found [here](https://splunkbase.splunk.com/app/7158)\n\n## Required Configuration\n\nThe connectors supported by this script have some shared configuration in order to pull the data from Socket.\n\n### Options\n| Option     | Required | Format                        | Description                                                                                                                                                             |\n|------------|----------|-------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| api_key    | True     | string                        | This is the Socket API Key created in the Socket dashboard. This should have the scoped permissions to access reports                                                   |\n| start_date | False    | string(`YYYY-MM-DD HH:MM:SS`) | If this is not defined then it will pull all reports and their corresponding issues. If defined only reports that match or are newer than the start_date will be pulled |\n| report_id  | False    | Socket Report ID              | If this is provided then only the specified report ID will be processed                                                                                                 |\n\n\n### Example\n\n```python\nimport os\nfrom socketsync.core import Core\n\nif __name__ == '__main__':\n    socket_org = os.getenv(\"SOCKET_ORG\") or exit(1)\n    api_key = os.getenv(\"SOCKET_API_KEY\") or exit(1)\n    start_date = os.getenv(\"START_DATE\")\n    core = Core(\n        api_key=api_key,\n        start_date=start_date,\n        report_id=report_id\n    )\n    issue_data = core.get_issues()\n```\n\n\n## Examples for each supported connector\n\n### CSV\n\nThe CSV Export function will output to a specified CSV file. Currently, it will overwrite the file if it already exists.\n\nInitializing Options:\n\n| Option  | Required | Default     | Description                                                                                                                                         |\n|---------|----------|-------------|-----------------------------------------------------------------------------------------------------------------------------------------------------|\n| file    | True     | None        | The name of the file to write the CSV results out to                                                                                                |\n| columns | False    | All Columns | The names of the column headers and the order for the columns. Must match the property names for the issues. If not passed default columns are used |\n\n```python\nimport os\nfrom socketsync.core import Core\nfrom socketsync.connectors.csv import CSV\n\nif __name__ == '__main__':\n    socket_org = os.getenv(\"SOCKET_ORG\") or exit(1)\n    api_key = os.getenv(\"SOCKET_API_KEY\") or exit(1)\n    start_date = os.getenv(\"START_DATE\")\n    report_id = os.getenv(\"SOCKET_REPORT_ID\")\n    core = Core(\n        api_key=api_key,\n        start_date=start_date,\n        report_id=report_id\n    )\n    issue_data = core.get_issues()\n\n    csv_file = \"CSV_FILE\"\n    csv = CSV(\n        file=csv_file\n    )\n    csv.write_csv(issue_data)\n```\n\n### Google BigQuery\n\nThe BigQuery connector will send data to the specified Table within BigQuery. Currently, in order to be authenticated you will need to do the following before running the code.\n\n1. Install the [GCloud CLI](https://cloud.google.com/sdk/docs/install)\n2. In a terminal run `gcloud auth login`\n3. In a terminal run `gcloud config set project $MY_PROJECT_ID`\n\nInitializing Options:\n\n| Option | Required | Default | Description                                                                      |\n|--------|----------|---------|----------------------------------------------------------------------------------|\n| table  | True     | None    | This is the table in the format of `dataset.table` that results will be added to |\n\n```python\nimport os\nfrom socketsync.core import Core\nfrom socketsync.connectors.bigquery import BigQuery\n\nif __name__ == '__main__':\n    socket_org = os.getenv(\"SOCKET_ORG\") or exit(1)\n    api_key = os.getenv(\"SOCKET_API_KEY\") or exit(1)\n    start_date = os.getenv(\"START_DATE\")\n    report_id = os.getenv(\"SOCKET_REPORT_ID\")\n    core = Core(\n        api_key=api_key,\n        start_date=start_date,\n        report_id=report_id\n    )\n    issue_data = core.get_issues()\n    bigquery_table = os.getenv('GOOGLE_TABLE') or exit(1)\n    bigquery = BigQuery(bigquery_table)\n    errors = bigquery.add_dataset(issue_data, streaming=True)\n```\n\n### Panther\nThe Panther connector requires you to have an HTTP connector setup in the Panther UI. In this example I used a bearer token but this can be overriden by using custom headers if desired.\n\nConfiguration can be found [here](panther/README.md)\n\nInitializing Options:\n\n| Option  | Required | Default | Description                                                                                           |\n|---------|----------|---------|-------------------------------------------------------------------------------------------------------|\n| token   | False    | None    | Token to use if you are using Bearer token. Default method if custom headers are not passed to `send` |\n| url     | True     | None    | Panther Webhook URL to POST data to                                                                   |\n| timeout | False    | 10      | Timeout in seconds for requests                                                                       |\n\n```python\nimport os\nfrom socketsync.core import Core\nfrom socketsync.connectors.panther import Panther\n\nif __name__ == '__main__':\n    socket_org = os.getenv(\"SOCKET_ORG\") or exit(1)\n    api_key = os.getenv(\"SOCKET_API_KEY\") or exit(1)\n    start_date = os.getenv(\"START_DATE\")\n    report_id = os.getenv(\"SOCKET_REPORT_ID\")\n    core = Core(\n        api_key=api_key,\n        start_date=start_date,\n        report_id=report_id\n    )\n    issue_data = core.get_issues()\n    panther_url = os.getenv('PANTHER_URL') or exit(1)\n    panther_token = os.getenv('PANTHER_TOKEN') or exit(1)\n    panther = Panther(\n        token=panther_token,\n        url=panther_url\n    )\n    for issue in issue_data:\n        issue_json = json.loads(str(issue))\n        panther.send(str(issue))\n        print(f\"Processed issue id: {issue.id}\")\n```\n\n### Elasticsearch\nThe Elasticsearch connector should work with on prem or cloud hosted Elastic search configurations. The configuration when loading `Elastic` is the same as from the [Elasticsearch documentation](https://elasticsearch-py.readthedocs.io/en/v8.11.1/quickstart.html#connecting)\n\n```python\nimport os\nfrom socketsync.core import Core\nfrom socketsync.connectors.elastic import Elastic\n\nif __name__ == '__main__':\n    socket_org = os.getenv(\"SOCKET_ORG\") or exit(1)\n    api_key = os.getenv(\"SOCKET_API_KEY\") or exit(1)\n    start_date = os.getenv(\"START_DATE\")\n    report_id = os.getenv(\"SOCKET_REPORT_ID\")\n    core = Core(\n        api_key=api_key,\n        start_date=start_date,\n        report_id=report_id\n    )\n    issue_data = core.get_issues()\n    elastic_token = os.getenv('ELASTIC_TOKEN') or exit(1)\n    elastic_cloud_id = os.getenv('ELASTIC_CLOUD_ID') or exit(1)\n    elastic_index = os.getenv('ELASTIC_ID') or exit(1)\n    es = Elastic(\n        api_key=elastic_token,\n        cloud_id=elastic_cloud_id\n    )\n    for issue in issue_data:\n        es.add_document(issue, elastic_index)\n```\n\n### WebHook\nThe WebHook integration is a simple wrapper for sending an HTTP(s) Request to the desired URL.\n\nInitialize Options:\n\n| Option       | Required | Default                                                                                                        | Description                                                      |\n|--------------|----------|----------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------|\n| url          | True     | None                                                                                                           | URL for the WebHook                                              |\n| headers      | False    | `{'User-Agent': 'SocketPythonScript/0.0.1', \"accept\": \"application/json\", 'Content-Type': \"application/json\"}` | Default set of headers to use if not specified                   |\n| auth_headers | False    | None                                                                                                           | Dictionary of auth headers to use to authenticate to the WebHook |\n| params       | False    | None                                                                                                           | Dictionary of query params to use if needed                      |\n| timeout      | False    | 10                                                                                                             | Time in seconds to timeout out a request                         |\n\n```python\nimport os\nfrom socketsync.core import Core\nfrom socketsync.connectors.webhook import Webhook\n\nif __name__ == '__main__':\n    socket_org = os.getenv(\"SOCKET_ORG\") or exit(1)\n    api_key = os.getenv(\"SOCKET_API_KEY\") or exit(1)\n    start_date = os.getenv(\"START_DATE\")\n    report_id = os.getenv(\"SOCKET_REPORT_ID\")\n    core = Core(\n        api_key=api_key,\n        start_date=start_date,\n        report_id=report_id\n    )\n    issue_data = core.get_issues()\n    webhook_url = os.getenv(\"WEBHOOK_URL\") or exit(1)\n    webhook_auth_headers = os.getenv(\"WEBHOOK_AUTH_HEADERS\") or {\n        'Authorization': 'Bearer EXAMPLE'\n    }\n    webhook = Webhook(webhook_url)\n    for issue in issue_data:\n        issue_json = json.loads(str(issue))\n        webhook.send(issue_json)\n```\n\n### Slack WebHook\nThe Slack WebHook integration is a simple wrapper for sending an HTTP(s) Request to the desired Slack Webhook URL.\n\nInitialize Options:\n\n| Option       | Required | Default                                                                                                        | Description                                                      |\n|--------------|----------|----------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------|\n| url          | True     | None                                                                                                           | URL for the WebHook                                              |\n| headers      | False    | `{'User-Agent': 'SocketPythonScript/0.0.1', \"accept\": \"application/json\", 'Content-Type': \"application/json\"}` | Default set of headers to use if not specified                   |\n| params       | False    | None                                                                                                           | Dictionary of query params to use if needed                      |\n| timeout      | False    | 10                                                                                                             | Time in seconds to timeout out a request                         |\n\n```python\nimport os\nfrom socketsync.core import Core\nfrom socketsync.connectors.slack import Slack\n\nif __name__ == '__main__':\n    socket_org = os.getenv(\"SOCKET_ORG\") or exit(1)\n    api_key = os.getenv(\"SOCKET_API_KEY\") or exit(1)\n    start_date = os.getenv(\"START_DATE\")\n    report_id = os.getenv(\"SOCKET_REPORT_ID\")\n    core = Core(\n        api_key=api_key,\n        start_date=start_date,\n        report_id=report_id\n    )\n    issue_data = core.get_issues()\n    slack_url = os.getenv(\"SLACK_WEBHOOK_URL\") or exit(1)\n    slack = Slack(slack_url)\n    for issue in issue_data:\n        issue_json = json.loads(str(issue))\n        slack.send(issue_json)\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Socket Security SIEM Sync Tool",
    "version": "1.0.0",
    "project_urls": {
        "Homepage": "https://github.com/SocketDev/socket-siem-connector"
    },
    "split_keywords": [
        "socketsync",
        " socket.dev",
        " sca",
        " oss",
        " security"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c3a03a88dcc775dcd8b5c270695fbf1f4ae4e82abf550328403e23e365aff759",
                "md5": "bd705a8293f1071c812c2d333ccf7219",
                "sha256": "a6551e0abec8435caf52559c2152ba101631192791230a62aa9d3e24ac7fdca6"
            },
            "downloads": -1,
            "filename": "socketsync-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "bd705a8293f1071c812c2d333ccf7219",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 2695541,
            "upload_time": "2024-06-12T06:33:29",
            "upload_time_iso_8601": "2024-06-12T06:33:29.864369Z",
            "url": "https://files.pythonhosted.org/packages/c3/a0/3a88dcc775dcd8b5c270695fbf1f4ae4e82abf550328403e23e365aff759/socketsync-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "126c206e2df3cfc7db0b4f2c742a8c0f86a344a8e0cc333599b949c5159ccc8b",
                "md5": "41976d14d9a51a059d8da9c63ac16560",
                "sha256": "d13398f7cc7c2ab49ad8c5cbd6c2a0acea3289f62c5e83b68a9275219831fb1b"
            },
            "downloads": -1,
            "filename": "socketsync-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "41976d14d9a51a059d8da9c63ac16560",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 2680489,
            "upload_time": "2024-06-12T06:33:33",
            "upload_time_iso_8601": "2024-06-12T06:33:33.102822Z",
            "url": "https://files.pythonhosted.org/packages/12/6c/206e2df3cfc7db0b4f2c742a8c0f86a344a8e0cc333599b949c5159ccc8b/socketsync-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-12 06:33:33",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "SocketDev",
    "github_project": "socket-siem-connector",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "socketsync"
}
        
Elapsed time: 0.50348s