sparkconnector


Namesparkconnector JSON
Version 3.0.2 PyPI version JSON
download
home_pagehttps://github.com/swan-cern/jupyter-extensions
SummaryHelper to connect to CERN's Spark Clusters
upload_time2024-05-03 11:34:57
maintainerNone
docs_urlNone
authorSWAN Admins
requires_python>=3.6
licenseAGPL-3.0
keywords jupyter jupyterlab swan cern
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # SparkConnector

Helper to connect to CERN's Spark Clusters

This extension is built as a Python module named `sparkconnector`, which simplifies the connection to Spark clusters.

It installs:

1. an nbclassic-extension
1. a Jupyterlab extension
1. an iPython extension


## Requirements

- JupyterLab >= 4.0.0
- pyspark (not installed by default)

## Install

To install the extension, execute:

```bash
pip install sparkconnector
jupyter nbclassic-extension install sparkconnector --py
jupyter nbclassic-extension enable  sparkconnector --py
```

It is also necessary to enable the iPython code. Append the following code to the config file (usually in `~/.ipython/profile_default/ipython_kernel_config.py`, check [here](https://ipython.readthedocs.io/en/stable/config/intro.html#python-configuration-files)):

```bash
c.InteractiveShellApp.extensions.append('sparkconnector.connector')
```

## Uninstall

To remove the extension, execute:

```bash
pip uninstall sparkconnector
```

## Contributing

### Development install

Note: You will need NodeJS to build the extension package.

The `jlpm` command is JupyterLab's pinned version of
[yarn](https://yarnpkg.com/) that is installed with JupyterLab. You may use
`yarn` or `npm` in lieu of `jlpm` below.

```bash
# Clone the repo to your local environment
# Change directory to the sparkconnector directory
# Install package in development mode
pip install -e "."
# Link your development version of the extension with JupyterLab
jupyter labextension develop . --overwrite
# Rebuild extension Typescript source after making changes
jlpm build
```

You can watch the source directory and run JupyterLab at the same time in different terminals to watch for changes in the extension's source and automatically rebuild the extension.

```bash
# Watch the source directory in one terminal, automatically rebuilding when needed
jlpm watch
# Run JupyterLab in another terminal
jupyter lab
```

With the watch command running, every saved change will immediately be built locally and available in your running JupyterLab. Refresh JupyterLab to load the change in your browser (you may need to wait several seconds for the extension to be rebuilt).

By default, the `jlpm build` command generates the source maps for this extension to make it easier to debug using the browser dev tools. To also generate source maps for the JupyterLab core extensions, you can run the following command:

```bash
jupyter lab build --minimize=False
```

### Development uninstall

```bash
pip uninstall sparkconnector
```

In development mode, you will also need to remove the symlink created by `jupyter labextension develop`
command. To find its location, you can run `jupyter labextension list` to figure out where the `labextensions`
folder is located. Then you can remove the symlink named `@swan-cern/sparkconnector` within that folder.

### Packaging the extension

See [RELEASE](RELEASE.md)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/swan-cern/jupyter-extensions",
    "name": "sparkconnector",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": "Jupyter, JupyterLab, SWAN, CERN",
    "author": "SWAN Admins",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/2c/17/a33f88e19151d8db8422469d81c553724fb1ea715a3fb5a9c2de4293b745/sparkconnector-3.0.2.tar.gz",
    "platform": "Linux",
    "description": "# SparkConnector\n\nHelper to connect to CERN's Spark Clusters\n\nThis extension is built as a Python module named `sparkconnector`, which simplifies the connection to Spark clusters.\n\nIt installs:\n\n1. an nbclassic-extension\n1. a Jupyterlab extension\n1. an iPython extension\n\n\n## Requirements\n\n- JupyterLab >= 4.0.0\n- pyspark (not installed by default)\n\n## Install\n\nTo install the extension, execute:\n\n```bash\npip install sparkconnector\njupyter nbclassic-extension install sparkconnector --py\njupyter nbclassic-extension enable  sparkconnector --py\n```\n\nIt is also necessary to enable the iPython code. Append the following code to the config file (usually in `~/.ipython/profile_default/ipython_kernel_config.py`, check [here](https://ipython.readthedocs.io/en/stable/config/intro.html#python-configuration-files)):\n\n```bash\nc.InteractiveShellApp.extensions.append('sparkconnector.connector')\n```\n\n## Uninstall\n\nTo remove the extension, execute:\n\n```bash\npip uninstall sparkconnector\n```\n\n## Contributing\n\n### Development install\n\nNote: You will need NodeJS to build the extension package.\n\nThe `jlpm` command is JupyterLab's pinned version of\n[yarn](https://yarnpkg.com/) that is installed with JupyterLab. You may use\n`yarn` or `npm` in lieu of `jlpm` below.\n\n```bash\n# Clone the repo to your local environment\n# Change directory to the sparkconnector directory\n# Install package in development mode\npip install -e \".\"\n# Link your development version of the extension with JupyterLab\njupyter labextension develop . --overwrite\n# Rebuild extension Typescript source after making changes\njlpm build\n```\n\nYou can watch the source directory and run JupyterLab at the same time in different terminals to watch for changes in the extension's source and automatically rebuild the extension.\n\n```bash\n# Watch the source directory in one terminal, automatically rebuilding when needed\njlpm watch\n# Run JupyterLab in another terminal\njupyter lab\n```\n\nWith the watch command running, every saved change will immediately be built locally and available in your running JupyterLab. Refresh JupyterLab to load the change in your browser (you may need to wait several seconds for the extension to be rebuilt).\n\nBy default, the `jlpm build` command generates the source maps for this extension to make it easier to debug using the browser dev tools. To also generate source maps for the JupyterLab core extensions, you can run the following command:\n\n```bash\njupyter lab build --minimize=False\n```\n\n### Development uninstall\n\n```bash\npip uninstall sparkconnector\n```\n\nIn development mode, you will also need to remove the symlink created by `jupyter labextension develop`\ncommand. To find its location, you can run `jupyter labextension list` to figure out where the `labextensions`\nfolder is located. Then you can remove the symlink named `@swan-cern/sparkconnector` within that folder.\n\n### Packaging the extension\n\nSee [RELEASE](RELEASE.md)\n",
    "bugtrack_url": null,
    "license": "AGPL-3.0",
    "summary": "Helper to connect to CERN's Spark Clusters",
    "version": "3.0.2",
    "project_urls": {
        "Homepage": "https://github.com/swan-cern/jupyter-extensions"
    },
    "split_keywords": [
        "jupyter",
        " jupyterlab",
        " swan",
        " cern"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5ff7d88925602fefcf48163908f7fd704aabc8cea65d2cbc15bfaa6b173db83e",
                "md5": "9995ba77e775cc1e9bf6d8483850fa45",
                "sha256": "456307d25ace50339033e87b851dc7342c2303befc7e27eaa7427e7cecc6dd3d"
            },
            "downloads": -1,
            "filename": "sparkconnector-3.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9995ba77e775cc1e9bf6d8483850fa45",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 400115,
            "upload_time": "2024-05-03T11:34:55",
            "upload_time_iso_8601": "2024-05-03T11:34:55.703253Z",
            "url": "https://files.pythonhosted.org/packages/5f/f7/d88925602fefcf48163908f7fd704aabc8cea65d2cbc15bfaa6b173db83e/sparkconnector-3.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2c17a33f88e19151d8db8422469d81c553724fb1ea715a3fb5a9c2de4293b745",
                "md5": "77078f0bbd044623c4b2580fe4260a3a",
                "sha256": "b383adab60b064656bde69c58a27b08eff04ffe912684ef121ee2b1ecf6c3d0d"
            },
            "downloads": -1,
            "filename": "sparkconnector-3.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "77078f0bbd044623c4b2580fe4260a3a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 383941,
            "upload_time": "2024-05-03T11:34:57",
            "upload_time_iso_8601": "2024-05-03T11:34:57.124873Z",
            "url": "https://files.pythonhosted.org/packages/2c/17/a33f88e19151d8db8422469d81c553724fb1ea715a3fb5a9c2de4293b745/sparkconnector-3.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-03 11:34:57",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "swan-cern",
    "github_project": "jupyter-extensions",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "sparkconnector"
}
        
Elapsed time: 0.28537s