largeproductaffinity


Namelargeproductaffinity JSON
Version 1.1.0 PyPI version JSON
download
home_pageNone
SummaryDescription: large_product_affinity is used to obtain Product Affinity using big data without PySpark environment
upload_time2024-11-28 09:05:32
maintainerNone
docs_urlNone
authorsarath babu
requires_pythonNone
licenseNone
keywords python apriori product affinity market basket analysis
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
# large_product_affinity

large_product_affinity is used to obtain Product Affinity using big data without PySpark environment



Features:

1. large_product_affinity has been proven to handle millions of rows of transactional data

2. It can also handle tens of thousands of products

3. large_product_affinity requires only a dataframe with just two columns

4. It requires minimal pre-processing

5. No post-processing required



Requirements for Input data:

1. Data should be in a dataframe format of two columns

2. First column must be transaction id or any field containing transaction information

3. Second column should be products corresponding to transactions in the first column



Input:

1. Please, Choose an acceptable Support



Pre-Processing:

1. No Nulls



Post-Processing:

None



Output:

1. Product Affinity table is sorted in the order of Confidence and Lift in descending order.



Drawbacks:

1. The only drawback noted was the user's system capability to read data.

2. Please, Use Modin or Dask to read large volumes of data if Pandas fails.


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "largeproductaffinity",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "python, apriori, product affinity, market basket analysis",
    "author": "sarath babu",
    "author_email": "babusarath05@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/69/d4/d39051a65feff61217750a9955da7ccd095bdc7c162c5d1c9db2788a395f/largeproductaffinity-1.1.0.tar.gz",
    "platform": null,
    "description": "\r\n# large_product_affinity\r\n\r\nlarge_product_affinity is used to obtain Product Affinity using big data without PySpark environment\r\n\r\n\r\n\r\nFeatures:\r\n\r\n1. large_product_affinity has been proven to handle millions of rows of transactional data\r\n\r\n2. It can also handle tens of thousands of products\r\n\r\n3. large_product_affinity requires only a dataframe with just two columns\r\n\r\n4. It requires minimal pre-processing\r\n\r\n5. No post-processing required\r\n\r\n\r\n\r\nRequirements for Input data:\r\n\r\n1. Data should be in a dataframe format of two columns\r\n\r\n2. First column must be transaction id or any field containing transaction information\r\n\r\n3. Second column should be products corresponding to transactions in the first column\r\n\r\n\r\n\r\nInput:\r\n\r\n1. Please, Choose an acceptable Support\r\n\r\n\r\n\r\nPre-Processing:\r\n\r\n1. No Nulls\r\n\r\n\r\n\r\nPost-Processing:\r\n\r\nNone\r\n\r\n\r\n\r\nOutput:\r\n\r\n1. Product Affinity table is sorted in the order of Confidence and Lift in descending order.\r\n\r\n\r\n\r\nDrawbacks:\r\n\r\n1. The only drawback noted was the user's system capability to read data.\r\n\r\n2. Please, Use Modin or Dask to read large volumes of data if Pandas fails.\r\n\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Description: large_product_affinity is used to obtain Product Affinity using big data without PySpark environment",
    "version": "1.1.0",
    "project_urls": null,
    "split_keywords": [
        "python",
        " apriori",
        " product affinity",
        " market basket analysis"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "eba1bb23b76814c0ac334a67223f908a9a2aa8177138126a0fad09be052d95cd",
                "md5": "58598aa08f86f4e13b203b028b006387",
                "sha256": "66a56960df4ea1f7f16379b125de62060f7fd9fc1a688931ef4c8c986f5cce43"
            },
            "downloads": -1,
            "filename": "largeproductaffinity-1.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "58598aa08f86f4e13b203b028b006387",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 4913,
            "upload_time": "2024-11-28T09:05:28",
            "upload_time_iso_8601": "2024-11-28T09:05:28.960988Z",
            "url": "https://files.pythonhosted.org/packages/eb/a1/bb23b76814c0ac334a67223f908a9a2aa8177138126a0fad09be052d95cd/largeproductaffinity-1.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "69d4d39051a65feff61217750a9955da7ccd095bdc7c162c5d1c9db2788a395f",
                "md5": "37a70f7d2d862afd25203445531f7f5b",
                "sha256": "01d0360f0e7b7c413022e612b0bbef28a232db99bc9ec70611b0b2b96a8b2b51"
            },
            "downloads": -1,
            "filename": "largeproductaffinity-1.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "37a70f7d2d862afd25203445531f7f5b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 4482,
            "upload_time": "2024-11-28T09:05:32",
            "upload_time_iso_8601": "2024-11-28T09:05:32.097519Z",
            "url": "https://files.pythonhosted.org/packages/69/d4/d39051a65feff61217750a9955da7ccd095bdc7c162c5d1c9db2788a395f/largeproductaffinity-1.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-28 09:05:32",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "largeproductaffinity"
}
        
Elapsed time: 0.82465s