funcStats


NamefuncStats JSON
Version 0.1.11 PyPI version JSON
download
home_page
SummaryA module that helps to evaluate functions when some tests are needed
upload_time2023-06-13 02:52:52
maintainer
docs_urlNone
authorElemental (Tom Neto)
requires_python
license
keywords python functions cronometer crono meter evaluate functions
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # FuncStats

### This package is aimed to attend the needs of testing performance of some Python execution, we hope you'll enjoy!


#### To use, follow the steps below:

- First of all, you will need to install the package by running the command:

        pip install funcStats


 - To load the monitor from funcStats package, add the following line:
    
        from funcStats import monitor

 - You can create or import the method that you will be testing, this will be our test method:

        def testList(var1, var2, var3):
            currentList = []
            for var in [var1, var2, var3]:
                currentList.append(var)
            return currentList

 - After load the monitor, you can create the monitor instance, by passing the method that will be monitored in the target argument.

        myMonitor = monitor(target=testList)

 - Which will allow your access to some information that can be retrieved like in the following code:
    
        print(myMonitor.targetName)
        print(myMonitor.targetArgs)
        print(myMonitor.targetInfo)

 - You can also assign a file out for the monitor, which will result in a report generated to a file:
    
        toFileMonitor = monitor(target=testList, toFile='path/to/file/testLog', console=True)

     As you can see, the file output might be generated with the .log extension, and the datetime of execution is also append to it's name for convenience.


### One of the great advantages of this package is the ability to run multiple tests with different loads, to see the performance of your code, through the meter method.

 - Is also possible to set multiple conjunctions of executions that can be coded as in the suggestion below
    
        multipleTestsArgs = [
            ('a', 'b', 'c'),  # here goes the arguments based on position for the first test attempt
            ('d', 'e', 'f'),  # here goes the arguments based on position for the second test attempt and so on...
            ('g', 'h', 'i')
        ]

 - After assigning the arguments to your executions, you can simply run the meter, and pass the arguments list, to make it run your target function multiple times with the samples provided.

        toFileMonitor.meter(multipleTestsArgs)

 
 - In the case you need to run the method with the same set of args, but at multiple times, it can be achieved by adding the loops arg to the meter execution.

       singleTestsArgs = [('a', 'b', 'c')] # this time we will add a single set of tuple as the args for our target function
       toFileMonitor.meter(singleTestsArgs, loops = 10)

### This will give you access to the information needed to monitor the performance of your method, without causing too much struggling.
This is a sample obtained by running the code provided in our multiple args example:

    [FuncStats Report] 2023-06-10 23:14:52.501321 - [testList] - Meter: Loading args... [('a', 'b', 'c'), ('d', 'e', 'f'), ('g', 'h', 'i')]

    [FuncStats Report] 2023-06-10 23:14:52.501480 - [testList] - Meter Execution Count: 1
    [FuncStats Report] 2023-06-10 23:14:52.501596 - [testList] - Meter Execution Starts at: 2023-06-10 23:14:52.501477
    [FuncStats Report] 2023-06-10 23:14:52.501710 - [testList] - Meter Execution Args: ('a', 'b', 'c')
    [FuncStats Report] 2023-06-10 23:14:52.501824 - [testList] - Meter Execution Returns: ['a', 'b', 'c']...
    [FuncStats Report] 2023-06-10 23:14:52.501946 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms


    [FuncStats Report] 2023-06-10 23:14:52.502111 - [testList] - Meter Execution Count: 2
    [FuncStats Report] 2023-06-10 23:14:52.502224 - [testList] - Meter Execution Starts at: 2023-06-10 23:14:52.502108
    [FuncStats Report] 2023-06-10 23:14:52.502329 - [testList] - Meter Execution Args: ('d', 'e', 'f')
    [FuncStats Report] 2023-06-10 23:14:52.502437 - [testList] - Meter Execution Returns: ['d', 'e', 'f']...
    [FuncStats Report] 2023-06-10 23:14:52.502545 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms


    [FuncStats Report] 2023-06-10 23:14:52.502696 - [testList] - Meter Execution Count: 3
    [FuncStats Report] 2023-06-10 23:14:52.502895 - [testList] - Meter Execution Starts at: 2023-06-10 23:14:52.502693
    [FuncStats Report] 2023-06-10 23:14:52.503016 - [testList] - Meter Execution Args: ('g', 'h', 'i')
    [FuncStats Report] 2023-06-10 23:14:52.503129 - [testList] - Meter Execution Returns: ['g', 'h', 'i']...
    [FuncStats Report] 2023-06-10 23:14:52.503238 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms

Another sample but this time from the single args:

    [FuncStats Report] 2023-06-11 00:09:57.752710 - [testList] - Meter: Loading args... [('a', 'b', 'c')]

    [FuncStats Report] 2023-06-11 00:09:57.752747 - [testList] - Meter Execution Count: 1
    [FuncStats Report] 2023-06-11 00:09:57.752759 - [testList] - Meter Execution Starts at: 2023-06-11 00:09:57.752741
    [FuncStats Report] 2023-06-11 00:09:57.752767 - [testList] - Meter Execution Args: ('a', 'b', 'c')
    [FuncStats Report] 2023-06-11 00:09:57.752778 - [testList] - Meter Execution Returns: ['a', 'b', 'c']...
    [FuncStats Report] 2023-06-11 00:09:57.752795 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms


    [FuncStats Report] 2023-06-11 00:09:57.752814 - [testList] - Meter Execution Count: 2
    [FuncStats Report] 2023-06-11 00:09:57.752825 - [testList] - Meter Execution Starts at: 2023-06-11 00:09:57.752806
    [FuncStats Report] 2023-06-11 00:09:57.752832 - [testList] - Meter Execution Args: ('a', 'b', 'c')
    [FuncStats Report] 2023-06-11 00:09:57.752842 - [testList] - Meter Execution Returns: ['a', 'b', 'c']...
    [FuncStats Report] 2023-06-11 00:09:57.752851 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms


    [FuncStats Report] 2023-06-11 00:09:57.752889 - [testList] - Meter Execution Count: 3
    [FuncStats Report] 2023-06-11 00:09:57.752898 - [testList] - Meter Execution Starts at: 2023-06-11 00:09:57.752883
    [FuncStats Report] 2023-06-11 00:09:57.752907 - [testList] - Meter Execution Args: ('a', 'b', 'c')
    [FuncStats Report] 2023-06-11 00:09:57.752917 - [testList] - Meter Execution Returns: ['a', 'b', 'c']...
    [FuncStats Report] 2023-06-11 00:09:57.752927 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms

You can find the source code for this package at the link: https://github.com/tomneto/funcStats

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "funcStats",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "python,functions,cronometer,crono,meter,evaluate functions",
    "author": "Elemental (Tom Neto)",
    "author_email": "<info@elemental.run>",
    "download_url": "https://files.pythonhosted.org/packages/3e/52/ae1c18c5baa439c4fc611d0e50e3df9816337cc0c3ad3f4640a8787a6cfc/funcStats-0.1.11.tar.gz",
    "platform": null,
    "description": "# FuncStats\n\n### This package is aimed to attend the needs of testing performance of some Python execution, we hope you'll enjoy!\n\n\n#### To use, follow the steps below:\n\n- First of all, you will need to install the package by running the command:\n\n        pip install funcStats\n\n\n - To load the monitor from funcStats package, add the following line:\n    \n        from funcStats import monitor\n\n - You can create or import the method that you will be testing, this will be our test method:\n\n        def testList(var1, var2, var3):\n            currentList = []\n            for var in [var1, var2, var3]:\n                currentList.append(var)\n            return currentList\n\n - After load the monitor, you can create the monitor instance, by passing the method that will be monitored in the target argument.\n\n        myMonitor = monitor(target=testList)\n\n - Which will allow your access to some information that can be retrieved like in the following code:\n    \n        print(myMonitor.targetName)\n        print(myMonitor.targetArgs)\n        print(myMonitor.targetInfo)\n\n - You can also assign a file out for the monitor, which will result in a report generated to a file:\n    \n        toFileMonitor = monitor(target=testList, toFile='path/to/file/testLog', console=True)\n\n     As you can see, the file output might be generated with the .log extension, and the datetime of execution is also append to it's name for convenience.\n\n\n### One of the great advantages of this package is the ability to run multiple tests with different loads, to see the performance of your code, through the meter method.\n\n - Is also possible to set multiple conjunctions of executions that can be coded as in the suggestion below\n    \n        multipleTestsArgs = [\n            ('a', 'b', 'c'),  # here goes the arguments based on position for the first test attempt\n            ('d', 'e', 'f'),  # here goes the arguments based on position for the second test attempt and so on...\n            ('g', 'h', 'i')\n        ]\n\n - After assigning the arguments to your executions, you can simply run the meter, and pass the arguments list, to make it run your target function multiple times with the samples provided.\n\n        toFileMonitor.meter(multipleTestsArgs)\n\n \n - In the case you need to run the method with the same set of args, but at multiple times, it can be achieved by adding the loops arg to the meter execution.\n\n       singleTestsArgs = [('a', 'b', 'c')] # this time we will add a single set of tuple as the args for our target function\n       toFileMonitor.meter(singleTestsArgs, loops = 10)\n\n### This will give you access to the information needed to monitor the performance of your method, without causing too much struggling.\nThis is a sample obtained by running the code provided in our multiple args example:\n\n    [FuncStats Report] 2023-06-10 23:14:52.501321 - [testList] - Meter: Loading args... [('a', 'b', 'c'), ('d', 'e', 'f'), ('g', 'h', 'i')]\n\n    [FuncStats Report] 2023-06-10 23:14:52.501480 - [testList] - Meter Execution Count: 1\n    [FuncStats Report] 2023-06-10 23:14:52.501596 - [testList] - Meter Execution Starts at: 2023-06-10 23:14:52.501477\n    [FuncStats Report] 2023-06-10 23:14:52.501710 - [testList] - Meter Execution Args: ('a', 'b', 'c')\n    [FuncStats Report] 2023-06-10 23:14:52.501824 - [testList] - Meter Execution Returns: ['a', 'b', 'c']...\n    [FuncStats Report] 2023-06-10 23:14:52.501946 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms\n\n\n    [FuncStats Report] 2023-06-10 23:14:52.502111 - [testList] - Meter Execution Count: 2\n    [FuncStats Report] 2023-06-10 23:14:52.502224 - [testList] - Meter Execution Starts at: 2023-06-10 23:14:52.502108\n    [FuncStats Report] 2023-06-10 23:14:52.502329 - [testList] - Meter Execution Args: ('d', 'e', 'f')\n    [FuncStats Report] 2023-06-10 23:14:52.502437 - [testList] - Meter Execution Returns: ['d', 'e', 'f']...\n    [FuncStats Report] 2023-06-10 23:14:52.502545 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms\n\n\n    [FuncStats Report] 2023-06-10 23:14:52.502696 - [testList] - Meter Execution Count: 3\n    [FuncStats Report] 2023-06-10 23:14:52.502895 - [testList] - Meter Execution Starts at: 2023-06-10 23:14:52.502693\n    [FuncStats Report] 2023-06-10 23:14:52.503016 - [testList] - Meter Execution Args: ('g', 'h', 'i')\n    [FuncStats Report] 2023-06-10 23:14:52.503129 - [testList] - Meter Execution Returns: ['g', 'h', 'i']...\n    [FuncStats Report] 2023-06-10 23:14:52.503238 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms\n\nAnother sample but this time from the single args:\n\n    [FuncStats Report] 2023-06-11 00:09:57.752710 - [testList] - Meter: Loading args... [('a', 'b', 'c')]\n\n    [FuncStats Report] 2023-06-11 00:09:57.752747 - [testList] - Meter Execution Count: 1\n    [FuncStats Report] 2023-06-11 00:09:57.752759 - [testList] - Meter Execution Starts at: 2023-06-11 00:09:57.752741\n    [FuncStats Report] 2023-06-11 00:09:57.752767 - [testList] - Meter Execution Args: ('a', 'b', 'c')\n    [FuncStats Report] 2023-06-11 00:09:57.752778 - [testList] - Meter Execution Returns: ['a', 'b', 'c']...\n    [FuncStats Report] 2023-06-11 00:09:57.752795 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms\n\n\n    [FuncStats Report] 2023-06-11 00:09:57.752814 - [testList] - Meter Execution Count: 2\n    [FuncStats Report] 2023-06-11 00:09:57.752825 - [testList] - Meter Execution Starts at: 2023-06-11 00:09:57.752806\n    [FuncStats Report] 2023-06-11 00:09:57.752832 - [testList] - Meter Execution Args: ('a', 'b', 'c')\n    [FuncStats Report] 2023-06-11 00:09:57.752842 - [testList] - Meter Execution Returns: ['a', 'b', 'c']...\n    [FuncStats Report] 2023-06-11 00:09:57.752851 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms\n\n\n    [FuncStats Report] 2023-06-11 00:09:57.752889 - [testList] - Meter Execution Count: 3\n    [FuncStats Report] 2023-06-11 00:09:57.752898 - [testList] - Meter Execution Starts at: 2023-06-11 00:09:57.752883\n    [FuncStats Report] 2023-06-11 00:09:57.752907 - [testList] - Meter Execution Args: ('a', 'b', 'c')\n    [FuncStats Report] 2023-06-11 00:09:57.752917 - [testList] - Meter Execution Returns: ['a', 'b', 'c']...\n    [FuncStats Report] 2023-06-11 00:09:57.752927 - [testList] - Meter Execution Takes: 0h 0m 0s 0ms\n\nYou can find the source code for this package at the link: https://github.com/tomneto/funcStats\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "A module that helps to evaluate functions when some tests are needed",
    "version": "0.1.11",
    "project_urls": null,
    "split_keywords": [
        "python",
        "functions",
        "cronometer",
        "crono",
        "meter",
        "evaluate functions"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3e52ae1c18c5baa439c4fc611d0e50e3df9816337cc0c3ad3f4640a8787a6cfc",
                "md5": "71d60f761808275652b095bc1414ee2d",
                "sha256": "8539ad157964df679bfa3a9a909c1a2cf26072216a8077005b8bbd92088b96b1"
            },
            "downloads": -1,
            "filename": "funcStats-0.1.11.tar.gz",
            "has_sig": false,
            "md5_digest": "71d60f761808275652b095bc1414ee2d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 7504,
            "upload_time": "2023-06-13T02:52:52",
            "upload_time_iso_8601": "2023-06-13T02:52:52.989810Z",
            "url": "https://files.pythonhosted.org/packages/3e/52/ae1c18c5baa439c4fc611d0e50e3df9816337cc0c3ad3f4640a8787a6cfc/funcStats-0.1.11.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-13 02:52:52",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "funcstats"
}
        
Elapsed time: 0.14086s