fst-pso


Namefst-pso JSON
Version 1.8.1 PyPI version JSON
download
home_pagehttps://github.com/aresio/fst-pso
SummaryFuzzy Self-Tuning PSO global optimization library
upload_time2021-10-26 14:28:06
maintainer
docs_urlNone
authorMarco S. Nobile
requires_python
licenseLICENSE.txt
keywords fuzzy logic optimization discrete optimization continuous optimization particle swarm optimization pso fst-pso fuzzy self-tuning pso
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            =====================
Fuzzy Self-Tuning PSO
=====================

*Fuzzy Self-Tuning PSO* (FST-PSO) is a swarm intelligence global optimization method [1]
based on Particle Swarm Optimization [2].

FST-PSO is designed for the optimization of real-valued multi-dimensional multi-modal minimization problems.

FST-PSO is settings-free version of PSO which exploits fuzzy logic to dynamically assign the functioning parameters to each particle in the swarm. Specifically, during each generation, FST-PSO is determines the optimal choice for the cognitive factor, the social factor, the inertia value, the minimum velocity, and the maximum velocity. FST-PSO also uses an heuristics to choose the swarm size, so that the user must not select any functioning setting.

In order to use FST-PSO, the programmer must implement a custom fitness function. Moreover, the programmer must specify the number of dimensions of the problem and the boundaries of the search space for each dimension. The programmer can optionally specify the maximum number of iterations. When the stopping criterion is met, FST-PSO returns the best fitting solution found, along with its fitness value.


Example
=======
FST-PSO can be used as follows:

from fstpso import FuzzyPSO	

def example_fitness( particle ):

	return sum(map(lambda x: x**2, particle))

if __name__ == '__main__':
	
	dims = 10

	FP = FuzzyPSO(  )

	FP.set_search_space( [[-10, 10]]*dims )	

	FP.set_fitness(example_fitness)

	result =  FP.solve_with_fstpso()

	print("Best solution:", result[0])
	
	print("Whose fitness is:", result[1])




Further information
-------------------

FST-PSO has been created by M.S. Nobile, D. Besozzi, G. Pasi, G. Mauri, 
R. Colombo (University of Milan-Bicocca, Italy), and P. Cazzaniga (University
of Bergamo, Italy). The source code was written by M.S. Nobile.

FST-PSO requires two packages: miniful and numpy. 

Further information on GITHUB: <https://github.com/aresio/fst-pso>

[1] Nobile, Cazzaniga, Besozzi, Colombo, Mauri, Pasi, "Fuzzy Self-Tuning PSO:
A Settings-Free Algorithm for Global Optimization", Swarm & Evolutionary 
Computation, 39:70-85, 2018 (doi:10.1016/j.swevo.2017.09.001)

[2] Kennedy, Eberhart, Particle swarm optimization, in: Proceedings IEEE
International Conference on Neural Networks, Vol. 4, 1995, pp. 1942–1948

<http://www.sciencedirect.com/science/article/pii/S2210650216303534>
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/aresio/fst-pso",
    "name": "fst-pso",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "fuzzy logic,optimization,discrete optimization,continuous optimization,particle swarm optimization,pso,fst-pso,fuzzy self-tuning pso",
    "author": "Marco S. Nobile",
    "author_email": "nobile@disco.unimib.it",
    "download_url": "https://files.pythonhosted.org/packages/5a/d7/f7f93c41fde5b8c1f9d52cc0f9a104a56eca13dc6876c6d2f967ddef88d7/fst-pso-1.8.1.tar.gz",
    "platform": "",
    "description": "=====================\nFuzzy Self-Tuning PSO\n=====================\n\n*Fuzzy Self-Tuning PSO* (FST-PSO) is a swarm intelligence global optimization method [1]\nbased on Particle Swarm Optimization [2].\n\nFST-PSO is designed for the optimization of real-valued multi-dimensional multi-modal minimization problems.\n\nFST-PSO is settings-free version of PSO which exploits fuzzy logic to dynamically assign the functioning parameters to each particle in the swarm. Specifically, during each generation, FST-PSO is determines the optimal choice for the cognitive factor, the social factor, the inertia value, the minimum velocity, and the maximum velocity. FST-PSO also uses an heuristics to choose the swarm size, so that the user must not select any functioning setting.\n\nIn order to use FST-PSO, the programmer must implement a custom fitness function. Moreover, the programmer must specify the number of dimensions of the problem and the boundaries of the search space for each dimension. The programmer can optionally specify the maximum number of iterations. When the stopping criterion is met, FST-PSO returns the best fitting solution found, along with its fitness value.\n\n\nExample\n=======\nFST-PSO can be used as follows:\n\nfrom fstpso import FuzzyPSO\t\n\ndef example_fitness( particle ):\n\n\treturn sum(map(lambda x: x**2, particle))\n\nif __name__ == '__main__':\n\t\n\tdims = 10\n\n\tFP = FuzzyPSO(  )\n\n\tFP.set_search_space( [[-10, 10]]*dims )\t\n\n\tFP.set_fitness(example_fitness)\n\n\tresult =  FP.solve_with_fstpso()\n\n\tprint(\"Best solution:\", result[0])\n\t\n\tprint(\"Whose fitness is:\", result[1])\n\n\n\n\nFurther information\n-------------------\n\nFST-PSO has been created by M.S. Nobile, D. Besozzi, G. Pasi, G. Mauri, \nR. Colombo (University of Milan-Bicocca, Italy), and P. Cazzaniga (University\nof Bergamo, Italy). The source code was written by M.S. Nobile.\n\nFST-PSO requires two packages: miniful and numpy. \n\nFurther information on GITHUB: <https://github.com/aresio/fst-pso>\n\n[1] Nobile, Cazzaniga, Besozzi, Colombo, Mauri, Pasi, \"Fuzzy Self-Tuning PSO:\nA Settings-Free Algorithm for Global Optimization\", Swarm & Evolutionary \nComputation, 39:70-85, 2018 (doi:10.1016/j.swevo.2017.09.001)\n\n[2] Kennedy, Eberhart, Particle swarm optimization, in: Proceedings IEEE\nInternational Conference on Neural Networks, Vol. 4, 1995, pp. 1942\u20131948\n\n<http://www.sciencedirect.com/science/article/pii/S2210650216303534>",
    "bugtrack_url": null,
    "license": "LICENSE.txt",
    "summary": "Fuzzy Self-Tuning PSO global optimization library",
    "version": "1.8.1",
    "split_keywords": [
        "fuzzy logic",
        "optimization",
        "discrete optimization",
        "continuous optimization",
        "particle swarm optimization",
        "pso",
        "fst-pso",
        "fuzzy self-tuning pso"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "md5": "08fca288b69e5c7fc629c5dc0318b474",
                "sha256": "b3d16ec27b0b4d36b35b306af40c00cd0b34e5e0a9e30a71ed02490e8954a26a"
            },
            "downloads": -1,
            "filename": "fst-pso-1.8.1.tar.gz",
            "has_sig": false,
            "md5_digest": "08fca288b69e5c7fc629c5dc0318b474",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 18304,
            "upload_time": "2021-10-26T14:28:06",
            "upload_time_iso_8601": "2021-10-26T14:28:06.381321Z",
            "url": "https://files.pythonhosted.org/packages/5a/d7/f7f93c41fde5b8c1f9d52cc0f9a104a56eca13dc6876c6d2f967ddef88d7/fst-pso-1.8.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2021-10-26 14:28:06",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "aresio",
    "github_project": "fst-pso",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "fst-pso"
}
        
Elapsed time: 0.02502s