Papers
Topics
Authors
Recent
Search
2000 character limit reached

A theoretical framework and some promising findings of grey wolf optimizer, part II: global convergence analysis

Published 15 Mar 2022 in math.OC | (2203.07636v1)

Abstract: This paper proposes a theoretical framework of the grey wolf optimizer (GWO) based on several interesting theoretical findings, involving sampling distribution, order-1 and order-2 stability, and global convergence analysis. In the part II of the paper, the global convergence analysis is carried out based on the well-known stagnation assumption for simplification purposes. Firstly, the global convergence property of the GWO under stagnation assumption is abstracted and modelled into two propositions, corresponding to global searching ability analysis and probability-1 global convergence analysis. Then, the global searching ability analysis is carried out. Next, based on a characteristic of the central moments of the new solution of the GWO under stagnation assumption, the probability-1 global convergence property of the GWO under stagnation assumption is proved. Finally, all conclusions are verified by numerical simulations, and the corresponding discussions that the global convergence property can still be guaranteed in the original GWO without stagnation assumption are given.

Authors (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.