Papers
Topics
Authors
Recent
Search
2000 character limit reached

Oscillatory evolution of collective behavior in evolutionary games played with reinforcement learning

Published 7 Aug 2019 in physics.soc-ph and nlin.AO | (1908.03060v1)

Abstract: Large-scale cooperation underpins the evolution of ecosystems and the human society, and the collective behaviors by self-organization of multi-agent systems are the key for understanding. As AI prevails in almost all branches of science, it would be of great interest to see what new insights of collective behavior could be obtained from a multi-agent AI system. Here, we introduce a typical reinforcement learning (RL) algorithm -- Q learning into evolutionary game dynamics, where agents pursue optimal action on the basis of the introspectiveness rather than the birth-death or imitation processes in the traditional evolutionary game (EG). We investigate the cooperation prevalence numerically for a general $2\times 2$ game setting. We find that the cooperation prevalence in the multi-agent AI is amazing of an equal level as in the traditional EG in most cases. However, in the snowdrift games with RL we also reveal that explosive cooperation appears in the form of periodic oscillation, and we study the impact of the payoff structure on its emergence. Finally, we show that the periodic oscillation can also be observed in some other EGs with the RL algorithm, such as the rock-paper-scissors game. Our results offer a reference point to understand emergence of cooperation and oscillatory behaviors in nature and society from AI's perspective.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.