ArticlePublisher preview available

The evolution of cooperation in spatial public goods game with tolerant punishment based on reputation threshold

AIP Publishing
Chaos
Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract and Figures

Reputation and punishment are significant guidelines for regulating individual behavior in human society, and those with a good reputation are more likely to be imitated by others. In addition, society imposes varying degrees of punishment for behaviors that harm the interests of groups with different reputations. However, conventional pairwise interaction rules and the punishment mechanism overlook this aspect. Building on this observation, this paper enhances a spatial public goods game in two key ways: (1) We set a reputation threshold and use punishment to regulate the defection behavior of players in low-reputation groups while allowing defection behavior in high-reputation game groups. (2) Differently from pairwise interaction rules, we combine reputation and payoff as the fitness of individuals to ensure that players with both high payoff and reputation have a higher chance of being imitated. Through simulations, we find that a higher reputation threshold, combined with a stringent punishment environment, can substantially enhance the level of cooperation within the population. This mechanism provides deeper insight into the widespread phenomenon of cooperation that emerges among individuals.
This content is subject to copyright. Terms and conditions apply.
Chaos ARTICLE pubs.aip.org/aip/cha
The evolution of cooperation in spatial public
goods game with tolerant punishment based on
reputation threshold
Cite as: Chaos 35, 013104 (2025); doi: 10.1063/5.0250120
Submitted: 22 November 2024 ·Accepted: 12 December 2024 ·
Published Online: 3 January 2025
View Online
Export Citation
CrossMark
Gui Zhang,1Yichao Yao,1Ziyan Zeng,1Minyu Feng,1,a)and Manuel Chica2,3
AFFILIATIONS
1The College of Artificial Intelligence, Southwest University, No. 2 Tiansheng Road, Beibei, Chongqing 400715, China
2Department of Computer Science and A.I. Andalusian Research Institute DaSCI “Data Science and Computational Intelligence,”
University of Granada, 18071 Granada, Spain
3School of Information and Physical Sciences, The University of Newcastle, Callaghan, NSW 2308, Australia
a)Author to whom correspondence should be addressed: myfeng@swu.edu.cn
ABSTRACT
Reputation and punishment are significant guidelines for regulating individual behavior in human society, and those with a good reputation
are more likely to be imitated by others. In addition, society imposes varying degrees of punishment for behaviors that harm the interests
of groups with different reputations. However, conventional pairwise interaction rules and the punishment mechanism overlook this aspect.
Building on this observation, this paper enhances a spatial public goods game in two key ways: (1) We set a reputation threshold and use
punishment to regulate the defection behavior of players in low-reputation groups while allowing defection behavior in high-reputation
game groups. (2) Differently from pairwise interaction rules, we combine reputation and payoff as the fitness of individuals to ensure that
players with both high payoff and reputation have a higher chance of being imitated. Through simulations, we find that a higher reputation
threshold, combined with a stringent punishment environment, can substantially enhance the level of cooperation within the population.
This mechanism provides deeper insight into the widespread phenomenon of cooperation that emerges among individuals.
Published under an exclusive license by AIP Publishing. https://doi.org/10.1063/5.0250120
In the real world, individual behavioral choices have attracted
a large amount of attention. In complex networks, cooperation
and defection of nodes reflect real-life behaviors. Edges gener-
ate interactions, and spatial public goods games (SPGG) help to
understand social choices. Realistic social individuals are more
likely to choose bad behaviors due to the high rewards of free-
riding behaviors, and reputation and punishment mechanisms
are crucial to behavioral regulation. A good reputation leads to
trust and cooperation, while a bad reputation leads to isolation.
In addition, real societies have an assessment of individual rep-
utations, resulting in different communities and corresponding
management systems. This paper explores the evolutionary pun-
ishment mechanism of reputation and tolerance, combining rep-
utation and punishment to construct a community mechanism
that promotes cooperation, aiming to reveal behavioral patterns
and provide guidance for social cooperation.
I. INTRODUCTION
The study of cooperation in biology, sociology, and economics
is extensive.16When confronted with conflicts between individ-
ual interests and general interests, the selfish choices of individuals
often clash with the public interests of the group, thereby hindering
the emergence of cooperation.710 Consequently, evolutionary game
theory has attracted significant attention and extensive research.
Prisoner dilemma game (PDG),1114 snow drift game (SDG),1518
and public goods game (PGG)1921 are used to address dilemmas
observed in evolutionary games.22,23 The resolution of the social
dilemma is noteworthy, with numerous studies proposed to elu-
cidate the emergence of cooperation. For example, Nowak and
May24 are the first to show that a spatial structure can promote
the development of cooperation in the PDG in 1992. Since then,
substantial research on cooperative behavior has been carried out
in various evolutionary game models, including those based on
Chaos 35, 013104 (2025); doi: 10.1063/5.0250120 35, 013104-1
Published under an exclusive license by AIP Publishing
... As a further step, the punishment is used as an incentive mechanism, and a new classification form of punishment is introduced into the spatial public goods game, 53 where the probability of punishment changes dynamically according to the number of consecutive defections in the game. Furthermore, Zhang et al. 46 studied the impact of the reputation threshold based tolerant punishment mechanism on the evolution of cooperation in spatial public goods games. Specifically, if an individual's reputation fell below the threshold, he would be punished after he chose to defect; whereas he was not punished even though he adopted defective strategy, if his reputation exceeded the threshold. ...
Article
Full-text available
Trust holds a pivotal position in contemporary society. Yet, the question of how to elevate and sustain trust among selfish individuals poses a formidable challenge. To delve into this issue, we incorporate a graded punishment strategy into a networked N-player trust game, aiming to observe the progression of trust-related behavior. Within this game framework, punishers uphold a certain degree of trust among the participants by incurring an extra expense to exclude those who betray trust. By conducting numerous Monte Carlo simulation experiments, we uncover that the graded punishment strategy can effectively curtail untrustworthy conduct to a significant degree, potentially even eliminating such behavior, thereby fostering an improvement in the overall trust level within the population. However, to effectively deploy this strategy, it is imperative to strike a balance between the penalty cost and the penalty amount, ensuring that the natural evolution of the system is not unduly disrupted. This balance is crucial for preserving the stability and sustainability of the system while safeguarding trust. Broadly speaking, our study offers fresh insights and approaches for enhancing and maintaining trust in the networked society, while also highlighting the avenues and challenges for future research, particularly in the realm of applying graded punishment strategies.
Article
Full-text available
Dynamic processes in complex network are crucial for better understanding collective behavior in human societies, biological systems, and the Internet. In this article, we first focus on the continuous Markov-based modeling of evolving networks with the birth-death of individuals. A new individual arrives at the group by the Poisson process, while new links are established in the network through either uniform connection or preferential attachment. Moreover, an existing individual has a limited lifespan before leaving the network. We determine stationary topological properties of these networks, including their size and mean degree. To address the effect of the birth-death evolution, we further study the information dynamics in the proposed network model from the random drift and natural selection perspective, based on assumptions of total-stochastic and fitness-driven evolution, respectively. In simulations, we analyze the fixation probability of individual information and find that means of new connections affect the random drift process but do not affect the natural selection process.
Article
Asymmetry is a common phenomenon in real life due to constraints such as status, age, information, reputation, and so on. Yet it is still unclear how asymmetrical interactions driven by adaptive feedback affect the evolution of cooperation. To this end, we creatively propose a novel asymmetric interaction model driven by strategy persistence to unravel this mystery. In particular, players whose strategy ersistence is beyond the threshold 𝛽 are the pearls for being able to interact with all of the neighbors. Otherwise, they can only interact with half of their neighbors as the layfolks. As the strategy persistence is always updated, leading to the fact that the pearls and the layfolks maybe switch at any time, which adds uncertainty to the evolution of cooperation. Simulation results show that the asymmetrical interactions of adaptive feedback effectively alleviate social dilemmas, thus opening up a path for cooperators to survive. Moreover, with the increase of 𝛽, there is an appropriate interval resulting in the optimal evolution of cooperation. Micro analysis further indicates that the pearl cooperators play an irreplaceable pivotal role in promoting the evolution of cooperation. At last, we use other social dilemmas, network topologies, strategy update patterns, and payoff normalization to verify the applicability and robustness of the designed model.
Article
Reputation plays a crucial role in social interactions by affecting the fitness of individuals during an evolutionary process. Previous works have extensively studied the result of imitation dynamics without focusing on potential irrational choices in strategy updates. We now fill this gap and explore the consequence of such kind of randomness, or one may interpret it as an autonomous thinking. In particular, we study how this extended dynamics alters the evolution of cooperation when individual reputation is directly linked to collected payoff, hence providing a general fitness function. For a broadly valid conclusion, our spatial populations cover different types of interaction topologies, including lattices, small-world and scale-free graphs. By means of intensive simulations we can detect substantial increase in cooperation level that shows a reasonable stability in the presence of a notable strategy mutation.
Article
Conditional cooperation is the tendency to cooperate if and only if others do so as well. It is the most common behavior in social dilemmas. We study how the incidence of conditional cooperation in the public goods game, the most widely studied social dilemma in experimental economics, varies with group size. In a laboratory experiment, we apply the strategy method to elicit how participants’ willingness to contribute to a public good depends on other group members’ decisions. A within-subject design allows us to evaluate and compare an individual participant's contribution behavior in different-sized groups. Two main findings emerge. First, the share of players who are conditional cooperators is consistent across group sizes. Second, the strategies chosen imply that conditional cooperators hold a (correct) belief that others are more cooperative in a larger than in a smaller group.
Article
Reputation and reciprocity are key mechanisms for cooperation in human societies, often going hand in hand to favor prosocial behavior over selfish actions. Here we review recent researches at the interface of physics and evolutionary game theory that explored these two mechanisms. We focus on image scoring as the bearer of reputation, as well as on various types of reciprocity, including direct, indirect, and network reciprocity. We review different definitions of reputation and reciprocity dynamics, and we show how these affect the evolution of cooperation in social dilemmas. We consider first-order, second-order, as well as higher-order models in well-mixed and structured populations, and we review experimental works that support and inform the results of mathematical modeling and simulations. We also provide a synthesis of the reviewed researches along with an outlook in terms of six directions that seem particularly promising to explore in the future.