ﻻ يوجد ملخص باللغة العربية
From an enactive approach, some previous studies have demonstrated that social interaction plays a fundamental role in the dynamics of neural and behavioral complexity of embodied agents. In particular, it has been shown that agents with a limited internal structure (2-neuron brains) that evolve in interaction can overcome this limitation and exhibit chaotic neural activity, typically associated with more complex dynamical systems (at least 3-dimensional). In the present paper we make two contributions to this line of work. First, we propose a conceptual distinction in levels of coupling between agents that could have an effect on neural and behavioral complexity. Second, we test the generalizability of previous results by testing agents with richer internal structure and evolving them in a richer, yet non-social, environment. We demonstrate that such agents can achieve levels of complexity comparable to agents that evolve in interactive settings. We discuss the significance of this result for the study of interaction.
The problem of Multi-Agent Path Finding (MAPF) calls for finding a set of conflict-free paths for a fleet of agents operating in a given environment. Arguably, the state-of-the-art approach to computing optimal solutions is Conflict-Based Search (CBS
Socially relevant situations that involve strategic interactions are widespread among animals and humans alike. To study these situations, theoretical and experimental works have adopted a game-theoretical perspective, which has allowed to obtain val
Measuring and promoting policy diversity is critical for solving games with strong non-transitive dynamics where strategic cycles exist, and there is no consistent winner (e.g., Rock-Paper-Scissors). With that in mind, maintaining a pool of diverse p
In the process of collectively inventing new words for new concepts in a population, conflicts can quickly become numerous, in the form of synonymy and homonymy. Remembering all of them could cost too much memory, and remembering too few may slow dow
Current low-precision quantization algorithms often have the hidden cost of conversion back and forth from floating point to quantized integer values. This hidden cost limits the latency improvement realized by quantizing Neural Networks. To address