dipblue: a diplomacy agent with strategic and trust reasoning

Diplomacy is a military strategy turn-based board game, which takes place in the turn of the 20th century, where seven world powers fight for the dominion of Europe. The game can be played by 2 to 7 players and is characterized by not having random factors, as well as, by being a zero-sum game. It h...

ver descrição completa

Detalhes bibliográficos
Autor principal: André Filipe da Costa Ferreira (author)
Formato: masterThesis
Idioma:eng
Publicado em: 2014
Assuntos:
Texto completo:https://hdl.handle.net/10216/72469
País:Portugal
Oai:oai:repositorio-aberto.up.pt:10216/72469
Descrição
Resumo:Diplomacy is a military strategy turn-based board game, which takes place in the turn of the 20th century, where seven world powers fight for the dominion of Europe. The game can be played by 2 to 7 players and is characterized by not having random factors, as well as, by being a zero-sum game. It has a very important component when played by human players that has been put aside in games typically addressed by Artificial Intelligence techniques: before making their moves the players can negotiate among themselves and discuss issues such as alliances, move propositions, exchange of information, among others. Keeping in mind that the players act simultaneously and that the number of units and movements is extremely large, the result is a vast game tree impossible of being effectively searched. The majority of existing artificial players for Diplomacy don't make use of the negotiation opportunities the game provides and try to solve the problem through solution search and the use of complex heuristics. This dissertation proposes an approach to the development of an artificial player named DipBlue, that makes use of negotiation in order to gain advantage over its opponents, through the use of peace treaties, formation of alliances and suggestion of actions to allies. Trust is used as a tool to detect and react to possible betrayals by allied players. DipBlue has a flexible architecture that allows the creation of different variations of the bot, each with a particular configuration and behaviour. The player was built to work with the multi-agent systems testbed DipGame and was tested with other players of the same platform and variations of itself. The results of the experiments show that the use of negotiation increases the performance of the bots involved in the alliances if all of them are trustworthy, however, when betrayed the efficiency of the bots drastically decreases. In this scenario, the ability to perform trust reasoning proved to successfully reduce the impact of betrayals.