Synchronization and information transmission in networks

The amount of information produced by a network may be measured by the mutual information rate. This measure, the Kolmogorov-Sinai entropy and the synchronization interval are expressed in terms of the transversal Lyapunov exponents. Thus, these concepts are related and we proved that the larger the...

ver descrição completa

Detalhes bibliográficos
Autor principal: Caneco, Acilina (author)
Outros Autores: Rocha, Leonel (author)
Formato: article
Idioma:eng
Publicado em: 2015
Assuntos:
Texto completo:http://hdl.handle.net/10174/12342
País:Portugal
Oai:oai:dspace.uevora.pt:10174/12342
Descrição
Resumo:The amount of information produced by a network may be measured by the mutual information rate. This measure, the Kolmogorov-Sinai entropy and the synchronization interval are expressed in terms of the transversal Lyapunov exponents. Thus, these concepts are related and we proved that the larger the synchronization is, the larger the rate with which information is exchanged between nodes in the network. In fact, as the coupling parameter increases, the mutual information rate increases to a maximum at the synchronization interval and then decreases. Moreover, the Kolmogorov-Sinai entropy decreases until reaching a minimum at the synchronization interval and then increases. We present some numerical simulations considering two different versions of coupling two maps, a complete network and a lattice, which confirmed our theoretical results.