Synchronization and information transmission in networks

The amount of information produced by a network may be measured by the mutual information rate. This measure, the Kolmogorov-Sinai entropy and the synchronization interval are expressed in terms of the transversal Lyapunov exponents. Thus, these concepts are related and we proved that the larger the...

Full description

Bibliographic Details
Main Author: Caneco, Acilina (author)
Other Authors: Rocha, Leonel (author)
Format: article
Language:eng
Published: 2015
Subjects:
Online Access:http://hdl.handle.net/10174/12342
Country:Portugal
Oai:oai:dspace.uevora.pt:10174/12342
Description
Summary:The amount of information produced by a network may be measured by the mutual information rate. This measure, the Kolmogorov-Sinai entropy and the synchronization interval are expressed in terms of the transversal Lyapunov exponents. Thus, these concepts are related and we proved that the larger the synchronization is, the larger the rate with which information is exchanged between nodes in the network. In fact, as the coupling parameter increases, the mutual information rate increases to a maximum at the synchronization interval and then decreases. Moreover, the Kolmogorov-Sinai entropy decreases until reaching a minimum at the synchronization interval and then increases. We present some numerical simulations considering two different versions of coupling two maps, a complete network and a lattice, which confirmed our theoretical results.