Resumo: | In developed countries, the Internet is increasingly considered an essential and integral part of people's lives. The need to be \online", share and access content are frequent routines in people's daily lives, making the Internet one of the most complex systems in operation. Most traditional communications (telephone, radio and television) are being remodelled or rede ned by the Internet, giving rise to new services such as Voice over Internet Protocol (VoIP) and Internet Protocol Television (IPTV). Books, newspapers and other types of printed publications are also adapting to the web technology or have been redesigned for blogs and feeds. Massi cation of the Internet and the constant increase in bandwidth o ered to the consumers have created excellent conditions for services such as OTT. OTT Services refer to the delivery of audio, video and other data over the Internet without the control of network operators. Although the OTT delivery presents an attractive solution (and pro table, looking at the fast growing services such as YouTube, Skype and Net ix, for example), it su ers from some limitations. It is necessary to maintain high levels of Quality-of-Experience (QoE) to continue to attract customers. In order to do this, a content distribution network is fundamental to adapt to the speed with which the contents are required and quickly discarded and that can accommodate all the tra c. This dissertation focuses on the distribution of OTT contents in wireless networks, in order to address the lack of research work in this area. A solution is proposed that can be integrated by the network equipment so that it is able to predict what kind of content consumers connected (or nearby) may request and put it in memory before being requested, improving consumers' perception of the service. Given the lack of information in the literature on management and control of proxy caches for embedded systems, the rst step was to test and evaluate two di erent cache algorithms: Nginx and Squid. The results show that there is a trade-o between cache performance and speed in processing the requests, with Nginx delivering better performance but worse response times. It was also found that cache size does not always determine a signi cant improvement in results. Sometimes keeping just the most popular content cached is enough. Afterwards, two algorithms for predicting prefetching contents in mobility scenarios were proposed and tested, given the characteristics of the wireless networks, where it was possible to observe very signi cant performance improvements, demonstrating that there is a possibility for an investment in this area, although this implies an increase in the processing capacity and power consumption of the network equipment.
|