TITLE: Variational Problems of Information Theory and Optimal Markov Evolution SPEAKER: Roman Belavkin (EIS, Middlesex University) ABSTRACT: Variational problems of information theory have played an important role in defining theoretical bounds on various quantities, such as the average length of codewords, capacity of a communication channel, probability of error and so on. Here we study the relation between two such problems in the context of evolutionary dynamics. The first is the problem of minimizing information divergence between two probability measures subject to a lower constraint on expected utility (linear functional). Solutions to this problem form a one parameter family of probability measures corresponding to a replicator dynamics with constant selection. The second is the problem of finding optimal Markov operator maximizing expected utility of the output measure subject to a constraint on mutual information between the input and output. We demonstrate how the semigroup of Markov operators solving the second problem can also produce the one parameter evolution of solutions to the first problem. Understanding this relation can facilitate the development of optimal control functions achieving information theoretic bounds. An optimal control of mutation rate in a simple genetic algorithm is shown as an example demonstrating the principle.