We live in “Age of Information”, but we still do not understand well, what Information is and what are its properties. Is there some kind of Information conservation law similar to conservation laws for energy and matter or does Information behave like a nuclear reaction, feeding itself and trying all the time to propagate further? In order to understand Information better we should consider principles which have guided the best producer and consumer of information – life and especially its highest form – Man. Memory, storing information about their environment and ability to replicate allows living things to change the thermodynamical balance of their environment – they decrease disorder (entropy) in their population while increasing it in their environment. Man invented for collecting information (thus also increasing entropy in environment) a totally new tool – language, which moved the process from the level of individual entities to the level of the whole Mankind. Language is the Mankind's model of the world, which reflects word structure and (in the limit) converges to a similar structure, i.e. has entropy close to the entropy of the world which it describes.
Here is considered the essence of the concept “information” and different uses of the word. From the many kinds of information the most important in everyday life is the social, macro information – the secondary information, created by social communication from individual, primary information obtained by our senses in perceptions. The tool for creating the social, shared by whole Mankind information is language.
To understand a phenomenon we should consider why and where it appeared. Here is modelled emergence of language in computer simulations of communication and information exchange in community of agents. In simulations agents created for exchange of their perceptions (new) language, following some very simple principles – they were eager to distribute their perceptions, inventing new signals for them; when receiving signals from others, they followed the principle of maximizing similarity of their language with their perceptions and received messages, using only minimal assumptions about meaning of received signals; they also utilized information compression using names.
To measure the process were introduced several measures of entropy: for the world, using the Shannon's entropy applied to Pawlak's model of information systems and a object-oriented approach – using the vector of differences of objects; for entropy of languages as weighted many-to-many relations between real-world objects, their attributes and names and their denotations – words is presented a new formula. Made in simulations measurements of the entropy of language show, that language continued to develop also when agents already well understood each other; entropy of languages steadily increases, still remaining smaller than the entropies of the world which language models.