Philosophy of Information is one of the most exciting and fruitful areas of philosophical research of our time. It is already affecting the overall way in which new and old philosophical problems are being addressed, bringing about a substantial innovation of the philosophical system. Information is conveyed by perception, and retained by memory, though also transmitted by means of language. The concept Information is acquired without one's necessarily having a grasp of the proposition which embodies it; the flow of information operates at a much more basic level than the acquisition and transmission of knowledge. This represents the information turn in philosophy.
Since Anaximander of Miletus, Pre-Socratic philosophers had been preoccupied with investigating Life and Security in the cosmos (world order). Philosophers from Thales to the Stoics regarded the cosmos as a place of life and as oikoumene of all living things. View of cosmos is a beautiful crystal including three cardinal concepts: Order, Harmony and Security, which can be regarded as the key concepts of ecological philosophy. Culture and nature, literature and landscape, sustainability and thought have merged together over many centuries. We read ancient authors and we hear how they spoke of their love and joy of their countryside. “In all things of nature there is something of the marvellous” – Aristotle. Music, literature and the visual arts are some of the central models of contemporary ecological thinking and are ideal ways of realising aims of liberal arts. That literature and art can express anxiety over the natural world or reflect environmental attitudes is nothing new. Literature and art actively intervene in helping to confront persuasive issues and to define the boundaries of what constitutes the environment, literally the nature itself.
Questions related to the philosophy of information have lead us naturally back to some of the profound debates in physics on the nature of the concept of entropy as it appears in the description of systems about which we have a priori only limited information. The Gibbs paradox, for example, centers on the question of whether entropy is subjective or objective. We have seen that while the description might have subjective components, whenever we use the concept of entropy to ask concrete physical questions, we always get objective physical answers. Similarly, when we inject intelligent actors into the story, as for Maxwell's demon, we see that the second law remains valid it applies equally well in a universe with sentient beings.
Questions about information theory arise at all levels of scientific enterprise: in the analysis of measurements, in performing computer simulations, and in evaluating the quality of mathematical models and theories. The notion of entropy started in thermodynamics as a rather abstract mathematical property. With the development of statistical mechanics it emerged as a measure of disorder, though the notion of disorder referred to a very restricted context. With the passage of time the generality and the power of the notion of entropy became clearer. Statistical mechanics is reduced to an application of the maximum entropy principle, using constraints that are determined by the physical system.
Forecasting is a process whose effectiveness can be understood in terms of the information contained in measurements, and the rate at which the geometry of the underlying dynamical system, used to make the forecast, causes this information to be lost. It is also very important to increase knowledge about information systems in theoretical and practical point of view. Novel measuring sensory systems and their networks are creating a new possibilities to use information for control and management of almost all natural and mankind activities.
Fundamental turning points in physics have always left important traces in information theory. A particularly interesting example is the development of quantum information theory, with its envisaged applications to quantum security, quantum teleportation and quantum computation. Another interesting example is the black hole information paradox, where the notions of entropy and information continue to be central players in our attempts to resolve some of the principal debates of modern theoretical physics. In a sense, our ability to construct a proper statistical mechanics is a good test of our theories. If we could only formulate an underlying statistical mechanics of black holes, we might be able to resolve fundamental questions about the interface between gravity and quantum mechanics. Using nonequilbrium statistical mechanics, we see that the question of what information means and how it can be used remains vital. New entropies are being defined, and their usefulness and theoretical consistency are topics that are actively debated.
The philosophy, nature and physics of information are an emerging field, one that is still very much in progress.
Last achievements of Information Science and Technology (Informatics), which studies the structure of representation and transformation of information by machines, as well as quantum physics and modern mathematical approaches are shown that information (information particles “bits – qubits”, and information field) has the properties of quantum matter.
The way to these solutions includes the R. Clausius observation that: “The energy the universe is constant the entropy of the universe tends toward a maximum”; C. Shannon's definitions of amount of information as logarithm of a probability of its occurrence from a given source over a given channel thus measuring in ‘bits’, which has become a household term. Actually, this notion fits with the physics tradition via one transformation. The total entropy of two independent systems is the sum of their individual entropies, while the total probability is the product of the individual probabilities.
Whatever one's final verdict, it seems uncontroversial that there are three main known stances of information: knowledge, logic, what is conveyed in informative answers; Probabilistic, information theoretic, measured quantitatively; Algorithmic, code compression, measured quantitatively. These three definitions must be unified as everything in the nature. Stating technical transformations between notions of information is one thing, understanding their philosophical consequences another.
The latter seems closer to information flow in human settings and purposeful activities. But at the same time it is a myth that algorithmic data compression might be a universal principle governing human level information flow, leading to what may be called “cognitive computationalism”.
While the preceding tandem view seems to highlight the dynamic processes, it equally well forces us to think more about the details of representation of information. It is symptomatic that I. Kolmogorov complexity can be viewed as a theory of string entropy, with random strings as systems in thermodynamic equilibrium.
This suggest intriguing equivalence relations for translating between complexity theory and physics, for whose details we refer to connection and overlapping of computer science, and physics.
Information process is the relation between information structure and computation, deduction, observation, learning, game playing, and evolution. We should unify the theory of information, computation, dynamic logics of epistemic update in one side, and spintronics, moletronics, memtronics, architectronics – novel physical approaches for information processing on the other.
Nano Information-communication systems and nanocomputers which can be developed thorough biomimetics examining 3D nanobiocircuit are the excellent example of unification of virtual and natural worlds. It is possible that even bioelectrochemical nanocomputers will be designed to store and process information utilizing complex electrochemical interactions and changes. Bioelectrochemical nanobiocircuits that store, process and compute exist in nature in enormous variety and complexity evolving through thousands years of evolution. The development and prototyping of bioelectrochemical nanocomputer and threedimensional circuits have been progressed through engineering bioinformatic and biomimetic paradigms.
Possible basic concepts in the development of nanocomputers are based on Mechanical “computers”, which have the richest history traced thousand years back. While very creative theories and machines have been developed and demonstrated, the feasibility of mechanical nanocomputers is questioned by many researchers due to the number of controlled mechanical nanocomponents, unsolved fabrication, assembling, packaging and other difficulties. For designing of novel computing systems molecular and single-electron transistors, quantum dots, molecular logics, and other nanoelectronic devices can be used as the basic elements. The nanoswitches (memoryless processing elements), logic gates and registers can be fabricated on the scale of a single molecule. The so-called quantum dots, which are molecular boxes holding the discrete number of electrons that is changed applying the electromagnetic field. The quantum dots are arranged in the quantum dot cells.
All above mentioned are the subjects of this book, which contains selected papers presented at the Tbilisi – Spring – 2011 Conference supported by NATO Science for Peace and Security Programme. In the framework of the conference titled “Philosophy and Synergy of Information: Sustainability and Security” scientist of different disciplines, different countries, from East and West share their ideas, results of their research, experience related to Information Science and Technology, Philosophy, Nature and Culture of Information, problems of sustainable development of countries, regions and continents.
Paata Kervalishvili and Susie Michailidis