Ebook: Philosophy and Synergy of Information: Sustainability and Security
The philosophy of information is one of the most exciting and fruitful areas of current philosophical research. Not only is it affecting the way in which new and old philosophical problems are addressed, but it is also creating a substantial innovation of the philosophical system. This book contains selected papers presented at the 2011 spring conference entitled ‘Philosophy and Synergy of Information: Sustainability and Security’, held in Tbilisi, and supported by the NATO Science for Peace and Security Programme. The conference welcomed scientists from different disciplines and different countries, both eastern and western. They met to share their ideas, the results of their research and their experiences related to information science and technology, philosophy, the nature and culture of information and the problems of sustainable development for countries, regions and continents. Subjects covered include the latest innovations in informatics, forecasting, quantum information theory, nano information-communication systems and many more. Since ancient times, philosophers have been preoccupied with investigating life and security in the cosmos (world order). This book will be of interest to all those involved in that endeavor.
Philosophy of Information is one of the most exciting and fruitful areas of philosophical research of our time. It is already affecting the overall way in which new and old philosophical problems are being addressed, bringing about a substantial innovation of the philosophical system. Information is conveyed by perception, and retained by memory, though also transmitted by means of language. The concept Information is acquired without one's necessarily having a grasp of the proposition which embodies it; the flow of information operates at a much more basic level than the acquisition and transmission of knowledge. This represents the information turn in philosophy.
Since Anaximander of Miletus, Pre-Socratic philosophers had been preoccupied with investigating Life and Security in the cosmos (world order). Philosophers from Thales to the Stoics regarded the cosmos as a place of life and as oikoumene of all living things. View of cosmos is a beautiful crystal including three cardinal concepts: Order, Harmony and Security, which can be regarded as the key concepts of ecological philosophy. Culture and nature, literature and landscape, sustainability and thought have merged together over many centuries. We read ancient authors and we hear how they spoke of their love and joy of their countryside. “In all things of nature there is something of the marvellous” – Aristotle. Music, literature and the visual arts are some of the central models of contemporary ecological thinking and are ideal ways of realising aims of liberal arts. That literature and art can express anxiety over the natural world or reflect environmental attitudes is nothing new. Literature and art actively intervene in helping to confront persuasive issues and to define the boundaries of what constitutes the environment, literally the nature itself.
Questions related to the philosophy of information have lead us naturally back to some of the profound debates in physics on the nature of the concept of entropy as it appears in the description of systems about which we have a priori only limited information. The Gibbs paradox, for example, centers on the question of whether entropy is subjective or objective. We have seen that while the description might have subjective components, whenever we use the concept of entropy to ask concrete physical questions, we always get objective physical answers. Similarly, when we inject intelligent actors into the story, as for Maxwell's demon, we see that the second law remains valid it applies equally well in a universe with sentient beings.
Questions about information theory arise at all levels of scientific enterprise: in the analysis of measurements, in performing computer simulations, and in evaluating the quality of mathematical models and theories. The notion of entropy started in thermodynamics as a rather abstract mathematical property. With the development of statistical mechanics it emerged as a measure of disorder, though the notion of disorder referred to a very restricted context. With the passage of time the generality and the power of the notion of entropy became clearer. Statistical mechanics is reduced to an application of the maximum entropy principle, using constraints that are determined by the physical system.
Forecasting is a process whose effectiveness can be understood in terms of the information contained in measurements, and the rate at which the geometry of the underlying dynamical system, used to make the forecast, causes this information to be lost. It is also very important to increase knowledge about information systems in theoretical and practical point of view. Novel measuring sensory systems and their networks are creating a new possibilities to use information for control and management of almost all natural and mankind activities.
Fundamental turning points in physics have always left important traces in information theory. A particularly interesting example is the development of quantum information theory, with its envisaged applications to quantum security, quantum teleportation and quantum computation. Another interesting example is the black hole information paradox, where the notions of entropy and information continue to be central players in our attempts to resolve some of the principal debates of modern theoretical physics. In a sense, our ability to construct a proper statistical mechanics is a good test of our theories. If we could only formulate an underlying statistical mechanics of black holes, we might be able to resolve fundamental questions about the interface between gravity and quantum mechanics. Using nonequilbrium statistical mechanics, we see that the question of what information means and how it can be used remains vital. New entropies are being defined, and their usefulness and theoretical consistency are topics that are actively debated.
The philosophy, nature and physics of information are an emerging field, one that is still very much in progress.
Last achievements of Information Science and Technology (Informatics), which studies the structure of representation and transformation of information by machines, as well as quantum physics and modern mathematical approaches are shown that information (information particles “bits – qubits”, and information field) has the properties of quantum matter.
The way to these solutions includes the R. Clausius observation that: “The energy the universe is constant the entropy of the universe tends toward a maximum”; C. Shannon's definitions of amount of information as logarithm of a probability of its occurrence from a given source over a given channel thus measuring in ‘bits’, which has become a household term. Actually, this notion fits with the physics tradition via one transformation. The total entropy of two independent systems is the sum of their individual entropies, while the total probability is the product of the individual probabilities.
Whatever one's final verdict, it seems uncontroversial that there are three main known stances of information: knowledge, logic, what is conveyed in informative answers; Probabilistic, information theoretic, measured quantitatively; Algorithmic, code compression, measured quantitatively. These three definitions must be unified as everything in the nature. Stating technical transformations between notions of information is one thing, understanding their philosophical consequences another.
The latter seems closer to information flow in human settings and purposeful activities. But at the same time it is a myth that algorithmic data compression might be a universal principle governing human level information flow, leading to what may be called “cognitive computationalism”.
While the preceding tandem view seems to highlight the dynamic processes, it equally well forces us to think more about the details of representation of information. It is symptomatic that I. Kolmogorov complexity can be viewed as a theory of string entropy, with random strings as systems in thermodynamic equilibrium.
This suggest intriguing equivalence relations for translating between complexity theory and physics, for whose details we refer to connection and overlapping of computer science, and physics.
Information process is the relation between information structure and computation, deduction, observation, learning, game playing, and evolution. We should unify the theory of information, computation, dynamic logics of epistemic update in one side, and spintronics, moletronics, memtronics, architectronics – novel physical approaches for information processing on the other.
Nano Information-communication systems and nanocomputers which can be developed thorough biomimetics examining 3D nanobiocircuit are the excellent example of unification of virtual and natural worlds. It is possible that even bioelectrochemical nanocomputers will be designed to store and process information utilizing complex electrochemical interactions and changes. Bioelectrochemical nanobiocircuits that store, process and compute exist in nature in enormous variety and complexity evolving through thousands years of evolution. The development and prototyping of bioelectrochemical nanocomputer and threedimensional circuits have been progressed through engineering bioinformatic and biomimetic paradigms.
Possible basic concepts in the development of nanocomputers are based on Mechanical “computers”, which have the richest history traced thousand years back. While very creative theories and machines have been developed and demonstrated, the feasibility of mechanical nanocomputers is questioned by many researchers due to the number of controlled mechanical nanocomponents, unsolved fabrication, assembling, packaging and other difficulties. For designing of novel computing systems molecular and single-electron transistors, quantum dots, molecular logics, and other nanoelectronic devices can be used as the basic elements. The nanoswitches (memoryless processing elements), logic gates and registers can be fabricated on the scale of a single molecule. The so-called quantum dots, which are molecular boxes holding the discrete number of electrons that is changed applying the electromagnetic field. The quantum dots are arranged in the quantum dot cells.
All above mentioned are the subjects of this book, which contains selected papers presented at the Tbilisi – Spring – 2011 Conference supported by NATO Science for Peace and Security Programme. In the framework of the conference titled “Philosophy and Synergy of Information: Sustainability and Security” scientist of different disciplines, different countries, from East and West share their ideas, results of their research, experience related to Information Science and Technology, Philosophy, Nature and Culture of Information, problems of sustainable development of countries, regions and continents.
Paata Kervalishvili and Susie Michailidis
Culture and nature, literature and landscape, sustainability and thought have merged together over many centuries. We read ancient authors and we hear how they spoke of their love and joy of their countryside. But why should the woks of the ancients and medieval, so far as it is related to nature be of interest and stimulation to our contemporaries? Music, literature and the visual arts are some of the central models of contemporary ecological thinking and are ideal ways of realising aims of liberal arts. That literature and art can express anxiety over the natural world or reflect environmental attitudes is nothing new. Literature and art actively intervene in helping to confront persuasive issues and to define the boundaries of what constitutes the environment, literally the nature itself, The history of the nature protection movement and the wider study of the relations between humanity and nature should be of interest to all who attempts to find a solution to the problems of pollution and environmental degradation. The historical past indicates useful parallels with the environmental present.
Sustainable Development needs to reconsider the essence of technology as such and come up with the holistic vision of the entire venture -- comprehending biology, esthetics, ethics and technology. For putting together such a holistic vision, the Humanities (philosophy, anthropology, arts, literature, education, etc.) can provide some pertinent ideas and practices. Sustainable Development is not only about the energy in terms of electricity, heating, etc. It's also about “social energy” – the energy that moves a society, a community. Here I would like to focus on the source of different kind -- the so-called National Narratives and Collective Memory (Cultural Memory). National Narratives are cognitive instruments that grasp together information into a coherent whole; they organize texts, episodes of the past, symbols, and practices, which make up Collective Memory, into a story. Collective Memory (as a combination of texts, episodes of the past, symbols, and practices) differs from analytic History as an objective discipline. The latter type of tools provide more flexible mediation as they are holistic in essence, based on the synergy of different kind of tools, and give us an access to a wider and more diverse picture of the world. The fusion of renewable energy and innovative technologies with the arts and humanities can provide such synergy, which will lay foundation for the Sustainable Development.
Using this matrix and initial data, obtained by the recognized international organizations we have developed the mathematical model that gives the opportunity to calculate the components of human life quality and safety as the components of sustainable development index and harmonization level of this development for every country. The system of indexes and indicators has been developed and gauging matrix for sustainable development processes (SDGM) in 3 dimensions: economic, ecological and socio-institutional has been offered. The global modeling of sustainable development processes for the large group of the countries in terms of human life quality and safety has been performed. The results of modeling have been explained in details by every dimension of the sustainable development.
Nowadays quantum information science is in a state of theoretic completeness; much is invested to research quality physical solution for quantum computer and applications, so the field is led by physicists and information science and technology professionals. This field proved its potential and many (including us) believe it is in the right direction to go. In this paper we'll try to present and analyze the philosophy and nature of quantum information science and discuss hot topics around them. Although there is plenty of space for discussion around this field, we will concentrate on some important issues, such as quantum communication, algorithms and cryptography. Communication promises to be one of the directions where huge breakthroughs will be observed (entanglement as communication), algorithms have already shown exponential advantage (Shor's factoring) over classical counterparts and different paradigm of information processing. Cryptography seems to utilize general achievements in quantum information science and make encryption protocols very secure. Quantum computation can be understood as a generalization of classical computation: The possible states of a quantum computer are superpositions of bit strings, where the tensor factor is called a quantum bit or qubit. To evaluate the computational complexity of an algorithm, either classical or quantum, it is necessary to specify a set of elementary operations, the number of which used during the computation quantities the complexity. If we allow arbitrary permutations classically, or arbitrary unitary transformations quantum mechanically, any state can be reached from any other state in a single step. The general kind of quantum computer should be programmed to perform distinctively quantum operations such as generating two bits in a superposition state. An appropriate measurement on the final superposed state of the computer produces a probabilistic behavior of the output, and because of this quantum parallelization does not reduce the average time required for parallelizable computation completion. The development of quantum computing science and technology as well as quantum computers as thinking machines is very dependent on different achievements of modern research in fundamental and applied physics as well as novel mathematical methods and tools.
The type of research that is sought and carried out by most universities today is, essentially, a process of information transfer from the academic realm to the industrial and commercial realm. University administrations see this mode of activity as the only one able to guarantee financial sustainability. The smaller the possibility of reliance on income from fee-paying students undergoing tuition, the greater the degree of dependence on industrial research contracts. The question arises whether this mode is academically sustainable, and if so under what conditions. In this article it is shown that this mode is academically unsustainable and, hence, in the long term financially unsustainable as well. This is because, insidiously, this mode leads to decreasing reliability of the information and, hence, ultimately there will be no demand for it. In other words, universities must either reform, essentially reverting to their traditional role as bastions of independent, objective inquiry, or disappear.
Binary sensor systems are analog sensors of various types (optical, MEMS – Msicro-Electro-Mechanical Systems, X-ray, gamma-ray, acoustic, electronic, etc.), based on the binary decision process. Typical examples of such «binary sensors» are X-ray luggage inspection systems, product quality control systems, automatic target recognition systems, numerous medical diagnostic systems, and many others. In all these systems, the binary decision process provides only two mutually exclusive responses. There are also two types of key parameters that characterize either a system or external conditions in relation to the system which are determined by their prior probabilities. In this paper, by using a formal neuron model, we analyze the problem of threshold redundancy of binary sensors of a critical state. Three major tasks are solved, videlicet: implementation of the algorithm of error probability calculation for threshold redundancy of a group of sensors; computation of the minimal upper bound for the probability in a closed analytical form and determination of its link with Claude Shannon's theorem; derivation of the expression (estimate) for sensor «weights» when the probability of the binary system error does not exceed the specified minimal upper bound.
Environmental safety is the main point among the Human Rights, and health care is the first task for all mankind. Obviously, there are serious achievements in modern science and technology; however, some threats from the external environment, particularly from high risk industrial projects are still the world problem. Technological monitoring should be conditioned by the large scale spatially distributed homogeneous or heterogeneous environment with dynamic diffusion processes. Multi mobile sensor systems are reconfigurable wireless networks of distributed autonomous devices that can sense or monitor physical or environmental conditions cooperatively. Intelligent sensors and sensor networks have an important impact in meeting environmental challenges. Agents interact (communicate, coordinate, negotiate) with each other, and with their environment. Usually, in a multi-agent system, interaction dynamics between an agent and its environment lead to emergent structure or emergent functionality. To be effective, multi-agent systems must yield coordinated behavior from individually autonomous actions. This paper describes a new approach to providing smart sensor system capabilities in a monitoring and control within a studied area. The key technologies, which are based on the paradigm usually called Swarm Intelligence (SI), focus on mutual coordinated or collective behaviors and adaptive topological self-reconfiguration of swarm sensors. The SI paradigm is based on the observation of the complex behavior of many social-insects society (ants, bees, wasps, termites,...) in which the system properties emerge from local interactions between elementary actions of single agents. Control of reconfigurable sensor networks is fundamentally a difficult problem in which the system must balance issues of power usage, communication versus control, the effectiveness of adapting to the environment as well as to changing science requirements. Introducing the new concept of entropy as an unconformity between the landscape of measuring factors' frequency distribution and sensor swarm spatial distribution is a novel approach for sensory systems' management and control. In the given context, sensor networks' dynamic reconfiguration, and from a systemological viewpoint with the idea of fitness-function, the optimization problem of swarm shaping can be resolved by the criterion of entropy minimization.
Educational programs in the area biomedical informatics are covering topics from the field of medical informatics, health informatics and bioinformatics. The conceptual roots of such programs lead back more than thirty years and the programs are well established in many countries. The leading role in promoting activities concerning education in biomedical informatics has been given by the International Medical Informatics Association (IMIA) at MEDINFO congresses, special topics conferences and activities of the IMIA working group on Health and Medical Informatics Education. In Georgia under this program preparation of young experts is spent last two years. We study all experience of preparation on this discipline of various European universities and the program for computing training and testing systems (TTS) is developed for this field. We present TTS and other interactive tools for evaluation of a targeted knowledge have been developed. The idea of the system is based on generalized multiple-choice questions, with no prior restrictions on the number of given answers. The only restriction is that at least one answer is correct and at least one wrong. This new idea has led to new concepts of standardization of test results and also to new research problems in statistics. Evaluation by the TTS is performed using fixed or automated test. A fixed test is appropriate for evaluation of the group of students in computer classroom connected to Internet. Students can pass evaluations by automated tests by themselves and the final results of the tests are displayed immediately. The displayed results also give explanation to students why some answers were not correct. Using this matrix and initial data, obtained by the recognized international organizations we have developed the mathematical model that gives the opportunity to calculate the components of human life quality and safety as the components of sustainable development index and harmonization level of this development for every country. The system of indexes and indicators has been developed and gauging matrix for sustainable development processes (SDGM) in 3 dimensions: economic, ecological and socio-institutional has been offered. The global modeling of sustainable development processes for the large group of the countries in terms of human life quality and safety has been performed. The results of modeling have been explained in details by every dimension of the sustainable development.
Achieving Sustainable development is one of the most important problems of modern Society. Sustainable Development must balance the needs of society, the economy, and the environment. Therefore, it is important to be able to influence the factors that determine the current level of sustainable development. The main determining factors can be divided into three groups: economic, social and environmental. Environment attracts increasing attention of researchers. Today's understanding of SD recognizes it's environmental (ecological), economic and social underpinning (the so called “triangle” of SD), although previously the economic and social origins were thought to be dominant. This understanding corresponds to the concept of Synergy as one of the fundamental requirements for achieving SD. From this perspective, The Republic of Georgia is of particular interest since in a relatively small geographic area there are located more than 120 million tons of toxic waste. In this manuscript, we examined the action of microwave pretreatment on the bacterial and chemical leachability of arsenic and manganese containing waste and whether pretreatment increased the effectiveness of ‘soft remediation’ in areas contaminated with arsenic and manganese. Some important physical and chemical parameters of the microwave pretreatment were determined. Three different sources for microwave and conventional treatment were used. The average energy of the microwave oven was about nine times higher than that of the pulsed one (550 W versus 60 W). The Electric furnace had the same heating power, as the electric oven. The microwave heating was more efficient for all sampels. Taking in account the low average power of the pulsed generator it can be councluded that heating was more efficient in the pulsed generator than in the microwave oven. Environmental and economic impacts of the implementation of microwave pretreatment in remediation methods for industrial utilization of hazardous waste remediation in Georgia was estimated. Our study showed the possible advantages of the developed methods included: low costs of environmentally friendly methods of soft remediation; high economic efficiency of industrial utilization of hazardous waste; high effectivity of decontamination and remediation achievable by microvave enhanced methods.
The scope of this paper is to discuss the safety and stability of the Internet System by use of knowledge from the Theory of Systems. This is achieved by: (a) reviewing fundamental knowledge of the Theory of Systems in order to apply it to the Internet system; (b) applying this knowledge in the case of the Internet; (c) discussing the safety and stability of the Internet System by use of the above-mentioned knowledge. By this approach we show that: a) The Internet is a system only if it operates under rules. b) The Internet is a global system, which means that is a system without environment and as such is subject only to Internal disturbances, which can very easily render it unstable. c) The Internet System can be secure only by implementing rules, which prohibit anonymity of users and force all to work in synergy for the achievement of a common scope, which is communication and diffusion of information and knowledge and d) The Internet System can be safe only by close monitoring its operation and by intervening to prevent the application of destabilizing disturbances by any of its elements.
The business management in any contemporary organization requires the making of decisions, the coordinating of activities, the handling of people, and the evaluation of performance directed toward group objectives. Thus, this problem cannot be solved without system approach, such as synergetic, system modeling and complexity theory. The role of system sciences is more and more determined in the viewpoint of behavioral modeling of the most complex system. This article discusses the management of information systems in support of businesses. In general business intelligence systems address the needs of different types of complex organizations, including agencies of public administration and associations. Business intelligence represents the complex iterative and interactive, pyramid-like hierarchical multi-stage process. By moving up one level in the pyramid we find optimization models that allow us to determine the best solution out of a set of alternative actions, which is usually fairly extensive and sometimes even infinite. The top of the pyramid corresponds to the choice and the actual adoption of a specific decision, and in some way represents the natural conclusion of the decision-making process. In this paper, we present a new approach for a decision making process with respect to the viewpoint of system dynamics, agent-based modeling and simulation-based optimization, that is conditioned by the existence of nonlinear economic or organizational behavioral factors in human society. Originality of this work is in the system model adaptability by structure reconfiguration or self-assembly when multi-agent organization is evolving its way to a better structure. Every simulation-based solution can be considered as new knowledge. From the point of view of the Artificial Intelligence, technological problems of data mining and knowledge discovery has been discussed.
Modern electronics uses about 60% of all the elements of Mendeleev's table. Many of them are quite rare and even very rare. The depletion of the existing resources could become a real threat in the very near future. We cannot completely eliminate the threat of resources depletion. We can only remove it by recycling, substituting rare metals, and by increasing the life-time of products … It requires a new way of thinking and a new way of product designing.
We are presenting the new approach of novel information based control and management of commodities markets and their properties. Good management and intelligent capitalization of natural resources and corresponding commodities at international commodity markets are the most efficient measures for accelerated economic development of the Newly Independent States. Hence, the black holes in the legislation and, generally, in the philosophy of commodities are an insuperable obstacle towards international commodity markets. Without solving the problems discussed in this article, direct access to commodity and mercantile exchange for national mining companies from Newly Independent States would be banned.
Information asymmetry disintegrates the performance of Supply Chains, by inducing misallocation of the available and potential resources, discriminatory pricing and allocation policies leading to winners and losers in the wealth creation game, arbitrage, lead-times variations and unpredictability. We provide the theoretical underpinning that can reinforce truthful information sharing, that is an incentive compatible mechanism. We focus on scenarios about allocating the capacity of a supplier. The theoretical implementation of the mechanism is presented in the form of SSCC protocols, which comprise truthful inputs of all participants, the computation of allocations on the basis of supplier and retailers' profit maximisation, the non-disclosure of inputs and results. Simple commercial protocols for linear and proportional allocation mechanism will be presented as prototypes.
Nowadays quantum models became very applicable in artificial intelligence development. Mathematical basis of quantum mechanics allows its well use for tasks of cognitive informatics: conceptual analysis, learning and information retrieval. For farther development of concepts formation, learning, and information retrieval technologies utilization of quantum methods of contextuality and entanglement, interference, quantum range and semantic similarity are very important: they introduced the new step in information science and technology. In presented paper some methods and tools of quantum information technologies are observed. On the basis of quantum interactions concepts formation and their properties within information retrieval tasks were investigated.