Philosophy of information


The philosophy of information is a branch of philosophy that studies topics relevant to computer science, information science and information technology.
It includes:
  1. the critical investigation of the conceptual nature and basic principles of information, including its dynamics, utilisation and sciences
  2. the elaboration and application of information-theoretic and computational methodologies to philosophical problems.

    History

The philosophy of information has evolved from the philosophy of artificial intelligence, logic of information, cybernetics, social theory, ethics and the study of language and information.

Logic of information

The logic of information, also known as the logical theory of information, considers the information content of logical signs and expressions along the lines initially developed by Charles Sanders Peirce.

Cybernetics

One source for the philosophy of information can be found in the technical work of Norbert Wiener, Alan Turing, William Ross Ashby, Claude Shannon, Warren Weaver, and many other scientists working on computing and information theory back in the early 1950s. See the main article on Cybernetics.
Some important work on information and communication was done by Gregory Bateson and his colleagues.

Study of language and information

Later contributions to the field were made by Fred Dretske, Jon Barwise, Brian Cantwell Smith, and others.
The Center for the Study of Language and Information was founded at Stanford University in 1983 by philosophers, computer scientists, linguists, and psychologists, under the direction of John Perry and Jon Barwise.

P.I.

More recently this field has become known as the philosophy of information. The expression was coined in the 1990s by Luciano Floridi, who has published prolifically in this area with the intention of elaborating a unified and coherent, conceptual frame for the whole subject.

Definitions of "information"

The concept information has been defined by several theorists.

Peirce

's theory of information was embedded in his wider theory of symbolic communication he called the semeiotic, now a major part of semiotics. For Peirce, information integrates the aspects of signs and expressions separately covered by the concepts of denotation and extension, on the one hand, and by connotation and comprehension on the other.

Shannon and Weaver

Claude E. Shannon, for his part, was very cautious: "The word 'information' has been given different meanings by various writers in the general field of information theory. It is likely that at least a number of these will prove sufficiently useful in certain applications to deserve further study and permanent recognition. It is hardly to be expected that a single concept of information would satisfactorily account for the numerous possible applications of this general field.". Thus, following Shannon, Weaver supported a tripartite analysis of information in terms of technical problems concerning the quantification of information and dealt with by Shannon's theory; semantic problems relating to meaning and truth; and what he called "influential" problems concerning the impact and effectiveness of information on human behaviour, which he thought had to play an equally important role. And these are only two early examples of the problems raised by any analysis of information.
A map of the main senses in which one may speak of information is provided by . The previous paragraphs are based on it.

Bateson

defined information as "a difference that makes a difference". which is based on Donald M. MacKay: information is a distinction that makes a difference.

Floridi

According to Luciano Floridi, four kinds of mutually compatible phenomena are commonly referred to as "information":
The word "information" is commonly used so metaphorically or so abstractly that the meaning is unclear.

Philosophical directions

Computing and philosophy

Recent creative advances and efforts in computing, such as semantic web, ontology engineering, knowledge engineering, and modern artificial intelligence provide philosophy with fertile ideas, new and evolving subject matters, methodologies, and models for philosophical inquiry. While computer science brings new opportunities and challenges to traditional philosophical studies, and changes the ways philosophers understand foundational concepts in philosophy, further major progress in computer science would only be feasible when philosophy provides sound foundations for areas such as bioinformatics, software engineering, knowledge engineering, and ontologies.
Classical topics in philosophy, namely, mind, consciousness, experience, reasoning, knowledge, truth, morality and creativity are rapidly becoming common concerns and foci of investigation in computer science, e.g., in areas such as agent computing, software agents, and intelligent mobile agent technologies.
According to Luciano Floridi " one can think of several ways for applying computational methods towards philosophical matters:
  1. Conceptual experiments in silico: As an innovative extension of an ancient tradition of thought experiment, a trend has begun in philosophy to apply computational modeling schemes to questions in logic, epistemology, philosophy of science, philosophy of biology, philosophy of mind, and so on.
  2. Pancomputationalism: On this view, computational and informational concepts are considered to be so powerful that given the right level of abstraction, anything in the world could be modeled and represented as a computational system, and any process could be simulated computationally. Then, however, pancomputationalists have the hard task of providing credible answers to the following two questions:
  3. # how can one avoid blurring all differences among systems?
  4. # what would it mean for the system under investigation not to be an informational system ?

    Information and society

Numerous philosophers and other thinkers have carried out philosophical studies of the social and cultural aspects of electronically mediated information.