INFORMATION & INFORMATION PROCESSING IN SCIENCE: ABSTRACTS


Biology Track

Information in evolution
Michael Lachmann

The concepts of entropy and mutual information have been central in physics, information theory, and computer science for more than 50 years. Over the last decades, these concepts have started creeping into biology – from neural networks, to animal signalling and recently also evolutionary biology. In my talk I will outline our work on the connection between mutual information and biological fitness. I will then outline why the concept is so widely applicable – why does the same equation describe an ideal gas and evolving lineages.

 

Functional Specialization, Substrate Independence, and the Explanatory Role of Information
Rosa Cao

I’m interested in two related questions:
1) How is information different from other resources for an organism?
2) How is information different from other explanatory resources for a theorist?
In the biological world, information is a distinctive resource for living creatures.  When a signal carries information for an organism, it is put to use as a guide to that which it carries information about, allowing the organism to coordinate its activities in a way appropriate to its situation. Similarly, when a signal carries information within an organism, it plays a distinctive functional role in facilitating coordination between parts of the organism separated by space or time.

Consequently, appeals to information in explanations of biological capacities gain traction when something like coordination is going on.  They are especially apt when the causal role of the signal used in that coordination process is taken to be substrate independent.
In the biological cases considered, what matters are the relations between the signal, its source, and its consumer. These relations give the informational content of the signal explanatory power over and above the particular material properties of the information vehicle – consistent with the paradigmatic feature of information that it can be abstracted away from any particular substrate.
I argue that this kind of substrate independence only emerges given a degree of functional specialization or modularity in the organization of the system.  Modularity requires the specialization of a mechanism for dealing with signals qua signals, as well as a clear separation between source, signal, and action.

 

An informational view of the evolution of living systems
Irun R. Cohen

Living systems develop, flourish and survive by virtue of the specific arrangements of their component parts.  Following the insight of Shannon, we can define information as a “specific arrangement”, as opposed to randomness.  From this perspective on information, we can say that living systems are made of information.  Living systems, however, are not static arrangements of parts; living systems emerge from the meaningful interactions of their component parts within the system and between the system and its environment.  Quite simply, Shannon’s concept of information is inadequate; living systems do not separate information – their specific arrangements – from their dynamic interactions – their meaning.  As we shall discuss, the evolution of living systems emerges from the evolution of both information and meaning. The survival of specific arrangements and interactions, of course, is subject to the ravages of entropy. Thus, the evolution of living systems necessarily expresses their struggle with entropy – living systems must adapt to entropic selection along with their fit adaptations to their environments.  We shall discuss how entropic selection molds the fundamental strategies of living systems.

 

Do animal signals have derived content?
Ulrich Stegmann

Current theories of the content of animal signals tend to construe signal content either as an instance of natural information or as a kind of representational content. As a consequence, signal content is taken to be underived in the sense that what a signal means is independent of what receivers take the signal to mean (e.g. Dretske 1988). Here I explore whether signal content is better viewed as a kind of derived content. On this view, signal content depends entirely on receivers: receivers determine whether or not a behavior or structure is a signal (i.e. has content) and, if so, what it is about. One way to develop this idea is to identify signal content with the “acquisition conditions” of receiver responses, i.e. with the circumstances causing receivers to acquire (evolve or learn) certain response dispositions to a given signal. This view avoids some difficulties of alternative accounts and matches actual content attributions in animal behavior studies. More importantly, it endows such attributions with valuable referential and explanatory roles: the ethological practice of attributing content to signals can be understood as a means for picking out the acquisition conditions of receiver responses; and the practice of explaining receiver behavior by appeal to signal content can be understood as explaining the response in terms of why receivers learned or evolved to so respond in the first place.

 

Time as Information to the Cognitive and Meta –cognitive systems
Dan Zakay

Time is a major source of information used by Humans’ cognitive and meta-cognitive systems. Manny types of perceptual , cognitive and meta-cognitive decisions( e.g., judgments of causality, familiarity, confidence and others) are based on the analysis of durations. Since no perceptual process or organ by which time can be detected exists, specific heuristics dedicated for the processing of time were developed. Perceived time is intimately linked with information-processing. When the amount of information processing is below a certain idiosyncratic threshold, the level of perceived time is increasing. Feeling of boredom was evolved as an alarm that warns an organism to increase its level of information processing by being engaged in appropriate activities. The theoretical model which provide an explanation for the link between time cognition and information is presented.

 

Physics Track

Interpreting the concept of information
Olimpia Lombardi

The word ‘information’ refers to a polysemantic concept that can be associated with different phenomena. Then, my first step will be to disentangle the varied meanings of the concept: semantic information as different from statistical information, and in this last case, the computational context as different from the communicational context. The locus classicus of the communicational concept of information is the famous article by Claude Shannon, where a precise formalism is introduced. However, in spite of the agreement concerning the traditional and well understood formalism, the very interpretation of the concept of information is far from unanimous. Here I will distinguish the epistemic and the physical interpretations, considering their advantages and shortcomings. Then I will critically consider a recent and widespread deflationary view. Finally, I will defend a pluralist view based on a formal interpretation of the concept.
In turn, during the last decades, new interpretive problems have arisen with the advent of quantum information theory, which combine the difficulties in the understanding of the concept of information with the well-known foundational puzzles derived from quantum mechanics itself. This research field has experienced a great development: new formal results multiply rapidly. In this context, the question to be answered is: are there two different kinds of information, classical and quantum? I will challenge the views that give a positive answer of this question in terms of the different ways of generating and transmitting information. My purpose is to claim that there is a single kind of information, which can be encoded by means of classical or quantum systems.

 

Thermodynamics and the measure of entanglement
Daniel Rohrlich

A crucial step in quantum information theory was to identify entanglement as a valuable resource.  We could then ask how to exploit this resource most efficiently.  I apply analogues of thermodynamical concepts, including reversibility, entropy, and the distinction between intensive and extensive quantities, to show that entropy of entanglement is the unique intensive measure of entanglement for pure states.

 

Bit by Bit: The Role of Information in Physics
Israel (Roly) Belfer

Since Claude Shannon’s work, the use of Information to analyze and describe physical systems has gained more and more prevalence, both enthusiastic and also more careful and considered uses of information processing. The classical questions arise here: when an actual change in a science occurs, and the philosophical dubiousness of a borrowed term – often on the verge of over-use and on the cusp of a new mode of scientific reasoning.
I will explore the historical case of Jacob Bekenstein’s formulation of Black-Hole Thermodynamics. Difficult to absorb and widely influential, it is still being considered today for its ramifications. Bekenstein introduced influential concepts in this informational turn, such as the Informational Bound and the relation between Information and surface areas, leading up to the Holographic Principle. The claims for and against informational BHTD as well as the results are telling – exposing the historical development of a new physical style of reasoning.

 

The PBR Theorem: Whose Side Is It On?
Yemima Ben Menahem

The paper examines the implications of the recent PBR theorem for the debate on the reality of the quantum state. The theorem is generally seen as undermining the epistemic interpretation of the quantum state, an interpretation which construes quantum states as representing information about possible measurements. The theorem is therefore taken to support a realist interpretation of the quantum state. By contrast, the argument here is that the theorem actually strengthens, rather than weakens, a radical epistemic interpretation of the quantum state.

 

Entropy: Information versus Probability
Orly Shenker

In statistical mechanics it is often said that entropy, understood as quantifying the amount of information one has about a system, is key to predicting the probability of the system’s future evolution. The idea is that the two are almost identified: high entropy macrostates are identified with high probability macrostates. In this talk I will examine the connection between information and entropy, and will argue that the the idea that the entropy of a macrosate entails its probability is in general empirically inadequate, and will then show under what conditions the two notions may be weakly linked. The discussion will be mostly classical, and if time permits I will briefly touch upon notions of quantum mechanical entropy in this context.

 

Cognitive and Brain Sciences Track

Nonnatural Representation
Gualtiero Piccinini

Adapting Grice’s distinction between natural and nonnatural meaning, let’s  posit two kinds of information: natural information that p, which entails that p (or at least raises the probability that p), and nonnatural information that p, which does not raise the probability that p. So a signal may carry nonnatural information that  p even though ‘p’ is simply false. Natural representation that p arises when a state carries natural information that p because that is its function. Such a representation malfunctions, and thus misrepresents, when it fails to carry natural information that p. This is the standard account of natural representation, plus or minus asymmetric dependence or something along the same lines (Dretske, Millikan, Fodor, Ryder, etc.). This account does not work for nonnatural representation. Nonnatural representation that p arises when a state carries nonnatural information that p because that is its function. Such a representation malfunctions when it fails to carry nonnatural information that p, but the result is not a misrepresentation! That’s because nonnatural information may be false, so failing to carry nonnatural information may well result in a correct representation. Thus, nonnatural representation requires a different account. In this paper I explore what an account of nonnatural representation might look like.

 

Information and representation in probabilistic models of cognition 
Mark Sprevak

What is the relationship between information and representation in cognitive science? Dating back at least to Dretske (1981), an influential answer has been that information is a rung on the ladder that gets one to mental representation. Mental representation is information plus something else. In this paper, I argue that that this traditional approach dramatically oversimplifies the relationship between information and representation. If one takes current probabilistic models of cognition seriously, then information enters into cognition as a way of governing the computational relations between representations. This is a new, logically independent, way in which information and representation are intertwined.

 

No Computation without Representation
Oron Shagrir

A semantic view of computation contends that computational states are type-individuated, at least in part, by their semantic properties. Based on the phenomenon of simultaneous implementation, I have presented an argument for the semantic view (Shagrir 2001; see also arguments advanced by Sprevak 2010, Shea 2013 and Rescorla 2013). Piccinini (2008; 2015) disputes the argument, concluding that computational individuation depends on a functional, non-semantic (rather than semantic), task. My aim in this talk is to reply to Piccinini, showing that semantic considerations override functional considerations.

 

From Information to Representation: a teleological approach
Nir Fresco & Eva Jablonka

We propose and defend a tripartite taxonomy of representation​: neural, mental and symbolic​. (1) Neural representation is an instance of learned functional information in which neural relations (a) correspond to external world (including bodily) relations, and (b) persist long enough to support online updating and offline recall of the external relations. (2) Mental representation is a neural representation that also requires the learning and recalling of sensory and action patterns; such patterns are updated through hierarchical top-down and bottom up-processes between successive representations of increasing persistence and generality within the hierarchy. (3) Symbolic representation is a mental representation that also requires that a representation be part of a semantic network mapping representations to symbols. Our approach, which is both teleological and information-based, offers a coherent framework in which some of the traditional problems of representation can be resolved.

 

Temporal structure in perception and attention
Ayelet N. Landau

An obvious place to inquire about the question of information in brain sciences is sensory systems, which are dedicated to representing the external environment that enters our brain through our senses. On the face of it, one may consider our sensory systems as simple “perception” generators. That is, once the external world reaches our senses, and is transduced into neural signals, the contents of our sensory system inevitably, become the contents of our perceptions – the environment which we can react and be aware of. Such a simple model, however, discounts an important limitation of the system. The sensory environment is far too rich for our brains to simply represent it all and produce perceptions containing everything that is going on around us. In light of this limitation a selection process is required wherein only relevant content that advances our goals reaches full sensory processing and become the contents of our perceptions. The process of selection is called attention and is the focus on my research. In my talk I will discuss the neural fate of visual information in lieu of attentional selection processes. In addition, I will discuss the role of temporal structure as a neural mechanism implementing and shaping the way information is perceived due to attention.

 

Evolving a Language of Thought
Ronald Planer

Many cognitive scientists believe that at least some facets of both human and animal cognition are most plausibly explained by the existence of a language of thought. In spite of this, little to no attention has been paid to the question of how a language of thought might evolve. In this talk, I attempt to make progress on this question by bringing to bear some recent work on the evolution of signalling systems. To this end, I will first explain how signalling systems with language-like properties can arise in between-individual cases. Then I will show how these ideas can be applied to signalling within the brain, specifically, signalling over time (i.e., memory). Finally, I will argue that the evolution of a language-like signalling system in the brain is greatly facilitated by features that are peculiar to this inter-temporal channel.

 

From Information to Cognition
Fred Adams

Everyone agrees that information is necessary for cognition. Can’t think Obama is president of U.S. without information that he is (or might be).  But is information sufficient for cognition if it causes or controls significant behavioral change?  Many think so and as a consequence they believe there is cognition in plants or bacteria and other places not usually thought to possess cognition.  I shall attempt to explain why I think cognition takes more.

 

Information and Representation in Theories of Brain Sensory Processing
Haim Sompolinsky

TBC

 

What’s information? An answer from physics, biology, mind-science and philosophy
Aaron Sloman

The Meta-Morphogenesis (M-M) project was inspired by the question: “What would Turing have worked on if, instead of dying two years after publication of his morphogenesis paper (1952), he had lived several more decades?” My conjectured answer is: he would have tried to explain how a lifeless planet (or universe) could generate all the forms of life, and all the (evolved and still evolving) forms of biological information processing (including mathematical information processing resulting in Euclid’s discoveries) that have existed on Earth. This includes the many forms of information-processing required for evolution by natural selection or produced as side-effects, including human uses of language for communication and much older and more wide-spread uses of internal languages for control, perception, learning, planning, desiring, etc. This talk will present some partial results concerning the nature and diversity of biological information and information-processing. Most researchers focus on a subset of types of information, and information-processing, with bad consequences for science and philosophy.

 

To the workshop homepage