Integrated Information Theory (IIT)
Encyclopedia
The Integrated Information Theory is a recently formulated theory which attempts to quantitatively measure consciousness
Consciousness
Consciousness is a term that refers to the relationship between the mind and the world with which it interacts. It has been defined as: subjectivity, awareness, the ability to experience or to feel, wakefulness, having a sense of selfhood, and the executive control system of the mind...

. It was developed by psychiatrist and neuroscientist Giulio Tononi
Giulio Tononi
Giulio Tononi is a professor of psychiatry at the University of Wisconsin. He is an authority on sleep, and in particular the genetics and etiology of sleep.He is also interested in the nature of consciousness,...

 of the University of Wisconsin–Madison
University of Wisconsin–Madison
The University of Wisconsin–Madison is a public research university located in Madison, Wisconsin, United States. Founded in 1848, UW–Madison is the flagship campus of the University of Wisconsin System. It became a land-grant institution in 1866...

.

Overview

The theory is based on two key observations. The first is that every observable conscious state contains a massive amount of information. A common example of this is every frame in a movie. Upon seeing a single frame of a movie you have watched you instantly associate it with a "specific conscious percept." That is to say you can discriminate a single frame from a film with any other single frame, including a blank, black screen. The mind, therefore, can discriminate amongst a massive number of possible visual states. This is a tremendous amount of information being represented. Compare our visual awareness to a simple photodiode which only can discriminate the presence of light from dark. It doesn't matter if the light is a lightbulb, a scene from Ben Hur or the bright light of noon on a summer day, the photodiode represents only minimal information. The hypothesis then is that the amount of consciousness an entity has is equal to the amount of information processing it contains. This brings us to the second key observation of the theory.

All of the information you have gleaned from conscious states is highly, and innately, integrated into your mind. It is impossible for you to see the world apart from all of the information that you are conscious of. When you are looking at an orange, for example, you cannot separate the color of the fruit (orange) from its shape (round). Consciousness is "integrated"; even though color processing and spacial processing are separately localized in the brain (a stroke victim can lose color perception yet maintain perfect spacial awareness, for example) conscious experiences cannot be atomized into distinct parts.

Definition of Consciousness

In this theory, consciousness arises as a property of a physical system, its 'integrated information'. Integrated information is an exact quantity that can be measured using the following equations:

Information

Given: a system (including current probability distribution) and Mechanism (which specifies the possible next state probability distribution, if the current state is perturbed with all possible inputs).
You can determine: Actual Distribution - Possible system states at time t = -1
Thus: System and Mechanism constitute information (about the system's previous state), in the classic sense of 'reduction of uncertainty.'

Relative Entropy/Effective Information

Effective Information = relative entropy H between the actual and potential repertoires = Kullback-Leibler divergence

It is implicitly specified by mechanism and state, so it is an 'intrinsic' property of the system.
One can calculate the actual repertoire of states by perturbing the system in all possible ways to obtain the forward repertoire of output states. After that, one applies Bayes' Rule.

Example:

System of two Binary elements - Four possible states (00, 01, 10, 11)

The first binary element operates randomly. The second binary element will be whatever the first element was in the previous state. Initially: (0, 0).
maximum entropy: p = (1/4, 1/4, 1/4, 1/4)
Given, at time t, state is 11
Previous state must have been 11 or 10, p = (0, 0, 1/2, 1/2)
Generated one bit of information since
where X is our system, mech is that system's mechanism, x1 is a state of the system, and p(X0(maxH)) is the uniform or potential distribution.

Integration ()

for

where X is our system, mech is that system's mechanism, x1 is a state of the system, PRODUCT(p(kM0(mech,mu1))) is the product of all the probability distributions of each part of the system in the minimal information partition.

It's clear then that will be high when there is a lot of information generated among the parts of a system as opposed to within them.

Complexes

A complex is a set of elements that generate integrated information that is not fully contained in a larger set of higher .

This then leads naturally to the notion of a main complex, which is the complex in a system that generates the largest amount of . Note that a main complex can partially contain complexes of lower within it.

Quality of consciousness

We begin by defining a multi-dimensional space called qualia space, or Q-space.
This space has an axis for every state of the system. A point in this space, then, has a component for every state; if we restrict the components to be numbers from 0 to 1, then we can view the components as probabilities that the system is in that state.
Thus a point in Q-space represents a probability distribution.
Now again using relative entropy we can measure the amount of information generated by a single connection c within the system with the following equation:



where Y is the system with that connection removed.
Thus there is are points Y and X in Q-space that correspond to the probability distributions of the system with and without the connection c, respectively. We can then draw a vector from Y to X that has length . This vector is associated with the connection c and is called a q-arrow. So a q-arrow is a representation of the informational relationship specified by a connection.

Properties of q-arrows

Context dependency

Q-folds

Entanglement

External links

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK