3

I will take the definition of "information" used in the field of Information Theory, which according to my understanding of Information Theory, information is the loss of uncertainty(e.g. while a coin is flipping I have zero information[high uncertainty], but when the coin stops flipping I gain information[loss of uncertainty] ). I think that the definition of "information" used in the field of Information Theory applies in any context.

I will take any definition known of the word "intelligence"(As provided by this wikipedia page: http://en.wikipedia.org/wiki/Intelligence; one's capacity for logic, abstract thought, understanding, self-awareness, communication, learning, emotional knowledge, memory, planning, creativity and problem solving).

One idea that I have is that the existence of intelligence(in any way you want to define it) implies the existence of information(If there is intelligence in something then that thing contains information)[e.i. The intelligent thing has something in it that is not completely uncertain].

Any other ideas of how I can relate loss of uncertainty(information) with intelligence in a general context?

0

3 Answers 3

1

While connecting "information" and "intelligence" is an irresistible puzzle, I suspect it is largely an apple-oranges update of mind-body problems.

Information is, as Shannon's insists, strictly physical. It reduces "uncertainty, thus relocating a "message" from one place and medium to another, within a predetermined context of "meanings." All syntax, no semantics. It has no more special relation to "mental capacities" than, say, the theory of gravity.

Nonetheless, the allure remains, and there have been efforts to develop a physical semantics or theory of "meaning" extending the theory of information. One involves, I seem to recall, a second-tier complexity of "information" that contains keys to "self-interpretation." This may be towards the end of Gleick's book, but I have no references at hand and hazy recollection.

Because "intelligence" remains a "ghostly" property, you may want to consider first expanding "information" to the social operations of "communication," as in the systems theory of N. Luhmann, who defines "meaning" as an interaction between "actual" and "possible." While this clearly echos the relation of "certainty" to "uncertainty" in information theory, it is also reminiscent of Hegel's "actualization of the rational" and "rationalization of the actual."

Sorry this is a bit vague, but I am actually just digging into this myself. But again, some neat conversion function between quantifiable "information" and whatever quality or capacity is meant by "intelligence" seems to run aground for now on the same old antinomies hampering computational reductions of "mind."

0

Intelligence is based on Information. Einstein is intelligent, but he was not when he was a baby, as he didn't know anything about the world.

However, is Wikipedia intelligent? Can someone prove its creativeness? Its emotions? I doubt it! So the opposite seems wrong according to your definition of intelligence.

Minsky brings interesting concepts to answer to your question. According to him, intelligence is a long process.

First, sensors (eyes for example) extract information from the world. The information has to be sorted and categorized following Kant. Then, you can learn from "resources" inside your mind (see "Society of Minds" of Minsky). Here's where information can be loss. Finally, the organization of your resources will define your intelligence.

Consequently, there's no inclusion of intelligence into information nor the opposite. Information is only a raw input of the mind process that creates intelligence.

0

The definition of information you mention is about loss of uncertainty, but more specifically it is about the amount of uncertainty lost, which corresponds to the amount of information gained. There is more uncertainty in the throw of dice than there is in a flip of a coin (1 ou of 6 vs 1 out of 2) , hence a signal carrying the outcome of roll of the dice carries more information than a signal carrying the outcome of a coin toss. From wikipedia:

  • On tossing a coin, the chance of 'tail' is 0.5. When it is proclaimed that indeed 'tail' occurred, this amounts to I('tail') = log2 (1/0.5) = log2 2 = 1 bits of information.
  • When throwing a fair dice, the probability of 'four' is 1/6. When it is proclaimed that 'four' has been thrown, the amount of self-information is I('four') = log2 (1/(1/6)) = log2 (6) = 2.585 bits.
  • When, independently, two dice are thrown, the amount of information associated with {throw 1 = 'two' & throw 2 = 'four'} equals I('throw 1 is two & throw 2 is four') = log2 (1/P(throw 1 = 'two' & throw 2 = 'four')) = log2 (1/(1/36)) = log2 (36) = 5.170 bits. This outcome equals the sum of the individual amounts of self-information associated with {throw 1 = 'two'} and {throw 2 = 'four'}; namely 2.585 + 2.585 = 5.170 bits.

Signals coming from an intelligent agent will contain more information than a signal coming from a non intelligent agent.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .