Information is something that imparts knowledge, that will increase knowledge. Information is difficult to precisely define. There are a number of different definitions around the web, all correct.
The Business Directory [1] defines it as:
"Data that is (1) accurate and timely, (2) specific and organized for a purpose, (3) presented within a context that gives it meaning and relevance, and (4) can lead to an increase in understanding and decrease in uncertainty.
Information is valuable because it can affect behavior, a decision, or an outcome."
The Oxford English Dictionary [2] defines it as:
"The imparting of knowledge in general OR Knowledge communicated concerning some particular fact, subject, or event; that of which one is apprised or told; intelligence, news."
Data is sometimes called by some as Information, however, this is incorrect. Data is the raw ingredient to form information. Information is contrasted with data: that which is obtained by the processing of data.[2]
An example of data would be a collection of clicks to a website. Each click is a data point, but is quite meaningless on its own. Bringing those data points together enables the creation of information. For example, how many clicks were there on a given day, which pages received those clicks, are there more or less clicks than the same day last week, etc.
There is a process for converting data into information, commonly referred to as the Information Ladder[3]. The information ladder demonstrates the conversion of raw data to increasingly useful information.
Information theory is the mathematical treatment of the concepts, parameters and rules governing the transmission of messages through communication systems.[4]
The objective of information theory is to determinate the informative content of a determined object or event and to obtain a better measure of that content. The word “information” is derived from the Latin expression inform that signifies “to give a form to”, meaning the imposition of a structure over an indefinite set of facts. Given the fact that information is not matter neither energy, then it should not be considered an absolute greatness, instead it should be faced in a relative way, such as entropy. The expression “information quantity” is a metaphor without any physical or numerical property. The information isolated from a system and out of context does not have any meaning; it is the contextualization that brings information to life. Information is transformed in knowledge only when it has a practical utility and without this the information does not pass of mere abstract data, physical or not.
According to Shannon and Weaver (1949) the information can be measured as negative entropy being a measure of the order in a system. It can be said that information is a form of data processing concerning objects, events or persons that only has meaning for a receiver if the increase of knowledge reduces the uncertainly in decision making processes. Langford’s equation defines information according to the following formula:
I = i (D, S, t)
In this formula, I represents the information obtained by a determined interpretative mechanism i which in turn acts on data set D, relatively to a previous knowledge stage S, during a period of time t.
Intuitively, the amount of information gained by observing a single datum is a function of the distribution of values in D: if you live in the desert and it seldom rains, then observing that the sun is shining provides you with less information than would observing that the ground is moist. This is sometimes thought of as the "surprise factor": observations which are surprising carry provide more information than observations that are not surprising. To make this more concrete, consider the implications of discovering indigenous penguins in a temperate climate. We already know that penguins can be found in Antarctica and in the region of the Southern Ocean, so observing additional penguins (of any species) in that area doesn't tell us nearly so much as would the (not so theoretical) discovery of a penguin in a temperate region.
Whatever the definition, information is an invisible agent that acts like an agglutinant in all the processes of decision-making, regulation and control. In an economically dynamic and technical situation information has the fundamental role of being the support for efficient decision making (Pinheiro, 1986). The information must be screened, condensed, stored, transmitted, received, aggregated and integrated. All these actions are based in the fact of that despite its infinity the information, for human consumption, can only be organized in a limited number of forms (Senn, 1989; Blethyn, 1990).
Organization is the result of the interaction of information with matter and energy. When it is added to the matter she shows up always some kind of structure or organization, i.e., the organization can be viewed as information stored. Beyond organizing matter and energy information is still able to self structure. The more organized is a process or a structure, less information is necessary to fully describe it. By the contrary, disorganization is always associated to an increase of system entropy (Stonier, 1990). The more disorganized is a system; more information is possible to extract from him hence there is no certainty that there aren’t hidden patterns in it (Rodrigues, 1991).