Information

ImprimirCitar
ASCII codes for each of the characters of the word "wikipedia" represented in binary, the numbering system most commonly used to encode information on computer.

Information is the name by which an organized set of processed data is known that constitutes a message that changes the state of knowledge of the subject or system that receives said message. There are various approaches to the study of information:

  • In biology, information is considered as a sensory stimulus that affects the behavior of individuals.
  • Multiple are the definitions that are present when determining their content. According to Ivis Goñi Camejo "information has not been defined only from a mathematical or technical point of view; its conceptualization encompasses philosophical, cybernetic and other approaches based on so-called information sciences. »
  • In social communication and journalism, they are taken as a set of messages exchanged by individuals from a society for concrete purposes.

Sensory data, once perceived and processed, constitutes information that changes the state of knowledge, which allows individuals or systems that have said new state of knowledge to make pertinent decisions according to said knowledge.

From the point of view of computer science, information is explicit knowledge extracted by living beings or expert systems as a result of interaction with the environment or sensitive perceptions of the same environment. In principle, information, unlike data or sensitive perceptions, has a useful structure that will modify the successive interactions of the person who possesses said information with his environment.

Etymology

The word «information» derives from the Latin noun informatio(-nis) (from the verb informare, meaning «to shape the mind», «to discipline », «instruct», «teach»). Already in Latin the word informationis was used to indicate a «concept» or an «idea», but it is not clear if such a word could have influenced the modern development of the word «information».

On the other hand, the corresponding Greek word was μορφή (morfè, from which the Latin word «form» arose by metathesis), or else εἶδος (éidos, from which the Latin «idea» derives), that is: «idea», «concept» or «form», «image»; the second word was notoriously used technically in the philosophical field by Plato and Aristotle to indicate the ideal identity or essence of something (see Theory of ideas). Eidos can also be associated with "thought", "assertion" or "concept".

Information in society

In human societies and partly in some animal societies, information has an impact on relationships between different individuals. In a society, the behavior of each individual towards some other individuals can be altered depending on what information is available to the first individual. For this reason, the social study of information refers to aspects related to the variation of behavior in possession of different information.

For Gilles Deleuze, social information is a control system, insofar as it is the propagation of slogans that we should believe or have ourselves believe. In this sense, information is an organized set of data capable of changing the state of knowledge in the sense of the instructions transmitted.

Main characteristics of the information

In general, the information has an internal structure and can be classified according to several characteristics:

  • Meaning (semantic): From the meaning extracted from an information, each individual (or expert system: system capable of emulating human thought) evaluates the possible consequences, and adequates their attitudes and actions in a manner consistent with those consequences. The expectations of the entity are given in a literal way to the meaning of the information.
  • Importance (relative to the receiver): That is, if it deals with any important issue. The importance of information for a receiver will refer to how much the attitude or behavior of individuals changes. In modern societies, individuals obtain from the mass media a great deal of information, a large part of it is not important to them, because it alters the behavior of the same very little significantly. This refers to how quantitative future expectations should be altered. Sometimes it is known that a fact makes some things less likely and more, the importance has to do with how less likely alternatives will be to others.
  • Vigilance (in the space-time dimension): It refers to whether it is up-to-date or upset. In practice, the validity of an information is difficult to assess, since in general access to information does not allow for immediate information to be known if such information is in force or not.
  • Validez (relative to the issuer): It is assessed whether the issuer is reliable or can provide unsuccessful (false) information. It has to do with whether signs should be considered in the revaluation of expectations or should be ignored because they are not reliable indications.
  • Value (Volatile intangible asset): The usefulness of such information for the recipient.

History of information

The history of information is associated with its production, treatment and transmission. A chronology of that detailed history can be:

  • CenturyV a X - High Middle Ages. The storage, access and limited use of the information is done in the libraries of the monasteries in an amanutical or manual way.
  • CenturyXII. The Incas (Peru) use a string system for the registration of numerical information called Quipu, used mainly to count cattle.
  • CenturyXV. Modern age. With the birth of the printing press in Europe (Gutenberg), the books begin to be manufactured in series. The first newspapers emerge.
  • CenturyXX.. 1926. The first broadcast of television will begin to affect the handling and treatment of information with a great impact on social communication methods throughout the century.
  • CenturyXX.. 1940. Jeremy Campbell defined the term information from a scientific perspective, in the context of the era of electronic communication.
  • CenturyXX.. 1943. The Austro-Hungarian Nikola Tesla invents the radio, although initially this invention is attributed to Guglielmo Marconi and the patent is not recognized until the 1960s.
  • CenturyXX.. 1947. In December John Bardeen, Walter Houser Brattain and William Bradford Shockley, invent the transistor. They will be rewarded for this with the Nobel Prize in Physics in 1956.They just sat without knowing the first of the two bases for a new technological and economic revolution, acting as a trigger for an exponential increase in the capacity of microeletronic integration, popularization and computer calculation power.
  • CenturyXX.. 1948. Claude E. Shannon, elaborates the mathematical bases of the Information Theory. It has just given the second basis of the revolution of information and communication technologies: the application of Boole's algebra will be the mathematical basis for industrializing the processing of information. This is how computer science or computer engineering is born. The new economic revolution is served. Humanity enters the Digital Era using transistor and binary numeration to symbolize, transmit and share information.
  • CenturyXX.. 1948. Norbert Wiener, elaborates the idea of cybernetics in his famous work Cybernetics or control and communication in animals and machines (Cybernetics or Control and Communication in the Animal and the Machine(1948) where it was commissioned to “keep order” in any natural or artificial system of information.
  • CenturyXX.. 1951-1953. James Watson and Francis Crick discover the principles of DNA codes, which form an information system from the double DNA spiral and the way genes work.
  • CenturyXX.. 1969. In the context of the cold war, the contracultural movement of the 1960s, the embryonic Internet is born when the first connection of computers or computers, known as ARPANET, is established between four American universities (three universities in California and one in Utah), with the initial objective of facilitating a network of military communications pump-proof. Its expansion and popularization, and the democratization of knowledge that facilitates, will radically transform economic, social and cultural relations into a more and more interdependent world.
  • At present, already in the twenty-first century, in a short period of time, the developed world has sought to achieve the globalization of access to the enormous volumes of information existing in increasingly complex media, with exponentially increasing storage capacities and in increasingly reduced supports. Despite this, there are still many sources of information in non-digital or digitally inaccessible format for various causes. In this framework the proliferation of data and information transmission networks, databases with online access, located anywhere, located via the Internet, allow the discovery of other networks and information centers of different types at any time from anywhere. It is the result of data managed through computer applications where data are processed and transformed into information that is subsequently managed as an integrating and characteristic sign of economic progress of the centuryXXI.
ProcesamientoDatos.svg

Uses of information

It is considered that the generation and/or obtaining of information pursues these objectives:

  • Increase/ improve the knowledge of the user or otherwise reduce the existing uncertainty about a set of logically possible alternatives.
  • Providing decision makers with the basic raw material for the development of solutions and choice.
  • Provide a series of evaluation rules and decision rules for control purposes.

In relation to the third point, information as a way to reach knowledge must be elaborated to make it usable or available (this empirical process is called Documentation, and it has its own methods and tools), but it is also impossible for the information to by itself provides the individual with more knowledge, since it is he who must value the significance of the information: he organizes it and turns it into knowledge. The data, so to speak, is itself a "prefix" of the information, that is, a prior element necessary to obtain the information.

Journalistic information

A news story is the story or writing of an informative text that you want to make known with its own rules of construction (enunciation) that refers to a novel or atypical fact -or the relationship between novel and/or atypical facts-, occurred within a community or a certain specific area, which makes it worth disclosing.

Information and Status

Control and manipulation is one of the most powerful means governments use to promote compliance with their policies. Thus, totalitarian and authoritarian states seek a monopoly on information to promote compliance with their policies. Information aims to make the facts known in an effective and impartial manner, while propaganda seeks to gain followers to achieve an objective, without caring about the veracity of the facts. Thus propaganda competes with law as an instrument of power.

Information theory

The information theory approach analyzes the mathematical and statistical structure of messages, regardless of their meaning or other semantic aspects. The aspects in which information theory is interested are the transmission capacity of the channels, the compression of data or the detection and correction of errors.

Mathematical characterization

A way to characterize our state of knowledge of the world, is through probabilities. If we know that in the future can happen n Different things A1,...... ,An{displaystyle scriptstyle A_{1},dotsA_{n}}}each with probability p1,...... ,pn{displaystyle scriptstyle p_{1},dotsp_{n}}} that set of probability constitutes our knowledge of the world, information should reduce our uncertainty, varying the likelihood to p~ ~ 1,...... ,p~ ~ n{displaystyle scriptstyle {tilde {p}_{1},dots{tilde {p}}_{n}}}. If the second state has less uncertainty, it is because some things have become more likely against other alternatives that have become less likely.

One way to "measure the information" associated with a message or observed fact is to calculate how some probabilities have increased and others have decreased. A convenient measure of calculating the "concentration" of certainty in some alternatives is the statistical entropy:

{S0=− − ␡ ␡ i=1npilog2 pi(Initial uncertainty)Sf=− − ␡ ␡ i=1np~ ~ ilog2 p~ ~ i(Final uncertainty),I=− − (Sf− − S0)≥ ≥ 0(information){cHFFFFFF}{cHFFFFFF}{cHFFFFFF}{cHFFFFFF}{cHFFFFFF}{cH00FFFF}{cH00FF}{cH00FFFF}{cH00}{cHFFFF}{cH00}{cH00FFFFFF}{cH00}{cH00FFFFFFFFFFFFFFFFFF}{cH00}{cH00}{cH00}{cH00}{cH00}{cH00FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF}{cH00}{cH00}{cH00}{cH00}{cH00}{cH00FFFFFFFFFFFFFFFFFF}{cH00}{cH00}{cH00}{cH00}{cH00}{cH00}{

A linguistic example illustrates this case well. Suppose we are asked to guess the second letter of a Spanish word. and they tell us that any letter of the alphabet could appear in the second position. Therefore, the initial uncertainty is obtained by calculating the probability of occurrence of each letter and calculating:

S0=− − palog2 pa− − pblog2 pb− − − − pandlog2 pand− − pzlog2 pz≈ ≈ 4,04bits{displaystyle S_{0}=-p_{a}log _{2}p_{a}-p_{b}log _{2}p_{b}-dots -p_{y}{y}log _{2}p_{y}-p_{z}log _{2}p_{z}approx 4,04

However, if we are given a clue that "the first letter is a Z", then only A, O, U can appear in second position (although there are a handful of exceptional cases of E e I) and therefore with this information the uncertainty is greatly reduced:

Sf=− − palog2 pa− − porlog2 por− − pulog2 pu≈ ≈ 0,91bits{displaystyle S_{f}=-p_{a}log _{2}p_{a}-p_{o}log _{2}p_{o}-p_{u}log _{2}p_{u}approx 0,91 {mbox{bits}}}}}}}

The quantized information for the clue "the first letter is a Z" turns out to be:

I=− − (Sf− − S0)≈ ≈ 3,13bits{displaystyle I=-(S_{f}-S_{0})approx 3,13 {mbox{bits}}}}}}

Information units become bits since base logarithms have been used 2. If the track had been “the first letter is an M”, the second letter could only be A, E, I, O, U which is a wider set and in this case Sf≈ ≈ 1,54{displaystyle scriptstyle S_{f}approx 1.54} and in this case, the track carries less information because it reduces uncertainty, the result in this case is repeating the previous steps of about 2.50 bits.

Basic operations with information

Four basic operations on information emerge, these are the pillars on which, like the four basic operations of mathematics (+ - x%), several semesters of University study can be built. These are:

On these four operations, both in storage and in transit, is that all complex operations and software developments are built. It is also worth mentioning that they are the basis of any type of computer attack or Cyberattack.

Information retrieval

The amount of information and knowledge developed is apparently enormous and has a recovery methodology, which is eventually infinite or total in a very large number of supports and sites, and the systemic recovery model must maximize the search to ensure its as complete a capture as possible within the environment of this complex system. In the case of searches on the Internet and using two or more descriptors, the numerical results that search engines give, which contain the two or more terms together or very close, is already a measure of the amount of information obtained and that is in mathematical expression the ln or natural logarithm of the sum of the validated interactions. Values of 2 or 3 will be optimal.

Information and Physics

In physics there is an intimate connection between entropy and information:

  • In statistical physics a macrostate or macroscopic situation can correspond from a microscopic point of view to several micro-states, i.e. several different microstates can be perceived in general macroscopic terms such as the same general situation or macrostatement. Some macros can only correspond to a relatively small number of microstates; other macro-states, on the contrary, may correspond to a higher number of microstates. Entropy is a physical magnitude that measures the amount of microstates corresponding to a macrostate. Somehow the macro-states with the greatest entropy can correspond to more micro-states and, therefore, known the macrostate there is greater uncertainty about the actual microstate of the system. Therefore entropy measures the lack of knowledge of the microstate, hence the information necessary to know the microstate of a known system its macrostate coincides with entropy. When there is a decrease in entropy it can be said that our knowledge of the microstate of the system has increased, or there is less uncertainty about the microstate.
  • In the theory of relativity it is assumed that no signal that is informative can travel faster than light, if otherwise the principle of causality could be violated, no examples of systems have been found that allow to transmit information faster than light.
  • In quantum mechanics, many authors accept the idea that information about a physical system is not destroyed even if it can be inaccessible. When a measurement of the state of a system is made, the wave function collapses, so that successive measures on the system do not allow to recover the state of the system before the measurement. However, the system environment and the measuring apparatus evolve into a state that does contain that information, although it is not recoverable. In the last two decades of the centuryXX. There was a debate about whether or not information is lost in black holes. On the one hand, Stephen Hawking, Roger Penrose and others argued that yes, while others such as Gerardus 't Hooft or Leonard Susskind argued that no, coming to formulate the latter the idea of the holographic principle according to which the horizon of events of the black hole would keep the information about the physical state of the material falling within. In addition, quantum mechanics it is possible to transmit signals faster than light, as the experiments on quantum interlacing of Alain Aspect showed; however, these super-plastic signals do not seem to be able to transmit information.

Information and neguentropy

Léon Brillouin published in 1959 Science et théorie de l'information (English version first published in 1962) where the relations between these two disciplines are examined. He particularly adopts a physical point of view and makes the link between Shannon's informational entropy and Boltzmann's statistical entropy where it is risked that information (and with it language) is a negentropic factor, that is, by which one can cancel entropy.

Analog and digital information

This section refers to the storage, encoding, and transmission of information using analog and digital signals. In the case of analogically encoded information, the data is translated into the form of electrical impulses, so that the flow of information is constant and only varies in amplitude. On the other hand, the digitally encoded information is that whose encoding is reduced to two values: 0 and 1; that is, it is translated into the binary system. All this type of information, regardless of whether we are talking about music, images, texts, etc., has the same nature.

Let's see the differences and therefore, disadvantages and advantages that one system or another has.

In the first place, since digital information is encoded with the same code, regardless of the format, it can be transmitted or copied infinite times without suffering damage or loss, because it simply must be reconstructed by the appropriate software —it is only of numbers-. On the contrary, analog information is not as precise, as it loses quality when copied.

Another of the disadvantages of analog information is that it occupies a physical space, it is static and its conservation can be difficult over the years. However, digital information does not occupy any physical place as it is inside devices capable of reading it, such as a computer. In addition, it is data that has the ability to flow very easily and its conservation is completely possible for a very long time; it does not deteriorate —although the medium that stores it can spoil. In fact, if they are not consciously destroyed, they last forever.

Another difference between the two types of information is how easy or difficult it is to share it. That is, analog information is more difficult to disseminate. The advantage of this is that it is much easier to control. On the other hand, the digital one, being so easy to share today —via the Internet, for example—, it is very difficult to prevent it from circulating once it has started to do so, so our control over it is much less.

Finally, digital information is processed much faster through devices such as computers or hard drives, so this facilitates the processing of information. This, due to the common nature of all formats, can combine text, image, video, etc., in the same file.

Contenido relacionado

Application layer

The application layer or application layer is the seventh layer of the OSI model and the fourth layer of the TCP/IP...

Third generation of computers

The third generation of computers was between 1965 and 1971. The late 1950s saw the invention of the integrated circuit, or chip, by Jack S. Kilby and Robert...

VRML

The Virtual Reality Modeling Language or VRML is a standardized file format that has as objective the representation of three-dimensional interactive scenes...
Más resultados...
Tamaño del texto:
Copiar