What refers to negative assessments in pedagogy. Types of pedagogical assessments. Psychological essence of education

Most scientists today abandon attempts to give a strict definition of information and believe that information should be considered as a primary, indefinable concept, like sets in mathematics. Some textbook authors offer the following definitions of information:

Information- this is knowledge or information about someone or something.

Information- this is information that can be collected, stored, transmitted, processed, used.

Computer science - the science of information

This is the science of the structure and properties of information, methods of collecting, processing and transmitting information

Computer science studies the technology of collecting, storing and processing information, and the computer is the main tool in this technology.

The term information comes from the Latin word informatio, which means information, explanation, presentation. Currently, science is trying to find general properties and patterns inherent in the multifaceted concept of information, but so far this concept largely remains intuitive and receives different semantic content in various branches of human activity:

1. In everyday life, information is any data, information, knowledge that interests someone. For example, a message about any events, about someone’s activities, etc.;

2. In technology, information is understood as messages transmitted in the form of signs or signals (in this case there is a source of messages, a recipient (receiver) of messages, a communication channel);

Signal- method of transmitting information. This is a physical process that has informational value. It can be continuous or discrete.
A signal is called discrete if it can take only a finite number of values ​​at a finite number of times.

Analog signal- a signal that continuously changes in amplitude and time.
Signals carrying textual, symbolic information, discrete.
Analog signals are used in telephone communications, radio broadcasting, and television.

Talk about information in general, and not in relation to some of it specific species pointless.

It can be classified:

By methods of perception (visual, tactile, etc.);

By presentation form (text, numeric, graphic, etc.);

According to social significance (mass, special, personal).

Data- presentation of facts and ideas in a formalized form suitable for transmission and processing in some information process.

Initially - data quantities, that is, quantities specified in advance, along with the condition of the problem. The opposite is variables.

In computer science, data is the result of recording, displaying information on some material medium, that is, a representation of information registered on the medium, regardless of whether this information has reached any receiver and whether it is of interest to him.

Traditionally, two ways of organizing data are used:

- text data (in the file system: text file, in programming: string data type) - a sequence of alphabetic characters, represented as an encoding.

Binary data is a sequence of bytes. "Binary" organization is not a way of organizing data per se, but rather a term that covers formats other than text.

May include a variety of elements such as machine (or other executable) code, numeric data, conditional codes, bitmaps, locations of other data, text fragments (see above), compressed and random data.

The issues of defining the concept of “information” are considered.

1.1. Definition of information
1.2. Quantitative measure of information (- What is the magnitude or amount of information; - Shannon's formula; - Bit and byte; - Expert methods for assessing information and the formation of new measures of information)
1.3. Classification of information (- By coding method; - By area of ​​origin; - By method of transmission and perception; - By public purpose)
1.4. Properties of information (- Attributive properties of information; - Pragmatic properties of information; - Dynamic properties of information)
2. What is computer science
2.1. Definition of computer science
2.2. Main components (- Theoretical computer science; - Simeotics; - Cybernetics; - Analog and digital information processing)
2.3. Some definitions.

Introduction

The problem of teaching computer science initial stage both in the senior classes of secondary schools and in the first years high school causes much controversy. Until recently, one of the main tasks was considered to be general familiarity with computer technology and the ability to program in one of the simplest languages ​​(usually “School Algorithmic Language”, “BASIC” or “Pascal”). This orientation marked a bias towards programming. The student began to associate the word “computer science” with the word “programming”. This methodological manual makes an attempt to reveal the concepts of computer science and information in order to use them by specialists in the humanities. Students should be able to operate with information of any kind: linguistic, visual, musical. The manual will help them begin to acquire skills in processing and systematizing information, and orientation in information networks.



1.1. Definition of information


The concept “Information” is widely used in ordinary life modern man, so everyone has an intuitive idea of ​​what it is. But when science begins to apply well-known concepts, it clarifies them, adapting them to its goals, and limits the use of the term to the strict framework of its application in a specific scientific field. This is how physics defined the concept of force, and the physical term of force is no longer at all what is meant when they say: willpower, or the power of the mind. At the same time, science, by studying a phenomenon, expands a person’s understanding of it. Therefore, for example, for a physicist, the concept of force, even limited to its strict physical meaning, is much richer and more meaningful than for those ignorant of physics. Thus, the concept of information, becoming the subject of study in many sciences, is specified and enriched in each of them. The concept of information is one of the basic ones in modern science and therefore cannot be strictly defined through simpler concepts. It is only possible to explain and illustrate its meaning by turning to various aspects of this concept. Human activities involve the processing and use of materials, energy and information. Accordingly, scientific and technical disciplines developed, reflecting issues of materials science, energy and computer science. The importance of information in the life of society is rapidly growing, methods of working with information are changing, and the scope of application of new information technologies is expanding. The complexity of the phenomenon of information, its diversity, breadth of scope and rapid development is reflected in the constant emergence of new interpretations of the concepts of computer science and information. Therefore, there are many definitions of the concept of information, from the most general philosophical - “Information is a reflection of the real world” to the narrow, practical - “Information is all information that is the object of storage, transmission and transformation.”


For comparison, we also present some other definitions and characteristics:


  1. Information - the content of a message or signal; information considered in the process of its transmission or perception, allowing to expand knowledge about the object of interest.

  2. Information is one of the fundamental entities of the world around us (Academician Pospelov).

  3. Information - initially - information transmitted by one people to other people orally, in writing or in some other way (TSB).

  4. Information is reflected diversity, that is, a violation of monotony.

  5. Information is one of the main universal properties matter.

By information it is necessary to understand not the objects and processes themselves, but their reflection or display in the form of numbers, formulas, descriptions, drawings, symbols, images. Information itself can be classified into the realm of abstract categories, such as, for example, mathematical formulas, but working with it is always associated with the use of some materials and energy expenditure. Information is stored in rock paintings of ancient people in stone, in the texts of books on paper, in paintings on canvas, in musical tape recordings on magnetic tape, in data random access memory computer, in the hereditary DNA code in every living cell, in a person’s memory in his brain, etc. To record, store, process, distribute it, you need materials (stone, paper, canvas, magnetic tape, electronic storage media, etc.), as well as energy, for example, to drive printing machines, create an artificial climate for storing masterpieces of fine art , power with electricity electronic circuits calculator, support the operation of transmitters on radio and television stations. Success in modern development Information technologies are primarily associated with the creation of new materials that form the basis of electronic components of computers and communication lines.


1.2. Quantitative measure of information


What is the magnitude or amount of information


A person tries to characterize each object or phenomenon, for comparison with similar ones, by its size. This cannot always be done simply and unambiguously. Even the size of physical objects can be assessed in different ways: by volume, weight, mass, number of its constituent elements, cost. Therefore, for example, it is clear that even a simple question: “What is bigger, a kilogram weight or a children’s balloon?” can be answered differently. The more complex and multifaceted a phenomenon is and the more characteristics this phenomenon has, the more difficult it is to find a definition of its magnitude that satisfies everyone involved in this phenomenon. Likewise, the amount of information can be measured in different ways: in the number of books, pages, characters, meters of film, tons of archival materials, kilobytes of computer RAM, and also assessed by the emotional perception of a person, by the benefits received from possessing information, by the necessary processing costs , systematization of information, etc. Try to evaluate where there is more information: in Einstein’s formula E=mc2, which underlies the physics of the hydrogen bomb, in Aivazovsky’s film “The Ninth Wave” or in the daily television program “News”. Apparently the easiest way to estimate the amount of information is by how much space is needed to store it, by choosing a single method of presenting and storing information. With the development of computers, encoding information using the numbers 1 and 0 became such a unified method. Coding here is the rewriting of information from one method of representation to another. The number of positions (called binary positions) containing only the digits 1 or 0 required to write a message directly is one of the criteria for the amount of information and is called the information volume in bits. To record one character (letters, numbers, spaces between words, punctuation marks) in a computer, 8 binary positions are most often used, and this is called a byte. Thus, the phrase: “Snow White and the Seven Dwarfs” consists of 21 letters (without quotes) and two spaces between words and will occupy 23 bytes or 184 bits in the computer memory. Not direct, but compressed recording of information is possible, i.e. encoding it with fewer bits. This is done through special processing and analysis of the frequency of occurrence, location and number of characters in the message. In practice, a person also compresses a message based on its meaning. For example, a long message of 37 bytes “nine hundred and ninety-six” can be compressed into four characters “1996.” For the first time, as a scientific concept, information began to be used in library science and the theory of journalism. Then it began to be considered by the science of optimal coding of messages and transmission of information through technical communication channels.


Shannon's formula


Claude Ellwood Shannon proposed information theory in 1948, which gave a probabilistic-statistical definition of the concept of quantity of information. Each signal in Shannon's theory is assigned a probability of its occurrence. The less likely the occurrence of a particular signal, the more information it carries for the consumer. Shannon suggested the following formula to measure the amount of information:



I = -S p i log 2 p i



where I is the amount of information; p i - probability of occurrence of the i-th signal;


N is the number of possible signals.


The formula shows the dependence of the amount of information on the number of events and on the probability of the occurrence of these events. Information is zero if only one event is possible. As the number of events increases, information increases. I=1 is a unit of information called a “bit”. A bit is the basic unit of information.


Bit and byte


In technology, two outcomes are possible, which are coded as follows: number one “1” - “yes”, “on”, “current flows” ... number zero “0” - “no”, “off”, “no current flows” " The numbers 1 and 0 are symbols of the simplest signed number system. Each sign or symbol of the binary number system contains one bit of information. Of particular importance for measuring the volume of symbolic information is a special unit - the byte. 1 byte = 8 bits, which corresponds to eight bits binary number. Why 8? This is how it happened historically. The volume of information is also measured in units derived from the byte: KB, MB and GB, only the prefixes “K”, “M” and “G” do not mean, as in physics “kilo”, “mega” and “giga”, although they are often That's what they call it. In physics, “kilo” means 1000, and in computer science, “K” means 1024, since this number is more natural for computers. They use the number 2 as the basis of their arithmetic, just as a person uses the number 10 as the basis of their arithmetic. Therefore, the numbers 10, 100, 100, etc. are convenient for humans, and the numbers 2, 4, 8, 16 and finally the number 1024, obtained by multiplying two ten times, are “convenient” for computers.


1 kilobyte (KB) = 1024 bytes = 8192 bits


1 megabyte (MB) = 1024 KB = 2 20 bytes = 2 23 bits


1 GB (MB) = 1024 MB = 2 20 KB = 2 30 bytes = 2 33 bits.


The concept of the amount of information introduced in this way does not coincide with the generally accepted concept of the amount of information as the importance of the information received, but it is successfully used in computing and communications.


Expert methods for assessing information and the development of new measures of information


Since information has various characteristics, the practical meaning of which in different applications of computer science is different, there cannot be a single measure of the amount of information that is convenient in all cases. For example, the quantity of a measure of information can be the complexity of calculation using some universal algorithm. It should be expected that the further penetration of computer science into those areas of human activity where it is still poorly applied, including art, will lead to the development of new scientific definitions of the amount of information. Thus, the perception of a work of art that we like brings a feeling of being filled with new, previously unknown information. It is not for nothing that the effect produced on a person by a great piece of music, an artist’s painting, or sometimes simply by contemplation of nature: picturesque mountains, deep sky, is often characterized by the word “revelation.” Therefore, characteristics of the amount of information may appear that characterize its aesthetic and artistic significance. Until simple, mathematically expressed definitions of the measure of quantity of a particular property of information have been created, the so-called expert assessments, i.e. opinions of experts in this field. They give their assessments based on personal, often very subjective experience. Professional communication between experts and creative discussion of the subject of analysis leads to the development of more or less generally accepted evaluation criteria, which can ultimately become the basis for the creation of a formal measure, unambiguous, like the international standard meter. Examples of the development of future measures of information, in its various manifestations, can be the following expert assessments and other already used indicators:



points given by judges of competitions for artistry of performance, for example, in figure skating;
reviews of films in the press with scores assigned according to the degree of their interest to the moviegoer;

the cost of paintings;

assessment of a scientist’s work based on the number of published articles;

assessment of a scientist’s work by the number of references to his work in the works of other scientists (refereeability index);

indices of the popularity of musical works and their performers, published in the press;

student grades given by college teachers.



In addition to measuring the amount of memory in bits and bytes, technology also uses other units of measurement that characterize work with information:



the number of operations per second, characterizing the speed of information processing by a computer;

the number of bytes or bits per second, characterizing the speed of information transfer;

the number of characters per second, characterizing the speed of reading, typing on a computer, or the speed of a printing device.



1.3. Classification of information


Information can be roughly divided into different kinds, based on one or another of its properties or characteristics, for example, by the method of encoding, the sphere of occurrence, the method of transmission and perception and social purpose, etc.


By coding method


According to the signal encoding method, information can be divided into analog and digital. An analog signal represents information about the value of the initial parameter, which is reported in the information, in the form of the value of another parameter, which is the physical basis of the signal, its physical carrier. For example, the angles of the clock hands are the basis for the analogue display of time. Height mercury in a thermometer, this is the parameter that provides analog information about temperature. The longer the stage in the thermometer, the higher the temperature. To display information in an analog signal, all intermediate parameter values ​​from minimum to maximum are used, i.e. theoretically an infinitely large number of them. The digital signal is used as physical basis to record and transmit information only a minimum number of such values, most often only two. For example, the basis for recording information in a computer is based on two states of the physical signal carrier - electrical voltage. One state is that there is electrical voltage, conventionally denoted by one (1), the other is that there is no electrical voltage, conventionally designated zero (0). Therefore, to convey information about the value of the initial parameter, it is necessary to use a data representation in the form of a combination of zeros and ones, i.e. digital representation. It is interesting that at one time computers were developed and used, which were based on ternary arithmetic, since it is natural to take the following three as the main states of electrical voltage: 1) voltage is negative, 2) voltage is zero, 3) voltage is positive. Scientific papers devoted to such machines and describing the advantages of ternary arithmetic are still being published. Now in competition Binary machine manufacturers won. Will it always be like this? Here are some examples of consumer digital devices. Electronic watches with digital display provide digital time information. The calculator performs calculations with digital data. A mechanical lock with a digital code can also be called a primitive digital device.


By area of ​​origin


According to the area of ​​origin, information can be classified as follows. Information that arises in inanimate nature is called elementary, in the world of animals and plants - biological, in human society - social. In nature, living and inanimate, information is carried by: color, light, shadow, sounds and smells. As a result of the combination of color, light and shadow, sounds and smells, aesthetic information arises. Along with natural aesthetic information, as a result creative activity people, another type of information arose - works of art. In addition to aesthetic information, semantic information is created in human society as a result of knowledge of the laws of nature, society, and thinking. The division of information into aesthetic and semantic is obviously very conditional; it is simply necessary to understand that in some information its semantic part may predominate, and in another the aesthetic part.


According to the method of transmission and perception


According to the method of transmission and perception, information is usually classified as follows. Information transmitted in the form of visible images and symbols is called visual; transmitted by sounds - auditory; sensations - tactile; smells - taste. Information perceived by office equipment and computers is called machine-oriented information. The amount of machine-oriented information is constantly increasing due to the continuously increasing use of new information technologies in various fields human life.


For public purposes


According to public purposes, information can be divided into mass, special and personal. Mass information is in turn divided into socio-political, everyday and popular science. Special information is divided into production, technical, managerial and scientific. Technical information has the following gradations:

Machine tool industry,

Mechanical engineering,

Instrumental...

Scientific information is divided into biological, mathematical, physical...


1.4. Information Properties


Information has the following properties:

Attributive;

Pragmatic;

Dynamic.

Attributive properties are those properties without which information does not exist. Pragmatic properties characterize the degree of usefulness of information for the user, consumer and practice. Dynamic properties characterize the change in information over time.


Attributive properties of information


Integrity of information from the physical medium and the linguistic nature of information


The most important attributive properties of information are the properties of the inseparability of information from the physical medium and the linguistic nature of information. One of the most important areas of computer science as a science is the study of the characteristics of various media and languages ​​of information, the development of new, more advanced and modern ones. It should be noted that although information is inseparable from the physical medium and has a linguistic nature, it is not strictly associated with either a specific language or a specific medium.


Discreteness


The next attributive property of information that you need to pay attention to is the property of discreteness. The information and knowledge contained in the information are discrete, i.e. characterize individual factual data, patterns and properties of the objects being studied, which are distributed in the form of various messages consisting of a line, a composite color, a letter, a number, a symbol, a sign.


Continuity


Information tends to merge with what has already been recorded and accumulated earlier, thereby promoting progressive development and accumulation. This confirms another attributive property of information - continuity.


Pragmatic properties of information


Meaning and novelty


The pragmatic properties of information are manifested in the process of using information. First of all, this category of properties includes the presence of meaning and novelty of information, which characterizes the movement of information in social communications and highlights that part of it that is new to the consumer.


Utility


Useful information is information that reduces the uncertainty of information about an object. Misinformation is regarded as negative values useful information. The term usefulness of information is often used to describe the impact that incoming information has on a person’s internal state, his mood, well-being, and finally health. In this sense, useful or positive information is that which is joyfully perceived by a person, helps to improve his well-being, and negative information has a depressing effect on the psyche and well-being of a person, and can lead to deterioration of health, a heart attack, for example.


Value


The next pragmatic property of information is its value. It is important to note that the value of information varies among different consumers and users.


Cumulativeness


The cumulative property characterizes the accumulation and storage of information.


Dynamic properties of information


Dynamic properties of information, as the name suggests, characterize the dynamics of information development over time.


Growth of information


First of all, it is necessary to note the property of information growth. The movement of information in information communications and its constant dissemination and growth determine the property of multiple distribution or repetition. Although information is dependent on a particular language and a particular speaker, it is not strictly associated with either a particular language or a particular speaker. Thanks to this, information can be received and used by several consumers. This is the property of reusability and a manifestation of the property of dispersing information across various sources.


Aging


Among the dynamic properties, it is also necessary to note the property of information aging.


2. What is computer science


2.1. Definition of computer science


Not very long ago, computer science was understood as a scientific discipline that studies the structure and general properties of scientific information, as well as the patterns of all processes of scientific communication - from informal processes of exchange of scientific information through direct oral and written communication between scientists and specialists to formal processes of exchange through scientific literature. This understanding was close to such as “library science”, “book science”. The term “documentation” sometimes served as a synonym for the concept of “computer science.” The rapid development of computer technology has changed the concept of “computer science,” giving it a much more computer-oriented meaning. Therefore there are still different interpretations this term. In America, as similar to the European understanding of computer science, the term “Computer Science” is used - the science of computers. Close to the concept of computer science is the term “systems engineering,” for which dictionaries also often give the translation “Computer Science.” Computer science is a science that studies all aspects of obtaining, storing, transforming, transmitting and using information.


2.2. Main components


The components of this science are: theoretical computer science, simeotics, cybernetics. In practice, computer science is implemented in programming and computer technology.


Theoretical computer science


Theoretical computer science is the foundation for building general computer science. This discipline deals with the construction of models, the construction of discrete sets that describe these models. An integral part of theoretical computer science is logic. Logic is a set of rules that govern the thinking process. Mathematical logic studies the logical connections and relationships underlying deductive (logical) inference.


Simeotics


Simeotics studies sign systems, the components of which - signs - can be of a very diverse nature, as long as they can identify three components interconnected by contractual relations: syntax (or plan of expression), semantics (or plan of meaning) and pragmatics (or plan of use). Simeotics allows us to establish analogies in the functioning of various systems of both natural and artificial origin. Its results are used in computational linguistics, artificial intelligence, psychology and other sciences.


Cybernetics


Cybernetics arose in the late 40s, when N. Wiener put forward the idea that the rules for controlling living, nonliving and artificial systems have many common features. The relevance of N. Wiener's conclusions was reinforced by the advent of the first computers. Cybernetics today can be considered as a branch of computer science that considers the creation and use of automated control systems of varying degrees of complexity.


Analog and digital information processing


Computer science, as the science of information processing, is implemented in analog and digital information processing. Analog information processing includes direct actions with color, light, shape, line, etc. Seeing the world through rose-colored glasses (literally) is analog processing of visual information. Analogue computing devices are also possible. They were widely used earlier in technology and automation. The simplest example of such a device is a slide rule. Previously, in schools they taught to perform multiplications and divisions with its help, and it was always at hand for any engineer. Now it has been replaced by digital devices - calculators. Digital information processing usually refers to actions with information through digital computing technology. Currently, traditional analogue methods of recording audio and television information are being replaced by digital methods, but they have not yet become widespread. However, we are increasingly using digital devices to control traditional “analog” devices. For example, the signals coming from a portable control device for a television or VCR are digital. The scales that appear in stores, displaying the weight and cost of purchase on a display, are also digital. Natural ways of displaying and processing information in nature are analog. An animal's footprint is an analog signal about the animal's size. The cry is analogue way convey the internal state: the louder, the stronger the feeling. Physical processes perform analog signal processing in the sense organs: focusing an image on the retina of the eyeball, spectral analysis of sounds in the cochlea. Analog signal processing systems are faster than digital ones, but they perform narrow functions and are difficult to adapt to new operations. That is why numerical computers have developed so rapidly now. They are universal and allow you to process not only numerical, but also any other information: text, graphic, sound. Digital computers are capable of receiving information from analog sources using special devices: analog-to-digital converters. Also, information, after processing on a digital computer, can be converted into analog form on special devices: digital-to-analog converters. Therefore, modern digital computers can speak, synthesize music, draw, control a machine or machine tool. But it may not be as noticeable to everyone as digital computers, but analog information processing systems are also developing. And some analog information processing devices have not yet been found and apparently will not find a worthy digital replacement in the near future. Such a device, for example, is a camera lens. It is likely that the future of technology lies in so-called analog-digital devices that take advantage of both. Apparently the sense organs nervous system and thinking are also built by nature on both an analog and digital basis. When designing human-machine systems, it is important to take into account the characteristics of a person in perceiving one or another type of information. When reading texts, for example, a person perceives 16 bits in 1 second, while simultaneously holding 160 bits. Convenient design in the aircraft cabin, on the control panel complex system, significantly facilitates a person’s work, increases the depth of his awareness of the current state of the controlled object, and affects the speed and efficiency of decisions made.


2.3. Some definitions.


The science - social sphere creation and use of information as knowledge of the objective human world.


Art - social activities on the creation and use of information sources that influence, first of all, feelings, and secondly, consciousness.


Creativity is the production of new information by a person. Pedagogy - organization information process associated with maximum assimilation of information.


Training is the transfer of information for the purpose of acquiring knowledge and skills.

Literature

1. Computer science. encyclopedic Dictionary for beginners. edited by D.A. Pospelova - M. Pedagogy-Press, 1994


2. Ya.L.Shrayberg, M.V.Goncharov - Reference Guide to the Fundamentals of Informatics and Computer Science - M.Finance and Statistics, 1995


3. Computer science and culture. Collection of scientific papers. - Novosibirsk, Science, Siberian branch, 1990


4. D.I. Blumenau - Information and information service - Leningrad, Science, 1989


5. Information technology: Issues of development and application. - Kyiv: Nauk.dumka, 1988


6. The concept of informatization of education // Informatics and education. - 1990 - N1


7. Terminological dictionary on the basics of computer science and computer technology / A.P. Ershov et al.; edited by A.P. Ershov, N.M. Shansky. - M.: Education, 1991. - 159 p.


8. Zavarykin V.M. and others. Fundamentals of computer science and computer technology: Textbook. manual for pedagogical students. Institute of Physics and Mathematics special - M.: Education, 1989.-207 p.


9. Encyclopedia of Cybernetics. - Main editorial office of the Ukrainian Soviet encyclopedia. Kyiv, 1974.


GONCHARENKO ELENA ALEKSANDROVNA
ZNAMENSKY VASILY SERAFIMOVICH


CBD INR
NALCHIK COLLEGE OF DESIGN
Nalchik-1996


Information and its properties

Objects material world are in a state of continuous change, which is accompanied by energy exchange. All types are accompanied by the appearance of signals. When signals interact with physical bodies, certain changes in properties occur in the latter - this phenomenon is called signal registration.

Data are recorded signals.

Information- this is information about objects and phenomena environment, their parameters, properties and states, which reduce the degree of uncertainty in them and incomplete knowledge. Data can be thought of as recorded observations that are not used but are stored for now.

Properties of information that determine its quality

The quality of information refers to the degree to which it meets consumer needs. The properties of information are relative, as they depend on the needs of the information consumer. The following properties characterize the quality of information are distinguished:

  • Objectivity information is characterized by its independence from anyone’s opinion or consciousness, as well as from methods of obtaining. More objective is the information to which the methods of obtaining and processing introduce a lesser element of subjectivity.
  • Completeness. Information can be considered complete when it contains a minimum but sufficient set of indicators for making the right decision. Both incomplete and redundant information reduce the effectiveness of information-based decisions.
  • Credibility- the property of information to be correctly perceived. Objective information is always reliable, but reliable information can be both objective and subjective. Reasons for unreliability may be:
    • deliberate distortion (disinformation);
    • unintentional distortion of a subjective property;
    • distortion due to interference;
    • errors in recording information;

In general, information reliability is achieved:

    • indicating the time of occurrence of the events, information about which is transmitted;
    • comparison of data obtained from various sources;
    • timely detection of misinformation;
    • excluding distorted information, etc.
  • Adequacy- the degree of correspondence to the real objective state of the matter.
  • Availability information - a measure of the possibility of obtaining this or that information.
  • Relevance information is the degree of correspondence of information to the current point in time.

You can also classify the properties of information that characterize its quality as follows [ Akulov O. A., Medvedev N. V. Computer science: basic course. - M.: Omega-L, 2004. P. 42.]:

  • Content or internal quality (quality inherent in the information itself and preserved when it is transferred from one system to another)
    • Significance (the ability to retain value for the consumer over time)
      • Completeness (a property characterized by a measure of its sufficiency for solving certain problems)
      • Identity (a property consisting in the correspondence of information to the state of an object)
    • Cumulativeness (the property of information contained in a small volume of information to sufficiently fully reflect reality)
      • Selectivity
      • Homomorphism
  • Security or external quality (quality inherent in information located or used only in a specific system)
    • Safety
    • Credibility
    • Confidentiality

Data Operations

To improve quality, data is converted from one type to another using processing methods. Data processing includes operations:

1) Data input (collection) - accumulation of data in order to ensure sufficient completeness for decision-making

2) Formalization of data - bringing data coming from different sources, to the same form, to increase their accessibility.

3) Data filtering is the elimination of “extra” data that is not necessary to increase reliability and adequacy.

4) Data sorting is the ordering of data according to a given characteristic for the purpose of ease of use.

5) Archiving is the organization of data storage in a convenient and easily accessible form.

6) Data protection - includes measures aimed at preventing the loss, reproduction and modification of data.

7) Data transportation - reception and transmission of data between participants in the information process.

8) Data transformation is the transfer of data from one form to another or from one structure to another.

Data Encoding

To automate work with data that belongs to different types, it is necessary to unify their form of presentation - consisting in expressing data of one type through data of another type. The system code of computer technology is a binary encoding based on providing data in the form of consecutive two characters: 1 and 0. These characters are called binary digits or bits.

One bit expresses two concepts: 0 or 1.

Two bits - four concepts: 00,01, 10, 11.

Three bits - eight concepts: 000,001,010,011,100,101,110,111

An increase by one in the number of bits of the binary coding system leads to a 2-fold increase in the number of values ​​that can be expressed by them. The general form is N=2 m, where N is the number of independent encoded values; m - bit depth of binary coding.

Encoding integers and real numbers

Algorithm for converting integer decimal numbers to binary: 1) Divide the number by 2. Fix the remainder (0 or 1) as the quotient.

2) If the quotient is not equal to zero, then divide it by 2, etc. until the quotient becomes equal to 0. If the quotient is 0, then write down all the resulting remainders, starting from the first from right to left.

To get the inverse, you need to sum the powers of 2 corresponding to the non-zero digits of the number.

To encode integers: 0 to 255 - 8 bits (eight-bit binary input) 0 to 655 - 16 bits 0 to 16.5 million - 24 bits

Encoding text data

If each character of the alphabet is associated with a specific integer or non-integer number (for example, a serial number), then using binary code it is possible to encode both text information and audio information. Eight bits are enough to encode 256 various characters. In order for the whole world to encode text data in the same way, unified encoding tables are needed, and this is not yet possible due to contradictions between the characters of national alphabets, as well as corporate contradictions.

For in English, which has de facto captured the niche of an international means of communication, the contradictions have already been removed. The US Standards Institute (ANSI - American National Standard Institute) introduced the ASCII (American Standard Code for Information Interchange) coding system. The ASCII system has two coding tables - basic and extended. The basic table fixes code values ​​from 0 to 127, and the extended table refers to characters with numbers from 128 to 255. In the USSR, the KOI-7 coding system (seven-digit information exchange code) was in effect in this area. However, the support of hardware and software manufacturers has brought the American ASCII code to the level of an international standard.

Subject and tasks of computer science

Computer science is a technical science that systematizes the methods of creating, storing, reproducing, processing and transmitting data using computer technology, as well as the principles of operation of these tools and methods of managing them.

The subject of computer science consists of the following concepts:

Computer hardware;

Computer software;

Means of interaction between hardware and software;

Means of human interaction with hardware and software.

In computer science Special attention focuses on interaction issues. There is even a special concept for this - interface. Methods and means of human interaction with hardware and software are called user interface. Accordingly, there are hardware interfaces, software interfaces and hardware-software interfaces.

As part of the main task of computer science today, the following areas for practical applications can be distinguished:

Architecture of computer systems (techniques and methods for constructing systems designed for automatic data processing);

Interfaces of computer systems (techniques and methods for managing hardware and software);

Programming (techniques, methods and tools for developing computer programs); „

Data conversion (techniques and methods for converting data structures);

Information protection (generalization of techniques, development of methods and means of data protection);

Automation (functioning of software and hardware without human intervention);

Standardization (ensuring compatibility between hardware and software, as well as between data presentation formats related to different types of computing systems).

Links

  • Computer science. basic course. Edited by S.V. Simanovich 2004

Wikimedia Foundation. 2010.

See what “Information and its properties” is in other dictionaries:

    Information- (Information) Information is information about something. The concept and types of information, transmission and processing, search and storage of information Contents >>>>>>>>>>>> ... Investor Encyclopedia

    Information- – any type of knowledge about objects, facts, concepts, etc. of the problem area that is exchanged between users information system. … … Encyclopedia of terms, definitions and explanations of building materials

    information- (from Latin informatio explanation, presentation) 1) fundamental manifestation of the dynamic properties of the world, its structural, semantic and qualitative and quantitative diversity; 2) in documentary films, messages, information transmitted by people to each other in... ... Great psychological encyclopedia

    Information (from the Latin informatio - explanation, presentation), originally - information transmitted by one people to other people orally, in writing or in some other way (for example, using conventional signals, using technical... ... Great Soviet Encyclopedia

    Noun, number of synonyms: 1 negative (8) ASIS Dictionary of Synonyms. V.N. Trishin. 2013… Synonym dictionary

    - (from lat. informatio explanation, awareness) any information and data reflecting the properties of objects in natural (biological, physical, etc.), social and technical. systems and transmitted by sound, graphic (including written) or other means without... ... Physical encyclopedia

    template information- these are any characteristics associated with the behavior of an object. Often, template information is assigned by identity management systems based on reputation and recent iterations, as opposed to being defined by the entity itself. Examples of template information... Technical Translator's Guide

    - (lat. informatio explanation, presentation, awareness) one of the most general concepts science, denoting some information, a collection of any data, knowledge, etc. Within the framework of the systemic cybernetic approach, information is considered in... ... The latest philosophical dictionary

    To improve this article, it is desirable?: Find and arrange in the form of footnotes links to authoritative sources confirming what has been written. Add illustrations. Add information for other countries and regions... Wikipedia

    In mathematical statistics and information theory, Fisher information is the variance of the sample contribution function. This function is named after Ronald Fisher who described it. Contents 1 Definition 2 Properties ... Wikipedia

Books

  • Properties of structural materials in the nuclear industry. Steels and alloys for nuclear power plant pipelines. Volume 3, Kashirsky Yu.V.. The collection contains information about more than 40 domestic and 60 foreign grades of steels and alloys used in nuclear energy. It discusses in detail the characteristics and properties...

Every day we learn a lot of information different ways: we see, we hear, we taste and smell, we touch, we read, we communicate, we think, we comprehend. Information is everywhere! It would seem that everything is simple, information is information that we receive from the world around us. But if you look at the concept of information more broadly, you can learn many different nuances of this issue that you may not have known about before. Therefore, let's look at the concept of information, the properties of information and its types in science in more detail.

Definition of information

Information is a fairly broad concept that can be defined in different ways. If we consider information broadly, it will have an abstract meaning with many meanings, and the specifics can only be determined in context.

In a narrower sense, information is information (data, messages) presented in different forms that are perceived by a person or a special device.

There is also such a definition of information: this is information and data that informs people about the state and state of affairs (the clearest example is the media: radio, print, television, cinema, Internet).

Information can also be defined as a set of data that is stored on a tangible medium and distributed in space and time.

If we approach the concept of information from a purely scientific point view, then works of art will not be considered pure information. Also, depending on the specific field of science, you can give various definitions information. For example, in philosophy, information is considered cognition, reflection and interaction; in biology, information is associated with the behavior of living organisms.

Information Features

We have figured out the definition of information, now we move on to the basic properties of information. There are many of them, but we will consider a few of the most significant. Information is influenced by the properties of the source data (information content) and the properties of the methods that record the information. Let's start in order.

Objectivity

In the properties of information, objectivity must be given first place. The objectivity of information lies in its independence from subjective human opinion, that is, objective information is the same for everyone.

For example, the expression: “These trousers are expensive” cannot be recognized as information in the narrow scientific sense, because it reflects a subjective opinion. For one they are expensive, for another they are quite affordable, and for a third they generally consider them cheap. But the expression “These trousers cost three thousand rubles” is objective information, since it is perceived by everyone the same. And the one for whom trousers will be expensive, and the one who can afford them, and the one who considers trousers to be completely inexpensive - they are all given the same data: trousers cost three thousand.

Credibility

The properties of information also include its reliability. In another way, reliability can be defined as truthfulness and truthfulness. Objective information will always be reliable, but reliable information will not always be objective; it can also be subjective.

For example: “This movie is very good!” - this is subjective information, because one person may like this film, but another may not (in this example, we consider information in in a broad sense, therefore we recognize its possibility of being subjective). If the person who said this phrase really thought as he said it, then this expression is considered reliable information, that is, truthful. It turns out that our example is subjective and reliable information.

Information becomes unreliable when its meaning is distorted for various reasons: intentionally or unintentionally, due to insufficient accuracy, due to the influence of various interferences.

Accuracy and completeness

What other properties of information can be identified? Undoubtedly, this is the accuracy and completeness of the information. Accuracy is the degree of proximity of information to the real state of things, phenomena, processes, objects. The more accurate the information, the better it is. Also related to the concept of information accuracy is the concept of its completeness. Information is considered complete if its quantity is sufficient for correct comprehension and understanding, for correct decision-making. Incomplete information can lead to misunderstandings and incorrect conclusions.

Let's say a guy walks down the street with his sister and meets his girlfriend along the way, with whom he builds a relationship. Everyone stops and starts talking. If a guy introduces a stranger to his girlfriend inaccurately and incompletely, for example: “Meet Ira,” then she may think that he is cheating on her with Ira, is dating her behind her back, will begin to be jealous and demand an explanation. If he says: “Meet Ira, my Native sister, came to stay for a couple of days,” then the girl will know for sure that Irina is not dangerous to their relationship and is just a close relative of the guy. This way she will get an accurate and full information about a new acquaintance and will draw the right conclusions.

Relevance

The properties of information also include its relevance - importance, significance, urgency at the present moment. The relevance of information is especially important for news media, since the news should always be fresh, have direct relation to the present.

Here is an example of irrelevant information. The expression “Last year I cleared the snow near the house” is no longer relevant this winter, since new snow has fallen and it needs to be removed; no one is interested in the snow from last year.

Value

The main properties of information include its value. The value of information is its usefulness. It is determined according to the needs of specific people, how well the information satisfies the needs of an individual. For example, a man has a stomach ache and he searches the Internet for information about why his stomach hurts. If he finds a well-written article explaining the reasons why his stomach hurts, then such information will be valuable to him. If he finds an article where, for example, it will be explained why the liver hurts, then such information will be useless to him, since it does not interest him and is not needed at the moment.

We examined the basic, most general properties of information. But information can also have a number of additional properties. Let's briefly list them:

  • Attributive – continuity (the ability to constantly accumulate data) and discreteness (division into separate parts and signs).
  • Dynamic – copying information, transferring it to the consumer from the source, transferring it to different languages, transfer to other storage media, information aging.
  • Practical – density and volume of information.

Types of information

Information can be presented in different types, forms, storage and coding methods.

  • According to the method of perception, information can be visual (I see), auditory (I hear), tactile (I touch, feel), olfactory (I smell), gustatory (I taste).
  • By presentation form: text (in the form of text), graphic (in the form of a drawing, diagram, photo, etc.), musical (in the form of music, sound), numerical (in the form of numbers), video (in the form of a video file), combined (combines different shapes representations, for example, a music video - video and audio forms), etc.
  • By specialty: scientific, technical, production, etc. information.
  • By significance for society: mass, individual-oriented, economic, political, aesthetic, etc.

In this article we looked at the types and properties of information, as well as some definitions of information. A brief analysis of the concept of information in specific aspects has been presented here. If you want to learn more about information, then you should turn to individual sciences that study it, for example, computer science.