Literature elective program. Methodology elective "modern trends in the development of Russian literature"

MINISTRY OF TRANSPORT OF THE RUSSIAN FEDERATION

TRAFFIC DEPARTMENT

KRSNOYARSK INSTITUTE OF RAILWAY TRANSPORT - BRANCH OF GOU VPO "IRKUTSK STATE UNIVERSITY OF COMMUNICATIONS"

COURSE OF LECTURES ON INFORMATICS

Textbook for engineering students

Krasnoyarsk 2012

UDC 681.3.06 BBK 32-973-01

Egorushkin, I.O. A course of lectures on informatics. Part 1: Study Guide / I.O. Egorushkin. Krasnoyarsk: Krasnoyarsk Institute of Railway Transport - branch of GOU VPO "Irkutsk State University ways of communication", 2012. 79 p.: ill.

A course of lectures on informatics for 1 semester, developed on the basis of the FEPO standard, is presented, including the following disciplinary modules:

a) the concept of information, general characteristics processes of collection, transmission, processing and accumulation of information;

b) technical means for implementing information processes; computer hardware;

c) software for the implementation of information processes; d) information technology: (technologies for processing text and

tabular information).

This course of lectures is intended for the development of the theoretical part of the discipline "Informatics" (lecture course) by students of engineering specialties. The manual consists of nine lectures provided by the program of the 1st semester, developed on the basis of the FEPO standard.

Il. 15. Bibliography: 3 titles.

Reviewers: Gaydenok N.D. – Doctor of Technical Sciences, Professor of the EZhD Department

Rogalev A.N. – Candidate of Physical and Mathematical Sciences, Associate Professor of the Department of Mathematical Modeling and Informatics, IGURE SibFU

Published by decision of the methodological council of KRIZhT

© Krasnoyarsk Institute of Railway Transport - branch of the State Educational Institution of Higher Professional Education "Irkutsk State University of Communications", 2012

© AND ABOUT. Egorushkin, 2012

LECTURE 1

1.1.Messages, data, signals ..................................

1.2. Measures and units of presentation, measurement and storage of information ...............

1.3. Types and properties of information ............................................................... ...............................................

LECTURE 2. GENERAL CHARACTERISTICS OF COLLECTION PROCESSES,

PROCESSING, TRANSFER AND ACCUMULATION OF INFORMATION ..................................

2.1.Measurement of information.................................................... ................................................. ......

2.2. Perception of information ............................................................... ................................................. ....

2.3.Information collection .......................................................... ................................................. .................

2.4.Transmission of information.................................................... ................................................. .........

2.5.Processing of information.................................................... ................................................. ......

INFORMATION-LOGICAL BASES OF COMPUTER..................................................

2.6.Number systems .......................................................... ................................................. ..............

2.7.Positional number systems ............................................................... ................................................

LECTURE 3. INFORMATION-LOGICAL COMPUTER BASES

3.1. Number systems (end) .............................................. ............................................

3.1.1. Binary number system...........................................................................

3.1.2. Other positional number systems....................................................

3.1.3. Mixed number systems.....................................................................

INFORMATICS AS SCIENCE .............................................. .................................

3.2. Subject area of ​​informatics as a science .............................................................. ..............

3.3. A brief history of the development of informatics ............................................................... ......................

3.4. The concept of the information society .................................................... .........................

3.5. Goals and objectives of the course "informatics".................................................................. .................................

LECTURE 4. COMPUTER AS INFORMATION PROCESSING TOOL ...............

4.1.History of the development of computers .................................................... ................................................. ......

4.2.Main characteristics of the computer .......................................................... ...............................................

4.3.Computer classification ............................................................... ................................................. ...........

LECTURE 5. COMPUTER AS INFORMATION PROCESSING TOOL

(THE ENDING)............................................... ................................................. ............

5.1. General principles of building modern computers .............................................................. ......

5.2. Computer software and its functions .............................................................. ...........

5.3. Composition and purpose of the main elements of the PC, their characteristics .....................

5.3.1. General information about PCs and their classification ..........................................

5.3.2. Block diagram of a PC...............................................................................

5.3.3. External PC devices............................................................................

5.3.4. PC storage devices................................................................

LECTURE 6. OPERATING SYSTEMS. GRAPHIC

WINDOWS OPERATING ENVIRONMENT .................................................. ...............

6.1.MSDOS operating system............................................................... ................................................

6.2.NortonCommander shell............................................................... ...............................................

6.3.Basic technological mechanisms of Windows............................................................... .......

6.4. Creation of objects, management of objects, properties of objects ..............................................

6.5 Navigating the file system. Operations with files. Searching for files.

Configuring operating system settings............................................................... .

6.6 Overview of Windows Applications. Application Collaboration ..................................................

6.7.Disk Maintenance Programs.Data Archiving.Programs-

archivers................................................. ................................................. ...................................

6.8.FarManager shell............................................................... ................................................. ........

LECTURE 7. INFORMATION PROCESSING SOFTWARE56

LECTURE 8. SOFTWARE TOOLS FOR INFORMATION PROCESSING

(THE ENDING)............................................... ................................................. ............

8.1. Application programs............................................................... ................................................. ....

8.2 Programming systems............................................................... ...............................................

8.3.Software classification.................................................................................... .................

8.4.Problem-oriented PPP .............................................................. ...................................

8.5.Integrated RFP .......................................................... ................................................. ......

LECTURE 9. BASICS OF TEXT AND TABLE PROCESSING

INFORMATION................................................... ................................................. ........

9.1.MicrosoftWord word processor.................................................................. ...............................

9.1.1. Starting and shutting down Word.............................................................

9.1.2. Main menu and toolbars.........................................................

9.1.3. Opening and saving documents.............................................................

9.1.4. Document Formatting..........................................................................

9.1.5. Printing a Document................................................................................................

9.2.Microsoft Excel Spreadsheet.................................................................. ...............................

9.2.1. Basic Spreadsheet Concepts......................................................

9.2.2. MS Excel spreadsheet interface. Main differences

between Word and Excel ............................................... ................................................. .......

LITERATURE................................................. ................................................. ...........

LECTURE 1. INFORMATION AND FORMS OF ITS PRESENTATION

The concept of information is a fundamental concept of informatics. Any human activity is a process of collecting and processing information, making decisions based on it and implementing them. With the advent of modern computer technology, information began to act as one of the most important resources of scientific and technological progress.

AT Within the framework of science, information is a primary and indefinable concept. It assumes the existence of a material carrier of information, a source of information, an information transmitter, a receiver, and a communication channel between the source and the receiver. The concept of information is used in all areas: science, technology, culture, sociology and Everyday life. The specific interpretation of the elements associated with the concept of information depends on the method concrete science, the purpose of the study, or simply from our ideas.

The term "information" comes from the Latin informatio - explanation, exposition, awareness. encyclopedic Dictionary(M.: Sov. encyclopedia, 1990) defines information in historical evolution: initially - information, transmitted by humans verbally, in writing or in any other way (with the help of conditional signals, technical means, etc.); since the middle of the twentieth century - a general scientific concept, including the exchange of information between people, a person

and automatic, the exchange of signals in the animal and flora(transfer of traits from cell to cell, from organism to organism).

A narrower definition is given in technology, where this concept includes all information that is the object of storage, transmission and transformation of information.

The most general definition takes place in philosophy, where information is understood as a reflection of the real world. Information as a philosophical category is considered as one of the attributes of matter, reflecting its structure.

AT evolutionary series matter → energy → information each

The next manifestation of matter differs from the previous one in that it was more difficult for people to recognize, isolate and use it in its pure form. It is the difficulty of identifying various manifestations of matter that probably determined the indicated sequence of cognition of nature by mankind.

1.1. Messages, data, signals

FROM the concept of information includes such concepts as signal, message and

A signal (from the Latin signum - a sign) is any process that carries information.

There are two forms of information representation - continuous and discrete. Since signals are information carriers, physical processes of various nature can be used as the latter.

Information is represented (reflected) by the value of one or more parameters of the physical process, or by a combination of several parameters.

A signal is called continuous if its parameter within the given limits can take any intermediate values. A signal is called discrete if its parameter within the given limits can take on certain fixed values.

A message is information presented in a specific form and intended to be transmitted.

From a practical point of view, information is always presented as a message. The informational message is associated with message source, on-

message receiver and communication channel.

The message from the source to the receiver is transmitted in material and energy form (electrical, light, sound signals, etc.). A person perceives messages through the senses. Information receivers in technology perceive messages using various measuring and recording equipment. In both cases, the reception of information is associated with a change in time of some quantity characterizing the state of the receiver. In this sense, an information message can be represented by a function x (t), characterizing the change in time of the material and energy parameters of the physical environment in which information processes are carried out.

The function x (t ) takes any real values ​​in the range of time t . If the function x (t) is continuous, then there is continuous or analog information, the source of which is usually various natural objects (for example, temperature, pressure, air humidity), objects of technological production processes (for example, neutron flux in the core, pressure and temperature coolant in the circuits of a nuclear reactor), etc. If the function x (t) is discrete, then information messages used by a person have the character of discrete messages (for example, alarms transmitted by means of light and sound messages, language messages transmitted in writing or with using sound signals; messages transmitted using gestures, etc.).

AT modern world information is usually processed by computers. Therefore, computer science is closely related to the toolkit - a computer.

A computer is a device for converting information through the execution of a program-controlled sequence of operations. A synonym for a computer is a computer, more often an electronic computer (ECM).

Data is information presented in a formalized form and intended for processing by technical means, for example, a computer.

Therefore, along with the terms information input, information processing, information storage, information retrieval terms are used data entry, data processing, data storage, etc.

1.2. Measures and units of representation, measurement and storage of information

For theoretical computer science, information plays the same role as matter in physics. And just as a substance can be assigned a fairly large number of characteristics (mass, charge, volume, etc.), so for information there is a fairly representative set of characteristics, albeit not so large. As for the characteristics of a substance, so for the characteristics of information there are units of measurement, which allows a certain portion of information to be assigned numbers - quantitative characteristics of information.

To date, the following methods of measuring information are most known:

volume; entropy; algorithmic.

Volumetric is the simplest and crudest way to measure information. It is natural to call the corresponding quantitative assessment of information the volume of information.

The amount of information in a message is the number of characters in the message.

Because, for example, the same number can be written in many different ways (using different alphabets):

"twenty one" 21 11001

then this method is sensitive to the form of representation (record) of the message. In computing, all processed and stored information, regardless of its nature (number, text, display) is represented in binary form (using an alphabet consisting of only two characters 0 and 1). This standardization made it possible to introduce two standard units of measure: bit and byte. A byte is eight bits. These units of measurement will be discussed in more detail later.

The amount of information called the numerical characteristic of the signal, reflecting degree of uncertainty(incompleteness of knowledge), which disappears after receiving a message in the form of a given signal. This measure of uncertainty in information theory is called entropy. If, as a result of receiving a message, complete clarity is achieved on some issue, it is said that complete or exhaustive information was received and the need to receive additional information no. And vice versa, if after receiving the message the undefined remained the same, then no information was received (zero information).

The above reasoning shows that between the concepts of information

tion, uncertainty and choice there is a close relationship. So,

any uncertainty implies the possibility of choice, and any information, reducing uncertainty, reduces the possibility of choice. With complete information, there is no choice. Partial information reduces the number of choices, thereby reducing uncertainty.

Example. A person tosses a coin and watches which side it falls on. Both sides of the coin are equal, so it is equally likely that one side or the other will fall out. Such a situation is attributed to the initial uncertainty characterized by two possibilities. After the coin falls, complete clarity is achieved, the uncertainty disappears (becomes equal to zero).

In algorithmic information theory (section of the theory of algorithms) it is proposed algorithmic method evaluating the information in the message. This method can be briefly characterized by the following reasoning.

Everyone will agree that the word 0101…01 is more difficult than the word 00..0, and the word where 0 and 1 are chosen from the experiment - tossing a coin (where 0 is the coat of arms, 1 is tails), is more difficult than both of the previous ones.

The computer program that produces a word from all zeros is extremely simple: print the same character. To get 0101 ... 01, a slightly more complex program is needed that prints the character opposite to the one just printed. A random, non-patterned sequence cannot be produced by any "short" program. The length of the program producing the chaotic sequence must be close to the length of the last one.

The above reasoning suggests that any message can be assigned a quantitative characteristic that reflects the complexity (size) of the program that allows it to be produced.

Since there are many different computers and different programming languages ​​( different ways algorithm), then, for definiteness, they are given by some specific computer, for example, a Turing machine, and the expected quantitative characteristic - the complexity of a word (message) - is defined as the minimum number of internal states of the Turing machine required to reproduce it. Algorithmic information theory also uses other ways of specifying complexity.

1.3. Types and properties of information

Let us dwell in more detail on the disclosure of the concept of information. Consider the following list:

genetic information; geological information; synoptic information; false information (disinformation); full information; economic information; technical information, etc.

Probably everyone will agree that not all types of information are given in this list, just as with the fact that the list is of little use. This list is not systematic. For species classification to be useful, it must be based on some system. Usually when

classification of objects of the same nature, one or another property (may be a set of properties) of objects is used as a basis for classification.

As a rule, object properties can be divided into two large classes: external and internal properties.

Internal properties are properties inherent in an object. They are usually "hidden" from the student of the object and manifest themselves indirectly in the interaction of this object with others.

External Properties are properties that characterize the behavior of an object when interacting with other objects.

Let us explain what has been said with an example. Mass is an internal property of matter (matter). It manifests itself in interaction or in the course of some process. From here appear such concepts of physics as gravitational mass and inertial mass, which could be called external properties of matter.

A similar division of properties can also be given for information. For any information, you can specify three objects of interaction: the source of information, the receiver of information (its consumer) and the object or phenomenon that this information reflects. Therefore, three groups of external properties can be distinguished, the most important of which are the properties of information from the point of view of the consumer.

Information quality- a generalized positive characteristic of information, reflecting the degree of usefulness for the user.

Level of quality- one of the important positive properties of information (from the position of the consumer). Any negative property can be replaced by its inverse, positive one.

Most often, quality indicators are considered that can be expressed in numbers, and such indicators are quantitative characteristics positive properties of information.

As is clear from the above definitions, in order to define a set key indicators quality, it is necessary to evaluate the information from the point of view of its consumer.

The consumer in practice faces the following situations: some of the information corresponds to his request, his requirements and such information is called relevant; and some is not, it is called irrelevant; all information is relevant, but it is not enough for the needs of the consumer; if the information received is sufficient, then it is natural to call such information complete; the information received is untimely (for example, outdated);

some of the information recognized by the consumer as relevant may turn out to be unreliable, that is, containing hidden errors (if the consumer detects some of the errors, then he simply classifies the corrupted information as irrelevant); the information is not available;

information is subject to "undesirable" use and modification by other consumers; the information has an inconvenient form and volume for the consumer.

An overview of the above situations allows us to formulate the following distribution of information properties.

Relevance - the ability of information to meet the needs (requests) of the consumer.

Completeness is the property of information to exhaustively (for a given consumer) characterize the reflected object and (or) process.

Timeliness- the ability of information to meet the needs of the consumer at the right time.

Reliability is the property of information not to have hidden errors. Availability is a property of information that characterizes the possibility of its

received by this consumer.

Security is a property that characterizes the impossibility of unauthorized use or change.

Ergonomics is a property that characterizes the convenience of the form or amount of information from the point of view of a given consumer.

In addition, information can be classified in terms of its use into the following types: political, technical, biological, chemical, etc. e. This is essentially a classification of information according to need.

Finally, characterizing the quality of information in general, the following definition is often used. scientific information.Note that the last definition characterizes not the relationship "information - consumer", but the relationship "information - the reflected object / phenomenon", that is, this is already a group of external properties of information. Here, the most important is the property of adequacy.

Adequacy is the property of information to uniquely correspond to the displayed object or phenomenon. Adequacy turns out to be an internal property of information for the consumer, manifesting itself through relevance and reliability.

Among the internal properties of information, the most important are the volume (quantity) of information and its internal organization, structure. According to the method of internal organization, information is divided into two groups:

1. Data or a simple, logically unordered set of information.

2. Logically ordered, organized datasets. The ordering of the data is achieved by imposing on the data some

structures (hence the often used term - data structure).

In the second group, there are in a special way organized information - knowledge. Knowledge, unlike data, is information not about a single specific fact, but about how all facts of a certain type are arranged.

Finally, the properties of information associated with the process of its storage turned out to be out of our field of vision. Here the most important property is survivability - the ability of information to maintain its quality over time. You can also add the uniqueness property to this. Information that is stored in a single copy is called unique.

Thus, we have described the main properties of information, and, accordingly, have determined the basis for classifying it by type.