The logical foundations of computer

click fraud protection

Any device or mechanism created by man, based on the specific laws of its work, which will allocate it through the application features and functionality.The need to meet the urgent needs is the main stimulus for the development of new types of machines, technologies, etc.This is made possible the accumulation of knowledge in many areas of science and technology, the use of which allows you to create a first logical premise of new fields of technology, for example, the logical foundations of the computer, and then translate them into new types of equipment.Common parlance is called "technological progress."

impetus for the emergence of computers were two motives: the need for large volumes of information processing and achievements in various fields of science and technology (electricity, mathematics, physics and technology of semiconductors, metals, and many others).The first samples of electronic computing devices confirmed the principles of computers and the era of rapid development of a new class of technical objects, known as "electronic computing machines."

To implement the technical idea of ​​the computing device were formulated logical foundations of the computer using the algebra of logic, to define a set of functions and theoretical basis.The laws of the algebra of logic, which defined the logical foundations of the computer, formulated in the 19th century Englishman John. Bull.In fact, it is the theoretical basis of digital information processing.Its essence make up the rules of logical relationships between numbers: conjunction, disjunction, and the other, which is very similar to the well-known fundamental relationships between the numbers in arithmetic - multiplication, addition, etc.The numbers in Boolean algebra have a binary representation, ie,figures represented only 1 and 0 Actions with numbers described additional symbols of algebra of logic.These elements allow the combination of simple mathematics of logical laws describe any computing task, or control of the special characters, that is, "to write a program."Using the input device, the program "loaded" into the computer and serves as a "disposal" for it to be performed.

input device converts the incoming symbols into electrical signals in binary form, and steps over them - transfer and conversion of implementing the execution of arithmetic and logic operations are performed with electronic devices, which are called gates, adders, triggers, etc.They constitute a technical stuffing the computer, where they number in the tens of thousands of items.

design computer has 4 main components: CU - control unit, RAM and ROM - node operational and non-volatile memory, ALU - arithmetic logic unit, UVV - input device output.Of course, each of them complies incorporated into the design of the logical foundations of the computer.Workflow boot computer includes a RAM or ROM operating program written in a special code, which is stored on punched cards, magnetic tapes, magnetic and optical disks and other media.This program is designed to manipulate the current flows from the W or operational information and to obtain the result programmed, such as displaying an image on a monitor or an audio signal into a digital conversion, etc.For this CU performs a number of information blocks transfers between all devices included in the computer.

main "nerve center" of the computer is the ALU - performer of all arithmetic and logic operations.Currently function ALU performs a device called a processor or microprocessor, which is a semiconductor device with the size of matchboxes couple with a set incredible number of functions.Gradually, in the microprocessor adds the function of controlling external devices - monitors, printers, etc.Recent developments in this area helped to create microprocessors with a full set of functional units of computers, so that they appeared single-chip computers and pocket features a full-fledged computer.What is surprising is that the logical foundations of the computer, once designed for the first computing device, have not changed to this day.