The human brain and the computer. What do computers and humans have in common? Amazing cases of revealing the unique abilities of the human brain

Processors for personal computers are divided into several criteria:

  • manufacturer;
  • family;
  • model within the family.

Choosing a processor begins with choosing a manufacturer. There are two main processor manufacturers: Intel and AMD. The choice of processor must be approached strategically, because... processors and, accordingly, platforms are not mutually compatible. That is, a processor of one family cannot be replaced with a processor of another family of the same company - the entire platform will have to be changed.

Processor architecture

The laws of competition led to the fact that the attention of developers was directed to the search for increased productivity. Two new directions were found:

expansion of the bit capacity of existing 32-bit processors to 64 bits; integration into the processor of two or more cores that are directly involved in calculations.

Processor size– length of simultaneously processing data (in bits).

CPU core– a set of arithmetic-logical devices, control units and cache memory, implemented within a single processor microarchitecture.

Operating frequency– switching frequency of transistors in the processor core. It is obtained by multiplying the system bus clock frequency by a coefficient specified by a special processor unit.

Clock frequency– reference frequency generated by a special system bus device. Used to synchronize the processor and bus.

Modern processors have a Neumann-type architecture:

arithmetic-logical unit; control unit; memory unit; input-output device.

CPU core

The main element of the processor that performs data processing is the arithmetic logic unit (ALU). The processor has special memory cells called registers. They store and receive data at tremendous speed. As processing proceeds, data from the register is received and returned.

All modern microprocessors are synchronous, that is, they change the state of the elements at the moment clock pulses arrive. Each cycle has a signal that switches certain triggers. For example, data is loaded into registers only on the edge of a pulse, and read only on the fall. That is why the ALU can both read and write data to the register within one cycle.

The control unit, ALU and cache memory form the core of the processor.

System bus

The processor receives both data and commands for processing them from RAM cells via the system bus. The system bus consists of: data bus, address bus, control bus. The data bus copies data from memory cells to the processor registers. Using the address bus, the processor selects, starting from which cell it should receive data. Via the control bus, the processor receives commands from RAM to process data.

Cache memory

Inside the processor, all operations occur tens of times faster than when exchanging data with RAM. This means that the less frequently the processor accesses memory for data and instructions, the faster it can operate. To reduce the number of accesses, a relatively small block of ultra-RAM memory is built into the processor, capable of operating at the core frequency. This block of memory is called cache memory.

When accessing RAM cells, the processor receives not only the data that is required immediately for loading into registers, but also something else “in reserve.” This margin is written to cache memory. If the stored data is needed in the next cycle, the processor will fetch it from the cache. If other data is required, the processor will access RAM and the contents of the cache will be updated. As a rule, modern processors have two blocks of internal cache memory. The first block (Level 1 cache, 11) is typically divided into a data cache and an instruction cache. The second block (level 2 cache, 12) is used only for data storage. Some processor models (for example, Pentium 4 Extreme Edition) use L3 cache.

CPU socket

To connect cache memory, I/O units, clock signals, and power, the processor requires hundreds of lines. Therefore, the core and other processor blocks are placed in a sealed case equipped with many contact legs or pads. The case is inserted into the processor socket (Zosket.) on the system board, and from the bus connector they go to other computer devices.

Processor sockets are usually labeled by the number of pins, for example Socked 775 or Socked 939. Processors of the same family and the same architecture may have different cases and different processor sockets that are incompatible with each other. But the opposite picture (one socket for processors of different architectures) is very rare.

Why does a processor socket need hundreds of pins? Mainly for power supply. The data bus in the Pentium 4 processor is 64-bit and requires 64 lines. The address bus occupies 36 lines. 124 lines are reserved for service needs, and 28 outputs are reserved. All remaining pins are used to supply power. So, in the Socked connector there are 775 power lines - 523.

This number of power lines is explained by the features of the processor architecture. A modern processor has over 150 million transistors. They must be provided with current: small, in fractions of a microampere, but each of one and a half hundred million transistors. As a result, it turns out that the total current consumption of the processor is tens of amperes. For example, the maximum current consumption for a Pentium 4 processor with a Prescott core is 119 A. For comparison, the maximum permissible current in a household electrical network usually does not exceed 16 A.

Every human brain is something special, an incredibly complex miracle of nature, created through millions of years of evolution. Today our brain is often called a real computer. And this expression is not used in vain.

And today we will try to understand why scientists call the human brain a biological computer, and what interesting facts exist about it.

Why the brain is a biological computer

Scientists call the brain a biological computer for obvious reasons. The brain, like the main processor of any computer system, is responsible for the operation of all elements and nodes of the system. As is the case with RAM, hard drive, video card and other PC elements, the human brain controls vision, breathing, memory and any other process occurring in the human body. He processes the received data, makes decisions and performs all the intellectual work.

As for the “biological” characteristic, its presence is also quite obvious, because, unlike conventional computer technology, the human brain is of biological origin. So it turns out that the brain is a real biological computer.

Like most modern computers, the human brain has a huge number of functions and capabilities. And we offer some of the most interesting facts about them below:

  • Even at night, when our body is resting, the brain does not fall asleep, but, on the contrary, is in a more active state than during the day;
  • The exact amount of space or memory that can be stored in the human brain is currently unknown to scientists. However, they suggest that this “biological hard drive” can store up to 1000 terabytes of information;
  • The average weight of the brain is one and a half kilograms, and its volume increases, as in the case of muscles, from training. True, in this case, training involves gaining new knowledge, improving memory, etc.;
  • Despite the fact that it is the brain that reacts to any damage to the body by sending pain signals to the corresponding parts of the body, it itself does not feel pain. When we feel a headache, it is only pain in the tissues and nerves of the skull.

Now you know why the brain is called a biological computer, which means you have done a little training of your brain. Don't stop there, and systematically learn something new.

Hello! aspiring “computer geniuses.” I am writing mainly for the older generation, people who have not been involved with computers in life, but today want to understand how this strange mechanism works, which already understands our speech and answers our questions with its pleasant voice.

Humanity has always imitated nature in creating mechanisms.

She (nature) suggested how to create airplane and helicopter wings, rocket jet engines and other inventions. All of them are created in the likeness of animals, birds, insects and other amphibians. The time has finally come to create a semblance of “Homo sapiens” and this semblance of a reasonable person is on our table, in our pocket, in our car. All these smart devices (gadgets) have different bodies and faces, but they are designed and work according to the same rules, often copied from humans.

Computer and man – what do they have in common?

Of course, comparing a computer with a person is like comparing a bird with an airplane, but still...

The most important thing in a person is his brain. For now, the man is alive. In our brain there are sections that control the image received from the eyes and other sensory organs. All information is processed, some is stored in temporary memory, some is recorded (remembered) in long-term memory, and some is deleted into the “basket” with the possibility of subsequent restoration.


The brain of a computer is its processor. The processor, just like the brain, reads information from video cameras, microphones, computer mouse commands or voice commands, and then, after processing by the processor, it gives us a picture on the monitor or sound through the speakers. The computer also has temporary memory (CACH), random access memory and long-term memory stored on various disks (flash drives). At any time, we can first delete all unnecessary information into the trash, and over time, clear its contents as unnecessary or restore accidentally deleted documents.

Computer and human power supply system

A person is a product that operates on electrochemical processes. Each of us is an object controlled by weak electric fields and chemical reactions. We produce energy by obtaining biological food. We have a complex food system.

A computer, as you know, runs on electricity; its power system is provided by a power supply or batteries (batteries). The entire computer power system is connected by ultra-thin conductors; in humans these are blood vessels, muscles, nerves and other connections.

Human-computer training.

Computers originated in the second half of the last century. Unlike born humans, the first computers occupied huge areas. Thus, as a person grew smarter and larger with age, computers became smarter and smaller. At first, small programs for calculations were created for computers. Over time, programmers combined ready-made programs into groups of independent programs. The system became a union of thousands of programs working together to solve complex problems. So humanity, through joint efforts, created powerful processors controlled by millions of programs.

Nowadays, a computer is already a completely mature young man of earthlings. He has fantastic opportunities ahead - connections with people. I can’t say with certainty whether this is good or bad. I am sure that the “CREATOR” of humanity will not destroy his creation. I hope the article was useful to someone.

Thanks in advance to everyone who shared information on social networks.

The last century marked a major leap in human development. Having gone through a difficult path from the primer to the Internet, people have not been able to solve the main mystery that has tormented the minds of the greats for hundreds of years, namely, how does the human brain work and what is it capable of?

To this day, this organ remains the most poorly studied, but it was this organ that made man what he is now - the highest stage of evolution. The brain, continuing to keep its secrets and mysteries, continues to determine the activity and consciousness of a person at every stage of his life. No modern scientist has yet been able to unravel all the possibilities of which he is capable. That is why a large number of myths and unsubstantiated hypotheses are concentrated around one of the most important organs of our body. This can only indicate that the hidden potential of the human brain has yet to be explored, but for now its abilities go beyond the boundaries of already established ideas about its work.


Photo: Pixabay/geralt

Brain structure

This organ consists of a huge number of connections that create stable interaction between cells and processes. Scientists suggest that if this connection is represented as a straight line, its length will be eight times the distance to the Moon.

The mass fraction of this organ in the total body mass is no more than 2%, and its weight varies between 1019-1960 grams. From the moment of birth to the last breath of a person, he conducts continuous activity. Therefore, it needs to absorb 21% of all oxygen constantly entering the human body. Scientists have drawn up an approximate picture of how the brain assimilates information: its memory can hold from 3 to 100 terabytes, while the memory of a modern computer is currently being improved to a volume of 20 terabytes.

The most common myths about the human biological computer

Neuronal tissues of the brain die during the life of the body, and new ones are not formed. This is a fallacy that Elizabeth Goode has proven absurd. Nervous tissue and neurons are constantly renewed, and new connections replace the dead ones. Research has confirmed that in areas of cells destroyed by stroke, the human body is able to “grow” new material.

The human brain is only 5-10% open, all other possibilities are not used. Some scientists explained this by saying that nature, having created such a complex and developed mechanism, came up with a protective system for it, protecting the organ from excessive stress. This is wrong. It is reliably known that the brain is 100% involved during any human activity; it’s just that at the moment of performing any actions, its individual parts react one by one.

Superpowers. What can surprise the human mind?

Some people who do not outwardly show signs of having incredible abilities may have truly incredible abilities. They don’t appear in everyone, but scientists say that regular intensive brain training can develop superpowers. Although the secret of “selecting” people who may have the right to be called a genius has not yet been revealed. Some people know how to competently get out of difficult situations, while others sense approaching danger on a subconscious level. But the following superpowers are more interesting from a scientific point of view:

  • The ability to perform mathematical operations of any complexity without the help of a calculator or calculations on paper;
  • The ability to create brilliant creations;
  • Photographic memory;
  • Speed ​​reading;
  • Psychic abilities.

Amazing cases of revealing the unique abilities of the human brain

Over the entire history of human existence, a large number of stories have appeared confirming the fact that the human brain can have hidden abilities, adapt to changing situations and shift certain functions from the affected part to the healthy part.

Sonar vision. This ability is usually developed after loss of vision. Daniel Kish managed to master the echolocation technique inherent in bats. The sounds he makes, such as clicking his tongue or fingers, help him walk without a cane.

Mnemonics– a unique technique that allows you to perceive and remember any amount of information, regardless of its nature. Many people master it in adulthood, but American Kim Peak has this innate gift.

The gift of foresight. Some people claim that they can see the future. At the moment, this fact has not been fully proven, but history knows many people whom such an ability has made famous throughout the world.

Phenomena of which the human brain is capable

Carlos Rodriguez, at the age of 14, lost more than 59% of his brain after an accident, but still lives a completely normal life.

Yakov Tsiperovich, after clinical death and a week-long stay in a comatose state, stopped sleeping, eats little and does not age. Three decades have passed since that moment, and he is still young.

Fenias Gage suffered a terrible injury in the mid-19th century. A thick crowbar passed through his head, depriving him of a good part of his brain. The medicine of those years was not sufficiently advanced, and doctors foreshadowed his imminent death. However, the man not only did not die, but also retained his memory and clarity of consciousness.

The human brain, like its body, needs to be subjected to constant training. This can be either complex, specially designed programs, or reading books, solving puzzles and logical problems. At the same time, we should not forget about saturating this organ with nutrients. For example, the brain activity enhancer HeadBooster http://hudeemz.com/headbooster has a large number of them. But still, only constant training allows the brain to constantly develop and increase its capabilities.

Despite their best efforts, neuroscientists and cognitive psychologists will never find a copy of Beethoven's Fifth Symphony, words, pictures, grammatical rules or any other external cues in the brain. Of course, the human brain is not completely empty. But it doesn't contain most of the things people think it contains - even simple things like "memories".

Our misconceptions about the brain have deep historical roots, but we are especially confused by the invention of computers in the 1940s. For half a century, psychologists, linguists, neuroscientists and other experts on human behavior have argued that the human brain works like a computer.

To get an idea of ​​how frivolous this idea is, consider the brains of babies. A healthy newborn has more than ten reflexes. He turns his head in the direction where his cheek is scratched and sucks everything that comes into his mouth. He holds his breath when immersed in water. He grabs things in his hands so tightly that he can almost support his own weight. But perhaps most importantly, newborns have powerful learning mechanisms that allow them to change quickly so they can interact more effectively with the world around them.

Feelings, reflexes and learning mechanisms are what we have from the very beginning, and when you think about it, that's quite a lot. If we lacked any of these abilities, we would probably have difficulty surviving.

But here's what we don't have since birth: information, data, rules, knowledge, vocabulary, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols and buffers - the elements that allow digital computers to behave somewhat rationally. Not only are these things not in us from birth, they do not develop in us during life.

We don't keep words or rules that tell us how to use them. We do not create images of visual impulses, store them in a short-term memory buffer, and then transfer the images to a long-term memory device. We do not recall information, images or words from the memory register. All this is done by computers, but not by living beings.

Computers literally process information - numbers, words, formulas, images. The information must first be translated into a format that a computer can recognize, that is, into sets of ones and zeros (“bits”) collected into small blocks (“bytes”).

Computers move these sets from place to place into various areas of physical memory, implemented as electronic components. Sometimes they copy sets, and sometimes they transform them in various ways - say, when you correct errors in a manuscript or retouch a photograph. The rules that a computer follows when moving, copying, or working with an array of information are also stored inside the computer. A set of rules is called a "program" or "algorithm". A set of algorithms working together that we use for different purposes (for example, buying stocks or dating online) is called an “application”.

These are known facts, but they need to be spelled out to make things clear: computers operate on a symbolic representation of the world. They do store and retrieve. They really process. They do have physical memory. They are truly driven by algorithms in every way.

However, people don’t do anything like that. So why do so many scientists talk about our mental activity as if we were computers?

In 2015, artificial intelligence expert George Zarkadakis released a book, In Our Image, in which he describes six different concepts that people have used over the past two thousand years to describe human intelligence.

In the earliest version of the Bible, humans were created from clay or mud, which an intelligent God then imbued with his spirit. This spirit “describes” our mind - at least from a grammatical point of view.

The invention of hydraulics in the 3rd century BC led to the popularity of the hydraulic concept of human consciousness. The idea was that the flow of various fluids in the body - "bodily fluids" - accounted for both physical and spiritual functions. The hydraulic concept persisted for more than 1,600 years, all the while hampering the development of medicine.

By the 16th century, devices powered by springs and gears had appeared, which inspired René Descartes to argue that man is a complex machine. In the 17th century, British philosopher Thomas Hobbes proposed that thinking occurs through small mechanical movements in the brain. By the beginning of the 18th century, discoveries in the field of electricity and chemistry led to the emergence of a new theory of human thinking, again of a more metaphorical nature. In the mid-19th century, German physicist Hermann von Helmholtz, inspired by recent advances in communications, compared the brain to a telegraph.

Albrecht von Haller. Icones anatomicae

Mathematician John von Neumann stated that the function of the human nervous system is "digital in the absence of evidence to the contrary", drawing parallels between the components of computer machines of the time and areas of the human brain.

Each concept reflects the most advanced ideas of the era that gave birth to it. As one might expect, just a few years after the birth of computer technology in the 1940s, it was argued that the brain worked like a computer: the brain itself played the role of the physical carrier, and our thoughts acted as the software.

This view reached its zenith in the 1958 book The Computer and the Brain, in which mathematician John von Neumann stated emphatically that the function of the human nervous system is “digital in the absence of evidence to the contrary.” Although he acknowledged that very little is known about the role of the brain in the functioning of intelligence and memory, the scientist drew parallels between the components of computer machines of that time and areas of the human brain.

Image: Shutterstock

Thanks to subsequent advances in computer technology and brain research, an ambitious interdisciplinary study of human consciousness gradually developed, based on the idea that people, like computers, are information processors. This work now includes thousands of studies, receives billions of dollars in funding, and has been the subject of numerous papers. Ray Kurzweil's 2013 book How to Create a Mind: Unraveling the Mystery of Human Thinking illustrates this point, describing the brain's "algorithms", its "information processing" techniques, and even how it superficially resembles integrated circuits in its structure.

The idea of ​​human thinking as an information processing device (IP) currently dominates in human consciousness both among ordinary people and among scientists. But this is, in the end, just another metaphor, a fiction that we pass off as reality to explain something we don’t really understand.

The imperfect logic of the OR concept is quite easy to formulate. It is based on a fallacious syllogism with two reasonable assumptions and a wrong conclusion. Reasonable Assumption #1: All computers are capable of intelligent behavior. Reasonable Assumption #2: All computers are information processors. Incorrect conclusion: all objects capable of behaving intelligently are information processors.

If we forget about formalities, then the idea that people should be information processors just because computers are such is complete nonsense, and when the concept of AI is finally abandoned, historians will probably view it from the same point of view as now To us, the hydraulic and mechanical concepts look like nonsense.

Carry out an experiment: draw a hundred-ruble bill from memory, and then take it out of your wallet and copy it. Do you see the difference?

A drawing made in the absence of an original will certainly turn out to be terrible in comparison with a drawing made from life. Although, in fact, you have seen this bill more than one thousand times.

What is the problem? Shouldn't the "image" of the banknote be "stored" in the "storage register" of our brain? Why can't we just "refer" to this "image" and depict it on paper?

Obviously not, and thousands of years of research will not allow us to determine the location of the image of this bill in the human brain simply because it is not there.

The idea, promoted by some scientists, that individual memories are somehow stored in special neurons is absurd. Among other things, this theory takes the question of the structure of memory to an even more intractable level: how and where is memory stored in cells?

The very idea that memories are stored in individual neurons is absurd: how and where in a cell can information be stored? We will never have to worry about the human mind running amok in cyberspace, and we will never be able to achieve immortality by downloading our soul to another medium.

One of the predictions, which was expressed in one form or another by futurist Ray Kurzweil, physicist Stephen Hawking and many others, is that if human consciousness is like a program, then technologies should soon appear that will allow it to be loaded onto a computer, thereby greatly enhancing intellectual abilities and making immortality possible. This idea formed the basis of the plot of the dystopian film Transcendence (2014), in which Johnny Depp played a scientist similar to Kurzweil. He uploaded his mind to the Internet, causing devastating consequences for humanity.

Still from the film "Supremacy"

Fortunately, the concept of OI has nothing even close to reality, so we don't have to worry about the human mind running amok in cyberspace, and sadly, we'll never be able to achieve immortality by downloading our souls to another medium. It's not just a lack of software in the brain, the problem is even deeper - let's call it the problem of uniqueness, and it is both fascinating and depressing.

Since our brains have neither “memory devices” nor “images” of external stimuli, and the brain changes over the course of life under the influence of external conditions, there is no reason to believe that any two people in the world will react to the same stimulus in the same way. If you and I attend the same concert, the changes that happen in your brain after listening will be different from the changes that happen in my brain. These changes depend on the unique structure of nerve cells, which was formed during the entire previous life.

This is why, as Frederick Bartlett wrote in his 1932 book Memory, two people hearing the same story will not be able to retell it exactly the same way, and over time their versions of the story will become less and less similar to each other.

"Superiority"

I think this is very inspiring because it means that each of us is truly unique, not only in our genes, but also in the way our brains change over time. But it's also disheartening, because it makes the already difficult work of neuroscientists almost impossible to solve. Each change can affect thousands, millions of neurons or the entire brain, and the nature of these changes is also unique in each case.

Worse, even if we could record the state of each of the brain's 86 billion neurons and simulate it all on a computer, this enormous model would be useless outside the body to which the brain belongs. This is perhaps the most annoying misconception about the human structure, which we owe to the erroneous concept of OI.

Computers store exact copies of data. They can remain unchanged for a long time even when the power is turned off, while the brain supports our intelligence only as long as it remains alive. There is no switch. Either the brain will work without stopping, or we will not exist. Moreover, as neuroscientist Stephen Rose noted in 2005's The Future of the Brain, a copy of the brain's current state may be useless without knowing the full biography of its owner, even including the social context in which the person grew up.

Meanwhile, huge amounts of money are spent on brain research based on false ideas and promises that will not be fulfilled. Thus, the European Union launched a project to study the human brain worth $1.3 billion. European authorities believed the tempting promises of Henry Markram to create a working simulator of brain function based on a supercomputer by 2023, which would radically change the approach to the treatment of Alzheimer's disease and other ailments, and provided the project with almost unlimited funding. Less than two years after the project launched, it turned out to be a failure, and Markram was asked to resign.

People are living organisms, not computers. Accept it. We need to continue the hard work of understanding ourselves, but not waste time with unnecessary intellectual baggage. Over the half-century of its existence, the concept of OR has given us only a few useful discoveries. It's time to click on the Delete button.

Robert Epstein is a senior psychologist at the American Institute for Behavioral Research and Technology in California. He is the author of 15 books and the former editor-in-chief of Psychology Today.