The World Wide Web. WWW – World Wide Web Brief message about the World Wide Web

What is the World Wide Web?

The web, or “web,” is a collection of interconnected pages with specific information. Each such page can contain text, images, video, audio and other various objects. But besides this, there are so-called hyperlinks on web pages. Each such link points to another page, which is located on some other computer on the Internet.

Various information resources, which are interconnected by means of telecommunications and based on hypertext representation of data, form the World Wide Web, or WWW for short.

Hyperlinks link pages that are located on different computers located in different parts of the world. A huge number of computers that are united into one network is the Internet, and the “World Wide Web” is a huge number of web pages hosted on network computers.

Each web page on the Internet has an address - URL (Uniform Resource Locator - unique address, name). It is at this address that you can find any page.

How was the World Wide Web created?

On March 12, 1989, Tim Berners-Lee presented to the CERN management a project for a unified system of organization, storage and general access to information, which was supposed to solve the problem of sharing knowledge and experience between the Center’s employees. Berners-Lee proposed solving the problem of accessing information on different computers of employees using browser programs that provide access to the server computer where hypertext information is stored. After the successful implementation of the project, Berners-Lee was able to convince the rest of the world to use common Internet communication standards using the Hypertext Transfer Protocol (HTTP) and Universal Markup Language (HTML) standards.

It should be noted that Tim Berners-Lee was not the first creator of the Internet. The first system of protocols that ensure data transfer between networked computers was developed by employees of the US Defense Advanced Research Projects Agency (DARPA) Vinton Cerf And Robert Kahn in the late 60s - early 70s of the last century. Berners-Lee only proposed using the capabilities of computer networks to create a new system for organizing information and accessing it.

What was the prototype of the World Wide Web?

Back in the 60s of the 20th century, the US Department of Defense set the task of developing a reliable information transmission system in case of war. The US Advanced Research Projects Agency (ARPA) proposed developing a computer network for this purpose. They called it ARPANET (Advanced Research Projects Agency Network). The project brought together four scientific institutions - the University of Los Angeles, the Stanford Research Institute and the Universities of Santa Barbara and Utah. All work was financed by the US Department of Defense.

The first data transmission over a computer network took place in 1969. A Los Angeles University professor and his students tried to log into Stanford's computer and pass the word "login." Only the first two letters L and O were successfully transmitted. When they typed the letter G, the communication system failed, but the Internet revolution took place.

By 1971, a network with 23 users was created in the United States. The first program for sending email over the Internet was developed. And in 1973, University College London and the Civil Services in Norway joined the network, and the network became international. In 1977, the number of Internet users reached 100, in 1984 - 1000, in 1986 there were already more than 5,000, in 1989 - more than 100,000. In 1991, the World-Wide Web (WWW) project was implemented at CERN. In 1997, there were already 19.5 million Internet users.

Some sources indicate the date of the emergence of the World Wide Web a day later - March 13, 1989.

Structure and principles of the World Wide Web

Graphic representation of information on the World Wide Web

The World Wide Web is made up of millions of Internet web servers located around the world. A web server is a program that runs on a computer connected to a network and uses the hard drive protocol and sends it over the network to the requesting computer. More complex web servers are capable of dynamically allocating resources in response to an HTTP request. To identify resources (often files or parts thereof) on the World Wide Web, uniform English resource identifiers are used. Uniform Resource Identifier). To determine the location of resources on the network, uniform English resource locators are used. Uniform Resource Locator). Such URL locators combine URI identification technology and the English domain name system. Domain Name System) - domain name (or directly. The main function of a web browser is to display hypertext. The World Wide Web is inextricably linked with the concepts of hypertext and hyperlinks. Most of the information on the Internet is hypertext. To facilitate the creation, storage and display of hypertext on the World Wide Web it is traditionally used language English HyperText Markup Language), hypertext markup language. The work of marking up hypertext is called layout; the markup master is called a webmaster or webmaster (without a hyphen). After HTML markup, the resulting hypertext is placed in a file; such an HTML file is the most common resource on the World Wide Web. Once an HTML file is made available to a web server, it is called a “web page.” A collection of web pages makes up a website. Hyperlinks are added to the hypertext of web pages. Hyperlinks help World Wide Web users easily navigate between resources (files), regardless of whether the resources are located on the local computer or on a remote server. Web hyperlinks are based on URL technology.

World Wide Web Technologies

In general, we can conclude that the World Wide Web is based on “three pillars”: HTTP, HTML and URL. Although recently HTML has begun to lose its position somewhat and give way to more modern markup technologies: XML. XML eXtensible Markup Language) is positioned as a foundation for other markup languages. To improve the visual perception of the web, CSS technology has become widely used, which allows you to set uniform design styles for many web pages. Another innovation worth paying attention to is the English resource designation system. Uniform Resource Name).

A popular concept for the development of the World Wide Web is the creation of the Semantic Web. The Semantic Web is an add-on to the existing World Wide Web, which is designed to make information posted on the network more understandable to computers. The Semantic Web is a concept of a network in which every resource in human language would be provided with a description that is understandable to a computer.. The Semantic Web opens up access to clearly structured information for any application, regardless of platform and regardless of programming languages. Programs will be able to find the necessary resources themselves, process information, classify data, identify logical connections, draw conclusions and even make decisions based on these conclusions. If widely adopted and implemented wisely, the Semantic Web has the potential to spark a revolution on the Internet. To create a computer-readable description of a resource, the Semantic Web uses the RDF (English) format. Resource Description Framework ), which is based on the syntax of English. RDF Schema) and English Protocol And RDF Query Language ) (pronounced "sparkle"), a new query language for fast access to RDF data.

History of the World Wide Web

Tim Berners-Lee and Robert Cayo are considered the inventors of the World Wide Web. Tim Berners-Lee is the originator of HTTP, URI/URL and HTML technologies. In the year he worked in France. Conseil Européen pour la Recherche Nucléaire, Geneva (Switzerland), he wrote the Enquire program for his own needs. "Enquire", can be loosely translated as "Interrogator"), which used random associations to store data and laid the conceptual foundation for the World Wide Web.

There is also a popular concept Web 2.0, which summarizes several directions of development of the World Wide Web.

Methods for actively displaying information on the World Wide Web

Information on the web can be displayed either passively (that is, the user can only read it) or actively - then the user can add information and edit it. Methods for actively displaying information on the World Wide Web include:

It should be noted that this division is very arbitrary. So, say, a blog or guest book can be considered a special case of a forum, which, in turn, is a special case of a content management system. Usually the difference is in the purpose, approach and positioning one or another product.

Some information from websites can also be accessed through speech. India has already begun testing a system that makes the text content of pages accessible even to people who cannot read and write.

Organizations involved in the development of the World Wide Web and the Internet in general

Links

  • Berners-Lee's famous book "Weaving the Web: The Origins and Future of the World Wide Web" online in English

Literature

  • Fielding, R.; Gettys, J.; Mogul, J.; Fristik, G.; Mazinter, L.; Leach, P.; Berners-Lee, T. (June 1999). " Hypertext Transfer Protocol - http://1.1". Request For Comments 2616. Information Sciences Institute.
  • Berners-Lee, Tim; Bray, Tim; Connolly, Dan; Cotton, Paul; Fielding, Roy; Jeckle, Mario; Lilly, Chris; Mendelsohn, Noah; Orcard, David; Walsh, Norman; Williams, Stuart (December 15, 2004). " Architecture of the World Wide Web, Volume One". Version 20041215. W3C.
  • Polo, Luciano World Wide Web Technology Architecture: A Conceptual Analysis. New Devices(2003). Retrieved July 31 2005.

Notes

Wikimedia Foundation.

2010.

    See what "World Wide Web" is in other dictionaries:

    World-Wide Web World wide web

    - Ne doit pas être confondu avec Internet. Le World Wide Web, littéralement la “toile (d’araignée) mondiale”, communément appelé le Web, parfois la Toile ou le WWW, est un système hypertexte public fonctionnant sur Internet et qui … Wikipédia en Français World Wide Web

- ˌWorld ˌWide ˈWeb written abbreviation WWW noun the World Wide Web COMPUTING a system that allows computer users to easily find information that is available on the Internet, by providing links from one document to other documents, and to files… … Financial and business terms

Hello, dear readers of the blog site. We all live in the era of the global Internet and use the terms site, web, www (World Wide Web - World Wide Web, global network) quite often and without particularly going into what it is.

I observe the same thing from other authors, and even ordinary interlocutors. “Site”, “Internet”, “network” or the abbreviation “WWW” have become such common concepts for us that it doesn’t even occur to us to think about their essence. However, the first website was born only some twenty years ago. What is the Internet?

After all, it has a rather long history, however, before the advent of the global network (WWW), 99.9% of the planet's inhabitants did not even suspect its existence, because it was the lot of specialists and enthusiasts. Now even the Eskimos know about the World Wide Web, in whose language this word is identified with the ability of shamans to find answers in the layers of the universe. So let's discover for ourselves what the Internet, website, World Wide Web, and everything else is.

What is the Internet and how it differs from the World Wide Web The most remarkable fact that can now be stated is that. In essence, this is an association of individual local networks (thanks to the common standards once adopted, namely the TCP/IP protocol), which is maintained in working order by network providers.

It is believed that due to the ever-increasing media traffic (video and other heavy content moving in tons on the network), the Internet will soon collapse due to its currently limited bandwidth. The main problem in this regard is updating the network equipment that makes up the global web to a higher speed one, which is primarily hampered by the additional costs required. But I think that the problem will be solved as the collapse matures, and there are already separate segments of the network operating at high speeds.

In general, in light of the fact that the Internet is essentially no one’s, it should be mentioned that many states, trying to introduce censorship on the global network, want to identify it (namely its currently most popular component WWW) with.

But there is actually no basis for this desire, because The Internet is just a means of communication or, in other words, a storage medium comparable to a telephone or even plain paper. Try applying sanctions to paper or its distribution around the planet. In fact, individual states can apply certain sanctions only to sites (islands of information on the network) that become available to users via the World Wide Web.

The first prerequisites for the creation of the global web and the Internet were undertaken... What year do you think? Surprisingly, back in the wild days of 1957. Naturally, the military (and, naturally, the United States, where would we be without them) needed such a network for communication in the event of military operations involving the use of nuclear weapons. It took quite a long time to create the network (about 12 years), but this can be explained by the fact that at that time computers were in their infancy.

But nevertheless, their power was quite enough to create an opportunity between the military departments and leading US universities by 1971. Thus, the Email transfer protocol became first way to use the Internet for user needs. After a couple more, overseas people already knew what the Internet was. By the beginning of the 80x, the main data transfer protocols were standardized (mail, ), and the protocol of the so-called Usenet news conferences appeared, which was similar to mail, but made it possible to organize something similar to forums.

And a few years later, the idea of ​​​​creating a domain name system (DNS - will play a crucial role in the formation of WWW) appeared and the world's first protocol for communication via the Internet in real time - IRC (in colloquial Russian - irka) appeared. It allowed you to chat online. Science fiction that was accessible and interesting to a very, very small number of inhabitants of planet Earth. But that's just for now.

At the junction of the 80s and 90s, such significant events took place in the history of infrastructure development that they, in fact, predetermined its future fate. In general, such a spread of the global network in the minds of modern inhabitants of the planet is due to almost one single person - Tim Berners-Lee:

Berners-Lee is an Englishman, born into a family of two mathematicians who dedicated their lives to creating one of the world's first computers. It was thanks to him that the world learned what the Internet, website, email, etc. are. Initially, he created the World Wide Web (WWW) for the needs of nuclear research at Cern (they have the same collider). The task was to conveniently place all the scientific information available to the concern in their own network.

To solve this problem, he came up with everything that is now the fundamental elements of the WWW (what we consider the Internet, without understanding its essence a little). He took as a basis the principle of organizing information called. What it is? This principle was invented long before and consisted of organizing the text in such a way that the linearity of the narrative was replaced by the ability to navigate through different links (connections).

The Internet is hypertext, hyperlinks, URLs and hardware

Thanks to this, hypertext can be read in different sequences, thereby obtaining different versions of linear text (well, this should be clear and obvious to you, as experienced Internet users, now, but then it was a revolution). The role of hypertext nodes should have been, which we now simply call links.

As a result, all the information that now exists on computers can be represented as one large hypertext, which includes countless nodes (hyperlinks). Everything developed by Tim Berners-Lee was transferred from the local CERN grid to what we now call the Internet, after which the Web began to gain popularity at a breakneck pace (the first fifty million users of the World Wide Web were registered in just first five years of existence).

But to implement the principle of hypertext and hyperlinks, it was necessary to create and develop several things from scratch at once. Firstly, we needed a new data transfer protocol, which is now known to all of you HTTP protocol(at the beginning of all website addresses you will find a mention of it or its secure HTTPs version).

Secondly, it was developed from scratch, the abbreviation of which is now known to all webmasters in the world. So, we have received tools for transferring data and creating websites (a set of web pages or web documents). But how can one refer to these same documents?

The first made it possible to identify a document on a separate server (site), and the second made it possible to mix a domain name (obtained and uniquely indicating that the document belongs to a website hosted on a specific server) or an IP address (a unique digital identifier of absolutely all devices on a global or local network) into the URI. ). Read more about it at the link provided.

There is only one step left to take for the World Wide Web to finally work and become in demand by users. Do you know which one?

Well, of course, we needed a program that could display on the user’s computer the contents of any web page requested on the Internet (using a URL address). It became such a program. If we talk about today, there are not so many main players in this market, and I managed to write about all of them in a short review:

  1. (IE, MSIE) - the old guard is still in service
  2. (Mazila Firefox) - another veteran is not going to give up his position without a fight
  3. (Google Chrome) - an ambitious newcomer who managed to take the lead in the shortest possible time
  4. - a browser beloved by many in RuNet, but gradually losing popularity
  5. - a messenger from the apple mill

Timothy John Berners-Lee independently wrote the program for the world's first Internet browser and called it, without further ado, the World Wide Web. Although this was not the limit of perfection, it was from this browser that the victorious march of the World Wide Web WWW across the planet began.

In general, of course, it is striking that all the necessary tools for the modern Internet (meaning its most popular component) were created by just one person in such a short time. Bravo.

A little later, the first graphical browser Mosaic appeared, from which many of the modern browsers (Mazila and Explorer) originate. It was Mosaic that became the drop that was missing to there was an interest in the Internet(namely to the World Wide Web) among ordinary residents of planet Earth. A graphical browser is a completely different matter than a text browser. Everyone loves to look at pictures and only a few love to read.

What is noteworthy is that Berners-Lee did not receive any terribly large sums of money, which, for example, as a result he received or, although he probably did more for the global network.

Yes, over time, in addition to the Html language developed by Berners-Lee, . Thanks to this, some of the operators in Html were no longer needed, and they were replaced by much more flexible tools for cascading style sheets, which made it possible to significantly increase the attractiveness and design flexibility of the sites being created today. Although CSS rules are, of course, more complex to learn than markup language. However, beauty requires sacrifice.

How do the Internet and the global network work from the inside?

But let's see what is Web (www) and how information is posted on the Internet. Here we will come face to face with the very phenomenon called website (web is a grid, and site is a place). So, what is a “place on the network” (analogous to a place in the sun in real life) and how to actually get it.

What is intet? So, it consists of channel-forming devices (routers, switches) that are invisible and of little importance to users. The WWW network (what we call the Web or World Wide Web) consists of millions of web servers, which are programs running on slightly modified computers, which in turn must be connected (24/7) to the global web and use the HTTP protocol for data exchange.

The web server (program) receives a request (most often from the user’s browser, which opens the link or entered the Url in the address bar) to open a document hosted on this very server. In the simplest case, a document is a physical file (with the html extension, for example), which lies on the server’s hard drive.

In a more complex case (when using), the requested document will be generated programmatically on the fly.

To view the requested page of the site, special software is used on the client (user) side called a browser, which can draw the downloaded fragment of hypertext in a digestible form on the information display device where this same browser is installed (PC, phone, tablet, etc. ). In general, everything is simple, if you don’t go into details.

Previously, each individual website was physically hosted on a separate computer. This was mainly due to the weak computing power of the PCs available at that time. But in any case, a computer with a web server program and a website hosted on it must be connected to the Internet around the clock. Doing this at home is quite difficult and expensive, so they usually use the services of hosting companies specializing in this to store websites.

Hosting service Due to the popularity of the WWW, it is now quite in demand. Thanks to the growing power of modern PCs over time, hosters have the opportunity to host many websites on one physical computer (virtual hosting), and hosting one website on one physical PC has become called a service.

When using virtual hosting, all websites hosted on a computer (the one called a server) can be assigned one IP address, or each one can have a separate one. This does not change the essence and can only indirectly affect the Website located there (a bad neighborhood on one IP can have a bad impact - search engines sometimes treat everyone with the same brush).

Now let's talk a little about website domain names and their meaning on the World Wide Web. Every resource on the Internet has its own domain name. Moreover, a situation may arise when the same site may have several domain names (the result is mirrors or aliases), and also, for example, the same domain name may be used for many resources.

Also, for some serious resources there is such a thing as mirrors. In this case, the site files may be located on different physical computers, and the resources themselves may have different domain names. But these are all nuances that only confuse novice users.

History of the creation and development of the Internet.

The Internet owes its origins to the US Department of Defense and its secret research conducted in 1969 to test methods for allowing computer networks to survive military operations by dynamically rerouting messages. The first such network was the ARPAnet, which combined three networks in California with a network in Utah under a set of rules called the Internet Protocol (IP for short).

In 1972, access was opened to universities and research organizations, as a result of which the network began to unite 50 universities and research organizations that had contracts with the US Department of Defense.

In 1973, the network grew to an international scale, combining networks located in England and Norway. A decade later, IP was expanded to include a set of communications protocols supporting both local and wide area networks. This is how TCP/IP was born. Shortly thereafter, the National Science Foundation (NSF) launched NSFnet with the goal of linking 5 supercomputing centers. Along with the introduction of the TCP/IP protocol, the new network soon replaced ARPAnet as the backbone of the Internet.

Well, how did the Internet become so popular and developed, and the impetus for this, as well as for turning it into an environment for doing business, was given by the emergence of the World Wide Web (World Wide Web, WWW, 3W, ve-ve-ve, three double) - systems hypertext, which made surfing the Internet fast and intuitive.

But the idea of ​​linking documents through hypertext was first proposed and promoted by Ted Nelson in the 1960s, but the level of computer technology existing at that time did not allow it to be brought to life, although who knows how it would have ended if Has this idea found application?!

The foundations of what we understand today as the WWW were laid in the 1980s by Tim Berners-Lee while working on a hypertext system at the European Laboratory for Particle Physics (European Nuclear Research Centre). ).

As a result of these works, in 1990 the scientific community was presented with the first text browser (browser), which allows viewing text files linked by hyperlinks on-line. The browser was made available to the general public in 1991, but its adoption outside academia has been slow.

A new historical stage in the development of the Internet is due to the release of the first Unix version of the graphical browser Mosaic in 1993, developed in 1992 by Marc Andreessen, a student who interned at the National Center for Supercomputing Applications (NCSA), USA.

Since 1994, after the release of versions of the Mosaic browser for the Windows and Macintosh operating systems, and soon after that - the Netscape Navigator and Microsoft Internet Explorer browsers, the explosive spread of the popularity of the WWW, and as a result of the Internet, began among the general public, first in the United States, and then and all over the world.

In 1995, NSF transferred responsibility for the Internet to the private sector, and since that time the Internet has existed as we know it today.


Internet services.

Services are types of services that are provided by Internet servers.
In the history of the Internet, there have been different types of services, some of which are no longer in use, others are gradually losing their popularity, while others are experiencing their heyday.
We list those services that have not lost their relevance at the moment:
-World Wide Web - the World Wide Web - a service for searching and viewing hypertext documents, including graphics, sound and video. -E-mail – electronic mail – service for transmitting electronic messages.
-Usenet, News – teleconferences, news groups – a type of online newspaper or bulletin board.
-FTP – file transfer service.
-ICQ is a service for real-time communication using a keyboard.
-Telnet is a service for remote access to computers.
-Gopher – service for accessing information using hierarchical directories.

Among these services we can highlight services designed for communication, that is, for communication, transfer of information (E-mail, ICQ), as well as services whose purpose is to store information and provide access to this information for users.

Among the latest services, the leading place in terms of the volume of stored information is occupied by the WWW service, since this service is the most convenient for users and the most advanced in technical terms. In second place is the FTP service, since no matter what interfaces and conveniences are developed for the user, the information is still stored in files, access to which is provided by this service. The Gopher and Telnet services can currently be considered “dying”, since almost no new information is received on the servers of these services and the number of such servers and their audience is practically not increasing.

World Wide Web - World Wide Web

World Wide Web (WWW) is a hypertext, or more precisely, hypermedia information system for searching Internet resources and accessing them.

Hypertext is an information structure that allows you to establish semantic connections between text elements on a computer screen in such a way that you can easily transition from one element to another.
In practice, in hypertext, some words are highlighted by underlining or coloring them in a different color. Highlighting a word indicates that there is a connection between this word and some document in which the topic associated with the highlighted word is discussed in more detail.

Hypermedia is what happens if you replace the word “text” in the definition of hypertext with “any type of information”: sound, graphics, video.
Such hypermedia links are possible because, along with text information, it is possible to link any other binary information, for example, encoded sound or graphics. Thus, if a program displays a map of the world and if the user selects a continent on this map using the mouse, the program can provide graphic, audio and text information about it.

The WWW system is built on a special data transfer protocol called the HyperText Transfer Protocol (HTTP).
All content of the WWW system consists of WWW pages.

WWW pages are hypermedia documents of the World Wide Web system. They are created using the hypertext markup language HTML (Hypertext markup language). One WWW page is actually usually a set of hypermedia documents located on one server, intertwined with mutual links and related in meaning (for example, containing information about one educational institution or one museum). Each page document, in turn, can contain multiple screen pages of text and illustrations. Each WWW page has its own “title page” (English: “homepage”) - a hypermedia document containing links to the main components of the page. "Title page" addresses are distributed on the Internet as page addresses.

A set of Web pages interconnected by links and designed to achieve a common goal is called a Web site.

Email.

Email appeared about 30 years ago. Today it is the most widespread means of exchanging information on the Internet. The ability to receive and send email can be useful not only for communicating with friends from other cities and countries, but also in a business career. For example, when applying for a job, you can quickly send out your resume using e-mail to various companies. In addition, on many sites where you need to register (on-line games, online stores, etc.) you often need to provide your e-mail. In a word, e-mail is a very useful and convenient thing.

Electronic mail (Electronic mail, English mail - mail, abbreviated e-mail) is used for transmitting text messages within the Internet, as well as between other email networks. (Picture 1.)

Using e-mail, you can send messages, receive them in your email inbox, respond to letters from correspondents, send copies of letters to several recipients at once, forward a received letter to another address, use logical names instead of addresses, create several mailbox subsections for different types correspondence, include in letters various sound and graphic files, as well as binary files - programs.

To use E-mail, the computer must be connected to the telephone network via a modem.
A computer connected to a network is considered a potential sender and receiver of packets. Each Internet node, when sending a message to another node, splits it into fixed-length packets, usually 1500 bytes in size. Each packet is provided with a recipient address and a sender address. Packets prepared in this way are sent over communication channels to other nodes. When receiving any packet, the node analyzes the recipient's address and, if it matches its own address, the packet is accepted, otherwise it is sent further. Received packets related to the same message are accumulated. Once all packets of one message are received, they are concatenated and delivered to the recipient. Copies of packets are stored on sending nodes until a response is received from the recipient node indicating successful delivery of the message. This ensures reliability. To deliver a letter to the addressee, you only need to know his address and the coordinates of the nearest mailbox. On the way to the addressee, the letter passes several post offices (nodes).

FTP service

Internet service FTP (file transfer protocol) stands for protocol
file transfer, but when considering FTP as an Internet service there is
not just a protocol, but a service - access to files in file
archives.

In UNIX systems, FTP is a standard program that runs over the TCP protocol,
always supplied with the operating system. Its original purpose is
transferring files between different computers running on TCP/IP networks: on
On one of the computers the server program is running, on the second the user runs
a client program that connects to the server and sends or receives
FTP files (Figure 2)

Figure 2. FTP protocol diagram

The FTP protocol is optimized for file transfer. Therefore, FTP programs have become
part of a separate Internet service. The FTP server can be configured like this
way that you can connect with him not only under a specific name, but also under
conditional name anonymous - anonymous person. Then not all information becomes available to the client.
computer file system, but some set of files on the server that
composes the contents of an anonymous ftp server - a public file archive.

Today, public file archives are organized primarily as servers
anonymous ftp. A huge amount of information is available on such servers today.
and software. Almost everything that can be provided
to the public in the form of files, accessible from anonymous ftp servers. These are programs -
freeware and demo versions and multimedia, it's finally
just texts - laws, books, articles, reports.

Despite its popularity, FTP has many disadvantages. Programs-
FTP clients may not always be convenient or easy to use. It's not always possible
understand what kind of file this is in front of you - whether it is the file that you are looking for or not. No
a simple and universal search tool for anonymous ftp servers - although for
This is why there are special programs and services, but they don’t always provide
the desired results.

FTP servers can also provide access to files under a password - for example,
to your clients.

TELNET service

The purpose of the TELNET protocol is to provide a fairly general, bidirectional, eight-bit byte-oriented means of communication. Its main purpose is to allow terminal devices and terminal processes to communicate with each other. It is intended that this protocol can be used for terminal-to-terminal communication ("bundling") or for process-to-process communication ("distributed computing").

Figure 3. Telnet terminal window

Although a Telnet session has a client side and a server side, the protocol is actually completely symmetrical. After establishing a transport connection (usually TCP), both ends of it play the role of “network virtual terminals” (English). Network Virtual Terminal, NVT) exchanging two types of data:

Application data (that is, data that goes from the user to the text application on the server side and back);

Telnet protocol commands, a special case of which are options that serve to understand the capabilities and preferences of the parties (Figure 3).

Although a Telnet session running over TCP is full duplex, the NVT should be considered a half-duplex device that operates in line buffered mode by default.

Application data passes through the protocol without changes, that is, at the output of the second virtual terminal we see exactly what was entered at the input of the first. From a protocol point of view, the data is simply a sequence of bytes (octets), which by default belong to the ASCII set, but when the option is enabled Binary- any. Although extensions have been proposed to identify a character set, they are not used in practice.

All application data octet values ​​except \377 (decimal: 255) are transmitted as is on the transport. The \377 octet is transmitted as a \377\377 sequence of two octets. This is because the \377 octet is used at the transport layer to encode options.

The protocol provides minimal functionality by default and a set of options that extend it. The principle of negotiated options requires negotiations to take place when each option is included. One party initiates the request, and the other party can either accept or reject the offer. If the request is accepted, the option takes effect immediately. Options are described separately from the protocol itself, and their support by software is optional. The protocol client (network terminal) is instructed to reject requests to enable unsupported and unknown options.

Historically, Telnet was used to remotely access the command line interface of operating systems. Subsequently, it began to be used for other text interfaces, including MUD games. Theoretically, even both sides of the protocol can be not only people, but also programs.

Sometimes telnet clients are used to access other protocols based on the TCP transport, see Telnet and other protocols.

The telnet protocol is used in the FTP control connection, that is, telneting to the server with the command telnet ftp.example.net ftp to perform debugging and experimentation is not only possible, but also correct (unlike using telnet clients to access HTTP, IRC and most other protocols ).

The protocol does not provide for the use of either encryption or data authentication. Therefore, it is vulnerable to any type of attack to which its transport, that is, the TCP protocol, is vulnerable. For the functionality of remote access to the system, the SSH network protocol (especially its version 2) is currently used, during the creation of which the emphasis was placed specifically on security issues. So keep in mind that a Telnet session is very insecure unless it is done on a fully controlled network or with network-level security (various VPN implementations). Due to its unreliability, Telnet was long abandoned as a means of managing operating systems.

Scientific and technological progress does not stand still, but remains in constant development and search. Perhaps the most useful invention of mankind is the Internet. At its core, the Internet, the World Wide Web, is a unique tool for data exchange. The importance of this information space is undeniable due to the enormous communication capabilities between users of all devices connected to the Network.

What is the Internet

The Internet (Wide web) is a virtual environment that guarantees access to information resources, the elements of which are personal computers. They are combined into a single circuit and equipped with unique addressing functions using high-speed communication lines with host computers. Internet is a huge network of countless devices. It serves to exchange information that exists on this network in various forms.

Prototype

The history of the World Wide Web is as follows. In 1969, under the auspices of the US Department of Defense, the ARPANET computer network was created. It was she who became the first prototype of the modern Internet. Initially, it allowed several computers that were remote from each other to exchange simple information. In the next decade, information technology scientists developed new packet data protocols—TCP/IP—that are still in use. In 1983, ARPANET switched to these protocols, and scientists began to create an improved network - the modern Internet used by the world wide.

Who created and when

In 1990, the ARPANET network ceased to exist and gave way to the Internet. The invention of the World Wide Web made the Internet what we know it today. Many people consider the Internet and the World Wide Web to be synonymous. Not everyone knows who created the network that everyone sees. At the very beginning of the last decade of the 20th century, scientists Tim Berners-Lee and Robert Caillot created a distribution system for accessing information that connects millions of servers connected to the Internet.

Why is it called the World Wide Web?

The Internet, which is called the World Wide Web, is a system that has been so named for various reasons, and its abbreviation is “www”. Internet is the Semantic Web, a system that today represents the most popular and widely used invention in the field of information technology.

From a technical point of view, the online space is made up of countless computer devices connected to each other. Billions of PC users living in different countries communicate with each other every day, transmit and can receive useful information, download digital data sets in the form of applications, programs, utilities; watch videos, listen to music.

First access

To facilitate the exchange of e-mail, the first corresponding program was introduced in 1991. However, all this time the Internet remained only a set of channels for transferring data from one computer to another, and it was used only by leading scientists in Europe and the USA. The revolutionary solution that helped to be born and made the Internet accessible to all computer owners was the emergence and further development of the WWW system.

Structure

Internet- a global computer network consisting of all types of computer networks connected by stereotypical agreements on methods of information exchange and a unified address system. An online unit is a local network whose introduction is connected to some network.

Servers

An Internet server is the hardware and software that powers any necessary Internet services: http (website), email (email), ftp (file transfer protocol/method), etc. To place a website on the Internet, an Internet server is required.

Most often, Internet servers are powerful computers equipped with the same powerful software, including support for various programming languages ​​and data transfer protocols, databases, antiviruses and other security systems.

Browsers

A browser is a global program on a computer, with the support of which people access the Internet and view information there. It processes data on the global network and allows you to browse pages. Essentially, this is a program that you can work with online. Quite often, the browser icon is located on the computer desktop. When you click on it, an Internet connection appears.

Hyperlinks

Almost all movements on various sites on the Internet today occur on the basis of links. But they can bring both benefit and harm. Therefore, it is important for the user to understand how different types of links work and to be able to recognize links to clearly malicious web resources.

Hypertext

In general and in a simplified form, hypertext is a combination of interconnected sets of text fragments integrated into an information system, which allows users to transfer data from one text block to another. Online design allows for non-linear reading; in this case, the user has the opportunity to select the necessary information much faster and claim a personal sequence of transition from one part of the word to another.

HTML language

HTML is a hypertext markup language that makes a website the way users see it. This amazing tool simply makes websites look beautiful and modern, while also providing ease of use. HTML assembles the components of a web page in a user-friendly way. Its work is comparable to what text editors actually do. They turn a faceless mass of letters into a document with font and images.

Domains

A domain is a unique combination of characters that allows you to find a site among others. In addition to letters, a domain can contain numbers and symbols ranging from 2 to 63. A domain name can be compared to a home address. To find out where a person lives, you need to know the residential address, the same applies to the website. Each resource on the Internet has an individual IP address, which looks like this: 195.191.24.196. Such a set of numbers is very difficult to remember, so domains were invented that visually replace the numeric indexes of addresses.

Methods for displaying information

Information on the Internet can be displayed both inertly (that is, the user can only read it) and actively - during this time the user has the ability to add information and edit it.

Methods of intensive reflection of information on the global network include:

  1. Guest book is a free client program. A guest book allows guests to express their thoughts about pages and send them to the creators.
  2. Forum It is considered a space for network communication between several authors, where articles are related by a common topic.
  3. Chat service- for exchanging text messages in real time. Chat allows almost all users to know each other at the same time.
  4. Online diary, consisting of entries in reverse chronological order.
  5. Wiki hypertext environment(Wikipedia) for organizing multi-user collection and structuring of documents and other information.
  6. Content management systems. The content management system CMS UlterSuite allows you to create websites and maintain them.

Technologies

Internet technologies are communication, information and other technologies and proposals on the basis of which work is carried out online. Internet-technology has a great influence on the intellectualization of society and the economy. All over the world, computer technology, educational programs and multimedia files are considered common attributes of everyday life.

Safety

There is a possibility that a user's computer may become a victim of attackers. By traveling to various sites or downloading software from unverified sources, the user himself can infect his personal computer with a virus, which will lead to loss of functionality of the device.

Security can be ensured as follows:

  1. Although Windows has a built-in firewall, it is recommended to install a more reliable one.
  2. The next step is to install anti-spyware and anti-virus software.
  3. You need to disable all unused services on your device, this will reduce the chances of hackers gaining access.

The user must perform a number of security operations before surfing the endless expanses of the network.

Personal data

Every user who connects to the network can become a victim of a privacy attack. The number of crimes related to the unauthorized use of personal data on the Internet is growing every year. Many users do not take any action to protect their privacy, even if they are aware of security practices.

There are reasonable ways to protect against information theft. A virtual part of the network (VPN) is not a universal method of protection against all threats, but an extremely useful tool for protecting personal data. Credentials are all that is needed to obtain personal information, so passwords must be strong.

Legislation

Today, almost everyone has access to the Internet, and some cannot even imagine their life without the World Wide Web. However, few people are interested in how various resources are regulated, how prohibited content is controlled and blocked. Sometimes you might think that the World Wide Web is in complete chaos, but in reality this is not the case. In Russia, Internet laws have been developing quite rapidly in recent years. There are laws regarding remote work on the Internet and e-commerce.

Types of storage

When posting information online, most people don't really think about where it goes. And it ends up in data centers. The data center certainly stores all information posted on the network. These are personal photos, downloaded documents, Skype conversations, blog comments and other important and unimportant data. In essence, a data center is a huge bank, a content repository. It is not soldiers who are responsible for the security of data centers, but highly intelligent technologies operating under the guidance of video surveillance and control systems.

Development prospects

It is quite difficult to predict the specific development of such a complex and large-scale Internet. There is no doubt that network technologies will play a major role in the information society. In real time, online is developing very quickly: in any one and a half to two years, its main quantitative characteristics multiply. This concerns the number of users, the number of connected computers, the size of information and traffic, the size of the creation of information resources.

Memory

In the mid-90s of the last century, Brewster Kahle decided to preserve the memory of the global network and announced the birth of an archive of Internet sites. At the moment, the archive contains more than 10 billion pages that could be irretrievably lost. The accumulated information is updated approximately every 2 months, it is stored on several large servers and is important for many. This is an exceptionally huge database in the world. The cost of this work is a million dollars a year. Funds come from sponsors.