World Wide Web
In computing, the World Wide Web (The Web) or worldwide computer network is a system that works through the Internet, by which various types of data can be transmitted through the Hypertext Transfer Protocol or HTTP, which are the links of the web page.
Its characteristics are: it is a system through which information that will be shared over the Internet is managed; it is necessary to have an internet connection through an application to be able to use the web, for example, Chrome, Mozilla, Safari and other web browsers, which are internet tools that allow us to access a large number of contents that in turn allow us to they direct to other pages and so on; For its creation, HTML (Hypertext Markup Language) computer language is used, a markup language that allows the creation of documents to be shared over the Internet; uses the Uniform Resource Location or URL system, which is the specific address that determines each of the resources found on the network, that is, it is an identifier of its own for each page, document, file, among others, that is they find on the web.
The Web was developed between March and December 1989 by Englishman Tim Berners-Lee with the help of Belgian Robert Cailliau while working at CERN in Geneva, Switzerland, and published as a formal proposal in 1991. Since Thus, Berners-Lee has played an active role guiding the development of Web standards (such as the markup languages with which Web pages are created), and in recent years has championed his vision of a Semantic Web.
Using concepts from his earlier hypertext systems like INQUIRE, British physicist Tim Berners-Lee, a computer scientist and at the time a CERN employee, now director of the World Wide Web Consortium (W3C), wrote a proposal in March 1989 for what would become the World Wide Web. The 1989 proposal was for a CERN communication system but Berners-Lee eventually realized that the concept could be applied worldwide. At CERN, the European research organization near Geneva on the French-Swiss border, Berners-Lee and Belgian computer scientist Robert Cailliau proposed in 1990 to use hypertext "to link to and access information of various kinds as a network of nodes that the user can navigate at will", and Berners-Lee completed the first website in December of that year. Berners-Lee posted the project on the newsgroup alt.hypertext on August 6, 1991.
History
The Web was developed between March 1989 and December 1990 by Englishman Tim Berners-Lee and with the help of Belgian Robert Cailliau, while both were working at CERN in Geneva, Switzerland, and published in 1991. Since Thus, Berners-Lee has played an active role guiding the development of Web standards (such as the markup languages with which Web pages are created), and in recent years has championed his vision of a Semantic Web.
But we can also say that the web is the May 1970 issue of Popular Science magazine, where Arthur C. Clarke predicted that one day satellites "will carry accumulated knowledge of the world at your hands", with a console that will combine the functionality of a photocopier, telephone, television, and a small computer, allowing data transfer and video conferencing throughout the world.
In March 1989, Tim Berners-Lee wrote a proposal that references ENQUIRE, a database and software project he had built in the 1980s, and describes a more elaborate information management system.
The underlying idea of the Web dates back to Vannevar Bush's proposal in the 1940s for a similar system: broadly speaking, a network of distributed information with an operational interface that allowed access to it as well as to other relevant articles determined by keys. This project was never materialized, being relegated to the theoretical level under the name of Memex. It is in the 50s when Ted Nelson makes the first reference to a hypertext system, where information is freely linked. But it is not until 1980, with a technological operational support for the distribution of information in computer networks, when Tim Berners-Lee proposes INQUIRE to CERN (referring to Enquire Within Upon Everything, in Spanish Pregunando de todo sobre todo), where the practical realization of this concept of incipient notions of the Web materializes.
In March 1989, Tim Berners-Lee, already on staff at CERN's DD division, drafted the proposal, which referenced ENQUIRE and described a more elaborate information management system. There was no official baptism or coining of the term web in those initial references, the term mesh being used for this purpose. However, the World Wide Web had already been born. With the help of Robert Cailliau, a more formal proposal for the World Wide Web was published on August 6, 1991.
Berners-Lee used a NeXTcube as the world's first web server and also wrote the first web browser, WorldWideWeb in 1990. By Christmas of the same year, Berners-Lee had created all the tools needed to make a web work: the first web browser (which was also a web editor), the first web server, and the first web pages that simultaneously described the project.
On August 6, 1991, he posted a short summary of the World Wide Web project to the newsgroup alt.hypertext. This date also marks the debut of the Web as a publicly available service on the Internet.
The crucial underlying concept of hypertext has its origins in old projects from the 1960s, such as Ted Nelson's Project Xanadu and Douglas Engelbart's NLS online system. The two, Nelson and Engelbart, were in turn inspired by Vannevar Bush's aforementioned system based on microfilm "memex".
Berners-Lee's breakthrough was bridging hypertext and the Internet. In his book Weaving the Web (in Spanish, Tejiendo la red ), he explains that he had repeatedly suggested that the union between the two technologies was possible for members of the two technology communities, but since no one accepted his invitation, he finally decided to tackle the project himself. In the process, he developed a system of globally unique identifiers for web resources as well: the Uniform Resource Identifier.
The World Wide Web had some differences from the other hypertext systems that were available at the time:
- WWW only required unidirectional links instead of bidirectional ones. This made it possible for a person to link to another resource without the need for any action by the owner of that resource. This significantly reduced the difficulty of implementing web servers and browsers (compared to previous systems), but instead presented the chronic problem of broken links.
- Unlike its predecessors, such as HyperCard, World Wide Web was non-owner, making it possible to develop servers and customers independently and add unrestricted extensions of license.
On April 30, 1993, CERN introduced the World Wide Web to the public.
The first Internet page was created by Tim Berners-Lee in 1990 using a NeXT computer, the operation of this, as all Internet pages have been, was to inform about the World Wide Web. The page defines hypermedia and shows an example of what a hypertext page would be like, teaches how to contribute to the Web, mentions the people involved in that project, how information is classified on the Web, the servers and software that existed, teaches how to insert a bibliography, provides the termination of each type of software that exists in order to identify it more easily, mentions the birth of the page, reveals the CERN particle collider, and even comes with a user manual to use the World Wide Web and provides online help in Line Mode Browser, NeXTStep and MidasWWW software. All this information is displayed in four links on the main page.This web page was opened on April 30, 1993, for a long time this page ceased to exist, but it was opened 20 years later on April 30, 2013 as a commemoration of the birth of web technology.
ViolaWWW was a fairly popular browser in the early days of the Web that was based on the concept of the Mac software hypertext tool called HyperCard. However, researchers generally agree that the tipping point of the World Wide Web began with the introduction in 1993 of the Mosaic web browser, a graphical browser developed by an NCSA team at the University of Illinois at Urbana- Champaign (NCSA-UIUC), directed by Marc Andreessen. Support to develop Mosaic came from the High-Performance Computing and Communications Initiative, a funding program initiated by then-Governor Al Gore in the High Performance Computing and Communication Act of 1991, also known as the Gore Bill. Before Mosaic was released, web pages did not integrate a large graphical environment and its popularity was less than other previous protocols already in use on the Internet, such as the Gopher protocol and WAIS. Mosaic's graphical user interface enabled the WWW to become the most popular Internet protocol in a flash...
Operation of the Web
The first step is to translate the hostname portion of the URL into an IP address using the Internet's distributed database known as DNS. This IP address is necessary to contact the web server and to be able to send data packets to it.
The next step is to send an HTTP request to the web server requesting the resource. In the case of a typical web page, the HTML text is first sent and then it is immediately parsed by the browser, which then makes additional requests for graphics and other files that are part of the page. Website popularity statistics are usually based on the number of page views or associated server requests, or file requests, that take place.
Upon receiving the requested files from the web server, the browser renders the page as described in the HTML, CSS and other web languages. At the end, the images and other resources are incorporated to produce the page that the user sees on his screen.
Web standards
We highlight the following standards:
- The Uniform Resource Identifier (URI), which is a universal system for reference to resources on the Web, as web pages
- The Hypertext Transfer Protocol (HTTP), which specifies how the browser and the server communicate between them
- The Hypertext Marking Language (HTML), used to define the structure and content of hypertext documents
- The Extensible Marking Language (XML), used to describe the structure of text documents.
Since 2007, Berners-Lee has led the World Wide Web Consortium (W3C), which develops and maintains these and other standards that allow computers on the Web to effectively store and communicate different forms of information.
Java and JavaScript
A significant advance in web technology was the Java Platform from Sun Microsystems. This language allows web pages to contain small programs (called applets) directly on the display. These applets run on the user's computer, providing a richer user interface than simple web pages. The client's Java applets never gained the popularity Sun expected of them, for a number of reasons, including a lack of integration with other content (applets were confined to small boxes within the rendered page) and the fact that many computers at the time they were sold to end users without a properly installed JVM, so the user was required to download the virtual machine before the applet would start appearing. Until the advent of HTML5, Adobe Flash performed many of the functions that Java applets were originally thought to do including running video content, animations, and some superior GUI features. Java is now used more as a platform and language for server-side and other programming.
JavaScript, on the other hand, is a scripting language that was initially developed to be used within web pages. The standardized version is ECMAScript. Although the names are similar, JavaScript was developed by Netscape and has no relation to Java other than that its syntax is derived from the C programming language. Together with the Document Object Model of a web page, JavaScript has become a much more important technology than its original creators thought. The manipulation of the Document Object Model after the page has been sent to the client has been called Dynamic HTML (DHTML), to emphasize a change from static HTML displays.
In its simplest form, all the optional information and actions available in JavaScript-enabled web pages are already loaded the first time the page is submitted. Ajax ("Asynchronous JavaScript And XML", in Spanish, Asynchronous JavaScript and XML) is a JavaScript-based technology that can have a significant effect on Web development. Ajax provides a method by which large or small parts within a web page can be updated, using new information obtained from the network in response to user actions. This makes the page much more reliable, interactive, and interesting, without the user having to wait for the entire page to load. Ajax is seen as an important aspect of what is often called Web 2.0. Examples of currently used Ajax techniques can be seen in Gmail, Google Maps, etc.
Sociological Implications
The Web, as we know it today, has enabled a flow of global communication on a scale unprecedented in human history. People separated in time and space, can use the Web to exchange - or even mutually develop - their most intimate thoughts, or alternatively their attitudes and daily desires. Emotional experiences, political ideas, culture, musical languages, business, art, photographs, literature... everything can be shared and disseminated digitally with the least effort, reaching almost immediately anywhere on the planet. Although the existence and use of the Web is based on physical technology, which has its own drawbacks, this information does not use physical resources such as libraries or print media. However, the propagation of information through the Web (via the Internet) is not limited by the movement of physical volumes, or by manual copies or information materials. Thanks to its virtual nature, information on the Web can be searched for more easily and efficiently than in any physical medium, and much faster than a person could collect himself through travel, mail, telephone, telegraph, etc. or any other means of communication.
The Web is the most widely distributed medium of personal exchange to appear in the History of Humanity, far ahead of the printing press. This platform has allowed users to interact with many more groups of people scattered around the planet than is possible with the limitations of physical contact or simply the limitations of all other existing means of communication combined.
As well described, the scope of the Internet today is difficult to quantify. In total, according to 2010 estimates, the total number of web pages, either direct access via URL or via link access, is more than 27 billion; that is to say, about three pages for each person alive on the planet. At the same time, the diffusion of its content is such that in little more than 10 years, we have codified half a billion versions of our collective history, and we have put it on in front of 1.9 billion people. In short, it is the achievement of one of humanity's greatest ambitions: from ancient Mongolia, through the Library of Alexandria or the very Encyclopedia of Rousseau and Diderot, humanity has tried to compile in the same time and place all the knowledge accumulated from its beginnings up to that moment. Hypertext has made that dream possible.
As the Web has a global sphere of influence, its importance in contributing to the mutual understanding of people across physical or ideological borders has been suggested.
Publishing pages
The Web is available as a more encompassed platform within the mass media. To "publish" a web page, it is not necessary to go to a publisher or another institution, or even have technical knowledge beyond what is necessary to use a standard text editor.
Unlike books and documents, hypertext does not require a linear order from beginning to end. It does not require subdivisions into chapters, sections, subsections, etc.
Although some websites are available in multiple languages, many are only in your local language. Additionally, not all software supports all special characters, and RTL languages. These factors are some of the points that remain to be unified for the sake of global standardization. In general, excluding those pages that use non-romantic spellings, the use of the Unicode UTF-8 format as a character encoder is becoming more and more widespread.
The facilities thanks to which it is now possible to publish material on the Web are evident in the increasing number of new personal pages, in those for commercial purposes, informative, bloggers, etc. The development of free applications capable of generating web pages in a fully graphical and intuitive way, as well as an emerging number of free web hosting services have contributed to this unprecedented growth.
In many countries, published websites must respect web accessibility, this concept being regulated by Regulations or Guidelines that indicate the level of accessibility of said site:
- Web content accessibility guidelines 1.0 WCAG developed by the W3C within the WAI Initiative (Web Accesibility Initialtive).
- UNE:139803
Statistics
A 2017 survey of 2.024 million web pages found that the majority of web content was in English (56.4%), compared to 7.7% of pages in German, 5.6% in French and 4.95% in Japanese. Another more recent study that carried out searches of pages in 75 different languages, determined that there were over 11,500 million web pages on the indexable public Web at the end of January 2005. However, it should be noted that this data has been extracted from the Google data banks attending to domain names and, therefore, many of the references to which they point are mere redirects to other websites.
Speed issues
Frustration over congestion problems in the Internet infrastructure and the high latency caused by slow browsing has led to the creation of an alternative name for the World Wide Web: the World Wide Wait (in Spanish, the Gran espera mundial). Increasing the speed of the Internet is a latent discussion about the use of peering and QoS technologies. Other solutions to reduce Web wait times can be found at W3C.
Standard guidelines for ideal response times for web pages are (Nielsen 1999, page 42):
- 0.1 seconds (one tenth of a second). Ideal response time. The user does not perceive any interruptions.
- 1 second. Higher response time is acceptable. Up to 1 second download times interrupt the user's experience.
- 10 seconds. Unacceptable response time. The user experience is interrupted and the user can leave the website or system.
These times are useful for planning the capacity of web servers.
"www." pronunciation
In English, "www." is the longest three-letter acronym to pronounce,[citation needed] needing nine syllables. According to Douglas Adams:
The World Wide Web is the only thing I know whose abbreviated form takes three times more to say that its extended form.Douglas Adams The Independent on Sunday1999
The correct pronunciation according to the RAE is popularly known as "triple double vee, period" or "double vee, double vee, double vee, period". However, it is often abbreviated as "three double Vs, period." In some Spanish-speaking countries, such as Mexico, Colombia, Panama and the Dominican Republic, it is usually pronounced "triple double u, point", "triple double v, point" or "double u, double u, double u, point". While in Bolivia, Cuba, Argentina, Venezuela, Chile, Ecuador, Paraguay, Peru, Uruguay and Nicaragua "triple double see, period" or "double see, double see, double see, period".
In Chinese, the World Wide Web is usually translated as wàn wéi wǎng (万维网), which satisfies the "www" and which literally means "network of 10 thousand dimensions".
In Italian it is pronounced “vu vu vu” and in German, “ve ve ve”.
Standards
The following is a list of the documents that define the three main Web standards:
- Uniform Resource Locators (URL) (in Spanish, Uniform Resource Locator)
- RFC 1738, Uniform Resource Locator (URL) (December 1994)
- RFC 3986, Uniform Resource Identifier (URI) (in Spanish, Uniform Resource Identifier): General Syntax (January 2005)
- Hypertext Transfer Protocol (HTTP) (in Spanish, Hypertext Transfer Protocol)
- RFC 1945, Specification of HTTP/1.0 (May 1996)
- RFC 2616, Specification of HTTP/1.1 (June 1999)
- RFC 2617, HTTP Authentication
- HTTP/1.1 Specification of HTTP/1.1 errors
- Hypertext Markup Language (HTML) (in Spanish, Hipertext Labelling Language)
- Internet Draft, HTML version 1
- RFC 1866, HTML version 2.0
- HTML specification reference 3.2
- Specification of HTML 4.01
- Specification of HTML5
- Extensible HTML specification (XHTML)1.0
- Extensible HTML specification (XHTML)1.1
Broken Links and Web Storage
Over time, many hyperlinked web resources disappear, their location changes, or they are replaced with different content. This phenomenon is referred to in some circles as broken links and hyperlinks affected by this are often called 'dead links'.
The ephemeral nature of the Web has spawned many efforts to host the Web. The Internet Archive is one of the best-known efforts, having been archiving the Web since 1996.
Academic Conferences
The biggest academic event related to the WWW is the series of conferences promoted by IW3C2. There is a list with links to all the lectures in the series.
The WWW prefix in web addresses
It is very common to find the prefix "WWW" at the beginning of web addresses due to the custom of naming Internet hosts (servers) with the services they provide. So, for example, the hostname for a web server is usually "WWW", for an FTP server "ftp" is usually used, and for a news server, USENET, "news" or "nntp" (in relation to the NNTP news protocol). These host names appear as a DNS subdomain, as in "www.example.com".
The use of these prefixes is not mandated by any standard, in fact the first web server was at "nxoc01.cern.ch" and even today there are many websites that do not have the prefix "www". This prefix has nothing to do with how the main website is displayed. The prefix "www" it is simply a choice for the website's subdomain name.
Some web browsers automatically add "www." at the beginning, and possibly ".com" at the end, in the URLs that are typed, if the host is not found without them. Internet Explorer, Mozilla Firefox and Opera will also add "http://www/." and ".com" to the content of the address bar if the Control and ↵ Enter (enter key). For example, if you type "example" in the address bar and then just click ↵ Enter or Control+ ↵ Enter will normally search for "http://www.example.com", depending on the exact version of the browser and its settings.
Web technologies
Web technologies imply a set of tools that will make it easier for us to achieve better results when developing a website.
Web typology
- Internet search
- Forum (Internet)
- Social software
- Web portal or CMS
- Bitácora o Weblog / Blog
- Wiki
- Web 2.0
Contenido relacionado
Cross-site scripting
Machine language
Centimeter