How to increase RAM due to the hard disk
In this article, we will talk about how to expand RAM using a hard drive, since the answer to this question is of interest to many users of computer devices.…

Continue reading →

Protecting your computer from viruses is the main task of the user
Nowadays, a computer has become an integral thing just like a mobile phone or TV. Many people save money for a computer for a long time, and when one day…

Continue reading →

Everything a user needs to know about tablets
What is happening on the tablet market today can rightfully be called a tablet boom. This is somewhat reminiscent of the rapidly growing popularity of mobile phones, which our compatriots…

Continue reading →

How to create robots.txt correctly

Nowadays, the Internet has spread around the world. We are almost inconceivable our day without access to the Internet, where you can view a list of news, find the necessary information. New sites appear, new ones appear along with them
protocols for the performance of certain operations. The webmaster should be familiar with both the old methods of writing protocols and be able to instantly and timely master the latest programs and protocols.
Search engine robots initially access the robots.txt file when they enter the portal. This file contains the protocol on which the further actions of the search engine robot depend, as well as which files and areas are not subject to indexing by robots.

Every programmer and typesetter should be able to correctly write such a text file and correctly create robots.txt, as the violations made entail a large number of undesirable consequences. The main goal of robots.txt is to ban indexing. It is worth noting that this document is not mandatory for use in search work, it rather acts as a letter of recommendation, referring to which it is necessary to carry out search work.

This file has the extension txt. It is created using the standard office program “Notepad”, and subsequently it is placed in the root folder of the site, which contains information on indexing during the search process. It is worth noting that
indexing recommendations can be applied to all search engines as well as to certain types of robots.

The programmer should be guided by the following rules when writing such a file:

First of all, the name should remain unchanged, “robots.txt” should not be modified, for example, to “robot.txt”. If the name is different, the robot will simply ignore the instructions.

The name should be written with a small letter, this item is also mandatory, that is, “robots.txt”, and not “ROBOTS.TXT”.

The most important thing is the location of the file. Only installation in the root folder of the site will warn against unwanted errors and consequences.

One of the important points is that the spelling of the file must also be respected. Since if mistakes are made part of the resource portal, and in some cases the entire content of the site will undergo the indexing process.

The three components that make up this text file are:

User-agent directive: *

Disallow protocol: / adminka /

Disallow instruction: / image /

Consider each of the components in more detail.
User-agent component: *. The presence of an asterisk indicates that the manual in the file is relevant and applies to the vast majority of robots entering the portal. If the rules apply to a certain type of robotic
search engines, it becomes necessary to indicate its specific name in the text.

The Disallow: / adminka / protocol and Disallow: / image / protocol prohibit indexing of the marked content of the resource. It is important that each area that is not subject to indexing is prescribed in a new line. Combining areas or combining them in one line is strictly prohibited, this violates the basic rules of writing. As for line wrapping in one protocol, this action is also erroneous.
The following are examples of the design and creation of such a text file:

The goal is to prohibit indexing of the entire content of an information resource by all types of search engine robots:
User-agent: *
Disallow: /

The goal is to allow all portal content to be indexed by any kind of robotic search engines:
User-agent: *
Disallow:

The task is to create a ban on indexing the contents of the portal and the entire resource as a whole from a specific search robot (as an example, yandexbot):
User-agent: yandexbot
Disallow: /

The task is to allow the indexing process to one of the robots (as an example, yandexbot) and at the same time to prohibit indexing to the remaining robotic search engines:
User-agent: yandexbot
Disallow:

User-agent: *
Disallow: /

It is necessary to prohibit the indexing process of several areas of the information resource:
User-agent: *
Disallow: / directoria-1 /
Disallow: / directoria-2 /
Disallow: / hidedirectoria /

The task is to prohibit indexing several areas of the portal by all search automated systems:
User-agent: *
Disallow: /hide.php
Disallow: /secret.html

At the end of everything, you can summarize and compile a set of rules that you must use when creating this text document:

All text contained in the file must be written with a lowercase letter except for the first letter at the beginning of each line;

The Disallow protocol is intended for only one portal section or single file;

It is strictly forbidden to change the writing order of Disallow and User-agent instructions.

The User-agent area is required.

Google Chrome Internet Browser
The Google Chrome Internet browser can be called a beginner among old-timers Opera, Mozilla Firefox, Internet Explorer. However, despite its young age, the Google Chrome browser by 2014 almost came…

...

Advanced CSS Hacks
Hack refers to a method that allows CSS to be perceived only by a specific browser. Hacks can be used not only to fix bugs in the layout, but also…

...

Unifi setup
In order to under stand the capabilities and the unifi configuration system, you need to know a little about the product itself. Unifi is a WiFi system that includes a…

...

Seagate Business Storage 4-Bay NAS
Seagate NAS called Business Storage 4-Bay NAS is another recently announced new network-based method for organizing Network Attached Storage systems, which consumers use quite widely instead of acquiring additional computers…

...