Fatal errors
- Main page returns an error
- Failed to connect to the server due to DNS error
- Site indexing is disallowed in the robots.txt file
- Security problems or violations detected
Main page returns an error
- Check for the presence of the noindex element in the HTML code of the page. If you find it, delete it.
Go to the Currently column. If the response code differs from 200 OK, check whether the problem is current using Server response check.
page in Yandex.Webmaster and see what response code the page returned to the Yandex robot's requests in theWhen checking the server response, pay attention to the Page content section. If the message “Missing page content” is displayed, check your server settings:
- HTTP header. For example, if it contains "Content-length: 0", the robot will fail to index the page.
- The size of the page content. It must be greater than 0 and match the HTTP header.
After you make changes, wait until the robot crawls the site again. You can also send the page for reindexing and the robot will recrawl it within two weeks.
If the page responds with the 200 OK code as a result of verification and there are no problems with the availability of content, then the warning in Yandex.Webmaster will disappear within a few days.
Failed to connect to the server due to DNS error
Once a day, the indexing robot accesses DNS servers to determine the IP address of the server where the site is located. If DNS records are incorrectly configured on the site, the robot does not get the IP address of the site. As a result, the site cannot be indexed and added to search results.
Check the correctness of the server response to the indexing robot. If the site is still unavailable, contact your hosting provider to correct your domain's DNS records. After access to the website is gained, information in Yandex.Webmaster is updated within a few days.
The problem with accessing the site may be short-term. If you do not find any errors when re-checking the server response, wait for the information to be updated in Yandex.Webmaster. This should happen within a few days.
If the domain name was not renewed in time, the site becomes unavailable for indexing. Renew the domain registration. After this, the message in Yandex.Webmaster will disappear within a few days.
Site indexing is disallowed in the robots.txt file
The indexing robot requests the robots.txt file several times a day and updates information about it in its database. If the robot receives a prohibiting directive in response to a request, a warning appears in Yandex.Webmaster.
Check the robots.txt file contents. If the prohibiting directive is still present, delete it from the file. If you can't do this yourself, contact your hosting provider or domain name registrar. After the directive is removed, the data in Yandex.Webmaster is updated within a few days.
If the domain name was not renewed in time, the robots.txt file will contain prohibiting directives for the robot. Renew the domain registration. After this, the message in Yandex.Webmaster will disappear within a few days.
Security problems or violations detected
- If a search rule violation is found
-
See the description of the violation and recommendations for correcting it.
- SEO texts
- Deception of mobile internet users (paid subscriptions)
- Useless content, spam, or excess advertising
- SEO links for promotions (Minusinsk)
- Undesirable programs and dangerous files
- Cryptocurrency mining
- SEO links on site pages
- Doorway
- Keyword stuffing
- Hidden text
- Clickjacking
- User behavior simulator
- Assistance in user behavior simulation
- Affiliate program
- Phishing
- SMS fraud
- Cloaking
List of possible violations - If a security threat is found
-
Go to
and do the following:- Click the icon to see detailed information about the infection on the site.
- Click on the name of the verdict to see its description and an example of code that is processed in the browser. List of possible verdicts.
- Review the chains of infection. They can help identify the source of infection.
- Learn how to protect your site from infection.
When you solve the problem:
- Make sure the problem is fixed. If the service re-detects a threat during verification, you will not be able to report fixes for a month. After that, this period will increase up to a maximum of three months.
- In Yandex.Webmaster, go to the I fixed it. This will give an additional signal to the Yandex algorithms that the site needs to be rechecked. If the check is successful, the restrictions will be lifted over time and information about violations will no longer be displayed.page and click