How to avoid the site was K

3,

robots.txt file is the first to document the spider crawling website crawling, most search engines are to comply with the robots agreement, if robots is set properly, will directly affect the search engine grab included work, serious error in robots.txt documents can be K. Love of Shanghai robots.txt said: "please note that only if your site contains not to be included in the search engine content, only need to use the robots.txt file. If you want all the content included in the search engine website, do not create robots.txt files." So if you do not understand the robots.txt settings simply do not set. OK, the site was K, check the robots.txt file settings are correct, or simply delete robots.txt or set an empty file.

site is not being black horse or

mentioned here refers to the site of the neighbor Links, if you Links website that cheating by search engines, and punished, and you stand just with the exchange, in this case, your site may also be affected, the website Links are too complicated, also face the search risk of engine K off.

?

ZAC said: "Shanghai dragon"

4, the website to jump is set correctly?

5, your neighbors have punished

server or virtual host will directly affect the operation stability of search engines on the website of the. When the search engines crawl the web content server, there is no normal operation, whether the web log return is 200 status code? If the return code is 200, even if the site is not included that is only a temporary phenomenon, or K pages do not worry, flat mentality, such as search engine update "will be released, weight will return. OK, the site was K, to see the recent Web log, check whether the server or virtual host stability.

is not set correctly?

analysis of site is 8 reasons: how K, the stability of 1 server or virtual host

6, check the website there is no excessive

as of now, only 301 to is considered to be the safest and most reliable way to weight, it can effectively transfer the old ". Others such as JavaScript or jump 302 steering are likely to affect the website trust, and then punished treatment.

2, robots.txt

optimization? ?

?The

search engine is generally not to deal with the site suffered Heike attacks, can not be opened or placed Trojan and virus, but this situation will affect the user experience, if by the search engine user complaints, the website will have a search engine website manual adjustment.

Leave a Reply

Your email address will not be published. Required fields are marked *