A series of A5 optimization team error robots.txt K station did not discuss the sea love

August 12, 2017

believes that many webmaster have very clear robots.txt file, robots.txt file is the first time to visit the website search engine in the Robots.txt file you want to see, tell the spider program on the server file is what can be viewed. A5 贵族宝贝 Shanghai.Admin5贵族宝贝/ optimization group Longfeng robots.txt file is an important role that robots.tx can avoid duplicate content screening included, with a "love Shanghai included N. Although robots.txt is very important, but there are few webmaster can properly configure the pseudo static files, even large enterprises and people’s website, it is difficult to correctly configure robots.txt.

For example,

such as the famous Shanghai dragon expert ZAC blog, wordpress blog system is built using, because robots.txt is not configured, resulting in each post, included dozens or even hundreds of times, A5 optimization team by detecting the ZAC blog post was found, as long as a comment, it may be a love of Shanghai post, if a blog commented one hundred times, then this post might be love Shanghai included one hundred times. What is the cause of serious repeat ZAC blog included? A5 optimization team found that almost all repeat included the URL of the page behind there? Replytocom=****, replytocom is a parameter of the blog is * * * * represent numbers, numbers are different, why is there such a phenomenon? This is because the replytocom parameter love Shanghai spider scanning ZAC blog the page "replytocom" quite useless. The ZAC blog should be how to solve this kind of repeat included? A5 optimization team believes that since the ZAC blog of pseudo static, you should put all the dynamic URL shield, robots>

Comsenz released Discuz X1.0 forum! Version, many webmaster use Discuz! X1.0 to love Shanghai included a rise in the number of posts, but did not increase the A5 optimization team after a thorough inspection of Discuz! X1.0 found that the same post can use different web site more than five access, but the robots.txt did not block the repeat web site, causing the site included quantity, and finally a lot of use Discuz! X1.0 version of the forum is in love with the sea mercilessly K off. A5 optimization team to help Discuz! X1.0 station, the first time in the Discuz forum! Released the robots.txt configuration file is correct, A5 optimization group at the same time in the official response, the problem of robots.txt comsenz. Comsenz actively listen to the optimization of A5 panel, in Discuz! X1.5 and the updated version, the A5 team found that the optimization of the configuration of the robots.txt Comsenz has been almost perfect.

Leave a Reply

Your email address will not be published. Required fields are marked *