The Google’s drama of changes is still on. Recently, past few days ago there were various reports that Google made another change in its algorithm to make SEO more complex. But this time the changes were not effective and resulted into an error which compelled Google to say that making advance moves and indexing its own results was a mistake that needed to be fixed.
What’s the reality?
According to the various reports Google made out a move by going against its own webmaster guidelines and updated its robots.txt file in order to prevent its crawler, GoogleBot, from indexing Google’s Search Result Pages.
Actually, what Google did was that it made this update to make sure that its own results are being blocked from showing up in its own search results which resulted out in a mistake.
Here is the guideline according to which this update was made.
Use of GoogleBot to restrict crawling of search results pages or any other sort of auto generated pages that are not needed or are of no value for users evolving from search engines.
Now what’s coming up through sources that Google itself responded in Google humor, “Indexing the index? We must go deeper!” further adding,“It’s a glitch with multiple slashes in web addresses that we’re working to fix now.”
What more happening came out from other sources was, “we’re going to look into what happened here.”
But as per its changing and updating habit Google made this error fixed and is looking deeper for making its search engine more precise and valuable.
Did you find this article fruitful? Let’s know your feedback in comments.