These Errors and Mistakes Could Be Blocking Google from Indexing You

0 comments, 23/08/2014, by , in Google, Web Design

IndexingThe only way that your website can be discovered by the masses – short of a massive paid marketing campaign – is through search engines. When someone wants to find information or products that relate to what you offer, they can enter a query and find you in no time at all. The only way to be listed in search engines, however, is to have your site consistently indexed by Google and other search engines. Since the dawn of the internet, search engines have used a pretty rudimentary process involving search engine “crawlers” and “bots” to find all the content on the web. If these crawlers and bots cannot find your pages, then they won’t be discoverable through search engines. We’ll talk about some errors below that could be preventing search engines from finding your pages.

Check Your Robots File

Every website (should) have a file called robots.txt in its directory. This file can be edited to allow only certain directories to be indexed, or to disallow all directories from being indexed. Unfortunately, errors in your robots.txt file can be rather common if you have ever edited it in the past. This in turn may be preventing your pages from being indexed properly. The command for excluding directories will display “user-agent: *” followed by “Disallow:” and then the directory being blocked. Check your robots.txt file to ensure that no directories you need listed on search engines are classified this way. The exclusion of pages via this file does not make them invisible, but instead prevents search engines from finding/listing them.

Your Pages Have Passwords

If some of your pages have password protection, then you may be unaware that these sites will not be pinging to Google. Some people have created password-protected pages for show or as a way to test the feature out; unfortunately for the owners, this means that no one will be able to find it. If the pages need to be found by search engines, then we highly recommend removing password protection or instead, provide a secondary link or landing page that will explain what the page is about, subsequently linking to it from there.

JavaScript and Cookies

If your pages have elements that require JavaScript or cookies in order to view them, then there’s a good chance that Google will not index this content. Pinging to Google complex code (from the perspective of a crawler that only knows how to handle text/HTML) will simply be ignored and the page in question will be avoided in the future by bots. Some people use Javascript and cookies to keep content hidden from certain visitors, but you may very well end up keeping it hidden from everyone by preventing it from appearing in search results.

Conclusion

There are three easily corrected ways that you might be hiding content from Google unintentionally. Please be sure to check your website’s robots.txt file to see if there are any errors/mistakes. Next, verify that none of the pages you need indexed have password protection. Last, check out your JavaScript usage and cookie requirements, as these may also prevent bots and crawlers from finding their way throughout your site.






Leave a reply translated

Your email address will not be published. Required fields are marked *

12 − ten =