How to Hide Webpages from Google

0 comments, 16/11/2013, by , in Google

Hidden WebpagesWith tactics such as search engine optimization and content marketing being used by millions of people worldwide, it comes as no surprise that most web masters are consistently trying to get their websites indexed by all of the major search engines. It can be a bit perplexing, then, to hear that some people try with all of their might to keep certain webpages out of the sight of Google and others. There are many different reasons why you might want to hide a particular page – sensitive information, secure download links and content that might jeopardize your search engine rankings are all common reasons to exclude pages from search queries. If you want to learn how you can hide your pages from Google and keep select content secure, then continue reading to find out three different ways to do so.

Block Your Directories

Sometimes, you may wish to make multiple pages off-limits, rather than just one or two. There are plenty of ways in which you can stop pinging to Google these details for an individual page, but you probably do not want to spend time replicating the code for each instance. This is where alterations to your robots.txt file can be very useful. If you want to exclude an entire directory, you can insert the “disallow: /directory” command under the user-agent section to prevent Google and others from finding these pages. Please be sure that all pages in any directory added to the file are pages that you wish to block.

Password Protect Pages

This may not be the best way to handle the issue of hidden web pages for many occasions, but those who want to prevent subscriber material or hidden download links from falling into the wrong hands can use a password protection script to keep Google’s crawlers and bots away. This is not a good idea for simply preventing the indexing of regular material, as your average readers and users will either a) not be able to see the material or b) will not want to put in a password each time to view standard content.

Utilize JavaScript

The final way that any web master can effectively prevent search engines like Google from indexing select content is to use JavaScript to display said information. The uber vast majority of search engine bots and spiders do not read JavaScript and as such, your content will remain safe and secure on your website or blog. Since JavaScript is widely used by desktop and mobile users alike, however, your readers will still be able to enjoy any and all content without issue.

Conclusion

Whatever the motivation for not pinging to Google certain pages, you have many different options at your disposal. Whether you wish to block certain directories, password protect individual pages or display information on pages that is not indexed by Google due to coding, you can quickly configure a solution. As more and more penalties are handed down from Google and others for the sourcing or discussion of previously indexed content, this will only increase in prevalence in the future.






Leave a reply translated

Your email address will not be published. Required fields are marked *

eleven + 13 =