Search Engine Optimisation

How to Remove URLs from Google’s Search Results

Deleting information that bobs up in search result is not as easy as deleting data from a file. However, if you impose restriction on its access or eliminate the information altogether, content will stop showing up when that particular page will be re-indexed by search engines. This is a natural process and therefore, it will take some time. But, if you are running out of your patience or if you want the content to be removed immediately, you have to use the Google’s URL removal tool within your Google Webmaster Tools account to accelerate the process.

You have two options when it comes to removing content from the Google’s result page. These options are discussed in details below:

Removing a single URL

A URL can only be removed when the owner of the website makes it clear that he has no objection. URL of a particular website can be blocked from appearing in Google search either by using:

  • robots.txt file
  • noindex meta tag or
  • by indicating that the page no longer exists by returning a 410 or 404 status code

If robots.txt file is used for this purpose, you just need to include the following to the file:

User-agent: *
Disallow: /your-file-name.html

The same thing can be done by using “noindex” meta tag which is added to section of that particular page.

<meta name=”robots” content=”noindex” />

Similar objective can be achieved by removing the webpage from the database, and then returning that URL by using 404 / 410 status code.

However, it has to be ensured that the URL is blocked properly before submitting a removal request. After that, you should go to http://www.google.com/webmasters/tools/removals and then select the URL that you want to be removed from Google search and then you should select the “Webmaster has already blocked the page” option as below.

Remove URL from Google Search Result Pages

Removing an entire directory or site

Search engines can be blocked from indexing a website or a complete directory by using robots.txt file. If you want to block the entire website, the robots.txt should include:

User-agent: *
Disallow: /

If you want to block a particular folder, the file should include:

User-agent: *
Disallow: /folder/

You can try “Test robots.txt” or “Fetch as Googlebot” features to ensure whether the website is blocked properly or not. These features are also available in the Webmaster Tools section.

Verified Owners of Google Webmaster Tools

Verified owners of websites can remove a website by selecting it in the Webmaster Tool and then they have to go to Site configuration > Crawler access > Remove URL. Submit the URL that you want to remove and confirm it. Subdirectory can be removed in the same process. Just select the “Remove directory” option from the drop-down menu.

Remove Directory/Website from Google Search Result Pages

Re-indexing content

If you are the authorized owner of a website that you have blocked, you can unblock it by going to (Webmaster tools) Site configuration > Crawler access > Remove URL > Removed URLs and then canceling the earlier requests.

Leave a Reply

Your email address will not be published. Required fields are marked *