Some days ago, Google published some kind of “disavow links tool” for scraper sites. If your website content has been stolen by other websites, this tool can help you. How exactly does it work and how can you benefit from that new tool?
What are scraper sites?
Scraper sites are websites that copy content that other people have created. They post that content on their own website as their own content.
It seems logical that the original website should rank higher on Google than the website that copies the content. Unfortunately, that’s not always the case.
Scraper websites often rank higher on Google than the original websites. Sometimes, the original website isn’t displayed at all in the search results while the scraper site gets high rankings.
Google’s new approach
Until now, Google has tried to filter scraper sites algorithmically from the search result pages. It seems that this hasn’t worked very well. Google’s Matt Cutts announced a new tool that enables webmasters to report scraper sites:
How does the new tool work?
To report a scraper website to Google, you have to do the following:
- Go to this page.
- Enter the URL of the original page and the URL of the scraper site.
- Enter the URL of the search result page that demonstrates the problem.
Before you submit the report to Google, you should check that your website follows Google’s webmaster guidelines. If your website has been penalized because of spam, the scraper site tool won’t help you.
What is Google going to do with the data?
Google does not say what they are going to do with the data. Submitting a scraper site through the form does not mean that the scraper website will be removed from the search results. It is likely that Google will use the submitted information to improve the spam detection of the ranking algorithm.
How to improve your web pages
If your own web pages rank higher than the scraper websites, you don’t have to worry about them. Do the following to make sure that your web pages rank as highly as possible:
- Step 1: Make sure that there are no technical errors on your web pages that keep search engine robots away.
- Step 2: Optimize the content of your web pages so that search engines can see that they are relevant to a particular topic.
- Step 3: Improve the backlinks that point to your web pages to show that your website is relevant and popular.
- Step 4: Remove bad backlinks that hurt the rankings of your web pages.
SEOprofiler can help you with all of these steps. If you haven’t done it yet, try SEOprofiler now:
Article by Axandra SEO software