First, a recap of the past year. SEO in 2011: Low Quality Sites (Panda) and Communication
2011 was all about identifying low quality sites for Google. They wanted to clean up the Web, while also working to improve communication between Google and webmasters. He thinks they were able to do a good job with that, which allowed them to move back to focusing on webspam in 2012.
Last year in detail:
- Algo update to address sites with too many ads above the fold (aka MFA sites)
- Penguin: You’re probably already familiar with this one. It got a little bit of coverage. ;)
- Spammy link network crackdown: Google recently got its first spam report from the US Senate. One of the sites highlighted in that report was a site they had not only found via Penguin but with a spammy link crackdown, as well. It made Matt feel warm and fuzzy that they caught the site algorithmically and manually.
- Continued refinements on Panda: Matt says there continues to be data refreshes once a month
- Exact-match domains
As a webmaster, you may not be so excited about the stuff on that top list. So, for you, Matt shared some Google changes that took place over the past year that everyone can get excited about.
- Googlebot got smarter about AJAX/JS
- 90 days of queries, 2000 top queries –> 98 sites have full coverage for all queries that lead to clicks
- autocompletetype: better form filling. If you lower the barrier to entry to filling out forms, you’ll get higher conversion rates.
- Responsive Web design suggestions
- Better multilingual support
- Webmaster Academy
- Recommendations for smartphone optimized websites
- Better crawl/site error reporting
- Emails for critical issues
- Better user permissions in console
- Sending messages for pretty much every manual action that they do that will directly impact the ranking of your site.
- Blog posts of algorithm changes: Helps you to see what topics they’re trying to tackle
- Video of quality launch meeting –> released 8 minutes of a weekly minute to help you look over Google’s shoulder and see how they evaluate sites
- Updated Webmaster Guidelines with examples
- Message for unnatural links
With the 2012 recap out of the way, Matt had something to announce – A New & Improved Disavow Link Tool
Before Matt shared the URL for the tool, he had a few words of warning for everyone in the room.
Do not use this tool unless you know what you are doing and you are sure that you need it. Do not be the guy (or gal) who accidentally disavows every link going to your website. If you are that guy, you should not use this tool.
Matt stressed the importance of still trying to remove all the links from the Web manually that you can before you use this tool. Other search engines are also looking at your link profile – so are competitors – you don’t want them to think you like spammy links either.
Okay with that out of the way, the good stuff.
This is the URL for the new Google Disavow Link Tool: https://www.google.com/webmasters/tools/disavow-links-main?pli=1
Once you log into your Webmaster Console, you’ll see the link to disavow links. The process, itself, is pretty easy. Create a text file, one URL per line. You can also disavow links by domain: to ignore ALL domain links. For the time being, Google is going to use this as a strong suggestion. All things being equal, Google will ignore the links that you specify, but if they see something going wrong, they reserve the right to ignore you. Matt says that most sites should NOT use this tool.
When inside Webmaster Tools, you’ll be able to upload your text file of links for Google to ignore. There will be lots of warnings to remind you that you only want to do this if you know what you’re doing. Once you’ve chosen the file, hit the submit button and Google will tell you it has been processed.
This will not take effect immediately If you submit your links on Monday, don’t expect Google to have discounted them by Monday night. The way it works is that when Google crawls the Web, they will see the links you’ve mentioned and they’ll get annotated. It could take a few weeks. This is done on purpose so that webmasters don’t try to turn links on and off in real-time. Google is onto your games.
Webmasters will have the ability to download their text file via CSV or Google Docs so it can be re-edited and uploaded. If you delete the file, it will be like you never submitted it.
So is the disavow link tool the same as adding a nofollow? It’s about 99 percent the same. With nofollow, Google will drop that link out of their analysis. It’s pretty much a guaranteed thin, but here they’re being a little more cautious so people don’t accidentally disavow all links from themselves.
If you’re in the process of filing a reconsideration request – take care of your links first. Then wait a little bit before submitting the reconsideration request and let Google know that you’ve used this tool so they know to look for it.
Danny Sullivan was present during Matt’s talk and somewhat-frustrated says it sucks that Google has to give us this tool at all. In the past they’ve always said that links pointing to our site can’t hurt us, but now they seem to be going against that. If you know links are bad, why not just NOT count them?
Matt counters that the disavow tool isn’t just for bad links, it can also be for negative SEO. Instead of wasting your time worrying about that, webmasters can now use the tool and go back to getting links from reputable domains. This tool is also for those occasions when you can’t get someone to remove a link to your website that you want pulled because they’re either ignoring your email or you just can’t get a hold of them.
Didn’t exactly answer Danny’s question but…there’s always tomorrow. ;)