Danny Sullivan and Matt Cutts at SMX Advanced 2013

Photo by searchengineland.com

Here are my on-the-spot notes taken down as Danny Sullivan of Search Engine Land interviewed Google Distinguished Engineer (and head of Google’s anti-web spam team) Matt Cutts with questions of his own as well as some from the attending live audience at SMX (Search Marketing Exchange) Advanced in Seattle this week. Please understand that nothing in this post should be taken as an exact quotation; these are just my attempt to summarize what was said as well as I could.

5:03 Danny Sullivan introduces Matt Cutts.

DS: Will PRISM allow us to see not provided search terms? (Cracked Matt up)

DS: Since named Google algorithm updates tend to be named after animals starting with “P,” will the next ones be Polar Bear or Pug? (Matt cracks up again.

5:05 DS: What’s up with no more announcing of updates?

MC: There have been no updates to Panda in last 6 weeks. We are getting ready to pull in a new signal that should get some people out of the grey zone. Panda is very close to “steady state.” There’s new Panda data about 10 days out of each month, but they’re much smaller than they used to be. That’s normal for updates. They have a big impact in the beginning, but each successive iteration has a reduced impact because there is less to go after, and later updates go after more specific problems.

5:08 DS: Why not just announce all updates?

MC: We did that for a while. But it got tiring to have people freaking out over every announcement. Most are really quite small. It’s difficult to assess what’s worth announcing, when the SEO community tends to freak out over every announcement.

5:09 DS: Are the non-Google tools trying to measure algo fluxuations not accurate? Sometimes those tools show big fluctuations on days when you insist nothing big happened.

MC: Most changes are barely noticeable. Wen the tools are not accurate, they may be paying attention to a subset of data that is slightly skewed.

5:11 DS: Why is spam still getting through even after Panda and Penguin?

MC: No update is going to be targeted at every possible spam situation. Penguin 1.0 went mostly after home pages. 2.0 goes deeper in sites. There will be another update coming soon to go after spam from hacked sites. There are specific spam techniques which haven’t yet been targeted, but they will be. Changes coming out in the next few weeks that will go after some of the specific spam types that people have been complaining Penguin didn’t get.

Matt asks how many people are in-house SEOs. (Many hands go up.)

MC: Most people really don’t want to be doing illegal or black hat stuff. You might see a big change to “payday loans” today or tomorrow! So when you see one query with lots of spam, don’t assume Google is ignoring it.

We are worried about update name inflation. There will be some more named updates, but most won’t be.

5:18 DS: Links are a problem. Now our Third Door Media sites are getting link removal requests, which we won’t do. Why doesn’t Google just disavow all the bad links (big applause).

MC: The whole recent process of penalties should be seen as a temporary market correction. Everyone’s getting the message. You have to go after good links now. We’re moving to a healthier world in search, spamming is getting harder and harder.

People are now much less likely to want to pay for links. We’ve seen many sites go higher after they cleaned up and went white hat. They’re finding it’s more sustainable. In the past they might have made some quick but short-term profit from black hat techniques, but now they are learning that white hat yields a steadier, more sustainable income.

5:23 DS: How can we know what works anymore? Where’s the decoder ring of what counts?

MC: I feel like our standards have been pretty consistent from the beginning.

DS: Really? People should just concentrate on building good sites with great content? Really?

MC: If you aim for fantastic user experience, you will find it easier to get traction and links and everything else.

5:26 MC: There are 500 algo changes a year. Most typically don’t generate messages to webmasters. But if Google has taken manual action, you will almost always get a message in Google Webmaster Tools (WMT). Something new today: we’re testing sending along example URLs of bad URLS to webmasters when we notify them of a bad links penalty. (applause)

5:27 DS: Penalties vary in severity, right?

MC: Yes. A totally evil site might actually have to wait until the domain expires. But length of penalty matches severity of what the site is doing, and how often the infractions were repeated.

Right now penalties automatically expire. But we are finding ways to take a second look manually in some instances.

5:30 MC: You really need to be thinking about mobile. They’ve notice common problems. Such as all mobile urls redirecting to one mobile URL. In the future, if Google sees those kind of errors, they might lower rankings for your site for mobiel searchers.

Also, very slow mobile sites could be penalized in rankings. These mobile search changes have been approved and may be out soon. They’ve been already implemented in Japan, but now will be more widespread.

Audience questions:

5:33 Should I disavow links right away if I get a WMT message?

MC: It’s fair to your competitors that you do some clean up work before you disavow. Also just disavowing leaves a bunch of spammy links on the web. We want some help in cleaning up the web.

We are looking at ways to give more backlinks to webmasters, but that’s still a ways off. We’re concerned about black hat ways to abuse such a list.

5:36 When you began “not provided” cloaking of web queries from logged in Google users, you said such not-provided data would be less than 10% for most sites. But for most it’s skyrocketed way into double digits. Why did you tell us that?

MC: I was talking then about English only and Google.com only. Google PR people went crazy when I said single digits. As they continued to roll it out and expand it, it definitely went up. I pushed encryption because of reading a book by Cory Doctorow about how easily governments could spy on people. The events of the past week [the leak about NSA alleged access to domestic phone and web search data] seem to justify that concern. Google has made continual moves to increase user privacy.

DS: Why don’t you make all my data available in WMT?

MC: We increased the amount available from 30 to 90 days in the past. You should download it as you go.

DS: But you guys have all that data? Why not just make it available.

MC: That would be nice.

DS: Why is Google large brand focused?

MC: It’s not.

DS: Why do people think it is?

MC: We use all the data we have. We don’t give preference by brand.

When Google does Penguin updates, what’s the percentage of sites hit now?

MC: It depends on how much spam there is that is targeted by any particular update.

5:43 Are all affiliates seen by Google as black hats?

MC: No, of course not. but there are more bad than good ones. We like any sites that add value, and the degree to which we like it corresponds to how much value it adds.

5:44: Does page speed affect rankings?

MC: Slow sites rank lower. Fast sites don’t get a ranking boost, more of a lowering for slower sites.

5:46 What about the data that shows that Google is using Facebook data for ranking [a reference to a study presented earlier in the day by Eric Enge that showed that a Facebook Like alone could get a new page indexed and ranked]?

MC: I disagree with Eric Enge’s conclusions from his study earlier today. I do agree with him that using Chrome or Analytics alone will not get you indexed or ranked. Google isn’t able to crawl much of Facebook. What you’re seeing is correlation between Facebook likes and other factors that cause a page to rank well. Might have been links to those pages that you didn’t find. I’ll dig into Eric’s data to see what I can turn up.

5:49 (question about today’s mobile penalty) MC: At least mobile enable your root page. The issue is more if you are screwing up the mobile experience, not so much how much of the site is mobile.

Geolocation is not cloaking. But don’t treat Google like it’s its own country. Treat Google.bot mobile just like an iPhone coming in.

5:52 MC: We have looked at topical-based rankings, but still difficult to do well. We are doing a better job at detecting if a site is authoritative in a category, and that can help them rank higher for that category.

DS: How many cateogories? MC: Lots? DS: All auto-generated? MC: Yes.

MC: We’re also rolling out a beta of a structured data dashboard that would show markup errors. You can join at http://bit.ly/sdtesters

5:55 DS: Could you not get into The Internship?

MC: I must not fit the right demographic.

How much does bounce rate affect rankings?

MC: Last year I said it was not used. The answer is the same: to the best of my knowledge we don’t use bounce rate to affect rankings. I’m skeptical of using user behavior. We tried out having some feedback buttons on search results. My wife made the happy/frowny buttons for that! They found that people were getting others to click each other’s happy buttons. People tend to skew user data if they think it affects ranking.

DS: Is that why you’re not using Google +1’s for ranking?

MC: We’ll continue to anlayze the data to see if it’s useful. I’ll never rule out a particular signal, but I’m skeptical of these user-action ones.

DS: Are you going to do anything about offensive search suggestions?

MC: But it’s just what people are typing in.

DS: What is the most overrated thing out there in SEO?

MC: The effect of short-term social. Authorship is the kind of thing that will come into play, because we may not have access to alot of the kinds of small social that are developing. Things like Authorship are built up over a long time, whereas most social activity is very temporary and ephemeral.

DS: Biggest surprise of the last year?

MC: It’s impossible to predict what people are and aren’t going to notice or get worked up about. That’s one reason why we don’t announce every update.

Enhanced by Zemanta