Snippets and Titles – Matt Cutts
Matt explains what a snippet isMatt said that when a user searches for something Google shows the title and some description. Matt said that both are considered the snippet.
Google highlight some of the words in the snippetMatt said that Google highlights some of the words in the snippet to show the user how relevant this result can be to him.
If we showed the first 50 words it might not be usefulMatt said that if Google showed the first 50 words of a result for example then the user might not be able to determine whether the result is useful or not.
It takes a lot of computer power to compute snippetsMatt said that even though it takes a lot of computer power to compute snippets Google still think that it’s a very good idea to compute them that way because they help users.
Google show the most useful titleMatt said that even if the page doesn’t have a title Google tries to display the best title in the same manner it computes snippets. Matt said that if you have the same title for many pages Google can compute one.
Google may try to find a better titleMatt said that Google can sometimes try to find a better title and he gave the example of a page that has a very long title. Matt said that this title will be taken into score calculation , probably referring to page rank, but it might not be displayed as it is.
Google may get snippets from other placesMatt said that snippets can be grabbed from the open directory, computed from the keywords on pages or concluded from the meta description tag.
Google can find a better title for usersMatt said that Google can even change title in its search results if it believed that the title used can’t help users locate the best result.
What should we expect in the next few months in terms of SEO for Google?
Matt Cutts: Hey everybody, today’s webmaster video is answering the question: “What should we expect in the next few months in terms of SEO for Google?”
Okay, so, first off, we’re taping this video in early May of 2013, so I’ll give you a little bit of an idea about what to expect as far as what Google’s working on in terms of the webspam team.
In terms of what you should be working on, we try to make sure that is pretty constant and uniform.
- Try to make sure you make a great site that users love,
- That they’ll want to tell their friends about, bookmark, come back to, visit over and over again, ya know, all the things that make a site compelling.
We try to make sure that if that’s your goal, we’re aligned with that goal, and therefore, as long as you’re working hard for users, we’re working hard to try to show your high quality content to users as well.
But at the same time, people are always curious about, OK, what should we expect coming down the pipe in terms of what kinds of things Google’s working on. One of the reasons that we don’t usually talk that much about the kinds of things we’re working on is that the plans can change. Ya know, the timing can change, when we launch things can change. So take this with a grain of salt. This is, as of today, the things that look like they’ve gotten some approval or they look pretty promising. Okay, with all those kinds of disclaimers, let’s talk a little bit about the sort of stuff that we’re working on.
We’re relatively close to deploying the next generation of Penguin. Internally, we call it “Penguin 2.0”. And again, Penguin is a webspam change that’s dedicated to try to find blackhat webspam and try to target and address that. So this one is a little more comprehensive than Penguin 1.0 and we expect it to go a little bit deeper and have a little bit more of an impact than the original version of Penguin.
We’ve also been looking at advertorials that is sort of native advertising and those sorts of things that violate our quality guidelines. So again, if someone pays for coverage or pays for an ad or something like that, those ads should not flow PageRank. We’ve seen a few sites in the US and around the world that take money and then do link to websites and pass PageRank. So we’ll be looking at some efforts to be a little bit stronger on our enforcement as far as advertorials that violate our quality guidelines. Now there’s nothing wrong inherently with advertorials or native advertising, but they should not flow PageRank and there should be clear and conspicuous disclosure so that users realize that something is paid, not organic or editorial.
It’s kind of interesting. We get a lot of great feedback from outside of Google. For example, there were people complaining about searches like “payday loans” on Google.co.uk. So we have two different changes that try to tackle those kinds of queries in a couple different ways. We can’t get into too much detail about exactly how they work, but I’m kind of excited that we’re going from having just general queries be a little more cleaned to going to some of these areas that have traditionally been a little more spammy including, for example, some more pornographic queries. And some of these changes might have a little bit more of an impact in those kinds of areas that are a little more contested by various spammers and that sort of thing.
We’re also looking at some ways to go upstream to deny the value to link spammers–some people who spam links in various ways. We’ve got some nice ideas on trying to make sure that that becomes less effective and so we expect that that will roll out over the next few months as well.
And in fact, we’re working on a completely different system that does more sophisticated link analysis. We’re still in the early days for that, but it’s pretty exciting. We’ve got some data now that we’re ready to start munging and see how good it looks and we’ll see whether that bears fruit or not.
We also continue to work on hacked sites in a couple different ways, number one trying to detect them better, we hope in the next few months to roll out a next generation of hacked sites detection that is even more comprehensive, and also try to communicate better to webmasters, because sometimes they/we see confusion between hacked sites and sites that serve up malware, and ideally you have a one stop shop where once someone realizes that they have been hacked, they can go to webmaster tools and have some single spot they can go where they get a lot more info to sort of point them in the right way to hopefully clean up those hacked sites.
So if you’re doing high quality content whenever you’re doing SEO this shouldn’t be some big surprise you shouldn’t have to worry about a lot of different changes. If you’ve been hanging out on a lot black hat forums and trading different types of spamming package tips and that sort of stuff then it might be a more eventful summer for you.
But we have also been working on a lot of ways to help regular webmasters. We’re doing a better job of detecting when someone is sort of an authority in a specific space, could be medical or could be travel or whatever, and trying to make sure that those rank a little more highly if you’re some sort of authority or a site that according to the algorithms we think might be a little more appropriate for users.
We’ve also been looking at Panda and seeing if we can find some additional signals and we think we’ve got some to help refine things for the sites that are kinda in the border zone/in the grey area a little bit. So if we can soften the effect a little bit for those sites that we believe have some additional signals of quality that will help sites that might have previously been effected to some degree by Panda.
We’ve also heard a lot of feedback from some people about that if I go down three pages deep I’ll see a cluster of several results all from one domain. We’ve actually made things better that you’re less likely to see that on the first page and more likely to see that on the following pages. And we’re looking at a change which might deploy which would basically say that once you’ve seen a cluster of results from one site then you’d be less likely to see more results from that site as you go deeper into the next pages of Google search results.
And that has been good feedback that people have been sending us. We continue to refine host clustering and host crowding and all those sorts of the things. But we’ll continue to listen to feedback and see what we can do even better.
And then we’re going to keep try figuring out how to get more information to webmasters. I mentioned more information for sites that are hacked and ways they might be able to do things, we’re also going to be looking for ways we can provide more concrete details, more example URLs that webmasters can use to figure out where to go diagnose their site.
That’s just a rough snapshot of how things look right now, things can absolutely change and be in flux we might see new attacks, we might need to move our resources around, but that’s a little about bit of what to expect over the next few months in the summer of 2013.
I think it’s going to be a lot of fun. I’m really excited about a lot of these changes because we do see really good improvements in terms of people who are link spamming or doing various black hat spam would be less likely to show up I think by the end of the summer. And at the same time we’ve got a lot of nice changes queued up that hopefully will help small/medium businesses and regular webmasters as well. So that’s just a very quick idea about what to expect in terms of SEO for the next few months as far as Google.