Google

How Do Google Rank Websites?

There are 2,925,249,355 internet users globally – but 982,003,325 websites servicing them. If you want to attract more than your fair share (which is a frankly pathetic 2.98 internet users per website) then may I suggest that deft SEO is a strategy that you can not afford to ignore? When Google launched in 1998 there were only 2,410,067 websites on the entire Internet. It’s grown by more than 1,000 times over since then. Without specialist SEO knowledge it stands to reason that its over 1,000 times harder to rank on Google now than it was then.

internet users globally
Internet users globally

The availability & interchange of information has soared – 105,506 YouTube videos are viewed every second. There are 9,788 Tweets sent every second or every day (a bout 1,000,000 every 1 minute and 42 seconds, around the clock). 1,830 Skype calls are made every second of every day. 2,461 Instagram photos are uploaded every second. 2,416,569 emails are sent every second.

The thirst for this knowledge or information is equally mindbogglingly large: 50,030 Google searches are performed every second. That’s over 1,000,000 Google searches every 20 seconds. Every time you perform a Google search that query travel an average of 1,500 miles to a data center and back to present you with your search results.

The demands of this flow of internet traffic is immense – 29,019 GB of Internet traffic per second. Search engine optimization is the process of maximizing your website’s chances of soliciting traffic via organic search engines.

GB of Internet traffic per second
Internet traffic per second

I view SEO or search marketing as but one cog of many in the money printing mechanics of a success & profitable website. SEO, though, remains my first love, passion & core competence. I have, though, come to view it as merely part of my service offering – an offering that spans from online business conceptualization, via design & development, through conversion & revenue maximization.

The first option involves an investment of time but is a good choice for anyone with little to no budget. The informational resources to learn the fundamentals of SEO are readily available online. The many & varied nuances of situational SEO best practices are only attainable via experience which can only come with time & continued, constantly re-calibrated & self-aware effort. There is nothing inherently or excessively difficult about SEO – Google go to great lengths to distribute information relating to SEO best practices as freely & widely as possible & make it as accessible as can be – but the finer aspects of the art of SEO are learnt through experience. If you have no budget for search marketing but enthusiasm for learning about it – DIY SEO is entirely achievable with effort, time, patience & realistic expectations.

The second option is a minefield of finding a competent individual SEO consultant or expert, company or agency who will faithfully implement search engine friendly changes & focus on your online success. This becomes problematic when your SEO retainer accounts for 2-5% of the SEO agencies revenue but your organic search success accounts for anything up to 100% of your website’s revenue derived. A sustainable SEO consultant-client relationship is effectively a partnership – a symbiotic association between two profit driven entities. If the potential upside of an online industry is substantial – & it is in many online sectors – both parties (the competent SEO practitioner & the ambitious website owner) must appreciate that the only possible lasting arrangement is one that a workable share of that upside to both.

The third option again involves finding a competent individual & a substantial investment of your time. From a budgetary perspective an SEO training course offers a useful middle ground between learning SEO entirely by yourself & employing someone to do your website’s SEO on your behalf. The payment involved would generally be once off in nature. Doing an SEO course also affords you the opportunity to better gauge the enormity or otherwise of the search engine marketing task you are faced with. In most cases individuals will have underestimated the size of that task. In other instances individuals may realize that they are capable of taking the SEO baton onward themselves.

The goal of SEO is not one dimensional – traffic is heterogeneous – consisting of lower quality traffic & higher quality traffic. The aim should be to swell revenues & profits – not merely traffic. The solicitation of high quality traffic is inextricably linked to the existence of high quality content. If you want to be found via a search engine – make sure your online real estate possesses something of value to your target market. I call this epiphany one of my binding foundational principles of search – & the more that you take cognizance of these, the less important frankly minor algorithmic tweaks.

The most successful websites in the world – your Wikipedia’s, your Facebook’s & yes, even, your Google’s – have achieved multi-billion dollar success (or chosen to give it away in the case of Wikipedia) because they offer uniquely superb resources that add value to people’s lives. Google itself have on-site SEO that is well below par – but that is entirely inconsequential because their perceived value to visitors is immeasurably positive.

Wikipedia by Articles
Wikipedia by Number of Articles Related to Specific Areas of the UK.

Adding value to visitor’s lives is the key to boosting your imputed worth in the eyes of Google legitimately. A website can offer time sensitive information (stock prices, sports scores, weather, and breaking news); information that offers gain (financial or otherwise) to the visitor (job listings, discounted goods or cheap/convenient travel reservations); reference information of high utility (databases & journals); data & media of entertainment value to visitors; data sifting tools (search engines) & existing customer functionality for convenience (online banking & airline websites). So ask yourself – how is your website adding value to your visitor’s life? A great many websites – what I call the corporate brochure model of a website – simply do not offer any value beyond a presentable website whatsoever. This isn’t intended as a slight – merely an objective assessment. Most content generators focus on building links to their content but high value content marketing can not help but generate high quality trusted links almost by default. This means that links will be natural & not made to look natural.

If you abide by this principle – of value offering above all else – be it informational, educational, entertainment in nature or some other form of utility – the nuts & bolts of SEO will materialize automatically. Additionally this mechanism for boosting your online presence is a future proof SEO strategy as the worth that Google calibrates is incontrovertibly positive. There are no smoke & mirrors. There are no loopholes which you are exploiting today which will evaporate tomorrow. Google employ 24,000 of the smartest people on the planet to constantly shut down such loopholes – is that a game you want to play? Is that a game-plan you see having any longevity whatsoever?

Web content creators & online businesses should desist from their obsession with ranking in Google’s top 10, and focus instead on user experience. This will lead to a surge of organic search traffic and rankings. Google’s priority is their users and a good user experience will lead to the search engine taking a more favorable view of your website.

How To Rank on Google

Google Ranking Signals
Google Ranking Signals

Why Some Websites Soar & Others Sink on Google Search

Targeted Keyword in Title Tag. This is akin to saying “label your work” which, predictably, is of great help to the search engine seeking to make sense of the accumulated body of human knowledge! It is also your first step toward ranking for what you want to rank for on Google. It is important to maintain a balance between keyword rich title tags & title tags that effectively explain what your page is about whilst appealing to your target searcher.

Domain Authority – stronger domains boost a given pages chances of ranking for a given keyword string relative to a less authoritative domain. That is – The Huffington Post, all other factors being equal (in theory), would yield a greater SEO/SERP impact & ranking than a page on a more obscure domain. Authority is deferred over time & via the accumulation of imputed worth via links to content on your website.

Domain Authority
Domain Authority

Targeted Keyword in the URL. Another way you can make Google’s life easier – by making your webpage’s relevance to a given search term as obvious as possible. As an SEO Consultant I have seen hundreds of website’s who under-utilize this very easy to remedy inefficiency. You don’t need to be an exact match either – but having keyword rich text in the url will improve your chances of ranking well on Google.

Quantity of Linking Root Domains. This is a very important ranking signal. 10,000 total links from 5 websites is of less SEO potency than 500 inbound links from 20 websites. This is a form of “safety in the assumption of worth based on numbers” – if more independent voices “vote” for your site via an inbound link then your SEO performance is undoubtedly & near universally bolstered. For new sites this process can take time but an investment in the time it takes to create quality, noteworthy content should see an increasing number of websites within or near your niche find you content of use & eventually link to your website in growing numbers.

Quantity of Inbound Links from C-Class IP addresses. Links from a larger amount of clearly independent domains are deemed a signal of greater imputed worth. These are viewed as a highly reliable metric for algorithmic purposes by Google.

Domain Age. This is a qualified ranking signal – older domains that do not have a history of serious transgressions against Google guidelines have an inherent SEO advantage over younger domains. The Google algorithm places some value in the longer period of guideline adherence. This has lead to a market for so-called recycled domains in the lead generation business. The advantage can be a subtle one & its possible that the majority of the reason for the higher placement in SERPs could be down to correlation with being established, high quality website’s that came into existence 11-12 years ago (& survived) instead of direct algorithmic causation.

Domain Age
Domain Age

Content offering Unique Insights will increasingly be rewarded with higher placement within Google SERPs. That is new research, novel analysis, vibrant commentary & industry specific information of high utility to visitors will almost always be rewarded with a strong organic footprint within Google SERPs. As the other non-Google search engine algorithm’s become more refined & sophisticated over time so the quality of novel content produced will surpass the largely mechanical ranking signals that currently lead to high rankings on these engines.

Main Content should be immediately visible – that is, above the fold. Google Quality Guidelines states: “The page layout on highest quality pages makes the Main Content immediately visible.”

Trusted Links or TrustRank. There are “seed sites”: like ac.uks, .govs, .edus, Time.com and so on. The world wide web’s most trusted domain posses links directly from these websites. By acquiring inbound links from websites that themselves enjoy inbound links from seed sites you acquire a diluted yet highly potent form of reflected trust.This is loosely how it works:cam.ac.uk –> your website = Perfect. Complete trust implied.cam.ac.uk –> another website –> your website = very, very good. Implied trust is transferred.cam.ac.uk –> another website –>–> another website –>–> another website –>–> another website –>–> another website –>–> another website –>–> another website–>your website = of very little use. Still of more use than a scenario where the University of Cambridge‘s seal of approval is nowhere to be found within the link chain.

Targeted Keyword Appears in Domain. A quick win which as with Exact Match Domains is on the wane in potency yet remains a useful mild auxiliary algorithmic advantage that should be taken if available.

Targeted Keyword is First Word in Domain. This is clearly not possible in all if not most instances but can be seen as a very useful easy win if you are presented with the choice of purchasing a domain that is keywordabc.co.uk vs. abckeyword.co.uk – the fact that the keyword is placed ever so slightly more prominently has been shown to ever so slightly but ever so consistently rank higher for [keyword] searches all other factors being equal.

Domain registration length. A reputable business is likely to register their domain for a longer period. A fly by night scam operation is likely to register their domain for the shortest period possible to minimize costs (usually given that they own dozens or even hundreds of shell/affiliate sites).

A natural looking Diversity of Inbound Links. A natural looking link profile includes links from a variety of domains, sources & link types. Google have armies of computer scientists, chaos theory mathematicians & their ilk establishing rigid mathematical formulas for what constitutes the extremities of what is naturally feasible in terms of a non-manipulative inbound link set. Links from industry relevant sources & appropriate usage of the disavow tool can help in this regard should you currently suffer from inappropriate external inbound links.

Diversity of Inbound Links
Diversity of Inbound Links

Inbound links from older, more established, domains assist SEO performance to a greater degree than do links from newer, less established, domains. This is related to the ingrained accumulated trust earned from long existing domains. That’s not to say thank inbound links from newer websites are of low worth – far from it – merely that inbound links from long established, quality domains are in general of premium worth.

Quantity of Linking Pages Inbound Links are derived from influences SEO performance. Links from a higher quantity of different pages represent an implied quality signal to the Google algorithm.

Targeted Keyword in Subdomain. Given the option of using the formula keyword.mywebsite.com or somethingelse.mywebsite.com it helps moderately to use the former for SEO purposes. This is a weaker version of having a near exact match domain (which as we’ve said is not a ranking signal of the potency it once was & still relies on being a decent quality site to rank).Keyword in Subdomain

References and Citation of Sources. Google’s co-founders (not to mention 98% of her current employees) derive from a tertiary academic environment where citation of sources is a part of the very fabric of their world – an esoteric but highly influential observation for SEO purposes.

A natural looking ratio of Nofollow Inbound Links to your website. Having too high a percentage of followed inbound links to your website is indicative of manipulative SEO practices by or on your behalf.

Nofollow Inbound Links
Nofollow Inbound Links

Google PageRank – derided, misunderstood – but I maintain, still a very useful broad strokes quality indicator. As per Brin & Page’s seminal paper on the topic, “We assume page A has pages T1…Tn which point to it (i.e., are citations). The parameter d is a damping factor which can be set between 0 and 1. We usually set d to 0.85. There are more details about d in the next section. Also C(A) is defined as the number of links going out of page A. The PageRank of a page A is given as follows:PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))

A website silo structure enables Google to organize your website’s content by theme. Thus a website should be set up in a structure that allows for it as an entity to rise above an unrelated mash up of information lacking a clear central theme or themes.

website silo structure
Website silo structure

Inbound Links from Globally Respected Authority Sites. A DMOZ listing is the perfect example of such an inbound link.

DMOZ
DMOZ

Keyword Usage within Category URL String. This means utilizing keywords or “near keywords” as categories in URL strings. Such as: myblog.com/hotels-london/ritz utilized [hotels london] as url text in all sub-pages which helps the website rank for the term [hotels london] as well as search terms less immediately perceptible such as [ritz london] – less directly. Think of the emboldened elements within a Google SERP – they give clues as to glaring ranking signals that you need to exploit as best you can.

Website Usability. A website that isn’t user friendly will result in less time on site; a higher bounce rate & other tell-tale signs of a website of little utility. This is a Google ranking signal that will (and already is) increasing in importance over time. I can easily see website usability becoming a top 10 or even higher ranking signal for the Google organic search engine algorithm within 24 months from August 2014.

Website Usability
Website Usability

Frequency of website content updates. A site-wide freshness indicator. The more regularly a website’s content is updated, the more regularly it will be crawled by Google & the more the website will be viewed as a newsworthy resource within its subject niche.

Relevance of the Page that Inbound Link is Derived from… of especial diagnostic importance here is the title tag of the page linking to your page. If you page is seeking to rank on Google for the search term [disco cheam] a link from a webpage of that title is ideal. A link from a page with a similar variant is also a positive ranking signal.

Data Server Geographical Location. Google, generally, views a website attempting to rank for the search term [accountant london] as being more congruous if it is hosted in London, UK than were it hosted in say Beijing, China. This is one of the more commonsense elements of the search engine optimization process.

Server Geographical Location
Server Geographical Location

Inbound Links to the Homepage of the Domain. A link to the homepage bolsters rankings site-wide in a small but incremental fashion. It also bolsters your website’s claims to being an authority within your content niche generally.

Absence of cloaking. This is when what a human visitor sees & what Google bot sees are two different things in terms of hidden links behind background matching text. Once found to be doing this your website will be negatively impacted in search results in both the acute & chronic sense.cloaking

Domain Ownership History. This is only of importance if there is a history of either unscrupulous/penalized owners of a given domain or an unusually frequent change in ownership of a domain. Domain ownership history will have an SEO ranking impact to a very small percentage of websites but those that it does impact will be impacted rather significantly.

Social Shares. Although a link from Twitter, for example, is technically a nofollow inbound link Google seek larger patterns of social media impact of a domain & reward websites with higher levels of social buzz implied by wider uptake through user shared content on social media networks. This is less a keyword specific & more an overall rank inflation benefit.

Social Shares
Social Shares

Proportion of Text to Code in the website’s HTML. A huge array of code that effectively says very little in text form to a human is seen as a partial waste of the web crawler’s time & is thus deemed to have a negative SEO impact. Alternatively text heavy/code light HTML is seen as an efficient way to communicate with the search engine’s crawler. The magic number appears to be around the 13.5% text to code (by characters) per page (or higher). My homepage, for example, offers as much as 54% text as a proportion of the size of the webpage. Google loves text content – consider that every search you conduct – bar voice or image based ones – is done via the medium of text. You can easily check that your website meets my recommended 13.5% or greater level by utilizing a text to code online checker.

Use of Breadcrumb Navigation. This is a double benefit ranking signal in that it is seen as a usability/user experience advantage as well as being the perfect way to send anchor text rich links to key pages & categories.

Breadcrumb Navigation
Breadcrumb Navigation

Relevance of the Inbound Linking Domain. If you are looking to rank for [hotel london] you will yield greater SEO advantage from a website in the travel industry than you will from one in the automotive sales industry. Again, an observation that is entirely in keeping with Google’s reliance on commonsense.

Exact Match Domain – less of a quick gain than they once were but still useful enough if you are a quality site. If you are a low quality site & having an exact match domain is all you have going for you then you’re out of luck – Google caught up with you around 2012.

Exact Match Domain
Matt Cutts tweeted about the changing value of Exact Match Domain some time ago.

Public vs. Private WhoIs. Only male impotence drug peddlers & their ilk have the need to hide their identity online. Google rewards enterprises & individual economic agent’s who are “real” & accountable entities & thus I strongly recommend you list your WhoIs as Public with as much information about your company detailed as possible.

Public vs. Private WhoIs
Public vs. Private WhoIs

Penalized Webmaster/WhoIs Owner. There is a legacy effect in such penalties. This is of especially importance when purchasing an existing domain – be sure to do a check of everyone who has owned it prior.

Country Domain extension. This can & should be a major consideration when purchasing a domain. If your business wishes to rank better in the United Kingdom & is largely unconcerned with ranking globally then a .co.uk is ideal. If your business needs to rank globally & there is zero premium on United Kingdom visitors – its best to get a .com.

Use of a (any form of) sitemap: allows Google to better & more efficiently index your website’s pages. You can have one or both of an HTML and/or a XML sitemap. I usually have both. An HTML sitemap for users & an XML sitemap for the Googlebot. This is a way to make sure that Google have access to every post & page on your website from one location.

Anchor Text of Internal Links. This should be a fusion of exact & near exact match anchor text to both target keywords yet simultaneously not be viewed as too obviously attempting to filter through keyword specific link influence down through internal pages. Internal linking is a hugely under utilized auxiliary ranking factor ignored for SEO purposes in general.

Installation of Google Webmaster Tools. The most direct way for a webmaster to seek information from & communicate information to Google. Indexation is the greatest benefactor potentially.

Google Webmaster Tools
Google Webmaster Tools

Deployment of Google Analytics. The benefits are multiform – for one you are giving good more hard data from which to ascertain how usable & useful your website is to visitors.

Absence of evidence of link purchases from known (to Google or Bing) Link Brokers/Merchants. This constitutes a reasonably low chance of being caught/catastrophic consequence reward to risk ratio that simply does not make sense at any level. If you want to be making revenue from your domain this time next year do not under any circumstances deal with these nefarious characters of the world wide web.

A domain name with a hyphen – or worse yet more than one hyphen – have an inherent negative impact upon your rankings on Google. This is a hangover from previous impositions on the Google algorithm from the very early days (that is the late 1990s) when the war on spammers was in many senses at its height (many websites of extremely low quality employed domain names such as name-of-well-known-impotence-drug-for-sale.com). One hyphen in a domain name is unfortunate – two is a tragedy.

An “Appropriate amount of contact information” is a stated Google Quality Guideline. This means a stated “Contact Us” page is seen as borderline required by all commercial websites to communicate a website of quality to Google.

URL Length – too long is deemed counter-productive for ranking purposes – an important finding in situations where categories or tags are used within urls.

URL Depth – pages closer to the homepage tend to rank higher than deeper laying pages. Thus for SEO purposes, it is better to keep your higher quality content as shallow as possible within the website’s url structure.

Website downtime from faulty server performance negatively impacts organic search rankings & SEO generally & site-wide. This should be a serious consideration when committing to a service arrangement with a given web hosting company.

Website downtime
Website downtime

Title Tag Begins with Targeted Keyword. Google pays more attention to the initial characters as it reads from left to right so having the Title Tag start with a targeted keyword is somewhat advantageous. Only do this if this fits the user experience; ie. if it makes sense to do so.

Over optimization penalty – this results from keyword stuffing, misuse of the H1 tag & inappropriate/repetitive use of meta-tags.

Inbound Link Anchor Text. Less important than it once was due to fears of being a target of the Google webspam team – the text that a user clicks on to navigate to your webpage from another website tells Google (in theory) what your page is about & that increases your chances of ranking for that search term (if done in a natural fashion).

Targeted Keyword in Meta Description. At worst you want to include at least one reasonable variation of the targeted keyword within the meta description. At most you want to include the keyword string twice (& no more) in the meta description; a third iteration is considered spam-like.

Targeted Keyword Appears in H1 Tag.

Inbound Links derived from limited uptake TLD Domains. Limited uptake TLD Domains include .edu; .gov, .mil; .int & .ac.uk domains. There are around 804 functional .gov domains currently from a pool of 982,003,325 total websites in the world. As recently as 2010 there were ‘just’ 206,956,723 websites in the world. Having inbound links from these premium domains is a surefire way to differentiate the inferred worth of your website compared to your competitors in the all seeing eye’s of Google.

Targeted Keyword is Most Frequently Used Phrase on Page.

Text Content Length in Words. Generally more content in the form of text gives you a better chance of ranking for a wider range of search terms. If you need a benchmark 350 is deemed a good bare minimum amount of words per url or page. The more, though, the letter for SEO ranking purposes. Most of the articles I publish on this website have over 5,000 text words per page. By creating in-depth & high quality content that adds value to visitors, I’ve got better search rankings & higher levels of quality traffic to my website.

Excess Outbound Links dilute message or obscure the impact of the main content of a page. Limitations should be informed by volume of text content on a given page.

Excess Outbound Links
Excess Outbound Links

Duplicate Meta Data. If done site-wide this hurts rankings, you guessed it, site-wide.

Links from repetitive sources such as author biographical area or site-wide “blogroll” links are not as potent from an SEO perspective as are links from a contextual source – that is links from the main body text of an article, post or page.

Page Category Relevance. Every category should further enable & enrich the understanding of the content so as to be understandable & logical to a human visitor.

Usefulness of the Content – this will inflate most if not all other factors & should be viewed as SEO’s new role – that is, one of encouraging the creation & integration of website’s that provide utility to the accumulated body of human knowledge.

PageRank of the page that links to your domain (independent of the PageRank of the website linking to your domain).

Inbound Links from domain’s with country specific TLD (so .co.uk or .de) help your website or webpage rank better in search results made from that country on Google.

Keyword Density. The regularity with which keywords is used in the main content of a webpage is less of a factor than it once was – note I say less, not NOT a factor, merely of less importance than it once was. Keyword density best practices should still be observed – though never at the express cost of the quality of the content. Google’s purchase of the Hilltop Algorithm some years ago was a clear signal that Google’s long run goal is to move beyond a mere rigid algorithm with impositions that reward robotic text creation & bury novel insights deeper in the SERPs. The golden band for keyword density would appear to be in the 0.9% to 3.5% range with 2.2% perhaps representing a nominal optimal number.

Proximity of related/relevant words to your targeted keyword around the anchor text of external inbound links to your webpage. This is indicative of a high quality, editorially bestowed inbound link that packs a serious SEO ranking punch.

Latent Semantic Indexing Keywords. This is a fancy way of saying Google has evolved enough to realise that certain words are synonymous with others. Thus [apartment rentals croydon] as a keyword string is largely treated the same as would [flat rentals croydon] & as such utilizing one increases your chances of ranking for the other. Additionally LSI Keywords are offered as alternatives at the foot of the SERP.

Latent Semantic Indexing
Latent Semantic Indexing Keywords

The use of ultra offensive language such as ethnic slurs or references to pornographic material (on non-pornographic websites, obviously) has a negative impact on how Google views your domain’s standing as an informational resource worthy of sending their visitors toward.

Latent Semantic Indexing Keywords in Title and Meta Description. It is preferential to mix up iterations of words that essentially describe similar concepts – that is usage of “consultant” & “expert” can be used near interchangeably within a webpages meta-data & are seen as superior for SEO ranking purposes than would multiple iterations of either within the same meta tag element.

Page Load Speed. The faster a page is able to load the less time a web user wastes & Google takes this as a sign of functionality & or usability. Google themselves state, “Fast and optimized pages lead to higher visitor engagement, retention, and conversions“. This is in keeping with the types of webpages Google prefer to send their search engine users toward. Page speed is a sufficiently significant issues in the mind’s of Google that they offer 3 page speed optimization products, namely: PageSpeed SDK, PageSpeed Service & PageSpeed Module.

Page Load Speed
Page Load Speed & Conversion Rate.

Duplicate Content. This is a form or variation on the cardinal sin of the academic world Google’s co-founders stem from (academia) – plagiarism.

Duplicate Content
Duplicate Content

Avoid Reciprocal Link programs. Google have gone public with this issue & just because some major websites used this rather drone-like technique in 2003 doesn’t mean it will work in 2014. It won’t – & can lead to a negative SEO effect in the SERPs.

Google give SEO preference to e-commerce websites with SSL certificates.

The use of words such as “link partners”, “sponsored links” & “link to us” diminishes an inbound link’s worth to the page being linked to.

Proper Rel=Canonical Usage (avoidance of Duplicate Content Issues).

Page Loading Speed on the Google Chrome web browser specifically.

Image Optimization – use of alt tags & image titles that communicate what the images are depicting.

Recency of Content Changes

Too much cross-linking between different websites that you own on the same server will be viewed negatively by Google when done on a mass scale & especially when the links that are cross-linked are of little to no relevance to each other.

Scale of Content Changes. Larger changes to content, predictably, result in more significant responses from the Google Index.

Inbound Links from User Generated Content sources. They are deemed especially valuable as (if certain quality assurance & validity checks are met) are links of near incontrovertible quality.

Historical Page Changes. Updating content on a regular, consistent basis over a protracted period indicates to Google that you are a legitimate informational resource of topical worth.

Targeted Keyword Being Used Early. Opinions vary on how early is early enough but you should generally aim to include your targeted keyword string (verbatim) within the first 50 words of a given page’s main body text if possible. Ideally, the earlier you are able to utilize the keyword string, the better. Never, though, compromise editorial or literary quality of a given passage to conform with this loose rule of thumb.

Return Traffic. In keeping with its mission of directing its users to the most useful online resources Google deem websites that web users visit & then return to (& in reasonable volume) as more useful than sites users visit once & then never again!

Google+1′s & Google Plus footprint. Offers both algorithmic advantage (despite Google’s rather amusing denials) & auxiliary benefits such as placement within supplementary SERP blocks.

Google via their Webmaster Central Blog, are on record as saying that they want “websites people access from Google (to be) are secure.” HTTPS — HTTP Secure/HTTP over Transport Layer Security— is as of August 2014 a positive algorithmic ranking signal. For the time being it is deemed a minor ranking signal in the greater algorithmic scheme of things but it is once that has the potential to become more important as privacy concerns become a larger & larger concern for users & thus focus for their champion – Google!

User reviews on websites such as Yelp.com.

20% to 25% of keyword search queries are once offs – by using more text content per page you increase the chances of ranking for one of these millions of once off search queries.

The text that is visible to a user when they hover over a link (be it internal or external) is called the Link Title Attribution & this is a useful, if modest, little ranking signal to Google for SEO ranking purposes.

Targeted Keyword in H2, H3 Heading Tags.

Targeted Keyword Order. Keyword Research should inform the order in which Keywords are used & placed in both meta data & body text.

Inbound Links from Wikipedia. Whilst these hard to permanently acquire inbound links are technically nofollow they impute a heightened state of authority.

Number of Words in the Page that links to your webpage. More words in the content that links to you is indicative of a stronger & more powerful link.

Compliance with the Google Panda 4.1 Update dated September 23, 2014. Google announced this update & estimated the impact at 3-5% of all keyword search queries would be affected. This is, in SEO terms, a massive relative impact although the lack of in depth available information on the mechanics of this algorithmic update make only surface level compliance possible & highly advised at this juncture.

Outbound Linking to Authority Sites. When appropriate, it is deemed a slight algorithmic advantage to link to high authority domains as opposed to low authority domains. This is especially true when there are universally accepted online authority domains in a given niche.

Outbound Link Content Niche or Neighbourhood.

Correct grammar and spelling usage. This is a ranking signal that is likely to increase in impact over time as Google seeks to delineate high quality text content from less useful, lower caliber informational resources.

Too many 301 Redirects to a given page will hurt the SEO performance of that page.

Inbound Links from leading intra-industry resource pages. Being linked to from established leading resource authorities within your market sector provide a very potent SEO boost.

Other Keywords a Page Ranks for tell Google your page is a quality one (although this is a rather circular SEO argument – it usually pans out).

Usage of Microformats – be they Schema.org or Dublin Core – which afford semantic markup of information. Such pages have a mild tendency to rank higher in the SERPs & additionally seem to universally enjoy a higher click through rate from Google & other major search engines once ranked. Microformats are data formats built on existing standards that are designed for human beings primarily.

Utility of Supplementary Content. In 2014 Google re-edited & issued their Quality Rater’s Guidelines which spoke at length about Supplementary Content. Helpful Supplementary Content which improves a webpage will be rewarded & obnoxious, irrelevant ads will, generally, have a negative impact on your search engine presence. The first goal, though, should be to avoid the dreaded Low quality rating that is handed out to content poor webpages & sites.

Inbound link age. Officially, Google have stated that older links – that is links that have been pointing to your webpage for a lengthy period & consistently, are worth more than more newly gained links.

Responsive websites are needed for Mobile search engine optimization.

Brand Name as Inbound Link Anchor Text: Google expect a certain portion of your inbound links to come with your brand name as anchor text – anything other than that is viewed with suspicion & can negatively impact SEO performance.

Inbound Link Placement Relative to Content. A link within the body text of a page is worth more the earlier it is mentioned. This is a weak to mild differential within this wider Google SEO ranking signal.

For Local SEO purposes – compliance with the Google Pigeon update dated July 24, 2014. According to Google Pigeon fuses elements of the local search algorithm with more core algorithm ones.

Limited Amount of Outbound Links

Having Multimedia Content – thought to communicate a signal of overall quality.

Internal Links Pointing to a Given Page Relative to Other Pages (hierarchy of pages within the site is denoted by this)

Quality of Internal Links. The Googlebot prefers being sent – even internally (within a website) – to pages rich in text content & or global utility to the visitor.

Bounce Rate. Opinions may well vary but I contend that Google, on behalf of their users, utilize bounce rate data they may receive from websites with Google Analytics installed to double check usability & functionality as well as overall utility of a given webpage. What constitutes a decent bounce rate for SEO varies between market sectors but anything north of 40% should be deemed of likely mild negative algorithmic impact.

Global Authority/Popularity of social media users who share your content.

Terms & Conditions & or Privacy Policy Pages. Having these make your commercial website appear a legitimate online enterprise to the Google algorithm. Small signal with a “polishing” effect on your website’s appearance to the Google organic search algorithm.

Websites that are Google Chrome Bookmarked on a regular basis are likely to be given an algorithmic advantage over websites that are not.

Older Page that is regularly updated is the ideal scenario. Newer page that is left to rot is not.

Inbound Links from Forum Profiles & signatures (to a lesser degree) have been devalued in worth in recent times by Google & are an area Google are seen to be clamping down in in the face of widespread spamming operations active globally.

Positive Link Accumulation Velocity is when your website is accumulating more inbound links on a daily basis. This is deemed, you guessed it, a positive state of affairs for your site’s SEO performance. The opposite scenario is, of course, deemed a worrying situation & likely to hurt you algorithmic-ally.

Time spent on website or webpage. The longer the mean visitor spends on your website the higher utility Google assumes your site offers visitors.

Compliance with the Pirate 2.0 algorithm update by Google dated October 21, 2014. A Google update to fight software and digital piracy. This update was highly selective, causing ranking losses to a proportionately tiny group of websites.

Broken Outbound Links are seen as a negative ranking signal.

Yahoo! & DMOZ Directory listings – within relevant niche category – although such listings are sufficiently potent so as to be universally powerful for SEO purposes.

Site-wide inbound links are worth scarcely more than what a single link from that site might be worth in SEO terms.

Direct Traffic. Your website is deemed of slightly more utility if a reasonable (again varies but 5-10% is a decent broad estimate) percentage of the website’s traffic is derived from direct traffic. That is someone typing yourwebsitename dot com & enter into their web browser.

Hierarchical Page Placement within the xml Sitemap.

Advanced Reading Level – advanced reading level is seen as a quality content signal that delineates a page of quality content from content farms.

Organic Search Click Through Rate could be a weak to mediocre organic ranking signal; although this is largely an assumption based on experience rather than a cold, hard fact.

Positive sentiment text in proximity to Inbound external link.

Limit or Counter-Balance Affiliate Links

Compete.com or Alexa.com traffic ranking – not a signal so much as indicative of a positive global trajectory.

HTML errors & W3C validation – universally deemed a weak quality ranking signal but every benefit is deemed useful in more competitive SEO market sectors.

Human Editors (it is speculated that Google use human editors to clean up SERPs in ultra popular search markets).

Inbound Links from 301 sources. These are worth less than a simple link from an existing page but worth more than nothing. That is, they are worth having if available. This is related to the concept ofLink Recycling.

Quality of the text content on the webpage that links to your webpage.

Use of Bullet Points and Lists. Seen as a coincidental signifies of quality text content that caters for the visitor (very minor positive functionality or usability signal).

Quantity of webpages within the website – only useful in so much as a large website is less likely to be viewed as a content anemic affiliate website. The concept of “more pages mean more rankings & traffic” is, though, largely flawed.

YouTube videos are granted elevated rankings in Google searches. Google own Youtube – & thus trust it & its policies implicitly.

Google Sandbox: Google will temporarily ameliorate search engine gains to new domains that have a massive amount of inbound links – this will be lifted once Google appreciates that you are a legitimate link destination with a natural looking inbound link accumulation velocity or uptake.

Extremely Long Title Tags are a way of telling Google you a) didn’t bother reading their play/quality guideline book. It also hides keywords you should perhaps be targeting beyond the character limit that Google divulges on the SERP & itself takes note of… According to the latest research, the optimal length of a title tag is 55 characters. Changes to SERP design mean that the maximum length for title tags is now 60 characters in most cases. If your title goes beyond 60 characters, it gets truncated. When it gets truncated the click through rate (historically) is negatively impacted.

As of December 2011, Google publicly stated they were seeking to further bury Parked Domains.

Gross Organic Click Through Rate (site-wide click through rate).

Only install an SEO plug in if you intend to configure & monitor yourself manually – otherwise you run the risk of inadvertently hurting your search engine performance by following the assumptions made in the plugin’s default settings.

Content is Still King

Content is a mechanism to talk to your customer. Effective communication via content increases customer retention as well as brand awareness.

Long Tail Search Traffic Via SEO

Google’s #1 search result takes an average of 36.4% of all clicks (see below). This leaves a huge amount of organic search traffic still to play for via SEO. I’ve attained 62,336 visitors who converted into enquirers of my services from long-tail keyword phrases in the last 12 months alone. People who search via descriptive long-tail phrases are on average more qualified buyers for products & services than those who search via short tail terms. The conversion rates for long-tail keyword search traffic is almost always higher than that of as short tail keyword traffic.

Long Tail Search Traffic
Long Tail Search Traffic

When deploying an SEO keyword or content strategy that improves the quality of people’s lives via access to better content and information to solve their problems Google’s algorithm changes should be welcomed not feared. If you are giving better solutions to searchers needs for information then improvements in Google’s organic search algorithm should only ever help, not hinder you.

If your content is built around long-tail keyword terms that are designed to genuinely help the visitors to your website then you simply don’t have to worry about Google Panda; Penguin or any other animal for that matter!

 

Understanding the long-tail keyword search is about understanding the searchers intent. The intent of the person making any particular keyword search is more important than what the keyword happens to be. Once you appreciate the searcher’s intent – quality solutions or long-tail keyword rich content will be an inevitable result. Long-tail keywords should help a searcher solve a problem or query – if your page does this & does this well – it will likely rank prominently on Google. The answer, thus, to ‘Why isn’t my website ranking’should often be ‘Because it isn’t offering a quality solution to a widespread or niche informational problem’.

Be ultra aware of what you chose to include in your headlines/titles. They’re the first thing search engine users will see and they can lead to a click or a user going elsewhere. Only use long tail keywords as exact headlines if they flow naturally as headlines when you read them.

Tips to Attain Long-Tail Search Traffic

  • Keywords in Titles.
  • Intelligent usage of Internal Linking within your own website.
  • Links from other websites that are deemed authoritative or of high trustworthiness within your sector.

Internal Linking

  • Link to relevant content.
  • Add breadcrumbs.
  • Only link to quality, useful & unique content. Don’t link for the sake of linking.
  • Use a variety of anchor text to link internally (you have autonomy over this after all).

The Attention Web

Visitors spend most of their time focusing below the fold (below the screen when you first load a page).

 

In one study of 2 billion visitors via both social & non-social sources the following was ascertained:

 

SEO’s Smoke & Mirrors

SEOs often claim they know secrets for improving search rankings and traffic. This is almost always a lie – in the remainder of cases its merely delusional thinking. I won’t say that there are no little appreciated secrets to SEO or ranking well on Google but whatever these may now be you can be sure that Google are actively working to close down such manipulation of their search rankings & their quality. Any gains such unethical SEO tactics might (& its only ever a might) attain will be temporary in nature & possibly followed by penalties for manipulative tactics down the line. It’s just not worth it.

The reason to do keyword research is to find out what your potential audience are looking for. Then you make the most useful content possible to help them satisfy their queries & problems & hey presto – you’ll be seeing increases in traffic, rankings & revenues!

Useful Supplementary Content

How The Internet Took Over The World

The Google Panda Story

Wikidata: A Free Collaborative Knowledge Base

Google Algorithm Change Timeline

How Google May Answer Questions In Queries With Rich Content Results

Moz vs Majestic SEO vs Ahrefs vs SEMrushD

Man and Van Boom Time Looming in London

How Google May Identify Navigational Queries And Resources