Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]

The SEO starter guide describes much of what your SEO will do for you. Although you don't need to know this guide well yourself if you're hiring a professional to do the work for you, it is useful to be familiar with these techniques, so that you can be aware if an SEO wants to use a technique that is not recommended or, worse, strongly discouraged.


So you have a new site. You fill your home page meta tags with the 20 keywords you want to rank for – hey, that’s what optimisation is all about, isn’t it? You’ve just told Google by the third line of text what to filter you for. The meta name=”Keywords” was actually originally for words that weren’t actually on the page that would help classify the document.

QUOTE: “Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve its purpose. SC is created by Webmasters and is an important part of the user experience. One common type of SC is navigation links which allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page.” Google Search Quality Evaluator Guidelines 2017
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running and optimizing search engine ad campaigns.[56] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[59] which revealed a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device [60]. Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.
A poor 404 page and user interaction with it, can only lead to a ‘poor user experience’ signal at Google’s end, for a number of reasons. I will highlight a poor 404 page in my audits and actually programmatically look for signs of this issue when I scan a site. I don’t know if Google looks at your site that way to rate it e.g. algorithmically determines if you have a good 404 page – or if it is a UX factor, something to be taken into consideration further down the line – or purely to get you thinking about 404 pages (in general) to help prevent Google wasting resources indexing crud pages and presenting poor results to searchers. I think rather that any rating would be a second order scoring including data from user activity on the SERPs – stuff we as SEO can’t see.
QUOTE: “I’ve got a slide here where I show I think 8 different URLs you know every single one of these URLs could return completely different content in practice we as humans whenever we look at ‘www.example.com’ or just regular ‘example.com’ or example.com/index or example.com/home.asp we think of it as the same page and in practice it usually is the same page so technically it doesn’t have to be but almost always web servers will return the same content for like these 8 different versions of the URL so that can cause a lot of problems in search engines if rather than having your backlinks all go to one page instead it’s split between (the versions) and it’s a really big headache….how do people fix this well …. the canonical link element” Matt Cutts, Google
SEO stands for ‘Search Engine Optimization’. It’s the practice of optimizing your web pages to make them reach a high position in the search results of Google or other search engines. SEO focuses on improving the rankings in the organic – aka non paid – search results. If you have a website and you want to get more traffic, it should be part of your marketing efforts. Here, I’ll explain what SEO is and how we approach it at Yoast.
×