Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.

Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?

Tablet - We consider tablets as devices in their own class, so when we speak of mobile devices, we generally do not include tablets in the definition. Tablets tend to have larger screens, which means that, unless you offer tablet-optimized content, you can assume that users expect to see your site as it would look on a desktop browser rather than on a smartphone browser.


Ever since its April 2015 update, Google is now taking the user's mobile experience into consideration in its search results. This means that when a user is doing a search on a mobile device, Google's search results will favor websites that are mobile friendly over the ones that aren't. If you want to capture that mobile search audience, you will need to have a mobile version of your website in addition to your desktop version.
In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
There are a lot of definitions of SEO (spelled Search engine optimisation in the UK, Australia and New Zealand, or search engine optimization in the United States and Canada) but organic SEO in 2019 is still mostly about getting free traffic from Google, the most popular search engine in the world (and almost the only game in town in the UK in 2019):
Google is looking for a “website that is well cared for and maintained” so you need to keep content management systems updated, check for broken image links and HTML links. If you create a frustrating user experience through sloppy website maintenance – expect that to be reflected in some way with a lower quality rating. Google Panda October 2014 went for e-commerce pages that were optimised ‘the old way’ and are now classed as ‘thin content’.
Critics will point out the higher the cost of expert SEO, the more cost-effective Adwords becomes, but Adwords will only get more expensive, too. At some point, if you want to compete online, your going to HAVE to build a quality website, with a unique offering to satisfy returning visitors – the sooner you start, the sooner you’ll start to see results.
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
Keyword analysis. From nomination, further identify a targeted list of key­words and phrases. Review competitive lists and other pertinent industry sources. Use your preliminary list to determine an indicative number of recent search engine queries and how many websites are competing for each key­word. Prioritize keywords and phrases, plurals, singulars and misspellings. (If search users commonly misspell a keyword, you should identify and use it). Please note that Google will try to correct the term when searching, so use this with care.
Smartphone - In this document, "mobile" or “mobile devices" refers to smartphones, such as devices running Android, iPhone, or Windows Phone. Mobile browsers are similar to desktop browsers in that they can render a broad set of the HTML5 specification, although their screen size is smaller and in almost all cases their default orientation is vertical.
Don’t break Google’s trust – if your friend betrays you, depending on what they’ve done, they’ve lost trust. Sometimes that trust has been lost altogether. If you do something Google doesn’t like such as manipulate it in a way it doesn’t want, you will lose trust, and in some cases, lose all trust (in some areas). For instance, your pages might be able to rank, but your links might not be trusted enough to vouch for another site. DON’T FALL OUT WITH GOOGLE OVER SOMETHING STUPID
When I think ‘Google-friendly’ these days – I think a website Google will rank top, if popular and accessible enough, and won’t drop like a f*&^ing stone for no apparent reason one day, even though I followed the Google SEO starter guide to the letter….. just because Google has found something it doesn’t like – or has classified my site as undesirable one day.

It is important you spread all that real ‘PageRank’ – or link equity – to your sales keyword / phrase rich sales pages, and as much remains to the rest of the site pages, so Google does not ‘demote’ pages into oblivion –  or ‘supplemental results’ as we old timers knew them back in the day. Again – this is slightly old school – but it gets me by, even today.


Note that Google is pretty good these days at removing any special characters you have in your page title – and I would be wary of trying to make your title or Meta Description STAND OUT using special characters. That is not what Google wants, evidently, and they do give you a further chance to make your search snippet stand out with RICH SNIPPETS and SCHEMA mark-up.
SEO is also about making your search engine result relevant to the user's search query so more people click the result when it is shown in search. In this process, snippets of text and meta data are optimized to ensure your snippet of information is appealing in the context of the search query to obtain a high CTR (click through rate) from search results.
×