QUOTE: “Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.”
While most of the links to your site will be added gradually, as people discover your content through search or other ways and link to it, Google understands that you'd like to let others know about the hard work you've put into your content. Effectively promoting your new content will lead to faster discovery by those who are interested in the same subject. As with most points covered in this document, taking these recommendations to an extreme could actually harm the reputation of your site.
QUOTE: “Over time, we’ve seen sites try to maximize their “search footprint” without adding clear, unique value. These doorway campaigns manifest themselves as pages on a site, as a number of domains, or a combination thereof. To improve the quality of search results for our users, we’ll soon launch a ranking adjustment to better address these types of pages. Sites with large and well-established doorway campaigns might see a broad impact from this change.” Google 2015
Google will INDEX perhaps 1000s of characters in a title… but I don’t think anyone knows exactly how many characters or words Google will count AS a TITLE TAG when determining RELEVANCE OF A DOCUMENT for ranking purposes. It is a very hard thing to try to isolate accurately with all the testing and obfuscation Google uses to hide it’s ‘secret sauce’. I have had ranking success with longer titles – much longer titles. Google certainly reads ALL the words in your page title (unless you are spamming it silly, of course).

The above information does not need to feature on every page, more on a clearly accessible page. However – with Google Quality Raters rating web pages on quality based on Expertise, Authority and Trust (see my recent making high-quality websites post) – ANY signal you can send to an algorithm or human reviewer’s eyes that you are a legitimate business is probably a sensible move at this time (if you have nothing to hide, of course).
QUOTE: “I’ve got a slide here where I show I think 8 different URLs you know every single one of these URLs could return completely different content in practice we as humans whenever we look at ‘www.example.com’ or just regular ‘example.com’ or example.com/index or example.com/home.asp we think of it as the same page and in practice it usually is the same page so technically it doesn’t have to be but almost always web servers will return the same content for like these 8 different versions of the URL so that can cause a lot of problems in search engines if rather than having your backlinks all go to one page instead it’s split between (the versions) and it’s a really big headache….how do people fix this well …. the canonical link element” Matt Cutts, Google
QUOTE: “The duration performance scores can be used in scoring resources and websites for search operations. The search operations may include scoring resources for search results, prioritizing the indexing of websites, suggesting resources or websites, protecting particular resources or websites from demotions, precluding particular resources or websites from promotions, or other appropriate search operations.” A Panda Patent on Website and Category Visit Durations
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
If you are improving user experience by focusing primarily on the quality of the MC of your pages and avoiding – even removing – old-school SEO techniques – those certainly are positive steps to getting more traffic from Google in 2019 – and the type of content performance Google rewards is in the end largely at least about a satisfying user experience.
I think it makes sense to have unique content as much as possible on these pages but it’s not not going to like sync the whole website if you don’t do that we don’t penalize a website for having this kind of deep duplicate content and kind of going back to the first thing though with regards to doorway pages that is something I definitely look into to make sure that you’re not running into that so in particular if this is like all going to the same clinic and you’re creating all of these different landing pages that are essentially just funneling everyone to the same clinic then that could be seen as a doorway page or a set of doorway pages on our side and it could happen that the web spam team looks at that and says this is this is not okay you’re just trying to rank for all of these different variations of the keywords and the pages themselves are essentially all the same and they might go there and say we need to take a manual action and remove all these pages from search so that’s kind of one thing to watch out for in the sense that if they are all going to the same clinic then probably it makes sense to create some kind of a summary page instead whereas if these are going to two different businesses then of course that’s kind of a different situation it’s not it’s not a doorway page situation.”
Ideally, you will have unique pages, with unique page titles and unique page meta descriptions . Google does not seem to use the meta description when ranking your page for specific keyword searches if not relevant and unless you are careful if you might end up just giving spammers free original text for their site and not yours once they scrape your descriptions and put the text in main content on their site. I don’t worry about meta keywords these days as Google and Bing say they either ignore them or use them as spam signals.
SEOptimer is a free SEO Audit Tool that will perform a detailed SEO Analysis across 100 website data points, and provide clear and actionable recommendations for steps you can take to improve your online presence and ultimately rank better in Search Engine Results. SEOptimer is ideal for website owners, website designers and digital agencies who want to improve their own sites or theirs of their clients.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.
QUOTE: “In place of a pop-up try a full-screen inline ad. It offers the same amount of screen real estate as pop-ups without covering up any content. Fixing the problem depends on the issue you have for example if it’s a pop-up you’ll need to remove all the pop-up ads from your site but if the issue is high ad density on a page you’ll need to reduce the number of ads” Google, 2017
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]

Sometimes I think if your titles are spammy, your keywords are spammy, and your meta description is spammy, Google might stop right there – even they probably will want to save bandwidth at some time. Putting a keyword in the description won’t take a crap site to number 1 or raise you 50 spots in a competitive niche – so why optimise for a search engine when you can optimise for a human? – I think that is much more valuable, especially if you are in the mix already – that is – on page one for your keyword.
It is important you spread all that real ‘PageRank’ – or link equity – to your sales keyword / phrase rich sales pages, and as much remains to the rest of the site pages, so Google does not ‘demote’ pages into oblivion –  or ‘supplemental results’ as we old timers knew them back in the day. Again – this is slightly old school – but it gets me by, even today.
Domain authority is an important ranking phenomenon in Google. Nobody knows exactly how Google calculates, ranks and rates the popularity, reputation, intent or trust of a website, outside of Google, but when I write about domain authority I am generally thinking of sites that are popular, reputable and trusted – all of which can be faked, of course.
Good news for web designers, content managers and search engine optimisers! ” Google clearly states, “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” although does stipulate again its horses for courses…..if everybody else is crap, then you’ll still fly – not much of those SERPs about these days.
Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.

Google has a LONG list of technical requirements it advises you meet, on top of all the things it tells you NOT to do to optimise your website. Meeting Google’s technical guidelines is no magic bullet to success – but failing to meet them can impact your rankings in the long run – and the odd technical issue can actually severely impact your entire site if rolled out across multiple pages.


QUOTE: “The preferred domain is the one that you would liked used to index your site’s pages (sometimes this is referred to as the canonical domain). Links may point to your site using both the www and non-www versions of the URL (for instance, http://www.example.com and http://example.com). The preferred domain is the version that you want used for your site in the search results.” Google, 2018
This helpful tool scans your backlink profile and turns up a list of contact information for the links and domains you'll need to reach out to for removal. Alternatively, the tool also allows you to export the list if you wish to disavow them using Google's tool. (Essentially, this tool tells Google not to take these links into account when crawling your site.)
I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
Our online SEO training courses teach you vital SEO skills you can apply immediately. Find out how to outrank your competition and become the best result through our training courses! Whether you’re a blogger, developer, online marketer, or own a business, big or small: we believe in SEO for everyone. We’ve got a great variety of courses, from Keyword Research, Site structure and SEO Copywriting to the more technical aspects of SEO: Structured data, multilingual SEO and Technical SEO training. There’s something for everyone, so be sure to check them out!
×