Many think that Google won’t allow new websites to rank well for competitive terms until the web address “ages” and acquires “trust” in Google – I think this depends on the quality of the incoming links. Sometimes your site will rank high for a while then disappears for months. A “honeymoon period” to give you a taste of Google traffic, perhaps, or a period to better gauge your website quality from an actual user perspective.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
In addition to processing the text content on your web pages, Google will also try to figure out what your images are about as well. Alt Text is a short description that you can customize for each image to let Google know what the image is about. Setting short, descriptive Alt Texts through our site builder will help Google better associate your web pages with the search terms you're trying to target.
QUOTE: “(Google Panda) measures the quality of a site pretty much by looking at the vast majority of the pages at least. But essentially allows us to take quality of the whole site into account when ranking pages from that particular site and adjust the ranking accordingly for the pages. So essentially, if you want a blunt answer, it will not devalue, it will actually demote. Basically, we figured that site is trying to game our systems, and unfortunately, successfully. So we will adjust the rank. We will push the site back just to make sure that it’s not working anymore.”  Gary Illyes – Search Engine Land
Sometimes I think if your titles are spammy, your keywords are spammy, and your meta description is spammy, Google might stop right there – even they probably will want to save bandwidth at some time. Putting a keyword in the description won’t take a crap site to number 1 or raise you 50 spots in a competitive niche – so why optimise for a search engine when you can optimise for a human? – I think that is much more valuable, especially if you are in the mix already – that is – on page one for your keyword.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.

Anchor Text is the visible text on a link or button that you can click on. Google uses this text to determine what the linked page is about. For example, if you set "Learn More About Bike Repairs" as the Anchor Text for the link that goes to your repairs page, it tells Google that the linked page is most likely about bike repairs. When building your site with our site builder, make your Anchor Text as descriptive as possible and avoid using generic Anchor Text like "Click Here" as this doesn't help Google categorize your pages whatsoever.
While Google is on record as stating these quality raters do not directly influence where you rank (without more senior analysts making a call on the quality of your website, I presume?) – there are some things in this document, mostly of a user experience nature (UX) that all search engine optimisers and Webmasters of any kind should note going forward.
A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.
Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.
I do not obsess about site architecture as much as I used to…. but I always ensure my pages I want to be indexed are all available from a crawl from the home page – and I still emphasise important pages by linking to them where relevant. I always aim to get THE most important exact match anchor text pointing to the page from internal links – but I avoid abusing internals and avoid overtly manipulative internal links that are not grammatically correct, for instance..

Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.

We expect advertisements to be visible. However, you should not let the advertisements distract users or prevent them from consuming the site content. For example, advertisements, supplement contents, or interstitial pages (pages displayed before or after the content you are expecting) that make it difficult to use the website. Learn more about this topic.38
What about other search engines that use them? Hang on while I submit my site to those 75,000 engines first [sarcasm!]. Yes, ten years ago early search engines liked looking at your meta-keywords. I’ve seen OPs in forums ponder which is the best way to write these tags – with commas, with spaces, limiting to how many characters. Forget about meta-keyword tags – they are a pointless waste of time and bandwidth.

In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]
QUOTE: “Another problem we were having was an issue with quality and this was particularly bad (we think of it as around 2008 2009 to 2011) we were getting lots of complaints about low-quality content and they were right. We were seeing the same low-quality thing but our relevance metrics kept going up and that’s because the low-quality pages can be very relevant. This is basically the definition of a content farm in our in our vision of the world so we thought we were doing great our numbers were saying we were doing great and we were delivering a terrible user experience and turned out we weren’t measuring what we needed to so what we ended up doing was defining an explicit quality metric which got directly at the issue of quality it’s not the same as relevance …. and it enabled us to develop quality related signals separate from relevant signals and really improve them independently so when the metrics missed something what ranking engineers need to do is fix the rating guidelines… or develop new metrics.” SMX West 2016 – How Google Works: A Google Ranking Engineer’s Story (VIDEO)
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
×