Google's Search Engine Optimization Ranking Factors are the elements of your web pages that Google's algorithm takes into consideration when deciding how websites rank in the search results for a given search query.
This is the key to success for Search Engine Optimization is Patience.
It is not a one-day affair. You will need to give time so that the campaign can grow and show its results.
Selection of Keywords can be the most critical and crucial thing in search engine optimization. Use different keywords rather than sticking to just one. A very nice research Keyword Tools can be found at SEMScoop.
A URL (Uniform Resource Locator), more commonly known as a "web address", specifies the location of a resource (such as a web page) on the Internet. The URL also specifies how to retrieve that resource, also known as the "protocol" such as HTTP, HTTPS, FTP, etc
Keywords and phrases that appear in the pages URL or domain name, aid in establishing relevance of a piece of content for a particular search query. Diminishing returns are apparently achieved as URLs become lengthier or as keywords are used more than once.
A Title Tag should contain your target keyword. This tells both Google and searchers that your web page is relevant to this search query.
Title tags define the title of a document or page on your site, and often appear in the search results, and as snippets for social media sites. Title Tags should not longer than 65 to 70 characters, depending on the characters (Moz Tool). As with URL, keywords closer to the beginning are widely theorized to have more weight.
Headings don’t just make the content on a webpage better organized and easier to read. They are also critical to good technical SEO
Keywords in Heading tags have strong weight in determining the relevant subject of a page. An H1 tag carries the most weight, H2 has less, and so forth. This tag also improves accessibility for screen readers and clear, descriptive headings reduce bounce rates according to various studies.
Only have one H1 tag per page, more than this and Google may not rank the heading tag at all.
Keyword density is the percentage calculated based on the number of times a keyword occurs inside the content of a webpage divided by the total word count.
Keyword density isn’t important for SEO, as it’s no longer thought to be a ranking factor. This is because modern search engine algorithms are smart enough to understand what a page is about and how well it matches search intent in other ways. And Keyword Density, although referenced in Google Patents, is almost certainly just a simplified concept behind TF-IDF.
Google’s John Mueller confirms bolding important text in a paragraph can help your site’s SEO, as it allows Google to understand the content better.
Keywords in bold, italic, underline, or larger fonts have more weight in determining the relevant subject of a page, but less weight than words appearing in a heading. Matches in text that is of larger font or bolded or italicized may be weighted more than matches in normal text."
Keyword Proximity refers to the distance between two words or phrases, or how close keywords are to each other within a body of text. The closer the two keywords are to each other, the higher the weight for that phrase.
There are no official rules to follow when considering keyword prominence, but it is something you should consider when designing your on-page SEO strategy.
Also called alt tags and alt descriptions, alt text is the written copy that appears in place of an image on a webpage if the image fails to load on a user's screen
This is a complete HTML image tag:
< img src=“image.jpg” alt=“image description” title=“image tooltip”>
The alt and title attributes of an image are commonly referred to as ‘alt tag’ or ‘alt text’, and ‘title tag’. (Technically they’re not tags, they’re attributes,
The ALT attribute of an image is a used to describe that image to search engines and who are unable to display the image. This establishes relevance, especially for Image Search, while also improving accessibility.
Think of TF-IDF, or Term Frequency-Inverse Document Frequency, like Keyword Density with context. It can be defined as the calculation of how relevant a word in a series or corpus is to a text. The meaning increases proportionally to the number of times in the text a word appears, but is compensated by the word frequency in the corpus. This serves to ignore words like "the" in computation and establishes how many times a literate human should probably mention a phrase like "Google Ranking Factors" in a single document that covers such a topic.
There's a natural trend in how we write English. It’s an entirely reasonable question to ask if there’s a ranking benefit to pushing more of the content above the fold. The user experience is better when more content is above the fold.. This applies to sentences, paragraphs, pages, HTML tags. Google seems to apply this everywhere as well, with content that appears earlier and more visibly being given more weight. This is, at the very least, a function of the Page Layout algorithm, which gives a lot of preference to what appears above-the-fold on your site.
A powerful ranking bonus is attributed when a keyword exactly matches a domain name.
Keywords in domain names are indeed official ranking factors. But after multiple algorithm updates, they are not as important as they used to be. They may also be dangerous. In 2012, Google started to penalize exact match domain names (or EMDs) because of their spammy intent.
This is a really common question that comes up for the new top-level domains.
In short, no. You do not get a special bonus from having a keyword in your top-level domains such as .biz .sports etc.
In Search Engine Journal's opinion, Google likes exact match domains so long as they are not spammy. With many alternative domain extensions to buy (e.g., .business, .company, .biz, etc. ), you might be tempted to grab those to protect your new investment or brand. However, I just don’t see too many sites other than .com domains rank in the SERPs for competitive keywords.
This is somewhat confusing since a brand-new domain name may also receives a temporary boost. Older domains are given a little more trust, which is emphasizes is pretty minor (while in the process, acknowledging exists). Speculatively, this may be rewarding sites that have had a chance to prove themselves not a part of short-term black hat projects.
The ideal method of separating keywords in a URL is to use a hyphen. Underscores can work, but are not as reliable, as they can be confused with programming variables. Adding words together in a URL is likely to cause words to not be seen as separate keywords and preventing any Keyword in URL bonus. Aside from this, just using a hyphen will not make a site rank higher.
Google directly states in this patent that longer domain registration terms predict the legitimacy of a domain. Speculatively, those that engage in web-spam understand that it's a short-term, high volume game of burn/rinse/repeat and don't purchase domains for longer than they need.
But Google has always said: Make great content, don't worry nearly as much about how many years your domain is registered for.
blog.rshweb.com
Subdomains are seen as separate websites by Google. This has indirect implications with many other factors. This means you can create unique authority for each of the subdomains you’re using. Subdomains can actually be beneficial to your SEO efforts. Your keywords when used in conjunction with subdomains go hand-in-hand with SEO. By affiliating your site with relevant keywords to your product or service, potential clients will be able to find you more easily when they search those keywords.
HTTPS uses TLS (SSL) to encrypt normal HTTP requests and responses, making it safer and more secure. A website that uses HTTPS has https:// in the beginning of its URL instead of http://, like https://rshweb.com
SSL was officially announced as a new positive ranking factor in 2014, regardless of whether the site processed user input. (credit cards).
Categorical Information Architecture has been an SEO discussion point for a long time, as it seems that Google analyzes topic coverage across entire sites. The exact ranking implications of this are unclear, but Google now refers to Subdirectories as Structured Data, and at very least, will use to display breadcrumbs on the results page, therefore ranking more pages.
"Country Code Top Level Domains" (such as .uk and .br) are a strong signal of relevance for a specific, regional market. Some gTLDs (such as .io and .co) are treated like ccTLDs by Google, however, dubbed generic - "gccTLDs".
Content is a critical element in an effective SEO Strategy. No other factor will keep website visitors on your website. It is just a matter of what subject appeals to them. Most ranking signals rely on content, preferably compelling content.
This is technically "Fresh content when query deserves freshness". The term, Query Deserves Freshness (often shortened to QDF), refers to search queries that would benefit from more current content. This does not apply to every query, but it applies to quite a lot, especially those that are informational in nature. These SEO benefits are just one more reason that brand publishers tend to be very successful.
A Google webpage states: "For some queries, older documents may be more favorable than newer ones." It goes on to describe a scenario where a search result set may be re-ranked by the average age of documents in the retrieved results before being displayed. Any content, no matter how good, grows stale over time.
While spelling and grammar are not a direct ranking signal, they do play a part in your SEO. It is a trust factor. Amit Singhal stated "these are the kinds of questions we ask" regarding spelling/grammar when defining "quality content". The first Panda update made this seem to matter a lot. Directly or indirectly, dozens of content-related factors are clearly affected by spelling/grammar.
This SEO theory became widespread in 2010, suggesting that more content and less code is good. Here's what we know:
Page speed is a confirmed factor. Google has used page speed as a ranking factor since 2010. Google's PageSpeed Insights tool presses even a 5Kb reduction in payload size will help. Certain, subtle code mistakes can cause devaluations and penalties. So at minimum there is an indirect correlation.
As keyword density is now virtually a non-factor, a basic understanding of Phrase-Based Indexing tells us that if you write about content thoroughly and elaborately, you stand a far better chance of ranking compared to writing generic content that just happens to drop a lot of keywords. A clear component of one Google patent describes this as the "identification of related phrases and clusters of related phrases".
A physical address is theorized as a mark of legitimacy in standard search rankings. Loosely supported by the notion that Google looks at citations for local SEO (also known as Google Maps SEO) as mentions of Name, Address, Phone (sometimes shorted to "NAP") together. "Highly satisfying contact information" is also something that Google quality control auditors are instructed to seek out.
Designing Tips For Accessible Websites Theorized as a mark of legitimacy. It appears that this may have originated, or is at least best-supported, from a document called Google's Quality Rater Guidelines. In this document, Google asks quality control auditors to search for "highly satisfying contact information."
Rich media, on top of drawing more traffic from in-line image and video search, has long been considered a component of "high quality, unique content". Video appeared to be the deciding factor with Panda 2.5.
Mobile-friendly websites are given a significant ranking advantage. This claim stems from the fact that a greater percentage of searches are conducted on mobile devices and the understanding that Google aims to serve pages with the best user experience. Google uses the mobile version of the site to determine its rankings. If your site is not up to scratch or presents less content on your mobile site, you will have difficulty getting good rankings. If you don’t have an adequate mobile view of your site yet, you best make a fully functioning one, preferably as a responsive design. Google has a great getting started guide to help you improve your mobile site.
It is said that the Title Tag is the single most important ranking factor. This tells the search engines what each page is about and which keywords to focus on. It appears as the clickable headline for the search result and is important for user experience, SEO, and social sharing. The title tag of a web page is meant to be an accurate and concise description of a page's content.
Titles should be 60-65 characters or less, including spaces. You should have a unique and well-crafted Title Tag for every page. Title Tag Preview Tool
A meta description is an attribute within your meta tags that helps describe your page. This snippet of text may appear in the search engine results under your headline, though sometimes, search engines will pull a snippet of text from the main body copy of the page instead. A good meta description functions as a search ad. Considering how many Google Ads agencies exist almost entirely on A/B testing Google Ads, the marketing value here can't be understated. Although keywords used in meta descriptions were once widely considered a direct ranking factor, they are not anymore.
Google never rewarded meta keywords, but other search engines, scrapers, and crawlers sometimes did. Missing from the source article: the late-2000s explosion of reddit-like social bookmarking sites did leverage meta keywords. We do not recommend using this meta tag at all
Schema.org is a joint project between Google, Yahoo, Bing, and Yandex to understand logical data entities over keywords
Currently, use of structured data can improve rankings in a massive variety of scenarios. There are also theories that schema.org can improve traditional search rankings by way of clearer entity salience.
Although it's possible for outbound links to "leak PageRank", websites are not supposed to be dead ends. Google rewards authoritative outbound links to "good sites". To quote the source: "parts of our system encourage links to good sites."
Given that Google analyzes your inbound links for authority, relevance, and context, it seems reasonable that outbound links should be relevant as well as authoritative. This would likely relate to the Hilltop algorithm's method for identifying topical experts. Google is also issuing manual actions for the reverse scenario.
Also see Whiteboard Friday and external linking.
And Reboot's Study on outgoing links used as ranking signals
The anchor text of a link tells the user where that link leads. It is an important component of navigation within your website, and when not abused, helps to establish the relevance of a particular piece of content over vague alternatives such as click here (which you should never use).
Internal links also connect your content and give Google an idea of the structure of your website. They can establish a hierarchy on your site, using the right internal linking strategy can boost your SEO ranking, See Google’s SEO Starter Guide for a few more SEO Tips
Backlink are links pointing from other websites to your website. The overall concept is that if you can get a backlink from authority websites, it tells search engines that your website is also good.
Search engines like Google see backlinks as votes of confidence. Generally speaking, the more votes your web pages have, the more likely they are to rank for relevant search queries. Studies on link-based ranking factors seems to find the same thing: the number of backlinks from unique websites (referring domains) correlates strongly with organic search traffic.
Sitemaps can be useful, though not required, for the purpose of getting more pages of your site into the Google index. Even though search engines can technically find your URLs without sitemaps, by including your pages, you are indicating that you consider them to be quality landing pages.
An XML sitemap will not improve your rankings within Google. This comes straight from Google and is confirmed by various studies.
But we feel XML sitemaps are beneficial for all websites. Every single website needs Google to be able to find essential pages easily and to know when they’re last updated.
Tweet Share Pin Email