250+ SEO Essentials Definitions

250+ SEO Essentials Definitions

To rank at Skyscraper content is an inbound marketing technique used to find topics that work well on search engines and social media. It utilizes reverse engineering to create better content than the already existing content. The strategy also includes backlinks to existing posts or shares with people who have shown interest in the past. Search engines, you must understand how they work, the meaning and definition of the most useful SEO terms, phrases, and jargon in this industry. This glossary, with 250 entries and their meanings, is your new best friend.

This glossary is designed to serve the primary purpose of helping you in learning SEO. But before digging deep into the Search Engine Optimization, you must be familiar with the most important terms for the SEO, and their meaning. Because the SEO best practice are changing and evolving, I exclude the terms and phrases that are no longer in use. My primary focus is only on those terms that are valid at present and can help you to learn SEO strategies and techniques.

To make this SEO glossary more helpful for you, I’ll update it regularly. So, I suggest you to bookmark this page, using the shortcut CTRL + D.


  • Above the fold

    In web design and online marketing, the term above the fold is used to describe areas of the website that can be seen by visitors without scrolling. This is the first picture the visitor finds on the page. Because of its high visibility, the content you put on the upper half page of the website should be the most important for achieving your business goals. The content must immediately capture the user’s attention and show him what he is looking for, so that he doesn’t have to visit another website. Too many ads here may impact negatively your ranking.
  • Accelerated Mobile Page (AMP)

    Accelerated Mobile Page (AMP) is a Google project designed to load websites faster in slow networks, whilst still including rich content such as animations, ads and videos. The pages are powered by the AMP HTML framework. Because it's most often used by publishers, implementing AMP for a blog or a news website is an SEO good practice.
  • Adwords

    See Google Adwords
  • Algorithm

    An algorithm is a mathematical process or formula to execute a set of functions. Search engines use algorithms to discover pages on the internet and rank them in the most relevant way for the search queries.
  • Algorithmic penalty

    An algorithmic penalty is applied when a website has its ranking reduced by Google. It can be difficult to notice unless you are paying attention closely to its rankings. If this does occur, you will need to find the root issue and resolve it.
  • Alt Attribute (alt tag)

    This is the alternative text, encoded into a page’s Hypertext Markup Language (HTML) or Extensible Hypertext Markup Language (XHTML), which should be displayed if an image or another element cannot be rendered in the browser. Alt text helps search engines in knowing what each image means and how the information it conveys fits with the rest of the content on the page.
  • Analytics

    The information resulting from a systematic analysis of data or statistics, such as the number of visitors to a site, where they landed, where they originated and where and when they exited.
  • Anchor text

    The clickable part of the link you see, often a keyword phrase, but can be a Uniform Resource Locator (URL). Through anchor text, the table or menu links to different sections of the page. The words contained in the anchor text can determine the rank of the page on search engines.
  • Authority

    Authority in terms of SEO refers to the authority of a website, which is measured by a variety of metrics and different services. A website that has attributes such as good rankings, strong backlinks and popularity, would be considered to have a high authority.
  • Authority site

    An authority site is a very high-quality website that is trusted and respected by industry experts, other websites and search engines. Such types of websites have usually many incoming links from other hub/expert sites. Authority sites enjoy high page rank by search engines. Backlinks from authority sites also improves ranking of a website on search engines. Wikipedia is an example of an authority site.


  • B to B

    Acronym for business-to-business. B2B for SEO is a more complex and slightly longer process. The services are more in depth and therefore more expensive. Normally the audience is made of professionals and executives.
  • B to C

    Acronym for business-to-consumer. This tends to be more simple and quicker than B2B. The services rendered here and the products sold also tend to be less expensive and are purchased by users.
  • Backlink

    A backlink is a hyperlink into a page or site from any other page or site. Backlinks have a significant influence on the success of your website. They are extremely important in your site’s ranking. The acquisition of backlinks is usually a standard part of a full SEO campaign.
  • Baidu

    Baidu is the most popular search engine in China. It was founded in 2000 and has a Chinese market share of just over 76%. Baidu offers very similar services to Google.
  • BERT

    Acronym for Bidirectional Encoder Representations from Transformers. It’s a method of natural language processing (NLP) used by Google Search since October 2019.
  • Bing

    Bing is Microsoft's search engine and has a 6.2% US market share. It replaced Microsoft Live Search in 2009. Since 2010, Bing has been powering Yahoo Search.
  • Black hat

    Black hat refers to SEO practices which are considered unethical and try to cheat the search engine guidelines. Black hat SEO frequently uses combinations of (among other techniques) keyword stuffing, sneaky redirects, poor quality content, cloaking and bulk site creation methods to try to rank highly in the search engine results. Black hat strategies may help in short term SEO gain, but, in the long run, they drastically harm a website’s rank. At the worst, such practice can lead to manual action of removing a website from search engine’s index.

    See also Grey hat and White hat

  • Blog

    A blog is a website or part of a website that contains informational writings on a specific topic, usually in informal or personal style. It may be otherwise defined as the writing and reading space on the internet. A blogger is a writer on the internet having specialized knowledge in a certain field of interest. Blogs help in improving SEO of websites.
  • Bot

    A bot (alternately spider, crawler, etc.) is a software application that systematically browses the internet for new web pages and updates. Google's main bot for web crawling is Googlebot.
  • Bounce rate

    This metric refers to the percentage of users who visit a website and then leave it without viewing any other pages. For example, if 8 in every 10 users leave a website after viewing only one page then there is said to be an 80% bounce rate. Higher bounce rate negatively affect a website’s ranking.
  • Branded keywords

    A branded keyword is a search phrase that includes a company brand name exactly or in variation. Examples: Google Search Console, BlackBerry mobile, etc. There is a growing consensus and they are not valuable in terms of spending advertising budget on promoting them, as good SEO will provide higher ranking organically.
  • Breadcrumbs

    Breadcrumb refers to a horizontal bar which helps the user to understand where they are on the site and how to get back to the root areas and categories of a resource. Breadcrumbs help in making website user-friendly. It also helps in SEO. Google considers breadcrumb as an enhanced SEO feature.
  • Broken link

    A broken link is a link (external or internal) on a web page that doesn’t working correctly. If a user were to click the link it would lead to a 404 page or other type of error page in the browser. Improper URL setting, removing a webpage from the destination or changing the destination of a URL without implementing proper redirect (such as, 301 redirect) are the main reason of broken link problem. Broken links are detrimental to your site’s search engine ranking, so fix or remove them as soon as possible.


  • Cache

    The cache is a mechanism that temporarily stores web content, such as images, in order to reduce loading time for future visits to the same site. In SEO, a cached page is a snapshot of a webpage as it appeared when a search engine last visited it. Any update after the search engine’s last crawl will not visible in cached page.
  • Canonical URL

    When a single page can be accessible through several URLs, the most preferred URL for search engine indexing is termed as canonical URL. The rel="canonical" link element can be used to specify the canonical URL of a webpage and avoid duplicate content penalty by search engines.
  • Citation (local SEO)

    Citations go beyond ranking factors, and it’s important to make sure that your business is listed on every website where customers search for you. Citations can help internet users find local business and can also affect local search engine rankings. In simple SEO terms, local citation refers to any mention of your company online and although local citation can allow you to rank in search results, they are no longer a major ranking factor.
  • Clickbait

    This is a piece of online content that is intentionally over promising or misleading in headlines, typically designed to entice people to click or visit a website in order to sell advertisement. By using buzz words such as "can’t miss", "won’t believe what you see", or "number seven will amaze you", users are more tempted to click on the link. Search engines and social media treat clickbait as immoral practice.
  • Click-through rate (CTR)

    It's a performance metric expressed in percentage that provides the ratio of the number of times a link in a specific organic search result or paid ad or email is clicked to the number of times of impression, i.e., the organic search result or paid ad is viewed. For example, if a search result is viewed 100 times and clicked 30 times, the CTR will be 30%. If you are getting high impressions but a very low CTR, then consider amending your Title and Description meta tags for that page.
  • Cloaking

    Cloaking is a black hat practice, unethically used for higher page ranking, in which the content served to users is different from the content served to the search engine bots. Since cloaking misleads search engine crawlers, it is regarded as high level unethical practice. Definitely not recommended as your site will be severely penalized and even banned from the search engine indexes.
  • Content

    In the context of SEO, content is all the information or data on a site. Blog posts, articles, white papers, videos, images, infographics, podcasts, are some example of web content. Content is meant to be consumed and distributed by an audience and one of the most important search ranking factors.
  • Content Management System (CMS)

    A CMS is a web-based application that lets people create a website with little knowledge of coding. Joomla, WordPress and Drupal are some examples of CMS. For example, this website is powered by Joomla.
  • Content spinning

    Content Spinning is a black hat SEO practice in which an article is rewritten using similar words and phrasing. It's considered a Black hat technique because most spun content is produced through automated methods and is considered low quality for human readability.
  • Conversion

    In online marketing, conversion occurs when a user completes a desired action on a website. Examples of conversions include: making a purchase, downloading an item, subscribing email newsletters, etc. Conversion is the ultimate goal of an SEO strategy.
  • Conversion Rate Optimization (CRO)

    The Conversion Rate Optimization, usually shorted to CRO, is the practice of improving the conversion rate (i.e., the percentage of users completing a desired action) on a website. CRO is parallel to SEO, and while SEO is intended to drive users to a website; CRO is intended to drive users into conversions.
  • Co-Occurrence

    A co-occurrence is a term that’s used to describe if certain search terms or phrases occurred to happen simultaneously for numerous searches. A webpage having none of the keywords in its content might rank for the phrase due to the semantic proximity of the content with the search. For example, a webpage about "Kids apparel store" might be ranked in Google’s index as "boutique shop"; even though not having the words "boutique" and "shop"; mainly because of "Kids apparel" is usually combined with "boutique" and "shop" in numerous search queries.
  • Crawl

    The crawl is the process of gathering information, using a bot, from the billions of webpages to update, add, and organize webpages in a search engine’s index. Things that would slow down a crawl on a website would be orphan pages, dead links, and poor redirects. Things that would improve the ability of a website to crawl are a sitemap, increased speed of a site and properly sized videos and photos.
  • Curated content

    The curation is gathering data from various sources in order to offer users a centralized location to easily find it. People will curate content in order to either become an expert or to offer a complete knowledge group for users.
  • Customer Journey

    The customer journey (or purchase path) is interaction between a customer and the brand or organization that has created a product or a service. It's comprised of all the touchpoints that a customer experiences on the website.


  • Data

    The data is an analytical info that SEO industry uses in order to understand the behaviour of visitors better. These may be demographic, geographic or other pieces of information. Search information or keyword information are important and relevant examples of data.
  • De-indexing

    De-indexing is the act of removing temporary or permanently a website or a webpage from the search engine results. De-indexing may be a penalty by a search engine due to violation of guidelines or as a voluntary action by the webmaster. Consequently, the website or the webpage would not be able to appear in SERPs.
  • Dead End page

    A dead end page is a webpage that has no internal or external links. A user has no option but to exit the website altogether. Once a user or a bot arrives at this page, there is no place to move forward. Dead-end pages are not good for SEO and must be fixed.
  • Deep link

    A backlink is from another website which targets a content relevant page of a website rather than the front page. Ideally, the link will be from a content relevant page on the outgoing website also. In this case, "Deep" refers to the depth of a link in a hierarchical structure of webpages or content.
  • Disavow

    Disavowing links is the process of telling Google that you don't want the links to count towards your site. For example, if you feel that some links are spammy, you can disavow them via the Disavow tool in Search Console. Utmost care is required while using the Disavow tool as it may also remove useful links.
  • Do-follow link

    A do-follow link is a link which has not had the rel="nofollow" attribute applied to it. Generally speaking, this means that it's a link which passes PageRank. "Do-follow" is the default status for all links.
  • Domain

    The domain is the address of a website or the actual characters typed in that make up a URL and typically ending in an extension like .com, or .org. For example: web-eau.net is the domain name of this website.
  • Domain Authority (DA)

    This is a metric which refers to the overall strength of a domain that can help webpages within that domain to rank quickly. It's all about the quality links a particular domain earned over the time to represent a website’s overall quality profile. Domain authority uses a 0 to 100 logarithmic scale. This is similar to a quality scorecard for a website.
  • Doorway page

    Doorway pages are poor quality webpages that are created to manipulate search engine ranking for specific keywords, only for the purpose of redirecting users who click on that page to a different website. Google recognizes them and has been working to crack down on them.
  • Duplicate content

    Duplicate content refers to a significant amount of content that appears on the internet in more than one location, or replicates partially or fully content that already exists elsewhere on the web. Google might penalize severely for duplicate content, including de-indexing the websites.
  • Dwell time

    This is the amount of time a user spends on a website after clicking from a SERP. The goal is to have a longer amount of time because it suggests that a user found relevant or interesting data on the site. Dwell time highly affects the page rank. Short dwell time can be an indicator of low-quality content to search engines.


  • Editorial link

    This is a link, internal or external, that occurs naturally within the body of a content on a webpage. It is not purchased or requested, which makes it the gold standard of link. For example, the writer of an article has named you a source which leads to your credibility.
  • Engagement Metrics

    Metrics or tools to determine the interactions that a user has with webpages and content. The most important engagement metrics are:
    • Total number of users
    • Click-through rate
    • Conversion rate
    • Bounce rate
    • Dwell time
    • Time on page/site
    • New vs. returning visitors
    • Frequency and recency
    • Retention
    • Screen flow
  • Entity

    An entity of a knowledge graph is an element, like the people, demographics, locations, devices, organizations, websites, events, groups and other facts in Google’s Search Analytics that helps webmasters to find out the traffic flow.
  • Expertise, Authoritativeness, Trustworthiness (E-A-T)

    E-A-T is a metric by which Google's Quality Raters assess webpages. Webpages with high level expertise, authoritativeness and trustworthiness are considered as high quality pages.
    The factors that help an E-A-T score are:
    • Link building
    • Quality content
    • Optimized social media
    • Searchable content
    • Accurate local listings
  • External link

    This is an outgoing link to another website. Although the recipient is normally considered the primary beneficiary, there can be SEO benefits for the giver if the link is a "good" quality editorial link to a webpage with related content.


  • Favicon

    Favicons are small icons that usually contain a logo or a basic graphic and appear in the tab of web browsers as well as bookmarks. There is no direct SEO value to using it, so perhaps out of place in this SEO glossary, but your website really should have one.
  • Featured Snippet

    Featured Snippets are special boxes shown at the top of search results in Google which can contain short pieces of text and images. Since June 2020, these snippets now automatically make use of Scroll to Text Fragment links.
  • Feed

    Feed is the content that is delivered to you via special programs such as news aggregators. For example, Google Feed Burner is a feed program.
  • Fetch as Google

    Fetch as Google is a Google Search Console tool that allows webmasters to submit to and check the rendering of a resource in Google - simulating a visit from Googlebot.
  • Findability

    Findability refers to how easily the content on a website can be discovered, both internally (by users) and externally (by search engines). For a better findability, a website should define sitemaps and have a well-structured content. It helps in gaining better SEO rank.
  • Footer Link

    Footer links are the links that appear at the bottom section of a website. As they are displayed in all website's pages, footer links can have SEO impact.
  • Fresh Content

    Adding or updating content to websites helps with ranking. It can be an important factor that search engines take into account when determining pages ranking in its search results, but it’s not as important as quality and relevance. Bots see new and recently updated content as more relevant.


  • Geo-targeting

    This is a marketing technique of displaying different contents to specific groups of people based on their geographic location, such as country, state, city, PIN code, IP address, etc. Geo-targeting can be used for local searches, when your business is interested in traffic from a particular location only.
  • Ghost Blogger

    A ghost blogger is a person who writes for websites and blogs without any self credibility, but in exchange for handsome payment. Her/his name does not appear with a blog post or article she/he has written, but typically the credit goes to another person who pays for the content. Ghost writers are numerous in SEO industry.
  • Google

    A (famous) search engine founded in September 1998 by Larry Page and Sergey Brin. They have about 90% of the search engine market share.
  • Google Adwords

    Google Ads [formerly Google Adwords] is an online advertising platform designed by Google on which advertisers submit bids in other to show internet users short advertisements, service offers, product listings or videos. Services are provided based on a PPC [pay-per-click] pricing model. According to the keywords you want to target, a company pays for their ads to rank first in search results page; having to pay every time a visitor clicks on their ad. This online advertisement allows you to target audiences interested in the products or services you offer.
  • Google Analytics

    Google analytics is a web analytics tool available at both a freemium and premium level by Google that helps webmasters gather and analyze data about website usage. It shows demographic usage along with the locations of visitors to the site. In addition, users can see how long a visitor spends on their site, bounce rate, and where their traffic originated from. A must-have.
  • Google Bomb

    This an unethical SEO practice intended to make a website rank higher on Google's search rankings for an irrelevant, comic, off-topic, unrelated or controversial search. This practice is also known as Google washing.
  • Google Bot

    Google Bot is the web crawling program developed by Google that performs a task autonomously to find and add new websites and webpages to its index. Bots are known as robots, crawlers or spiders.
  • Google Bowling

    Google Bowling refers broadly to negative SEO. This is an unethical SEO practice of attempting to have backlinks sent to competition on websites that are non-relevant and full of spam. In theory, the competitors’ website could be penalized which would lower their SERP ranking.
  • Google Fred

    A significant and unannounced update to Google’s algorithms which came out in March 2017. The change was intended to promote websites that prioritized user experience over financial gain via ads.
  • Google Hummingbird

    Hummingbird is a significant change (August 2013 release) to Google’s algorithms that improved search results. The main goal of Hummingbird was to provide better search results by understanding the context of the query, rather than returning results for certain keywords.
  • Google Keyword Planner

    Keyword Planner is a Google Ads feature that has dual functionality. It can be used to find keyword ideas which are based upon words or combinations of words along with a URL. Or, you can parse data from the planner that anticipate clicks and impressions for a month in advance.
  • Google Mobile First Indexing

    Since 2017, Google favors and prefers mobile versions of websites when it comes to indexing and ranking. Thus, a poorly configured mobile site might negatively impact your search engine ranking.
  • Google My Business

    Google my Business is another free tool for business owners to have some of their information visible when a user performs a query. The business can create and revise listings, which will result in a business being found on Google Maps. Thus, the information may be displayed on the right side of the SERP.
  • Google Panda

    Panda is the official name of a significant change in Google’s search algorithm that was officially launched in February 2011, followed by a series of updates. This updated centered on website quality. It reduced rankings of sites that had unoriginal content. Pages that had duplicate material were lowered in ranking. Panda became a part of Google’s core search engine algorithm in 2016.
  • Google Penguin

    Penguin is the official name of a major search algorithm change in Google that was officially announced in April 2012. This release centered upon quality links. The focus was on reducing rankings of websites that use spammy links and manipulative links.
  • Google Pigeon

    Pigeon is a major algorithmic update that was launched in July 2014. Pigeon is not the official name of the algorithm, rather the name has been given by the SEO industry. This update focused on increasing rankings for local searches. It used location as one of the influencers in determining ranking.
  • Google Possum

    This September 2016 release updated Google’s algorithms to diversify local search results. This had an unintended consequence of limiting search results for businesses that shared the same address.
  • Google Rank-Brain

    Rank-Brain is a sophisticated artificial intelligence (AI) based search algorithm that helps Google to understand a search query, officially introduced in October 2015. It's not a programmed algorithm, rather the programming in Rank-Brain is based on a machine learning system. This was an important update that attempted to use a user’s search intention when delivering search results. This was a machine learning or AI update that piggybacked off of Google’s previous Hummingbird.
  • Google Search Central Blog

    This is Google’s official blog for webmasters and SEO professionals. This blog is highly helpful as all updates and announcements regarding Google Search is available here. If you are seriously interested in SEO, you need to subscribe to this blog.
  • Google Search Console

    Search Console is rebranded from the previous title of Google Webmaster Tools since May 2015. It's a completely free service by Google for webmasters with several helpful features, including submitting site maps for indexing, fixing index issues, inspecting URL index, acting in accordance to errors and warnings, observing performance for search queries, improving search ranking, checking mobile usability of a page, validating AMP, monitoring outbound links and more. Search Console allows users to check the status of their indexing along with optimizing the visibility of websites.
  • Google Trends

    Trends is a Google service that provides free tools to webmasters so that they can observe search trends for a topic or term over time in different countries or language based region and to find out comparative statistics and being able to choose the best keyword ranking factors for their websites. The graphical representations of results allows users to implore different strategies based upon the popularity of certain keywords within the timeframe and area in which they would like to use them.
  • Google Webmaster Guidelines

    These Guidelines are meant for developing and designing websites that are friendly to Google. These guidelines help in developing quality content that can rank well in Google search and in building qualified links that can help in better ranking by Google. Clear definitions laid out by Google that must be followed if a website is to be found, crawled, and indexed.
  • Gray Hat

    Grey hat is a degree of SEO practice that keeps+ itself in between white hat (good SEO practices) and black hat (the worst SEO practices). These SEO practices are not technically violating Google’s guidelines. However, they are questionable as to their ethics. Much of SEO conducted could arguably be said to fall within the area of grey hat SEO, but should be equally avoided in order to achieve long term SEO success.
  • Guest Blogging

    Guest blogging (or guest posting) refers to the practice of writing for other’s website in exchange for a backlink to own website or blog. It is widely used by bloggers as a SEO practice as well to create brand awareness as long as the basic subjects or topics were relevant between the two sites.


  • Head

    The head of a document contains important elements such as the document’s title, metadata, scripts, styles and more. It will not contain any of the page’s content which is to be displayed.
  • Heading Tags

    Heading tags are the HTML elements of a webpage that defines the title of the document or page (H1), main headings or sections (H2) and sub-headings or paragraphs (H3 to H6). In a SEO context, H1 is the most important heading tag, while H6 is the least. Because the actual current SEO value is still a source of discussion, it’s better to use no more than one H1 tag on each page and go easy with the others.
  • Hidden Text

    Text that is not visible to a user when visiting a webpage. It's used in order to manipulate search rankings, as it loads sites which have an abnormal volume of keywords. This is considered as spammy practice and violates Google’s webmaster guidelines. It should not be used.

    Also known as Content Cloaking.

  • Hilltop Algorithm

    Hilltop Algorithm is a Google’s search algorithm released in 2003 and that decides how much a document is relevant for a certain search query. This is determined on the basis of reference of expert webpages to an authority webpage. This page would contain non-affiliated links to other sites and would be narrow in scope of its focus.
  • Homepage

    The home page is the default front page of a website that loads first when an internet user enters the domain name of a website in a browser. It is the starting point for a website and the page from which all the other pages can eventually be found.
  • hreflang

    This is a process that allows users to experience a webpage in a desired language. Based upon the language that is used during the search, a website will display itself in the corresponding language. The hreflang attributes can be used to ensure the correct language and regional websites are served in search results. The Hreflang specification can be applied using the rel="alternate" and hreflang="x" attributes with the applicable language code inside a link element, via a HTTP header or a sitemap.
  • .htaccess

    The htaccess file is a web server configuration file containing commands to direct the server’s behavior in certain circumstances. This file is mainly used by Apache servers and can be used to rewrite and redirect URLs.
  • HTML sitemap

    An HTML sitemap is a list of pages in a website and meant for website users to help them navigate through the website. This is the hierarchy of a website, which is intended to be used by visitors as a navigation tool. It is normally expressed as a bulleted list, and it shows the relationships between pages on a site.
  • HTTP

    HTTP is the acronym for Hyper-Text Transfer Protocol. This is the protocol used across the World Wide Web that defines how data is transferred from a computer server to a web browser and what action web servers and browsers should take.

    HTTPS is the acronym for Hyper-Text Transfer Protocol Secure that uses a Secure Sockets Layer (SSL) to encrypt data transferred between a website and the web browsers. HTTPS is a minor Google ranking factor.
  • Hyperlink

    A hyperlink is a link from one page to another, or from one place on a page to another place on the same page. Hyperlinks are inbound and outbound. The hyperlinks that start and end on the same site are called internal hyperlinks. Hyperlinks are important SEO factors.


  • Image sitemap

    Similar to an HTML sitemap, this is an HTML file that has all the images on a website that you want the Google bots to crawl and index.
  • Impression

    In the context of SEO, an impression counts for a single view of your web link one time on SERP.
  • Inbound Link

    An inbound link is a link that comes from another website.
  • Index

    In the context of SEO, index is the database search engines use to store and retrieve information regarding web-pages, posts and media gathered during the crawling process.
  • Indexed Page

    An indexed page is a webpage that has been discovered by a crawler, has been added to a search engine index, and is eligible to appear in search results for relevant queries.
  • Infographic

    This is a graphic that represents much information in an easy to understand and very visual format. It is a great way to convey either a lot of information or complex information in a more simplified manner. Infographics are often used as pieces of the content marketing strategy.
  • Information Architecture

    This general term covers how the content is designed, categorized, organized, and structured in a useful and meaningful way. Good Information architecture should show consideration to how both humans and search spiders read the website.
  • Information retrieval

    The information retrieval is the process of searching for information or file (e.g., text, images, videos) from a large database and then presenting the most relevant information to an end user.
  • Internal links

    These are hypertext links between two web pages of the same domain. It is a great way to help users navigate a website and to be sent from one logical destination to the next. It also helps the Google bots to better index a website. Likewise, it’s a good tool for SEO as it allows specific pages to be granted higher authority.
  • International SEO

    This is the process of increasing visibility for a website internationally (vs local SEO). There are ways to set up detection of the language and the location of a searcher that will allow a website to automatically reflect the most appropriate webpage.
  • Interstitials

    These webpages are displayed prior to or after an expected page’s appearance. These may be used to verify age or more commonly, a disclaimer. Pop-up ads are examples of interstitials. This reduces the user experience and can result in penalties if done improperly on a mobile site.
  • IP Address

    IP is the acronym for Internet Protocol. An IP address is a unique string of numbers separated by decimal points which identifies a device and serves as its address point on the internet. An IP address serves two basic purposes: Identify the network the address belongs to and the exact location of the address.


  • Keywords

    The keywords are the words which appear in the content on your webpages and are used in search queries. In the context of SEO, it's a phrase that consists of more than one word and meaningfully able to tell search engines about the significant content of the webpage.
  • Keyword Cannibalization

    The keyword cannibalization is the self-competition among webpages within the same website that occurs by targeting same keywords for multiple pages so that the pages from the website rank for the same search query (keyword) on a SERP. By using the same keywords on more than one page on a website, you are competing against yourself to rank higher and to get the same traffic. This is bad for your SEO.
  • Keyword Density

    This is the number of times a keyword is displayed on a webpage divided by the total number of words on the page. While it is unknown exactly what the best rate is, many people will overuse keywords on their pages.
  • Keyword Stuffing

    The keyword stuffing is a black hat SEO technique and refers to spam practices of increasing keyword density, adding irrelevant keywords or repeating keywords in unnatural way in a webpage in the hopes of increasing search rank. This spam tactic is against Google’s Webmaster Guidelines and can result a manual penalty.
  • Knowledge Graph

    The Knowledge Graph is a knowledge base system created and a set of new richer features from Google first announced in May 2012. Knowledge Graph results provide structured and detailed information about topics.
  • Knowledge Panel

    This is a panel that appears on the right side of a SERP and provides basic information for a search result. It is a rich snippet that will allow a user to quickly find search results without having to open a webpage. It will likely contain facts and data, along with people and places directly related to a search.
  • KPI

    KPI is the acronym for Key Performance Indicator. This is a measurable value that indicates the effectiveness of a business operation. It may include such factors as gross profit margin, cash flow, market share, inventory turnover and more.


  • Landing Page

    A landing page is a specially designed standalone page in a website with a certain call to action feature. In a SEO perspective, the design of these pages is to optimize conversions. Each landing page may use separate keywords.
  • Lazy Loading

    The concept is to load only the necessary parts of a website in order to optimize load times. Remaining sections or portions of websites are loaded as they are needed.
  • Lead

    A lead is a potential customer who requires your product or service or/and interested in it. A lead can be confirmed if she/he shares her/his contact details and other information relevant to a business deal.
  • Link

    In a SEO and web technology context, a link (short of hyperlink) is an HTML object that makes connection between different websites, different pages within a website and/or different sections within a page. In SEO terminology, the primary categories of links are two – internal links and external links. Thus, links play a critical role in search engine’s evaluation and ranking of websites.
  • Link Bait

    A link bait is a webpage that has valuable content. This will organically become a target for other sites to link to. These sites have high value and information that is not readily found elsewhere. The objective of link bait is to improve search rank of a website or webpages with adequate back-link. Link bait may be misused in many ways, for example, creating intentional provocative content. While not easy to reach, it's extremely valuable in SEO.
  • Link Building

    Link building is the process of building links to a website or resource. Link building is a key part of most full SEO campaigns and is undertaken for the purpose of increasing the authority of a resource. It is the second most important part of SEO, after quality content.
  • Link Directory

    A web link directory is an online directory of websites, usually separated by related categories, either maintained by a human or a systematic program. Inclusion in a link directory may be free or paid. Link directories have been widely misused for search engine ranking and hence, search engines have upgraded themselves to prevent such misuse.
  • Link Juice

    The link juice (or short juice) is the quantity of trust and authority that a link will pass to the landing page at the other end. The quantity is dependent on the quality and ranking of the originating page or website. Concisely, this is the SEO value of any particular backlink.
  • Link Profile

    Also named backlink profile, this is the measure of the quality of the backlinks a website has. This aggregate presentation of all of a site’s inbound links presents the search engines with an image of the site’s value, as perceived by other sites.
  • Local Search

    The local search is a query that intends for the results to be located in a geographical area immediately surrounding the users or a specific desired location.
  • Local SEO

    This is the practice of directing searchers to local businesses based upon location. SEO that is aimed at local and/or regional listings in search engines.
  • Long-Tail Keyword

    This is a search query phrase which is longer, more precise and specific. As they’re more specific, there is often less competition in SEO and SEM than for shorter, generic keyword terms. Such keywords help better ranking for highly competitive entities.
  • LSI

    LSI is the acronym for Latent Semantic Indexing. LSI are keywords that have a relation to your existing primary keyword. They can be individual words or phrases that are closely related to the primary word. Few SEO professionals believe that Google and other search engines use LSI to rank webpages, but there is no such evidence and moreover, this technology has been outdated.


  • Manual Action

    A manual action is a manual penalty taken against a website found to be breaking Google guidelines as determined by a human reviewer at Google. However, these actions can affect a single page or may be applied across the entire domain.

    Hacked site, unnatural link, pure spam, thin content, cloaking, sneaky redirection, spammy structured markup, keyword stuffing, hidden text, user generated spam etc. are the most common reasons for manual action.

  • Metadata

    SEO metadata is the data that comes up about a site when queried. The title along with the meta description may be displayed. This can be entered into the website’s coding.
  • Meta Description

    The Meta description is a HTML tag that provides the summary of a webpage, typically 150 - 160 characters, and can appear as the snippet in SERP when the search phrase, fully or partially, is within the description. Though it has no role in search engine ranking, relevant and catchy description can help in increasing click-through rate (CTR).
  • Meta Keyword

    The Meta keyword is an HTML tag that contains the most important keyword of a page. Though the practice of adding meta keywords to a web document is still alive, most of the search engines, including Google, ignore them just to prevent keyword stuffing.
  • Meta Tag

    Meta tags are the tags that are placed in the top source code of a webpage to describe its contents to search engines. These tags are not visible on the webpage. The three common types of meta tags are – title tags, meta description tags and keyword tags. While title tag is fully relevant for search engines, meta description is partially relevant and keyword tags have no relevance.
  • Metrics

    In the context of SEO, metrics refer to a number of methods, available on the internet, free or paid, to measure the success of SEO efforts. Common metrics followed in SEO are keyword rank, organic traffic, backlinks, bounce rate, and others.
  • MFA

    MFA is the acronym for Made For Advertisements. In the context of SEO, MFA refers to websites that are created and designed mostly for advertisement with little or thin useful content. Search engines do not give them any preference.
  • Mirror Site

    A mirror site is a duplicate site displayed at a different URL. Careful, mirror site is not a good option in the content of SEO because the search engines analyses duplicate content up to the level of manual action.
  • Mobile-First Indexing

    Google announced in November 2016 that they were experimenting with a mobile first index, meaning a stronger focus on mobile sites in algorithmic calculations. In March 2018 Google announced they were migrating some sites that followed best practice to mobile first indexing.

    This is an indexing concept in which bots crawl the mobile versions of websites before the desktop versions. If there is no mobile version of a page, there will be negative results on rankings. The desktop version is indexed and crawled after.

  • Mobile Optimization

    The mobile optimization is to update a website so that it is easily used on a mobile device. It is the opposite of the conception of Mobile-First. Adjustments to a website such as layout, text, and navigation may need to be performed.
  • Monetize

    This refers to earning from a site, a blog or video channel by placing advertisement and affiliate link in between content. Blogs with excellent, engaging content can monetize their traffic.


  • Natural Link

    A natural link is a backlink that your website gains naturally from other websites or blogs. In this case, the webmasters or blogger of the other sites believes that it is useful for their reader, or it is required to quote a proper source, and they add a link to your content. Typically, search engines love natural links.
  • Natural Search

    The natural search (or Organic search) are positions and web traffic which originates naturally based on SEO and not through paid advertising.
  • Negative SEO

    The negative SEO is a practice of maliciously trying to alter the search engine results of a website or competitor. This black hat technique uses malicious practices such as creating artificial spammy links and duplicate content through hacking a website in order to change the content.
  • No-archive

    No-archive tag is a meta robots tag that tells the search engines not to store a cached copy of a specific page. This tag prevents search engines showing the cached link of a page in SERP.
  • No-follow

    No-follow tag is a meta robots tag that tells the search engines not to follow a specific outbound link. Either because of a website doesn’t want to pass authority to another webpage or because it’s a paid link.
  • No-index

    No-index tag is a meta robots tag that can be added in a web document to tell the search engines that this page should not appear in the search engine results pages (SERPs).
  • No-opener

    No-opener is a meta robots tag that can be assigned to the rel attribute to protect against reverse tab nabbing. It does not have an impact on SEO in a link that it is added to.
  • No-referrer

    No-referrer is a meta robots tag that can be assigned to the rel attribute which tells the browser not to leak any HTTP referrer information, and is used in conjunction with no-opener to protect against reverse tab nabbing. It does not have an impact on SEO in a link that it is added to.
  • No-snippet

    No-snippet is a meta robots tag that tells search engines not to show a description with the listing. In this case, though Google may run your page and show in SERPs, for description , it will return a note – ‘No information is available for this.’


  • Off-Page SEO

    This refers to the SEO practices that take place outside the website. The off-page SEO is the work done outside a website itself to increase SERP rankings. Items that would be included here are things like social media postings, guest blogging, and link building.
  • On-Page SEO

    The on-page SEO is the compliment to the off-page SEO. This is optimizing the content on the webpages themselves to increase ranking. Using results from keyword research and updating with quality content are two examples of on-page SEO.
  • Open Graph Protocol

    The Open Graph Protocol is a structured data platform created by Facebook. Using Open Graph Meta tags such as og:title and og:image, it is possible to provide information about a resource to Facebook and other platforms.
  • Opt-in

    The opt-in is the permission from a user that allows the user to receive promotional material, most often in the form of an email. The need for permission has become more important as laws now prohibit the distribution of these materials without consent. However, once a user opts out, the messages must stop.
  • Opt-out

    The opt-out is the removal of permissions from a user to receive marketing materials sent directly to them. Most commonly, the opt-out is done via an unsubscribe link at the bottom of an email. Law now states that users need to have clear and obvious opportunities to stop receiving these material.
  • Organic Search result

    The organic search refers to the search results which do not include any paid advertising. These are the listings displayed on SERPs (search engine resultant pages) that are shown as a result of SEO.
  • Orphan Page

    An orphan page is a page that is not linked to any other pages on a website. Orphan pages have fewer chances of being indexed and ranked well.
  • Outbound Link

    An outbound link is a link that directs users from one page to another website, either in the same window or in a separate window.
  • Outreach marketing

    This refers to the practice of reaching out to other businesses and people who are in the same industry and have value for the two parties to each other. It is a targeted approach in which there is an offer to exchange links. In the SEO context, the outreach marketing would be used for link building.
  • Over-optimization

    The Over-optimization is an unethical SEO practice of attempting to fool a search engine. This may result in a penalty or reduced rankings in the SERPs.


  • PageRank

    PageRank is an algorithm used by Google Search to rank web pages in their search engine results. PageRank was named after Larry Page, one of the founders of Google. PageRank is a way of measuring the importance of website pages.
  • Page Speed

    Page speed is the amount of time that it takes for a page to load completely. A page's loading speed is determined by several factors, including a site's server, page file size, and image compression.
  • Page-view

    Page view is the amount of times a page is loaded by visitors. A reloading counts for another page view by a visitor.
  • Paid Link

    A paid link is a backlink to a website that is purchased. Purchasing them is never a good idea because it violates Google’s webmaster guidelines and could result in a manual penalty.
  • Penalty

    A penalty refers to demoting the rank of a webpage or a website either through the manual action by Google Web spam Team or automatically by algorithm update. The ultimate objective of a penalty is to control web spam or black hat SEO practice.
  • Persona

    A persona is an ideal representation of a customer or user that comes to a website. This is an amalgamation of demographics, values, and behaviors. Creating person helps marketers in understanding the user’s perspective and market segments so that they can work on a specific strategy in accordance to their market demand.
  • Piracy

    The piracy is the usage of copyrighted content (texts, images, videos, etc.) without the permission of the owner or the creator. In a context of SEO, if Google finds any piracy of web content, it takes action immediately.
  • PBN

    PBN is the acronym for Private Blog Network. PBN are interlinked websites owned by the same entity. Some PBN are considered unethical because they exist to host content and links in a way to manipulate the SERPs. This is purely a black hat SEO practice, and Google can take severe action against a PBN.
  • Position

    The position in SEO is exactly the rank of a webpage in SERPs for certain search query.
  • PPC

    PPC is the acronym for Pay Per Click and it’s the advertisements that appear with the organic results on SERP. This is a marketing model whereby adverts are shown to users with the aim of delivering traffic to a resource. Advertisers are charged simply speaking on the basis of how many times users click the adverts and visit the resource.
  • Proximity

    Proximity, in a context of SEO, is a search metric that measures how close the words in a search query are to the keywords in the content to be eligible to appear on SERP for that query.


  • QDF

    QDF is the acronym for Query Deserves Freshness, which is a search algorithm by Google in which the search engine determines whether a search query is for newer or up-to-date content and ranks webpages in accordance to that.
  • Quality Content

    Quality content can be defined as unique, original, valuable and engaging content that is meant for real users and not for search engines. Quality content is a vital requirement for lasting SEO advantages.
  • Quality Link

    This refers to the link from another website which offers value to your site. In order to be a quality link, the originating website needs to be relevant, respected, and authoritative.
  • Query

    In a SEO context, a query refers more specifically to the query that a user makes to a search engine. Moreover, the specific words or groups of words which are entered into the search engine.


  • Rank

    Rank refers to the position of a website in search engine results, usually on Google, or the position of the URL on the search engine results page. The most relevant results are listed at the top of the first page of search results.
  • Rank Factor

    The rank factor is a variable that search engines use to determine the best order of relevant index results to return for search queries. It describes the criteria used by search engines when evaluating which pages and content of websites to rank in regular search results or SERPs (search engine results pages). Ranking factors may be related to website content, technical implementation, user descriptions, backlink profiles, or any other features deemed relevant by search engines.
  • Rank Brain

    It's a search engine algorithm based on machine learning and artificial intelligence to sort search results and answer queries. It was verified by Google on October 26, 2015. Not only that, but it aims to provide users with more relevant search results.
  • Reciprocal Link

    Small and medium-sized websites often use reciprocal links to increase website traffic and link popularity. Reciprocal link is therefore the exchange of links between two different websites, covering content or similar topics on the same niche, thereby helping to create backlinks for both websites.
  • Redirect

    The HTTP redirect code is a way to redirect visitors and search engines from one URL to another URL. The three most common redirects are 301, 302, and Meta Refresh. Redirects are used when moving content to a new URL, deleting pages, merging websites or duplicate content, temporarily redirecting traffic during server maintenance or changing domain names. If possible, avoid using redirects; redirects are not bad for SEO, but they have to be placed correctly. A bad implementation might result in a loss of PageRank to loss of traffic.
  • Referrer

    Referrer is the page from which the backlink came. When a site includes a link to your website in a content of theirs, that site is considered a referrer. The referrer URL is also known as the HTTP referrer URL. Referrer URL is URL data from HTTP header fields used to identify the web link used to direct the user to the website.
  • Re-inclusion

    Re-inclusion refers to the process of requiring search engines to re-index websites that have been penalized for black hat SEO. Google and Yahoo have tools that webmasters can use to submit their sites for recovery. That is to ask search engines to relist a webpage that has been penalized. A new Google post request requires the website to remove all possible spam penalties.
  • Regional Long Trail (RLT)

    Regional Long Trail is a multi-keyword phrase containing the name of a region or city. This is especially useful for SEO in the service industry.
  • Relevance

    It is a concept related to the relevance of a product, with respect to whether the activity or brand is consistent with the needs and wishes of their potential customers. This is a measure of whether a brand's message really appeals to their audience. For example, when someone enters a search query using a specific keyword and sees a company’s ad on Google, they will decide whether to click on the ad. If they don’t click, they may not have found the ad relevant to their original search query, and over time, the volume of the ad will decrease and cost more to be maintained by the business. Search engines use keyword relevance to determine what a page is about, which is also part of what they use to determine which keywords a website ranks for in search results.
  • Repeat Visits

    Repeated visits as a metric is a measure of the number of visits during the reporting period. Once visitors start visiting the site, new visitors and returning visitors are eligible. Repeat visits can be encouraged when a positive user experience is provided, having a fast-loading website, easy to read and scan high-quality content and making the website easier to navigate.
  • Reputation

    Online reputation determines how others think about a company when they search or find it online. Digital reputation includes how users perceive the brand and media reports. Internet users build a brand’s digital reputation based on their personal experience. Online reputation management (ORM) is about monitoring and improving the appearance of your online business. This means identifying and analyzing what potential customers, journalists, or partners know about your brand, your employees, or your products/services when they perform Google searches.
  • Responsive

    In SEO, when a website is responsive, it simply means that it directly reacts to the device used and provides the best viewing experience. Responsive Web Design (RWD) is also a method in web design that aims to create a website that dynamically changes its appearance based on the screen size and orientation of the device used to display it. By providing the best user experience across the board, responsive web design means that users can read and browse the website with minimal resize and scrolling.
  • Return On Investment (ROI)

    Return on investment simply compares the revenue of digital marketing activities with the cost of creating and delivering activities. Ideally, you want the highest possible return on investment. Yes: ROI = (net profit/total cost) * 100. It is a measure of the profit or loss that can be generated in digital marketing activities based on the amount of investment. By calculating the rate of return on marketing investment, the company can measure the extent to which marketing efforts have contributed to the overall sales growth of the marketing activities.
  • Rich Cards

    Rich cards are a more attractive display layer because they improve standard search results by providing a more structured and visual preview of content in Google Mobile Search. In fact, they are a new way for companies to appear in search results. Rich Cards is a new search result format based on rich snippets, as rich cards can be seen as a further development of rich snippets. Like rich snippets, rich cards use schema.org so that the content can be displayed in a visual format, thereby improving the user experience on mobile devices.
  • Rich Snippet

    It can be used to add useful information tags to the search engine results page. Its purpose is to provide users with additional data to evaluate the best search results, thereby using this tag to display more complete search results so that users can more easily find the information they are looking for. The snippet is a summary of the content from a website that appears in Google search results. The snippets are generated based on the search query and presented as part of the search result list. The additional information they contain is not stored in the meta description of the affected target site, but is stored in the source code using rich snippets as structured data. The rich snippets are used to describe the structured data tags that the site operator can add to their existing HTML, this in turn enables search engines to better understand the information contained on each page.
  • Robots.txt

    A robots.txt file is a simple text file hosted on a web server. The robots.txt file, also known as the robots exclusions protocol, is an important part of a website. It provides instructions to search engine robots to crawl pages on a website; the robots.txt file defines how search engine spiders view and interact with a site’s web pages and whether to access files on such website. This is mainly used to avoid overloading your website due to requests. Wrong execution may damage or even destroy a web search.
  • Robots Meta Tags

    These are code snippets that provide crawlers with instructions on how to crawl or index web content. Robot meta tags are sometimes referred to as meta tags. Meta tags provide information about the web page in the HTML code of the document. This information is called "metadata." Meta tags are specifically HTML tags that describe the content of a website. Although it does not appear on the page, it can be read by search engines and web crawlers. Meta tags (such as meta titles and meta descriptions) play an important role in website SEO because they contain keywords and phrases that describe the content of the page. A tag telling search engines what to obey and what not to obey.
  • Root Domain

    The root domain is the highest level in the site hierarchy. And separated from the top-level domain by a dot (e.g. rootdomain.com). Internet root domains (sometimes called zero-level domains) are served by domain name system root servers located in different countries/regions of the world.


  • Sandbox

    Sandbox or Google Sandbox is a so-called filter that prevents new websites from entering Google’s best search results; a software testing environment that enables software or programs to run in the sandbox for independent evaluation, monitoring, or testing. Think of it as a trial period. It is also a term used by SEO marketers to describe the time spent on their website waiting to be moved to the "grown-up" form on Google. The effect should depend on the type and ecological niche of the website. The new site usually lasts 46 months, but in some cases around 23 months, and in more extreme cases as long as 89 months.
  • Schema

    Schema is essentially a rich snippet of HTML markup that adds additional information to the text below the URL in the search results. A rich schema is a way to directly tell search engines who you are, what you are doing, and provide accurate information. Information about the products, services, or content you provide. It is basically "common vocabulary" terms or tags, which your online marketing company can use to interact with search engines to provide accurate searches.
  • Scraping

    Web scraping is a way to automate the data extraction process, including the process of analyzing the website and importing information from the website into a local file or spreadsheet stored on your computer as data to meet your needs. The most general form of data collection refers to a technique in which one computer program extracts data from the output produced by another program. This form of copying collects some data from the Internet and usually copies it to a central location. A database or spreadsheet for future search or analysis, as earlier explained above. Web scraping software can directly access the World Wide Web through the Hypertext Transfer Protocol or a web browser. Although, users of the software can manually clean up web pages, the term usually refers to an automated process implemented by robots or web crawlers.
  • Search Engine

    A search engine is a web-based tool or program that enables users to find information on the World Wide Web. A program that searches and recognizes items that match keywords, phrases or characters specified by users in a database, that is, searches information databases in response to user queries and is used to find specific websites on the World Wide Web. Popular examples of search engines are Google, AOL, Ask.com, Baidu, Bing, DuckDuckGo, Yahoo and MSN Search. Currently, Google is the most popular and well-known search engine.
  • Search Engine Friendly (SEF)

    A search engine-friendly website ensures that each page of the website has unique content and a search engine optimized content is not just content that reserves page space but a content that is very relevant to the content search engines are in the look-out for, and is optimized to optimize website SEO. This means that Google and other search engines can efficiently crawl every page on the site, effectively interpret the content and index it into their database.
  • Search Engine Marketing (SEM)

    Search Engine Marketing (SEM) is a form of digital marketing that promotes a website by increasing the visibility of the website on the search engine results page, mainly through paid advertising. SEM is also called pay for search or pay per click (PPC), that is, brands pay to display ads in search results on search engine results pages (SERPs). They target selected keywords so that when users search for these phrases, they will see ads from that brand.
  • Search Engine Optimization (SEO)

    Search engine optimization is the process of improving the quality and quantity of website traffic from search engines to websites or web pages; a digital marketing strategy designed to make your website appear in search results from search engines. Search is one of the most important ways for people to find content on the Internet, so it’s essential to improve your page ranking on search engines such as Google. An increase in search engine rankings will lead to an increase in website traffic. Once you understand how SEO works, you can use a variety of strategies to increase your visibility (or rank higher) in search results. The goal of SEO is free traffic, not direct or paid traffic.
  • Search Engine Results Pages (SERP)

    SERPs are web pages displayed by search engines in response to user queries. The main component of search results is the list of results returned by search engines in response to keyword search queries. In addition to regular search results, search engine results pages (SERPs) usually contain paid search and pay-per-click advertising (PPC).
  • Search Volume

    Search volume is the average number of searches for a specific keyword in a specific time period. Indicates the number of searches for a specific search term in a search engine (such as Google) over a period of time. Search volume is an indicator that shows how many people are looking for a particular search query. When creating a content strategy, it’s important to consider search volume because it reflects the popularity of the search query. Ideally, you want to find keywords with high search volume and low search competition. The number of search queries is approximate and may be affected by seasonal and regional fluctuations.
  • Secondary Keywords

    Secondary keywords are those keywords that add more detail to these original keywords, and usually have more to do with the intent of the person searching. Therefore, after linking to your main keyword, you should look for secondary keywords, because using them can help you outperform your competitors. Secondary keywords are closely related terms. Related terms are useful for keyword optimization because they add context to the page and send additional signals to search engines so that they can understand and rank the page.
  • Secure Sockets Layer

    The Secure Socket Layer (SSL) is an internet security protocol developed by Netscape to establish an encrypted connection between a web server and a browser, used to secure Internet communications. Secure Socket Layer/Transport Layer Security [SSL/TLS] is a basic technology to protect web transactions and communications, but it is not reliable. A new study called Lucky13 shows that SSL/TLS is subject to conditional time attacks that may leak encrypted data. It enables companies to encrypt data before sending it to users to prevent third parties from reading the data during transmission.
  • Seed Keyword

    Seed keywords (also called primary keywords) are the primary keywords most relevant to the business. They usually consist of one or two words. They can be combined with modifiers to create long tail keywords. Seed keywords tend to have high monthly search volume and competition. Although short keywords may or may not have modifiers, seed keywords never have modifiers.
  • Server side includes

    Server Side Includes is an instruction or code placed on an HTML page and evaluated on the server when the page is served. They allow you to add dynamically generated content to an existing HTML page without having to provide CGI [Common Gateway Interface] programs or other dynamic technologies for the entire page. It’s a process that uses a web server to perform tasks, such as displaying files as part of other files, or dynamically displaying information such as website URLs or date and time. SSI can also be used for graphics, such as logos or image maps, to be displayed on multiple pages.
  • Silo Structure

    In SEO, the silo structure is the best way to organize content on a specific subject. It helps users and search engines browse your website in a clear and understandable way. It is a very logical organization structure. A silo is a website architecture where content is logically organized; it consists of hierarchical groups with topics and subtopics; in other words, you can think of it as creating categories and subcategories for your website and writing relevant content for them.
  • Site Speed

    The website speed involves a report that shows how quickly users view and interact with content. You can identify areas that need improvement, and then track the extent of these improvements. The website speed report measures page load time, used to test the number of page views on your website. Page speed can be called "page load time" (the time it takes for the content of a given page to be fully displayed) or "first byte time" (the time it takes for the browser to read the first byte of the web server information received). You can use Google Page Speed Insights to measure your page speed.
  • Site Structure

    Site structure is related to how you organize the content of your website. In posts and pages. Categories and tags, as well as taxonomy such as internal links, navigation, and breadcrumbs, are tools for structuring websites.
  • Site Link

    A site link is a hyperlink or URL that points to a specific page on a website that appears in certain Google entries to help users navigate the website. If you see search results on Google, you may see sitelinks to some sites. When you click on these site links, you will be redirected to the specific webpage the site link points to. The web-master cannot add site links; Google uses its own secret automatic algorithm to add them.
  • Sitemap

    The sitemap is an XML file that provides information about certain types of content such as the pages, videos, images and other files on your website and the relationship between them. It lists the URL of the site and refers to other URLs on the website. A site map is a list of pages on a website within a domain, used by the designer when planning the website. It allows webmasters to add additional information to each URL: when it was last updated, how often it was changed, and its importance. A sitemap is the blueprint of your website, which helps search engines find, crawl, and index all the content on your website. The sitemap also tells search engines which pages on your website are the most important. Usually in the form of an XML sitemap, linking to each page on your website.
  • Site-wide Links

    Sitewide links are backlinks or outbound links that appear on most or all pages of a website. These are usually linking in the header, footer, sidebar, or navigation menu of the website.
  • Sky-scraping

    Skyscraper content is an inbound marketing technique used to find topics that work well on search engines and social media. It utilizes reverse engineering to create better content than the already existing content. The strategy also includes backlinks to existing posts or shares with people who have shown interest in the past.
  • Social Media Marketing (SMM)

    The term "social media marketing" (SMM) refers to the use of social media platforms and websites to promote products or services, and social media marketing enables a company to connect with existing customers, build brand, increase sales, increase website traffic and acquire new customers, thereby promoting culture and mission, or the tone they want. For example, Buffer is a social media management tool platform that can help you succeed in social media marketing.
  • Social Signal

    Social signals are a measure of social media activity, such as voting, collective shares, posting, likes or other engagement, which search engines can consider as part of their ranking algorithms. According to this perception of search engines, the visibility of social media and social signals are the backbone of this highly effective communication. Speech utterances, body movements, such as gestures, object manipulation, and combinations thereof. People use these signals intuitively and unconsciously.
  • Spambot

    Spambot is a computer program that collects email addresses from the Internet in order to create mailing lists for sending unsolicited mail, also known as spam. Spambots usually create accounts and send spam with them. These programs are often called email blockers or filters. The response of web hosts and website owners is to ban spammers, which has led to ongoing battles between them and spammers. Spammers are looking for new ways to circumvent bans and anti-spam programs.
  • Splash Page

    The splash page is the page that precedes every page on your website. The splash page has numerous purposes: you can display disclaimers, promote new offers, or warnings based on the industry or niche market in which your business is located. The splash page is the front end of your website.
  • Split Testing

    Split testing, often called A/B testing, allows marketers to compare two different versions of a webpage, the control (original) and the variant, to see which one improves the conversion rate. It is a way to test multiple versions of a website (or various elements of the website) against each other to see which version performs better. A method of performing controlled, random experiments to improve website performance (such as clicks, form submissions, or purchases).
  • SSL Certificate

    An SSL certificate is a code on the Web server to ensure the security of your online communication. SSL certificates enable encrypted connections. It's like putting the envelope in the envelope before sending. SSL certificates allow websites to migrate from HTTP to the more secure HTTPS. An SSL certificate can also be referred to as the data file hosted on the origin server of a website. The SSL certificate also enables SSL/TLS encryption and contains the website’s public key and login information and related information. Let’s summarize this; SSL is a small data file that can digitally link an encryption key with organizational data. When installed on a web server, it includes a padlock and HTTPS protocol, and also provides a secure web server connection.
  • Status Code

    The status code is issued by the server in response to client requests sent to the same server; it is the server's response to the browser request. When you visit a website, your browser will send a request to the website’s server, and the server will respond to the browser’s request with a three-digit code: the HTTP status code.
  • Stop Words

    A stop word is a collection of commonly used words in a language. Examples of stop words in English: "a", "the", "is", “with”, "are", etc. Stop words are often used in text mining and natural language processing (NLP) to delete frequently used words that contain little useful information. Search engines can ignore certain words in search queries and search results, and stop words are one of them. They do not change the meaning of the query and are used for correct sentence structure when writing content. People also omit them because they think it will make their URL shorter and cleaner.
  • Structured Data

    The term structured data refers to data contained in fixed fields of a file or record. It is usually stored in a relational database (DBMS). It can be composed of numbers and text, and the preparation can be automatic or manual, as long as it is in the DBMS [Database Management System] structure. In the most general sense, structured data is information (data) that is organized (structured). Organized information is basically structured data. P.S. DBMS is a software that has been designed to store, retrieve, define, manipulate and manage data in a database.
  • Subdomain

    As the name implies, a subdomain is an additional part of the main domain of the domain name. It is the additional information added to the beginning of a website’s domain name to enable the website to target content for specific functions. Subdomains can be created to organize and navigate different areas of the main website. With the main domain, you can have as many subdomains to access all the different pages on your website, it’s however optional. In this example [freebies.yourwebsite.com], "freebies" is the subdomain, "your website" is the main domain, and ". com" is the top-level domain (TLD).
  • Submission

    Search engine submission is the process of sending requests to search engines to index websites, that is, submitting a website for index by search engines. Contrary to search engine optimization, submission doesn’t have any effect on the ranking of the website, it just makes search engines aware of its existence. It can sometimes take up to a month to send the URL to the search engine; which is usually done using Google Webmaster Tools or Bing Webmaster Tools. SEO submission software promises to do the website submission for you by submitting your website to various search engines, special interest sites and internet directories.


  • Taxonomy

    Website taxonomy is a way of classifying, organizing, linking and connecting website content. The taxonomy in the SEO language is a set of URLs with common attributes, so they have common relevance to each other. The URL does not have to follow a specific URL structure, nor does it have to have the same architectural depth as the homepage. A good website taxonomy presents content to your readers and search engines, and makes it easy to access, searchable, and useful; optimizing pages for search engines.
  • Technical SEO

    Technical search engine optimization [SEO] refers to improving the technical aspects of a website in order to increase the ranking of its pages in search engines and ensuring that the website meets the technical requirements of modern search engines to improve its organic ranking. The cornerstone of technical optimization is to make your website faster, easier to crawl and understand by search engines. Technical SEO is important because it lets search engines like Google know that you have a valuable website. The important elements of technical SEO are crawling, indexing, rendering, and website architecture.
  • Time On Page (TOP)

    TOP is calculated based on the time difference between when a person arrives at the page and switches to the next page. Clicking on a link to go to another page on the website is a trigger for calculating the time spent on the previous page. The amount of time spent on the page corresponds to the bounce rate; this allows you to know if you are driving the right amount of traffic to your website. Time on page shows whether your content is relevant to your readers.
  • Title Tag

    Title tags are important because they tell the reader what you want to do. Appears when the user clicks on the page. For the same reason, they are important to search engines, except that they are also used to determine the relevance of web pages as search query results. A tag is an HTML element that can be found in the head section of the HTML code of your website. A title tag is an HTML code tag that can be used to assign a title to a website. You can find this title in the title bar of your browser and on the results page.
  • Top Level Domain (TLD)

    The top-level domain is the last part of the domain name; it’s the letter immediately after the endpoint of the internet address. Examples are: com, org or net. TLD is one of the top-level domains after the root domain of the internet hierarchical domain name system. A TLD identifies certain content on the website linked to it, for example, its purpose, the organization to which it belongs, or the geographic area where it was created.
  • Topical Trust Flow

    The topical trust flow is a measure of the topical relevance of inbound links pointing to the site from which your link comes, that is, how trustworthy and authoritative your domain is within its niche and the topic the content is centered around. Trust flow is one of the most important indicators to measure the value, quality and performance of a website on search engines. It can help you determine link building potential, measure competition, and test backlinks. The site trust flow is measured by the number of clicks received from a set of initial trusted sites for a given page. For example, if your website is authoritative and high-quality, then your trust flow in it will increase and the more referring domains, the more accurate its topical trust flow.
  • Traffic

    SEO traffic is organic traffic from search engines; coming from people who enter keywords or search queries on Google, view search results, and click on your website. You can use tools such as Google Analytics and Google Search Console to track SEO organic traffic.
  • Trust

    Trust in a website has more to do with the credibility of such website. The more trustworthy and reliable your website is, the more likely it is that your content will rank high in Google search results. You will most likely receive the Google Stamp of Approval. The Google Trust Score is a factor related to trust, and it’s the combination of many factors that determines the reliability of a website.
  • Trust Rank

    Trust Rank is an algorithm designed to evaluate the quality of web content. It analyzes web content, backlinks, and incoming links, and checks whether the content is relevant to specific user queries. Trust Rank as an algorithm performs link analysis to separate useful websites from spam and helps search engines rank pages in SERPs (Search Engine Results Pages). Search engines have a variety of algorithms and factors. They use various ratings to measure website quality, Trust Rank is one of them.


  • Uniform Resource Locator (URL)

    URL (Uniform Resource Locator) is a form of URI [Universal Resource Identifier] used to address documents that can be accessed through the Internet and Intranet. It is also known as Internet address or Web address. A uniform resource locator (URL) is referred to as a link to a web resource that indicates the location of the resource on the computer network and the mechanism for obtaining the resource. The URL of the website can be located at the top of the page in the address bar.
  • Unique Content

    This is content that is only available on your website and can provide your customers with unmistakable added value. This content helps distinguish your website from other websites by providing additional information that can’t be found on other websites. Unique content is online content that is completely different from all other content on the Internet. Creating this type of content is very important because it can improve search engine optimization [SEO]. This means that the content is original and will not be copied elsewhere.
  • Unique Visitor

    Unique visitors as a term is a marketing analysis that refers to the number of people who have visited a website or series of web pages at least once and are counted only once in a given period of time. Therefore, if one user visits a website multiple times, they will be counted as one visitor. It is also referred to as the "unique user". When tracking a website or selling advertising space, unique visitors are an important indicator because it can show how many people have visited the website. Although repeated communication with potential customers is good for advertising, too many impressions can reduce the return on investment.
  • Universal Search

    Universal search is a search engine optimization term that refers to generic search. The results from various databases are displayed in the form of a list. This search enables Google to display images, local business information, advanced instructions, etc. directly in search results. In the context of search engine optimization, "universal search" (also known as "advanced search") refers to the integration of additional media (such as videos, images, or maps) displayed above or between regular search results engine. Like Google or Bing. This means that Google can obtain results from multiple industries and combine them into one SERP.
  • Unnatural Link

    Unnatural links are usually artificial links that are mainly used to manage or manipulate page rankings. These may be purchased links or links generated by spammers that put your site under the radar and may link your site to another site in a bad area on the Internet.
  • URL Parameter

    URL parameters are used to indicate how search engines should process certain parts of your website based on your URL, in order to crawl your website more efficiently. You can embed URL parameters in your URL so that your URL can track click information. URL parameters consist of a key and a value, separated by an equal sign (=) and connected with an ampersand (&). The first parameter is always after the question mark in the URL. You can add multiple parameters to pages with private tags.
  • URL Rewrite

    URL rewriting makes it easier for users to remember URLs. When the URL is entered on the web server, the URL rewriting engine changes the syntax behind the scenes so that the page or database item can be retrieved. The URL rewriter module can be used to perform URL manipulation tasks, which includes: Define effective rules for converting complex URLs into simple and consistent web addresses. Easily replace web application URLs with easy-to-use, system friendly results.
  • Usability

    Usability is a measure of the ability of a specific user to use a product/design in a specific environment to effectively, efficiently, and satisfactorily achieve a set goal. It measures the usability of the user interface. Usability also refers to techniques used to improve usability of the design thorough out the development process to the final product.
  • User-generated content (UGC)

    UGC refers to content related to your brand created by people who are not officially represented by your company. These people can be customers, employees, partners, influencers, and other business stakeholders. By definition, user-generated content is any form of content (text, news, tweets, images, audio files, comments, etc.) created by individuals (not brands) and published on social networks or the Internet. User generated content could also be social media updates, comments, videos or podcasts. UGC doesn’t have to be paid for before being published or shared online.
  • User Agent

    A user agent is a set of modules used to obtain, display, and manage web resources, while interacting with other software. It provides information and user interface content for assistive technologies that may provide additional functions, visualizations, or user interfaces. This software retrieves, presents, and facilitates the interaction between end users and web content on behalf of users. Therefore, the user agent is a special type of software agent.
  • User Experience (UX)

    User experience (UX for short) is how a person feels when interacting with the system. A system can be a website, web application, or desktop software, and in modern environments is usually referred to by some form of human-computer interaction (HCI). User experience (UX) design is the method used to create products that provide users with meaningful and relevant experiences. This involves the entire process of designing, buying and integrating products, including branding, design, usability, and user experience (UX). The focus is on understanding what users need, what they value, their capabilities, and their limitations. In addition to considering the business goals and goals of the team leading the project. UX best practices help improve the user experience and the experience of your products and all related services.
  • User-Friendly

    Anything and everything user-friendly will not interrupt or disrupt readers' consumption or navigation of content on a website. However, if you want to develop content online, it is still important to try to inspire action. You just need to be smart. Usability or user experience as already explained above is the art of making your website simple, easy-to-use, and user-friendly. Optimizing a website for SEO and making it user-friendly means that Google and other search engines can effectively crawl every page on the site, effectively interpret the content, and index it into their database. They can then provide users with the most relevant and valuable web pages based on the topic they are looking for.


  • Vertical Search

    Vertical search also referred to as specialized search is searching within a specific subject area or specific segment of a general search. Vertical search engines are created for specific types of content. It is also the indexing and disclosure of content for websites, categories, or industries, and they are different from general Internet search engines in that they focus on specific parts of online content. They are also called professional or topic search engines.
  • Viral Content

    Viral content is widely recognized online content through distribution and publication on social media, news sites, email newsletters, aggregators, and search engines. Web memes are classic examples of viral content, usually in the form of videos or pictures with one or two lines of text. Viral content can be broadcast through materials such as articles, pictures, or videos.
  • Virtual Assistant

    A virtual assistant is a worker who provides administrative services for clients from remote locations (usually home offices). Typical tasks that virtual assistants can perform include scheduling appointments, making phone calls, organizing trips, and managing email accounts. From digital marketing tasks, scheduling and event management to personal errands. The assistant can do almost everything.
  • Visibility

    SEO Visibility is a metric that represents the percentage of all possible SERPs [Search Engine Results Pages] clicks that a website receives for a specific keyword. Visibility is an index composed of various relevant search factors, which is used to calculate the visibility of a website in regular search engine results.
  • Vlog

    Vlog refers to video blogs, which is a type of blog where most or all of the content is in video format. Essentially, video blogs are a new way of sharing content.
  • Voice Search

    As the name implies, voice search refers to the use of voice recognition technology to enable users to search simply by speaking on devices such as computers, smartphones, or smart devices. Voice search, also known as voice-enabled, enables users to search the Internet, websites, or applications using voice commands. In the broadest sense, voice search includes keyword queries.


  • Webmaster

    A webmaster is the person responsible for establishing or maintaining a website, especially for a company or organization. A webmaster is someone who creates and manages the content and organization of a website, manages the technical aspects of computer servers and website programming, or both. The webmaster can control everything related to the website. They monitor the performance, functionality, speed, and design of search engines. Webmasters even track content, events and marketing activities.
  • Web Scrap

    Web scrap is the process of using bots to extract content and data from a website, that is, automatically capturing structured web data. Unlike screen scraping, which only copies the pixels displayed on the screen, website scraping extracts the underlying HTML, thereby extracting the data stored in the database. The scraper can then copy the entire content of the website to another location. It is also known as web data extraction. Web scrap is completely legal.
  • Web Spam

    Web spam is a set of techniques that can disrupt search engine ranking algorithms and force them to rank search results higher than their original rankings. Web spam annoys search engine users and destroys search engine; therefore, most commercial search engines try to combat web spam. Web spam is therefore referred to as a deliberate attempt to manipulate search engine rankings for specific keywords or keyword searches.
  • Webpage

    A web page is a document available on the World Wide Web. The web pages are stored on the web server and can be viewed using a web browser. The collection of related web pages on a web server is called a website. Each website is assigned a uniform resource locator (URL). Web pages are formatted and designed using systems and standards (such as hypertext and HTML). A web page can contain many types of resources, such as: Style information: control the appearance of the page.
  • Website Speed

    Checkout Site speed.
  • White Hat

    White hats are ethical hackers or computer security professionals who specialize in penetration testing and other testing methods to ensure the security of company information systems. Such people are worthy of respect and admiration. White hat hackers prefer to use their power for better than evil, as they use the same hacking techniques as black hat hackers, except that they first do it with the permission of the system owner, which makes the process completely legal.
  • Widget

    A web widget is a piece of code that can be added to your website, which can then take content from another website and paste it into a page on your own website. There are many different names for web widgets, including HTML embeds, embed codes, plug-ins, and gadgets. The purpose of widgets is usually simple but important. Although widgets are implemented for specific functions, they can also be used as additional promotional tools for companies and the products they represent. Such as website performance, search engine optimization, and even backlink activities.
  • Word Count

    The word count is the number of digital words in a document, file, or line of text. Word counts the number of words in the document as you type, and also counts the number of pages, paragraphs, lines, and characters. The number of words can be located in the status bar. The word count includes all content in the main body (including titles/headings, tables, citations, quotation marks, lists, etc.).


  • Extensible Hypertext Markup Language (XHTML)

    XHTML stands for Extensible Hypertext Markup Language, which is the intersection of HTML and XML. XHTML is almost the same as HTML, but more restrictive than HTML. XHTML is HTML defined as an XML application. It is compatible with all major browsers. It is a markup language used to create web pages, and the most widely used language for creating documents on the World Wide Web, and is considered a replacement for the original web markup language HTML. Although a web browser converts XHTML into a human-readable document, XHTML should not be used to control the appearance of a web page-it requires CSS.
  • Extensible Markup Language (XML)

    Extensible Markup Language (XML) is a very flexible and simple text format that defines a set of rules for encoding documents in human and machine-readable formats which was originally developed to meet the challenges of large-scale electronic publishing, and it is also playing an increasingly important role in various data exchanges on the Internet and elsewhere. A markup language derived from Standard Generalized Markup Language (SGML). XML tags identifies data and are used to store and organize data, rather than displaying data like HTML tags. It structures data for storage and transmission. It also uses tags to describe components in a file.
  • XML Sitemap

    An XML sitemap is a simple marked text file that defines the data type; it lists the URL and the important pages on a website so that Google can find and crawl them. The XML sitemap URL is usually located in the root directory of the domain; ready to be accessed by bots. It allows webmasters to add additional information to each URL: when it was last updated, how often it was changed, and its importance relative to other URLs on the website. XML sitemaps are important for SEO because they make it easier for Google to search the pages of your website. This is important because Google scores web pages, not just websites.


  • Your Money or Your Life (YMYL)

    The "your money or your life" page is seen as a page that may potentially affect the future happiness, health, financial stability, or safety of users. The reason for this being that in this type of information, if it’s presented in an inaccurate, false or misleading manner, it will directly affect the user. In other words, there are many risks with this type of content, and Google expects quality raters to have a higher "page quality" (PQ) standard than other types of websites. They are called YMYL pages because they can directly affect your money or life, hence the name. YMYL is how Google describes websites and content that may have a negative impact on the quality of life and/or its finances.
  • Yandex

    Yandex is specially developed for the Russian market to better handle specific Russian search tasks. Although Google search only scores pages that match a user’s specific spelling query, Yandex can analyze the user’s synonyms and intent, regardless of the user’s spelling. This is Russia’s leading and largest search engine with a 60% market share, providing browsers, emails, news aggregators, maps, paid services and translators for Russians, Belarus, Kazakhstan, Uzbekistan, Ukraine and Turkey. More than 57% of Russian Internet users use Yandex as their main search engine. Very similar to Google and easy to use. The ranking of e-commerce sites in Yandex is very high.
  • Youtube

    Youtube is a social media platform used for online video sharing. Youtube SEO, as the name suggests, can optimize your video to get a higher ranking in the Youtube search engine for specific search terms. It displays ads at the top, followed by regular results. When deciding which video to display for search, YouTube considers various video attributes, such as title, description and transcription. Youtube SEO ensures that these attributes are optimized so that you can display videos with relevant keywords. Unlike Google, which uses backlinks and other factors to measure rankings, Youtube SEO is about optimizing your channels, playlists, metadata, descriptions, and videos. You can optimize your video for SEO on Youtube and elsewhere.
Daniel Dubois

Daniel Dubois

Passionné par le Web depuis 2007, Daniel défend la veuve et l'orphelin du web en créant des sites respectueux du W3C. Fort d'une expérience de plusieurs années, il partage ses connaissances dans un état d'esprit open source.
Très impliqué dans la communauté Joomla depuis 2014, il est actif au sein de plusieurs projets, conférencier et fondateur du JUG Breizh.