The 30 Minute Guide to SEO

It seems almost on a daily occurence that we have someone in our circle say they want to learn SEO, but they just don’t have the time or money to consume a course on it.

We’ve tried to condense SEO into a guide that is comprehensive, yet concise and accessible. And we think we’ve done it. So, without further ado, let’s dive right in to a beginners guide to SEO…

Table of Contents

 
How Search Engines Work
Robots & The Index
Algorithms
SERPs
Advertising

Architecture
Security
Sitemap
Robots.txt
Structured Data
Page Speed
Mobile

Content
Keywords
Title
Meta Descriptions
Headings
Media

Authority
Competitive Analysis
Blogging
Guest Contribution
Infographics
Tools
Redirects
Black Hat SEO

Rising Trends
Voice
Video
Machine Learning

SEO is an acronym for search engine optimization.

SEO is the discipline of optimizing a website for visibility on search engine results pages, in an effort to acquire customers and market share through the abundant levels of traffic that top rankings in search engines can provide.

More people use search engines to find products & services than any other marketing channel — that’s right — more than TV, radio, billboards, Facebook ads, magazines, newspapers, banner ads and corporate sponsorships. With that in mind, SEO has become a vital marketing channel for anyone interested in building a brand or business.

But, how does SEO work?

To answer that question, we have to first understand how search engines themselves work.

How Search Engines Work

Right now, I want you to visit your favorite search engine.

Type your query into the search bar.

Click the “Search” button.

Voila!

In less than a second, a ton of results popped up. That was easy!

Now, have you ever stopped to think about how this technology is made possible? Did the search engine really just look up billions of webpages, find the most relevant ones and then display them in some magical order at the speed of light?

Not quite.

Let’s find out how they really work.

Robots & The Index

The internet is a gigantic place with an abundance of information to uncover. Search engines, in an attempt to access and catalog all of it, have created software (commonly referred to as a ‘robot’ or ‘spider’) which automatically crawls the web, finds every bit of information they possibly can, and stores it into an index for users to access at any point in time through the search engine search bar.

Every minute of every day, new websites are created and existing websites are making changes to theirs, which means robots are re-crawling the web on a regular basis in order to keep their index as accurate and up-to-date as possible.

This is how search engines attempt to be a reflection of what information is out there on the web.

So in reality, every time you run a query through a search engine, you are actually not searching the web, you are rather searching their index of the web.

Every search engine robot & index is programmed differently, which is one reason why when you conduct the exact same search query on two different engines, you will often get completely different results.

But there is an additional reason why results differ between search engines, which we will discover next.

Algorithms

Whenever we conduct a search, the search engine pulls data from their index, displaying what they found on the web in relation to that search term — but that raises the question: how do they decide which result to list first? Which result comes second? And third? And on and on?

Algorithms.

An algorithm is a set of rules which, in this instance, automatically arranges the search engine results page. It’s a complex formula made up of dozens of factors.

It is not surprising to learn many website owners (referred to as ‘webmasters’) want to discover what exactly the search engine algorithms comprise of, in order to make their own website rank at the top, but search engines do not reveal this information lest any individual or corporation take advantage of the system.

Furthermore, search engines are constantly making changes to their algorithm, tweaking and adjusting it every so often to have it produce the highest quality and most relevant results possible. Search engine algorithms are perpetually under construction and review, which should make it clear to webmasters — diverting attention away from developing a great website to instead focus on what factors may or may not influence a search engine algorithm, is simply not a good use of time or energy.

SERPS

The results found on a search engine are based on a complex algorithm which tries to provide us with the best possible result first. These results are displayed on what are known as the search engine results pages, or simply ‘SERPs’.

The SERPs have evolved over the years.

What used to be a simple list of 10 text links to choose from, has now transformed into multiple forms of data presented to address a search query better than ever before — they are commonly known as SERP features and it appears they are here to stay.

While SERP features are evident on almost any query searched, they can typically be observed when pinged:

• Direct questions e.g. “How old is the earth?”
• Movie titles e.g. “The Fast and The Furious”
• Celebrity names e.g. “Denzel Washington”
• Addresses e.g. “725 5th Ave New York, NY 10022”
• Product prices e.g. “Tesla Model X Price”
• Local businesses e.g. “Thai restaurants near me”
• Weather forecasts e.g. “Glendale CA weather”
• Unit conversions e.g. “300 lbs in kgs”
• Medical diagnosis e.g. “Tuberculosis symptoms”

As search engines get smarter, SERPs will continue to evolve to provide results with greater relevance at a quicker rate.

Search engines want to give users the answers they seek as quickly as possible, thus will do what they can to provide information in a format they believe will satisfy a query best.

Advertising

Search engines display their results in a variety of formats to best match the nature of a query. In every SERP, it is important to acknowledge a subset of results which are unlike the rest. These results are not a part of the index and not subject to the algorithm. They are, in fact, advertisements.

Search engines usually display two types of results for every query: The organic results, and the paid results.

Paid results often look very similar to organic results, but can be identified through the ‘Ad’ label or similar.

Webmasters who advertise with search engines pay for every visitor that clicks through to their website, which is known as a ‘pay per click’ or PPC model. The price for each click is determined by the keywords they want to advertise for, as well as what position they want to be placed in amongst the other advertisers. As an example, a competitive term like ‘insurance’ could cost $55 per click, whereas ‘boat shoes’ may only cost $9 per click.

Advertising is how search engines make money and support the ongoing cost of crawling and indexing the entire web, which is a big task that requires a lot of resources.

Architecture

Earlier we mentioned algorithms — sets of rules which are used to automatically arrange search engine results pages.

While most factors constituting search algorithms are unknown, search engines have acknowledged a number of overarching components to consider.

Architecture is one.

Architecture is the structure or frame of a site. It is a primary element to consider when optimizing a website for search engine visibility.

Let’s see what we can learn about building a solid site architecture.

Security

Is it imperative for search engines that their users enjoy a safe web browsing experience. As a result, security is taken into consideration when determining a sites visibility in the SERPs.

Establishing an acceptable level of website security is not a complicated nor difficult assignment. It can usually be achieved through installation of a secure sockets layer, also known as SSL encryption.

Any reputable web hosting company should provide SSL encryption as an option; some may include it as a standard feature, while others may charge extra for it.

The most obvious way to tell if a website is secure is by looking at the website address in the address bar: does it begin with a HTTP or a HTTPS?

HTTP is an acronym for hypertext transfer protocol, which is the most commonly used protocol to communicate between a server and a web browser. HTTPS is essentially the same, but for the additional ‘S’ which indicates all visitor data is encrypted and sensitive information cannot be exploited from that site.


If a website address begins with HTTPS, it is secure.


Another way to determine if a website is secure is with the visual cue of a lock icon in the address bar, although this is not always the case.

Sitemap

A sitemap is, somewhat literally, a map of a website. They are designed to help robots find, classify and organize pages of a site they may not otherwise have found on their own.


Sitemaps are almost always found in the form of a .xml file, listing every webpage in a sequential order for a search engine bot to crawl and understand the overall context, relevance and hierarchy of an entire site.

While the creation of a sitemap is not a critical necessity, it certainly does help.

Search engines read sitemap files to more intelligently crawl a site. A sitemap tells a robot which pages are important and also provides valuable information about these pages, such as when the page was last updated, how often the page is changed, and any alternate language versions of a page.


If a live webpage cannot be found through internal references within a site, but is included in the sitemap, search engine robots will still be able to discover and crawl that page.


A sitemap can be seen as an auxiliary method of ensuring every page that should be found and included in a search engine index has the best possible chance to do so.

Robots.txt

Robots.txt is a file which essentially tells search engine bots how to behave — in other words, what they should and shouldn’t crawl.

A robots.txt is useful when webmasters don’t want search engines to crawl and subsequently display pages of their site that serve no commercial purpose, and potentially expose any site vulnerabilities.

Inserting a simple disallow command is sufficient in preventing a robot crawling an unwanted area of a website, such as the admin folder, or shopping cart pages if it’s an eCommerce site.

Webmasters can even specify which search engine robots they want to block or allow, if they choose to do so.

Robots.txt is also helpful when directing robots to the location of the website sitemap.

A sample robots.txt may look something like this:
User-agent: *
Disallow: /cgi-bin/
Disallow: /admin/
Sitemap: http://abc.com/sitemap.xml

Structured Data

Structured data is a standardized format of providing information about a page and classifying page content.


When applicable, it is important to ensure a website contains structured data to assist the search engines better understand website content.

This can be achieved with schema markup.

Schema is a collaborative structured data guideline which can be enabled on sites which contain specific types of content. Effectively utilizing schema markup will better equip a page for rankings and may result in a display of rich data in the SERPs. Rich data magnifies certain data in a listing, leading to higher click through rates and user engagement.

A small list of examples where schema can be executed include:

• Reviews
• Prices
• Articles
• Events
• Recipes
• Movies
• Books


Search engines are getting smarter and should eventually reach a point where they understand site data without the need for structured data, but for now it’s a good idea to implement schema when at all possible.


All types of structured data markup can be accessed online at schema.org.

Page Speed

How fast a page loads has been identified by search engines as a visibility factor — it is therefore crucial webpage loading speed be lightning quick.

The quicker a page loads, the better for search engines and users alike.

According to the most recent data, 47% of consumers expect a web page to load in 2 seconds or less and over 40% of website visitors abandon a website that takes more than 3 seconds to load. And perhaps the most alarming statistic of all: Every second delay in page response results in a 7% reduction in conversions.

In other words, a slow loading website doesn’t just come at the cost of lost rankings, but customers also.


Users hate waiting for pages to load, and search engines want to ensure they have a positive browsing experience, thus they will reward a website for implementing measures to dramatically speed up their page loading time.

The simple solution is to make the website load faster, but how exactly is that done?


While every situation is unique, best practices for increasing site speed include, but are not limited to:

• Removing unnecessary page requests
• Minifying code (HTML, JavaScript, CSS)
• Compressing images
• Enabling caching
• Utilizing content delivery networks

When in doubt, speed testing tools such as Pingdom and PageSpeed Insights can help diagnose slow performance and identify solutions to explore.

Mobile

More than half of all searches are conducted on a mobile phone.


Mobile phone usage will only continue to grow. As a result, it is vital websites display, navigate and interact well on a mobile device.


Some webmasters have attempted to optimize well for mobile devices through the creation of ‘mobile versions’ of a site, that is, is a stripped down version of the same website which lives on a separate subdomain or directory, however this approach proves inefficient due to possible duplication and dilution issues, let alone the extra resources required for the design and maintenance of a separate entity.

Mobile responsive design, however, appears to be a superior step toward ranking well on mobile devices. Mobile responsive design simply refers to a website automatically ‘responding’ to the screen dimensions of a device it is accessed on — to put it another way, if a website is viewed on a mobile phone, the layout will adjust itself to the full dimensions of the screen, while simultaneously conforming to a laptop or desktop screen when required.


Implementing a mobile responsive design and ensuring other elements interact well on a mobile device will help with overall user experience and engagement, subsequently leading to greater visibility.

Content

Following architecture, content would be the next pillar to establish in the endeavor to create a website that will rank well in search engines.

It is crucial to not only feature content which is beneficent and insightful, but also that it be well optimized, meaning it utilizes a combination of various targeted keywords in key areas of a page.

Let’s see what we can learn about creating content for SEO.

Keywords

Keywords hold great significance in the world of SEO.

Otherwise known as ‘search terms’, keywords are the phrases users type into the search bar of a search engine to find what they are looking for.

Every search query begins with a keyword.

Whether a user searches for “dog food” or “best hotel with a pool in Las Vegas NV”, there are webmasters who want to rank well for it. However, not every keyword is created equally.

Some keywords are preferable to others, based on search volume and user intent.

For example, there are much more searches for the keyword “iPhone” compared to “where to buy a iPhone XR”. With that in mind, some telecom sites would prefer to rank for “iPhone” if given the choice.

Having said that, while the singular iPhone keyword has greater search volume, there is a good chance those users may just be looking for specs on the device, comparing it to other models, conducting a paper about technology, or a myriad of other reasons in which the user intent is unclear — “where to buy a iPhone XR”, on the other hand, is clear with intent and therefore a much more profitable keyword, despite the lower search volume.


It’s necessary to conduct extensive keyword research to truly understand not only what people are looking for, but also what they are not looking for. Yes, anyone can brainstorm and make fairly safe assumptions about what people are searching for, but there are just too many variations, ideas and concepts one cannot comprehend on their own. It’s always a good idea to go deep on keyword research, namely using a range of tools to get a broader understanding of the targeted industry.


There is an abundance of keyword research tools which gather data on what people are searching for on a regular basis.

Some of these tools include:

Most tools are reasonably priced and tend to present reputable data, often pulling directly from search engine APIs (application programming interfaces), although it’s always best to access a combination of tools to lower any chance of acting on false data.


Knowing the right keywords to target is half the content battle.

Title

A title is the element used to explain to search engines and users what a webpage is all about.


In the SERPs, the title is what the user clicks on to visit a site, therefore it is vital it describe content accurately and feature the relevant keywords users are searching for.

For many users, the title is the first thing they notice in a search engine results page. In other words, it is their first impression of a site — as such, it is perhaps one of the most important elements to consider in a SEO campaign.

A lot of webmasters forgo the benefits of optimizing their titles by either leaving the default title as is, having the exact same title on all pages, or simply not utilizing the the title to their full character limits — thus relinquishing the opportunity for greater visibility.

Optimizing unique page titles with the appropriate keywords, to the maximum character limits is recommended to ensure a website is in the best position to rank higher for a greater number of relevant terms.


A sample title tag may take the following form:
< title >Summary of Company Products or Services | Company Name< /title >

Meta Description

Meta descriptions are the sections of content found underneath a title in the SERPs.

A meta description exists to further explain exactly what a page is about and what a user can expect to find once they click through to view the page.

It is ideal to make the most of a meta description, ensuring it is succinct, persuasive and incentivizes users to visit the page. After all, it makes no sense to expend a considerable amount of effort to rank well in the search engine results, only to have people scroll right over a result due to a poorly constructed or irrelevant meta description.


Similar to a title tag, meta descriptions should be optimized for search engines by ensuring a unique description for every page, utilizing the description to the full character limits and including relevant keywords to maximize the chance of a user clicking through to the website.


A sample meta description tag may take the following form:

< meta name=“description” content=“ We’ve been in business for 50 years. Every product comes with our 30 day money back guarantee. Guaranteed lowest prices. Visit us today!” >


Headings

Headings are the headlines of content on a webpage.

Headings are useful for search engines to evaluate the structure of text on a page better, and for users to navigate a page with greater clarity.


Sub headings are also used when content is broken into sub sections and further.


Both headings and sub headings have their own particular tag for search engines to understand and process, commonly known as H-tags.


While there is usually only one heading per page, multiple classes of sub headings can be used on a single page.

For each sub section of content created, an appropriate sub heading tag can be created, up to six levels down. This is usually not necessary, however it is there if need be.

The appropriate syntax for headings tags would be the following:
< h2 >This is a heading< /h2 >
< h3 >This is a sub heading< /h3 >
< h4 >As is this one< /h4 >
< h5 >And another one< /h5 >
< h6 >And one more< /h6 >

Media

Despite advances in robot crawling technology, search engines simply cannot crawl certain elements as easily as text. This includes most forms of multimedia and any other non-text content, which is often ignored or devalued by search engine bots.

If an entire page is made up of flash, a search engine essentially sees nothing. The same would apply for video, JavaScript and other types of media.

With this in mind, many websites do not feature flash or excessive video. Images, however, are still an important media which not only enhance design of a site, but are usually necessary for a large portion of websites.

Search engines have provided a specific kind of attribution webmasters can utilize for images — tags known as ‘ALT’ tags, which help identify image context and relevance. It is also ideal for images to include these tags, as they could assist in the visibility for image search.

Deploying text over other forms of media will help search engines crawl a site easier and ensure pages load quicker, which is ideal for both users and search engines alike. In the cases where images are absolutely necessary, however, it is imperative ALT tags be applied.

A sample ALT tag would appear as:

< img src="https://www.websitename.com/image.jpg" alt=“name of image” >

Authority

Architecture and content are integral elements to a search engine algorithm.


Authority is another.

Arguably the strongest factor of all, authority is what separates two very similar sites who have solid architecture and relevant content.


Authority is attained through the accumulation of links — the anchors we see all over the web that direct us from one page to another. When one website links to another, search engines see it as a vote of authority.

Let’s see what we can learn.

Competitive Analysis

Links are commonly seen as a proverbial opinion poll of the web. In other words, links tell search engines which websites are trustworthy and relevant, since another website would not direct its users there unless it is a useful resource in that particular context.


Unless a site is in an entirely new category which no other company has ventured into, there is almost certainly existing competitors who have had the opportunity to establish market share and subsequently, gain a considerable amount of links to their websites from a various number of sources.

Competitive analysis refers to actively finding and analyzing where the competition is currently receiving their links from, and mimic their efforts to gain links from the same sources.


Competitive analysis is usually the first tactic webmasters use to gain links relatively quickly, since it tends to be an easy and effective method with minor risk. The logic behind this strategy is that if a certain directory, blog, news outlet or resource links to much of the competition, there is a good chance it will link to them also — which is often the case.


Most webmasters use link analysis tools to find these types of links, as it is almost impossible to find them manually. It may involve an upfront or recurring cost to access such tools, but many find them to be worth their weight in SEO gold.

Blogging

A blog, short for ‘web log’, is a website which is updated regularly with chronological entries. Blogs tend to take the form of an online public journal where anyone can read and respond to an entry.

While the first blog was written in 1994, It has only been in the past few years where blogs have become a very popular medium. They are popular due to the low barrier to entry and widely available blogging services which have become mainstream for the general public who, while not being very technologically savvy, are comfortable creating a blog.

There are an estimated 500 millions of blogs all over the web, with 2 million blog posts being written every day. Many blogs have gained an extraordinary level of authority through consistent, intentional, high quality content, and continue to use it as their main method of attracting authority and links.

Once a blog has reached a certain climax or tipping point, it will gain authority almost automatically. Every post naturally receives links from readers, industry outlets and commentators the same way famous celebrities receive attention for anything they do. It is for this reason blogging is regarded as a fantastic way to gain authority, albeit a long term strategy that requires extreme patience and persistence.

Guest Contribution

If writing for authority and SEO sounds like a viable option but the idea of maintaining a blog is worrisome or undesirable, guest contributions on external publications may be a feasible alternative.

Otherwise known as guest posting or guest writing, the concept of guest contribution is as follows — write a relevant, share-worthy piece of content for a particular audience, followed by reaching out to a popular writer in that category in order to have them publish it on their platform, in exchange for a link back to the author website.

It is important to stress the quality and relevance of the proposed content, as writers receive many requests on a daily basis — there is also a level of risk for them to publish content to an audience who is only familiar with their own work.

Guest contribution remains a popular method of building authority, as there is a benefit for all parties involved: the popular writer wins because they were able to publish a post with no effort on their part; the audience wins because they consume a new & interesting piece of content, and the author wins because they were able to receive some recognition and a well earned link.

As long as people read articles and content online, guest contribution will continue to work and bring value.

Infographics

Infographics, short for information graphics, are a certain kind of image in which vast amounts of data or information sets are summarized and displayed as an easy-to-understand graphic.


Also known as data visualization, infographics are a proven way to gain authority. This is due to the share-worthy nature of the content, as many people are interested in simplified and visually appealing content, thus will subsequently link to it or share it with others.

Infographics can work for virtually any industry. A wide range of data can be applied to infographics, such as:

• Industry statistics
• Polls, surveys or lists
• Economic or political analysis
• Documentation of any processes
• Comparisons of products or services
• History or a company, person or place
• Explanations of concepts
• Facts on an object or event
• Perspectives of time or space
• Geographic norms or disparities
• Applicable tips or tricks
• Extrapolations or applications of findings
• Clinical trial discoveries

There are many ways to construct and apply information to a graphical representation.


As a strategy, some webmasters even repurpose previously successful infographics by either applying it to another industry, or updating it to reflect modern times e.g. updating an infographic from 2015 to what is relevant for 2020.

Infographics have endless possibilities, and are typically only limited by our own creativity.

Tools

Tools are online resources which can be accessed for free while providing definitive value for users.


The idea behind producing a tool for authority is the anticipated publicity that will come with it. If a tool yields enough value, it should attract links passively and organically without much need for manual outreach or extensive promotion.

Examples of an online tool could include a:

• Mortgage repayment calculator
• Legal contract generator
• Image resizer
• .Pdf to .Doc file converter
• Gif maker
• File compressor
• Website SEO auditor
• Programming code compiler
• Social media analyzer
• Photo editor
• Internet speed checker
• Resume builder

Another option would be to modify a popular existing tool by making it better, faster or more user friendly.

While tools may be costly to develop, many businesses employ this strategy not only for prospective authority, but automated lead generation also. In this instance, tools have a “double benefit” and can justify the cost of developing such tools.


Similar to infographics, the possibilities for tools are only limited by our own creativity.

Redirects

A redirect is what happens when a site or page loads in place of another.


A redirect is usually implemented when a page no longer exists and there is a new, alternative destination users should visit instead. This may occur, for example, when a company is acquired or merged with another, therefore visitors should access the new company website instead.

Redirects could also transpire when a page is closed or made redundant and a more relevant page is suitable. In these noted instances, a permanent redirect (sometimes referred to as a ‘301’ redirect) is implemented to notify the search engines this page or website should be permanently replaced with the preferred URL.



A secondary function of redirects is that they also pass on authority.

Websites shut down all the time — including older ones that were once great resources and had gathered some authority, having a decent number of websites pointing links to it. With that in mind, a webmaster may purchase a recently shut down domain and redirect it to their own site, requiring search engines to pass on the newly acquired authority onto them and allowing the webmaster to reap the benefits of the previous sites authority for relatively low cost and effort.


Redirects are also powerful ways to recoup any lost authority due to a site migration or redesign.

Black Hat SEO

Black hat SEO is the practice of manipulating search engine algorithms using strategies that are against search engine guidelines.


White hat SEO, of course, is the opposite of black hat. It involves remaining strictly within search engine guidelines and not engaging in any webmaster activity that could be considered questionable.


Gray hat SEO is somewhere in the middle, to the point where webmasters toe the line of search engine guidelines. They are debatable, but not as clearly in violation of guidelines as black hat SEO is.


A large number of individuals make a living off the traffic they generate through SEO. With competition being extremely high in some categories, there’s a portion of webmasters who will go to great lengths to ensure they rank exceptionally high, even engaging in ethically questionable practices which are often termed “black hat”.


Search engines are constantly tweaking their algorithm in an effort to devalue and counter black hat tactics.

The most common black hat SEO method undertaken today include paid links, where websites are paid to point links to a certain website in order to gain authority relatively quickly. Another tactic would involve cloaking, which is the act of displaying different content to users and search engines respectively.

It’s important to remember black hat techniques are high-risk in nature and participation in them is not recommended — building a website with great architecture and optimized content that will naturally gain authority over time is a better, safe, long term strategy which will yield a better return on investment.

Rising Trends

Architecture, content and authority are major components to search engines and their relationships between users and websites — but that won’t always be the case.

Search engines are constantly evolving.


What works today will not work tomorrow. While it’s not clear exactly what the future holds for search engines, robots and algorithms, there are some trends that are obvious, already active at some capacity and we can expect to see more of it in the near future. Let’s explore some of them.

Voice

Voice makes our lives convenient.

Almost half of all online users perform at least one voice search every day.


Voice is both easy and quick. It has begun to penetrate several areas of our lives, with search engines being no exception.


Voice search is here, and is showing no signs of slowing down.


In fact, voice searches are up more than 3000% from 10 years ago — these days, 20% of all searches on mobile are voice searches.


With the development and widespread adoption of smart home devices, the importance of voice is only magnified.

With that in mind, it would be wise to optimize a webpage for voice search. This includes analyzing the results from a voice search to understand what data a webmaster should focus on and improve in order to stay visible and relevant.


The rise of voice also reinforces the importance of mobile search, and how crucial it is to optimize a website for mobile and voice as a compounded effort.

Video

A recent report indicates 43% of consumers want to see more video content from companies.

Video consumption is at an all time high and we can safely assume that trend will continue to grow. In fact, over half of all search engine results pages include a video-based result at this point in time.



Another report estimates video will make up 80% of all online traffic by 2021. Taking that into consideration, it seems neglecting video will potentially equate to a decline in search engine traffic when video takes precedence some time in the near future.

While some may be disheartened by the introduction of another form of content for SEO consideration, the practical aspect of video is its adaptability for other mediums that are still effective today, including transcription to text and repurposing for images.


Video search engines could very well be the future of search, and we can see some early signs of that. As a result, video should be incorporated into any SEO strategy as soon as viably possible.

Machine Learning

Search engines want to display the very best websites for users no matter what the search query is.


Search engine algorithms, as discussed earlier, evaluate architecture, content and authority among other varying factors — but sometimes even those components are not enough to help distinguish a great result from the best result possible.

For that reason, using machine learning, search engines are analyzing user behavior to determine which result deserves most visibility through engagement attributes such as click through rate, bounce rate, dwell time and other elements which help them understand what users are really looking for and if they are satisfied with the current results they are served with.

When a user clicks on a result and remains on the page for a considerable amount of time, this indicates the page has satisfied the users request and the webpage is seen as a better result when compared to the competing results .

Machine learning, in this sense, is a promising approach for search engines to better serve the user, promoting websites they visit and spend more time on, while simultaneously demoting the sites they don’t visit, or spend very little time on.

Final Thoughts

SEO is constantly evolving — there is no denying the time and dedication required to remain on top of it all. Having said that, the fundamentals we have covered here make future developments and maturation of search much easier to understand and accommodate.


Research extensively.

Test constantly.


And never forget the golden rule of SEO: make a webpage for users, not search engines.

Fin!

About the author

Sebastian

Sebastian is a veteran digital marketing expert with 23+ years of experience across hundreds of brands, and curates a weekly marketing newsletter.