A Complete Guide to Technical SEO

arrow_1.png

Whenever we say Technical SEO what comes to your mind? Most probably your mind hits, solving technical issues. Page speed, thin content, crawling and indexing all these terms are related to technical issues of your site. If you are a beginner and want to learn and implement Technical SEO, I have brought a complete guide for you. This Complete Guide covers everything from basic to expert level. So let’s start.

Fundamentals of Technical SEO

What is Technical SEO?

In order to learn technical SEO, You should first learn what technical SEO is?

Basically, Technical SEO is a process in which we ensure that our website meets all the technical requirements of the new modern search engines. 
We focus on the factors like page speed, crawling, indexing and website architecture with a combined goal of getting higher rankings.

Importance of Technical SEO

Let’s say you have designed the best site and worked best on keyword research and poured all of your efforts into creating the content. But what if your technical SEO is still messed up? Your little ignorance to technical SEO can spoil all your hard work. And as a result you will not rank.  At the very initial level, google and other search engines will crawl and index your site to find and store your content.

But it’s just an initial step. Even if Google has indexed all your stuff, your job is not done. Because your site needs to be fully optimized for technical SEO. 

Your pages should be secure, your site should be mobile friendly, your content should be free of duplication and faster loading. Including all these things, there are thousands of technical aspects that need your attention. I am not saying that your Technical SEO will only work if it’s perfect. But it should be good enough to make it easier for Google to access your content. 

The easier Google can access your content, the more you get the chances to rank.

How to improve Technical SEO?

Other than crawling and indexing technical SEO covers many other things. You have to take account on the following:

Javascript

XML sitemaps

Site architecture

URL structure

Structured Data

301 Redirects

404 Pages 

Canonical Tags 

Thin content 

Duplicate Content 

Hreflang 

I am going to cover all these things in my guide.

Site Structure and Navigation 

Your site structure is the first thing to consider in your technical SEO Checklist. Even before crawling and indexing 

But why?

The reason is most of the crawling and indexing issues arise because of poorly-designed site structure.  So, if you have worked on this step, you don’t need to worry about indexing any more. 

The next thing is your site structure impacts on everything you do to optimize your site.  Everything from URLs to your sitemap and to using robots.txt to block search engines from certain pages. In short, a good strong site structure makes technical SEO tasks much easier. Here are some steps to follow for a good site structure. 

Use a Flat, Organized Site Structure

Your Site structure is the way you have organized pages on a website. 

In general, you should have a site structure that is flat. In other words: your site’s pages should be interconnected and only a few links away from one another.

It’s important. So that google can index and crawl 100% of your site pages. 

Maintaining a flat structure is not so difficult for a blog or a local shop website, but it can be tricky for ecommerce sites.

Ecommerce sites have hundreds of product pages which can be difficult to optimize for a flat structure.

Messy Site structures can create orphan pages. (Those pages are not connected to any other page of the site and left as orphans.)

It makes hard to identify and address index issues.

You can use the Semrush Site audit feature to get an overview of your site structure.

It helps but it’s not super beneficial or visual. To get a more detailed look at how your site’s pages are linked together, check Visual Site Mapper. 

It’s a free tool for an interactive look at your site structure.

Consistent URL Structure

You don’t need to be depressed about your URL structures. Especially when you are having a small site like a blog. 

It’s necessary to use logical consistent URL structures, it helps users to understand where these pages are located on your site.

Putting your pages under specific categories gives Google extra context of that page.

BreadCrumbs Navigation

We all know that adding breadcrumb navigation to your site is super SEO friendly. 

This is because adding breadcrumbs automatically adds internal links to categories or subpages on your site.

It helps to solidify your site structure. Keep in mind that Google has changed URLs into breadcrumb style navigation in the search results. 

So try to use breadcrumbs where it makes sense

Crawling, rendering and Indexing

All these terms work  to make your site super easy for the search engines to access. 

I will discuss how to find and fix crawling errors and issues and how to send crawlers to deeper pages of your site.

Identify indexing Issues

At first find pages on your site which search engines are troubled to crawl. You can do it in three ways.

Coverage Report

You should start with a coverage report from Google Search Console. 

This page tells you which pages Google is unable to index or access.

Screaming Frog

Screaming frog is the world’s famous crawler for a valid reason that it’s really awesome.

When you are done with the coverage report and fixed all the issues, I recommend a full crawl with screaming frog.

Semrush Site Audit

Semrush gives us a good Site Audit Tool.

This tool is my personal favorite because it gives you an overall view of your site’s Technical SEO health.

Site Performance Report

And issues with your site’s HTML tags

All these three tools have their own advantages and disadvantages. So, if you are running a large site, I recommend you try all of these three.

In this way nothing will be left out.

Mostly you don’t face any issue in getting your homepage indexed, but the deeper pages with several links from the homepage can create problems.

Flat structures usually prevent these issues but your deepest page should be only 4 to 5 clicks away from your homepage. 

If you want a specific page to index, just add an old fashioned internal link to that page.

Specifically if the page you are linking has a high page and domain authority gets crawl regularly.

Use an XML Sitemap 

Does Google still need a XML site map in this age of mobile friendliness and AMP to find your site’s pages?

The answer is yes. 

Google states that XML sitemaps are still the second most important source to access URLs.

If XML sitemaps are second, what’s the first?

Google has not said but I have assumed that it’s internal or external linking.

If you want to double check your sitemap, go to the “Sitemap” feature in google search console.

This will show you the sitemap visible to the google. 

GSC inspect 

What to do if a URL is not getting indexed?

Well GSC inspect can give you deeper insights

It not only clarifies why page is not indexed but also how google renders the indexed pages. 

Duplicate or thin content

If you are creating unique original content for each page of your site, you don’t need to worry. 

Google technically crops up or hits duplicate content on your site. So let’s discuss how to find and fix duplicate and thin content.

Site Audit to identify Duplicate content

I have brought two tools that are great at finding duplicate content. 

Raven Site Auditor 

It thoroughly scans your whole site and let you know which pages are having duplicate or thin content and need to be updated,

Semrush Site Auditor

Semrush Site Auditor also has a content quality section for the identification of duplicate or thin content. 

It shows you if several pages on your site have the same content.

Duplicate content also covers copied content from other sites. 

I recommend the “Copyscape” batch search feature to get bulk analysis of URLs and find where else the content is available around the web.

If a snippet appears with that text on another site, click and search for it in the quotes. 

If your website is shown at the top of these snippets, it means that you are considered the original author of the content.

Note: If other people have copied your content, it’s not your problem. You only need to worry about the copied or duplicate content on your site. 

No index pages with duplicate content

Many sites have pages with duplicate content. 

That’s ok. It becomes a problem when these duplicate pages are indexed.

The solution to this problem is adding a no index tag to this page. 

This tag tells google and other search engines to no index the page.

Go to “Inspect URL feature” in the GSC and double check, if your “no index” tag has set correctly. 

Paste the URL and click “Test Live URL”

You will see a “URL is available to Google” message if your URL is still indexed. Which means that the “ no index” tag is not properly set.

But if your see a message “ excluded with no index tag”, your tag is doing great. 

This is one of the very few times when you will want to see a red alert or error  message in the google search console.

It depends on your crawl budget, how many days or weeks google takes to recrawl those pages. 

 So keep on checking the “Excluded” tab in the coverage report to make sure the unwanted pages are not indexed.

For Example, sometimes we have paginated comments on our post.

This every single comment page has an original blog post published on it. We add no index tag to these pages. 

So, that we will not have any duplication issues. 

Note: Another way is to block the search engine spiders from crawling the page altogether in your robot.txt file.

Using Canonical URL’s

Most of the time we use no index tags or replace the content with unique content to deal with duplication. 

We can also use a third way that is Canonical URLs. 

Canonical URLs are perfect for pages with similar content or with minor differences. 

For Example you run an ecommerce site that sells shoes. 

And you have a product page showing only black sleepers.

Every size, color and variation can result in different URL depending upon your site’s structure.

That’s not a good practice, 

Here we use Canonical tags to let google know that the product with the black colour is the main product and all other pages are the variations.

Page Speed

Improving your pagespeed is a super trick of the few technical SEO strategies that directly impact your site’s rankings.

I am not saying that a fast loading site will take to the top of first page rankings, you still need backlinks. But a faster load speed can significantly impact your organic traffic and reduce bounce rate (users getting back just after clicking your site).

Reduce web page sizes 

We have often seen or read terms like CDNs, Cache, Lazy loading and Minifying CSS when we talk about page speed. 

But a very few people talk about web page sizes. 

In fact, when I researched page speed, I found that a page’s total size impacts load times more than any other factor.

Sometimes it’s better to go with huge pages and lower speed to maintain the images quality, because grainy or blur images don’t work well corresponding to clear awesome images.

You can speed up your page by reducing image sizes and clearing the cache. But the load time still doesn’t get better, if your pages are huge. 

You can check your page speed with Page Speed Insights.

Test Your Site Load Times With and Without a CDN

One of my most surprising findings is that CDNs are associated with worse load times.

But it’s not reality. They are not set up correctly.

So if your site uses a CDN, test your site’s speed on webpagetest.org keeping the CDN on or off to get the real insights.

Elimination of 3rd Party Scripts

Each 3rd party script adds an average of 34ms to page load time.

You need some of these scripts like Google Analytics, but look over your site’s scripts to see if there’s any that you can get rid of.

Implement hreflang tag for International Websites

If your site has different versions of one page for different countries and languages, use hreflang tag.

It can add huge help. The only issue is hreflang is difficult to set up. 

But you can add Aleyda Solis’ Hreflang Generator Tool to your site to reduce errors.

This tool makes it relatively easy and clear for you to create and use an hreflang tag for multiple countries, languages and regions.

Fix Dead or Broken Links

Having a bundle of broken or dead links on your site can’t directly trouble your SEO. 

But these links can make it harder for  search engine crawlers or Google Bots to find your site’s pages. 

I recommend doing a site audit using semrush or screaming frog and fixing these links. 

Set up Structured Data (Schema)

Does setting up Structured data or Schema directly helps your site’s SEO?

I don’t think so. 

In fact, I found no correlation between Schema and first page rankings in my research on SEO.

Adding Schema can add rich snippets of your page when it is shown in the SERP results. 

These Rich snippets can dramatically improve your click through rate.

Validate your XML Sitemap

If you are running a big site, it’s very difficult to keep track of all of the pages. I have seen many sitemaps having pages with 404 and 301 status codes. The primary goal of sitemap is to show search engines all of your site’s live pages. So 100% of the links in your sitemap should point to live pages.

So I recommend validating your sitemap through the Map Broker XML Sitemap Validator. Just enter your side map to see if your pages are broken or redirected.

Noindex Tag and Category Pages

On WordPress sites, I highly recommend no index category and tag pages unless these pages are bringing much traffic to your site.

These pages are usually useless and don’t give any value to the visitor. They can also cause duplicate or thin content issues. You can use Yoast to easily no index pages with a single click without any mess.

Addressing Mobile Usability Issues

Google has released its mobile first indexing initiative, So it’s clear that our site must be mobile friendly. 

You can use the Google Search Console’s Mobile Usability report to keep your site up to date and mobile friendly. If Google finds that something on your site isn’t optimized and working for mobile users, they’ll let you know.

In this way, you know exactly what and how to fix.

I hope you get all these tips. I have tried to add depth to every word. 

Still if you have queries or want some specific Technical SEO Services for your site, We are here to help. 

At Itechsole our expert team masters all these trendy strategies and recommends super suitable solutions to the technical issues of your site. 

Try our Technical SEO Services now

Facebook
Twitter
Email
Print

Leave a Reply

Your email address will not be published. Required fields are marked *

Newsletter

Sign up our newsletter to get update information, news and free insight.

Latest Post