September 19, 2022

After you have written content that is valuable and is based on good keyword research, it is important to make sure that it can be read by both humans and search engines.

You don’t need to be a tech expert to understand these concepts, but it’s important to know what they do so you can have an intelligent conversation about them with developers.

Getting on the same page as your developers is essential if you want them to help you with your optimizations. They’re unlikely to prioritize your asks if they can’t understand your request or see its value.

If you gain the trust of the people you work with, you can start to break down the bureaucracy that often prevents important work from being completed.

HTML

HTML is the code that helps create the structure of a web page. It stands for HyperText Markup Language. The HTML defines elements like headings, paragraphs, lists, and content.

This is a screenshot from W3schools.com, a website that offers tutorials and practice exercises for HTML, CSS, and JavaScript.

SEOs need to know about HTML because it is the code that creates webpages.

While most Content Management Systems (CMS) don’t require users to write in HTML, it is what users are editing every time they make a change to a web page- such as adding content, changing the anchor text of internal links, etc.

Google uses these HTML elements to determine how relevant your document is to a particular query. The content of your HTML plays a significant role in your web page’s ranking in Google’s organic search results.

CSS

CSS is responsible for the fonts, colors, and layouts of web pages. CSS allowed web developers to style their HTML content for the first time. This was a major breakthrough because it gave developers much more control over how their web pages looked.

CSS allows web pages to be “beautified” without the need to manually code styles into the HTML of every page. This is especially helpful for large sites.

It wasn’t until 2014 that Google’s indexing system began to take into account more than just text when indexing web pages, making it more like an actual browser.

A black-hat SEO practice that tried to manipulate search engine rankings by hiding text and links via CSS has been rendered obsolete by Google’s updated indexing system. The “hidden text and links” practice is not allowed by Google’s quality guidelines.

Components of CSS that SEOs, in particular, should care about:

  • Since style directives can live in external stylesheet files (CSS files) instead of your page’s HTML, it makes your page less code-heavy, reducing file transfer size and making load times faster.
  • Browsers still have to download resources like your CSS file, so compressing them can make your webpages load faster, and page speed is a ranking factor.
  • Having your pages be more content-heavy than code-heavy can lead to better indexing of your site’s content.
  • Using CSS to hide links and content can get your website manually penalized and removed from Google’s index.

JavaScript

Beginner’s Guide To Technical SEO

In the early days of the Internet, HTML was used to build webpages.

CSS allowed webpage content to be styled. The programming language JavaScript allowed for websites to be dynamic instead of just having structure and style.

Since JavaScript can create content dynamically, it has made it possible to create web pages that are not entirely static. This has opened up a lot of opportunities for more complex and interesting web page designs.

When a user tries to access a page that has been enhanced with JavaScript, their browser will run the JavaScript code against the HTML that the server returned. This will cause the webpage to become interactive.

You’ve probably seen JavaScript in action, even if you didn’t know it! Since JavaScript can essentially do anything to a page, that’s why it’s used so frequently. It could, for example, create a pop-up or request third-party resources like ads to be displayed on your page.

Client-side rendering versus server-side rendering

JavaScript can unfavorably affect SEO because search engines interpret JavaScript differently than people do. That’s because of client-side versus server-side rendering.

Most JavaScript is executed in a client’s browser. If you use server-side rendering, the files are processed by the server and then sent to the browser in a completed state.

Some elements that are important for SEO are loaded using JavaScript instead of being present in the HTML code. This means that they are invisible to search engines until they are rendered.

This means that search engine crawlers will not be able to see the content of your JavaScript immediately.

Google claims that, providing you do not block Googlebot from scanning your JavaScript files, they will be able to read and comprehend your web pages in the same way as a browser can. Consequently, Googlebot should register the same things as a user who is viewing a site in their browser.

Google can miss certain elements on a webpage that are only available once JavaScript is executed.

There are also some other things that could go wrong during Googlebot’s process of rendering your web pages, which can prevent Google from understanding what’s contained in your JavaScript:

  • You’ve blocked Googlebot from JavaScript resources 
  • Your server can’t handle all the requests to crawl your content
  • The JavaScript is too complex or outdated for Googlebot to understand
  • JavaScript doesn’t “lazy load” content into the page until after the crawler has finished with the page and moved on.

JavaScript can have positive and negative effects on your website’s SEO. If you’re not careful, JavaScript can negatively impact your website’s ranking in search engine results pages.

There is a way to check whether Google and your visitors see the same things when they visit your website. To view your page through Googlebot’s eyes, use the “URL Inspection” tool in Google Search Console.

Crawling 

The process of crawling consists of search engines grabbing content from pages and using the links present on those pages to discover more pages. There are some methods you can use to control which parts of your site are crawled by web browsers. Here are a few options.

Robots.txt

A robots.txt file is used to indicate to search engines which parts of a website they are allowed to access.

Crawl rate

Many web crawlers support the use of a crawl-delay directive in robots.txt. It allows you to specify how often pages can be crawled. Unfortunately, Google doesn’t respect this. To change the crawl rate for Google, you will need to go into Google Search Console.

Access restrictions

If you want the page to be accessible to some users but not search engines, then what you probably want is one of these three options:

  • Some kind of login system
  • HTTP authentication (where a password is required for access)
  • IP whitelisting (which only allows specific IP addresses to access the pages)

Crawling activity

The Google Search Console’s “Crawl Stats” report is the easiest way to see how Google is crawling your website. This report provides more information about how Google crawls your website.

Crawl adjustments

How often Google crawl a site, as well as how much crawling your site allows, will contribute to that site’s crawl budget.

The more popular a page is, the more often it will be crawled by a search engine. Pages that are not popular or well-linked will be crawled less often.

The next step after a page is crawled is for it to be rendered and then sent to the index. The index is a list of all the pages on a website that can be found by search engines. Let’s talk about the index.

Robots directives

A robots.txt file is an instruction manual for search engine bots, telling them which files and folders on your website are off-limits. It’s placed into the section of a webpage and looks like this:

Canonicalization

Google will only store one version of a page in its index, even if there are multiple versions of that page.

The process of making a standard version of something is called canonicalization and the URL that is chosen as the standard will be the one Google shows in search results. There are many different signals it uses to select the canonical URL including:

The Google Search Console URL Inspection tool allows you to see how Google has indexed a page. It will show you the Google-selected canonical URL.

Check indexing

Web pages that are not indexed can’t appear as search results The pages you want people to find should be able to be indexed by Google. If they can’t be indexed, then they won’t appear as search results. Crawling and indexing are, together, the first part of the search engine’s job The first two chapters of this text were focused on crawling and indexing because they are the first part of a search engine’s job.

In Site Audit, the Indexability report displays pages that can’t be indexed and the reasons why. It’s free in Ahrefs Webmaster Tools.

Reclaim lost links

Websites tend to change their URLs over the years. There are many old URLs that have links from other websites. If users are not taken to the intended pages when they click links, then those links are unsatisfactory and do not help your pages.

Even though you may have lost value, it’s not too late to do redirects. This link building will be the quickest you will ever do.

If you’re looking for ways to reestablish lost links, Ahrefs’ Site Explorer is a good place to start. Navigate to the Best by Links report for the desired domain, and select the “404 not found” HTTP response filter.

Add schema markup

Schema markup is code that makes it easier for search engines to understand your content. This code can also help your website be more visible in search results.

Google maintains a search gallery that showcases the various search features for which your site must have proper schema markup to be eligible.

Even though the projects in the previous part may require less effort, the projects in this chapter are still worth focusing on as they bring more benefits.

That doesn’t mean you shouldn’t do them. This is a guide to help you understand how to prioritize different projects.

Page experience signals

There are other ranking factors that are less important, but that you should still take into consideration for the sake of your users. UX covers aspects of the website that impact user experience.

Core Web Vitals

Beginner’s Guide To Technical SEO

The core web vitals are a set of speed metrics that are used by Google as part of their page experience signals to gauge user experience. The load time, interactivity, and stability of a website are measured by the Largest Contentful Paint (LCP), the First Input Delay (FID), and the Cumulative Layout Shift (CLS).

HTTPS

The HTTPS protocol protects information exchanged between your browser and the server it is communicating with from being intercepted or tampered with by attackers. This ensures that information exchanged between users is kept confidential, cannot be altered, and is authenticated. You also want to make sure your pages are loading as quickly as possible. It is recommended that you load your pages over HTTPS rather than HTTP, and that you also ensure your pages load as quickly as possible.

A “lock” icon in the address bar means the website is using HTTPS.

Mobile-friendliness

This test is to see if webpages are able to be displayed correctly and if they are easily usable by people on mobile devices. How do you know how mobile-friendly your site is? Check the “Mobile Usability” report in Google Search Console.

Interstitials

Interstitials block content from being seen. Pop-ups that cover the main content and require users to take some action before they will disappear are called modal dialogs.

Hreflang — For multiple languages

The HTML attribute hreflang is used to specify the language and geographical targeting of a webpage.

If you have the same page in different languages, you can use the hreflang tag to tell Google which language the page is in. This allows them to provide the appropriate version to their users.

General maintenance/website health

Broken links

Broken links are links on your site that point to non-existent resources. This can frustrate your users and negatively impact your site’s ranking in search results. Links can go to other pages on your domain or to pages on other domains.

The ‘Links report’ in Site Audit will help you quickly find any broken links on your website. It’s free in Ahrefs Webmaster Tools.

Redirect chains

Redirect chains are a series of redirects that happen between the initial URL and the destination URL. To find redirect chains on your website, use Site Audit in the Redirects report. It’s free in Ahrefs Webmaster Tools.

Google Search Console

The Google Search Console is a free service from Google that helps you monitor and troubleshoot your website’s appearance in its search results.

This text describes some of the features of Google Webmaster Tools. You can use it to find out about technical errors on your website, submit sitemaps, and get information about structured data issues.

Each of the above-mentioned search engines has its own version. Ahrefs Webmaster Tools can help you improve your website’s SEO performance for free. It allows you to:

  • Monitor your website’s SEO health.
  • Check for 100+ SEO issues.
  • View all your backlinks.
  • See all the keywords you rank for.
  • Find out how much traffic your pages are receiving.
  • Find internal linking opportunities.

Google’s Mobile-Friendly Test

Google has a test to see if your page is mobile-friendly. This is important because it affects how easily someone can use your page on their phone. The tool also identifies issues that make using the site on a mobile device difficult, like small text that is hard to read, incompatible plugins, and so on.

This test will show you what Google sees when it crawls your page. You can also use the Rich Results Test to check the content Google sees for desktop or mobile devices.

Chrome DevTools

Chrome DevTools is Chrome’s built-in webpage debugging tool. This text provides information on how the PageSpeed Insights tool can be used to help improve website performance. PageSpeed Insights can be used to debug page speed issues and improve webpage rendering performance.

Ahrefs’ SEO Toolbar

Ahrefs’ SEO Toolbar is a free extension for Chrome and Firefox which provides data that is useful for SEO about the pages and websites you visit.

Its free features are:

  • On-page SEO report
  • Redirect tracer with HTTP headers
  • Broken link checker
  • Link highlighter
  • SERP positions

In addition, as an Ahrefs user, you get:

  • SEO metrics for every site and page you visit and for Google search results
  • Keyword metrics, such as search volume and Keyword Difficulty, directly in the SERP
  • SERP results export

PageSpeed Insights

PageSpeed Insights analyzes the loading speed of your webpages. The page performance score shows how quickly a page loads, and the actionable recommendations help to make pages load faster.

About the Author Brian Richards

See Brian's Amazon Author Central profile at https://amazon.com/author/brianrichards

Connect With Me

Share your thoughts

Your email address will not be published. Required fields are marked

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}