Do you feel like you’ve reached a point where your organic growth has stagnated? While high-quality content and links are important for SEO purposes, it is also important to not overlook the power of technical SEO.

Technical SEO is going to be very important in 2019 and it is something that will help you to think like Googlebot. Before we dive into the fun stuff, it’s important to understand what Googlebot is, how it works, and why we need to understand it.

Googlebot

Googlebot is a web crawler that collects data from webpages.

Googlebot is just one of many web crawlers. Every search engine has their own branded spider. In SEO, we call these branded bot names “user agents.”

We will discuss user agents later, but for now, just understand that when we refer to user agents, we are talking about a specific web crawling bot. Some of the most common user agents include:

  • Googlebot – Google
  • Bingbot – Bing
  • Slurp Bot – Yahoo
  • Alexa Crawler – Amazon Alexa
  • DuckDuckBot – DuckDuckGo

How Googlebot Reads Webpages

Google has made a lot of progress with how their site appears. The goal of Googlebot is to make a webpage look the same way a user would see it.

You can use the Fetch and Render tool in Search Console to see how Google views your page. This will give you a view of what Googlebot sees when it visits your website, as opposed to what a user sees. This provides useful information about how Googlebot sees your webpages.

Technical Ranking Factors

However, improving your website’s technical SEO can have a big impact on your traffic and organic search rankings. Although there is no foolproof method to optimize your website for technical SEO, there are certain strategies you can use to improve your website’s ranking. All 200+ ranking factors are important! The biggest ranking factors for SEO in the future will be based on user experience.

SEO Googlebot optimization tips

1. Check How Your Site is Indexed Today

The first step to take in improving your site’s SEO is to check how it is currently being indexed. There are two simple ways to do this:

  • Use the Google search “site:” operator
    As a general rule, Google Search Operators are a great way to refine your search results. In the world of technical SEO, a search using the “site:” operator lets you return results for a specific website, subdomain, or URL path. For example, a Google search for site:mediatemple.net will return all indexed pages on mediatemple.net.Suppose we only wanted to find indexed pages in the Media Temple blog. Since the URL path for the blog is mediatemple.net/blog, a site:mediatemple.net/blog search will do the trick.
  • Use the Google Search Console
    If you’re not using Google Search Console, you should probably start. It allows you to capture a lot of information related to your website’s search rankings. For example, you can see all the pages on your site Google knows about and review a list of potential issues you can resolve for some quick technical SEO wins. 

Be on the lookout for problems such as duplicate content, no search results, results you do NOT want to appear in searches, or fewer results than expected.

Some of our other tips will address how to handle many of those issues. If you are having issues with pages not appearing in Google search results, there are a few potential solutions. Sitemaps and robots.txt files can help Google find and index your pages, and changes in the Google Search Console may also be necessary.

2. Make Sure Your Pages Load Fast

Bots prioritize websites with faster loading times in search rankings after the latest algorithm update. While there is no definitive answer to how quickly a page should load, most recommendations suggest a time of 2-3 seconds. The general consensus is that the faster the better.

Of course, there is no one-size-fits-all answer to how you can make your website load faster, but here are some resources to help you get started:

  • GTmetrix is a good tool for baselining a webpage’s performance, and it provides some actionable recommendations.

  • Google Pagespeed Insights is another solid tool for understanding how your web pages perform, and the results are from the horse’s mouth.
  • For a deeper dive into optimizing your site performance for improved search rankings, Nathan Reimnitz has you covered

3. Get Your Headers Right

In general, using headers is a good way to divide the content for your readers. A “Header 1” (H1) is the main idea, an H2 is a subtopic, an H3 is a subtopic of that subtopic, and so on.

Headers are represented by specific HTML tags in technical SEO. An H1 of “Pepper and egg sandwiches are the best sandwiches ever” would look like this in HTML:

Pepper and egg sandwiches are the best sandwiches ever

If you want to improve your website’s search engine optimization, you may need to focus on your header strategy.

John Muller from Google Webmaster Central responded to a question about how to make sure Google understands the context of the content on a website. He said that using headers is a good way to do this.

This is good news because it means that you are likely already doing things to make your content readable for humans.

Here are a few tips: To get your headers right, follow these tips:

  • Only use 1 H1 per page
  • Use headers to break up your content for human readers
  • Make your headers descriptive enough to contextualize content

The Yoast plugin can be a useful tool for analyzing the quality of your headers and other aspects of a page’s SEO for WordPress sites. They also have a free online content analysis tool for non-WordPress sites.

4. Understand How to Use Tags 

Header tags are just one type of HTML tag. There are plenty more SEO techniques that you can use to improve your website ranking. Here are some of the most important things that they can help you solve specific problems with.

  • Use rel=canonical to avoid issues with duplicate content. Duplicate content is bad for SEO. However, it can be a common occurrence for content to appear in multiple places on your site. For example, if your site serves the same content at www.<yoursite>.com and <yoursite>.com (without the www.), you have duplicate content. Using the rel=canonical tag helps tell bots which page they should refer users to. It’s straightforward in some cases, but knowing when to use canonical tags vs 301 redirects can get tricky, so check out this Moz article if you need more guidance. 
  • Use a noindex meta tag to prevent a page from being indexed. Sometimes you don’t want pages to be indexed (login screens, thank you pages, logs, etc). In that case, a noindex meta tag can help.
  • Use alt tags to improve SEO for images. Images are an important part of SEO, but how can you tell the bots what your images are about? Alt tags. Given their purpose, your alt tags should literally describe your image, not supplement it in some way. 

5. Use your robots.txt

The robots. txt file is a text file that is used to give instructions to web crawlers. Although not every bot respects these instructions, they can be useful in telling the bot what you do and don’t want it to do on your website. A robots.txt file should exist at yoursite/robots.txt.

For example, ours is at https://mediatemple.net/robots.txt. For WordPress users, the robots.txt file is typically located in the root directory of their site. If you want to change your robots.txt file, you can either use a plugin like Yoast or upload a new one through FTP.

There are a variety of things you can do in a robots.txt file including:

If you are new to technical SEO, make sure that you are not preventing search engines from indexing the pages you want indexed, and ensure that they can find your sitemap.

Robots.txt is a text file that contains commands that tell web robots which pages on a website to crawl. The commands are fairly easy to understand, but if you need help, this CloudFlare article can be a good resource.

6. Sitemap.xml

Sitemaps play an important role in how Googlebot crawls and indexes pages on your website. They are also a key ranking factor.

Here are a few sitemap optimization tips:

  • Only have one sitemap index.
  • Separate blogs and general pages into separate sitemaps, then link to those on your sitemap index.
  • Don’t make every single page a high priority.
  • Remove 404 and 301 pages from your sitemap.
  • Submit your sitemap.xml file to Google Search Console and monitor the crawl.

7. Site Speed

14 Ways To Think Like Googlebot And Boost Your Technical SEO

The speed at which a website loads has become one of the most important ranking factors, especially for mobile devices. If your site’s load speed is too slow, it may be ranked lower by Googlebot.

If you want to know if your website is loading too slowly for Googlebot, you can use one of the many free site speed testing tools available online. These tools will give you suggestions to send to your developers.

8. Schema

Including structured data on your website can help Google’s algorithms understand the content of your pages and your website as a whole. However, it’s important that you follow Google’s guidelines.

JSON-LD should be used to implement structured data markup for efficiency. Google prefers JSON-LD as their markup language.

9. Canonicalization

This can be a major problem for larger sites, especially those that sell products online. Duplicate webpages can confuse customers and cause them to go to a competitor’s site. Having duplicate webpages can be practical for various reasons, such as having pages in different languages.

If you have more than one page with similar content, it’s important to use a canonical tag on your preferred page, and an hreflang attribute to specify the language of your content.

10. URL Taxonomy

14 Ways To Think Like Googlebot And Boost Your Technical SEO

A clean and well-defined URL structure can improve your website’s ranking and make it easier for users to find what they’re looking for. Googlebot will be able to understand the relationship of each page better if they are set as parent pages.

Although Google’s John Mueller does not recommend it, changing the URL of a fairly old page that is ranking well may be beneficial.

The taxonomy of a site’s URL should be considered from the start of the site’s development.

If you’re convinced that optimizing your URLs will do your site some good, make sure you’ve set up the right 301 redirects and updated your sitemap.

11. JavaScript Loading

While HTML pages may be easier to rank, JavaScript provides website creators with the ability to make their pages more dynamic and appealing to users. In 2018, Google improved its JavaScript rendering capabilities.

At a recent Q&A session with John Mueller, it was stated that Google plans to continue focusing on JavaScript rendering in 2019. This means that they will continue to work towards making sure that websites coded in JavaScript appear correctly in Google search results. If your site uses a lot of JavaScript to render content, make sure your developers are following Google’s best practices recommendations.

12. Images

In recent months, Google has been increasingly emphasizing the importance of image optimization. This has been something that Google has hinted at for a long time, but now they are giving more clear and direct advice on the matter. If you want Google to understand the context of your images and how they relate to your content, you should optimize them.

If you’re looking into some quick wins on optimizing your images, I recommend:

  • Image file name: Describe what the image is with as few words as possible.
  • Image alt text: While you may copy the image file name, you also are able to use more words here.
  • Structured Data: You can add schema markup to describe the images on the page.
  • Image Sitemap: Google recommends adding a separate sitemap for them to crawl your images.

13. Broken Links & Redirect Loops

Many people believe that broken links are bad for website performance, and some have said that they can use up valuable resources. John Mueller has said that broken links do not browsing reduce budget.

I believe that we should clean all broken links for information safety. The best way to find broken links on your website is to use Google Search Console or your favorite crawling tool.

A redirect loop is a common phenomenon with older sites. A redirect loop is a situation where a redirect command points to itself, creating an endless loop.

Redirect loops can make it difficult for search engines to crawl a site, and they may eventually stop the crawl. The most effective course of action would be to replace each initial link on every page with the corresponding final link.

14. Titles and Meta Descriptions

Although this may not be new information for many SEO professionals, optimizing titles and meta descriptions can lead to increased rankings and CTR in the SERP.

This is part of the fundamentals of SEO and Googlebot does read this. There are many theories about best practice for writing these, but my recommendations are pretty simple:

  • I prefer pipes (|) instead of hyphens (-), but Googlebot doesn’t care.
  • In your meta titles, include your brand name on your home, about, and contact pages. In most cases, the other page types don’t matter as much.
  • Don’t push it on length.
  • For your meta description, copy your first paragraph and edit to fit within the 150-160 character length range. If that doesn’t accurately describe your page, then you should consider working on your body content.
  • Test! See if Google keeps your preferred titles and meta descriptions.

Conclusion

There are many things to consider when optimizing your website for Googlebot, the Google search engine crawler. Technical SEO can be a complex and time-consuming process, but it is important to make sure your website is properly optimized to ensure higher search engine rankings. Before making changes to your website, I recommend doing research and asking your colleagues about their experiences.

When you’re trying out new things, it can be exciting, but sometimes it can also lead to a decrease in your organic traffic. A good method to use is to test different tactics by waiting a few weeks in between each change.

About the Author Brian Richards

See Brian's Amazon Author Central profile at https://amazon.com/author/brianrichards

Connect With Me

Share your thoughts

Your email address will not be published. Required fields are marked

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}