Skip to main content

A Guide to Google Best Practices



Google's not actually conjuring up new ways to punish people for ridiculous violations...It's algorithm is designed and maintained to rank sites based on quality and accuracy.

The best way to make sure that your site is following Google's Best Practices, is to be accurate with your on-page SEO and to publish quality content.

It's really just common sense with a few added inside tips from Google on how to it operate best with their search engine algorithm.

Here are some tips on how to maintain On-Page Accuracy while optimizing your site for Google.


Choose a keyword that accurately describes the content on your page, don't just pick one based off search volume and competition.


Make sure that your page titles include that keyword and accurately represent the content on those pages. If someone is looking for a service that you provide, and you make it clear on your website that you provide that service, they will find you. That's how it works.


Use <h1> <h2> <h3> and <p> tags to organize your content in a way that will help outline it for your reader. This will also help the search engine crawlers to understand your content and accurately rank it for search engine users.

Meta Descriptions

Use the meta descriptions to inform web users about what the page represents. You want to drive the right consumers to your site and Google will reward you for accuracy. Place the proper keyword in the meta description, but don't stuff every keyword you can think of that users "might" search for. You will lose points with Google, and useless keywords won't help drive the right kind of traffic to your site.


URLs that are simple and easy to understand are best. They should contain your keyword, which again, should accurately describe the content on your page. URLs that are easy for users and search engine crawlers to understand are best. Avoid including session IDs and unnecessary parameters. Generic is often best.

  • Google Webmaster's SEO Starter Guide
    This document first began as an effort to help teams within Google, but we thought it'd be just as useful to webmasters that are new to the topic of search engine optimization and wish to improve their sites' interaction with users and search engines



It is important to understand that Google uses search engine crawlers and algorithms to sort through sites, but it is all in the name of helping humanity.

Websites are made for human readers, so in all of your optimizing for search engine robots, don't forget to make your site appealing to humans. It should actually be the priority, and typically, the more optimized a site is for human interaction, the more the robots will reward you with rankings and traffic.

Here are some easy ways to optimize for humans.

Keeping your site navigation organized and as simple as possible makes a huge difference in how a web user interacts with your site. It also helps Google's crawlers understand your site better and rank it accurately. Think accurate titles, and organized hierarchy according to what content is most important and what will make sense to the people navigating.

404 Page

Create a custom 404 page that tells users that the link they went to is bad. Make sure it displays a way for them navigate your site, so that they can track down the right page url. Make is personalized to fit the theme of your company.

Helpful Blog Content

User a blog to give consumers and other industry influencers helpful information. Don't use it to plug your services and your brand--that's what the rest of your site is for. Help your customers and others in your field, and they will be more trusting and willing to help you out or purchase your services and products.

Scroll to Continue


Every search engine has their own algorithms that determines how highly a site ranks in their SERPs (search engine result pages). Most of Google's major algorithm updates are codenamed after animals.

Panda, Penguin, Hummingbird, and Possum are the most important updates to understand in order to follow Google's Best Practices for search engine optimization.



Google's Panda Algorithm first launched in February of 2011, the Panda algorithm was designed to demote sites with low-quality content. The update hit content farms and content aggregators hardest.

Issues that trigger a Panda penalty:

  • Spun or automated content - posts are duplicated or incredibly similar with slight keyword variations.
  • Duplicate content - where most a site's content is unoriginal
  • Aggregate content - where the majority of content is pulled from other sources
  • Low-quality or irrelevant content - unhelpful content that people leave quickly
  • Thin content - barely any information is provided
  • Your Google Algorithm Cheat Sheet: Panda Algorithm
    Do you have questions about the Panda algorithm? This guide explains in lay terms what each of these Google algorithm changes is about and how to improve your site so that it looks better in the eyes of the big

How Panda tracks these issues:

  • High bounce rates from your site and individual pages
  • Low time on you site and individual pages
  • Low amount of return visitors
  • Low click-through rates



Google's Penguin Algorithm first released in 2012, the Penguin algorithm deals specifically with unnatural backlink patterns. Before this year, Penguin was only updated once a year. That meant that if your site received a Penguin penalty for having low-quality or unnatural backlinks, it would be a year before you recover and know whether or not your changes made a difference.

The fourth update integrated Penguin into Google's core algorithm, which means that Penguin 4.0 now operates in real-time. Every time Google crawls and caches a page, the Penguin criteria takes effect.

  • Google Penguin 4.0
    The final update of Google’s Penguin Algorithm is here, and it has serious significance for you if you’re using SEO as a marketing strategy.

Issues that trigger a penalty

  • a lot of low-quality links
  • Sneaky redirects
  • Hidden text or links
  • Doorway pages



Google's Hummingbird Update was not just an adjustment of one particular focus of the Google algorithm. It was a complete overhaul. Besides the Panda and Penguin updates that were already in place, the entire algorithm was adjusted.

The goal of Hummingbird was to change the criteria generated by user queries. For example, if someone search for "Best place in San Diego for pizza," the Hummingbird algorithm translated "place" and "pizza" as meaning the user was looking for a restaurant, so restaurant would be added to the sorting criteria.

The main reason for this update was to work with voice queries on phones, which became more standard after Siri was developed and released.

  • Search Engine Land's Guide to Google Hummingbird
    What website owners, marketers and SEO professionals need to know about the Google Hummingbird search platform, which helps Google return search results that better match query intent. Read SEO tips, tactics & How To Guides to perform better in S

There aren't really Hummingbird "penalties" as the purpose of the update wasn't to punitive towards black hat SEO tactics.

The best way to utilize Hummingbird in your SEO efforts is simply to write content that answers people's commonly searched questions, not just keywords. However, this is already a basic SEO practice that you should be using, so the ramifications are small compared to the Penguin and Panda updates.

How Well Do You Know Google's Animals?

For each question, choose the best answer. The answer key is below.

  1. Approximately How Many Feathers Does the Ruby-Throated Hummingbird Have?
    • 80
    • 370
    • 940
  2. What Percentage of a Panda's Diet Consists of Bamboo?
    • 28%
    • 50%
    • 87%
    • 99%
  3. What is the Average Lifespan of a Common Possum?
    • 8 Weeks
    • 16 Months
    • 2 Years
    • 10 Years
  4. How Tall is the Smallest Species of Penguin?
    • 16 inches
    • 20 inches
    • 2 feet

Answer Key

  1. 940
  2. 99%
  3. 2 Years
  4. 16 inches



Google's Possum Update was an update to how the algorithm handles local search results. It replaced Google's last location update, called Pigeon, which was released in 2014.

The revision radicalized how Google treats and filters out duplicate results. If there are multiple business listings across the web with slightly different names or different phone numbers, hours, email addresses, etc., they are filtered out.

Making sure that your business's information is consistent across the web will keep you in good standings with the Possum and can really boost your rankings with Google.

© 2016 Andrew Lowen

Related Articles