How to Track Google My Business Traffic with Google Analytics

June 17, 2017 in Blog, Local SEO

Since updating the “Local Pack” to only show 3 results, it’s more competitive than ever to get your site listed above the rest. Whether you’re a brick-and-mortar retailer, local restaurant, or service-based agency trying to get in front of potential customers, the Stack Pack is key.

Why is it important to track local search?

For local search, businesses need to be able to segment where their traffic is coming from in order to find and safeguard strategies that work.

By putting campaign tracking on your Google My Business page, you’ll be able to know just how much traffic the Local Pack is sending you and if you were to lose your local spot (I hope this never happens to you) you’d quickly be able to diagnose the traffic impact.

Instead of spending time chasing down potential sources of your traffic drop, you can segment out the data using Google’s handy URL builder and Campaign performance dashboard.

1. Navigate to the Campaign URL Builder to create a Custom Campaign.

 

2. Enter your URL, campaign source, medium, and name. By including organic as your campaign medium, you’ll better be able to include GMB traffic in your organic performance reports, since GMB traffic can be considered organic.

 

3. Add your tracking URL to your GMB Page.

4. Search for your business and click on the ‘Website’ button. When you hover over the button, make sure the URL has updated. Verify that your new tracking URL is sending data to Google Analytics in Acquisition > Campaigns > All Campaigns.

 

Note: Before implementing the UTM, verify the page contains a canonical tag to prevent duplicate content.

How to Diagnose Enterprise Level SEO Traffic Drops

May 29, 2017 in Enterprise SEO

With 1,600 Google algorithm updates every year and numerous product managers working on daily deploys, tracking down the cause of an enterprise-level organic traffic drop takes precious resources in already strapped environments. And, if you haven’t optimized your site for internal reporting or you don’t have a method in place for investigating traffic drops, then the time it takes to identify and fix the problem can be costly.

Whether you work on an enterprise site with 10 million pages or a smaller site with 100 pages, the first step to solving your problem is narrowing your search and coming up with a list of hypotheses that you can start to gather data for, by enlisting the help of your team.

Start by Memorizing Traffic Drop Types 

There are two main types of traffic drops:

  1. Organic traffic drops that impact a specific page type (technical or topical category)
  2. Organic traffic drops that impact the entire site

To determine which you have on your hands, make sure that you have a way to look at page type reporting distinct from one another. For eCommerce sites, this means being able to quickly analyze the organic traffic performance difference between your product listing pages (PLPs) and product detail pages (PDPs). One of the easiest ways to do this is with unique URLs or page type tracking.

Aside from the technical page type (i.e. PLP vs PDP), you should also make it easy to report on different topical categories on your site. For example, if you have a website selling shoes — did you lose rankings on a specific category of shoes — maybe men’s or women’s? Or did you lose organic traffic across all shoe types?

If you’ve looked through your data and determined that you have a sitewide traffic drop on your hands, then you can safely narrow this down to one of three culprits:

  1. Google algorithm update
  2. Broken or inaccurate tracking
  3. Major technical deploy toward the top of your site hierarchy

There are a few different steps you can take to narrow down which one it may be. 

Solving Organic Traffic Drops

For any type of traffic drop, start by finding a timeframe, roughly 1-2 days, in which the traffic fell below what’s expected. For some algorithmic traffic drops, you might also be looking at a sloping traffic trend over 4-6 weeks.

If you’re working on an enterprise site and there is a steep cliff within a 1-2 day timeframe, you’re likely dealing with an attribution error or manual action. Check in Google Search Console to see if you have a manual action for spammy link building practices, like cloaking, thin content, hidden text, or keyword stuffing.

Contact your data analytics team to see if there have been changes or issues with tracking on your site, before enlisting the helping of your SEO team.  

Once you have your date range and your data analytics team has verified accurate tracking, you can start by ruling out the impact of major Google algorithm updates, because they’re usually the least frequent and most easily identifiable.

Was it a Google algorithm update?

There are two types of algorithmic updates that may impact your site, those that change the search engine results pages (SERPs) themselves and those that penalize your site for violating Google’s guidelines.

Mozcast is a great place for a quick temperature check on SERP volatility, aside from reading through articles on Search Engine Land, Search Engine Watch, and World Webmaster Forum when trying to understand if there were any updates that could have impacted the site. If the date of your traffic drop lines up with a Google algorithm update, then it’s likely that your site has been impacted.

If the dates don’t line up, then congratulations you have another issue on your hands.

The second most common culprit of traffic drops are site changes. Product managers often make site changes that hurt SEO, especially in large organizations where SEO might not be consulted before a product launch.

And in some cases, it might even be your team that accidentally did the damage.

Make sure you contact your product team and look over their sprint boards to see if you can find changes which could have impacted your traffic. The most common changes that hurt SEO, include:

  1. Adding, moving or removing links from the page
  2. Adding, moving or removing text from the page
  3. Post-loading content on the page
  4. Breaking existing SEO site features (title tags, metas, canonicals, HREFLANG)
  5. Deactivating pages without properly 301 redirecting
  6. Moving content onto a subdomain
  7. Generating duplicate content
  8. Creating desktop & mobile interstitials or modals
  9. Launching product changes that hurt user metrics (conversion rate, bounce rate)

Run through this list of common mistakes on the page types that were impacted. View Source and Inspect Element are the best ways to understand how Google sees your webpage. Make sure you look at both your desktop and mobile source code when trying to uncover the technical issue to get your traffic back.

Making the Business Case

It’s likely that multiple hypotheses will sound right, but you’ll find that one or none of them may be. Create a list of your hypotheses and all the evidence you have to support them. If you’re unsure which hypothesis might be the culprit, then create multiple tickets for your engineers to implement at different times.

Unlike A/B testing, this type of testing can be –pre and –post analysis. If you’re uncertain that a change is what actually brought your traffic back, then you can always turn your fix off and then back on to be certain.

Site Change Date % Traffic Drop
Doubled the number of links in the global navigation. 5/12/17 30%
Removed links to product pages. 5/14/17 50%
Removed product descriptions from the source code. 5/16/17 15%

Once you’ve solved the traffic drop, record the data of the drop, the % traffic loss, and the root cause so that you can keep a record for your team to learn from for future traffic drops. This will also help you when explain SEO performance to senior leadership.

How Cleaning Up 404 Errors Can Improve Crawl Rate Limit

March 20, 2017 in Blog, Ongoing SEO, Technical Optimization

Gary Illyes published an article in January on the Google Webmaster Central Blog about What Crawl Budget Means for Google Bot, which brought a lot of attention to the topic of a site’s crawl budget. During the same week, I started recovery work on a site because the crawl budget had been decreasing since November 2016.

Using the latest news on crawl budget and data from Google Search Console, I was able to track down the problem to an increase in 404 errors that was creating poor crawl health and dragging down the site’s crawl rate limit.

What is a Crawl Rate Limit & Why Does it Matter

Googlebot crawls the web to discover pages to store in Google’s index. If your pages aren’t crawled by Googlebot, then searchers won’t be able to find them, since only pages stored in the index are returned by a user’s query.

Every site is given a crawl rate limit, which limits the maximum fetching rate for a given site — or the amount of data that Googlebot will crawl. You want Googlebot to crawl the maximum number of pages possible, so more of your site’s content with ranking potential is indexed and can compete to rank in the SERPs.

How to Diagnose Crawl Rate Limit Problems

Crawl rates fluctuate naturally for a few different reasons:

  1. Someone manually set a crawl rate limit in Google Search Console
  2. Add or removed subfolder from ROBOTS.TXT file
  3. Add or removed large number of pages on your site
  4. Anecdotal and often unexplained spikes in Googlebot crawls on your site

If you see a dip in your crawl rate, make sure one of the three hasn’t happened by checking your Search Console settings, verifying the contents of your ROBOTS.TXT file, and checking for an increase in HTTP errors.

Verify Crawl Rate Limit Settings in Google Search Console

1. Select the SETTINGS GEAR in the upper right-hand corner.

2. From the dropdown menu, select SITE SETTINGS.

3. Under Crawl Rate, verify that LET GOOGLE OPTIMIZE FOR MY SITE is selected.

Common Issues with ROBOTS.TXT Files That Hurt Crawl Rate

To view the contents of your robots.txt file, navigate to your root domain with the ROBOTS.TXT filepath — MYDOMAIN.COM/ROBOTS.TXT.

A properly-structured ROBOTS.TXT file is a key component of any site, especially sites with hundreds, thousands, or even millions of pages. To prevent unwanted changes to your ROBOTS.TXT file, make sure that you limit the number of people who have access to editing the document.

If, for some reason, you or someone on your team made a change to the file that hurt your crawl rate limit then you it’s easy to revert the changes. You don’t have to be an SEO expert to read a ROBOTS.TXT file, all you need to know for this exercise is that DISALLOW means a robot will not crawl the filepath and ALLOW means it will.

Look for the disallow lines in your ROBOTS.TXT file to see if your whole site or large sections of your site have been blocked from robots, which would prevent Google from crawling them and reduce the total number of pages crawled. The most important ones to watch out for:

  • /
  • /subfolder-name/

In the first example, someone would have blocked your whole site and in the second an entire subfolder of your site that could consist of a large number of pages. If either of these happens, all you have to do is remove the unwanted statement from your ROBOTS.TXT file and monitor your Search Console crawl analytics for an improvement in the number of pages and bytes crawled. Unfortunately, for the site I was working on, neither the site settings or ROBOTS.TXT file could explain the drop in the site’s crawl rate limit.

Case Study: How HTTP Errors Can Reduce Overall Crawl Rate Limit

The article on Google Webmaster Blog confirmed that crawl health influences a site’s crawl rate limit. In diagnosing the site, it became clear to me just how much crawl health matters for all sites, even smaller ones.

In the article, Gary Illyes of Google states:

“First, we’d like to emphasize that crawl budget, as described below, is not something most publishers have to worry about. If new pages tend to be crawled the same day they’re published, crawl budget is not something webmasters need to focus on. Likewise, if a site has fewer than a few thousand URLs, most of the time it will be crawled efficiently.”

The keyword phrase here is most of the time­ ­— and I was looking for an explanation for the fraction of the time that this is not the case. The last place for me to look was at the site’s crawl health. In the Top Questions portion of the article, we learn:

“…a significant number of 5xx errors or connection timeouts signal the opposite, and crawling slows down. We recommend paying attention to the Crawl Errors report in Search Console and keeping the number of server errors low.”

In Crawl Errors, I found an increase in 404 errors, not 500s, during late October and early November 2016, right around when the crawl budget decreased.

The site was recently migrated to a new platform and some of the URLs changed during the migration and were now serving 404 errors. We implemented 301 redirects to the appropriate new pages and saw the crawl rate limit increase by 107% over 4 weeks.

At just under 500 pages, even small site crawl rates are influenced by poor crawl health. Aside from server errors, client errors like 404s also seem to have an impact on crawl rate limit.

Conclusion

Having a process for diagnosing crawl rate limit drops makes it faster to diagnose sitewide issues. By protecting who has access to both your Search Console account and your ROBOTS.TXT file, you can prevent any unwanted changes or isolate the changes that could have impacted your site.

Keep an eye on your Crawl Errors report by setting up a monthly process to check in on your site’s crawl health — which is great for both users and Googlebot.

10 Useful & Fun SEO Plugins for WordPress

March 8, 2017 in Blog, Ongoing SEO


When it comes to using WordPress as your CMS, there are a lot of plugins that make life easier, including ones that will help you optimize your site for search. I’m personally impressed with how far plugins have come — from content delivery networks (CDNs) to schema markup, WordPress plugins are more flexible than ever before. And plugins like GIPHYPress just make writing a lot more fun.
 
Whenever I’m looking to download a new plugin, there are three criteria that I always consider:
 
a) Overall number of ratings and reviews
b) Qualitative feedback
c) Date the plugin was last updated
 
Below, you’ll find a list of plugins that I currently couldn’t live without — because who wants to struggle to integrate their tracking code into their website or have every page inherit the same meta description?
 

Insert Headers and Footers


 
Stars: 4.3 out of 5 stars
 
Reviews: 34 reviews
 
Active Installs: 200,000+
 
Download: https://wordpress.org/plugins/insert-headers-and-footers/
 
This is a really simple plugin if you’re looking for one that will help you add scripts to the head of your website, particularly useful for Google Analytics and Search Console verification. If you ever need to add additional scripts, then all you have to do is add them to a simple text field.
 

Schema.press

 

 
Stars: 4.8 out of 5 stars
 
Reviews: 44 reviews
 
Active Installs: 10,000+
 
Download: https://wordpress.org/plugins/schema/
 
Created in July 2016, this light-weight and free WordPress plugin is perfect for adding schema.org structured data to your site. You can use the recommended JSON-LD format to apply schema to multiple product or services pages to help Google better understand the type of content on your site.
 

Ultimate Noindex Nofollow Tool II


 
Stars: 3.4 out of 5 stars
 
Reviews: 7 reviews
 
Active Installs: 10,000+
 
Download: https://wordpress.org/plugins/ultimate-noindex-nofollow-tool-ii/
 
Even though this tool hasn’t been updated for awhile, I find that it’s really effective for noindexing and nofollowing pages across your site. If you used tags actively across your site, then you can prevent duplicate content by blocking them with a meta noindex tag.
 

W3 Total Cache

 

 
This SEO plugin for WordPress is perfect for improving site performance through caching and content delivery network (CDN) integration. I was able to get some meaningful page speed performance out of using W3 Total Cache and have been impressed with robust free and paid features, from minification to easy purging of caches and CDN.
 
Stars: 4.3 out of 5 Stars
 
Reviews: 3,765 reviews
 
Active Installs: 1+ million
 
Download: https://wordpress.org/plugins/w3-total-cache/
 

Far Future Expiration Plugin

 

 
In the spirit of improving site performance, this plugin accesses your .htaccess file and inserts a snippet of code to set an expiry time to prevent browsers from continually fetching your images, CSS, and javascript files that you haven’t changed. In other words, it sets code that makes your site faster because each page load requires less resources to fetch.
 
Stars: 4.2 out of 5 stars
 
Reviews: 15 reviews
 
Active Installs: 10,000+
 
Download: https://wordpress.org/plugins/far-future-expiry-header/
 

Yoast

 

 
Not a big surprise to most people who have been around the WordPress Plugin world, Yoast is one of the leading players in the SEO space. From customizing title tags and meta descriptions to submitting XML sitemaps, Yoast makes handling all the fundamental building blocks of SEO relatively easy.
 
Stars: 4.8 out of 5 stars
 
Reviews: 15 reviews
 
Active Installs: 3+ million
 
Download: https://wordpress.org/plugins/wordpress-seo/
 

WP Post to PDF

 

 
There is a good amount of keyword research out there to be had for queries that contain PDF at the end. If you’ve answered the same question in your blog post as you would in a separate PDF, then why spend the time creating an entirely different document? This allows you to turn your blog post into a PDF so readers can download information from your site.
 
Stars: 4.6 out of 5 stars
 
Reviews: 15 reviews
 
Active Installs: 4,000+
 
Download: https://wordpress.org/plugins/wp-post-to-pdf-enhanced/
 

WP Google Analytics Events

 

 
By creating custom class names for action items on your site, this events plugin allows you to track how many people click on your sign up form or contact button, and much more. All you need is a Google Analytics integration code and the ability to create custom classes around discrete elements.
 
Stars: 4.6 out of 5 stars
 
Reviews: 15 reviews
 
Active Installs: 10,000+
 
Download: https://wordpress.org/plugins/wp-google-analytics-events/
 

TinyMCE Advanced

 

 
Chances are if you’re using WordPress, you are regularly accessing and editing the content on your site using TinyMCE. This advanced WordPress Visual Editor gives you more options — from setting family font and sizes to creating and editing tables.
 
Stars: 4.6 out of 5 stars
 
Reviews: 15 reviews
 
Active Installs: 2+ million active installs
 
Download: https://wordpress.org/plugins/tinymce-advanced/
 

Ninja Forms

 

 
Ok, even though a form technically won’t help you rank better in search — you’re going to need to do something with the traffic that lands on your site. Ninja Forms is my favorite plugin right now for creating and styling contact forms and managing leads. Their interface makes it easy to see which forms people used the most on your site and respond to different types of inquiries.
 
Stars: 4.5 out of 5 stars
 
Reviews: 776
 
Active Installs: 800,000+
 
Download: https://wordpress.org/plugins/ninja-forms/
 
With these plugins, you should be able to cover both beginner and advanced SEO. Are there any other plugins you’re using to optimize your site?