With 1,600 Google algorithm updates every year and numerous product managers working on daily deploys, tracking down the cause of an enterprise-level organic traffic drop takes precious resources in already strapped environments. And, if you haven’t optimized your site for internal reporting or you don’t have a method in place for investigating traffic drops, then the time it takes to identify and fix the problem can be costly.
Whether you work on an enterprise site with 10 million pages or a smaller site with 100 pages, the first step to solving your problem is narrowing your search and coming up with a list of hypotheses that you can start to gather data for, by enlisting the help of your team.
Start by Memorizing Traffic Drop Types
There are two main types of traffic drops:
- Organic traffic drops that impact a specific page type (technical or topical category)
- Organic traffic drops that impact the entire site
To determine which you have on your hands, make sure that you have a way to look at page type reporting distinct from one another. For eCommerce sites, this means being able to quickly analyze the organic traffic performance difference between your product listing pages (PLPs) and product detail pages (PDPs). One of the easiest ways to do this is with unique URLs or page type tracking.
Aside from the technical page type (i.e. PLP vs PDP), you should also make it easy to report on different topical categories on your site. For example, if you have a website selling shoes — did you lose rankings on a specific category of shoes — maybe men’s or women’s? Or did you lose organic traffic across all shoe types?
If you’ve looked through your data and determined that you have a sitewide traffic drop on your hands, then you can safely narrow this down to one of three culprits:
- Google algorithm update
- Broken or inaccurate tracking
- Major technical deploy toward the top of your site hierarchy
There are a few different steps you can take to narrow down which one it may be.
Solving Organic Traffic Drops
For any type of traffic drop, start by finding a timeframe, roughly 1-2 days, in which the traffic fell below what’s expected. For some algorithmic traffic drops, you might also be looking at a sloping traffic trend over 4-6 weeks.
If you’re working on an enterprise site and there is a steep cliff within a 1-2 day timeframe, you’re likely dealing with an attribution error or manual action. Check in Google Search Console to see if you have a manual action for spammy link building practices, like cloaking, thin content, hidden text, or keyword stuffing.
Contact your data analytics team to see if there have been changes or issues with tracking on your site, before enlisting the helping of your SEO team.
Once you have your date range and your data analytics team has verified accurate tracking, you can start by ruling out the impact of major Google algorithm updates, because they’re usually the least frequent and most easily identifiable.
Was it a Google algorithm update?
There are two types of algorithmic updates that may impact your site, those that change the search engine results pages (SERPs) themselves and those that penalize your site for violating Google’s guidelines.
Mozcast is a great place for a quick temperature check on SERP volatility, aside from reading through articles on Search Engine Land, Search Engine Watch, and World Webmaster Forum when trying to understand if there were any updates that could have impacted the site. If the date of your traffic drop lines up with a Google algorithm update, then it’s likely that your site has been impacted.
If the dates don’t line up, then congratulations you have another issue on your hands.
The second most common culprit of traffic drops are site changes. Product managers often make site changes that hurt SEO, especially in large organizations where SEO might not be consulted before a product launch.
And in some cases, it might even be your team that accidentally did the damage.
Make sure you contact your product team and look over their sprint boards to see if you can find changes which could have impacted your traffic. The most common changes that hurt SEO, include:
- Adding, moving or removing links from the page
- Adding, moving or removing text from the page
- Post-loading content on the page
- Breaking existing SEO site features (title tags, metas, canonicals, HREFLANG)
- Deactivating pages without properly 301 redirecting
- Moving content onto a subdomain
- Generating duplicate content
- Creating desktop & mobile interstitials or modals
- Launching product changes that hurt user metrics (conversion rate, bounce rate)
Run through this list of common mistakes on the page types that were impacted. View Source and Inspect Element are the best ways to understand how Google sees your webpage. Make sure you look at both your desktop and mobile source code when trying to uncover the technical issue to get your traffic back.
Making the Business Case
It’s likely that multiple hypotheses will sound right, but you’ll find that one or none of them may be. Create a list of your hypotheses and all the evidence you have to support them. If you’re unsure which hypothesis might be the culprit, then create multiple tickets for your engineers to implement at different times.
Unlike A/B testing, this type of testing can be –pre and –post analysis. If you’re uncertain that a change is what actually brought your traffic back, then you can always turn your fix off and then back on to be certain.
|Site Change||Date||% Traffic Drop|
|Doubled the number of links in the global navigation.||5/12/17||30%|
|Removed links to product pages.||5/14/17||50%|
|Removed product descriptions from the source code.||5/16/17||15%|
Once you’ve solved the traffic drop, record the data of the drop, the % traffic loss, and the root cause so that you can keep a record for your team to learn from for future traffic drops. This will also help you when explain SEO performance to senior leadership.