By Glenn Gabe, from Search Engine Watch – http://bit.ly/Y77BQ9

For those dealing with a major drop in traffic due to an algorithm update, I’m not sure there’s a better feeling in the world than seeing recovery. The original algo hit typically feels like a punch in the gut, with serious confusion and frustration quickly following. Then you have months of recovery work with hope (and faith) that your hard work will pay off. And for some, recovery will never happen. That’s an unfortunate truth for sites that have pushed the limits of spamming repeatedly for a long time.

So, when you finally see something like the screenshot below, it’s hard not to throw your hands up in the air and dance in the streets. This trending is from a client that recovered during the 9/5 Panda update from a horrible hit this past spring. They had been working on recovery for more than four months and are up 81 percent since the 9/5 Panda update.

post-recovery-panda-sep-5-2014

That’s awesome, but it’s at that point in time that I typically tell clients to stop celebrating, put down the piña coladas, and get ready to take action. You see, this is exactly the time to analyze that surge in traffic to ensure you aren’t feeding Google the same (or similar) problems that got you hit in the first place.

Just because traffic is surging doesn’t mean you can’t get hit again. Just read my post about the sinister surge in traffic before an algorithm hit to learn more about that phenomenon. And now that Panda is near-real-time, you can actually get hit again relatively quickly. I have seen a number of temporary recoveries over the past few months (especially since Panda 4.0).

post-recovery-temp-recover-panda

Revisiting the Gray Area of Panda (and Algorithms in General)

To better understand why and how this can happen, let’s quickly revisit the gray area of Panda (and other algorithms). It’s an important point that I think many don’t understand. With any algorithm, there’s an inherent gray area. When you cross a threshold, you can get hit by the algo update. If you stay below that threshold, you will probably not experience the negative drop. So, within the range of the strike zone, there are various shades of danger.

I’ve provided an image below that depicts the various levels of danger when it comes to algorithm updates. You have the area in white, which is the safe zone. Then there’s the gray area, with various shades of gray. The closer you get to the strike zone, the darker the gray area becomes. And then you have the impact zone, where you clearly cross the algorithmic threshold. That actually begins in the darker gray area and leads to the zone that is blacked out (which is pure algorithmic hell).

post-recovery-panda-gray-area

The gray area is important to understand because you want to stay out of it. If you enter the gray area, either knowingly or unknowingly, you are in danger of being hit by Panda. And if you do get hit, and you remain in the gray area, you will never know how far you are to recovery. Google unfortunately doesn’t inform webmasters that they are being algorithmically demoted or how close they are to recovery. And if you don’t make enough changes, you can sit in the gray area forever. I’ve explained before that the gray area of Panda is one of the most frustrating places to sit when you are trying to recover.

Recovery-wise, if you only perform enough work to barely exit the gray area, you can recover, but only temporarily. Since you are on the fringe of the strike zone, you can easily get hit by subsequent algorithm updates. As I’ve explained before here on Search Engine Watch, SEO band-aids do not lead to long-term recovery. That’s because a site can jump in and out of the gray area, experiencing hit, then recovery, another hit, then recovery again, so on and so forth. It’s a horrible place to live.

post-recovery-panda-band-aids
Post-Recovery Checks to Ensure the Surge Remains
OK, now that I’ve covered why it’s important to analyze the surge in traffic based on recovery, it’s time to take action. As a webmaster experiencing Panda recovery, you want to make sure user engagement is strong, users from Google organic are happy, and your landing pages are as high-quality as possible. You don’t want to find a bunch of low-quality landing pages, blank pages, thin content, heavy ad problems, technical glitches, etc. Just like bamboo, they can attract the mighty Panda again.

Here are five things you can do today to ensure the recovery surge is going to pay off, and not send you back to the gray area of Panda.

1. Analyze Top Landing Pages From Google Organic (via Google Analytics and Google Webmaster Tools)

When you’re hit by Panda, it’s important to analyze the top landing pages leading to the site prior to the Panda hit. That’s where Google is getting user engagement data from, and you can often find glaring issues while going through the process of checking those pages.

Well, now that Google traffic has returned based on recovery, it’s smart to analyze the top pages receiving that traffic to ensure all is OK from a content quality standpoint. The last thing you want to do is drive users from Google organic to thin pages, broken pages, pages with serious ad problems, etc.

I recommend comparing the post-recovery timeframe to the previous timeframe to determine the change in traffic level per url (from Google organic). You can do this via both Google Analytics and Google Webmaster Tools. Then dig into the content receiving the most traffic post-Panda recovery to ensure all is OK.

post-recovery-panda-top-lp-ga

And make sure you keep an eye on the keywords leading to the various landing pages. Make sure they line up. For example, make sure your landing pages provide rich information based on what users are searching for. If not, those users might not be very happy. And if they aren’t, Google can certainly pick that up.

2. Check Mobile Versus Desktop Traffic (From Google Organic).

In my last post, I explained how mobile traffic could be impacting your Panda situation. For example, if 50 percent of your Google organic traffic is from smartphone users, then that’s 50 percent of the data Google is going to measure when it comes to Panda.

Therefore, it’s important to understand how much mobile traffic is hitting your site from Google, and where that traffic is going. Then you should analyze that traffic via mobile devices. Yes, that means using multiple mobile devices to test the top landing pages for smartphone users.

post-recovery-panda-gwt-mobile

I just went through this process for several clients of mine that recovered during the 9/5 Panda update. You would be surprised what you can find. During my analysis, I found technical problems, content problems, ad problems, and more. I even found problems that were fixed after my initial audit that had returned. And my clients had no idea that was happening.

3. Use Fetch and Render

After Panda 4.0 (coincidentally or not), Google released fetch and render in Google Webmaster Tools. The tool enables you to fetch a URL on your site as Googlebot, but also view a snapshot of the rendered page. Googlebot can now fetch the necessary resources like JavaScript and CSS and actually render the page (like an actual user would).

I highly recommend testing a number of the top landing pages post-Panda recovery via fetch and render. Similar to what I explained earlier, you might be surprised what you find. I recently found a number of problems across companies that recovered, including landing pages that wouldn’t render at all, just the template rendering without the core content, ads that were broken or wouldn’t display, and more. Without using fetch and render, you might miss serious rendering problems that Googlebot is actually running into.

post-recovery-panda-fetch-render

And by the way, you can use fetch and render as Googlebot (desktop) and Googlebot for Smartphones. This enables you to render top landing pages from desktop or mobile the way that users would see them.

4. Utilize Human Review (Yes, Real Human Beings.)

There are many Panda victims that never go through the process of having real people test their websites and provide feedback. Too many try to seek out simple technical problems that are causing issues versus trying to understand true user engagement. Don’t get me wrong, technical problems can definitely cause content quality issues, but Panda heavily takes into account user happiness. When performing audits, I often surface serious problems that impact user engagement (and cause users to bounce off the site.)

John Mueller from Google has explained this point a number of times, yet I still find Panda victims try to hunt down the silver Panda bullet. In my experience, Panda is rarely caused by one issue. There are typically a number of problems I surface during deep Panda audits. I know that might be frustrating for some people to understand, but it’s the truth.

Here’s a graphic representing the deadly Panda cocktail. And yes, it will definitely leave you and your site with a nasty hangover.

post-recovery-panda-cocktail

To better understand possible user engagement problems (even after recovery), I recommend having neutral third parties go through your website and provide objective feedback. Most business owners are way too close to their own websites to objectively review their content, user experience, ad situation, etc. And then they spin their wheels working on Panda recovery.

I recommend having a test group go through your website with a list of goals to achieve. Vary those goals and have them document everything. Make sure they understand you don’t want sugarcoated feedback. If something looks off, sounds weird, looks spammy, etc., they should document it. Have them jot down technical glitches, content issues, grammatical and spelling errors, advertising problems, usability issues, etc. Ask how they feel about the design of your website and how trustworthy the site is.

5. Perform a Crawl Analysis

In my previous posts about Panda, I explained how important a crawl analysis can be for hunting down Panda problems after getting hit by an update. And that’s especially the case for larger-scale websites. For example, sites with more than 500,000 pages indexed.

But crawls can help after recovery, too. If you have recently recovered from Panda, then it makes complete sense to crawl the top pages receiving traffic from Google organic post-recovery. This will enable you to gather data about those pages at scale. For example, hunting down thin content, finding soft 404s, URLs with poor page speed, funky redirects, duplicate content, etc.

There are times a solid crawl can lead you down interesting paths. Then it’s up to you and your SEO to analyze the situation to determine potential Panda problems. Some findings will definitely be benign, while others lead to serious bamboo. This is where human SEO intelligence and SEO tools combine to provide critical insights.

post-recovery-panda-crawl

Tool-wise, there are several strong solutions for crawling websites. I like using Screaming Frog for small to medium-size websites, and DeepCrawl for larger-scale websites. And you can always use a combination of tools to slice and dice various sections of a website.

Just keep in mind that the reports won’t provide “Panda problems” on a silver platter. You’ll need to use your SEO knowledge to hunt down problems based on what the crawl surfaces.

Summary: Post Recovery Analysis Can Keep the Panda at Bay

Recovering from an algorithm hit is an amazing feeling. That said, you need to hold off on celebrating until you make sure your newly found Google organic traffic is happy with your content. Performing a post-recovery analysis can help identify problematic content, user engagement issues, technical problems, advertising issues, and more. And those problems can unfortunately provide a bamboo trail for Panda leading back to your website. Don’t fall victim to subsequent algorithm hits. Performing the proper analysis can keep the mighty Panda (and other algorithmic animals) at bay. And that’s exactly what you want to do.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.