Fix the Panda


Yes you can recover from Panda 2.5

If you have a site or blog that took a big hit from the latest Google algorithm update, don’t panic (Yet!).

It may take time, but it can be done. Now is the time to pay attention to basics. In some ways if your site was one that got hit very hard, possibly even de-listed, you could actually have an advantage going forward. You can now take drastic action to fix up all the errors and problems that accumulated over the years, or even months…

Broken links and missing content no longer has any relevance. Therefore you can get rid of bad content, plagiarised content, duplicate content, spam, and anything else that has no value, or may be damaging the site’s reputation with the search engine.

Be Absolutely Ruthless

Look at every piece of content on your site. Be absolutely ruthless. If it lacks quality, or does not add value – DELETE it now. If you really must have that trashy article, or near duplicate content, then add Robots Meta tags noindex, nofollow” /> These tags are used in different ways to limit what search engines do – “noindex” tells SE’s to ignore the Do not followpage, “nofollow” tells it to ignore outgoing links. (For more information about these meta tags and others – Metatags.info has plenty of information).

Be particularly ruthless using the “noindex” term – why have all the junk pages indexed by Google? Including “noindex” will stop Google from adding these irrelevant pages to their index database.

“Nofollow” Meta tags: Use this as well on any page (or other content TYPE) that could get your site into trouble. If you have an RSS feed – this can really cause problems (read this article Too Many Backlinks are Bad News). Placing the Robots nofollow meta tag in the HEAD part of the page containing the RSS feed will stop search engines from following all outgoing links from that page (CMS users add this to feed aggregators, feed categories etc).

Drupal 7 site builders, if you haven’t already installed the “Meta Tags Quick” module, now is the time to do so. And add Links_Nofollow as well.

Once you have scrapped all the unwanted pages, go to webmaster tools and see how many broken links Google has found (Wait a day or to give the spider time to discover the missing links). Request Google to delete these links from the index! A day or 2 later you may find a message telling you your site has “severe” health problems. Important pages are missing”. Ignore this for NOW!

CMS website in Particular

Among the websites in my management portfolio CMS sites took the hardest hit! The HTML sites were relatively unscathed as far as ranking goes. (from what I can see, SERP for all sites not supporting Google ads of one type or were hammered – see Forget Google, go Yahoo and Bing – theres nothing we can do about that right now, just be patient).

The problem with advanced content management systems is it is too good at automatic SEO. We all got excited when content management systems ‘create’ hundreds of pages from a handful – links from ‘tags’ different menus, other taxonomy terms all add together to let search engines think the site is much bigger than it really is. Well, THIS NO LONGER WORKS! Google now sees this as an SEO technique aimed at obfuscation of the true nature (size) of the site. Sites allowing all these alternative navigation terms to build links were hit very hard. CMS users must EXCLUDE tags, and other taxonomy, from being indexed by search engines, with “noindex” meta tags. Present Google with a single link to every item of content. No more than one link!

Edit and Improve Content

Now look at every item of content remaining on your site. Proofread and edit. DO NOT OVERDO KEYWORDS. Keyword Edit the contentstuffing is bad news. Your content must read easily with a natural flow. Overuse of keywords detracts from readability and flow. Fire your SEO company. Ignore all the SEO ‘experts’ telling you to aim for keyword density of 8 to 11 percent and other ridiculous ideas intended to lighten your wallet. 1% to 3% MAXIMUM is what you should look for. These SEO’s are very happy if the work they do today is bad news in 6 months time – they get to charge all over again to change things to suit the latest algorithm update!

Make your content enjoyable to read for yourself.

Correct spelling mistakes, and improve grammar. This counts.

Avoid too many short pages (or posts – those are also ‘pages’). Aim for at least 400 words as an average. If you have a lot of short articles less than 200 words, try to rewrite those to increase the content value (but don’t pad it)

Duplicate content is no good. If possible, try to combine duplicate material, Change the wording significantly, or use code to tell Google to ignore the page! Another way to deal with duplicated content is use rel=canonical <link rel=”canonical” href=”url” />which tells search engines that you know the page is duplicate, and to look at the given url as the main page.

Now more than ever before ‘Content is King’.

Fix Code

Have a god look at the code your website uses. Clean up bad HTML as much as possible. Use a code validator to check every page. Some code that does not pass validation, e.g. Open Graph Meta can be ignored, AS LONG AS THE MAIN HTML is OK (Read W3C Validation – Is it still Relevant), and any other code is correctly formatted.

Fix your Site Map

Make sure every page you want indexes (that have not been excluded for indexing) is in your site map – ONCE ONLY. Exclude tags, menus, and other taxonomy from your site map. (Part of the process described above for CMS sites).

The simpler you can make your site structure, and your site map, the better.

Submit for Reconsideration

Once you are satisfied you have done everything possible to fix the wrongs, submit  a request to Google to reconsider your site.

If you pay attention to these basics, your site will be set to regain its reputation. It will not happen overnight

These steps may in some cases sound opposite to what we have become used to doing. Drastic times call for drastic solutions. Getting your website in prime working order will be good for every search engine out there, not only Google. Yahoo likes well structured sites, so does Bing. Trying to work to keep your website up to the requirements of every one of Google’s capricious demands will prevent you from doing what is really needed – creating good original content, and looking after basic housekeeping.

Money spent to keep up with the daily algorithm changes is equal to pouring it down the drain. That is a never-ending exercise in futility. Good solid web structure (and content will cost less, and work better in the long run.

Get the house in order, and Be Patient

Related Articles

Advertisements

About Mike

Web Developer and Techno-geek Saltwater fishing nut Blogger

Posted on October 14, 2011, in CMS, Internet, TECHNOLOGY and tagged , , , , , , , , , , . Bookmark the permalink. Leave a comment.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: