Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Google Quality Guidelines

Options
  • 27-09-2012 3:38pm
    #1
    Registered Users Posts: 31


    In June this year we received a message via Google Webmaster tools “We've reviewed your site and we believe that some or all of your pages violate our Quality Guidelines”. Since then the traffic from Google organic searches has been minimal and this has decimated our business.

    Online since 1995 (pre Google) we have never knowingly engaged in so-called SEO black hat techniques. There is no indication from Google where we might be in breach. We have read and reread these guidelines and made many changes to our site (www.speech-writers.com). We have resubmitted our site for consideration 4 times and each time, 3 weeks later, we get the same bland message which is very frustrating.

    Someone recently suggested that the fact that our homepage has 200+ links to various categories and subcategories may breach the quality guidelines. We are currently also in the process of trying to remove many links to our site.

    The cynic in me thinks that part of the reason for the constant changes in the Google search algorithm is that if they push my commercial site and others out of the organic listings then we need to pay for more Adwords clicks and Google wins.

    I would welcome any views or suggestions what I have to do to convince this wonderful company that my site is worthy of inclusion in their organic listings. Separately we have also spent over $500,000 on Adwords since 2002.

    Fred Crowe


Comments

  • Registered Users Posts: 106 ✭✭business bloomer


    Sign up with Google Webmaster Tools and see what it tells you. You could also have some malware on your website, or there is some basic black hat technique you're not aware of.


  • Registered Users Posts: 1,254 ✭✭✭blue4ever


    Because of the date it definitely a penguin hit (as opposed to any other algorithm adjustment).
    This one picked up primarily in inbound and low quality links. I’d look at your analytics first and more specifically look at your traffic first two weeks in May V last two in June (for example). In that I’d look at the keywords that have dropped the most.
    There’s a chance that you might have ‘over optimised’ on some of those keywords – used them in crap directories and used unnatural links in those parishes.
    That’s the first place I’d start without having much more information.


  • Registered Users Posts: 31 fc2060


    We are already signed up with Webmaster tools. all our messages from Google or via Webmaster tools. They would also have indicated to us if they felt there was malware on the site.

    Thanks for your input


  • Registered Users Posts: 1,254 ✭✭✭blue4ever


    Not discussing malware

    I'm saying that some of your links are unnaturally repetitive and are perhaps are in spammy directories.


  • Registered Users Posts: 31 fc2060


    Yes there is no doubt that the company I used for SEO for a few years went over the top in terms of their link building. They certainly did more than we were paying them for and did things that we were not paying them for, purely to improve our rankings. Our site was also built with SEO in mind and while that worked for us in the early days it now seems to be having a negative effect.

    It's very difficult to get rid of these links, We have started writing to Webmasters asking him to delete links but not all of them responded to our request.

    Do you think that having 200+ links on our homepage is in breach of the Google quality guidelines.


  • Advertisement
  • Registered Users Posts: 1,254 ✭✭✭blue4ever


    You have to remember that much of the panda updates are "manual" updates' done by 'real' people. It will also look at your site, the links to it, the dispersion of links to pages other than the home page, the anchor text recurrence and how natural that is and the relevance of the site hosting the link to your business. I think, I saw a link to you site from a Russian or Slavic site! Hardly very natural.

    You can do some work and then look for a reconsideration request


  • Registered Users Posts: 396 ✭✭M.T.D


    Hi Fred
    Your site is not loading at all this morning 03/10/12


  • Registered Users Posts: 31 fc2060


    Blue4ever
    You are right about the quality of many of the links to my site, as I indeed alluded to here. I was shocked when I saw some of them. One of the back link software checkers told me there were 15500 back links and many of these looked very dodgy. I have never paid for any of them. it seems my SEO company where a bit over the top. I have now started the process of requesting the Webmasters that they delete them.

    I gather Microsoft/Bing have now introduced in their Webmaster Tools a facility to request that Microsoft ignore certain links from certain domains. I believe Google are about to implement this over the next few months and that would be very useful for someone like me.


  • Registered Users Posts: 31 fc2060


    M.T.D.
    We have been paying for and using a service from www.Alertra.com. they ping our site every 9 min or so from various locations around the world and they did not report a problem today for yesterday. If there is a problem I receive an e-mail and text message.

    The service also provides excellent information regarding access times to my site. I have this information now going back over 10 years

    Thanks for your input

    Fred


  • Registered Users Posts: 380 ✭✭TsuDhoNimh


    fc2060 wrote: »
    In June this year we received a message via Google Webmaster tools “We've reviewed your site and we believe that some or all of your pages violate our Quality Guidelines”.
    This is a manual penalty handed down by the webspam team.

    The discussion so far has given good advice in terms of avoiding algorithmic penalties from the likes of Penguin in particular (i.e. do remove those spammy links - I haven't actually looked at your link profile, but from the discussion it's a fairly safe bet you would have suffered a Penguin hit even in the absence of the webspam one), but the crux of the problem won't be solved directly by this action (though it will help resolve other potential or even future issues by sorting this out - and it will take time to resolve).

    For the moment, stop submitting reinclusion requests. There's something there they're unhappy with (that isn't the spammy links) and until you resolve it they'll keep sending you the PFO (please f off) replies.

    The 'pages violating the quality guidelines' mean the issue is an onpage one that you're in complete control of, so that's technically good news. The question then becomes which areas are you at risk of breaching the guidelines in (it could be anything from over optimised/spammy title tags to hidden text on pages, so you'll need to review the site in great detail).

    As you're putting together your analysis of the site, document everything. Where you make changes, detail those changes giving detailed specifics (including what happened that might have put you in breach, why it happened if relevant and what you've done to rectify it).

    Go above and beyond what you think is required and do all in your power to show you're being honest, above board and doing everything you can to rectify the situation (so include details of the review you've done on your link profile and the steps you're taking to rectify that [potentially including information on why you think it happened and that you had no knowledge of it being done]). When you go to great lengths, the webspam team have been known to offer further guidance in the terms of personal replies to help identify a specific issue that may have been missed (this is the exception rather than the rule and you shouldn't be trying to depend on it and wasting your time requesting it).


  • Advertisement
  • Registered Users Posts: 31 fc2060


    Thanks TsuDhoNimh for your detailed reply.

    The most frustrating part of this exercise is the bland nature of the reply from Google. We have thousands of pages and we have indeed made many changes since our first "breach" message. We have similar sites in six other languages and all the sites were linking to each other and we have now removed these links Or added a "nofollow" tag to some of the links. We had various other links between the sites which have now been removed. We have documented all of these changes in our various reconsideration e-mails to Google but all we get back is same the bland message.

    It is interesting that you think that the problem is within our site rather than links to our site. We have used canonical tags to avoid duplication but it is difficult as the text we use for a 50th birthday party speech is not going to be much different from the text used for his 60th birthday party speech and the same for 80th, 90th, 100th etc. The Google robot may deem this to be duplicate content as indeed many of the words would be similar.

    Can you suggest any tools which might highlight where our site may be in breach of the guidelines, such as keyword stuffing, duplicate text etc.

    Fred


  • Registered Users Posts: 380 ✭✭TsuDhoNimh


    fc2060 wrote: »
    The most frustrating part of this exercise is the bland nature of the reply from Google.
    They are improving in terms of the level of detail they're providing to webmasters (before this year you'd have been hit with a penalty and not have the first idea of what you've been hit for), but it will always be kept vague. If they force a webmaster to review all elements of their sites, they hope to devalue the benefits of spam techniques and improve the overall quality of the web. Or at the very least, fix six issues on a single site when only one caused the actual penalty.
    fc2060 wrote: »
    We have thousands of pages and we have indeed made many changes since our first "breach" message.
    Be sure to keep track of all of the changes made to the site, even if they've already been submitted in a prior reinclusion request. The more data and information you have the easier it'll be for you to get to the bottom of this issue and the easier it'll be for you to demonstrate the lengths you're going to.
    fc2060 wrote: »
    ... all the sites were linking to each other and we have now removed these links Or added a "nofollow" tag to some of the links.
    Can't think of the specific guideline off the top of my head (I believe the exact wording mentions something related to 'sister sites', though it's one I've never seen actually punished being honest), but that is/was a breach. Be sure to have a record of exactly what links were in place, which ones have been removed and which ones are now 'nofollowed'. Doesn't need to be complicated, a simple spreadsheet is more than enough.
    fc2060 wrote: »
    It is interesting that you think that the problem is within our site rather than links to our site.
    The links to your site are an issue, one you should resolve. If you hadn't been hit by the webspam penalty I'd suggest you probably would have been hit by a penguin slap and had a significant drop as a result. But the message you've received (based on pages in violation and not a manual penalty related to links) show that the manual penalty isn't based on the links (again, you possibly/probably still would have an algorithmic penalty there, so it's very much in your interest to tidy up those links).

    That said, it's 100% clear from the message you've received that there's something onpage on your site that G feel is in breach of their guidelines and is the cause of this manual penalty. That's not my opinion, that's simply the meaning of the message you received (Don't take my word, feel free to do searches relating to the various messages WMT provide and Google employees like Matt Cutts, John Mu et. al. have given fairly detailed responses in relation to them).
    fc2060 wrote: »
    We have used canonical tags to avoid duplication but it is difficult as the text we use for a 50th birthday party speech is not going to be much different from the text used for his 60th birthday party speech and the same for 80th, 90th, 100th etc. The Google robot may deem this to be duplicate content as indeed many of the words would be similar.
    I'd very much doubt if it's a duplicate content issue that you're dealing with. The use of canonical tags is more important from a crawl efficiency, consolidation of authority and overall site quality perspective than it is from a 'webspam' one. They'd be very unlikely to penalise a site (it already gets penalised in the algorithms for the mistakes) just as the structure isn't well maintained from a technical point of view (I've never seen or heard of it be the cause of a manual webspam penalty).

    That's speaking in general, without looking at the specific pages, so if you think some of the content is at a level where it could be causing issues... go with your gut ahead of my generalisation and assumption. More importantly, even more important than removing the Google slap, go with what you think is best for your users.

    (You previously mentioned about having +200 links on your home page. The old rule of thumb used to say <100 for a variety of technical reasons that are no longer relevant or valid. Having said that, <100 is still a very relevant figure as from a usability point of view once you start going over that figure you're making the site a bit of a mess to navigate. I'd look at your navigation options, your site architecture and your analytics and try and decide if the current setup is actually a good one for your users.

    Personally, I absolutely hate sites built in that manner and would much prefer a nice clean category structure where I can then delve into the specifics in a far more manageable way.)
    fc2060 wrote: »
    Can you suggest any tools which might highlight where our site may be in breach of the guidelines, such as keyword stuffing, duplicate text etc.
    There won't be a tool that will be able to answer the question for you, algorithmic penalties tools can pick up with some success but manual ones even Google need real people to do (and if they can't automate it, safe bet the tool providers can't), but they will be able to assist you gathering the information you need to do your own review of your site from top to bottom.

    If you start with tools like Xenu Link Sleuth and/or Screaming Frog SEO Spider they'll start to collate a lot of the relevant data (e.g. all your URLs, page titles, images, related alt text, etc.), then as you start digging deeper and deeper find specific tools to do individual areas.

    Some of the bigger tool kits, Raven or SEOMoz, might be able to help save you some time. Using the free trial periods you should have more than enough time to gather all of the data you need (turning that into a working solution for the true problem is a different story, but that'll come down to your own ability to identify it when it's shown and even a bit of luck).


  • Registered Users Posts: 1,254 ✭✭✭blue4ever


    Sorry to dig this up again - but.....

    @fc2060 - this could be answer you were waiting for:

    http://googlewebmastercentral.blogspot.ie/2012/10/a-new-tool-to-disavow-links.html


    C


Advertisement