Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

geographically descriptive domains + google ranking

  • 05-11-2004 10:41pm
    #1
    Closed Accounts Posts: 606 ✭✭✭


    Hey, I've a question about registering a domain & pointing it at a existing domain.

    I have a existing domain 'www.olddomain.com' and I'm hoping to increase my google rankings by purchasing a new domain that is geographically descriptive, 'www.olddomainsDublin.com'.

    Does this work, will it increase rankings?

    Also what is the best way to do this? (I host with hosting365.com) - a domain pointer?

    Cheers


Comments

  • Registered Users, Registered Users 2 Posts: 7,740 ✭✭✭mneylon


    Google rank is hotly debated :D
    It might help you if your PR is very very low, but it could also work against you...


  • Closed Accounts Posts: 606 ✭✭✭pencil


    Yeah my PR is low on this site (a new one with better content is 95% done :-)

    Blacknight, as someone you provides hosting can you answer this one?

    I have six domains hosted with 365 - they are all interlinked but I don't seem to be getting the benefit from this interlinking.

    If I do a backward link search from the site they are all pointing at, none of them show up!!? They all individually show up in google.

    Is this beacuse they are all on the same shared IP & hosting provider?


  • Registered Users, Registered Users 2 Posts: 7,740 ✭✭✭mneylon


    Backward links between your own websites won't help you too much, unless the sites have a good PR and back link independently.
    The fact that you are hosting them all on the same IP block etc., isn't the issue, as the sites aren't "important" as far as google is concerned.

    I'd try to address two things:
    - are the sites cleanly coded? ie. w3c compliant
    - have you submitted them to relevant directories and search engines

    Avoid FFA sites and link farms. They can damage your ranking.


  • Closed Accounts Posts: 606 ✭✭✭pencil


    My own site has a PR of 5/10 the one that link to it & each other 2/10 :-(

    I just got up to speed on XHTML & CSS (XHTML 1.0 Transitional).

    All of these sites are ye olde HTML (tables et al) but other wise are cleanly coded. I've always treated them well, never over submitting (possibly they need to be spread around a bit more).

    Are you sure that the same IP block doesn't make a difference?


    Do you have any other comments on my first question?

    Cheers.

    Your always helpful here - must try you guys out.


  • Registered Users, Registered Users 2 Posts: 7,740 ✭✭✭mneylon


    Pencil

    I'd be more concerned about getting back links from directories and relevant sites tbh.
    There are a lot of theories about how Google ranks sites and how the listings can be affected. Only Google has the definitive answer. The rest of us are just making educated guesses.
    If you post the links or send me a PM with them I could have a quick look for you :D


  • Advertisement
  • Closed Accounts Posts: 2,161 ✭✭✭steve-hosting36


    Best idea is to get listed / linked from directories and sites with higher pr. Submit your sites to all the accepting engines, and ensure its in dmoz, etc. Search engines are an art not a science. There are also paid submission services available...


  • Closed Accounts Posts: 237 ✭✭FreeHost


    Pensil,

    Ref question about redirection, if you use a Meta Redirect for your new-domain and point it at your old-domain, Google will pick up the page rank for the new site after a period of time (I have tested it and it works). However, because people are using redirects to hijack PR from other sites the whole practice has become tarnished. In some cases I’ve read about sites were actually de-indexed.

    With regard to each of your sites linking to each other, if the content is different on each site there should be no problem (you will drop position with duplicate content)

    Being on the same IP block, as BK just said, makes no difference to Google’s cache of each site, the IP really only comes into play when sites are categorised by Country search.


  • Closed Accounts Posts: 606 ✭✭✭pencil


    Cheers guys,

    I suppose I need to work on getting back links from directories and relevant sites.

    I submitted to Dmoz two months ago still no listing - I take it there is a wait?

    So you reckon there is no benefit in purchasing a geographic domain (say www.bedandbreakfasttownname.com) to go along with the existing domain - pointing it at the same site (both domains point to same place)?

    Has any one faith in the paid submission services?


  • Closed Accounts Posts: 237 ✭✭FreeHost


    There’s a lot wrote about domain names too, it might be worth considering the domain extension, and having them all mapped to the same site e.g

    Yourdomain.com
    Yourdomain .co.uk
    Yourdomain.ie

    Some consider that hyphenated domain names are better placed than others e.g
    Bed-and-breakfeast.com

    But the bottom line is your content and relevant links.

    With regard to DMOZ, I read an interesting article, in the same vain it could be considered as venting, but some good points are raised.
    http://www.goarticles.com/cgi-bin/showa.cgi?C=23907

    Paid Submission Services – hmmmmm
    The sites that claim to register your site with 100,000 search engines and directories for $19 are horseshiiit. Most of the 100,000 are FFA sites.

    If you have a budget, I would suggest putting a list of directories together (about a hundred or so) and paying somebody to enter your site manually to each one. It would be a handy nixer for someone.


  • Moderators, Society & Culture Moderators Posts: 17,643 Mod ✭✭✭✭Graham


    Pointing 2 domains to exactly the same content is only likely to result in one of the domains ever appearing in Google.

    Duplicate content penalty is the SEO term for it and I guess you can see the reason for it from Googles point of view, what benefit is there to users in finding exactly the same content listed twice??

    Cross linking domains is unlikely to get you much at all, whois data/IP Blocks all tell Google you're linking to yourself.

    Best approach to long term success in google is:
    1. Constantly add new (original) content to your website, Google likes fresh content.
    2. Get inbound links from other (preferably related sites)
    3. Repeat steps 1 and 2 ad infinitum.

    For a more detailed list of steps take a look here it's one of the best guides I've seen on success with google:

    http://www.webmasterworld.com/forum10003/2010.htm?highlight=steps+rankings+google


  • Advertisement
  • Closed Accounts Posts: 1,414 ✭✭✭tom-thebox


    blacknight wrote:
    Google rank is hotly debated :D
    It might help you if your PR is very very low, but it could also work against you...

    Got a couple of links there to places it is debated on, forums, sites etc... I read a few decent papers on hostbidder.com but would like to learn more.

    I would be afraid when purchasing books that they are written before google changed their system.


  • Closed Accounts Posts: 21 mm2


    Google's geographic targeting is improving all the time.

    Within the UK for example google advertisers can now choose to target their adds to six sub-uk regions. Google adwords is even able to offer targeting within a defined radius of a specific address!

    Google's primary way of determining geographic location is by IP address.

    If you want to target Dublin, having your site served from Dublin is a good idea.

    Other ideas (which may or may not be currently important):
    Postal address on all pages
    Phone number in standard format on all pages
    Geographic meta tags, ICBM / Geotags
    .ie domain with a Dublin address in the whois.


  • Banned (with Prison Access) Posts: 16,659 ✭✭✭✭dahamsta


    Graham wrote:
    Cross linking domains is unlikely to get you much at all, whois data/IP Blocks all tell Google you're linking to yourself.
    I've always found it very difficult to believe that Google looks at that information. I have no way of knowing of course, I just find it difficult to believe.

    adam


  • Registered Users, Registered Users 2 Posts: 7,521 ✭✭✭jmcc


    mm2 wrote:
    Google's geographic targeting is improving all the time.
    Still the same old IP/cctld extension basis. The country specific body text and keywords may give the illusion that it is improving.
    If you want to target Dublin, having your site served from Dublin is a good idea.
    Most of the main Irish hosters on Irish IP space are Dublin based. It would be better to have 'Dublin' in the domain name or the URL. The IP whois data is often wildly inaccurate and gtld whois data cannot always be relied upon.
    .ie domain with a Dublin address in the whois.
    The .ie whois data does not include geographic information. Addressing info is only included in gtld (com/net/org/biz/info) whois data. Even so, I don't think that Google et al are going to trawl through the whois data for approximately 40 million domains to identify Irish domains like that. It just doesn't have the brainpower or the resources to expend for such a little return. (At most it is approximately 250K domains of which less than 60% will have websites).

    Regards...jmcc


  • Registered Users, Registered Users 2 Posts: 7,521 ✭✭✭jmcc


    dahamsta wrote:
    I've always found it very difficult to believe that Google looks at that information. I have no way of knowing of course, I just find it difficult to believe.
    I'd guess that Google does look at that kind of information. It is especially useful when identifying linkswamps (fake sites that run only, for example, Dmoz data to promote sites in their group). Most search engine operators have enough problems dealing with real sites as it is and tend to get aggressive with linkswamp operators. When you strip away the domain names, linkswamps become highly visible though the software for doing this kind of thing is generally custom, it is an easy to implement feature of geo-location based on IPs. Indeed part of the process of geo-location involves mapping sites to IPs to countries. This grouping allows 'coming soon' holding site IPs to be rapidly identified (some can have 20K or more sites hosted on one IP) and removed from any spidering schedule. Doing something similar for linkswamps would be simple enough. I know for definite that one SE deepsixed all the Irish Medical Pages (cyberwarehouse operator) websites except the main one. :)

    Regards...jmcc


  • Banned (with Prison Access) Posts: 16,659 ✭✭✭✭dahamsta


    Perhaps I should have phrased that differently. Let me put it this way: I deal with people that linkswamp, and I also have to deal with their conspiracy theories about Google. This is one issue in particular they've always had the tinfoil hat on about. However having watched their work from afar for several years now, and in particular mistakes that have been made that haven't been spotted for months or even years, I think that their paranoia is massively overstated. In other words, it's my experience that Google doesn't look at WHOIS or IP block information, at least not in detail. I would guess that if they look at it all, it would be a binary IP to IP comparison. Personally, I doubt they even do that.

    Again though, this is just speculation, albeit based on experience.

    adam


  • Registered Users, Registered Users 2 Posts: 7,521 ✭✭✭jmcc


    dahamsta wrote:
    Perhaps I should have phrased that differently. Let me put it this way: I deal with people that linkswamp, and I also have to deal with their conspiracy theories about Google. This is one issue in particular they've always had the tinfoil hat on about, however having watched their work from afar for several years now, I think that their paranoia is massively overstated.
    The guilty often are rather paranoid. :) Google's approach tends to be more shotgun than scalpel. This may back up your theory Adam.
    In other words, it's my experience that Google doesn't look at WHOIS or IP block information, at least not in detail. I would guess that if they look at it all, it would be a binary IP to IP comparison. Personally, I doubt they even do that.
    It is probable that they have some kind of linkswamp threshold. The website's IPs would be processed as a raw number and perhaps coded with a country code but that would be about it. WHOIS data is not even necessary to get to this elementary level. However for investigating linkswamps, the WHOIS data could be helpful. The problem is that a lot of the more recent spammers have taken to covering their tracks.

    The big advantage of running an SE like Google is that Google could easily run a comparative analyis on the content of each of these linkswamp sites using the body text. What I have noticed over the past few months is that some linkswamp operators have taken to coding their linkswamp/SERPs spam as javascript. The theory being that ordinary SE spiders do not handle javascript (true to a large extent) and thus these guys get into the SERPs. I don't know if Google is cleaning its index of these types yet.

    Regards...jmcc


  • Banned (with Prison Access) Posts: 16,659 ✭✭✭✭dahamsta


    jmcc wrote:
    However for investigating linkswamps, the WHOIS data could be helpful.
    As would IP block information. I'm not denying that this information could be used in this way, and in fact if I was Google I would be. I just don't think they are. I think they have enough trouble trying to keep blatant spam out of their indexes without bringing it down to this level. Eventually they'll have to though.
    What I have noticed over the past few months is that some linkswamp operators have taken to coding their linkswamp/SERPs spam as javascript.
    This has been another topic that's come up, although it's usually me that brings it up in cases where I need to connect the sites. My opinion on this is usually that I'd eat my hat if Googlebot was running a Javascript interpreter. However I invariably end up having to write a client/server to handle it, because it's very hard to convince non-techies of this stuff. And I'm on a flat fee. And ye wonder why I hate SEO. :)

    adam


Advertisement