Multi National Company that Doesn't Want to Implement International SEO
-
I have got an interesting situation where I have a client who wants to merge two ccTLD's into one. They currently have .fi and .com and they want to merge both sites to .com
.fi is for finland and .com for USA.
They want to merge the sites and the original plan was to use subfolders for each country and pair with hreflang.
However the team now wants to merge both sites with NO subfolders differentiating between finland or the US.
My understanding of International SEO that this is the most opposite from best practices, but is there any specific reasons why they wouldn't want to do this?
I'm struggling to find any specific reasons that I can cite to the client that would argue why we should at least do a subfolder or some sort of international seo strategy.
-
@webuniversalp1 Yes, hreflang tags need to be created for each page "appropriately" as covered in my previous response to help search engines show the right version page to the right geo/audience.
-
El tema es que vas a tener que redireccionar las dos webs, no sé si mantendrás el contenido de la .com pero si no te espera un auditoria grande dependiendo de la web, en cuanto a lo que dicen en los anteriores sobre hreflang pienso lo mismo.
-
@naeemgari Hello,
You need to use the hreflang tags for each page, with this Google will not penalize your content and will understand that they are two versions with different languages.
-
@naeemgari I agree with that.
-
Hello,
You need to use the hreflang tags for each page, with this Google will not penalize your content and will understand that they are two versions with different languages.
-
@jkhoo for international SEO the strongest signal for search engines is ccTLDs. The next best option would be a sub-folder URL structure with the correct hreflang tag declarations.
For your core keywords is there low / no search volume in Finland? From a business standpoint, managing two websites can be tedious. You need to build content & backlinks for two domains.
However, from an SEO standpoint, the preferred option would be to keep the ccTLDs. They are the best indicator of relevance to local SERPs. Think about your audience in Finland are they likely to visit a .com domain from SERPs or a .fi domain? Search engines would also prefer showing more targeted and relevant results to users. Therefore, ccTLDs for target regions and international SEO are the best options.
The next best route would be sub folders with appropriate hreflang tag declarations & xml sitemaps.
Additionally, site mergers/migrations generally result in a loss in organic traffic and visibility which can range from a quarter to over year(s).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
SiteName Attribute Showing in Different Language in SERP
We are currently experiencing issues with our subdomain SiteName. Our parent company root domain is a Japanese language site, but we have an English subdomain that is for the United States primarily, and nearly rest of world for organic traffic. Our issue is that we have followed the guidelines here: https://developers.google.com/search/docs/appearance/site-names There was a large post on here with many responses including Googlers with issues others were having, but it has since been removed. Here is the code in place on our homepage: <script
Technical SEO | | Evan_Wright
type="application/ld+json"> { "@context": "https://schema.org",
"@type": "WebSite",
"name": "Mescius Developer Tools",
"alternateName": ["Mescius, inc.", "developer.mescius.com"],
"url": "https://developer.mescius.com" }
</script> Unfortunately this is what is appearing in the SERP. It is using the Japanese equivalent of our parent company. Screenshot 2024-02-23 at 3.37.55 PM.png Even though the relationship between root and subdomain should not be causing this, it seems like something is impacting this incorrect SiteName, and it is impacting CTR for the subdomain. Has anyone else experienced this and found a fix?0 -
Question regarding international SEO
Hi there, I have a question regarding international SEO and the APAC region in particular. We currently have a website extension .com and offer our content in English. However, we notice that our website hardly ranks in Google in the APAC region, while one of the main languages in that region is also English. I figure one way would be to set up .com/sg/ (or .com/au/ or .com/nz/), but then the content would still be in English. So wouldn't that be counted as duplicate content? Does anyone have experience in improving website rankings for various English-speaking countries, without creating duplicate content? Thanks in advance for your help!
International SEO | | Billywig0 -
"Duplicate without user-selected canonical” - impact to SERPs
Hello, we are facing some issues on our project and we would like to get some advice. Scenario
International SEO | | Alex_Pisa
We run several websites (www.brandName.com, www.brandName.be, www.brandName.ch, etc..) all in French language . All sites have nearly the same content & structure, only minor text (some headings and phone numbers due to different countries are different). There are many good quality pages, but again they are the same over all domains. Goal
We want local domains (be, ch, fr, etc.) to appear in SERPs and also comply with Google policy of local language variants and/or canonical links. Current solution
Currently we don’t use canonicals, instead we use rel="alternate" hreflang="x-default": <link rel="alternate" hreflang="fr-BE" href="https://www.brandName.be/" /> <link rel="alternate" hreflang="fr-CA" href="https://www.brandName.ca/" /> <link rel="alternate" hreflang="fr-CH" href="https://www.brandName.ch/" /> <link rel="alternate" hreflang="fr-FR" href="https://www.brandName.fr/" /> <link rel="alternate" hreflang="fr-LU" href="https://www.brandName.lu/" /> <link rel="alternate" hreflang="x-default" href="https://www.brandName.com/" /> Issue
After Googlebot crawled the websites we see lot of “Duplicate without user-selected canonical” in Coverage/Excluded report (Google Search Console) for most domains. When we inspect some of those URLs we can see Google has decided that canonical URL points to (example): User-declared canonical: None
Google-selected canonical: …same page, but on a different domain Strange is that even those URLs are on Google and can be found in SERPs. Obviously Google doesn’t know what to make of it. We noticed many websites in the same scenario use a self-referencing approach which is not really “kosher” - we are afraid if we use the same approach we can get penalized by Google. Question: What do you suggest to fix the “Duplicate without user-selected canonical” in our scenario? Any suggestions/ideas appreciated, thanks. Regards.0 -
International SEO Sub folder Structure
Hi Could anyone offer some advice on the best way to structure sub folders on a website that we are launching worldwide. We are a UK based business and currently run a UK site on www.website.com and we are planning on launching into Europe using a sub folder structure. We will use /de, /fr, /es for the new countries that are coming on board but the question is should the UK site url be: www.website.com or www.website.com/uk As have an established web presence in the UK I'm thinking it should remain as www.wewbsite.com but are there any advantages / disadvantages to changing it to .com/uk Many Thanks
International SEO | | SmiffysUK0 -
SEO for .com vs. .com.au websites
I have a new client from Australia who has a website on a .com.au domain. He has the same domain name registered for .com. Example: exampledomain.com.au, and exampledomain.com He started with the .com.au site for a product he offers in Australia. He's bringing the same product to the U.S. (it's a medical device product) and wants us to build a site for it and point to the .com. Right now, he has what appears is the same site showing on the .com as on the .com.au. So both domains are pointing to the same host, but there are separate sections or directories within the hosting account for each website - and the content is exactly the same. Would this be viewed as duplicate content by Google? What's the best way to structure or build the new site on the .com to get the best SEO in the USA, maintain the .au version and not have the websites compete or be viewed as having duplicate content? Thanks, Greg
International SEO | | gregelwell0 -
Multi-lingual SEO: Country-specific TLD's, or migration to a huge .com site?
Dear SEOmoz team, I’m an in-house SEO looking after a number of sites in a competitive vertical. Right now we have our core example.com site translated into over thirty different languages, with each one sitting on its own country-specific TLD (so example.de, example.jp, example.es, example.co.kr etc…). Though we’re using a template system so that changes to the .com domain propagate across all languages, over the years things have become more complex in quite a few areas. For example, the level of analytics script hacks and filters we have created in order to channel users through to each language profile is now bordering on the epic. For a number of reasons we’ve recently been discussing the cost/benefit of migrating all of these languages into the single example.com domain. On first look this would appear to simplify things greatly; however I’m nervous about what effect this would have on our organic SE traffic. All these separate sites have cumulatively received years of on/off-site work, and even if we went through the process of setting up page-for-page redirects to their new home on example.com, I would hate to lose all this hard-work (and business) if we saw our rankings tank as a result of the move. So I guess the question is, for an international business such as ours, which is the optimal site structure in the eyes of the search engines; Local sites on local TLD’s, or one mammoth site with language identifiers in the URL path (or subdomains)? Is Google still so reliant on TLD for geo targeting search results, or is it less of a factor in today’s search engine environment? Cheers!
International SEO | | linklater0 -
Geo Targeting for Similar Sites to Specific Countries in Google's Index
I was hoping Webmaster Tools geo targeting would prevent this - I'm seeing in select google searches several pages indexed from our Australian website. Both sites have unique TLDs: barraguard.com barraguard.com.au I've attached a screenshot as an example. The sites are both hosted here in the U.S. at our data center. Are there any other methods for preventing Google and other search engines from indexing the barraguard.com.au pages in searches that take place in the U.S.? dSzoh.jpg
International SEO | | longbeachjamie0