Why Your Perfectly Implemented Hreflang is not working

Updated November 26th, 2025, to add in the challenge of the user’s browser language settings on what is presented.

Since my recent PubCon presentation, I have spoken with a dozen companies that had previously or recently implemented hreflang for their global sites, and it was not working as they hoped. They often referred to it as a “perfect implementation.” They wanted to know why the wrong market pages were still ranking after weeks and months of implementation.  Sometimes, even perfect hreflang setups still face issues. Of the dozen sites reviewed, ten were implemented correctly, requiring me to dig deeper into one of these common reasons for the hreflang not working as expected. 

Google Has Not Seen the Hreflang Tags

In more than half of those correct implementations reviewed, Google had not yet seen the hreflang entries on one or both pairs in the cluster. 

For example, a new English-language market website launched 30 days prior with hreflang tags implemented on the individual web pages. Nearly 60% of the URLs were flagged as Duplicate, with Google choosing an existing English language page as the canonical. The implementation passed my quality checklist.  When I crawled the new market site and the other English-language markets with Screaming Frog calling the GSC API, we immediately found the problem.  The report indicated that Google had yet to recrawl 60% or more of the existing English markets. Due to a lack of reciprocation, the new pages appear duplicated. 

Problem:  With hreflang tags in pages, Google must index each new or updated page to fetch the hreflang cluster and all of the pages listed as alternates to cross-validate the reciprocation.  Until both pages in a pair are validated, hreflang cannot be applied.  To further complicate the problem, now that these pages have been flagged as duplicates, in one case it being a first-time hreflang, it will take much longer to convince Google to treat them as specific to this new market.  

One reason the other sites were not being reindexed was their level of errors, which reduced Google’s desire to visit them. For every valid page, Google was required to crawl 10 URLs with errors, slowing down the refresh rates.

Solutions:  

  1. Try using hreflang XML sitemap(s), especially for new same-language market launches. Google is not required to visit each page to validate reciprocation because that is done in the sitemaps.  
  2. Try to minimize the number of errors, wasting Google’s resources, which will encourage the bots to revisit. Google has said that a new hreflang cluster naturally triggers a page refresh, but I suspect that is delayed when there is a large error ratio.

IP Location Detection is Blocking Google

Similar to the first reason, where Google had not seen the tags, two sites had an issue with their IP detection implementation.  One was blocking Google from getting the new Australian website as they were crawling from the US, and the other was blocking Google from requesting the hreflang XML sitemaps.  

Problem:  The hreflang standard requires the search engine to cross-validate each alternate to confirm that you have set the reciprocation. To perform this action, the search engine must be able to crawl the pages to get the hreflang tags and, if using the hreflang XML sitemaps method. This may be impossible if you have overly strict IP location detection. If it cannot retrieve the hreflang element, how does it not know you implemented hreflang?  Even if you allow Google to fetch the hreflang XML, ensure your IP detection does not prevent it from crawling and indexing the pages.  If you block search engines from crawling outside of the market, you have a much bigger problem than just SERP cannibalization.  

Solutions:  

  1. Ensure that search engines can access any page they request.  The first test is to submit the hreflang XML sitemaps in GSC and make sure Google can fetch them if they cannot review Seach Engine exclusions with your DevOps team.    
  2. Set up each market version of your site as a project in GSC, and this will indicate if Google has a problem accessing the market site(s). 
  3. You can often test this during your post-launch checks using your preferred crawling tool and a VPN outside of your corporate network to ensure you can access the pages.

Local Pages have Little to No Added Value

Over the past eight months, we have seen an increase in hreflang deployments, which are taking longer to correct cannibalization. This is often the case when a site has multiple same-language websites that are clones or near matches and either did not have hreflang or it was previously incorrect.

Problem: I believe when Google’s crawling and indexing process does not identify any incremental value from these pages, compared to the content they already have indexed, both from you and others, they flag it as duplicate or lessor value and significantly decrease the indexing rate. I have a forthcoming article detailing this challenge, the need for what Google refers to as, information gain,” and potential ways to make adjustments.

 

This example is for a refresh and market expansion of a product line with unique and same-language pages. We see significant delays in reindexing the refreshed pages and the new same-language URLs.  The new unique language pages were completely indexed without indexing lag or duplicate flags.  However, the identical language versions in English, French, and German all had 80 to 90 percent of the pages flagged as Duplicates, with Google choosing the existing language market pages as their preference or delaying the craw and indexing.  During the audit, we found nothing different with any of these pages than what Google already had from them and other websites.

Suggestions: 

  1. To make the pages market-centric, add geolocation signals, including local currency symbols, product and organization schema, and telephone numbers and addresses, where possible. 
  2. Add unique information that would be useful to a local searcher where possible. 
  3. Generate local market links to these pages and social media mentions showcasing them in regional markets.

Aspirational Hreflang

Like the no information gain problem, Aspirational Hreflang frequently occurs with aggressive content marketing campaigns when your system applies the intended language and region codes to pages that are not yet (or do not plan to be) localized for that language.  Your hreflang language says it is French or German when it is English, which Google can easily detect.

Problem: 

The problem is that the language stated in the hreflang entry was not correct. The site had flagged nearly 2,000 pages as being in Greek when they were actually in English. Google may not trust or respect the hreflang, effectively guaranteeing that these pages will be viewed as duplicate English and not indexed. In this screen capture, nearly all incorrectly tagged pages were excluded from being indexed and flagged as Duplicate canonical pages with Google selecting either the US or Australian English pages for which they were exact matches.

We did a test where we changed some markets to the correct language, and slowly, some pages were indexed. Recently, I checked, and many of those pages are now flagged as Duplicate Canonical again, as they added no incremental value. Note: Not only did they not add additional value, but they did not even reference the company in the documents, which should be strange for any company to do.

Solutions:  

  1. Localize and geotarget the content to the local market.  
  2. Use the correct language.  You can set this content as English and indicate it is for Germany or France.  A year ago, this would work as it did give the page’s purpose, but as noted above, these clone pages are unlikely to be indexed without any incremental value.

Searcher Intent Misalignment

Many SEOs have reported that Google gives higher ranking preference to content that aligns with the searcher’s query intent.  We are starting to see more of this, where we have only a few pages that have cross-market cannibalization and where we can identify a mismatch in the content, whether informational or transactional. 

Problem: 

Google may choose a different non-market page if the searcher’s intent is not aligned with the overall content types on the local website. Only two of the 2,000 pages on this site have non-local pages ranked instead of the desired local page. The queries with the cannibalization were product-name-specific, so we thought it was a case of the non-local page having more algorithmic or authoritative value than the other. The data supports this, but something else stood out.

In one of the hreflang implementations we audited, the client had only two of the 2,000 pages on this site have non-local pages ranked instead of the desired local page. The queries with the cannibalization were product-name-specific, so we thought it was a case of the non-local page having more algorithmic or authoritative value than the other. The data supports this, but something else stood out.

Using Ryan Jones’ excellent SerpRecon Competitor Analysis tool to compare the pages, it classified the non-local page’s Searcher Intent as Commercial vs Navigational (not even informational), which I would expect for the local version.  The other pages in the top SERP positions were also classified as having strong commercial intent. Both alternates were about the product; however, the local market has more marketing fluff and legal information vs. the non-local market, which has benefits and references to where to buy, making it more transactional. It could be argued that they need to improve the relevance and authority of the local page, but would Google then rank a local navigational page for a transactional query? 

Solutions:

  1. Ensure that you understand your searchers, what they are looking for, and in what context. Use a solution like SerpRecon or your preferred auditing tool to review the page’s messaging to ensure that it aligns. Yes, a page can be potentially informational and transactional, but as SerpRecon illustrates, what is it in the context of specific queries?
  2. Have more alignment in content and messaging between markets, especially where you have cannibalization.

Brand and Product Name Queries

If you believe your hreflang needs to be fixed because a ranking tool shows non-local pages appearing for your brand name, it could be a similar problem to intent misalignment or a bigger problem where one market version of the brand is more algorithmically relevant or authoritative.  

Problems:  

  1. Brand name and root domain pages – We see this most frequently for brand name queries when a multinational has a root domain like IBM.com that uses IP location detection and a 302 temporary redirect to route users to local market websites. Due to the 302 redirect, Google keeps the root page in the index, and this page often has exponentially more external and internal links, especially referencing the brand name. Google considers this to be the ultimate representation of the brand.  As this page is not shown anywhere, it is often left out of the hreflang settings, making it even more of a free agent.   
  2. Keyword and phrase variations – are another big problem. If a page is a legitimate alternate page with local market adaptations due to messaging, spelling, or legal requirements, it may not yield as good a result as another market. This can be a problem when using the same set of keyword phrases across markets to do your rank-checking tests.  This is the case in the screen captures below.
  3. More algorithmically relevant content – Technically, hreflang should override this problem as its goal is to swap a local market-specific alternate page with the non-local page when ranking and hreflang is in place. We have seen cases where if the other market, especially for product names, it will take a more dominant market.

For example, a rank-checking tool flagged that the US page for the phrase “Systane Active Ingredients” ranked in Australia despite a valid hreflang.  Upon review, Australia only listed the ingredients, whereas the US page is legally required to break out both active and inactive ingredients.  Since “active ingredients” was not used on the Australian page, Google ranked the US page as a better match for the query.  The fix in this case was simply to change the phrase checked to Systane Ingredients, and the correct page magically ranked in the report.

Solutions: 

  1. Include the root domain in your hreflang setup.  We have had the best luck setting this root domain as the X-Default. 
  2. Build relevance and authority for the local brand home pages and products. Often, a quick review of links to the root domain can identify links that you can request be updated by others and ensure there are no internal links to the root from any market pages.
  3. Ensure the content aligns with the most likely local keyword phrases and spellings.
  4. Add a coming soon page for new product launches if the product is unavailable in local markets to capture email addresses to notify clients. If that is not an option, ensure that hreflang is updated upon launch to ensure Google understands the purpose of the local page and can swap it in the SERPS. Not doing it at launch may result in the local page being viewed as a duplicate.

Searcher Browser Preferences

Hat tip to Jason Lax for this reminder on LinkedIn that the user’s language preference may determine what they see in the search results. I did not add this to this set of issues as it is more of an edge case, but he is right. It needs to be on this list. This can often explain why many users from one market with a dedicated language site have high impressions on a global site. This may be an unknown challenge for markets with a significant increase in foreign tourists. I wrote this article of my experience in the US Virgin Islands, where some sites did not recognize me as being in the US due to limited IP location detection.

Problem: Jason offered this excellent statement of the problem: “If, for example, the browser language is English and they are in Italy, then that user can’t be matched on both the language and market attributes specified in the hreflang tags. So, the user will be matched to the default site. I’ve seen this at work when toggling my browser languages while in Canada and seeing the corresponding English or French site.”

Note that this is not just for the search results, where Google may try to show an English-language page to the searcher as it prioritizes language, then personalization. The bigger user experience issue is what happens when the searcher clicks only to be sent to the Italian-language site due to their IP location. That may be set to route them only based on their IP being in Italy to an Italian site. Then, you need to monitor analytics to see how many changes there are to a different language option.

Solutions:

  1. Create a country and language matrix to understand how you want to handle this situation.
  2. Ensure that you have a correct X-Default strategy and implementation.
  3. Review your IP location and language detection setup to see how the system handles a language other than the local market site’s.
  4. Review analytics at a language and path level to see what users do in different scenarios.

Conclusions

While it is far more likely that an incorrect hreflang implementation is the case for your cross-market cannibalization, there are situations where a perfectly executed hreflang can still not get the results you are seeking.  It often takes a team that can keep an open mind, break down the root causes and work your way through a logic tree until you can find the real reasons your hreflang is not working.  So before you post on social media for Google to eliminate hreflang, I suggest you follow my hreflang diagnostic process or contact us to do a hreflang audit