Detecting duplicate links on a website is important for SEO optimization and improving user experience. Here are several ways to achieve it:

  1. Using online tools:
    • Tools like Screaming Frog or Ahrefs allow you to crawl all the links on a website and show you if there are any duplicates.
    • Screaming Frog has a specific feature to track all internal and external links and find duplicates.
  2. With custom scripts:
    • You can use a Python script with the BeautifulSoup library to extract all links and then check if any are duplicated.

    Example in Python:

    import requests
    from bs4 import BeautifulSoup
    from collections import Counter
    url = ‘https://yoursite.com’
    response = requests.get(url)
    soup = BeautifulSoup(response.text, ‘html.parser’)

    links = [link.get(‘href’) for link in soup.find_all(‘a’) if link.get(‘href’)]
    duplicates = [item for item, count in Counter(links).items() if count > 1]

    print(f’Duplicate links: {duplicates})

  3. Using browser extensions:
    • Some browser extensions like Check My Links for Chrome allow you to verify links on a page and detect if any are duplicated or broken.