Preventing crawlers like Googlebot from crawling parameterized duplicate content means controlling what they can access on your site through robotsxt. The robotsxt file is checked by robots before crawling a site, so it’s a good starting point when optimizing parameterized URLs.
Disallow: *?tag=*
This disallow flag will block search engines from crawling all bahamas mobile database URL parameters. Before selecting this option, make sure that no other parts of the URL structure use parameters, or they will be blocked as well.
You may need to crawl the web yourself to find all URLs that contain question marks ?.
Move URL parameters to static URLs
This belongs to a broader discussion about dynamic and static URLs . Rewriting dynamic pages as static pages can improve your site's URL structure.
However, especially if parameterized URLs are currently indexed, you should take the time not only to rewrite the URLs, but also to redirect those pages to their corresponding new static locations.
Google Developers also recommends:
Remove unnecessary parameters but keep a dynamic URL
Create static content that is equivalent to the original dynamic content
Limiting dynamicstatic overrides to those that can help you remove unnecessary parameters.
Incorporate URL parameters into your SEO strategy
Parameterized URLs make it easier to modify or track content, so it’s worth incorporating them when needed. You need to let web crawlers know when to index a specific URL with parameters and when not to, and highlight the most valuable version of the page.