# robots.txt for the deepidv website # # Rules below are intentionally permissive for public content but block # common non-public or noise paths (APIs, admin sections, source folders). # Adjust the Disallow rules to match any private/internal routes for your # deployed site. If you publish a sitemap, keep the Sitemap line (relative # path is fine) or replace with an absolute URL (e.g. https://example.com/sitemap.xml). User-agent: * # Allow everything by default Allow: / # Block server/API endpoints and obvious non-public folders. Tweak these # to match your app's actual private routes. These are safe defaults for # a static site or a typical frontend that proxies API calls. # Disallow: /api/ # Disallow: /admin/ # Disallow: /internal/ # Disallow: /private/ # Disallow: /scripts/ # Developer/source folders that shouldn't be crawled if accidentally exposed # Disallow: /src/ # Disallow: /node_modules/ # If you have a sitemap, point crawlers to it. Use an absolute URL if you # know the canonical domain (recommended), otherwise a site-root relative # path works on most hosts. Sitemap: https://www.deepidv.com/sitemap.xml # LLM navigation file # See https://llmstxt.org for details # LLMs-txt: https://www.deepidv.com/llms.txt # Optional: add a Host directive with your canonical hostname (uncomment # and update when you know the production hostname): Host: https://www.deepidv.com # End of file