Knowledge base article

How do I diagnose why Microsoft Copilot is not using pages on Squarespace?

Learn how to diagnose why Microsoft Copilot is failing to index or reference your Squarespace pages. Follow these steps to troubleshoot crawl and visibility issues.
Technical Optimization Created 2 March 2026 Published 29 April 2026 Reviewed 29 April 2026 Trakkr Research - Research team
how do i diagnose why microsoft copilot is not using pages on squarespacecopilot not reading squarespacesquarespace site indexing issuesai search engine optimizationtroubleshoot copilot crawling

To diagnose why Microsoft Copilot is not using your Squarespace pages, start by verifying your site's robots.txt file to ensure AI crawlers are not blocked. Next, check your Squarespace SEO settings to confirm that pages are set to public and not hidden from search engines. Use the Google Search Console to inspect individual URLs for indexing errors. Finally, ensure your sitemap is correctly submitted and updated. If issues persist, check for canonical tag conflicts or excessive JavaScript rendering requirements that might prevent the AI from parsing your content effectively. Regularly auditing these technical elements ensures that your site remains accessible to modern AI-driven search engines and conversational agents.

External references
4
Official docs, platform pages, and standards in the source pack.
Related guides
1
Guide pages that connect this answer to broader workflows.
Mirrors
2
Canonical markdown and JSON mirrors for retrieval and reuse.
What this answer should make obvious
  • Verified indexing protocols for AI crawlers.
  • Standardized Squarespace SEO configuration steps.
  • Technical audit framework for search visibility.

Check Robots.txt and Sitemap

The first step in diagnosing crawl issues is ensuring your site's directives allow access. The strongest setup is the one that lets you rerun the same question, inspect the cited sources, and explain what changed with confidence.

Verify that your sitemap is correctly formatted and accessible to external crawlers. The practical move is to preserve a baseline, compare repeated outputs, and connect every shift back to the sources influencing the answer.

  • Access your robots.txt file via yourdomain.com/robots.txt
  • Ensure no Disallow directives target AI user agents
  • Submit your sitemap URL to search console tools
  • Check for any crawl delay settings in Squarespace

Review Squarespace SEO Settings

Squarespace has built-in settings that can inadvertently hide pages from search engines. The practical move is to preserve a baseline, compare repeated outputs, and connect every shift back to the sources influencing the answer.

Reviewing these settings is critical for maintaining site visibility. The useful workflow is the one that gives the team a baseline, fresh runs to compare, and enough source context to explain the shift.

  • Verify page visibility is set to Public
  • Check for 'noindex' tags in page settings
  • Ensure canonical URLs are correctly defined
  • Review site-wide SEO description and title tags

Analyze Content Accessibility

Sometimes the content itself is the issue, particularly if it relies on heavy scripts. The useful workflow is the one that gives the team a baseline, fresh runs to compare, and enough source context to explain the shift.

AI crawlers prefer clean, semantic HTML for better understanding. The useful workflow is the one that gives the team a baseline, fresh runs to compare, and enough source context to explain the shift.

  • Minimize reliance on complex JavaScript rendering
  • Use clear headings and structured data markup
  • Check for broken links that block crawler paths
  • Ensure content is not behind a login wall
Visible questions mapped into structured data

Why is Copilot ignoring my Squarespace site?

It is likely due to a robots.txt block or a 'noindex' tag preventing the crawler from accessing your content.

Does Squarespace automatically support AI crawlers?

Yes, Squarespace is generally compatible, but you must ensure your SEO settings do not explicitly block search engines.

How long does it take for Copilot to re-index?

Re-indexing can take anywhere from a few days to several weeks depending on the crawl frequency of the AI.

Should I use a custom robots.txt file?

Only if you need to restrict specific bots; otherwise, the default Squarespace configuration is usually sufficient.