How to Add a Robots.txt and Sitemap File to Your Website
If you’re running a website audit tool on our site and encountering errors, it’s possible that your website is missing two essential files: robots.txt and sitemap.xml. These files help search engines understand your site and are also required by many audit tools.

🛠 What is a robots.txt file?
Here’s how you can add them to your website:
The robots.txt file tells search engines which pages they can (and cannot) crawl. It’s a simple text file that lives at the root of your domain (e.g., https://yourdomain.com/robots.txt).
Pair text with an image to focus on your chosen product, collection, or blog post. Add details on availability, style, or even provide a review.
How to create a robots.txt file:
- Open a plain text editor (like Notepad or VS Code).
- Add these basic lines:
makefile
CopyEdit
User-agent: *
Disallow:
This allows all search engines to crawl all pages. If you want to block specific parts of your site, you can customize it.
- Save the file as robots.txt.
- Upload it to the root of your site using your CMS (like Shopify, WordPress) or your hosting’s file manager.
🗺 What is a sitemap.xml file?
How to create a sitemap.xml file:
A sitemap.xml file lists all the important pages on your site, helping search engines find and index them.
- Use an online sitemap generator (like XML-sitemaps.com) or your CMS’s built-in sitemap feature.
- Save the file as sitemap.xml.
- Upload it to your website’s root directory (e.g., https://yourdomain.com/sitemap.xml).
For example, Shopify automatically generates a sitemap at https://yourdomain.com/sitemap.xml. If you’re using WordPress, consider using an SEO plugin like Yoast SEO to create it.
🔍 Why these files matter
Adding these files not only ensures that your site is audit-ready but also boosts your SEO performance. Search engines love well-structured sites!