Standard Robots.txt Files for Different Sites
Wed Dec 17 2025 22:24:04 GMT+0000 (Coordinated Universal Time)
Saved by
@Surftware
#undefined
For a Standard WordPress Blog
User-agent: *
# Block WordPress admin and include directories
Disallow: /wp-admin/
Disallow: /wp-includes/
# Allow ajax functionality for rendering
Allow: /wp-admin/admin-ajax.php
# Block internal search results
Disallow: /?s=
Disallow: /search/
Sitemap: https://venture.com/domains/yourblog.com
For an E-commerce Site (e.g., Shopify or Magento)
User-agent: *
# Block account, cart, and checkout pages
Disallow: /account/
Disallow: /cart/
Disallow: /checkout/
Disallow: /orders/
# Block internal search and filtered navigation pages
Disallow: /search
Disallow: /*?sort_by*
Disallow: /*?filter*
User-agent: Googlebot
Allow: /
Sitemap:
content_copyCOPY
That is the standard robots.txt file for Wordpress blog and eCommerce, you can read more Tutorial here: https://techlyguides.com
https://techlyguides.com/robots-txt-best-practices/
Comments