Controlling how search engines like Google crawl and index your WordPress or WooCommerce site, such as “My Shop” (from your “Blog Page” guide), is crucial for SEO. A robots.txt file helps manage crawler access to pages like product listings (e.g., “Flying Ninja” from your “Creating Products” guide) or blog posts. This guide addresses how to create, upload, and submit an updated robots.txt file in the new Google Search Console (GSC), based on the 6fc Live Ask Google Webmasters video and enriched with current web sources. It also clarifies the transition from the old GSC, as raised by Div’s question about robots.txt submission.
Why Submit a Robots.txt File?
- Manage Crawling: Allows or blocks Googlebot from accessing specific pages (e.g., admin areas, staging sites) to optimize crawl budget.
- SEO Control: Prevents indexing of low-value pages (e.g., internal search results) while ensuring key content (e.g., product pages) is crawled.
- WooCommerce Fit: Enhances visibility for eCommerce pages or membership content (from your “Membership Plans” guide) for verified users (from your “Email Verification” guide).
- Limitations: Not all crawlers obey robots.txt; it doesn’t guarantee non-indexing (use noindex for that); changes may take 24-72 hours to reflect.
Context: Old vs. New Google Search Console
- Old GSC (Pre-2019): Had a dedicated Robots.txt Tester tool for editing, testing, and submitting robots.txt files directly in GSC.
- New GSC (Post-2019): The Robots.txt Tester was sunset in November 2023, replaced by the Robots.txt Report in Settings, which doesn’t allow direct editing or submission but supports recrawl requests.
- Div’s Question (Video): As of the 2019 video, the new GSC lacked a robots.txt submission feature, requiring users to switch to the old version. Google’s plan was to rethink features, not just copy them, to address modern website needs efficiently.
- Current Status (2025): The Robots.txt Report now allows viewing and requesting recrawls, but editing and submission are handled externally (e.g., via hosting or CMS).
Step 1: Create or Edit Your Robots.txt File
- Understand Robots.txt:
- A robots.txt file lives at your site’s root (e.g., https://yoursite.com/robots.txt) and uses simple rules to allow/disallow crawler access.
- Example: txtCopy
User-agent: * Allow: / Disallow: /wp-admin/ Sitemap: https://yoursite.com/sitemap.xml
- User-agent: * applies to all crawlers.
- Allow: / permits crawling the entire site.
- Disallow: /wp-admin/ blocks the admin area.
- Sitemap: points to your sitemap.
- Access Existing File:
- Edit or Create File:
- Test Syntax:
- Use third-party tools since GSC’s Robots.txt Tester is discontinued:
- Logeix Robots.txt Tester: logeix.com/tools/robots-txt-tester
- Merkle: technicalseo.com/tools/robots-txt/
- Ryte: ryte.com
- Or, perform a GSC Live Test in URL Inspection for https://yoursite.com/robots.txt to check crawler access.
- Fix errors (e.g., incorrect syntax like Disalow instead of Disallow).
- Use third-party tools since GSC’s Robots.txt Tester is discontinued:
Step 2: Upload Robots.txt to Your Site
- Access Root Directory:
- cPanel: Go to File Manager > public_html, upload or replace robots.txt.
- FTP: Connect via FileZilla, navigate to the root, and upload robots.txt.
- WordPress:
- Use Yoast SEO: Go to SEO > Tools > File Editor to edit/upload robots.txt.
- Or, use a plugin like All in One SEO to manage robots.txt.
- CMS (e.g., Wix, Blogger): Check for search settings to manage crawler access if direct editing isn’t available.
- Verify Upload:
- Subdomains:
Step 3: Submit Robots.txt in Google Search Console
- Access Robots.txt Report:
- Log in to GSC: search.google.com/search-console.
- Select your domain property (e.g., yoursite.com, verified per your “Verify Domain” guide).
- Go to Settings > Robots.txt Report (available since November 2023).
- Check Current File:
- Request Recrawl:
- Verify Update:
Step 4: Verify and Troubleshoot
- Test Crawling:
- In GSC, use URL Inspection to test specific URLs (e.g., https://yoursite.com/blog/):
- Search site:yoursite.com on Google to ensure only intended pages are indexed.
- Troubleshoot Issues:
- 404 Error (Not Fetched):
- Parsing Errors:
- Blocked Critical Pages:
- Old Version Cached:
- CMS Issues:
- Contact support at support.google.com/webmasters/ or your hosting provider.
- Monitor Regularly:
Step 5: Enhance Your Robots.txt Strategy
- GSC Features:
- WooCommerce Integration:
- Allow product pages (e.g., /product/flying-ninja/) for indexing.
- Block low-value pages (e.g., /cart/, /checkout/) with Disallow.
- Pair with NotificationX for sales alerts on indexed pages (from your “NotificationX” guide).
- Secure transactions with Razorpay or UPI QR Code (from your “Razorpay” or “UPI Payment” guides).
- Use FiboSearch for indexed product searches (from your “FiboSearch” guide).
- Advanced Rules:
- Alternative Blocking:
Step 6: Best Practices
- File Placement:
- Rule Simplicity:
- Testing:
- Security and Compliance:
- Secure blocked pages with Wordfence (from your “Malware Removal” guide).
- Use GDPR Cookie Consent for compliant tracking (from your “Cookie Notice” guide).
Pro Tips
- Boost Engagement: Promote indexed pages via Welcome Bar or Join.chat (from your “Sticky Menu” or “Join.chat” guides).
- Ads: Run YouTube ads for crawlable content with Google Ads (from your “YouTube Ads” guide).
- Analytics: Track indexed page performance with Visualizer (from your “Visualizer” guide).
- Backup: Save robots.txt with UpdraftPlus (from your “Backup and Migration” guide).
- Styling: Align crawlable pages with Neve’s design (from your “Neve” guide).
- SEO: Pair with Search & Filter for indexed content navigation (from your “Search & Filter” guide).
Congratulations!
You’ve learned how to create, upload, and submit a robots.txt file in the new Google Search Console with 6fc Live! Your WordPress or WooCommerce site is now optimized for efficient crawling and indexing. For more details, explore Google’s Robots.txt Guide (
) or the Robots.txt Report help page (support.google.com/webmasters/answer/13085936). Combine with your other guides (e.g., “Verify Domain,” “Submit URL for Indexing,” “Search & Filter”) for a robust SEO strategy. Need help with robots.txt syntax, recrawling, or troubleshooting? Comment below or visit support.google.com/webmasters/