Step 1: It All Started with a Cron Job
The story began with a simple goal:
I wanted the W3 Total Cache preload to automatically rebuild page caches every few hours.
The preload uses a sitemap as its source, so I set this:
The cron job was in cPanel:
cd /home/onlinedentist/public_html && /usr/local/bin/php /home/onlinedentist/public_html/wp-cron.php
…but soon, I noticed something strange: the cache folders were barely filling up – out of more than 300 multilingual pages, only a handful were cached.
That was the first clue something deeper was broken.
Step 2: The Real Culprit – AIOSEO, Polylang, and a Broken Sitemap
My site runs in two languages using Polylang.
At that time, I also used All in One SEO (AIOSEO), which had its own sitemap system.
When I checked the sitemap index, it looked like this:
It claimed there were 317 URLs.
But when I clicked one of the sub-sitemaps, there were no items…
I got this message: “Didn’t expect to see this? Make sure your sitemap is enabled and your content is set to be indexed.”
So AIOSEO’s sitemap system didn’t actually work with Polylang.
The W3 Total Cache preload relied on it – and since it was broken, the cache preload had no URLs to warm up.
Step 3: Switching to Google XML Sitemaps
To fix this, I replaced AIOSEO’s sitemap module with the old but reliable Google XML Sitemaps plugin – which creates a true, working sitemap.
This plugin ignores Polylang and lists all pages from all languages, so W3TC finally had a valid source for its preload job.
After enabling it, my cache folders started filling up properly – both /hu/ and /en/ pages were cached automatically.
Step 4: The Next Problem – Double Sitemaps
Right after fixing that, a new warning appeared in the WordPress dashboard:
“One or more plugins are affecting your site’s ability to be indexed.”
I checked and realized WordPress itself was still generating its own sitemap.
Now I had two sitemaps again:
- /sitemap.xml (from the plugin, correct)
- /wp-sitemap.xml (from WordPress core, unnecessary)
To clean things up, I decided to disable the core sitemap completely.
Step 5: Disabling the WordPress Core Sitemap
I tried the official filter first:
add_filter( 'wp_sitemaps_enabled', '__return_false' );
But it didn’t work – the sitemap stayed alive.
So I added a stronger block in my functions.php:
add_action( 'init', function() {
remove_action( 'init', 'wp_sitemaps_get_server' );
add_filter( 'wp_sitemaps_enabled', '__return_false', 9999 );
}, 0 );
And to make sure it was completely gone, I placed this .htaccess rule above the WordPress section:
# --- Disable WordPress core sitemap ---
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/wp-sitemap\.xml$ [OR]
RewriteCond %{REQUEST_URI} ^/wp-sitemap-index\.xml$ [OR]
RewriteCond %{REQUEST_URI} ^/wp-sitemap-posts-[0-9]+\.xml$
RewriteRule .* - [R=404,L]
</IfModule>
After this, /wp-sitemap.xml finally returned 404 Not Found, while /sitemap.xml continued to work normally.
Step 6: Verifying the Robots.txt and Cache Integration
Once the sitemap situation was stable,
I checked the virtual robots.txt (generated by WordPress) – it already contained the correct sitemap reference:
User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php
No need for a physical robots.txt file.
Then, in W3 Total Cache → Page Cache → Cache Preload, I confirmed:
Sitemap URL: https://online-dentist.hu/sitemap.xml Update interval: 1800 seconds Pages per interval: 10
The next scheduled cron run rebuilt the entire cache successfully.
What I Learned
- Scheduled cache preloading only works if the sitemap works.
- A broken or plugin-conflicted sitemap silently breaks the preload job.
- AIOSEO and Polylang don’t play well together.
- The sitemap index looks correct, but the sub-sitemaps are often invalid.
- Google XML Sitemaps still works best for multilingual static sitemaps.
- It doesn’t overthink language folders – it just lists every URL.
- The WordPress core sitemap must be disabled when using a custom sitemap plugin, or you’ll confuse both Google and W3TC.
- Virtual files are fine.
- You don’t need physical robots.txt or sitemap.xml – WordPress can handle them dynamically.
Final Thoughts
This fix began as a routine optimization – a simple cron job for W3TC – but it led me deep into WordPress internals, SEO plugin conflicts, and how virtual endpoints really work.
Now my site has:
- One active sitemap (/sitemap.xml)
- One dynamic robots.txt
- A fully working cache preload system
Sometimes improving performance means stripping away the unnecessary – not adding more tools, but making sure the ones you already have actually talk to each other.
Buy me a coffee?
If you enjoyed this story, you can buy me a coffee. You don’t have to – but it means a lot and I always turn it into a new adventure.
Buy a coffee for Steve

Linktree
Short introduction