A while back I decided to use a .php permalink for my WordPress posts and ran into some issues. The iRobots.txt WordPress plugin I started to use was blocking access to all my .php files, and further when I tried to updated it, an error was returned.
Since this is a virtual robots file, I was unable to change it manually. So the logical step was to just delete it and upload a manual file into my folders. Well this is where things got interesting. Google was not seeing the new file and still getting the old virtual file from somewhere in my site. I could not figure out where this thing was cached, so I decided to try something new.
I exported my database and files and proceeded to transfer everything out. I rebuilt the WordPress code. When I imported the database I made sure all the plugins were not part of the import. After everything was restored, I made sure there was no pieces of the robots.txt plugin anywhere.
Once I did that, Google had recognized the newly uploaded robots.txt and I was back in business. The moral of the story is not to use .php or anything similar to the WordPress parent code or pages again. I am sticking with .html.
This video will show you what I was seeing and why all my post were being blocked by Google.
The steps to complete the export are not too hard. Just FTP over your website folder to your desktop and export your database through database tools. Make sure you uncheck any databases that are related to your plugins. You want to ensure that you do not import any old plugin data.
Then it is just a matter of uninstalling your the WordPress code and re-installing the base code. Import the database and FTP over any theme information.
Now you need to re-install all plugins, except for the robots.txt plugin. Then create a simple robots.txt, uploaded it to your host and within an hour or two Google will recognize the correct robots.txt instead of the virtual file associated to WordPress.