Every WordPress site must have a robots.txt file.
That file tells to the search engine robots to index or not index some parts of your site.
In a WordPress site we want to index all post but we don't want to index the admin section (wp-admin) or core files (wp-includes).
So How to optimize your robots.txt file in WordPress?
1. Create the robots.txt file
In your root installation folder / if you don't have it already.
2. Fill the robots.txt file with this lines
User-agent: * Disallow: /wp-includes/ Disallow: /wp-admin/ Sitemap: https://ricard.dev/sitemap.xml.gz
This lines will allow to index all your site but will disallow to index the wp-includes and the wp-admin folders.
Some WordPress developers also block the wp-content folder. It's up to you to index that folder.
Here is my robots.txt https://ricard.dev/robots.txt
Do you have a robots.txt file?