Website owners use the robots.txt file to give directions concerning their website to search engine web robots. It can be referred to as the robots exclusion protocol.
The following robots.txt working like this, before visit your website robot visit first your robots.txt for example if your website is www.digitalmediarole.com, it first checks for www.digitalmediarole.com/robots.txt.
Here is robots.txt file example syntax.
User-agent: *
Disallow:
The “User-agent: *”means that this section applies to all robots. The “Disallow:” tells the robot that it should be visit any web pages on the website.
There are 2 important suggestions when using robots.txt file
- Robots will ignore your robots.txt. Particularly malware robots that scan the web for security vulnerabilities, and email address harvesters employed by spammers can pay no attention.
- The robots.txt file could be a publicly on the market file. Anyone will see what sections of you server you do not need robots to use.
So do not try and use robots.txt to cover data.
How to create robots.txt file
Where to place it.
Place in the top level directory of your web server.
What to place in it
Here is example
User-agent: *
Disallow: /wp-admin/
Conclusion
Robots.txt file is one of the major important on page parameter. It explains about your web pages to robots. If you can miss robots.txt file create and upload today.