Skip to main content

Why Robot.txt is Important?


A robot.txt is exactly a coded algorithm which helps to crawl in google search engins, if it is not added in hosting file manager then google will never crawl your pages in search engins. To proceed such thing you need few steps to complet and add this file in your hosting (file manager section)

Step 1 - Visit to search engin google and find the link to generate the Robot.txt file by online

Step 2 - Copy the code and paste it into a notepad and save with the extension name robot.txt

Step 3 - Upload the file in File Manager Section (In your hosting account)

Once if you are done with this then wait for some days about 1 week and then visit to Google Search Consol and find the information by inspecting the pages. You will get the updates in few minutes in your email box

Comments

Popular posts from this blog

How to check a sitemap in easy steps?

Why schema markup is important for SEO?