How to use a Robots.txt on Github pages

1

I have a repository in GitHub and it has a site that is visible through github pages, but I want the search engines to only show the home page of the site. If it was a normal site I could add a robots.txt in the root folder, but github does not give me access to the root folder of the domain, only the repository folder. What do I do?

* You can not use meta tags because I also want Googlebot to not see other files that are in the repository that are not html files.

    
asked by anonymous 21.07.2018 / 17:06

1 answer

1

If it's githubpages then it's a subdomain something like silas333.github.io , so just go up in your repository usually robots.txt , I have a site in githubpages and it works normal: link

What you can not create is a robots.txt for the page of your repository, which is something totally different from the subdomain of githubpages, ie the domain github.com you have no control, your subdomain you have in .github.io

    
25.07.2018 / 14:39