hugo新增robots规则
robots.txt是对爬虫机器人爬取url进行限制,新增robots规则
robots.txt(layouts/robots.txt)
m1.添加模板
1.add toml
#hugo.toml/config.toml
enableRobotsTXT = true
2.add robots.txt
layouts/robots.txt
# content
User-agent: *
Sitemap: {{ "sitemap.xml" | absURL }}
m2.添加静态
1.
To create a robots.txt file without using a template:
2.
Set enableRobotsTXT to false in the site configuration.
Create a robots.txt file in the static directory.
3.
static/robots.txt
User-agent: *
Sitemap: https://blog.mvpbang.com/sitemap.xml
view
User-agent: *
Sitemap: https://blog.mvpbang.com/sitemap.xml