RobotUserAgent is the user-agent value of the robot, used to find a matching policy in the robots.txt file of a host. It is not used to make the robots.txt request, only to match a policy. It should always be set to the name of your crawler application so that site owners can configure the robots.txt accordingly.
// should always set your robot name so that it looks for the most // specific rules possible in robots.txt. opts.RobotUserAgent = "Example" // and reflect that in the user-agent string used to make requests, // ideally with a link so site owners can contact you if there's an issue
c.logFunc(LogTrace, "init() - seeds length: %d", l) c.logFunc(LogTrace, "init() - host count: %d", hostCount) c.logFunc(LogInfo, "robot user-agent: %s", c.Options.RobotUserAgent) // Create a shiny new WaitGroup
CrawlDelay: DefaultTestCrawlDelay, LogFlags: LogAll, RobotUserAgent: DefaultRobotUserAgent, }, seeds: "http://robota/page1.html",