Bug Steps:
Attempt to extract the delay for a user agent from a robots.txt file where the crawl-delay for a specific user agent appears after the rule for *. Example:
https://www.archives.gov/robots.txt
User-agent: *
Crawl-delay: 10
...
User-agent: usasearch
Crawl-delay: 2
Expected results:
> Robotex.new('usasearch').delay('https://www.archives.gov')
#=> 2
Actual results:
> Robotex.new('usasearch').delay('https://www.archives.gov')
#=> 10
See https://developers.google.com/search/reference/robots_txt#order-of-precedence-for-user-agents for precedence rules.