You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This commit was created on GitHub.com and signed with GitHub’s verified signature.
The key has expired.
0.19.0
Updates
Dependency: Dropped usage of request in favor of the got library.
robots input types now search for Sitemap directives and favor those over any other information in the robots.txt file. If Sitemap directives are found, those alone are used to drive crawling of the site. If no Sitemap directives are found, it will fallback to Allow directives as in previous versions.
Sitemaps can accidentally be SitemapIndexes. If a sitemap index is parsed and is actually a sitemapIndex file, it is just processed as a sitemapIndex file, no questions asked. This allows robots.txt to contain a mixture of sitemap and sitemapIndex files.