Skip to content

Latest commit

 

History

History
15 lines (10 loc) · 723 Bytes

README.md

File metadata and controls

15 lines (10 loc) · 723 Bytes

crwlr.software logo

Robots Exclusion Standard/Protocol Parser

for Web Crawling/Scraping

Use this library within crawler/scraper programs to parse robots.txt files and check if your crawler user-agent is allowed to load certain paths.

Documentation

You can find the documentation at crwlr.software.

Contributing

If you consider contributing something to this package, read the contribution guide (CONTRIBUTING.md).