robotstxt: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler'
Permissions Checker
Provides functions to download and parse 'robots.txt' files.
Ultimately the package makes it easy to check if bots
(spiders, crawler, scrapers, ...) are allowed to access specific
resources on a domain.
Version:
0.7.15
Depends:
R (≥ 3.0.0)
Published:
2024年08月29日
Author:
Pedro Baltazar [aut, cre],
Peter Meissner [aut],
Kun Ren [aut, cph] (Author and copyright holder of list_merge.R.),
Oliver Keys [ctb] (original release code review),
Rich Fitz John [ctb] (original release code review)
Maintainer:
Pedro Baltazar <pedrobtz at gmail.com>
NeedsCompilation:
no
Documentation:
Downloads:
Reverse dependencies:
Linking:
Please use the canonical form
https://CRAN.R-project.org/package=robotstxt
to link to this page.