robotstxt

A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

Total

5,643

Last month

425

Last week

131

Average per day

14

Daily downloads

Total downloads

Description file content

Package
robotstxt
Date
2018-02-10
Type
Package
Title
A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker
Version
0.6.0
Description
Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.
License
MIT + file LICENSE
LazyData
TRUE
BugReports
https://github.com/ropenscilabs/robotstxt/issues
URL
Imports
stringr (>= 1.0.0), httr (>= 1.0.0), spiderbar (>= 0.2.0), future (>= 1.6.2), magrittr, utils
Suggests
knitr, rmarkdown, dplyr, testthat, covr
Depends
R (>= 3.0.0)
VignetteBuilder
knitr
RoxygenNote
6.0.1
NeedsCompilation
no
Packaged
2018-02-11 12:41:55 UTC; peter
Author
Peter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]
Maintainer
Peter Meissner
Repository
CRAN
Date/Publication
2018-02-11 14:19:44 UTC

install.packages('robotstxt')

0.6.0

7 days ago

https://github.com/ropenscilabs/robotstxt

Peter Meissner

MIT + file LICENSE

Depends on

R (>= 3.0.0)

Imports

stringr (>= 1.0.0), httr (>= 1.0.0), spiderbar (>= 0.2.0), future (>= 1.6.2), magrittr, utils

Suggests

knitr, rmarkdown, dplyr, testthat, covr

Discussions