robotstxt

A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

Total

6,563

Last month

322

Last week

57

Average per day

11

Daily downloads

Total downloads

Description file content

Package
robotstxt
Date
2018-02-10
Type
Package
Title
A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker
Version
0.6.0
Description
Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.
License
MIT + file LICENSE
LazyData
TRUE
BugReports
https://github.com/ropenscilabs/robotstxt/issues
URL
Imports
stringr (>= 1.0.0), httr (>= 1.0.0), spiderbar (>= 0.2.0), future (>= 1.6.2), magrittr, utils
Suggests
knitr, rmarkdown, dplyr, testthat, covr
Depends
R (>= 3.0.0)
VignetteBuilder
knitr
RoxygenNote
6.0.1
NeedsCompilation
no
Packaged
2018-02-11 12:41:55 UTC; peter
Author
Peter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]
Maintainer
Peter Meissner
Repository
CRAN
Date/Publication
2018-02-11 14:19:44 UTC

install.packages('robotstxt')

0.6.0

3 months ago

https://github.com/ropenscilabs/robotstxt

Peter Meissner

MIT + file LICENSE

Depends on

R (>= 3.0.0)

Imports

stringr (>= 1.0.0), httr (>= 1.0.0), spiderbar (>= 0.2.0), future (>= 1.6.2), magrittr, utils

Suggests

knitr, rmarkdown, dplyr, testthat, covr

Discussions