Skip to content

Latest commit

 

History

History
41 lines (30 loc) · 1.17 KB

README.md

File metadata and controls

41 lines (30 loc) · 1.17 KB

Roboto: Parse and use robots.txt files

crate Docs Build Status MIT licensed

Roboto provides a type-safe way to parse and use robots.txt files. It is based on the Robots Exclusion Protocol and is used to approximately try control the behavior of web crawlers and other web robots.

Installation

Add this to your Cargo.toml:

[dependencies]
roboto = "0.1"

Usage

use roboto::Robots;

let robots = r#"
User-agent: *
Disallow: /private
Disallow: /tmp
"#.parse::<Robots>().unwrap();

let user_agent = "googlebot".parse().unwrap();

assert_eq!(robots.is_allowed(&user_agent, "/public"), true);