forked from colinramsay/simply-robotstxt
-
Notifications
You must be signed in to change notification settings - Fork 0
Sunama/simply-robotstxt
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
robots.txt is an extremely simply format, but it still needs some parser loving. This class will normalise robots.txt entries for you. USAGE ----- With the following robots.txt: User-agent: * Disallow: /logs User-agent: Google Disallow: /admin Sitemap: http://something.com/sitemap.xml Use it like this: require 'robotstxtparser' # Also accepts a local file rp = RobotsTxtParser.new() rp.read("http://something.com/robots.txt") rp.user_agent('Google') # returns ["/logs", "/admin"] rp.user_agent('Autobot') # returns ["/logs"] rp.sitemaps # returns ["http://something.com/sitemap.xml"]
About
Parse a robots.txt file in ruby
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published
Languages
- Ruby 100.0%