Metadata-Version: 2.2
Name: requests-robotstxt
Version: 0.1.0
Summary: Brings automatic support for robots.txt files in requests.
Home-page: https://github.com/ambv/requests-robotstxt
Author: Łukasz Langa
Author-email: lukasz@langa.pl
License: MIT
Platform: any
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Natural Language :: English
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.2
Classifier: Programming Language :: Python :: 3.3
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Dist: requests>=1.2.0
Requires-Dist: robotexclusionrulesparser
Requires-Dist: six
Dynamic: author
Dynamic: author-email
Dynamic: classifier
Dynamic: description
Dynamic: home-page
Dynamic: keywords
Dynamic: license
Dynamic: platform
Dynamic: requires-dist
Dynamic: summary

==================
requests-robotstxt
==================

.. image:: https://secure.travis-ci.org/ambv/requests-robotstxt.png
  :target: https://secure.travis-ci.org/ambv/requests-robotstxt

Currently just a proof of concept, the module strives to be an extension to
`requests <http://pypi.python.org/pypi/requests>`_ that brings automatic
support for robots.txt.

How to use
----------

Simply use ``RobotsAwareSession`` instead of the built-in ``requests.Session``.
If a resource is not allowed, a ``RobotsTxtDisallowed`` exception is raised.

How do I run the tests?
-----------------------

The easiest way would be to extract the source tarball and run::

  $ python test/test_robotstxt.py

Change Log
----------

0.1.0
~~~~~

* initial published version

Authors
-------

Glued together by `Łukasz Langa <mailto:lukasz@langa.pl>`_.
