[Python-modules-commits] [requests] 02/08: Import requests_2.9.1.orig.tar.gz

Daniele Tricoli eriol-guest at moszumanska.debian.org
Sun Dec 27 10:09:50 UTC 2015


This is an automated email from the git hooks/post-receive script.

eriol-guest pushed a commit to branch master
in repository requests.

commit 8565e660f5f59be865744d0a65a829f61dd16b00
Author: Daniele Tricoli <eriol at mornie.org>
Date:   Sun Dec 27 10:59:26 2015 +0100

    Import requests_2.9.1.orig.tar.gz
---
 HISTORY.rst                                    |  54 ++++-
 PKG-INFO                                       |  58 ++++-
 requests.egg-info/PKG-INFO                     |  58 ++++-
 requests.egg-info/requires.txt                 |   2 +-
 requests/__init__.py                           |  12 +-
 requests/adapters.py                           |  20 +-
 requests/api.py                                |  14 +-
 requests/auth.py                               |   2 +-
 requests/cookies.py                            |  19 +-
 requests/exceptions.py                         |  15 ++
 requests/models.py                             |   9 +-
 requests/packages/urllib3/__init__.py          |  32 ++-
 requests/packages/urllib3/_collections.py      |   7 +-
 requests/packages/urllib3/connection.py        |  39 ++--
 requests/packages/urllib3/connectionpool.py    |  34 +--
 requests/packages/urllib3/contrib/appengine.py |  23 +-
 requests/packages/urllib3/contrib/ntlmpool.py  |   1 +
 requests/packages/urllib3/contrib/pyopenssl.py |  35 +--
 requests/packages/urllib3/exceptions.py        |  14 +-
 requests/packages/urllib3/fields.py            |   1 +
 requests/packages/urllib3/filepost.py          |   1 +
 requests/packages/urllib3/packages/__init__.py |   1 +
 requests/packages/urllib3/poolmanager.py       |   3 +-
 requests/packages/urllib3/request.py           |   4 +-
 requests/packages/urllib3/response.py          |  53 ++++-
 requests/packages/urllib3/util/__init__.py     |  20 ++
 requests/packages/urllib3/util/connection.py   |   1 +
 requests/packages/urllib3/util/request.py      |   1 +
 requests/packages/urllib3/util/response.py     |   3 +-
 requests/packages/urllib3/util/retry.py        |   7 +-
 requests/packages/urllib3/util/ssl_.py         |  39 +++-
 requests/packages/urllib3/util/timeout.py      |   8 +-
 requests/packages/urllib3/util/url.py          |   5 +-
 requests/sessions.py                           |  12 +-
 requests/status_codes.py                       |   1 +
 requests/utils.py                              |  47 +++-
 requirements.txt                               |   2 +
 setup.py                                       |   6 +-
 test_requests.py                               | 295 +++++++++++++------------
 39 files changed, 667 insertions(+), 291 deletions(-)

diff --git a/HISTORY.rst b/HISTORY.rst
index 02593a3..f8c1a54 100644
--- a/HISTORY.rst
+++ b/HISTORY.rst
@@ -3,6 +3,54 @@
 Release History
 ---------------
 
+2.9.1 (2015-12-21)
+++++++++++++++++++
+
+**Bugfixes**
+
+- Resolve regression introduced in 2.9.0 that made it impossible to send binary
+  strings as bodies in Python 3.
+- Fixed errors when calculating cookie expiration dates in certain locales.
+
+**Miscellaneous**
+
+- Updated bundled urllib3 to 1.13.1.
+
+2.9.0 (2015-12-15)
+++++++++++++++++++
+
+**Minor Improvements** (Backwards compatible)
+
+- The ``verify`` keyword argument now supports being passed a path to a
+  directory of CA certificates, not just a single-file bundle.
+- Warnings are now emitted when sending files opened in text mode.
+- Added the 511 Network Authentication Required status code to the status code
+  registry.
+
+**Bugfixes**
+
+- For file-like objects that are not seeked to the very beginning, we now
+  send the content length for the number of bytes we will actually read, rather
+  than the total size of the file, allowing partial file uploads.
+- When uploading file-like objects, if they are empty or have no obvious
+  content length we set ``Transfer-Encoding: chunked`` rather than
+  ``Content-Length: 0``.
+- We correctly receive the response in buffered mode when uploading chunked
+  bodies.
+- We now handle being passed a query string as a bytestring on Python 3, by
+  decoding it as UTF-8.
+- Sessions are now closed in all cases (exceptional and not) when using the
+  functional API rather than leaking and waiting for the garbage collector to
+  clean them up.
+- Correctly handle digest auth headers with a malformed ``qop`` directive that
+  contains no token, by treating it the same as if no ``qop`` directive was
+  provided at all.
+- Minor performance improvements when removing specific cookies by name.
+
+**Miscellaneous**
+
+- Updated urllib3 to 1.13.
+
 2.8.1 (2015-10-13)
 ++++++++++++++++++
 
@@ -42,7 +90,7 @@ Release History
 - The ``json`` parameter to ``post()`` and friends will now only be used if
   neither ``data`` nor ``files`` are present, consistent with the
   documentation.
-- We now ignore empty fields in the ``NO_PROXY`` enviroment variable.
+- We now ignore empty fields in the ``NO_PROXY`` environment variable.
 - Fixed problem where ``httplib.BadStatusLine`` would get raised if combining
   ``stream=True`` with ``contextlib.closing``.
 - Prevented bugs where we would attempt to return the same connection back to
@@ -525,7 +573,7 @@ This is not a backwards compatible change.
 - Improved mime-compatible JSON handling
 - Proxy fixes
 - Path hack fixes
-- Case-Insensistive Content-Encoding headers
+- Case-Insensitive Content-Encoding headers
 - Support for CJK parameters in form posts
 
 
@@ -612,7 +660,7 @@ This is not a backwards compatible change.
 +++++++++++++++++++
 
 - Removal of Requests.async in favor of `grequests <https://github.com/kennethreitz/grequests>`_
-- Allow disabling of cookie persistiance.
+- Allow disabling of cookie persistence.
 - New implementation of safe_mode
 - cookies.get now supports default argument
 - Session cookies not saved when Session.request is called with return_response=False
diff --git a/PKG-INFO b/PKG-INFO
index 6522e96..d75ebac 100644
--- a/PKG-INFO
+++ b/PKG-INFO
@@ -1,6 +1,6 @@
 Metadata-Version: 1.1
 Name: requests
-Version: 2.8.1
+Version: 2.9.1
 Summary: Python HTTP for Humans.
 Home-page: http://python-requests.org
 Author: Kenneth Reitz
@@ -99,6 +99,54 @@ Description: Requests: HTTP for Humans
         Release History
         ---------------
         
+        2.9.1 (2015-12-21)
+        ++++++++++++++++++
+        
+        **Bugfixes**
+        
+        - Resolve regression introduced in 2.9.0 that made it impossible to send binary
+          strings as bodies in Python 3.
+        - Fixed errors when calculating cookie expiration dates in certain locales.
+        
+        **Miscellaneous**
+        
+        - Updated bundled urllib3 to 1.13.1.
+        
+        2.9.0 (2015-12-15)
+        ++++++++++++++++++
+        
+        **Minor Improvements** (Backwards compatible)
+        
+        - The ``verify`` keyword argument now supports being passed a path to a
+          directory of CA certificates, not just a single-file bundle.
+        - Warnings are now emitted when sending files opened in text mode.
+        - Added the 511 Network Authentication Required status code to the status code
+          registry.
+        
+        **Bugfixes**
+        
+        - For file-like objects that are not seeked to the very beginning, we now
+          send the content length for the number of bytes we will actually read, rather
+          than the total size of the file, allowing partial file uploads.
+        - When uploading file-like objects, if they are empty or have no obvious
+          content length we set ``Transfer-Encoding: chunked`` rather than
+          ``Content-Length: 0``.
+        - We correctly receive the response in buffered mode when uploading chunked
+          bodies.
+        - We now handle being passed a query string as a bytestring on Python 3, by
+          decoding it as UTF-8.
+        - Sessions are now closed in all cases (exceptional and not) when using the
+          functional API rather than leaking and waiting for the garbage collector to
+          clean them up.
+        - Correctly handle digest auth headers with a malformed ``qop`` directive that
+          contains no token, by treating it the same as if no ``qop`` directive was
+          provided at all.
+        - Minor performance improvements when removing specific cookies by name.
+        
+        **Miscellaneous**
+        
+        - Updated urllib3 to 1.13.
+        
         2.8.1 (2015-10-13)
         ++++++++++++++++++
         
@@ -138,7 +186,7 @@ Description: Requests: HTTP for Humans
         - The ``json`` parameter to ``post()`` and friends will now only be used if
           neither ``data`` nor ``files`` are present, consistent with the
           documentation.
-        - We now ignore empty fields in the ``NO_PROXY`` enviroment variable.
+        - We now ignore empty fields in the ``NO_PROXY`` environment variable.
         - Fixed problem where ``httplib.BadStatusLine`` would get raised if combining
           ``stream=True`` with ``contextlib.closing``.
         - Prevented bugs where we would attempt to return the same connection back to
@@ -621,7 +669,7 @@ Description: Requests: HTTP for Humans
         - Improved mime-compatible JSON handling
         - Proxy fixes
         - Path hack fixes
-        - Case-Insensistive Content-Encoding headers
+        - Case-Insensitive Content-Encoding headers
         - Support for CJK parameters in form posts
         
         
@@ -708,7 +756,7 @@ Description: Requests: HTTP for Humans
         +++++++++++++++++++
         
         - Removal of Requests.async in favor of `grequests <https://github.com/kennethreitz/grequests>`_
-        - Allow disabling of cookie persistiance.
+        - Allow disabling of cookie persistence.
         - New implementation of safe_mode
         - cookies.get now supports default argument
         - Session cookies not saved when Session.request is called with return_response=False
@@ -1183,8 +1231,8 @@ Classifier: Intended Audience :: Developers
 Classifier: Natural Language :: English
 Classifier: License :: OSI Approved :: Apache Software License
 Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python :: 2.6
 Classifier: Programming Language :: Python :: 2.7
 Classifier: Programming Language :: Python :: 3
 Classifier: Programming Language :: Python :: 3.3
 Classifier: Programming Language :: Python :: 3.4
+Classifier: Programming Language :: Python :: 3.5
diff --git a/requests.egg-info/PKG-INFO b/requests.egg-info/PKG-INFO
index 6522e96..d75ebac 100644
--- a/requests.egg-info/PKG-INFO
+++ b/requests.egg-info/PKG-INFO
@@ -1,6 +1,6 @@
 Metadata-Version: 1.1
 Name: requests
-Version: 2.8.1
+Version: 2.9.1
 Summary: Python HTTP for Humans.
 Home-page: http://python-requests.org
 Author: Kenneth Reitz
@@ -99,6 +99,54 @@ Description: Requests: HTTP for Humans
         Release History
         ---------------
         
+        2.9.1 (2015-12-21)
+        ++++++++++++++++++
+        
+        **Bugfixes**
+        
+        - Resolve regression introduced in 2.9.0 that made it impossible to send binary
+          strings as bodies in Python 3.
+        - Fixed errors when calculating cookie expiration dates in certain locales.
+        
+        **Miscellaneous**
+        
+        - Updated bundled urllib3 to 1.13.1.
+        
+        2.9.0 (2015-12-15)
+        ++++++++++++++++++
+        
+        **Minor Improvements** (Backwards compatible)
+        
+        - The ``verify`` keyword argument now supports being passed a path to a
+          directory of CA certificates, not just a single-file bundle.
+        - Warnings are now emitted when sending files opened in text mode.
+        - Added the 511 Network Authentication Required status code to the status code
+          registry.
+        
+        **Bugfixes**
+        
+        - For file-like objects that are not seeked to the very beginning, we now
+          send the content length for the number of bytes we will actually read, rather
+          than the total size of the file, allowing partial file uploads.
+        - When uploading file-like objects, if they are empty or have no obvious
+          content length we set ``Transfer-Encoding: chunked`` rather than
+          ``Content-Length: 0``.
+        - We correctly receive the response in buffered mode when uploading chunked
+          bodies.
+        - We now handle being passed a query string as a bytestring on Python 3, by
+          decoding it as UTF-8.
+        - Sessions are now closed in all cases (exceptional and not) when using the
+          functional API rather than leaking and waiting for the garbage collector to
+          clean them up.
+        - Correctly handle digest auth headers with a malformed ``qop`` directive that
+          contains no token, by treating it the same as if no ``qop`` directive was
+          provided at all.
+        - Minor performance improvements when removing specific cookies by name.
+        
+        **Miscellaneous**
+        
+        - Updated urllib3 to 1.13.
+        
         2.8.1 (2015-10-13)
         ++++++++++++++++++
         
@@ -138,7 +186,7 @@ Description: Requests: HTTP for Humans
         - The ``json`` parameter to ``post()`` and friends will now only be used if
           neither ``data`` nor ``files`` are present, consistent with the
           documentation.
-        - We now ignore empty fields in the ``NO_PROXY`` enviroment variable.
+        - We now ignore empty fields in the ``NO_PROXY`` environment variable.
         - Fixed problem where ``httplib.BadStatusLine`` would get raised if combining
           ``stream=True`` with ``contextlib.closing``.
         - Prevented bugs where we would attempt to return the same connection back to
@@ -621,7 +669,7 @@ Description: Requests: HTTP for Humans
         - Improved mime-compatible JSON handling
         - Proxy fixes
         - Path hack fixes
-        - Case-Insensistive Content-Encoding headers
+        - Case-Insensitive Content-Encoding headers
         - Support for CJK parameters in form posts
         
         
@@ -708,7 +756,7 @@ Description: Requests: HTTP for Humans
         +++++++++++++++++++
         
         - Removal of Requests.async in favor of `grequests <https://github.com/kennethreitz/grequests>`_
-        - Allow disabling of cookie persistiance.
+        - Allow disabling of cookie persistence.
         - New implementation of safe_mode
         - cookies.get now supports default argument
         - Session cookies not saved when Session.request is called with return_response=False
@@ -1183,8 +1231,8 @@ Classifier: Intended Audience :: Developers
 Classifier: Natural Language :: English
 Classifier: License :: OSI Approved :: Apache Software License
 Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python :: 2.6
 Classifier: Programming Language :: Python :: 2.7
 Classifier: Programming Language :: Python :: 3
 Classifier: Programming Language :: Python :: 3.3
 Classifier: Programming Language :: Python :: 3.4
+Classifier: Programming Language :: Python :: 3.5
diff --git a/requests.egg-info/requires.txt b/requests.egg-info/requires.txt
index 99755a4..34ddab5 100644
--- a/requests.egg-info/requires.txt
+++ b/requests.egg-info/requires.txt
@@ -1,5 +1,5 @@
 
 [security]
-pyOpenSSL
+pyOpenSSL>=0.13
 ndg-httpsclient
 pyasn1
diff --git a/requests/__init__.py b/requests/__init__.py
index 3d8188a..bd5b5b9 100644
--- a/requests/__init__.py
+++ b/requests/__init__.py
@@ -42,8 +42,8 @@ is at <http://python-requests.org>.
 """
 
 __title__ = 'requests'
-__version__ = '2.8.1'
-__build__ = 0x020801
+__version__ = '2.9.1'
+__build__ = 0x020901
 __author__ = 'Kenneth Reitz'
 __license__ = 'Apache 2.0'
 __copyright__ = 'Copyright 2015 Kenneth Reitz'
@@ -62,7 +62,8 @@ from .sessions import session, Session
 from .status_codes import codes
 from .exceptions import (
     RequestException, Timeout, URLRequired,
-    TooManyRedirects, HTTPError, ConnectionError
+    TooManyRedirects, HTTPError, ConnectionError,
+    FileModeWarning,
 )
 
 # Set default logging handler to avoid "No handler found" warnings.
@@ -75,3 +76,8 @@ except ImportError:
             pass
 
 logging.getLogger(__name__).addHandler(NullHandler())
+
+import warnings
+
+# FileModeWarnings go off per the default.
+warnings.simplefilter('default', FileModeWarning, append=True)
diff --git a/requests/adapters.py b/requests/adapters.py
index 7682db0..6266d5b 100644
--- a/requests/adapters.py
+++ b/requests/adapters.py
@@ -8,6 +8,7 @@ This module contains the transport adapters that Requests uses to define
 and maintain connections.
 """
 
+import os.path
 import socket
 
 from .models import Response
@@ -107,7 +108,7 @@ class HTTPAdapter(BaseAdapter):
 
     def __setstate__(self, state):
         # Can't handle by adding 'proxy_manager' to self.__attrs__ because
-        # because self.poolmanager uses a lambda function, which isn't pickleable.
+        # self.poolmanager uses a lambda function, which isn't pickleable.
         self.proxy_manager = {}
         self.config = {}
 
@@ -185,10 +186,15 @@ class HTTPAdapter(BaseAdapter):
                 raise Exception("Could not find a suitable SSL CA certificate bundle.")
 
             conn.cert_reqs = 'CERT_REQUIRED'
-            conn.ca_certs = cert_loc
+
+            if not os.path.isdir(cert_loc):
+                conn.ca_certs = cert_loc
+            else:
+                conn.ca_cert_dir = cert_loc
         else:
             conn.cert_reqs = 'CERT_NONE'
             conn.ca_certs = None
+            conn.ca_cert_dir = None
 
         if cert:
             if not isinstance(cert, basestring):
@@ -394,7 +400,15 @@ class HTTPAdapter(BaseAdapter):
                         low_conn.send(b'\r\n')
                     low_conn.send(b'0\r\n\r\n')
 
-                    r = low_conn.getresponse()
+                    # Receive the response from the server
+                    try:
+                        # For Python 2.7+ versions, use buffering of HTTP
+                        # responses
+                        r = low_conn.getresponse(buffering=True)
+                    except TypeError:
+                        # For compatibility with Python 2.6 versions and back
+                        r = low_conn.getresponse()
+
                     resp = HTTPResponse.from_httplib(
                         r,
                         pool=conn,
diff --git a/requests/api.py b/requests/api.py
index 72a777b..b21a1a4 100644
--- a/requests/api.py
+++ b/requests/api.py
@@ -33,7 +33,7 @@ def request(method, url, **kwargs):
     :param allow_redirects: (optional) Boolean. Set to True if POST/PUT/DELETE redirect following is allowed.
     :type allow_redirects: bool
     :param proxies: (optional) Dictionary mapping protocol to the URL of the proxy.
-    :param verify: (optional) if ``True``, the SSL cert will be verified. A CA_BUNDLE path can also be provided.
+    :param verify: (optional) whether the SSL cert will be verified. A CA_BUNDLE path can also be provided. Defaults to ``True``.
     :param stream: (optional) if ``False``, the response content will be immediately downloaded.
     :param cert: (optional) if String, path to ssl client cert file (.pem). If Tuple, ('cert', 'key') pair.
     :return: :class:`Response <Response>` object
@@ -46,13 +46,11 @@ def request(method, url, **kwargs):
       <Response [200]>
     """
 
-    session = sessions.Session()
-    response = session.request(method=method, url=url, **kwargs)
-    # By explicitly closing the session, we avoid leaving sockets open which
-    # can trigger a ResourceWarning in some cases, and look like a memory leak
-    # in others.
-    session.close()
-    return response
+    # By using the 'with' statement we are sure the session is closed, thus we
+    # avoid leaving sockets open which can trigger a ResourceWarning in some
+    # cases, and look like a memory leak in others.
+    with sessions.Session() as session:
+        return session.request(method=method, url=url, **kwargs)
 
 
 def get(url, params=None, **kwargs):
diff --git a/requests/auth.py b/requests/auth.py
index 8c4e847..2af55fb 100644
--- a/requests/auth.py
+++ b/requests/auth.py
@@ -136,7 +136,7 @@ class HTTPDigestAuth(AuthBase):
         if _algorithm == 'MD5-SESS':
             HA1 = hash_utf8('%s:%s:%s' % (HA1, nonce, cnonce))
 
-        if qop is None:
+        if not qop:
             respdig = KD(HA1, "%s:%s" % (nonce, HA2))
         elif qop == 'auth' or 'auth' in qop.split(','):
             noncebit = "%s:%s:%s:%s:%s" % (
diff --git a/requests/cookies.py b/requests/cookies.py
index 88b478c..b85fd2b 100644
--- a/requests/cookies.py
+++ b/requests/cookies.py
@@ -8,6 +8,7 @@ requests.utils imports from here, so be careful with imports.
 
 import copy
 import time
+import calendar
 import collections
 from .compat import cookielib, urlparse, urlunparse, Morsel
 
@@ -143,10 +144,13 @@ def remove_cookie_by_name(cookiejar, name, domain=None, path=None):
     """
     clearables = []
     for cookie in cookiejar:
-        if cookie.name == name:
-            if domain is None or domain == cookie.domain:
-                if path is None or path == cookie.path:
-                    clearables.append((cookie.domain, cookie.path, cookie.name))
+        if cookie.name != name:
+            continue
+        if domain is not None and domain != cookie.domain:
+            continue
+        if path is not None and path != cookie.path:
+            continue
+        clearables.append((cookie.domain, cookie.path, cookie.name))
 
     for domain, path, name in clearables:
         cookiejar.clear(domain, path, name)
@@ -365,7 +369,7 @@ def _copy_cookie_jar(jar):
         return None
 
     if hasattr(jar, 'copy'):
-        # We're dealing with an instane of RequestsCookieJar
+        # We're dealing with an instance of RequestsCookieJar
         return jar.copy()
     # We're dealing with a generic CookieJar instance
     new_jar = copy.copy(jar)
@@ -421,8 +425,9 @@ def morsel_to_cookie(morsel):
             raise TypeError('max-age: %s must be integer' % morsel['max-age'])
     elif morsel['expires']:
         time_template = '%a, %d-%b-%Y %H:%M:%S GMT'
-        expires = int(time.mktime(
-            time.strptime(morsel['expires'], time_template)) - time.timezone)
+        expires = calendar.timegm(
+            time.strptime(morsel['expires'], time_template)
+        )
     return create_cookie(
         comment=morsel['comment'],
         comment_url=bool(morsel['comment']),
diff --git a/requests/exceptions.py b/requests/exceptions.py
index 89135a8..ba0b910 100644
--- a/requests/exceptions.py
+++ b/requests/exceptions.py
@@ -97,3 +97,18 @@ class StreamConsumedError(RequestException, TypeError):
 
 class RetryError(RequestException):
     """Custom retries logic failed"""
+
+
+# Warnings
+
+
+class RequestsWarning(Warning):
+    """Base warning for Requests."""
+    pass
+
+
+class FileModeWarning(RequestsWarning, DeprecationWarning):
+    """
+    A file was opened in text mode, but Requests determined its binary length.
+    """
+    pass
diff --git a/requests/models.py b/requests/models.py
index 2727bee..4bcbc54 100644
--- a/requests/models.py
+++ b/requests/models.py
@@ -324,7 +324,7 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
     def prepare_url(self, url, params):
         """Prepares the given HTTP URL."""
         #: Accept objects that have string representations.
-        #: We're unable to blindy call unicode/str functions
+        #: We're unable to blindly call unicode/str functions
         #: as this will include the bytestring indicator (b'')
         #: on python 3.x.
         #: https://github.com/kennethreitz/requests/pull/2238
@@ -385,6 +385,9 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
             if isinstance(fragment, str):
                 fragment = fragment.encode('utf-8')
 
+        if isinstance(params, (str, bytes)):
+            params = to_native_string(params)
+
         enc_params = self._encode_params(params)
         if enc_params:
             if query:
@@ -434,7 +437,7 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
             if files:
                 raise NotImplementedError('Streamed bodies and files are mutually exclusive.')
 
-            if length is not None:
+            if length:
                 self.headers['Content-Length'] = builtin_str(length)
             else:
                 self.headers['Transfer-Encoding'] = 'chunked'
@@ -631,7 +634,7 @@ class Response(object):
 
     @property
     def is_permanent_redirect(self):
-        """True if this Response one of the permanant versions of redirect"""
+        """True if this Response one of the permanent versions of redirect"""
         return ('location' in self.headers and self.status_code in (codes.moved_permanently, codes.permanent_redirect))
 
     @property
diff --git a/requests/packages/urllib3/__init__.py b/requests/packages/urllib3/__init__.py
index 86bb71d..e43991a 100644
--- a/requests/packages/urllib3/__init__.py
+++ b/requests/packages/urllib3/__init__.py
@@ -2,10 +2,8 @@
 urllib3 - Thread-safe connection pooling and re-using.
 """
 
-__author__ = 'Andrey Petrov (andrey.petrov at shazow.net)'
-__license__ = 'MIT'
-__version__ = '1.12'
-
+from __future__ import absolute_import
+import warnings
 
 from .connectionpool import (
     HTTPConnectionPool,
@@ -32,8 +30,30 @@ except ImportError:
         def emit(self, record):
             pass
 
+__author__ = 'Andrey Petrov (andrey.petrov at shazow.net)'
+__license__ = 'MIT'
+__version__ = '1.13.1'
+
+__all__ = (
+    'HTTPConnectionPool',
+    'HTTPSConnectionPool',
+    'PoolManager',
+    'ProxyManager',
+    'HTTPResponse',
+    'Retry',
+    'Timeout',
+    'add_stderr_logger',
+    'connection_from_url',
+    'disable_warnings',
+    'encode_multipart_formdata',
+    'get_host',
+    'make_headers',
+    'proxy_from_url',
+)
+
 logging.getLogger(__name__).addHandler(NullHandler())
 
+
 def add_stderr_logger(level=logging.DEBUG):
     """
     Helper for quickly adding a StreamHandler to the logger. Useful for
@@ -55,7 +75,6 @@ def add_stderr_logger(level=logging.DEBUG):
 del NullHandler
 
 
-import warnings
 # SecurityWarning's always go off by default.
 warnings.simplefilter('always', exceptions.SecurityWarning, append=True)
 # SubjectAltNameWarning's should go off once per host
@@ -63,6 +82,9 @@ warnings.simplefilter('default', exceptions.SubjectAltNameWarning)
 # InsecurePlatformWarning's don't vary between requests, so we keep it default.
 warnings.simplefilter('default', exceptions.InsecurePlatformWarning,
                       append=True)
+# SNIMissingWarnings should go off only once.
+warnings.simplefilter('default', exceptions.SNIMissingWarning)
+
 
 def disable_warnings(category=exceptions.HTTPWarning):
     """
diff --git a/requests/packages/urllib3/_collections.py b/requests/packages/urllib3/_collections.py
index b68b9a5..67f3ce9 100644
--- a/requests/packages/urllib3/_collections.py
+++ b/requests/packages/urllib3/_collections.py
@@ -1,3 +1,4 @@
+from __future__ import absolute_import
 from collections import Mapping, MutableMapping
 try:
     from threading import RLock
@@ -167,7 +168,7 @@ class HTTPHeaderDict(MutableMapping):
     def __ne__(self, other):
         return not self.__eq__(other)
 
-    if not PY3: # Python 2
+    if not PY3:  # Python 2
         iterkeys = MutableMapping.iterkeys
         itervalues = MutableMapping.itervalues
 
@@ -234,7 +235,7 @@ class HTTPHeaderDict(MutableMapping):
         """
         if len(args) > 1:
             raise TypeError("extend() takes at most 1 positional "
-                            "arguments ({} given)".format(len(args)))
+                            "arguments ({0} given)".format(len(args)))
         other = args[0] if len(args) >= 1 else ()
 
         if isinstance(other, HTTPHeaderDict):
@@ -304,7 +305,7 @@ class HTTPHeaderDict(MutableMapping):
         return list(self.iteritems())
 
     @classmethod
-    def from_httplib(cls, message): # Python 2
+    def from_httplib(cls, message):  # Python 2
         """Read headers from a Python 2 httplib message object."""
         # python2.7 does not expose a proper API for exporting multiheaders
         # efficiently. This function re-reads raw lines from the message
diff --git a/requests/packages/urllib3/connection.py b/requests/packages/urllib3/connection.py
index 3eab1e2..1e4cd41 100644
--- a/requests/packages/urllib3/connection.py
+++ b/requests/packages/urllib3/connection.py
@@ -1,4 +1,6 @@
+from __future__ import absolute_import
 import datetime
+import os
 import sys
 import socket
 from socket import error as SocketError, timeout as SocketTimeout
@@ -6,18 +8,13 @@ import warnings
 from .packages import six
 
 try:  # Python 3
-    from http.client import HTTPConnection as _HTTPConnection, HTTPException
+    from http.client import HTTPConnection as _HTTPConnection
+    from http.client import HTTPException  # noqa: unused in this module
 except ImportError:
-    from httplib import HTTPConnection as _HTTPConnection, HTTPException
-
-
-class DummyConnection(object):
-    "Used to detect a failed ConnectionCls import."
-    pass
-
+    from httplib import HTTPConnection as _HTTPConnection
+    from httplib import HTTPException  # noqa: unused in this module
 
 try:  # Compiled with SSL?
-    HTTPSConnection = DummyConnection
     import ssl
     BaseSSLError = ssl.SSLError
 except (ImportError, AttributeError):  # Platform-specific: No SSL.
@@ -61,6 +58,11 @@ port_by_scheme = {
 RECENT_DATE = datetime.date(2014, 1, 1)
 
 
+class DummyConnection(object):
+    """Used to detect a failed ConnectionCls import."""
+    pass
+
+
 class HTTPConnection(_HTTPConnection, object):
     """
     Based on httplib.HTTPConnection but provides an extra constructor
@@ -205,10 +207,10 @@ class VerifiedHTTPSConnection(HTTPSConnection):
         self.key_file = key_file
         self.cert_file = cert_file
         self.cert_reqs = cert_reqs
-        self.ca_certs = ca_certs
-        self.ca_cert_dir = ca_cert_dir
         self.assert_hostname = assert_hostname
         self.assert_fingerprint = assert_fingerprint
+        self.ca_certs = ca_certs and os.path.expanduser(ca_certs)
+        self.ca_cert_dir = ca_cert_dir and os.path.expanduser(ca_cert_dir)
 
     def connect(self):
         # Add certificate verification
@@ -263,10 +265,19 @@ class VerifiedHTTPSConnection(HTTPSConnection):
                     'for details.)'.format(hostname)),
                     SubjectAltNameWarning
                 )
-            match_hostname(cert, self.assert_hostname or hostname)
 
-        self.is_verified = (resolved_cert_reqs == ssl.CERT_REQUIRED
-                            or self.assert_fingerprint is not None)
+            # In case the hostname is an IPv6 address, strip the square
+            # brackets from it before using it to validate. This is because
+            # a certificate with an IPv6 address in it won't have square
+            # brackets around that address. Sadly, match_hostname won't do this
+            # for us: it expects the plain host part without any extra work
+            # that might have been done to make it palatable to httplib.
+            asserted_hostname = self.assert_hostname or hostname
+            asserted_hostname = asserted_hostname.strip('[]')
+            match_hostname(cert, asserted_hostname)
+
+        self.is_verified = (resolved_cert_reqs == ssl.CERT_REQUIRED or
+                            self.assert_fingerprint is not None)
 
 
 if ssl:
diff --git a/requests/packages/urllib3/connectionpool.py b/requests/packages/urllib3/connectionpool.py
index b38ac68..995b416 100644
--- a/requests/packages/urllib3/connectionpool.py
+++ b/requests/packages/urllib3/connectionpool.py
@@ -1,3 +1,4 @@
+from __future__ import absolute_import
 import errno
 import logging
 import sys
@@ -10,7 +11,8 @@ try:  # Python 3
     from queue import LifoQueue, Empty, Full
 except ImportError:
     from Queue import LifoQueue, Empty, Full
-    import Queue as _  # Platform-specific: Windows
+    # Queue is imported for side effects on MS Windows
+    import Queue as _unused_module_Queue  # noqa: unused
 
 
 from .exceptions import (
@@ -22,7 +24,6 @@ from .exceptions import (
     LocationValueError,
     MaxRetryError,
     ProxyError,
-    ConnectTimeoutError,
     ReadTimeoutError,
     SSLError,
     TimeoutError,
@@ -35,7 +36,7 @@ from .connection import (
     port_by_scheme,
     DummyConnection,
     HTTPConnection, HTTPSConnection, VerifiedHTTPSConnection,
-    HTTPException, BaseSSLError, ConnectionError
+    HTTPException, BaseSSLError,
 )
 from .request import RequestMethods
 from .response import HTTPResponse
@@ -54,7 +55,7 @@ log = logging.getLogger(__name__)
 _Default = object()
 
 
-## Pool objects
+# Pool objects
 class ConnectionPool(object):
     """
     Base class for all connection pools, such as
@@ -68,8 +69,7 @@ class ConnectionPool(object):
         if not host:
             raise LocationValueError("No host specified.")
 
-        # httplib doesn't like it when we include brackets in ipv6 addresses
-        self.host = host.strip('[]')
+        self.host = host
         self.port = port
 
     def __str__(self):
@@ -645,22 +645,24 @@ class HTTPConnectionPool(ConnectionPool, RequestMethods):
                 return response
 
             log.info("Redirecting %s -> %s" % (url, redirect_location))
-            return self.urlopen(method, redirect_location, body, headers,
-                    retries=retries, redirect=redirect,
-                    assert_same_host=assert_same_host,
-                    timeout=timeout, pool_timeout=pool_timeout,
-                    release_conn=release_conn, **response_kw)
+            return self.urlopen(
+                method, redirect_location, body, headers,
+                retries=retries, redirect=redirect,
+                assert_same_host=assert_same_host,
+                timeout=timeout, pool_timeout=pool_timeout,
+                release_conn=release_conn, **response_kw)
 
         # Check if we should retry the HTTP response.
         if retries.is_forced_retry(method, status_code=response.status):
             retries = retries.increment(method, url, response=response, _pool=self)
             retries.sleep()
             log.info("Forced retry: %s" % url)
-            return self.urlopen(method, url, body, headers,
-                    retries=retries, redirect=redirect,
-                    assert_same_host=assert_same_host,
-                    timeout=timeout, pool_timeout=pool_timeout,
-                    release_conn=release_conn, **response_kw)
+            return self.urlopen(
+                method, url, body, headers,
+                retries=retries, redirect=redirect,
+                assert_same_host=assert_same_host,
+                timeout=timeout, pool_timeout=pool_timeout,
+                release_conn=release_conn, **response_kw)
 
         return response
 
diff --git a/requests/packages/urllib3/contrib/appengine.py b/requests/packages/urllib3/contrib/appengine.py
index ed9d8b8..884cdb2 100644
--- a/requests/packages/urllib3/contrib/appengine.py
+++ b/requests/packages/urllib3/contrib/appengine.py
@@ -1,3 +1,4 @@
+from __future__ import absolute_import
 import logging
 import os
 import warnings
@@ -60,7 +61,7 @@ class AppEngineManager(RequestMethods):
             raise AppEnginePlatformError(
                 "URLFetch is not available in this environment.")
 
-        if is_prod_appengine_v2():
+        if is_prod_appengine_mvms():
             raise AppEnginePlatformError(
                 "Use normal urllib3.PoolManager instead of AppEngineManager"
                 "on Managed VMs, as using URLFetch is not necessary in "
@@ -108,14 +109,14 @@ class AppEngineManager(RequestMethods):
             raise TimeoutError(self, e)
 
         except urlfetch.InvalidURLError as e:
-            if 'too large' in e.message:
+            if 'too large' in str(e):
                 raise AppEnginePlatformError(
                     "URLFetch request too large, URLFetch only "
                     "supports requests up to 10mb in size.", e)
             raise ProtocolError(e)
 
         except urlfetch.DownloadError as e:
-            if 'Too many redirects' in e.message:
+            if 'Too many redirects' in str(e):
                 raise MaxRetryError(self, url, reason=e)
             raise ProtocolError(e)
 
@@ -155,7 +156,7 @@ class AppEngineManager(RequestMethods):
 
     def _urlfetch_response_to_http_response(self, urlfetch_resp, **response_kw):
 
-        if is_prod_appengine_v1():
+        if is_prod_appengine():
             # Production GAE handles deflate encoding automatically, but does
             # not remove the encoding header.
             content_encoding = urlfetch_resp.headers.get('content-encoding')
@@ -176,7 +177,7 @@ class AppEngineManager(RequestMethods):
         if timeout is Timeout.DEFAULT_TIMEOUT:
             return 5  # 5s is the default timeout for URLFetch.
         if isinstance(timeout, Timeout):
-            if not timeout.read is timeout.connect:
+            if timeout.read is not timeout.connect:
                 warnings.warn(
                     "URLFetch does not support granular timeout settings, "
                     "reverting to total timeout.", AppEnginePlatformWarning)
@@ -199,12 +200,12 @@ class AppEngineManager(RequestMethods):
 
 def is_appengine():
     return (is_local_appengine() or
-            is_prod_appengine_v1() or
-            is_prod_appengine_v2())
+            is_prod_appengine() or
+            is_prod_appengine_mvms())
 
 
 def is_appengine_sandbox():
-    return is_appengine() and not is_prod_appengine_v2()
+    return is_appengine() and not is_prod_appengine_mvms()
 
 
 def is_local_appengine():
@@ -212,11 +213,11 @@ def is_local_appengine():
             'Development/' in os.environ['SERVER_SOFTWARE'])
 
 
-def is_prod_appengine_v1():
+def is_prod_appengine():
     return ('APPENGINE_RUNTIME' in os.environ and
             'Google App Engine/' in os.environ['SERVER_SOFTWARE'] and
-            not is_prod_appengine_v2())
+            not is_prod_appengine_mvms())
 
 
-def is_prod_appengine_v2():
+def is_prod_appengine_mvms():
     return os.environ.get('GAE_VM', False) == 'true'
diff --git a/requests/packages/urllib3/contrib/ntlmpool.py b/requests/packages/urllib3/contrib/ntlmpool.py
index c6b266f..c136a23 100644
--- a/requests/packages/urllib3/contrib/ntlmpool.py
+++ b/requests/packages/urllib3/contrib/ntlmpool.py
@@ -3,6 +3,7 @@ NTLM authenticating pool, contributed by erikcederstran
 
 Issue #10, see: http://code.google.com/p/urllib3/issues/detail?id=10
 """
+from __future__ import absolute_import
 
 try:
     from http.client import HTTPSConnection
diff --git a/requests/packages/urllib3/contrib/pyopenssl.py b/requests/packages/urllib3/contrib/pyopenssl.py
index c20ae46..5996153 100644
--- a/requests/packages/urllib3/contrib/pyopenssl.py
+++ b/requests/packages/urllib3/contrib/pyopenssl.py
@@ -43,6 +43,7 @@ Module Variables
 .. _crime attack: https://en.wikipedia.org/wiki/CRIME_(security_exploit)
 
 '''
+from __future__ import absolute_import
 
 try:
     from ndg.httpsclient.ssl_peer_verification import SUBJ_ALT_NAME_SUPPORT
@@ -53,7 +54,7 @@ except SyntaxError as e:
 import OpenSSL.SSL
 from pyasn1.codec.der import decoder as der_decoder
 from pyasn1.type import univ, constraint
-from socket import _fileobject, timeout
+from socket import _fileobject, timeout, error as SocketError
 import ssl
 import select
 
@@ -71,6 +72,12 @@ _openssl_versions = {
     ssl.PROTOCOL_TLSv1: OpenSSL.SSL.TLSv1_METHOD,
 }
 
+if hasattr(ssl, 'PROTOCOL_TLSv1_1') and hasattr(OpenSSL.SSL, 'TLSv1_1_METHOD'):
+    _openssl_versions[ssl.PROTOCOL_TLSv1_1] = OpenSSL.SSL.TLSv1_1_METHOD
+
+if hasattr(ssl, 'PROTOCOL_TLSv1_2') and hasattr(OpenSSL.SSL, 'TLSv1_2_METHOD'):
... 1603 lines suppressed ...

-- 
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/python-modules/packages/requests.git



More information about the Python-modules-commits mailing list