[med-svn] [r-cran-crul] 03/05: New upstream version 0.3.8

Andreas Tille tille at debian.org
Sun Oct 1 21:16:40 UTC 2017


This is an automated email from the git hooks/post-receive script.

tille pushed a commit to branch master
in repository r-cran-crul.

commit e0d93743ba552514c516b672cb7141b013384ba0
Author: Andreas Tille <tille at debian.org>
Date:   Sun Oct 1 23:12:24 2017 +0200

    New upstream version 0.3.8
---
 DESCRIPTION                           |  15 +-
 MD5                                   |  98 ++++---
 NAMESPACE                             |  27 +-
 NEWS.md                               | 123 +++++++-
 R/async.R                             | 124 ++++++++
 R/asyncvaried.R                       | 230 +++++++++++++++
 R/auth.R                              |  36 +++
 R/client.R                            | 170 +++++------
 R/crul-package.r                      |  24 +-
 R/curl-options.R                      |  23 +-
 R/delete-requests.R                   |  18 ++
 R/handle.R                            |   2 +-
 R/httprequest.R                       | 243 ++++++++++++++++
 R/make_url.R                          |  44 +++
 R/mocking.R                           |  36 +++
 R/onLoad.R                            |   6 +
 R/post-requests.R                     |  42 ++-
 R/proxies.R                           |  98 +++++++
 R/query.R                             |   4 +
 R/response.R                          |  50 +++-
 R/{response.R => stubbed-response.R}  |  68 +----
 R/zzz.R                               |  56 ++++
 README.md                             | 167 +++++++++--
 build/vignette.rds                    | Bin 209 -> 264 bytes
 inst/doc/async.Rmd                    | 263 +++++++++++++++++
 inst/doc/async.html                   | 356 +++++++++++++++++++++++
 inst/doc/crul_vignette.Rmd            |  95 ++++---
 inst/doc/crul_vignette.html           | 515 ++++++++++++++--------------------
 inst/doc/how-to-use-crul.Rmd          | 224 +++++++++++++++
 inst/doc/how-to-use-crul.html         | 311 ++++++++++++++++++++
 inst/vign/crul_vignette.Rmd           | 156 ----------
 inst/vign/crul_vignette.md            | 281 -------------------
 man/Async.Rd                          |  68 +++++
 man/AsyncVaried.Rd                    | 125 +++++++++
 man/HttpClient.Rd                     |  79 +++---
 man/HttpRequest.Rd                    |  91 ++++++
 man/HttpResponse.Rd                   |  55 ++--
 man/HttpStubbedResponse.Rd            |  43 +++
 man/auth.Rd                           |  44 +++
 man/crul-package.Rd                   |  25 +-
 man/curl-options.Rd                   |  25 +-
 man/delete-requests.Rd                |  23 ++
 man/handle.Rd                         |   3 +-
 man/http-headers.Rd                   |   1 -
 man/mock.Rd                           |  46 +++
 man/post-requests.Rd                  |  45 ++-
 man/proxies.Rd                        |  60 ++++
 man/url_build.Rd                      |  39 +++
 man/writing-options.Rd                |   1 -
 tests/testthat/test-async.R           | 151 ++++++++++
 tests/testthat/test-asyncvaried.R     |  59 ++++
 tests/testthat/test-auth.R            |  57 ++++
 tests/testthat/test-client.R          |  47 +++-
 tests/testthat/test-delete.R          |   2 +-
 tests/testthat/test-handle.R          |  14 +
 tests/testthat/test-headers.R         |   7 +-
 tests/testthat/test-mocking.R         |  73 +++++
 tests/testthat/test-paths.R           |  12 +-
 tests/testthat/test-proxies.R         |  56 ++++
 tests/testthat/test-put.R             |   2 +-
 tests/testthat/test-query.R           |   3 +-
 tests/testthat/test-request.R         |  71 +++++
 tests/testthat/test-response.R        |  36 +++
 tests/testthat/test-url_build_parse.R |  96 +++++++
 tests/testthat/test-utils.R           |  49 ++++
 vignettes/async.Rmd                   | 263 +++++++++++++++++
 vignettes/crul_vignette.Rmd           |  95 ++++---
 vignettes/how-to-use-crul.Rmd         | 224 +++++++++++++++
 68 files changed, 4807 insertions(+), 1188 deletions(-)

diff --git a/DESCRIPTION b/DESCRIPTION
index 902c44c..5a80054 100644
--- a/DESCRIPTION
+++ b/DESCRIPTION
@@ -6,22 +6,23 @@ Description: A simple HTTP client, with tools for making HTTP requests,
     The package name is a play on curl, the widely used command line tool
     for HTTP, and this package is built on top of the R package 'curl', an
     interface to 'libcurl' (<https://curl.haxx.se/libcurl>).
-Version: 0.2.0
+Version: 0.3.8
 License: MIT + file LICENSE
 Authors at R: c(
-    person("Scott", "Chamberlain", role = c("aut", "cre"), 
+    person("Scott", "Chamberlain", role = c("aut", "cre"),
     email = "myrmecocystus at gmail.com")
     )
 URL: https://github.com/ropensci/crul
 BugReports: https://github.com/ropensci/crul/issues
-Imports: curl (>= 2.2), R6 (>= 2.2.0), urltools (>= 1.6.0), httpcode
+Imports: curl (>= 2.6), R6 (>= 2.2.0), urltools (>= 1.6.0), httpcode
         (>= 0.2.0), mime
-Suggests: testthat, fauxpas (>= 0.1.0), knitr, covr, jsonlite
+Suggests: testthat, fauxpas (>= 0.1.0), webmockr (>= 0.1.0), knitr,
+        jsonlite
 VignetteBuilder: knitr
-RoxygenNote: 5.0.1
+RoxygenNote: 6.0.1
 NeedsCompilation: no
-Packaged: 2017-01-03 15:18:41 UTC; sacmac
+Packaged: 2017-06-13 23:08:55 UTC; sacmac
 Author: Scott Chamberlain [aut, cre]
 Maintainer: Scott Chamberlain <myrmecocystus at gmail.com>
 Repository: CRAN
-Date/Publication: 2017-01-03 16:27:27
+Date/Publication: 2017-06-15 00:24:32 UTC
diff --git a/MD5 b/MD5
index 90e85ae..7e502cf 100644
--- a/MD5
+++ b/MD5
@@ -1,48 +1,80 @@
-a6f6fbc16cd71616d681a37cb3c912e9 *DESCRIPTION
+54262102efdef0a076fc341ce3c0a9a1 *DESCRIPTION
 c5af52351472a750055a760a8924ce71 *LICENSE
-ccfd2548d37213f1c9fb4ed441c441e1 *NAMESPACE
-863807c45f19cb9c253d7a29dae2eabb *NEWS.md
+64a88c3cc94e69cba56f41c7028e35db *NAMESPACE
+ac5d5d245fe9278857e5ce36df5b63d7 *NEWS.md
+038fea3e8221beba4df24bf3a82f2143 *R/async.R
+58a23acf1db567e90b3423e90d044400 *R/asyncvaried.R
+c672a32ee8516230026a87ce3eed2d82 *R/auth.R
 c6de1547ab52ff1cd67fb5c6aeeac61d *R/body.R
-01ee24174d618c7272c081739e07c1e7 *R/client.R
-c2c1b73d8f35cf3a5d89fbd07020e28c *R/crul-package.r
-5cb4fcba89003e2e8188b5c08a7a9f04 *R/curl-options.R
+779bdad5b9684b79c430a95081e32322 *R/client.R
+a2d0cf3c78290e92312fedc3ea371c7d *R/crul-package.r
+f158cf1c763e34264ba3267a98aba3c1 *R/curl-options.R
 ce31dc346613d071e58f07d49bdb14eb *R/curl_options.R
+b077080e3625ecf58b23553f8aa33640 *R/delete-requests.R
 8f916ca18e13c4b6c544f2c284b9fabf *R/fetch.R
-2dfad3475055103724c39b36f6ac1d2a *R/handle.R
+47e71e1884900e76d9fe249b4a480fc8 *R/handle.R
 7fe1bb2e5890da2188c2f3099ef5ccc1 *R/headers.R
 0cd2754bc2668203d10a1431d2713c4e *R/http-headers.R
-9141860e68614c1ec7ff6b7381f5f808 *R/make_url.R
-3d4be1045a245a5ab31e88b3bec7fef4 *R/post-requests.R
-bbc44d0c12732851f73d1f556098317a *R/query.R
-e1c677d21d92379174d5d4e6e35895ab *R/response.R
+e011c5465b1cb7fc82f7cbd19ef9a210 *R/httprequest.R
+f48ecd3dc67376a23e154bf3c31dbc7d *R/make_url.R
+12ad585abb34c0f016f71dc2957ba36f *R/mocking.R
+c24c9f7c882823a79d1d63a0d1d7d48c *R/onLoad.R
+6aa00afab8fd6fe2a1314e9026ab275e *R/post-requests.R
+7660da3918e7f65f037f957f9f17b058 *R/proxies.R
+ff10b92e14e29e606593ff962f732b7a *R/query.R
+75ffcf8110f089b7514af5ef01d9024f *R/response.R
+c757ba50136b5b24d9d295ea82a6d3dd *R/stubbed-response.R
 14c11771d130c3cc4ba9e9f475a2951d *R/use_agent.R
 c92e901a06260e4175002099a617aa60 *R/writing-options.R
-226c3a16c73ea8dd62a69524df930908 *R/zzz.R
-f4aca5d2f32af09d109570d33de13836 *README.md
-13d67e238649588f477b9dfd996b8269 *build/vignette.rds
-02bf1952e659c0bd4a564da66c12c9db *inst/doc/crul_vignette.Rmd
-3cbc48c791a977d09be812762d3959b7 *inst/doc/crul_vignette.html
-80d1d44e24d667bc689d3ae88f6ce364 *inst/vign/crul_vignette.Rmd
-02bf1952e659c0bd4a564da66c12c9db *inst/vign/crul_vignette.md
-a10ec0e1878c2413d815012af25f9ac7 *man/HttpClient.Rd
-20a2ec2ab50ab466c0a5df70e6e737cf *man/HttpResponse.Rd
-238334b1249de8c8a108cd7eeba6bd87 *man/crul-package.Rd
-17b7aa113d4e3f2bb27181a0f30aec86 *man/curl-options.Rd
-099cc621daf091046a95667588044cb3 *man/handle.Rd
-cb453c1382bb59334acf15f7565933bc *man/http-headers.Rd
-42bfec87af6ea9ee8c034a1945b55789 *man/post-requests.Rd
-65da7a25eb2fa7bceb0ac867acbe4034 *man/writing-options.Rd
+3cf213006f0168dcc61c520a2bcaccb6 *R/zzz.R
+fd0b2c1986cb95d91bc077e22563a172 *README.md
+0ca11f56c5744e9cae3f8996635ec09a *build/vignette.rds
+92dc9cd788c0843421c367fb67d167df *inst/doc/async.Rmd
+3362460f137641def1ff6a6a46a58cf7 *inst/doc/async.html
+825e63b8f0ea15919c61fd6d482897d0 *inst/doc/crul_vignette.Rmd
+e997b5a59446531ad53550cdaf6fe689 *inst/doc/crul_vignette.html
+7b77c12dd2c3a496f35ee648c144686a *inst/doc/how-to-use-crul.Rmd
+263d062b17490370e6c1c8f5f14c72ab *inst/doc/how-to-use-crul.html
+4ce0044dbf2746868781235dce58a1d0 *man/Async.Rd
+f7d25bb9f12306064e5869c0e2bc0cd2 *man/AsyncVaried.Rd
+d21d0970e0cfdd86ce362dfd2eb937bb *man/HttpClient.Rd
+6b7e8e8b82a880dbb08cba3b1feaa4ef *man/HttpRequest.Rd
+8ceb31c33528e8d7e6ce83b7e795ced6 *man/HttpResponse.Rd
+ed69669a250cc2b376ea1d8bedf74193 *man/HttpStubbedResponse.Rd
+af2f8ff1a1d271c642ff558c0aab7ad9 *man/auth.Rd
+1493a47d3b24e85b4a93a126945f8a45 *man/crul-package.Rd
+158948dd9ddccddd9013c959df1a5a36 *man/curl-options.Rd
+74f5415c416ad5d026bb49471c579452 *man/delete-requests.Rd
+992a0e61803fff336f4ea5b01568ebbc *man/handle.Rd
+5a2be8a76e37279e6a27e43af56e79cf *man/http-headers.Rd
+424b3a1de3812859e94ff209ceddd4d0 *man/mock.Rd
+c3abfb0ae2e49b45c602fdd817078a54 *man/post-requests.Rd
+b8e980c5139ae71b16809c2f2235ec67 *man/proxies.Rd
+a1593980eb0f949136935f57df586863 *man/url_build.Rd
+8de29f9d478cf19c85cc8ae51510669e *man/writing-options.Rd
 9d086d73e3d68be9f055a6103bf7be39 *tests/test-all.R
-18d7e9ae1cd3375c8c646e6af971fae0 *tests/testthat/test-client.R
-411b885d9cffa5d35c8c76640b33dace *tests/testthat/test-delete.R
+5f166a9ddd900938232b65b17f7f492d *tests/testthat/test-async.R
+b69bc5f8b2a540e10094b75064274ffd *tests/testthat/test-asyncvaried.R
+bffa7e3d248142476657230ee4141457 *tests/testthat/test-auth.R
+c73212db04d129aa3f61d6f8cc8e967c *tests/testthat/test-client.R
+80fad867615be2365a42f29c319d3b39 *tests/testthat/test-delete.R
 81c982a6d03502e5b48ce38d581e8fe8 *tests/testthat/test-get.R
+162d7395dce79723ab9902c42196aaef *tests/testthat/test-handle.R
 f703252d231c221048dbdd8e74db7a49 *tests/testthat/test-head.R
-7302796a88a8ed8326239c4cced4726a *tests/testthat/test-headers.R
+4c95ffa8eac589fed3d1b33715d00e0d *tests/testthat/test-headers.R
+af5e1e93f7903088d1db4a4bc6fb6565 *tests/testthat/test-mocking.R
 d97a38299ad7f3501b4bfc9aeab50310 *tests/testthat/test-patch.R
-1acaf78df767322f37911a1925f1eda1 *tests/testthat/test-paths.R
+a4606f5df98bf946e95ad53f4e25737c *tests/testthat/test-paths.R
 8e092d174768345a088f0ff4388542e9 *tests/testthat/test-post.R
-2ee7f4b175d689c27227a8bb91bb2d6d *tests/testthat/test-put.R
-edd29a7dd644adb3389cf4dca2bd5f2e *tests/testthat/test-query.R
+3c4202f0d05f093d28d52228c50b9384 *tests/testthat/test-proxies.R
+da00093251bfe73573552e3fa54cc189 *tests/testthat/test-put.R
+4a636b8dcd87c9e911d1178bf93dedde *tests/testthat/test-query.R
+5d20f9e7b320e8c00d949212baedf907 *tests/testthat/test-request.R
+6ec788d9c3ad1e6aa77a4a14c169ab90 *tests/testthat/test-response.R
 ae1547b89f973f29f0d21fd526ccb7ce *tests/testthat/test-status.R
+6d6424b5f9549bb410a1e31ae9d99f67 *tests/testthat/test-url_build_parse.R
 b66e8ddf24d1ff5ffe66761e00d75a0e *tests/testthat/test-user-agent.R
-02bf1952e659c0bd4a564da66c12c9db *vignettes/crul_vignette.Rmd
+11807caff7a89ebc264d38dbdaf2cac3 *tests/testthat/test-utils.R
+92dc9cd788c0843421c367fb67d167df *vignettes/async.Rmd
+825e63b8f0ea15919c61fd6d482897d0 *vignettes/crul_vignette.Rmd
+7b77c12dd2c3a496f35ee648c144686a *vignettes/how-to-use-crul.Rmd
diff --git a/NAMESPACE b/NAMESPACE
index 73a539e..d5bb316 100644
--- a/NAMESPACE
+++ b/NAMESPACE
@@ -1,7 +1,32 @@
 # Generated by roxygen2: do not edit by hand
 
+export(Async)
+export(AsyncVaried)
 export(HttpClient)
+export(HttpRequest)
 export(HttpResponse)
+export(HttpStubbedResponse)
+export(auth)
 export(handle)
-import(curl)
+export(mock)
+export(proxy)
+export(url_build)
+export(url_parse)
 importFrom(R6,R6Class)
+importFrom(curl,curl_escape)
+importFrom(curl,curl_fetch_disk)
+importFrom(curl,curl_fetch_memory)
+importFrom(curl,curl_fetch_stream)
+importFrom(curl,curl_options)
+importFrom(curl,curl_version)
+importFrom(curl,handle_reset)
+importFrom(curl,handle_setform)
+importFrom(curl,handle_setheaders)
+importFrom(curl,handle_setopt)
+importFrom(curl,multi_add)
+importFrom(curl,multi_cancel)
+importFrom(curl,multi_list)
+importFrom(curl,multi_run)
+importFrom(curl,new_handle)
+importFrom(curl,new_pool)
+importFrom(curl,parse_headers)
diff --git a/NEWS.md b/NEWS.md
index cd6b519..d20bcf6 100644
--- a/NEWS.md
+++ b/NEWS.md
@@ -1,3 +1,92 @@
+crul 0.3.8
+==========
+
+### BUG FIXES
+
+* Fixed `AsyncVaried` to return async responses in the order that
+they were passed in. This also fixes this exact same behavior in 
+`Async` because `Async` uses `AsyncVaried` internally. (#41)
+thanks @dirkschumacher for reporting
+
+
+
+crul 0.3.6
+==========
+
+* Note: This version gains support for integration with 
+`webmockr`, which is now on CRAN.
+
+### NEW FEATURES
+
+* New function `auth()` to do simple authentication (#33)
+* New function `HttpStubbedResponse` for making a stubbed 
+response object for the `webmockr` integration (#4)
+* New function `mock()` to turn on mocking - it's off by 
+default. If `webmockr` is not installed but user attempts 
+to use mocking we error with message to install 
+`webmockr` (#4)
+
+### MINOR IMPROVEMENTS
+
+* Use `gzip-deflate` by deafult for each request 
+to make sure gzip compression is used if the server 
+can do it (#34)
+* Change `useragent` to `User-Agent` as default user 
+agent header (#35)
+* Now we make sure that user supplied headers override the 
+default headers if they are of the same name (#36)
+
+
+
+crul 0.3.4
+==========
+
+### NEW FEATURES
+
+* New utility functions `url_build` and `url_parse` (#31)
+
+### MINOR IMPROVEMENTS
+
+* Now using markdown for documentation (#32)
+* Better documentation for `AsyncVaried` (#30)
+* New vignette on how to use `crul` in realistic
+scenarios rather than brief examples to demonstrate
+individual features (#29)
+* Better documentation for `HttpRequest` (#28)
+* Included more tests
+
+### BUG FIXES
+
+* Fixed put/patch/delete as weren't passing body
+correctly in `HttpClient` (#26)
+* DRY out code for preparing requests - simplify to
+use helper functions (#27)
+
+
+crul 0.3.0
+==========
+
+### NEW FEATURES
+
+* Added support for asynchronous HTTP requests, including two new
+R6 classes: `Async` and `AsyncVaried`. The former being a simpler
+interface treating all URLs with same options/HTTP method, and the latter
+allowing any type of request through the new R6 class `HttpRequest` (#8) (#24)
+* New R6 class `HttpRequest` to support `AsyncVaried` - this method
+only defines a request, but does not execute it. (#8)
+
+### MINOR IMPROVEMENTS
+
+* Added support for proxies (#22)
+
+### BUG FIXES
+
+* Fixed parsing of headers from FTP servers (#21)
+
+
+
+
+
 crul 0.2.0
 ==========
 
@@ -5,9 +94,9 @@ crul 0.2.0
 
 * Created new manual files for various tasks to document
 usage better (#19)
-* URL encode paths - should fix any bugs where spaces between words 
+* URL encode paths - should fix any bugs where spaces between words
 caused errors previously (#17)
-* URL encode query parameters - should fix any bugs where spaces between words 
+* URL encode query parameters - should fix any bugs where spaces between words
 caused errors previously (#11)
 * request headers now passed correctly to response object (#13)
 * response headers now parsed to a list for easier access (#14)
@@ -15,38 +104,44 @@ caused errors previously (#11)
 possible in last version (#15)
 
 
+
+
 crul 0.1.6
 ==========
 
 ### NEW FEATURES
 
 * Improved options for using curl options. Can manually add
-to list of curl options or pass in via `...`. And we 
-check that user doesn't pass in prohibited options 
-(`curl` package takes care of checking that options 
+to list of curl options or pass in via `...`. And we
+check that user doesn't pass in prohibited options
+(`curl` package takes care of checking that options
 are valid) (#5)
-* Incorporated `fauxpas` package for dealing with HTTP 
+* Incorporated `fauxpas` package for dealing with HTTP
 conditions. It's a Suggest, so only used if installed (#6)
-* Added support for streaming via `curl::curl_fetch_stream`. 
-`stream` param defaults to `NULL` (thus ignored), or pass in a 
-function to use streaming. Only one of memory, streaming or 
+* Added support for streaming via `curl::curl_fetch_stream`.
+`stream` param defaults to `NULL` (thus ignored), or pass in a
+function to use streaming. Only one of memory, streaming or
 disk allowed. (#9)
-* Added support for streaming via `curl::curl_fetch_disk`. 
-`disk` param defaults to `NULL` (thus ignored), or pass in a 
-path to write to disk instead of use memory. Only one of memory, 
+* Added support for streaming via `curl::curl_fetch_disk`.
+`disk` param defaults to `NULL` (thus ignored), or pass in a
+path to write to disk instead of use memory. Only one of memory,
 streaming or disk allowed. (#12)
 
 ### MINOR IMPROVEMENTS
 
-* Added missing `raise_for_status()` method on the 
+* Added missing `raise_for_status()` method on the
 `HttpResponse` class (#10)
 
 ### BUG FIXES
 
-* Was importing `httpcode` but wasn't using it in the package. 
+* Was importing `httpcode` but wasn't using it in the package.
 Now using the package in `HttpResponse`
 
 
+
+
+
+
 crul 0.1.0
 ==========
 
diff --git a/R/async.R b/R/async.R
new file mode 100644
index 0000000..3b2b88e
--- /dev/null
+++ b/R/async.R
@@ -0,0 +1,124 @@
+#' Simple async client
+#'
+#' A client to work with many URLs, but all with the same HTTP method
+#'
+#' @export
+#' @param urls (character) one or more URLs (required)
+#' @family async
+#' @details
+#' **Methods**
+#'   \describe{
+#'     \item{`get(path, query, ...)`}{
+#'       make async GET requests for all URLs
+#'     }
+#'     \item{`post(path, query, body, encode, ...)`}{
+#'       make async POST requests for all URLs
+#'     }
+#'     \item{`put(path, query, body, encode, ...)`}{
+#'       make async PUT requests for all URLs
+#'     }
+#'     \item{`patch(path, query, body, encode, ...)`}{
+#'       make async PATCH requests for all URLs
+#'     }
+#'     \item{`delete(path, query, body, encode, ...)`}{
+#'       make async DELETE requests for all URLs
+#'     }
+#'     \item{`head(path, ...)`}{
+#'       make async HEAD requests for all URLs
+#'     }
+#'   }
+#'
+#' See [HttpClient()] for information on parameters.
+#'
+#' @format NULL
+#' @usage NULL
+#' @return a list, with objects of class [HttpResponse()].
+#' Responses are returned in the order they are passed in.
+#' @examples \dontrun{
+#' cc <- Async$new(
+#'   urls = c(
+#'     'https://httpbin.org/',
+#'     'https://httpbin.org/get?a=5',
+#'     'https://httpbin.org/get?foo=bar'
+#'   )
+#' )
+#' cc
+#' (res <- cc$get())
+#' res[[1]]
+#' res[[1]]$url
+#' res[[1]]$success()
+#' res[[1]]$status_http()
+#' res[[1]]$response_headers
+#' res[[1]]$method
+#' res[[1]]$content
+#' res[[1]]$parse("UTF-8")
+#'
+#' lapply(res, function(z) z$parse("UTF-8"))
+#' }
+Async <- R6::R6Class(
+  'Async',
+  public = list(
+    urls = NULL,
+
+    print = function(x, ...) {
+      cat("<crul async connection> ", sep = "\n")
+      cat("  urls: ", sep = "\n")
+      for (i in seq_along(self$urls)) {
+        cat(paste0("   ", self$urls[[i]]), sep = "\n")
+      }
+      invisible(self)
+    },
+
+    initialize = function(urls) {
+      self$urls <- urls
+    },
+
+    get = function(path = NULL, query = list(), ...) {
+      private$gen_interface(self$urls, "get", path, query, ...)
+    },
+
+    post = function(path = NULL, query = list(), body = NULL,
+                    encode = "multipart", ...) {
+      private$gen_interface(self$urls, "post", path, query, body, encode, ...)
+    },
+
+    put = function(path = NULL, query = list(), body = NULL,
+                   encode = "multipart", ...) {
+      private$gen_interface(self$urls, "put", path, query, body, encode, ...)
+    },
+
+    patch = function(path = NULL, query = list(), body = NULL,
+                     encode = "multipart", ...) {
+      private$gen_interface(self$urls, "patch", path, query, body, encode, ...)
+    },
+
+    delete = function(path = NULL, query = list(), body = NULL,
+                      encode = "multipart", ...) {
+      private$gen_interface(self$urls, "delete", path, query, body, encode, ...)
+    },
+
+    head = function(path = NULL, ...) {
+      private$gen_interface(self$urls, "head", path, ...)
+    }
+  ),
+
+  private = list(
+    gen_interface = function(x, method, ...) {
+      tmp <- AsyncVaried$new(
+        .list = lapply(x, function(z) {
+          switch(
+            method,
+            get = HttpRequest$new(url = z)$get(...),
+            post = HttpRequest$new(url = z)$post(...),
+            put = HttpRequest$new(url = z)$put(...),
+            patch = HttpRequest$new(url = z)$patch(...),
+            delete = HttpRequest$new(url = z)$delete(...),
+            head = HttpRequest$new(url = z)$head(...)
+          )
+        })
+      )
+      tmp$request()
+      tmp$responses()
+    }
+  )
+)
diff --git a/R/asyncvaried.R b/R/asyncvaried.R
new file mode 100644
index 0000000..da0af83
--- /dev/null
+++ b/R/asyncvaried.R
@@ -0,0 +1,230 @@
+#' Async client for different request types
+#'
+#' @export
+#' @param ...,.list Any number of objects of class [HttpRequest()],
+#' must supply inputs to one of these parameters, but not both
+#' @family async
+#' @return An object of class `AsyncVaried` with variables and methods.
+#' Responses are returned in the order they are passed in.
+#' @details
+#' **Methods**
+#'   \describe{
+#'     \item{`request()`}{
+#'       Execute asynchronous requests
+#'       - returns: nothing, responses stored inside object,
+#'       though will print messages if you choose verbose output
+#'     }
+#'     \item{`requests()`}{
+#'       list requests
+#'       - returns: a list of `HttpRequest` objects, empty list before
+#'       requests made
+#'     }
+#'     \item{`responses()`}{
+#'       list responses
+#'       - returns: a list of `HttpResponse` objects, empty list before
+#'       requests made
+#'     }
+#'     \item{`parse(encoding = "UTF-8")`}{
+#'       parse content
+#'       - returns: character vector, empty character vector before
+#'       requests made
+#'     }
+#'     \item{`status_code()`}{
+#'       (integer) HTTP status codes
+#'       - returns: numeric vector, empty numeric vector before
+#'       requests made
+#'     }
+#'     \item{`status()`}{
+#'       (list) HTTP status objects
+#'       - returns: a list of `http_code` objects, empty list before
+#'       requests made
+#'     }
+#'     \item{`content()`}{
+#'       raw content
+#'       - returns: raw list, empty list before requests made
+#'     }
+#'     \item{`times()`}{
+#'       curl request times
+#'       - returns: list of named numeric vectors, empty list before
+#'       requests made
+#'     }
+#'   }
+#'
+#' @format NULL
+#' @usage NULL
+#' @examples \dontrun{
+#' # pass in requests via ...
+#' req1 <- HttpRequest$new(
+#'   url = "https://httpbin.org/get",
+#'   opts = list(verbose = TRUE),
+#'   headers = list(foo = "bar")
+#' )$get()
+#' req2 <- HttpRequest$new(url = "https://httpbin.org/post")$post()
+#'
+#' # Create an AsyncVaried object
+#' out <- AsyncVaried$new(req1, req2)
+#'
+#' # before you make requests, the methods return empty objects
+#' out$status()
+#' out$status_code()
+#' out$content()
+#' out$times()
+#' out$parse()
+#' out$responses()
+#'
+#' # make requests
+#' out$request()
+#'
+#' # access various parts
+#' ## http status objects
+#' out$status()
+#' ## status codes
+#' out$status_code()
+#' ## content (raw data)
+#' out$content()
+#' ## times
+#' out$times()
+#' ## parsed content
+#' out$parse()
+#' ## response objects
+#' out$responses()
+#'
+#' # pass in requests in a list via .list param
+#' reqlist <- list(
+#'   HttpRequest$new(url = "https://httpbin.org/get")$get(),
+#'   HttpRequest$new(url = "https://httpbin.org/post")$post(),
+#'   HttpRequest$new(url = "https://httpbin.org/put")$put(),
+#'   HttpRequest$new(url = "https://httpbin.org/delete")$delete(),
+#'   HttpRequest$new(url = "https://httpbin.org/get?g=5")$get(),
+#'   HttpRequest$new(
+#'     url = "https://httpbin.org/post")$post(body = list(y = 9)),
+#'   HttpRequest$new(
+#'     url = "https://httpbin.org/get")$get(query = list(hello = "world"))
+#' )
+#'
+#' out <- AsyncVaried$new(.list = reqlist)
+#' out$request()
+#' out$status()
+#' out$status_code()
+#' out$content()
+#' out$times()
+#' out$parse()
+#' }
+AsyncVaried <- R6::R6Class(
+  'AsyncVaried',
+  public = list(
+    print = function(x, ...) {
+      cat("<crul async varied connection> ", sep = "\n")
+      cat("  requests: ", sep = "\n")
+      for (i in seq_along(private$reqs)) {
+        cat(sprintf("   %s: %s",
+                    private$reqs[[i]]$payload$method,
+                    private$reqs[[i]]$url), "\n")
+      }
+      invisible(self)
+    },
+
+    initialize = function(..., .list = list()) {
+      if (length(.list)) {
+        private$reqs <- .list
+      } else {
+        private$reqs <- list(...)
+      }
+      if (length(private$reqs) == 0) {
+        stop("must pass in at least one request", call. = FALSE)
+      }
+      if (
+        any(vapply(private$reqs, function(x) class(x)[1], "") != "HttpRequest")
+      ) {
+        stop("all inputs must be of class 'HttpRequest'", call. = FALSE)
+      }
+    },
+
+    request = function() {
+      private$output <- private$async_request(private$reqs)
+    },
+
+    responses = function() {
+      private$output %||% list()
+    },
+
+    requests = function() {
+      private$reqs
+    },
+
+    parse = function(encoding = "UTF-8") {
+      vapply(private$output, function(z) z$parse(encoding = encoding), "")
+    },
+
+    status_code = function() {
+      vapply(private$output, function(z) z$status_code, 1)
+    },
+
+    status = function() {
+      lapply(private$output, function(z) z$status_http())
+    },
+
+    content = function() {
+      lapply(private$output, function(z) z$content)
+    },
+
+    times = function() {
+      lapply(private$output, function(z) z$times)
+    }
+  ),
+
+  private = list(
+    reqs = NULL,
+    output = NULL,
+    reqq = NULL,
+
+    async_request = function(reqs) {
+      crulpool <- curl::new_pool()
+      multi_res <- list()
+
+      make_request <- function(i) {
+        w <- reqs[[i]]$payload
+        h <- w$url$handle
+        curl::handle_setopt(h, .list = w$options)
+        if (!is.null(w$fields)) {
+          curl::handle_setform(h, .list = w$fields)
+        }
+        curl::handle_setheaders(h, .list = w$headers)
+        curl::multi_add(handle = h,
+          done = function(res) multi_res[[i]] <<- res,
+          pool = crulpool
+        )
+      }
+
+      for (i in seq_along(reqs)) make_request(i)
+
+      # run all requests
+      curl::multi_run(pool = crulpool)
+      remain <- curl::multi_list(crulpool)
+      if (length(remain)) lapply(remain, curl::multi_cancel)
+      multi_res <- ccp(multi_res)
+
+      Map(function(z, b) {
+        HttpResponse$new(
+          method = b$payload$method,
+          url = z$url,
+          status_code = z$status_code,
+          request_headers = c(useragent = b$payload$options$useragent,
+                              b$headers),
+          response_headers = {
+            if (grepl("^ftp://", z$url)) {
+              list()
+            } else {
+              headers_parse(curl::parse_headers(rawToChar(z$headers)))
+            }
+          },
+          modified = z$modified,
+          times = z$times,
+          content = z$content,
+          handle = b$handle,
+          request = b
+        )
+      }, multi_res, reqs)
+    }
+  )
+)
diff --git a/R/auth.R b/R/auth.R
new file mode 100644
index 0000000..68e7df5
--- /dev/null
+++ b/R/auth.R
@@ -0,0 +1,36 @@
+#' Authentication
+#'
+#' @export
+#' @param user (character) username, optional
+#' @param pwd (character) password, optional
+#' @param auth (character) authentication type, one of basic (default),
+#' digest, digest_ie, gssnegotiate, ntlm, or any. optional
+#' @details
+#' Only supporting simple auth for now, OAuth later.
+#' @examples
+#' auth(user = "foo", pwd = "bar", auth = "basic")
+#' auth(user = "foo", pwd = "bar", auth = "digest")
+#' auth(user = "foo", pwd = "bar", auth = "ntlm")
+#' auth(user = "foo", pwd = "bar", auth = "any")
+#'
+#' # with HttpClient
+#' (res <- HttpClient$new(
+#'   url = "https://httpbin.org/basic-auth/user/passwd",
+#'   auth = auth(user = "user", pwd = "passwd")
+#' ))
+#' res$auth
+#' x <- res$get()
+#' jsonlite::fromJSON(x$parse("UTF-8"))
+#'
+#' # with HttpRequest
+#' (res <- HttpRequest$new(
+#'   url = "https://httpbin.org/basic-auth/user/passwd",
+#'   auth = auth(user = "user", pwd = "passwd")
+#' ))
+#' res$auth
+auth <- function(user, pwd, auth = "basic") {
+  structure(ccp(list(
+    userpwd = make_up(user, pwd),
+    httpauth = auth_type(auth)
+  )), class = "auth", type = auth)
+}
diff --git a/R/client.R b/R/client.R
index 01f03a5..28cb882 100644
--- a/R/client.R
+++ b/R/client.R
@@ -1,31 +1,30 @@
 #' HTTP client
 #'
 #' @export
-#' @param url (character) A url. One of \code{url} or \code{handle} required.
-#' @param opts (list) curl options
-#' @param handle A handle
+#' @template args
 #' @details
-#' \strong{Methods}
+#' **Methods**
 #'   \describe{
-#'     \item{\code{get(path, query, disk, stream, ...)}}{
+#'     \item{`get(path, query, disk, stream, ...)`}{
 #'       Make a GET request
 #'     }
-#'     \item{\code{post(path, query, body, disk, stream, ...)}}{
+#'     \item{`post(path, query, body, disk, stream, ...)`}{
 #'       Make a POST request
 #'     }
-#'     \item{\code{put(path, query, body, disk, stream, ...)}}{
+#'     \item{`put(path, query, body, disk, stream, ...)`}{
 #'       Make a PUT request
 #'     }
-#'     \item{\code{patch(path, query, body, disk, stream, ...)}}{
+#'     \item{`patch(path, query, body, disk, stream, ...)`}{
 #'       Make a PATCH request
 #'     }
-#'     \item{\code{delete(path, query, body, disk, stream, ...)}}{
+#'     \item{`delete(path, query, body, disk, stream, ...)`}{
 #'       Make a DELETE request
 #'     }
-#'     \item{\code{head(path, disk, stream, ...)}}{
+#'     \item{`head(path, disk, stream, ...)`}{
 #'       Make a HEAD request
 #'     }
 #'   }
+#'
 #' @format NULL
 #' @usage NULL
 #' @details Possible parameters (not all are allowed in each HTTP verb):
@@ -34,16 +33,18 @@
 #'  \item query - query terms, as a list
 #'  \item body - body as an R list
 #'  \item encode - one of form, multipart, json, or raw
-#'  \item disk - a path to write to. if NULL (default), memory used
+#'  \item disk - a path to write to. if NULL (default), memory used.
+#'  See [curl::curl_fetch_disk()] for help.
 #'  \item stream - an R function to determine how to stream data. if
-#'  NULL (default), memory used
+#'  NULL (default), memory used. See [curl::curl_fetch_stream()]
+#'  for help
 #'  \item ... curl options, only those in the acceptable set from
-#'  \code{\link[curl]{curl_options}} except the following: httpget, httppost,
+#'  [curl::curl_options()] except the following: httpget, httppost,
 #'  post, postfields, postfieldsize, and customrequest
 #' }
 #'
-#' @seealso \code{\link{post-requests}}, \code{\link{http-headers}},
-#' \code{\link{writing-options}}
+#' @seealso [post-requests], [delete-requests], [http-headers],
+#' [writing-options]
 #'
 #' @examples
 #' (x <- HttpClient$new(url = "https://httpbin.org"))
@@ -79,24 +80,36 @@
 #' # query params are URL encoded for you, so DO NOT do it yourself
 #' ## if you url encode yourself, it gets double encoded, and that's bad
 #' (x <- HttpClient$new(url = "https://httpbin.org"))
-#' res <- x$get("get", query = list(a = 'hello world'), verbose = TRUE)
+#' res <- x$get("get", query = list(a = 'hello world'))
 
 HttpClient <- R6::R6Class(
   'HttpClient',
   public = list(
     url = NULL,
     opts = list(),
+    proxies = list(),
+    auth = list(),
     headers = list(),
     handle = NULL,
 
     print = function(x, ...) {
       cat("<crul connection> ", sep = "\n")
-      cat(paste0("  url: ", if (is.null(self$url)) self$handle$url else self$url), sep = "\n")
-      cat("  options: ", sep = "\n")
+      cat(paste0("  url: ",
+                 if (is.null(self$url)) self$handle$url else self$url),
+          sep = "\n")
+      cat("  curl options: ", sep = "\n")
       for (i in seq_along(self$opts)) {
         cat(sprintf("    %s: %s", names(self$opts)[i],
                     self$opts[[i]]), sep = "\n")
       }
+      cat("  proxies: ", sep = "\n")
+      if (length(self$proxies)) cat(paste("    -", purl(self$proxies)),
+                                    sep = "\n")
+      cat("  auth: ", sep = "\n")
+      if (length(self$auth$userpwd)) {
+        cat(paste("    -", self$auth$userpwd), sep = "\n")
+        cat(paste("    - type: ", self$auth$httpauth), sep = "\n")
+      }
       cat("  headers: ", sep = "\n")
       for (i in seq_along(self$headers)) {
         cat(sprintf("    %s: %s", names(self$headers)[i],
@@ -105,9 +118,16 @@ HttpClient <- R6::R6Class(
       invisible(self)
     },
 
-    initialize = function(url, opts, headers, handle) {
+    initialize = function(url, opts, proxies, auth, headers, handle) {
       if (!missing(url)) self$url <- url
       if (!missing(opts)) self$opts <- opts
+      if (!missing(proxies)) {
+        if (!inherits(proxies, "proxy")) {
+          stop("proxies input must be of class proxy", call. = FALSE)
+        }
+        self$proxies <- proxies
+      }
+      if (!missing(auth)) self$auth <- auth
       if (!missing(headers)) self$headers <- headers
       if (!missing(handle)) self$handle <- handle
       if (is.null(self$url) && is.null(self$handle)) {
@@ -123,12 +143,16 @@ HttpClient <- R6::R6Class(
         url = url,
         method = "get",
         options = list(
-          httpget = TRUE,
-          useragent = make_ua()
+          httpget = TRUE
         ),
-        headers = self$headers
+        headers = list(
+          `User-Agent` = make_ua(),
+          `Accept-Encoding` = 'gzip, deflate'
+        )
       )
-      rr$options <- utils::modifyList(rr$options, c(self$opts, ...))
+      rr$headers <- norm_headers(rr$headers, self$headers)
+      rr$options <- utils::modifyList(
+        rr$options, c(self$opts, self$proxies, self$auth, ...))
       rr$disk <- disk
       rr$stream <- stream
       private$make_request(rr)
@@ -138,98 +162,41 @@ HttpClient <- R6::R6Class(
                     stream = NULL, encode = "multipart", ...) {
       curl_opts_check(...)
       url <- make_url(self$url, self$handle, path, query)
-      # opts <- list(post = TRUE)
-      # if (is.null(body)) {
-      #   opts$postfields <- raw(0)
-      #   opts$postfieldsize <- 0
-      # }
       opts <- prep_body(body, encode)
-      rr <- list(
-        url = url,
-        method = "post",
-        options = as.list(c(
-          opts$opts,
-          useragent = make_ua()
-        )),
-        headers = c(self$headers, opts$type),
-        fields = opts$fields
-      )
-      rr$options <- utils::modifyList(rr$options, c(self$opts, ...))
+      rr <- prep_opts("post", url, self, opts, ...)
       rr$disk <- disk
       rr$stream <- stream
       private$make_request(rr)
     },
 
     put = function(path = NULL, query = list(), body = NULL, disk = NULL,
-                   stream = NULL, encode = NULL, ...) {
+                   stream = NULL, encode = "multipart", ...) {
       curl_opts_check(...)
       url <- make_url(self$url, self$handle, path, query)
-      opts <- list(customrequest = "PUT")
-      if (is.null(body)) {
-        opts$postfields <- raw(0)
-        opts$postfieldsize <- 0
-      }
-      rr <- list(
-        url = url,
-        method = "put",
-        options = c(
-          opts,
-          useragent = make_ua()
-        ),
-        headers = self$headers,
-        fields = body
-      )
-      rr$options <- utils::modifyList(rr$options, c(self$opts, ...))
+      opts <- prep_body(body, encode)
+      rr <- prep_opts("put", url, self, opts, ...)
       rr$disk <- disk
       rr$stream <- stream
       private$make_request(rr)
     },
 
     patch = function(path = NULL, query = list(), body = NULL, disk = NULL,
-                     stream = NULL, encode = NULL, ...) {
+                     stream = NULL, encode = "multipart", ...) {
       curl_opts_check(...)
       url <- make_url(self$url, self$handle, path, query)
-      opts <- list(customrequest = "PATCH")
-      if (is.null(body)) {
-        opts$postfields <- raw(0)
-        opts$postfieldsize <- 0
-      }
-      rr <- list(
-        url = url,
-        method = "patch",
-        options = c(
-          opts,
-          useragent = make_ua()
-        ),
-        headers = self$headers,
-        fields = body
-      )
-      rr$options <- utils::modifyList(rr$options, c(self$opts, ...))
+      opts <- prep_body(body, encode)
+      rr <- prep_opts("patch", url, self, opts, ...)
       rr$disk <- disk
       rr$stream <- stream
       private$make_request(rr)
     },
 
     delete = function(path = NULL, query = list(), body = NULL, disk = NULL,
-                      stream = NULL, encode = NULL, ...) {
+                      stream = NULL, encode = "multipart", ...) {
       curl_opts_check(...)
       url <- make_url(self$url, self$handle, path, query)
-      opts <- list(customrequest = "DELETE")
-      if (is.null(body)) {
-        opts$postfields <- raw(0)
-        opts$postfieldsize <- 0
-      }
-      rr <- list(
-        url = url,
-        method = "delete",
-        options = c(
-          opts,
-          useragent = make_ua()
-        ),
-        headers = self$headers,
-        fields = body
-      )
-      rr$options <- utils::modifyList(rr$options, c(self$opts, ...))
+      opts <- prep_body(body, encode)
+      rr <- prep_opts("delete", url, self, opts, ...)
       rr$disk <- disk
       rr$stream <- stream
       private$make_request(rr)
@@ -248,7 +215,8 @@ HttpClient <- R6::R6Class(
         ),
         headers = self$headers
       )
-      rr$options <- utils::modifyList(rr$options, c(self$opts, ...))
+      rr$options <- utils::modifyList(rr$options,
+                                      c(self$opts, self$proxies, ...))
       rr$disk <- disk
       rr$stream <- stream
       private$make_request(rr)
@@ -270,7 +238,14 @@ HttpClient <- R6::R6Class(
       }
       curl::handle_setheaders(opts$url$handle, .list = opts$headers)
       on.exit(curl::handle_reset(opts$url$handle), add = TRUE)
-      resp <- crul_fetch(opts)
+
+      if (crul_opts$mock) {
+        check_for_package("webmockr")
+        adap <- webmockr::CrulAdapter$new()
+        return(adap$handle_request(opts))
+      } else {
+        resp <- crul_fetch(opts)
+      }
 
       HttpResponse$new(
         method = opts$method,
@@ -278,7 +253,16 @@ HttpClient <- R6::R6Class(
         status_code = resp$status_code,
         request_headers = c(useragent = opts$options$useragent, opts$headers),
         response_headers = {
-          headers_parse(curl::parse_headers(rawToChar(resp$headers)))
+          if (grepl("^ftp://", resp$url)) {
+            list()
+          } else {
+            hh <- rawToChar(resp$headers %||% raw(0))
+            if (is.null(hh) || nchar(hh) == 0) {
+              list()
+            } else {
+              headers_parse(curl::parse_headers(hh))
+            }
+          }
         },
         modified = resp$modified,
         times = resp$times,
diff --git a/R/crul-package.r b/R/crul-package.r
index b3a6bbd..32f056a 100644
--- a/R/crul-package.r
+++ b/R/crul-package.r
@@ -1,18 +1,32 @@
-#' \strong{HTTP R client}
+#' **HTTP R client**
 #'
 #' @section Package API:
 #' \itemize{
-#'  \item \code{\link{HttpClient}} - create a connection client, set all
+#'  \item [HttpClient()] - create a connection client, set all
 #'  your http options, make http requests
-#'  \item \code{\link{HttpResponse}} - mostly for internal use, handles
+#'  \item [HttpResponse()] - mostly for internal use, handles
 #'  http responses
+#'  \item [Async()] - asynchronous requests
+#'  \item [AsyncVaried()] - varied asynchronous requests
+#'  \item [HttpRequest()] - generate an HTTP request, mostly for
+#'  use in building requests to be used in `Async` or `AsyncVaried`
+#'  \item [mock()] - Turn on/off mocking, via `webmockr`
+#'  \item [auth()] - Simple authentication helper
+#'  \item [proxy()] - Proxy helper
 #' }
 #'
 #' @section HTTP conditions:
-#' We use \code{fauxpas} if you have it installed for handling HTTP
+#' We use `fauxpas` if you have it installed for handling HTTP
 #' conditions but if it's not installed we use \pkg{httpcode}
 #'
-#' @import curl
+#' @section Mocking:
+#' Mocking HTTP requests is supported via the \pkg{webmockr}
+#' package. See [mock] for guidance.
+#'
+#' @importFrom curl curl_escape curl_fetch_disk curl_fetch_memory
+#' curl_fetch_stream curl_options curl_version handle_reset handle_setform
+#' handle_setheaders handle_setopt multi_add multi_cancel multi_list
+#' multi_run new_handle new_pool parse_headers
 #' @importFrom R6 R6Class
 #' @name crul-package
 #' @aliases crul
diff --git a/R/curl-options.R b/R/curl-options.R
index 95d5920..912ba8b 100644
--- a/R/curl-options.R
+++ b/R/curl-options.R
@@ -1,7 +1,16 @@
 #' curl options
 #'
+#' With the `opts` parameter you can pass in various
+#' curl options, including user agent string, whether to get verbose
+#' curl output or not, setting a timeout for requests, and more. See
+#' [curl::curl_options()] for all the options you can use.
+#'
+#' A progress helper will be coming soon.
+#'
 #' @name curl-options
-#' @examples
+#' @aliases user-agent verbose timeout
+#'
+#' @examples \dontrun{
 #' # set curl options on client initialization
 #' (res <- HttpClient$new(
 #'   url = "https://httpbin.org",
@@ -16,6 +25,14 @@
 #' # or set curl options when performing HTTP operation
 #' (res <- HttpClient$new(url = "https://httpbin.org"))
 #' res$get('get', verbose = TRUE)
-#' \dontrun{res$get('get', stuff = "things")}
-#' \dontrun{res$get('get', httpget = TRUE)}
+#' res$get('get', stuff = "things")
+#' res$get('get', httpget = TRUE)
+#'
+#' # set a timeout
+#' (res <- HttpClient$new(
+#'   url = "https://httpbin.org",
+#'   opts = list(timeout_ms = 1)
+#' ))
+#' res$get('get')
+#' }
 NULL
diff --git a/R/delete-requests.R b/R/delete-requests.R
new file mode 100644
index 0000000..00ce790
--- /dev/null
+++ b/R/delete-requests.R
@@ -0,0 +1,18 @@
+#' HTTP DELETE requests
+#'
+#' @name delete-requests
+#' @examples
+#' (x <- HttpClient$new(url = "https://httpbin.org"))
+#'
+#' ## a list
+#' (res1 <- x$delete('delete', body = list(hello = "world"), verbose = TRUE))
+#' jsonlite::fromJSON(res1$parse("UTF-8"))
+#'
+#' ## a string
+#' (res2 <- x$delete('delete', body = "hello world", verbose = TRUE))
+#' jsonlite::fromJSON(res2$parse("UTF-8"))
+#'
+#' ## empty body request
+#' x$delete('delete', verbose = TRUE)
+#'
+NULL
diff --git a/R/handle.R b/R/handle.R
index 2f0068f..a3c8914 100644
--- a/R/handle.R
+++ b/R/handle.R
@@ -2,7 +2,7 @@
 #'
 #' @export
 #' @param url (character) A url. required.
-#' @param ... options passed on to \code{\link[curl]{new_handle}}
+#' @param ... options passed on to [curl::new_handle()]
 #' @examples
 #' handle("https://httpbin.org")
 #'
diff --git a/R/httprequest.R b/R/httprequest.R
new file mode 100644
index 0000000..176160a
--- /dev/null
+++ b/R/httprequest.R
@@ -0,0 +1,243 @@
+#' HTTP request object
+#'
+#' @export
+#' @template args
+#' @seealso [post-requests], [delete-requests],
+#' [http-headers], [writing-options]
+#'
+#' @details This R6 class doesn't do actual HTTP requests as does
+#' [HttpClient()] - it is for building requests to use for async HTTP
+#' requests in [AsyncVaried()]
+#'
+#' Note that you can access HTTP verbs after creating an `HttpRequest`
+#' object, just as you can with `HttpClient`. See examples for usage.
+#'
+#' Also note that when you call HTTP verbs on a `HttpRequest` object you
+#' don't need to assign the new object to a variable as the new details
+#' you've added are added to the object itself.
+#'
+#' **Methods**
+#'   \describe{
+#'     \item{`get(path, query, disk, stream, ...)`}{
+#'       Define a GET request
+#'     }
+#'     \item{`post(path, query, body, disk, stream, ...)`}{
+#'       Define a POST request
+#'     }
+#'     \item{`put(path, query, body, disk, stream, ...)`}{
+#'       Define a PUT request
+#'     }
+#'     \item{`patch(path, query, body, disk, stream, ...)`}{
+#'       Define a PATCH request
+#'     }
+#'     \item{`delete(path, query, body, disk, stream, ...)`}{
+#'       Define a DELETE request
+#'     }
+#'     \item{`head(path, disk, stream, ...)`}{
+#'       Define a HEAD request
+#'     }
+#'     \item{`method()`}{
+#'       Get the HTTP method (if defined)
+#'       - returns character string
+#'     }
+#'   }
+#'
+#' See [HttpClient()] for information on parameters.
+#'
+#' @format NULL
+#' @usage NULL
+#'
+#' @examples
+#' x <- HttpRequest$new(url = "https://httpbin.org/get")
+#' ## note here how the HTTP method is shown on the first line to the right
+#' x$get()
+#'
+#' ## assign to a new object to keep the output
+#' z <- x$get()
+#' ### get the HTTP method
+#' z$method()
+#'
+#' (x <- HttpRequest$new(url = "https://httpbin.org/get")$get())
+#' x$url
+#' x$payload
+#'
+#' (x <- HttpRequest$new(url = "https://httpbin.org/post"))
+#' x$post(body = list(foo = "bar"))
+#'
+#' HttpRequest$new(
+#'   url = "https://httpbin.org/get",
+#'   headers = list(
+#'     `Content-Type` = "application/json"
+#'   )
+#' )
+HttpRequest <- R6::R6Class(
+  'HttpRequest',
+  public = list(
+    url = NULL,
+    opts = list(),
+    proxies = list(),
+    auth = list(),
+    headers = list(),
+    handle = NULL,
+    payload = NULL,
+
+    print = function(x, ...) {
+      cat(paste0("<crul http request> ", self$method()), sep = "\n")
+      cat(paste0("  url: ", if (is.null(self$url))
+        self$handle$url else self$url), sep = "\n")
+      cat("  curl options: ", sep = "\n")
+      for (i in seq_along(self$opts)) {
+        cat(sprintf("    %s: %s", names(self$opts)[i],
+                    self$opts[[i]]), sep = "\n")
+      }
+      cat("  proxies: ", sep = "\n")
+      if (length(self$proxies)) cat(paste("    -",
+                                          purl(self$proxies)), sep = "\n")
+      cat("  auth: ", sep = "\n")
+      if (length(self$auth$userpwd)) {
+        cat(paste("    -", self$auth$userpwd), sep = "\n")
+        cat(paste("    - type: ", self$auth$httpauth), sep = "\n")
+      }
+      cat("  headers: ", sep = "\n")
+      for (i in seq_along(self$headers)) {
+        cat(sprintf("    %s: %s", names(self$headers)[i],
+                    self$headers[[i]]), sep = "\n")
+      }
+      invisible(self)
+    },
+
+    initialize = function(url, opts, proxies, auth, headers, handle) {
+      if (!missing(url)) self$url <- url
+      if (!missing(opts)) self$opts <- opts
+      if (!missing(proxies)) {
+        if (!inherits(proxies, "proxy")) {
+          stop("proxies input must be of class proxy", call. = FALSE)
+        }
+        self$proxies <- proxies
+      }
+      if (!missing(auth)) self$auth <- auth
+      if (!missing(headers)) self$headers <- headers
+      if (!missing(handle)) self$handle <- handle
+      if (is.null(self$url) && is.null(self$handle)) {
+        stop("need one of url or handle", call. = FALSE)
+      }
+    },
+
+    get = function(path = NULL, query = list(), disk = NULL,
+                   stream = NULL, ...) {
+      curl_opts_check(...)
+      url <- make_url_async(self$url, self$handle, path, query)
+      rr <- list(
+        url = url,
+        method = "get",
+        options = list(
+          httpget = TRUE,
+          useragent = make_ua()
+        ),
+        headers = list(
+          `User-Agent` = make_ua(),
+          `Accept-Encoding` = 'gzip, deflate'
+        )
+      )
+      rr$headers <- norm_headers(rr$headers, self$headers)
+      rr$options <- utils::modifyList(
+        rr$options, c(self$opts, self$proxies, self$auth, ...))
+      rr$disk <- disk
+      rr$stream <- stream
+      self$payload <- rr
+      return(self)
+    },
+
+    post = function(path = NULL, query = list(), body = NULL, disk = NULL,
+                    stream = NULL, encode = "multipart", ...) {
+      curl_opts_check(...)
+      url <- make_url_async(self$url, self$handle, path, query)
+      opts <- prep_body(body, encode)
+      rr <- prep_opts("post", url, self, opts, ...)
+      rr$disk <- disk
+      rr$stream <- stream
+      self$payload <- rr
+      return(self)
+    },
+
+    put = function(path = NULL, query = list(), body = NULL, disk = NULL,
+                   stream = NULL, encode =  "multipart", ...) {
+      curl_opts_check(...)
+      url <- make_url_async(self$url, self$handle, path, query)
+      opts <- prep_body(body, encode)
+      rr <- prep_opts("put", url, self, opts, ...)
+      rr$disk <- disk
+      rr$stream <- stream
+      self$payload <- rr
+      return(self)
+    },
+
+    patch = function(path = NULL, query = list(), body = NULL, disk = NULL,
+                     stream = NULL, encode =  "multipart", ...) {
+      curl_opts_check(...)
+      url <- make_url_async(self$url, self$handle, path, query)
+      opts <- prep_body(body, encode)
+      rr <- prep_opts("patch", url, self, opts, ...)
+      rr$disk <- disk
+      rr$stream <- stream
+      self$payload <- rr
+      return(self)
+    },
+
+    delete = function(path = NULL, query = list(), body = NULL, disk = NULL,
+                      stream = NULL, encode =  "multipart", ...) {
+      curl_opts_check(...)
+      url <- make_url_async(self$url, self$handle, path, query)
+      opts <- prep_body(body, encode)
+      rr <- prep_opts("delete", url, self, opts, ...)
+      rr$disk <- disk
+      rr$stream <- stream
+      self$payload <- rr
+      return(self)
+    },
+
+    head = function(path = NULL, disk = NULL, stream = NULL, ...) {
+      curl_opts_check(...)
+      url <- make_url_async(self$url, self$handle, path, NULL)
+      opts <- list(customrequest = "HEAD", nobody = TRUE)
+      rr <- list(
+        url = url,
+        method = "head",
+        options = c(
+          opts,
+          useragent = make_ua()
+        ),
+        headers = self$headers
+      )
+      rr$options <- utils::modifyList(rr$options,
+                                      c(self$opts, self$proxies, ...))
+      rr$disk <- disk
+      rr$stream <- stream
+      self$payload <- rr
+      return(self)
+    },
+
+    method = function() self$payload$method
+  )
+)
+
+make_url_async <- function(url = NULL, handle = NULL, path, query) {
+  if (!is.null(handle)) {
+    url <- handle$url
+  }
+
+  if (!is.null(path)) {
+    urltools::path(url) <- path
+  }
+
+  url <- gsub("\\s", "%20", url)
+  url <- add_query(query, url)
+
+  if (!is.null(handle)) {
+    curl::handle_setopt(handle, url = url)
+  } else {
+    handle <- curl::new_handle(url = url)
+  }
+
+  return(list(url = url, handle = handle))
+}
diff --git a/R/make_url.R b/R/make_url.R
index 0a47616..0f7f834 100644
--- a/R/make_url.R
+++ b/R/make_url.R
@@ -37,3 +37,47 @@ add_query <- function(x, url) {
     return(url)
   }
 }
+
+#' Build and parse URLs
+#'
+#' @export
+#' @param url (character) a url
+#' @param path (character) a path
+#' @param query (list) a named list of query parameters
+#' @return `url_build` returns a character string URL; `url_parse`
+#' returns a list with URL components
+#' @examples
+#' url_build("https://httpbin.org")
+#' url_build("https://httpbin.org", "get")
+#' url_build("https://httpbin.org", "post")
+#' url_build("https://httpbin.org", "get", list(foo = "bar"))
+#'
+#' url_parse("httpbin.org")
+#' url_parse("http://httpbin.org")
+#' url_parse(url = "https://httpbin.org")
+#' url_parse("https://httpbin.org/get")
+#' url_parse("https://httpbin.org/get?foo=bar")
+#' url_parse("https://httpbin.org/get?foo=bar&stuff=things")
+#' url_parse("https://httpbin.org/get?foo=bar&stuff=things[]")
+url_build <- function(url, path = NULL, query = NULL) {
+  assert(url, "character")
+  assert(path, "character")
+  assert(query, "list")
+  if (!has_namez(query)) stop("all query elements must be named", call. = FALSE)
+  make_url(url, handle = NULL, path, query)$url
+}
+
+#' @export
+#' @rdname url_build
+url_parse <- function(url) {
+  tmp <- urltools::url_parse(url)
+  tmp <- as.list(tmp)
+  if (!is.na(tmp$parameter)) {
+    tmp$parameter <- unlist(
+      lapply(strsplit(tmp$parameter, "&")[[1]], function(x) {
+        z <- strsplit(x, split = "=")[[1]]
+        as.list(stats::setNames(z[2], z[1]))
+      }), FALSE)
+  }
+  return(tmp)
+}
diff --git a/R/mocking.R b/R/mocking.R
new file mode 100644
index 0000000..8db9343
--- /dev/null
+++ b/R/mocking.R
@@ -0,0 +1,36 @@
+#' Mocking HTTP requests
+#'
+#' @export
+#' @param on (logical) turn mocking on with `TRUE` or turn off with `FALSE`.
+#' By default is `FALSE`
+#' @details `webmockr` package required for mocking behavior
+#' @examples \dontrun{
+#' # load webmockr
+#' library(webmockr)
+#' library(crul)
+#'
+#' URL <- "https://httpbin.org"
+#'
+#' # turn on mocking
+#' crul::mock()
+#'
+#' # stub a request
+#' stub_request("get", file.path(URL, "get"))
+#' webmockr:::webmockr_stub_registry
+#'
+#' # create an HTTP client
+#' (x <- HttpClient$new(url = URL))
+#'
+#' # make a request - matches stub - no real request made
+#' x$get('get')
+#'
+#' # allow net connect
+#' webmockr::webmockr_allow_net_connect()
+#' x$get('get', query = list(foo = "bar"))
+#' webmockr::webmockr_disable_net_connect()
+#' x$get('get', query = list(foo = "bar"))
+#' }
+mock <- function(on = TRUE) {
+  check_for_package("webmockr")
+  crul_opts$mock <- on
+}
diff --git a/R/onLoad.R b/R/onLoad.R
new file mode 100644
index 0000000..2d894e7
--- /dev/null
+++ b/R/onLoad.R
@@ -0,0 +1,6 @@
+crul_opts = NULL
+
+.onLoad <- function(libname, pkgname){
+	crul_opts <<- new.env()
+	crul_opts$mock <<- FALSE
+}
diff --git a/R/post-requests.R b/R/post-requests.R
index 6a43f8a..af50f0c 100644
--- a/R/post-requests.R
+++ b/R/post-requests.R
@@ -1,16 +1,21 @@
-#' HTTP POST requests
+#' HTTP POST/PUT/PATCH requests
 #'
 #' @name post-requests
 #' @examples
 #' (x <- HttpClient$new(url = "https://httpbin.org"))
 #'
-#' # post request
-#' (res_post <- x$post('post', body = list(hello = "world")))
+#' # POST requests
+#' ## a list
+#' (res_post <- x$post('post', body = list(hello = "world"), verbose = TRUE))
+#'
+#' ## a string
+#' (res_post <- x$post('post', body = "hello world", verbose = TRUE))
 #'
 #' ## empty body request
 #' x$post('post')
 #'
 #' ## form requests
+#' \dontrun{
 #' (cli <- HttpClient$new(
 #'   url = "http://apps.kew.org/wcsp/advsearch.do"
 #' ))
@@ -23,6 +28,7 @@
 #'     selectedLevel = 'cont'
 #'   )
 #' )
+#' }
 #'
 #' (x <- HttpClient$new(url = "https://httpbin.org"))
 #' res <- x$post("post",
@@ -34,7 +40,37 @@
 #' )
 #' jsonlite::fromJSON(res$parse())
 #'
+#' # PUT requests
+#' (x <- HttpClient$new(url = "https://httpbin.org"))
+#' (res <- x$put(path = "put",
+#'   encode = "json",
+#'   body = list(
+#'     genus = 'Gagea',
+#'     species = 'pratensis'
+#'   )
+#' ))
+#' jsonlite::fromJSON(res$parse("UTF-8"))
+#'
+#' res <- x$put("put", body = "foo bar")
+#' jsonlite::fromJSON(res$parse("UTF-8"))
+#'
+#'
+#' # PATCH requests
+#' (x <- HttpClient$new(url = "https://httpbin.org"))
+#' (res <- x$patch(path = "patch",
+#'   encode = "json",
+#'   body = list(
+#'     genus = 'Gagea',
+#'     species = 'pratensis'
+#'   )
+#' ))
+#' jsonlite::fromJSON(res$parse("UTF-8"))
+#'
+#' res <- x$patch("patch", body = "foo bar")
+#' jsonlite::fromJSON(res$parse("UTF-8"))
+#'
 #'
+#' # Upload files - STILL WORKING ON THIS
 #' # path <- file.path(Sys.getenv("R_DOC_DIR"), "html/logo.jpg")
 #' # (x <- HttpClient$new(url = "https://httpbin.org"))
 #' # x$post("post",
diff --git a/R/proxies.R b/R/proxies.R
new file mode 100644
index 0000000..abcf5ac
--- /dev/null
+++ b/R/proxies.R
@@ -0,0 +1,98 @@
+#' proxy options
+#'
+#' @name proxies
+#' @param url (character) URL, with scheme (http/https), domain and
+#' port (must be numeric). required.
+#' @param user (character) username, optional
+#' @param pwd (character) password, optional
+#' @param auth (character) authentication type, one of basic (default),
+#' digest, digest_ie, gssnegotiate, ntlm, or any. optional
+#'
+#' @details See <http://proxylist.hidemyass.com/> for a list of proxies you
+#' can use
+#'
+#' @examples
+#' proxy("http://97.77.104.22:3128")
+#' proxy("97.77.104.22:3128")
+#' proxy("http://97.77.104.22:3128", "foo", "bar")
+#' proxy("http://97.77.104.22:3128", "foo", "bar", auth = "digest")
+#' proxy("http://97.77.104.22:3128", "foo", "bar", auth = "ntlm")
+#'
+#' # with proxy (look at request/outgoing headers)
+#' (res <- HttpClient$new(
+#'   url = "http://www.google.com",
+#'   proxies = proxy("http://97.77.104.22:3128")
+#' ))
+#' res$proxies
+#' \dontrun{res$get(verbose = TRUE)}
+#'
+#' # vs. without proxy (look at request/outgoing headers)
+#' (res2 <- HttpClient$new(url = "http://www.google.com"))
+#' res2$get(verbose = TRUE)
+#'
+#'
+#' # Use authentication
+#' (res <- HttpClient$new(
+#'   url = "http://google.com",
+#'   proxies = proxy("http://97.77.104.22:3128", user = "foo", pwd = "bar")
+#' ))
+#'
+#' # another example
+#' (res <- HttpClient$new(
+#'   url = "http://ip.tyk.nu/",
+#'   proxies = proxy("http://200.29.191.149:3128")
+#' ))
+#' \dontrun{res$get()$parse("UTF-8")}
+NULL
+
+#' @export
+#' @rdname proxies
+proxy <- function(url, user = NULL, pwd = NULL, auth = "basic") {
+  url <- proxy_url(url)
+  structure(ccp(list(
+    proxy = url$domain,
+    proxyport = url$port,
+    proxyuserpwd = make_up(user, pwd),
+    proxyauth = auth_type(auth)
+  )), class = "proxy")
+}
+
+proxy_url <- function(x) {
+  tmp <- tryCatch(urltools::url_parse(x), error = function(e) e)
+  if (inherits(tmp, "error")) {
+    stop("proxy URL not of correct form, check your URL", call. = FALSE)
+  }
+  port <- tryCatch(as.numeric(tmp$port), warning = function(w) w)
+  if (inherits(port, "warning")) stop("port ", tmp$port, " was not numeric",
+                                      call. = FALSE)
+  tmp$port <- port
+  as.list(tmp)
+}
+
+make_up <- function(user, pwd) {
+  assert(user, "character")
+  assert(pwd, "character")
+  if (!is.null(user) || !is.null(pwd)) {
+    return(paste0(user, ":", pwd))
+  }
+  NULL
+}
+
+auth_type <- function(x) {
+  stopifnot(inherits(x, "character"))
+  switch(
+    x,
+    basic = 1,
+    digest = 2,
+    digest_ie = 16,
+    gssnegotiate = 4,
+    ntlm = 8,
+    any = -17,
+    stop("auth not in acceptable set, see ?proxies", call. = FALSE)
+  )
+}
+
+purl <- function(x) {
+  sprintf("http://%s:%s (auth: %s)",
+          x$proxy, x$proxyport, !is.null(x$proxyuserpwd))
+}
diff --git a/R/query.R b/R/query.R
index c6aa3c7..75f2f4c 100644
--- a/R/query.R
+++ b/R/query.R
@@ -5,6 +5,10 @@ encode <- function(x) {
   curl::curl_escape(x)
 }
 
+has_namez <- function(x) {
+  length(Filter(nzchar, names(x))) == length(x)
+}
+
 # adapted from https://github.com/hadley/httr
 has_name <- function(x) {
   nms <- names(x)
diff --git a/R/response.R b/R/response.R
index e1bb5a0..b9c9f70 100644
--- a/R/response.R
+++ b/R/response.R
@@ -1,26 +1,34 @@
 #' Base response object
 #'
 #' @export
-#' @param url (character) A url
+#' @param url (character) A url, required
 #' @param opts (list) curl options
 #' @param handle A handle
+#' @param method (character) HTTP method
+#' @param status_code (integer) status code
+#' @param request_headers (list) request headers, named list
+#' @param response_headers (list) response headers, named list
+#' @param modified (character) modified date
+#' @param times (vector) named vector
+#' @param content (raw) raw binary content response
+#' @param request request object, with all details
 #' @details
-#' \strong{Methods}
+#' **Methods**
 #'   \describe{
-#'     \item{\code{parse()}}{
+#'     \item{`parse()`}{
 #'       Parse the raw response content to text
 #'     }
-#'     \item{\code{success()}}{
+#'     \item{`success()`}{
 #'       Was status code less than or equal to 201.
 #'       returns boolean
 #'     }
-#'     \item{\code{status_http()}}{
+#'     \item{`status_http()`}{
 #'       Get HTTP status code, message, and explanation
 #'     }
-#'     \item{\code{raise_for_status()}}{
+#'     \item{`raise_for_status()`}{
 #'       Check HTTP status and stop with appropriate
 #'       HTTP error code and message if >= 300.
-#'       - If you have \code{fauxpas} installed we use that,
+#'       - If you have `fauxpas` installed we use that,
 #'       otherwise use \pkg{httpcode}
 #'     }
 #'   }
@@ -68,14 +76,23 @@ HttpResponse <- R6::R6Class(
     print = function(x, ...) {
       cat("<crul response> ", sep = "\n")
       cat(paste0("  url: ", self$url), sep = "\n")
+
       cat("  request_headers: ", sep = "\n")
-      for (i in seq_along(self$request_headers)) {
-        cat(sprintf("    %s: %s", names(self$request_headers)[i], self$request_headers[[i]]), sep = "\n")
+      if (length(self$request_headers)) {
+        for (i in seq_along(self$request_headers)) {
+          cat(sprintf("    %s: %s", names(self$request_headers)[i],
+                      self$request_headers[[i]]), sep = "\n")
+        }
       }
+
       cat("  response_headers: ", sep = "\n")
-      for (i in seq_along(self$response_headers)) {
-        cat(sprintf("    %s: %s", names(self$response_headers)[i], self$response_headers[[i]]), sep = "\n")
+      if (length(self$response_headers)) {
+        for (i in seq_along(self$response_headers)) {
+          cat(sprintf("    %s: %s", names(self$response_headers)[i],
+                      self$response_headers[[i]]), sep = "\n")
+        }
       }
+
       params <- parse_params(self$url)
       if (!is.null(params)) {
         cat("  params: ", sep = "\n")
@@ -83,14 +100,17 @@ HttpResponse <- R6::R6Class(
           cat(paste0("    ", sub("=", ": ", params[[i]], "=")), sep = "\n")
         }
       }
-      if (!is.null(self$status_code)) cat(paste0("  status: ", self$status_code), sep = "\n")
+      if (!is.null(self$status_code)) cat(paste0("  status: ",
+                                                 self$status_code), sep = "\n")
       invisible(self)
     },
 
-    initialize = function(method, url, opts, handle, status_code, request_headers,
-                          response_headers, modified, times, content, request) {
+    initialize = function(method, url, opts, handle, status_code,
+                          request_headers, response_headers, modified, times,
+                          content, request) {
+
       if (!missing(method)) self$method <- method
-      if (!missing(url)) self$url <- url
+      self$url <- url
       if (!missing(opts)) self$opts <- opts
       if (!missing(handle)) self$handle <- handle
       if (!missing(status_code)) self$status_code <- as.numeric(status_code)
diff --git a/R/response.R b/R/stubbed-response.R
similarity index 61%
copy from R/response.R
copy to R/stubbed-response.R
index e1bb5a0..ce90e44 100644
--- a/R/response.R
+++ b/R/stubbed-response.R
@@ -1,4 +1,4 @@
-#' Base response object
+#' stubbed response object
 #'
 #' @export
 #' @param url (character) A url
@@ -27,31 +27,11 @@
 #' @format NULL
 #' @usage NULL
 #' @examples
-#' x <- HttpResponse$new(method = "get", url = "https://httpbin.org")
+#' (x <- HttpStubbedResponse$new(method = "get", url = "https://httpbin.org"))
 #' x$url
 #' x$method
-#'
-#' x <- HttpClient$new(url = 'https://httpbin.org')
-#' (res <- x$get('get'))
-#' res$request_headers
-#' res$response_headers
-#' res$parse()
-#' res$status_code
-#' res$status_http()
-#' res$status_http()$status_code
-#' res$status_http()$message
-#' res$status_http()$explanation
-#' res$success()
-#'
-#' x <- HttpClient$new(url = 'https://httpbin.org/status/404')
-#' (res <- x$get())
-#'  \dontrun{res$raise_for_status()}
-#'
-#' x <- HttpClient$new(url = 'https://httpbin.org/status/414')
-#' (res <- x$get())
-#'  \dontrun{res$raise_for_status()}
-HttpResponse <- R6::R6Class(
-  'HttpResponse',
+HttpStubbedResponse <- R6::R6Class(
+  'HttpStubbedResponse',
   public = list(
     method = NULL,
     url = NULL,
@@ -59,23 +39,17 @@ HttpResponse <- R6::R6Class(
     handle = NULL,
     status_code = NULL,
     request_headers = NULL,
-    response_headers = NULL,
-    modified = NULL,
-    times = NULL,
     content = NULL,
     request = NULL,
 
     print = function(x, ...) {
-      cat("<crul response> ", sep = "\n")
+      cat("<crul stubbed response> ", sep = "\n")
       cat(paste0("  url: ", self$url), sep = "\n")
       cat("  request_headers: ", sep = "\n")
       for (i in seq_along(self$request_headers)) {
         cat(sprintf("    %s: %s", names(self$request_headers)[i], self$request_headers[[i]]), sep = "\n")
       }
-      cat("  response_headers: ", sep = "\n")
-      for (i in seq_along(self$response_headers)) {
-        cat(sprintf("    %s: %s", names(self$response_headers)[i], self$response_headers[[i]]), sep = "\n")
-      }
+      cat("  response_headers: NULL", sep = "\n")
       params <- parse_params(self$url)
       if (!is.null(params)) {
         cat("  params: ", sep = "\n")
@@ -88,22 +62,18 @@ HttpResponse <- R6::R6Class(
     },
 
     initialize = function(method, url, opts, handle, status_code, request_headers,
-                          response_headers, modified, times, content, request) {
+                          content, request) {
       if (!missing(method)) self$method <- method
       if (!missing(url)) self$url <- url
       if (!missing(opts)) self$opts <- opts
       if (!missing(handle)) self$handle <- handle
       if (!missing(status_code)) self$status_code <- as.numeric(status_code)
       if (!missing(request_headers)) self$request_headers <- request_headers
-      if (!missing(response_headers)) self$response_headers <- response_headers
-      if (!missing(modified)) self$modified <- modified
-      if (!missing(times)) self$times <- times
       if (!missing(content)) self$content <- content
       if (!missing(request)) self$request <- request
     },
 
     parse = function(encoding = NULL) {
-      # readBin(self$content, character())
       iconv(readBin(self$content, character()),
             from = guess_encoding(encoding),
             to = "UTF-8")
@@ -129,27 +99,3 @@ HttpResponse <- R6::R6Class(
     }
   )
 )
-
-guess_encoding <- function(encoding = NULL) {
-  if (!is.null(encoding)) {
-    return(check_encoding(encoding))
-  } else {
-    message("No encoding supplied: defaulting to UTF-8.")
-    return("UTF-8")
-  }
-}
-
-check_encoding <- function(x) {
-  if ((tolower(x) %in% tolower(iconvlist()))) return(x)
-  message("Invalid encoding ", x, ": defaulting to UTF-8.")
-  "UTF-8"
-}
-
-parse_params <- function(x) {
-  x <- urltools::parameters(x)
-  if (is.na(x)) {
-    NULL
-  } else {
-    strsplit(x, "&")[[1]]
-  }
-}
diff --git a/R/zzz.R b/R/zzz.R
index e4d2325..a9be0a5 100644
--- a/R/zzz.R
+++ b/R/zzz.R
@@ -1,3 +1,59 @@
 `%||%` <- function(x, y) if (is.null(x)) y else x
 
 ccp <- function(x) Filter(Negate(is.null), x)
+
+assert <- function(x, y) {
+  if (!is.null(x)) {
+    if (!class(x) %in% y) {
+      stop(deparse(substitute(x)), " must be of class ",
+           paste0(y, collapse = ", "), call. = FALSE)
+    }
+  }
+}
+
+prep_opts <- function(method, url, self, opts, ...) {
+  if (method != "post") {
+    opts$opts$post <- NULL
+    opts$opts$customrequest <- toupper(method)
+  }
+  if (!is.null(opts$type)) {
+    if (nchar(opts$type[[1]]) == 0) {
+      opts$type <- NULL
+    }
+  }
+  rr <- list(
+    url = url,
+    method = method,
+    options = as.list(c(
+      opts$opts
+    )),
+    headers = as.list(c(
+      opts$type,
+      `User-Agent` = make_ua(),
+      `Accept-Encoding` = 'gzip, deflate'
+    )),
+    fields = opts$fields
+  )
+  rr$headers <- norm_headers(rr$headers, self$headers)
+  rr$options <- utils::modifyList(
+    rr$options,
+    c(self$opts, self$proxies, self$auth, ...)
+  )
+  return(rr)
+}
+
+norm_headers <- function(x, y) {
+  if (length(names(y)) > 0) {
+    x <- x[!names(x) %in% names(y)]
+    x <- c(x, y)
+  }
+  return(x)
+}
+
+check_for_package <- function(x) {
+  if (!requireNamespace(x, quietly = TRUE)) {
+    stop(sprintf("Please install '%s'", x), call. = FALSE)
+  } else {
+    invisible(TRUE)
+  }
+}
diff --git a/README.md b/README.md
index 188ce5d..a49a2a2 100644
--- a/README.md
+++ b/README.md
@@ -3,12 +3,33 @@ crul
 
 
 
+[![Project Status: Active - The project has reached a stable, usable state and is being actively developed.](http://www.repostatus.org/badges/latest/active.svg)](http://www.repostatus.org/#active)
 [![Build Status](https://travis-ci.org/ropensci/crul.svg?branch=master)](https://travis-ci.org/ropensci/crul)
 [![codecov](https://codecov.io/gh/ropensci/crul/branch/master/graph/badge.svg)](https://codecov.io/gh/ropensci/crul)
 [![rstudio mirror downloads](http://cranlogs.r-pkg.org/badges/crul)](https://github.com/metacran/cranlogs.app)
-[![cran version](http://www.r-pkg.org/badges/version/crul)](https://cran.r-project.org/package=crul)
+[![cran version](https://www.r-pkg.org/badges/version/crul)](https://cran.r-project.org/package=crul)
 
-An HTTP client, taking inspiration from Rubyland's [faraday](https://rubygems.org/gems/faraday).
+An HTTP client, taking inspiration from Ruby's [faraday](https://rubygems.org/gems/faraday) and Python's [requests](http://docs.python-requests.org/en/master/)
+
+Package API:
+
+* `HttpClient` - Main interface to making HTTP requests. Synchronous requests only.
+* `HttpResponse` - HTTP response object, used for all responses across the
+different clients.
+* `Async` - Asynchronous HTTP requests - a simple interface for many URLS -
+whose interface is similar to `HttpClient` - all URLs are treated the same.
+* `AsyncVaried` - Asynchronous HTTP requests - accepts any number of `HttpRequest`
+objects - with a different interface than `HttpClient`/`Async` due to the nature
+of handling requests with different HTTP methods, options, etc.
+* `HttpRequest` - HTTP request object, used for `AsyncVaried`
+* `mock()` - Turn on/off mocking, via `webmockr`
+* `auth()` - Simple authentication helper
+* `proxy()` - Proxy helper
+
+Mocking:
+
+`crul` now integrates with [webmockr](https://github.com/ropensci/webmockr) to mock 
+HTTP requests.
 
 ## Installation
 
@@ -23,7 +44,6 @@ Dev version
 
 
 ```r
-install.packages("devtools")
 devtools::install_github("ropensci/crul")
 ```
 
@@ -49,8 +69,10 @@ library("crul")
 ))
 #> <crul connection> 
 #>   url: https://httpbin.org
-#>   options: 
+#>   curl options: 
 #>     timeout: 1
+#>   proxies: 
+#>   auth: 
 #>   headers: 
 #>     a: hello world
 ```
@@ -122,14 +144,15 @@ res$content
 #>  [47] 6f 20 77 6f 72 6c 64 22 2c 20 0a 20 20 20 20 22 41 63 63 65 70 74 22
 #>  [70] 3a 20 22 2a 2f 2a 22 2c 20 0a 20 20 20 20 22 41 63 63 65 70 74 2d 45
 #>  [93] 6e 63 6f 64 69 6e 67 22 3a 20 22 67 7a 69 70 2c 20 64 65 66 6c 61 74
-#> [116] 65 22 2c 20 0a 20 20 20 20 22 48 6f 73 74 22 3a 20 22 68 74 74 70 62
-#> [139] 69 6e 2e 6f 72 67 22 2c 20 0a 20 20 20 20 22 55 73 65 72 2d 41 67 65
-#> [162] 6e 74 22 3a 20 22 6c 69 62 63 75 72 6c 2f 37 2e 35 31 2e 30 20 72 2d
-#> [185] 63 75 72 6c 2f 32 2e 33 20 63 72 75 6c 2f 30 2e 32 2e 30 22 0a 20 20
-#> [208] 7d 2c 20 0a 20 20 22 6f 72 69 67 69 6e 22 3a 20 22 37 31 2e 36 33 2e
-#> [231] 32 32 33 2e 31 31 33 22 2c 20 0a 20 20 22 75 72 6c 22 3a 20 22 68 74
-#> [254] 74 70 73 3a 2f 2f 68 74 74 70 62 69 6e 2e 6f 72 67 2f 67 65 74 22 0a
-#> [277] 7d 0a
+#> [116] 65 22 2c 20 0a 20 20 20 20 22 43 6f 6e 6e 65 63 74 69 6f 6e 22 3a 20
+#> [139] 22 63 6c 6f 73 65 22 2c 20 0a 20 20 20 20 22 48 6f 73 74 22 3a 20 22
+#> [162] 68 74 74 70 62 69 6e 2e 6f 72 67 22 2c 20 0a 20 20 20 20 22 55 73 65
+#> [185] 72 2d 41 67 65 6e 74 22 3a 20 22 6c 69 62 63 75 72 6c 2f 37 2e 35 31
+#> [208] 2e 30 20 72 2d 63 75 72 6c 2f 32 2e 36 20 63 72 75 6c 2f 30 2e 33 2e
+#> [231] 38 22 0a 20 20 7d 2c 20 0a 20 20 22 6f 72 69 67 69 6e 22 3a 20 22 31
+#> [254] 35 37 2e 31 33 30 2e 31 37 39 2e 38 36 22 2c 20 0a 20 20 22 75 72 6c
+#> [277] 22 3a 20 22 68 74 74 70 73 3a 2f 2f 68 74 74 70 62 69 6e 2e 6f 72 67
+#> [300] 2f 67 65 74 22 0a 7d 0a
 ```
 
 HTTP method
@@ -145,8 +168,11 @@ Request headers
 
 ```r
 res$request_headers
-#> $useragent
-#> [1] "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"
+#> $`User-Agent`
+#> [1] "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#> 
+#> $`Accept-Encoding`
+#> [1] "gzip, deflate"
 #> 
 #> $a
 #> [1] "hello world"
@@ -160,26 +186,35 @@ res$response_headers
 #> $status
 #> [1] "HTTP/1.1 200 OK"
 #> 
+#> $connection
+#> [1] "keep-alive"
+#> 
 #> $server
-#> [1] "nginx"
+#> [1] "meinheld/0.6.1"
 #> 
 #> $date
-#> [1] "Tue, 03 Jan 2017 05:51:33 GMT"
+#> [1] "Tue, 13 Jun 2017 22:25:04 GMT"
 #> 
 #> $`content-type`
 #> [1] "application/json"
 #> 
-#> $`content-length`
-#> [1] "278"
-#> 
-#> $connection
-#> [1] "keep-alive"
-#> 
 #> $`access-control-allow-origin`
 #> [1] "*"
 #> 
 #> $`access-control-allow-credentials`
 #> [1] "true"
+#> 
+#> $`x-powered-by`
+#> [1] "Flask"
+#> 
+#> $`x-processed-time`
+#> [1] "0.000868082046509"
+#> 
+#> $`content-length`
+#> [1] "307"
+#> 
+#> $via
+#> [1] "1.1 vegur"
 ```
 
 And you can parse the content with `parse()`
@@ -188,7 +223,7 @@ And you can parse the content with `parse()`
 ```r
 res$parse()
 #> No encoding supplied: defaulting to UTF-8.
-#> [1] "{\n  \"args\": {}, \n  \"headers\": {\n    \"A\": \"hello world\", \n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.3 crul/0.2.0\"\n  }, \n  \"origin\": \"71.63.223.113\", \n  \"url\": \"https://httpbin.org/get\"\n}\n"
+#> [1] "{\n  \"args\": {}, \n  \"headers\": {\n    \"A\": \"hello world\", \n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get\"\n}\n"
 jsonlite::fromJSON(res$parse())
 #> No encoding supplied: defaulting to UTF-8.
 #> $args
@@ -204,15 +239,18 @@ jsonlite::fromJSON(res$parse())
 #> $headers$`Accept-Encoding`
 #> [1] "gzip, deflate"
 #> 
+#> $headers$Connection
+#> [1] "close"
+#> 
 #> $headers$Host
 #> [1] "httpbin.org"
 #> 
 #> $headers$`User-Agent`
-#> [1] "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"
+#> [1] "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
 #> 
 #> 
 #> $origin
-#> [1] "71.63.223.113"
+#> [1] "157.130.179.86"
 #> 
 #> $url
 #> [1] "https://httpbin.org/get"
@@ -224,16 +262,88 @@ jsonlite::fromJSON(res$parse())
 ```r
 res <- HttpClient$new(url = "http://api.gbif.org/v1/occurrence/search")
 res$get(query = list(limit = 100), timeout_ms = 100)
-#> Error in curl::curl_fetch_memory(x$url$url, handle = x$url$handle) : 
+#> Error in curl::curl_fetch_memory(x$url$url, handle = x$url$handle) :
 #>   Timeout was reached
 ```
 
+## Asynchronous requests
+
+The simpler interface allows many requests (many URLs), but they all get the same
+options/headers, etc. and you have to use the same HTTP method on all of them:
+
+
+```r
+(cc <- Async$new(
+  urls = c(
+    'https://httpbin.org/',
+    'https://httpbin.org/get?a=5',
+    'https://httpbin.org/get?foo=bar'
+  )
+))
+#> <crul async connection> 
+#>   urls: 
+#>    https://httpbin.org/
+#>    https://httpbin.org/get?a=5
+#>    https://httpbin.org/get?foo=bar
+res <- cc$get()
+lapply(res, function(z) z$parse("UTF-8"))
+#> [[1]]
+#> [1] "<!DOCTYPE html>\n<html>\n<head>\n  <meta http-equiv='content-type' value='text/html;charset=utf8'>\n  <meta name='generator' value='Ronn/v0.7.3 (http://github.com/rtomayko/ronn/tree/0.7.3)'>\n  <title>httpbin(1): HTTP Client Testing Service</title>\n  <style type='text/css' media='all'>\n  /* style: man */\n  body#manpage {margin:0}\n  .mp {max-width:100ex;padding:0 9ex 1ex 4ex}\n  .mp p,.mp pre,.mp ul,.mp ol,.mp dl {margin:0 0 20px 0}\n  .mp h2 {margin:10px 0 0 0}\n  .mp > p,.mp [...]
+#> 
+#> [[2]]
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get?a=5\"\n}\n"
+#> 
+#> [[3]]
+#> [1] "{\n  \"args\": {\n    \"foo\": \"bar\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get?foo=bar\"\n}\n"
+```
+
+The `AsyncVaried` interface accepts any number of `HttpRequest` objects, which
+can define any type of HTTP request of any HTTP method:
+
+
+```r
+req1 <- HttpRequest$new(
+  url = "https://httpbin.org/get",
+  opts = list(verbose = TRUE),
+  headers = list(foo = "bar")
+)$get()
+req2 <- HttpRequest$new(url = "https://httpbin.org/post")$post()
+out <- AsyncVaried$new(req1, req2)
+```
+
+Execute the requests
+
+
+```r
+out$request()
+```
+
+Then functions get applied to all responses:
+
+
+```r
+out$status()
+#> [[1]]
+#> <Status code: 200>
+#>   Message: OK
+#>   Explanation: Request fulfilled, document follows
+#> 
+#> [[2]]
+#> <Status code: 200>
+#>   Message: OK
+#>   Explanation: Request fulfilled, document follows
+out$parse()
+#> [1] "{\n  \"args\": {}, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Foo\": \"bar\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get\"\n}\n"                                                                                                                                      [...]
+#> [2] "{\n  \"args\": {}, \n  \"data\": \"\", \n  \"files\": {}, \n  \"form\": {}, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Content-Length\": \"0\", \n    \"Content-Type\": \"application/x-www-form-urlencoded\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"json\": null, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin [...]
+```
+
 ## TO DO
 
+### http caching
+
 Add integration for:
 
-* [webmockr](https://github.com/ropensci/webmockr)
-* [vcr](https://github.com/ropensci/vcr) 
+* [vcr](https://github.com/ropensci/vcr)
 
 for flexible and easy HTTP request caching
 
@@ -245,4 +355,3 @@ for flexible and easy HTTP request caching
 * Please note that this project is released with a [Contributor Code of Conduct](CONDUCT.md).
 By participating in this project you agree to abide by its terms.
 
-[![ropensci_footer](https://ropensci.org/public_images/github_footer.png)](https://ropensci.org)
diff --git a/build/vignette.rds b/build/vignette.rds
index c57c5ff..b778fba 100644
Binary files a/build/vignette.rds and b/build/vignette.rds differ
diff --git a/inst/doc/async.Rmd b/inst/doc/async.Rmd
new file mode 100644
index 0000000..c0adf23
--- /dev/null
+++ b/inst/doc/async.Rmd
@@ -0,0 +1,263 @@
+<!--
+%\VignetteIndexEntry{async with crul}
+%\VignetteEngine{knitr::rmarkdown}
+%\VignetteEncoding{UTF-8}
+-->
+
+async with crul
+===============
+
+
+
+Asynchronous requests with `crul`.
+
+There are two interfaces to asynchronous requests in `crul`:
+
+1. Simple async: any number of URLs, all treated with the same curl options, 
+headers, etc., and only one HTTP method type at a time.
+2. Varied request async: build any type of request and execute all asynchronously.
+
+The first option takes less thinking, less work, and is good solution when you 
+just want to hit a bunch of URLs asynchronously.
+
+The second option is ideal when you want to set curl options/headers on each 
+request and/or want to do different types of HTTP methods on each request.
+
+One thing to think about before using async is whether the data provider is 
+okay with it. It's possible that a data provider's service may be brought down
+if you do too many async requests. 
+
+
+```r
+library("crul")
+```
+
+## simple async
+
+Build request objcect with 1 or more URLs
+
+
+
+
+```r
+(cc <- Async$new(
+  urls = c(
+    'https://httpbin.org/get?a=5',
+    'https://httpbin.org/get?a=5&b=6',
+    'https://httpbin.org/ip'
+  )
+))
+#> <crul async connection> 
+#>   urls: 
+#>    https://httpbin.org/get?a=5
+#>    https://httpbin.org/get?a=5&b=6
+#>    https://httpbin.org/ip
+```
+
+Make request with any HTTP method
+
+
+```r
+(res <- cc$get())
+#> [[1]]
+#> <crul response> 
+#>   url: https://httpbin.org/get?a=5
+#>   request_headers: 
+#>     useragent: libcurl/7.51.0 r-curl/2.6 crul/0.3.8
+#>   response_headers: 
+#>     status: HTTP/1.1 200 OK
+#>     connection: keep-alive
+#>     server: meinheld/0.6.1
+#>     date: Tue, 13 Jun 2017 22:26:33 GMT
+#>     content-type: application/json
+#>     access-control-allow-origin: *
+#>     access-control-allow-credentials: true
+#>     x-powered-by: Flask
+#>     x-processed-time: 0.000774145126343
+#>     content-length: 302
+#>     via: 1.1 vegur
+#>   params: 
+#>     a: 5
+#>   status: 200
+#> 
+#> [[2]]
+#> <crul response> 
+#>   url: https://httpbin.org/get?a=5&b=6
+#>   request_headers: 
+#>     useragent: libcurl/7.51.0 r-curl/2.6 crul/0.3.8
+#>   response_headers: 
+#>     status: HTTP/1.1 200 OK
+#>     connection: keep-alive
+#>     server: meinheld/0.6.1
+#>     date: Tue, 13 Jun 2017 22:26:33 GMT
+#>     content-type: application/json
+#>     access-control-allow-origin: *
+#>     access-control-allow-credentials: true
+#>     x-powered-by: Flask
+#>     x-processed-time: 0.00131487846375
+#>     content-length: 321
+#>     via: 1.1 vegur
+#>   params: 
+#>     a: 5
+#>     b: 6
+#>   status: 200
+#> 
+#> [[3]]
+#> <crul response> 
+#>   url: https://httpbin.org/ip
+#>   request_headers: 
+#>     useragent: libcurl/7.51.0 r-curl/2.6 crul/0.3.8
+#>   response_headers: 
+#>     status: HTTP/1.1 200 OK
+#>     connection: keep-alive
+#>     server: meinheld/0.6.1
+#>     date: Tue, 13 Jun 2017 22:26:33 GMT
+#>     content-type: application/json
+#>     access-control-allow-origin: *
+#>     access-control-allow-credentials: true
+#>     x-powered-by: Flask
+#>     x-processed-time: 0.000844955444336
+#>     content-length: 33
+#>     via: 1.1 vegur
+#>   status: 200
+```
+
+You get back a list matching length of the number of input URLs
+
+Access object variables and methods just as with `HttpClient` results, here just one at a time.
+
+
+```r
+res[[1]]$url
+#> [1] "https://httpbin.org/get?a=5"
+res[[1]]$success()
+#> [1] TRUE
+res[[1]]$parse("UTF-8")
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get?a=5\"\n}\n"
+```
+
+Or apply access/method calls aross many results, e.g., parse all results
+
+
+```r
+lapply(res, function(z) z$parse("UTF-8"))
+#> [[1]]
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get?a=5\"\n}\n"
+#> 
+#> [[2]]
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\", \n    \"b\": \"6\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get?a=5&b=6\"\n}\n"
+#> 
+#> [[3]]
+#> [1] "{\n  \"origin\": \"157.130.179.86\"\n}\n"
+```
+
+## varied request async
+
+
+```r
+req1 <- HttpRequest$new(
+  url = "https://httpbin.org/get?a=5",
+  opts = list(
+    verbose = TRUE
+  )
+)
+req1$get()
+#> <crul http request> get
+#>   url: https://httpbin.org/get?a=5
+#>   curl options: 
+#>     verbose: TRUE
+#>   proxies: 
+#>   auth: 
+#>   headers:
+
+req2 <- HttpRequest$new(
+  url = "https://httpbin.org/post?a=5&b=6"
+)
+req2$post(body = list(a = 5))
+#> <crul http request> post
+#>   url: https://httpbin.org/post?a=5&b=6
+#>   curl options: 
+#>   proxies: 
+#>   auth: 
+#>   headers:
+
+(res <- AsyncVaried$new(req1, req2))
+#> <crul async varied connection> 
+#>   requests: 
+#>    get: https://httpbin.org/get?a=5 
+#>    post: https://httpbin.org/post?a=5&b=6
+```
+
+Make requests asynchronously
+
+
+```r
+res$request()
+```
+
+Parse all results
+
+
+```r
+res$parse()
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get?a=5\"\n}\n"                                                                                                                                    [...]
+#> [2] "{\n  \"args\": {\n    \"a\": \"5\", \n    \"b\": \"6\"\n  }, \n  \"data\": \"\", \n  \"files\": {}, \n  \"form\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Content-Length\": \"137\", \n    \"Content-Type\": \"multipart/form-data; boundary=------------------------4cd6745fc264b623\", \n    \"Expect\": \"100-continue\", \n    \"Host\": \"httpbin.org\", \n    \"User-Age [...]
+```
+
+
+```r
+lapply(res$parse(), jsonlite::prettify)
+#> [[1]]
+#> {
+#>     "args": {
+#>         "a": "5"
+#>     },
+#>     "headers": {
+#>         "Accept": "*/*",
+#>         "Accept-Encoding": "gzip, deflate",
+#>         "Connection": "close",
+#>         "Host": "httpbin.org",
+#>         "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#>     },
+#>     "origin": "157.130.179.86",
+#>     "url": "https://httpbin.org/get?a=5"
+#> }
+#>  
+#> 
+#> [[2]]
+#> {
+#>     "args": {
+#>         "a": "5",
+#>         "b": "6"
+#>     },
+#>     "data": "",
+#>     "files": {
+#> 
+#>     },
+#>     "form": {
+#>         "a": "5"
+#>     },
+#>     "headers": {
+#>         "Accept": "*/*",
+#>         "Accept-Encoding": "gzip, deflate",
+#>         "Connection": "close",
+#>         "Content-Length": "137",
+#>         "Content-Type": "multipart/form-data; boundary=------------------------4cd6745fc264b623",
+#>         "Expect": "100-continue",
+#>         "Host": "httpbin.org",
+#>         "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#>     },
+#>     "json": null,
+#>     "origin": "157.130.179.86",
+#>     "url": "https://httpbin.org/post?a=5&b=6"
+#> }
+#> 
+```
+
+Status codes
+
+
+```r
+res$status_code()
+#> [1] 200 200
+```
diff --git a/inst/doc/async.html b/inst/doc/async.html
new file mode 100644
index 0000000..7f2c87f
--- /dev/null
+++ b/inst/doc/async.html
@@ -0,0 +1,356 @@
+<!DOCTYPE html>
+
+<html xmlns="http://www.w3.org/1999/xhtml">
+
+<head>
+
+<meta charset="utf-8" />
+<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
+<meta name="generator" content="pandoc" />
+
+
+
+
+<title></title>
+
+<script src="data:application/x-javascript;base64,LyohIGpRdWVyeSB2MS4xMS4zIHwgKGMpIDIwMDUsIDIwMTUgalF1ZXJ5IEZvdW5kYXRpb24sIEluYy4gfCBqcXVlcnkub3JnL2xpY2Vuc2UgKi8KIWZ1bmN0aW9uKGEsYil7Im9iamVjdCI9PXR5cGVvZiBtb2R1bGUmJiJvYmplY3QiPT10eXBlb2YgbW9kdWxlLmV4cG9ydHM/bW9kdWxlLmV4cG9ydHM9YS5kb2N1bWVudD9iKGEsITApOmZ1bmN0aW9uKGEpe2lmKCFhLmRvY3VtZW50KXRocm93IG5ldyBFcnJvcigialF1ZXJ5IHJlcXVpcmVzIGEgd2luZG93IHdpdGggYSBkb2N1bWVudCIpO3JldHVybiBiKGEpfTpiKGEpfSgidW5kZWZpbmVkIiE9dHlwZW9mIHdpbmRvdz93aW5kb3c6dG [...]
+<meta name="viewport" content="width=device-width, initial-scale=1" />
+<link href="data:text/css;charset=utf-8,html%7Bfont%2Dfamily%3Asans%2Dserif%3B%2Dwebkit%2Dtext%2Dsize%2Dadjust%3A100%25%3B%2Dms%2Dtext%2Dsize%2Dadjust%3A100%25%7Dbody%7Bmargin%3A0%7Darticle%2Caside%2Cdetails%2Cfigcaption%2Cfigure%2Cfooter%2Cheader%2Chgroup%2Cmain%2Cmenu%2Cnav%2Csection%2Csummary%7Bdisplay%3Ablock%7Daudio%2Ccanvas%2Cprogress%2Cvideo%7Bdisplay%3Ainline%2Dblock%3Bvertical%2Dalign%3Abaseline%7Daudio%3Anot%28%5Bcontrols%5D%29%7Bdisplay%3Anone%3Bheight%3A0%7D%5Bhidden%5D%2Ctem [...]
+<script src="data:application/x-javascript;base64,LyohCiAqIEJvb3RzdHJhcCB2My4zLjUgKGh0dHA6Ly9nZXRib290c3RyYXAuY29tKQogKiBDb3B5cmlnaHQgMjAxMS0yMDE1IFR3aXR0ZXIsIEluYy4KICogTGljZW5zZWQgdW5kZXIgdGhlIE1JVCBsaWNlbnNlCiAqLwppZigidW5kZWZpbmVkIj09dHlwZW9mIGpRdWVyeSl0aHJvdyBuZXcgRXJyb3IoIkJvb3RzdHJhcCdzIEphdmFTY3JpcHQgcmVxdWlyZXMgalF1ZXJ5Iik7K2Z1bmN0aW9uKGEpeyJ1c2Ugc3RyaWN0Ijt2YXIgYj1hLmZuLmpxdWVyeS5zcGxpdCgiICIpWzBdLnNwbGl0KCIuIik7aWYoYlswXTwyJiZiWzFdPDl8fDE9PWJbMF0mJjk9PWJbMV0mJmJbMl08MSl0aHJvdy [...]
+<script src="data:application/x-javascript;base64,LyoqCiogQHByZXNlcnZlIEhUTUw1IFNoaXYgMy43LjIgfCBAYWZhcmthcyBAamRhbHRvbiBAam9uX25lYWwgQHJlbSB8IE1JVC9HUEwyIExpY2Vuc2VkCiovCi8vIE9ubHkgcnVuIHRoaXMgY29kZSBpbiBJRSA4CmlmICghIXdpbmRvdy5uYXZpZ2F0b3IudXNlckFnZW50Lm1hdGNoKCJNU0lFIDgiKSkgewohZnVuY3Rpb24oYSxiKXtmdW5jdGlvbiBjKGEsYil7dmFyIGM9YS5jcmVhdGVFbGVtZW50KCJwIiksZD1hLmdldEVsZW1lbnRzQnlUYWdOYW1lKCJoZWFkIilbMF18fGEuZG9jdW1lbnRFbGVtZW50O3JldHVybiBjLmlubmVySFRNTD0ieDxzdHlsZT4iK2IrIjwvc3R5bGU+IixkLm [...]
+<script src="data:application/x-javascript;base64,LyohIFJlc3BvbmQuanMgdjEuNC4yOiBtaW4vbWF4LXdpZHRoIG1lZGlhIHF1ZXJ5IHBvbHlmaWxsICogQ29weXJpZ2h0IDIwMTMgU2NvdHQgSmVobAogKiBMaWNlbnNlZCB1bmRlciBodHRwczovL2dpdGh1Yi5jb20vc2NvdHRqZWhsL1Jlc3BvbmQvYmxvYi9tYXN0ZXIvTElDRU5TRS1NSVQKICogICovCgovLyBPbmx5IHJ1biB0aGlzIGNvZGUgaW4gSUUgOAppZiAoISF3aW5kb3cubmF2aWdhdG9yLnVzZXJBZ2VudC5tYXRjaCgiTVNJRSA4IikpIHsKIWZ1bmN0aW9uKGEpeyJ1c2Ugc3RyaWN0IjthLm1hdGNoTWVkaWE9YS5tYXRjaE1lZGlhfHxmdW5jdGlvbihhKXt2YXIgYixjPWEuZG [...]
+<script src="data:application/x-javascript;base64,CgovKioKICogalF1ZXJ5IFBsdWdpbjogU3RpY2t5IFRhYnMKICoKICogQGF1dGhvciBBaWRhbiBMaXN0ZXIgPGFpZGFuQHBocC5uZXQ+CiAqIGFkYXB0ZWQgYnkgUnViZW4gQXJzbGFuIHRvIGFjdGl2YXRlIHBhcmVudCB0YWJzIHRvbwogKiBodHRwOi8vd3d3LmFpZGFubGlzdGVyLmNvbS8yMDE0LzAzL3BlcnNpc3RpbmctdGhlLXRhYi1zdGF0ZS1pbi1ib290c3RyYXAvCiAqLwooZnVuY3Rpb24oJCkgewogICJ1c2Ugc3RyaWN0IjsKICAkLmZuLnJtYXJrZG93blN0aWNreVRhYnMgPSBmdW5jdGlvbigpIHsKICAgIHZhciBjb250ZXh0ID0gdGhpczsKICAgIC8vIFNob3cgdGhlIHRhYi [...]
+<link href="data:text/css;charset=utf-8,pre%20%2Eoperator%2C%0Apre%20%2Eparen%20%7B%0Acolor%3A%20rgb%28104%2C%20118%2C%20135%29%0A%7D%0Apre%20%2Eliteral%20%7B%0Acolor%3A%20%23990073%0A%7D%0Apre%20%2Enumber%20%7B%0Acolor%3A%20%23099%3B%0A%7D%0Apre%20%2Ecomment%20%7B%0Acolor%3A%20%23998%3B%0Afont%2Dstyle%3A%20italic%0A%7D%0Apre%20%2Ekeyword%20%7B%0Acolor%3A%20%23900%3B%0Afont%2Dweight%3A%20bold%0A%7D%0Apre%20%2Eidentifier%20%7B%0Acolor%3A%20rgb%280%2C%200%2C%200%29%3B%0A%7D%0Apre%20%2Estri [...]
+<script src="data:application/x-javascript;base64,dmFyIGhsanM9bmV3IGZ1bmN0aW9uKCl7ZnVuY3Rpb24gbShwKXtyZXR1cm4gcC5yZXBsYWNlKC8mL2dtLCImYW1wOyIpLnJlcGxhY2UoLzwvZ20sIiZsdDsiKX1mdW5jdGlvbiBmKHIscSxwKXtyZXR1cm4gUmVnRXhwKHEsIm0iKyhyLmNJPyJpIjoiIikrKHA/ImciOiIiKSl9ZnVuY3Rpb24gYihyKXtmb3IodmFyIHA9MDtwPHIuY2hpbGROb2Rlcy5sZW5ndGg7cCsrKXt2YXIgcT1yLmNoaWxkTm9kZXNbcF07aWYocS5ub2RlTmFtZT09IkNPREUiKXtyZXR1cm4gcX1pZighKHEubm9kZVR5cGU9PTMmJnEubm9kZVZhbHVlLm1hdGNoKC9ccysvKSkpe2JyZWFrfX19ZnVuY3Rpb24gaCh0LH [...]
+
+<style type="text/css">code{white-space: pre;}</style>
+<style type="text/css">
+  pre:not([class]) {
+    background-color: white;
+  }
+</style>
+<script type="text/javascript">
+if (window.hljs && document.readyState && document.readyState === "complete") {
+   window.setTimeout(function() {
+      hljs.initHighlighting();
+   }, 0);
+}
+</script>
+
+
+
+<style type="text/css">
+h1 {
+  font-size: 34px;
+}
+h1.title {
+  font-size: 38px;
+}
+h2 {
+  font-size: 30px;
+}
+h3 {
+  font-size: 24px;
+}
+h4 {
+  font-size: 18px;
+}
+h5 {
+  font-size: 16px;
+}
+h6 {
+  font-size: 12px;
+}
+.table th:not([align]) {
+  text-align: left;
+}
+</style>
+
+
+</head>
+
+<body>
+
+<style type="text/css">
+.main-container {
+  max-width: 940px;
+  margin-left: auto;
+  margin-right: auto;
+}
+code {
+  color: inherit;
+  background-color: rgba(0, 0, 0, 0.04);
+}
+img {
+  max-width:100%;
+  height: auto;
+}
+.tabbed-pane {
+  padding-top: 12px;
+}
+button.code-folding-btn:focus {
+  outline: none;
+}
+</style>
+
+
+
+<div class="container-fluid main-container">
+
+<!-- tabsets -->
+<script>
+$(document).ready(function () {
+  window.buildTabsets("TOC");
+});
+</script>
+
+<!-- code folding -->
+
+
+
+
+
+
+<div class="fluid-row" id="header">
+
+
+
+
+</div>
+
+
+<!--
+%\VignetteIndexEntry{async with crul}
+%\VignetteEngine{knitr::rmarkdown}
+%\VignetteEncoding{UTF-8}
+-->
+<div id="async-with-crul" class="section level1">
+<h1>async with crul</h1>
+<p>Asynchronous requests with <code>crul</code>.</p>
+<p>There are two interfaces to asynchronous requests in <code>crul</code>:</p>
+<ol style="list-style-type: decimal">
+<li>Simple async: any number of URLs, all treated with the same curl options, headers, etc., and only one HTTP method type at a time.</li>
+<li>Varied request async: build any type of request and execute all asynchronously.</li>
+</ol>
+<p>The first option takes less thinking, less work, and is good solution when you just want to hit a bunch of URLs asynchronously.</p>
+<p>The second option is ideal when you want to set curl options/headers on each request and/or want to do different types of HTTP methods on each request.</p>
+<p>One thing to think about before using async is whether the data provider is okay with it. It’s possible that a data provider’s service may be brought down if you do too many async requests.</p>
+<pre class="r"><code>library("crul")</code></pre>
+<div id="simple-async" class="section level2">
+<h2>simple async</h2>
+<p>Build request objcect with 1 or more URLs</p>
+<pre class="r"><code>(cc <- Async$new(
+  urls = c(
+    'https://httpbin.org/get?a=5',
+    'https://httpbin.org/get?a=5&b=6',
+    'https://httpbin.org/ip'
+  )
+))
+#> <crul async connection> 
+#>   urls: 
+#>    https://httpbin.org/get?a=5
+#>    https://httpbin.org/get?a=5&b=6
+#>    https://httpbin.org/ip</code></pre>
+<p>Make request with any HTTP method</p>
+<pre class="r"><code>(res <- cc$get())
+#> [[1]]
+#> <crul response> 
+#>   url: https://httpbin.org/get?a=5
+#>   request_headers: 
+#>     useragent: libcurl/7.51.0 r-curl/2.6 crul/0.3.8
+#>   response_headers: 
+#>     status: HTTP/1.1 200 OK
+#>     connection: keep-alive
+#>     server: meinheld/0.6.1
+#>     date: Tue, 13 Jun 2017 22:26:33 GMT
+#>     content-type: application/json
+#>     access-control-allow-origin: *
+#>     access-control-allow-credentials: true
+#>     x-powered-by: Flask
+#>     x-processed-time: 0.000774145126343
+#>     content-length: 302
+#>     via: 1.1 vegur
+#>   params: 
+#>     a: 5
+#>   status: 200
+#> 
+#> [[2]]
+#> <crul response> 
+#>   url: https://httpbin.org/get?a=5&b=6
+#>   request_headers: 
+#>     useragent: libcurl/7.51.0 r-curl/2.6 crul/0.3.8
+#>   response_headers: 
+#>     status: HTTP/1.1 200 OK
+#>     connection: keep-alive
+#>     server: meinheld/0.6.1
+#>     date: Tue, 13 Jun 2017 22:26:33 GMT
+#>     content-type: application/json
+#>     access-control-allow-origin: *
+#>     access-control-allow-credentials: true
+#>     x-powered-by: Flask
+#>     x-processed-time: 0.00131487846375
+#>     content-length: 321
+#>     via: 1.1 vegur
+#>   params: 
+#>     a: 5
+#>     b: 6
+#>   status: 200
+#> 
+#> [[3]]
+#> <crul response> 
+#>   url: https://httpbin.org/ip
+#>   request_headers: 
+#>     useragent: libcurl/7.51.0 r-curl/2.6 crul/0.3.8
+#>   response_headers: 
+#>     status: HTTP/1.1 200 OK
+#>     connection: keep-alive
+#>     server: meinheld/0.6.1
+#>     date: Tue, 13 Jun 2017 22:26:33 GMT
+#>     content-type: application/json
+#>     access-control-allow-origin: *
+#>     access-control-allow-credentials: true
+#>     x-powered-by: Flask
+#>     x-processed-time: 0.000844955444336
+#>     content-length: 33
+#>     via: 1.1 vegur
+#>   status: 200</code></pre>
+<p>You get back a list matching length of the number of input URLs</p>
+<p>Access object variables and methods just as with <code>HttpClient</code> results, here just one at a time.</p>
+<pre class="r"><code>res[[1]]$url
+#> [1] "https://httpbin.org/get?a=5"
+res[[1]]$success()
+#> [1] TRUE
+res[[1]]$parse("UTF-8")
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"ur [...]
+<p>Or apply access/method calls aross many results, e.g., parse all results</p>
+<pre class="r"><code>lapply(res, function(z) z$parse("UTF-8"))
+#> [[1]]
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"ur [...]
+#> 
+#> [[2]]
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\", \n    \"b\": \"6\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \&q [...]
+#> 
+#> [[3]]
+#> [1] "{\n  \"origin\": \"157.130.179.86\"\n}\n"</code></pre>
+</div>
+<div id="varied-request-async" class="section level2">
+<h2>varied request async</h2>
+<pre class="r"><code>req1 <- HttpRequest$new(
+  url = "https://httpbin.org/get?a=5",
+  opts = list(
+    verbose = TRUE
+  )
+)
+req1$get()
+#> <crul http request> get
+#>   url: https://httpbin.org/get?a=5
+#>   curl options: 
+#>     verbose: TRUE
+#>   proxies: 
+#>   auth: 
+#>   headers:
+
+req2 <- HttpRequest$new(
+  url = "https://httpbin.org/post?a=5&b=6"
+)
+req2$post(body = list(a = 5))
+#> <crul http request> post
+#>   url: https://httpbin.org/post?a=5&b=6
+#>   curl options: 
+#>   proxies: 
+#>   auth: 
+#>   headers:
+
+(res <- AsyncVaried$new(req1, req2))
+#> <crul async varied connection> 
+#>   requests: 
+#>    get: https://httpbin.org/get?a=5 
+#>    post: https://httpbin.org/post?a=5&b=6</code></pre>
+<p>Make requests asynchronously</p>
+<pre class="r"><code>res$request()</code></pre>
+<p>Parse all results</p>
+<pre class="r"><code>res$parse()
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"ur [...]
+#> [2] "{\n  \"args\": {\n    \"a\": \"5\", \n    \"b\": \"6\"\n  }, \n  \"data\": \"\", \n  \"files\": {}, \n  \"form\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Content-Length\&quo [...]
+<pre class="r"><code>lapply(res$parse(), jsonlite::prettify)
+#> [[1]]
+#> {
+#>     "args": {
+#>         "a": "5"
+#>     },
+#>     "headers": {
+#>         "Accept": "*/*",
+#>         "Accept-Encoding": "gzip, deflate",
+#>         "Connection": "close",
+#>         "Host": "httpbin.org",
+#>         "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#>     },
+#>     "origin": "157.130.179.86",
+#>     "url": "https://httpbin.org/get?a=5"
+#> }
+#>  
+#> 
+#> [[2]]
+#> {
+#>     "args": {
+#>         "a": "5",
+#>         "b": "6"
+#>     },
+#>     "data": "",
+#>     "files": {
+#> 
+#>     },
+#>     "form": {
+#>         "a": "5"
+#>     },
+#>     "headers": {
+#>         "Accept": "*/*",
+#>         "Accept-Encoding": "gzip, deflate",
+#>         "Connection": "close",
+#>         "Content-Length": "137",
+#>         "Content-Type": "multipart/form-data; boundary=------------------------4cd6745fc264b623",
+#>         "Expect": "100-continue",
+#>         "Host": "httpbin.org",
+#>         "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#>     },
+#>     "json": null,
+#>     "origin": "157.130.179.86",
+#>     "url": "https://httpbin.org/post?a=5&b=6"
+#> }
+#> </code></pre>
+<p>Status codes</p>
+<pre class="r"><code>res$status_code()
+#> [1] 200 200</code></pre>
+</div>
+</div>
+
+
+
+
+</div>
+
+<script>
+
+// add bootstrap table styles to pandoc tables
+function bootstrapStylePandocTables() {
+  $('tr.header').parent('thead').parent('table').addClass('table table-condensed');
+}
+$(document).ready(function () {
+  bootstrapStylePandocTables();
+});
+
+
+</script>
+
+<!-- dynamically load mathjax for compatibility with self-contained -->
+<script>
+  (function () {
+    var script = document.createElement("script");
+    script.type = "text/javascript";
+    script.src  = "https://mathjax.rstudio.com/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML";
+    document.getElementsByTagName("head")[0].appendChild(script);
+  })();
+</script>
+
+</body>
+</html>
diff --git a/inst/doc/crul_vignette.Rmd b/inst/doc/crul_vignette.Rmd
index 0c31d74..9e86d89 100644
--- a/inst/doc/crul_vignette.Rmd
+++ b/inst/doc/crul_vignette.Rmd
@@ -1,6 +1,6 @@
 <!--
-%\VignetteEngine{knitr::knitr}
 %\VignetteIndexEntry{crul introduction}
+%\VignetteEngine{knitr::rmarkdown}
 %\VignetteEncoding{UTF-8}
 -->
 
@@ -49,8 +49,10 @@ library("crul")
 ))
 #> <crul connection> 
 #>   url: https://httpbin.org
-#>   options: 
+#>   curl options: 
 #>     timeout: 1
+#>   proxies: 
+#>   auth: 
 #>   headers: 
 #>     a: hello world
 ```
@@ -109,14 +111,15 @@ res$content
 #>  [47] 6f 20 77 6f 72 6c 64 22 2c 20 0a 20 20 20 20 22 41 63 63 65 70 74 22
 #>  [70] 3a 20 22 2a 2f 2a 22 2c 20 0a 20 20 20 20 22 41 63 63 65 70 74 2d 45
 #>  [93] 6e 63 6f 64 69 6e 67 22 3a 20 22 67 7a 69 70 2c 20 64 65 66 6c 61 74
-#> [116] 65 22 2c 20 0a 20 20 20 20 22 48 6f 73 74 22 3a 20 22 68 74 74 70 62
-#> [139] 69 6e 2e 6f 72 67 22 2c 20 0a 20 20 20 20 22 55 73 65 72 2d 41 67 65
-#> [162] 6e 74 22 3a 20 22 6c 69 62 63 75 72 6c 2f 37 2e 35 31 2e 30 20 72 2d
-#> [185] 63 75 72 6c 2f 32 2e 33 20 63 72 75 6c 2f 30 2e 32 2e 30 22 0a 20 20
-#> [208] 7d 2c 20 0a 20 20 22 6f 72 69 67 69 6e 22 3a 20 22 37 31 2e 36 33 2e
-#> [231] 32 32 33 2e 31 31 33 22 2c 20 0a 20 20 22 75 72 6c 22 3a 20 22 68 74
-#> [254] 74 70 73 3a 2f 2f 68 74 74 70 62 69 6e 2e 6f 72 67 2f 67 65 74 22 0a
-#> [277] 7d 0a
+#> [116] 65 22 2c 20 0a 20 20 20 20 22 43 6f 6e 6e 65 63 74 69 6f 6e 22 3a 20
+#> [139] 22 63 6c 6f 73 65 22 2c 20 0a 20 20 20 20 22 48 6f 73 74 22 3a 20 22
+#> [162] 68 74 74 70 62 69 6e 2e 6f 72 67 22 2c 20 0a 20 20 20 20 22 55 73 65
+#> [185] 72 2d 41 67 65 6e 74 22 3a 20 22 6c 69 62 63 75 72 6c 2f 37 2e 35 31
+#> [208] 2e 30 20 72 2d 63 75 72 6c 2f 32 2e 36 20 63 72 75 6c 2f 30 2e 33 2e
+#> [231] 38 22 0a 20 20 7d 2c 20 0a 20 20 22 6f 72 69 67 69 6e 22 3a 20 22 31
+#> [254] 35 37 2e 31 33 30 2e 31 37 39 2e 38 36 22 2c 20 0a 20 20 22 75 72 6c
+#> [277] 22 3a 20 22 68 74 74 70 73 3a 2f 2f 68 74 74 70 62 69 6e 2e 6f 72 67
+#> [300] 2f 67 65 74 22 0a 7d 0a
 ```
 
 HTTP method
@@ -132,8 +135,11 @@ Request headers
 
 ```r
 res$request_headers
-#> $useragent
-#> [1] "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"
+#> $`User-Agent`
+#> [1] "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#> 
+#> $`Accept-Encoding`
+#> [1] "gzip, deflate"
 #> 
 #> $a
 #> [1] "hello world"
@@ -147,26 +153,35 @@ res$response_headers
 #> $status
 #> [1] "HTTP/1.1 200 OK"
 #> 
+#> $connection
+#> [1] "keep-alive"
+#> 
 #> $server
-#> [1] "nginx"
+#> [1] "meinheld/0.6.1"
 #> 
 #> $date
-#> [1] "Tue, 03 Jan 2017 05:52:03 GMT"
+#> [1] "Tue, 13 Jun 2017 22:26:40 GMT"
 #> 
 #> $`content-type`
 #> [1] "application/json"
 #> 
-#> $`content-length`
-#> [1] "278"
-#> 
-#> $connection
-#> [1] "keep-alive"
-#> 
 #> $`access-control-allow-origin`
 #> [1] "*"
 #> 
 #> $`access-control-allow-credentials`
 #> [1] "true"
+#> 
+#> $`x-powered-by`
+#> [1] "Flask"
+#> 
+#> $`x-processed-time`
+#> [1] "0.000635147094727"
+#> 
+#> $`content-length`
+#> [1] "307"
+#> 
+#> $via
+#> [1] "1.1 vegur"
 ```
 
 And you can parse the content with a provided function:
@@ -174,7 +189,7 @@ And you can parse the content with a provided function:
 
 ```r
 res$parse()
-#> [1] "{\n  \"args\": {}, \n  \"headers\": {\n    \"A\": \"hello world\", \n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.3 crul/0.2.0\"\n  }, \n  \"origin\": \"71.63.223.113\", \n  \"url\": \"https://httpbin.org/get\"\n}\n"
+#> [1] "{\n  \"args\": {}, \n  \"headers\": {\n    \"A\": \"hello world\", \n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get\"\n}\n"
 jsonlite::fromJSON(res$parse())
 #> $args
 #> named list()
@@ -189,15 +204,18 @@ jsonlite::fromJSON(res$parse())
 #> $headers$`Accept-Encoding`
 #> [1] "gzip, deflate"
 #> 
+#> $headers$Connection
+#> [1] "close"
+#> 
 #> $headers$Host
 #> [1] "httpbin.org"
 #> 
 #> $headers$`User-Agent`
-#> [1] "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"
+#> [1] "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
 #> 
 #> 
 #> $origin
-#> [1] "71.63.223.113"
+#> [1] "157.130.179.86"
 #> 
 #> $url
 #> [1] "https://httpbin.org/get"
@@ -213,21 +231,6 @@ x$post(
   path = "post", 
   body = list(hello = "world")
 )
-#> <crul response> 
-#>   url: https://httpbin.org/post
-#>   request_headers: 
-#>     useragent: libcurl/7.51.0 r-curl/2.3 crul/0.2.0
-#>     a: hello world
-#>   response_headers: 
-#>     status: HTTP/1.1 200 OK
-#>     server: nginx
-#>     date: Tue, 03 Jan 2017 05:52:03 GMT
-#>     content-type: application/json
-#>     content-length: 491
-#>     connection: keep-alive
-#>     access-control-allow-origin: *
-#>     access-control-allow-credentials: true
-#>   status: 200
 ```
 
 
@@ -240,7 +243,7 @@ f <- tempfile()
 res <- x$get(disk = f)
 # when using write to disk, content is a path
 res$content 
-#> [1] "/var/folders/gs/4khph0xs0436gmd2gdnwsg080000gn/T//RtmpoZ8Rrd/fileee7a81dea18"
+#> [1] "/var/folders/gs/4khph0xs0436gmd2gdnwsg080000gn/T//RtmpF4Mh1B/file6fcc2a328add"
 ```
 
 Read lines
@@ -267,14 +270,16 @@ readLines(res$content, n = 10)
 (x <- HttpClient$new(url = "https://httpbin.org"))
 #> <crul connection> 
 #>   url: https://httpbin.org
-#>   options: 
+#>   curl options: 
+#>   proxies: 
+#>   auth: 
 #>   headers:
 res <- x$get('stream/5', stream = function(x) cat(rawToChar(x)))
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 0, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 1, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 2, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 3, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 4, "origin": "71.63.223.113"}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 0}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 1}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 2}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 3}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 4}
 # when streaming, content is NULL
 res$content 
 #> NULL
diff --git a/inst/doc/crul_vignette.html b/inst/doc/crul_vignette.html
index a26d3a7..2107449 100644
--- a/inst/doc/crul_vignette.html
+++ b/inst/doc/crul_vignette.html
@@ -1,236 +1,144 @@
 <!DOCTYPE html>
-<html>
+
+<html xmlns="http://www.w3.org/1999/xhtml">
+
 <head>
-<meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>
 
-<title>crul introduction</title>
+<meta charset="utf-8" />
+<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
+<meta name="generator" content="pandoc" />
 
-<script type="text/javascript">
-window.onload = function() {
-  var imgs = document.getElementsByTagName('img'), i, img;
-  for (i = 0; i < imgs.length; i++) {
-    img = imgs[i];
-    // center an image if it is the only element of its parent
-    if (img.parentElement.childElementCount === 1)
-      img.parentElement.style.textAlign = 'center';
-  }
-};
-</script>
 
-<!-- Styles for R syntax highlighter -->
-<style type="text/css">
-   pre .operator,
-   pre .paren {
-     color: rgb(104, 118, 135)
-   }
-
-   pre .literal {
-     color: #990073
-   }
-
-   pre .number {
-     color: #099;
-   }
-
-   pre .comment {
-     color: #998;
-     font-style: italic
-   }
-
-   pre .keyword {
-     color: #900;
-     font-weight: bold
-   }
-
-   pre .identifier {
-     color: rgb(0, 0, 0);
-   }
-
-   pre .string {
-     color: #d14;
-   }
-</style>
 
-<!-- R syntax highlighter -->
-<script type="text/javascript">
-var hljs=new function(){function m(p){return p.replace(/&/gm,"&").replace(/</gm,"<")}function f(r,q,p){return RegExp(q,"m"+(r.cI?"i":"")+(p?"g":""))}function b(r){for(var p=0;p<r.childNodes.length;p++){var q=r.childNodes[p];if(q.nodeName=="CODE"){return q}if(!(q.nodeType==3&&q.nodeValue.match(/\s+/))){break}}}function h(t,s){var p="";for(var r=0;r<t.childNodes.length;r++){if(t.childNodes[r].nodeType==3){var q=t.childNodes[r].nodeValue;if(s){q=q.replace(/\n/g,"")}p+=q}else{if(t.chi [...]
-hljs.initHighlightingOnLoad();
-</script>
 
+<title></title>
 
+<script src="data:application/x-javascript;base64,LyohIGpRdWVyeSB2MS4xMS4zIHwgKGMpIDIwMDUsIDIwMTUgalF1ZXJ5IEZvdW5kYXRpb24sIEluYy4gfCBqcXVlcnkub3JnL2xpY2Vuc2UgKi8KIWZ1bmN0aW9uKGEsYil7Im9iamVjdCI9PXR5cGVvZiBtb2R1bGUmJiJvYmplY3QiPT10eXBlb2YgbW9kdWxlLmV4cG9ydHM/bW9kdWxlLmV4cG9ydHM9YS5kb2N1bWVudD9iKGEsITApOmZ1bmN0aW9uKGEpe2lmKCFhLmRvY3VtZW50KXRocm93IG5ldyBFcnJvcigialF1ZXJ5IHJlcXVpcmVzIGEgd2luZG93IHdpdGggYSBkb2N1bWVudCIpO3JldHVybiBiKGEpfTpiKGEpfSgidW5kZWZpbmVkIiE9dHlwZW9mIHdpbmRvdz93aW5kb3c6dG [...]
+<meta name="viewport" content="width=device-width, initial-scale=1" />
+<link href="data:text/css;charset=utf-8,html%7Bfont%2Dfamily%3Asans%2Dserif%3B%2Dwebkit%2Dtext%2Dsize%2Dadjust%3A100%25%3B%2Dms%2Dtext%2Dsize%2Dadjust%3A100%25%7Dbody%7Bmargin%3A0%7Darticle%2Caside%2Cdetails%2Cfigcaption%2Cfigure%2Cfooter%2Cheader%2Chgroup%2Cmain%2Cmenu%2Cnav%2Csection%2Csummary%7Bdisplay%3Ablock%7Daudio%2Ccanvas%2Cprogress%2Cvideo%7Bdisplay%3Ainline%2Dblock%3Bvertical%2Dalign%3Abaseline%7Daudio%3Anot%28%5Bcontrols%5D%29%7Bdisplay%3Anone%3Bheight%3A0%7D%5Bhidden%5D%2Ctem [...]
+<script src="data:application/x-javascript;base64,LyohCiAqIEJvb3RzdHJhcCB2My4zLjUgKGh0dHA6Ly9nZXRib290c3RyYXAuY29tKQogKiBDb3B5cmlnaHQgMjAxMS0yMDE1IFR3aXR0ZXIsIEluYy4KICogTGljZW5zZWQgdW5kZXIgdGhlIE1JVCBsaWNlbnNlCiAqLwppZigidW5kZWZpbmVkIj09dHlwZW9mIGpRdWVyeSl0aHJvdyBuZXcgRXJyb3IoIkJvb3RzdHJhcCdzIEphdmFTY3JpcHQgcmVxdWlyZXMgalF1ZXJ5Iik7K2Z1bmN0aW9uKGEpeyJ1c2Ugc3RyaWN0Ijt2YXIgYj1hLmZuLmpxdWVyeS5zcGxpdCgiICIpWzBdLnNwbGl0KCIuIik7aWYoYlswXTwyJiZiWzFdPDl8fDE9PWJbMF0mJjk9PWJbMV0mJmJbMl08MSl0aHJvdy [...]
+<script src="data:application/x-javascript;base64,LyoqCiogQHByZXNlcnZlIEhUTUw1IFNoaXYgMy43LjIgfCBAYWZhcmthcyBAamRhbHRvbiBAam9uX25lYWwgQHJlbSB8IE1JVC9HUEwyIExpY2Vuc2VkCiovCi8vIE9ubHkgcnVuIHRoaXMgY29kZSBpbiBJRSA4CmlmICghIXdpbmRvdy5uYXZpZ2F0b3IudXNlckFnZW50Lm1hdGNoKCJNU0lFIDgiKSkgewohZnVuY3Rpb24oYSxiKXtmdW5jdGlvbiBjKGEsYil7dmFyIGM9YS5jcmVhdGVFbGVtZW50KCJwIiksZD1hLmdldEVsZW1lbnRzQnlUYWdOYW1lKCJoZWFkIilbMF18fGEuZG9jdW1lbnRFbGVtZW50O3JldHVybiBjLmlubmVySFRNTD0ieDxzdHlsZT4iK2IrIjwvc3R5bGU+IixkLm [...]
+<script src="data:application/x-javascript;base64,LyohIFJlc3BvbmQuanMgdjEuNC4yOiBtaW4vbWF4LXdpZHRoIG1lZGlhIHF1ZXJ5IHBvbHlmaWxsICogQ29weXJpZ2h0IDIwMTMgU2NvdHQgSmVobAogKiBMaWNlbnNlZCB1bmRlciBodHRwczovL2dpdGh1Yi5jb20vc2NvdHRqZWhsL1Jlc3BvbmQvYmxvYi9tYXN0ZXIvTElDRU5TRS1NSVQKICogICovCgovLyBPbmx5IHJ1biB0aGlzIGNvZGUgaW4gSUUgOAppZiAoISF3aW5kb3cubmF2aWdhdG9yLnVzZXJBZ2VudC5tYXRjaCgiTVNJRSA4IikpIHsKIWZ1bmN0aW9uKGEpeyJ1c2Ugc3RyaWN0IjthLm1hdGNoTWVkaWE9YS5tYXRjaE1lZGlhfHxmdW5jdGlvbihhKXt2YXIgYixjPWEuZG [...]
+<script src="data:application/x-javascript;base64,CgovKioKICogalF1ZXJ5IFBsdWdpbjogU3RpY2t5IFRhYnMKICoKICogQGF1dGhvciBBaWRhbiBMaXN0ZXIgPGFpZGFuQHBocC5uZXQ+CiAqIGFkYXB0ZWQgYnkgUnViZW4gQXJzbGFuIHRvIGFjdGl2YXRlIHBhcmVudCB0YWJzIHRvbwogKiBodHRwOi8vd3d3LmFpZGFubGlzdGVyLmNvbS8yMDE0LzAzL3BlcnNpc3RpbmctdGhlLXRhYi1zdGF0ZS1pbi1ib290c3RyYXAvCiAqLwooZnVuY3Rpb24oJCkgewogICJ1c2Ugc3RyaWN0IjsKICAkLmZuLnJtYXJrZG93blN0aWNreVRhYnMgPSBmdW5jdGlvbigpIHsKICAgIHZhciBjb250ZXh0ID0gdGhpczsKICAgIC8vIFNob3cgdGhlIHRhYi [...]
+<link href="data:text/css;charset=utf-8,pre%20%2Eoperator%2C%0Apre%20%2Eparen%20%7B%0Acolor%3A%20rgb%28104%2C%20118%2C%20135%29%0A%7D%0Apre%20%2Eliteral%20%7B%0Acolor%3A%20%23990073%0A%7D%0Apre%20%2Enumber%20%7B%0Acolor%3A%20%23099%3B%0A%7D%0Apre%20%2Ecomment%20%7B%0Acolor%3A%20%23998%3B%0Afont%2Dstyle%3A%20italic%0A%7D%0Apre%20%2Ekeyword%20%7B%0Acolor%3A%20%23900%3B%0Afont%2Dweight%3A%20bold%0A%7D%0Apre%20%2Eidentifier%20%7B%0Acolor%3A%20rgb%280%2C%200%2C%200%29%3B%0A%7D%0Apre%20%2Estri [...]
+<script src="data:application/x-javascript;base64,dmFyIGhsanM9bmV3IGZ1bmN0aW9uKCl7ZnVuY3Rpb24gbShwKXtyZXR1cm4gcC5yZXBsYWNlKC8mL2dtLCImYW1wOyIpLnJlcGxhY2UoLzwvZ20sIiZsdDsiKX1mdW5jdGlvbiBmKHIscSxwKXtyZXR1cm4gUmVnRXhwKHEsIm0iKyhyLmNJPyJpIjoiIikrKHA/ImciOiIiKSl9ZnVuY3Rpb24gYihyKXtmb3IodmFyIHA9MDtwPHIuY2hpbGROb2Rlcy5sZW5ndGg7cCsrKXt2YXIgcT1yLmNoaWxkTm9kZXNbcF07aWYocS5ub2RlTmFtZT09IkNPREUiKXtyZXR1cm4gcX1pZighKHEubm9kZVR5cGU9PTMmJnEubm9kZVZhbHVlLm1hdGNoKC9ccysvKSkpe2JyZWFrfX19ZnVuY3Rpb24gaCh0LH [...]
 
+<style type="text/css">code{white-space: pre;}</style>
 <style type="text/css">
-body, td {
-   font-family: sans-serif;
-   background-color: white;
-   font-size: 13px;
+  pre:not([class]) {
+    background-color: white;
+  }
+</style>
+<script type="text/javascript">
+if (window.hljs && document.readyState && document.readyState === "complete") {
+   window.setTimeout(function() {
+      hljs.initHighlighting();
+   }, 0);
 }
+</script>
 
-body {
-  max-width: 800px;
-  margin: auto;
-  padding: 1em;
-  line-height: 20px;
-}
 
-tt, code, pre {
-   font-family: 'DejaVu Sans Mono', 'Droid Sans Mono', 'Lucida Console', Consolas, Monaco, monospace;
-}
 
+<style type="text/css">
 h1 {
-   font-size:2.2em;
+  font-size: 34px;
+}
+h1.title {
+  font-size: 38px;
 }
-
 h2 {
-   font-size:1.8em;
+  font-size: 30px;
 }
-
 h3 {
-   font-size:1.4em;
+  font-size: 24px;
 }
-
 h4 {
-   font-size:1.0em;
+  font-size: 18px;
 }
-
 h5 {
-   font-size:0.9em;
+  font-size: 16px;
 }
-
 h6 {
-   font-size:0.8em;
+  font-size: 12px;
 }
-
-a:visited {
-   color: rgb(50%, 0%, 50%);
+.table th:not([align]) {
+  text-align: left;
 }
+</style>
 
-pre, img {
-  max-width: 100%;
+
+</head>
+
+<body>
+
+<style type="text/css">
+.main-container {
+  max-width: 940px;
+  margin-left: auto;
+  margin-right: auto;
 }
-pre {
-  overflow-x: auto;
+code {
+  color: inherit;
+  background-color: rgba(0, 0, 0, 0.04);
 }
-pre code {
-   display: block; padding: 0.5em;
+img {
+  max-width:100%;
+  height: auto;
 }
-
-code {
-  font-size: 92%;
-  border: 1px solid #ccc;
+.tabbed-pane {
+  padding-top: 12px;
 }
-
-code[class] {
-  background-color: #F8F8F8;
+button.code-folding-btn:focus {
+  outline: none;
 }
+</style>
 
-table, td, th {
-  border: none;
-}
 
-blockquote {
-   color:#666666;
-   margin:0;
-   padding-left: 1em;
-   border-left: 0.5em #EEE solid;
-}
 
-hr {
-   height: 0px;
-   border-bottom: none;
-   border-top-width: thin;
-   border-top-style: dotted;
-   border-top-color: #999999;
-}
+<div class="container-fluid main-container">
 
- at media print {
-   * {
-      background: transparent !important;
-      color: black !important;
-      filter:none !important;
-      -ms-filter: none !important;
-   }
-
-   body {
-      font-size:12pt;
-      max-width:100%;
-   }
-
-   a, a:visited {
-      text-decoration: underline;
-   }
-
-   hr {
-      visibility: hidden;
-      page-break-before: always;
-   }
-
-   pre, blockquote {
-      padding-right: 1em;
-      page-break-inside: avoid;
-   }
-
-   tr, img {
-      page-break-inside: avoid;
-   }
-
-   img {
-      max-width: 100% !important;
-   }
-
-   @page :left {
-      margin: 15mm 20mm 15mm 10mm;
-   }
-
-   @page :right {
-      margin: 15mm 10mm 15mm 20mm;
-   }
-
-   p, h2, h3 {
-      orphans: 3; widows: 3;
-   }
-
-   h2, h3 {
-      page-break-after: avoid;
-   }
-}
-</style>
+<!-- tabsets -->
+<script>
+$(document).ready(function () {
+  window.buildTabsets("TOC");
+});
+</script>
 
+<!-- code folding -->
 
 
-</head>
 
-<body>
-<!--
-%\VignetteEngine{knitr::knitr}
-%\VignetteIndexEntry{crul introduction}
-%\VignetteEncoding{UTF-8}
--->
 
-<h1>crul introduction</h1>
 
-<p><code>crul</code> is an HTTP client for R.</p>
 
-<h2>Install</h2>
+<div class="fluid-row" id="header">
 
-<p>Stable CRAN version</p>
 
-<pre><code class="r">install.packages("crul")
-</code></pre>
 
-<p>Dev version</p>
 
-<pre><code class="r">devtools::install_github("ropensci/crul")
-</code></pre>
+</div>
 
-<pre><code class="r">library("crul")
-</code></pre>
 
+<!--
+%\VignetteIndexEntry{crul introduction}
+%\VignetteEngine{knitr::rmarkdown}
+%\VignetteEncoding{UTF-8}
+-->
+<div id="crul-introduction" class="section level1">
+<h1>crul introduction</h1>
+<p><code>crul</code> is an HTTP client for R.</p>
+<div id="install" class="section level2">
+<h2>Install</h2>
+<p>Stable CRAN version</p>
+<pre class="r"><code>install.packages("crul")</code></pre>
+<p>Dev version</p>
+<pre class="r"><code>devtools::install_github("ropensci/crul")</code></pre>
+<pre class="r"><code>library("crul")</code></pre>
+</div>
+<div id="the-client" class="section level2">
 <h2>the client</h2>
-
 <p><code>HttpClient</code> is where to start</p>
-
-<pre><code class="r">(x <- HttpClient$new(
+<pre class="r"><code>(x <- HttpClient$new(
   url = "https://httpbin.org",
   opts = list(
     timeout = 1
@@ -241,114 +149,95 @@ hr {
 ))
 #> <crul connection> 
 #>   url: https://httpbin.org
-#>   options: 
+#>   curl options: 
 #>     timeout: 1
+#>   proxies: 
+#>   auth: 
 #>   headers: 
-#>     a: hello world
-</code></pre>
-
-<p>Makes a R6 class, that has all the bits and bobs you'd expect for doing HTTP
-requests. When it prints, it gives any defaults you've set. As you update
-the object you can see what's been set</p>
-
-<pre><code class="r">x$opts
+#>     a: hello world</code></pre>
+<p>Makes a R6 class, that has all the bits and bobs you’d expect for doing HTTP requests. When it prints, it gives any defaults you’ve set. As you update the object you can see what’s been set</p>
+<pre class="r"><code>x$opts
 #> $timeout
-#> [1] 1
-</code></pre>
-
-<pre><code class="r">x$headers
+#> [1] 1</code></pre>
+<pre class="r"><code>x$headers
 #> $a
-#> [1] "hello world"
-</code></pre>
-
+#> [1] "hello world"</code></pre>
+</div>
+<div id="do-some-http" class="section level2">
 <h2>do some http</h2>
-
-<p>The client object created above has http methods that you can call,
-and pass paths to, as well as query parameters, body values, and any other
-curl options.</p>
-
-<p>Here, we'll do a <strong>GET</strong> request on the route <code>/get</code> on our base url
-<code>https://httpbin.org</code> (the full url is then <code>https://httpbin.org/get</code>)</p>
-
-<pre><code class="r">res <- x$get("get")
-</code></pre>
-
-<p>The response from a http request is another R6 class <code>HttpResponse</code>, which
-has slots for the outputs of the request, and some functions to deal with
-the response:</p>
-
+<p>The client object created above has http methods that you can call, and pass paths to, as well as query parameters, body values, and any other curl options.</p>
+<p>Here, we’ll do a <strong>GET</strong> request on the route <code>/get</code> on our base url <code>https://httpbin.org</code> (the full url is then <code>https://httpbin.org/get</code>)</p>
+<pre class="r"><code>res <- x$get("get")</code></pre>
+<p>The response from a http request is another R6 class <code>HttpResponse</code>, which has slots for the outputs of the request, and some functions to deal with the response:</p>
 <p>Status code</p>
-
-<pre><code class="r">res$status_code
-#> [1] 200
-</code></pre>
-
+<pre class="r"><code>res$status_code
+#> [1] 200</code></pre>
 <p>The content</p>
-
-<pre><code class="r">res$content
+<pre class="r"><code>res$content
 #>   [1] 7b 0a 20 20 22 61 72 67 73 22 3a 20 7b 7d 2c 20 0a 20 20 22 68 65 61
 #>  [24] 64 65 72 73 22 3a 20 7b 0a 20 20 20 20 22 41 22 3a 20 22 68 65 6c 6c
 #>  [47] 6f 20 77 6f 72 6c 64 22 2c 20 0a 20 20 20 20 22 41 63 63 65 70 74 22
 #>  [70] 3a 20 22 2a 2f 2a 22 2c 20 0a 20 20 20 20 22 41 63 63 65 70 74 2d 45
 #>  [93] 6e 63 6f 64 69 6e 67 22 3a 20 22 67 7a 69 70 2c 20 64 65 66 6c 61 74
-#> [116] 65 22 2c 20 0a 20 20 20 20 22 48 6f 73 74 22 3a 20 22 68 74 74 70 62
-#> [139] 69 6e 2e 6f 72 67 22 2c 20 0a 20 20 20 20 22 55 73 65 72 2d 41 67 65
-#> [162] 6e 74 22 3a 20 22 6c 69 62 63 75 72 6c 2f 37 2e 35 31 2e 30 20 72 2d
-#> [185] 63 75 72 6c 2f 32 2e 33 20 63 72 75 6c 2f 30 2e 32 2e 30 22 0a 20 20
-#> [208] 7d 2c 20 0a 20 20 22 6f 72 69 67 69 6e 22 3a 20 22 37 31 2e 36 33 2e
-#> [231] 32 32 33 2e 31 31 33 22 2c 20 0a 20 20 22 75 72 6c 22 3a 20 22 68 74
-#> [254] 74 70 73 3a 2f 2f 68 74 74 70 62 69 6e 2e 6f 72 67 2f 67 65 74 22 0a
-#> [277] 7d 0a
-</code></pre>
-
+#> [116] 65 22 2c 20 0a 20 20 20 20 22 43 6f 6e 6e 65 63 74 69 6f 6e 22 3a 20
+#> [139] 22 63 6c 6f 73 65 22 2c 20 0a 20 20 20 20 22 48 6f 73 74 22 3a 20 22
+#> [162] 68 74 74 70 62 69 6e 2e 6f 72 67 22 2c 20 0a 20 20 20 20 22 55 73 65
+#> [185] 72 2d 41 67 65 6e 74 22 3a 20 22 6c 69 62 63 75 72 6c 2f 37 2e 35 31
+#> [208] 2e 30 20 72 2d 63 75 72 6c 2f 32 2e 36 20 63 72 75 6c 2f 30 2e 33 2e
+#> [231] 38 22 0a 20 20 7d 2c 20 0a 20 20 22 6f 72 69 67 69 6e 22 3a 20 22 31
+#> [254] 35 37 2e 31 33 30 2e 31 37 39 2e 38 36 22 2c 20 0a 20 20 22 75 72 6c
+#> [277] 22 3a 20 22 68 74 74 70 73 3a 2f 2f 68 74 74 70 62 69 6e 2e 6f 72 67
+#> [300] 2f 67 65 74 22 0a 7d 0a</code></pre>
 <p>HTTP method</p>
-
-<pre><code class="r">res$method
-#> [1] "get"
-</code></pre>
-
+<pre class="r"><code>res$method
+#> [1] "get"</code></pre>
 <p>Request headers</p>
-
-<pre><code class="r">res$request_headers
-#> $useragent
-#> [1] "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"
+<pre class="r"><code>res$request_headers
+#> $`User-Agent`
+#> [1] "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#> 
+#> $`Accept-Encoding`
+#> [1] "gzip, deflate"
 #> 
 #> $a
-#> [1] "hello world"
-</code></pre>
-
+#> [1] "hello world"</code></pre>
 <p>Response headers</p>
-
-<pre><code class="r">res$response_headers
+<pre class="r"><code>res$response_headers
 #> $status
 #> [1] "HTTP/1.1 200 OK"
 #> 
+#> $connection
+#> [1] "keep-alive"
+#> 
 #> $server
-#> [1] "nginx"
+#> [1] "meinheld/0.6.1"
 #> 
 #> $date
-#> [1] "Tue, 03 Jan 2017 05:52:03 GMT"
+#> [1] "Tue, 13 Jun 2017 22:26:40 GMT"
 #> 
 #> $`content-type`
 #> [1] "application/json"
 #> 
-#> $`content-length`
-#> [1] "278"
-#> 
-#> $connection
-#> [1] "keep-alive"
-#> 
 #> $`access-control-allow-origin`
 #> [1] "*"
 #> 
 #> $`access-control-allow-credentials`
 #> [1] "true"
-</code></pre>
-
+#> 
+#> $`x-powered-by`
+#> [1] "Flask"
+#> 
+#> $`x-processed-time`
+#> [1] "0.000635147094727"
+#> 
+#> $`content-length`
+#> [1] "307"
+#> 
+#> $via
+#> [1] "1.1 vegur"</code></pre>
 <p>And you can parse the content with a provided function:</p>
-
-<pre><code class="r">res$parse()
-#> [1] "{\n  \"args\": {}, \n  \"headers\": {\n    \"A\": \"hello world\", \n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.3 crul/0.2.0\"\n  }, \n  \"origin\": \"71.63.223.113\", \n  \"url\": \"https://httpbin.org/get\&quot [...]
+<pre class="r"><code>res$parse()
+#> [1] "{\n  \"args\": {}, \n  \"headers\": {\n    \"A\": \"hello world\", \n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \ [...]
 jsonlite::fromJSON(res$parse())
 #> $args
 #> named list()
@@ -363,88 +252,96 @@ jsonlite::fromJSON(res$parse())
 #> $headers$`Accept-Encoding`
 #> [1] "gzip, deflate"
 #> 
+#> $headers$Connection
+#> [1] "close"
+#> 
 #> $headers$Host
 #> [1] "httpbin.org"
 #> 
 #> $headers$`User-Agent`
-#> [1] "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"
+#> [1] "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
 #> 
 #> 
 #> $origin
-#> [1] "71.63.223.113"
+#> [1] "157.130.179.86"
 #> 
 #> $url
-#> [1] "https://httpbin.org/get"
-</code></pre>
-
-<p>With the <code>HttpClient</code> object, which holds any configuration stuff
-we set, we can make other HTTP verb requests. For example, a <code>HEAD</code>
-request:</p>
-
-<pre><code class="r">x$post(
+#> [1] "https://httpbin.org/get"</code></pre>
+<p>With the <code>HttpClient</code> object, which holds any configuration stuff we set, we can make other HTTP verb requests. For example, a <code>HEAD</code> request:</p>
+<pre class="r"><code>x$post(
   path = "post", 
   body = list(hello = "world")
-)
-#> <crul response> 
-#>   url: https://httpbin.org/post
-#>   request_headers: 
-#>     useragent: libcurl/7.51.0 r-curl/2.3 crul/0.2.0
-#>     a: hello world
-#>   response_headers: 
-#>     status: HTTP/1.1 200 OK
-#>     server: nginx
-#>     date: Tue, 03 Jan 2017 05:52:03 GMT
-#>     content-type: application/json
-#>     content-length: 491
-#>     connection: keep-alive
-#>     access-control-allow-origin: *
-#>     access-control-allow-credentials: true
-#>   status: 200
-</code></pre>
-
+)</code></pre>
+</div>
+<div id="write-to-disk" class="section level2">
 <h2>write to disk</h2>
-
-<pre><code class="r">x <- HttpClient$new(url = "https://httpbin.org")
+<pre class="r"><code>x <- HttpClient$new(url = "https://httpbin.org")
 f <- tempfile()
 res <- x$get(disk = f)
 # when using write to disk, content is a path
 res$content 
-#> [1] "/var/folders/gs/4khph0xs0436gmd2gdnwsg080000gn/T//RtmpoZ8Rrd/fileee7a81dea18"
-</code></pre>
-
+#> [1] "/var/folders/gs/4khph0xs0436gmd2gdnwsg080000gn/T//RtmpF4Mh1B/file6fcc2a328add"</code></pre>
 <p>Read lines</p>
-
-<pre><code class="r">readLines(res$content, n = 10)
+<pre class="r"><code>readLines(res$content, n = 10)
 #>  [1] "<!DOCTYPE html>"                                                                           
 #>  [2] "<html>"                                                                                    
 #>  [3] "<head>"                                                                                    
-#>  [4] "  <meta http-equiv='content-type' value='text/html;charset=utf8'>"                         
-#>  [5] "  <meta name='generator' value='Ronn/v0.7.3 (http://github.com/rtomayko/ronn/tree/0.7.3)'>"
+#>  [4] "  <meta http-equiv='content-type' value='text/html;charset=utf8'>"                         
+#>  [5] "  <meta name='generator' value='Ronn/v0.7.3 (http://github.com/rtomayko/ronn/tree/0.7.3)'>"
 #>  [6] "  <title>httpbin(1): HTTP Client Testing Service</title>"                                  
-#>  [7] "  <style type='text/css' media='all'>"                                                     
+#>  [7] "  <style type='text/css' media='all'>"                                                     
 #>  [8] "  /* style: man */"                                                                        
 #>  [9] "  body#manpage {margin:0}"                                                                 
-#> [10] "  .mp {max-width:100ex;padding:0 9ex 1ex 4ex}"
-</code></pre>
-
+#> [10] "  .mp {max-width:100ex;padding:0 9ex 1ex 4ex}"</code></pre>
+</div>
+<div id="stream-data" class="section level2">
 <h2>stream data</h2>
-
-<pre><code class="r">(x <- HttpClient$new(url = "https://httpbin.org"))
+<pre class="r"><code>(x <- HttpClient$new(url = "https://httpbin.org"))
 #> <crul connection> 
 #>   url: https://httpbin.org
-#>   options: 
+#>   curl options: 
+#>   proxies: 
+#>   auth: 
 #>   headers:
-res <- x$get('stream/5', stream = function(x) cat(rawToChar(x)))
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 0, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 1, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 2, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 3, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 4, "origin": "71.63.223.113"}
+res <- x$get('stream/5', stream = function(x) cat(rawToChar(x)))
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 0}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 1}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 2}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 3}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 4}
 # when streaming, content is NULL
 res$content 
-#> NULL
-</code></pre>
+#> NULL</code></pre>
+</div>
+</div>
+
 
-</body>
 
+
+</div>
+
+<script>
+
+// add bootstrap table styles to pandoc tables
+function bootstrapStylePandocTables() {
+  $('tr.header').parent('thead').parent('table').addClass('table table-condensed');
+}
+$(document).ready(function () {
+  bootstrapStylePandocTables();
+});
+
+
+</script>
+
+<!-- dynamically load mathjax for compatibility with self-contained -->
+<script>
+  (function () {
+    var script = document.createElement("script");
+    script.type = "text/javascript";
+    script.src  = "https://mathjax.rstudio.com/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML";
+    document.getElementsByTagName("head")[0].appendChild(script);
+  })();
+</script>
+
+</body>
 </html>
diff --git a/inst/doc/how-to-use-crul.Rmd b/inst/doc/how-to-use-crul.Rmd
new file mode 100644
index 0000000..fc9951e
--- /dev/null
+++ b/inst/doc/how-to-use-crul.Rmd
@@ -0,0 +1,224 @@
+<!--
+%\VignetteIndexEntry{How to use crul}
+%\VignetteEngine{knitr::rmarkdown}
+%\VignetteEncoding{UTF-8}
+-->
+
+
+
+How to use crul
+===============
+
+The following aims to help you decide how to use `crul` in different 
+scenarios.
+
+First, `crul` is aimed a bit more at developers than at the casual 
+user doing HTTP requests. That is, `crul` is probably a better fit 
+for an R package developer, mainly because it heavily uses `R6` - 
+an interface that's very unlike the interface in `httr` but very
+similar to interacting with classes in Ruby/Python.
+
+Second, there is not yet, but will be soon, the ability to mock 
+HTTP requests. We are working on that, and should have it integrated
+soon. When that feature arrives we'll update this vignette.
+
+Load the library
+
+
+```r
+library("crul")
+```
+
+## A simple HTTP request function
+
+Most likely you'll want to do a `GET` request - so let's start with that - 
+though the details are not much different for other HTTP verbs.
+
+And in most cases you'll likely not want to do asynchronous requests - though
+see below if you do.
+
+You'll probably want to write a small function, like so (annotated for 
+clarity)
+
+
+```r
+make_request <- function(url) {
+  # create a HttpClient object, defining the url
+  cli <- crul::HttpClient$new(url = url)
+  # do a GET request
+  res <- cli$get()
+  # check to see if request failed or succeeded
+  # - if succeeds this will return nothing and proceeds to next step
+  res$raise_for_status()
+  # parse response to plain text (JSON in this case) - most likely you'll 
+  # want UTF-8 encoding
+  txt <- res$parse("UTF-8")
+  # parse the JSON to an R list
+  jsonlite::fromJSON(txt)
+}
+```
+
+Use the function
+
+
+```r
+make_request("https://httpbin.org/get")
+#> $args
+#> named list()
+#> 
+#> $headers
+#> $headers$Accept
+#> [1] "*/*"
+#> 
+#> $headers$`Accept-Encoding`
+#> [1] "gzip, deflate"
+#> 
+#> $headers$Connection
+#> [1] "close"
+#> 
+#> $headers$Host
+#> [1] "httpbin.org"
+#> 
+#> $headers$`User-Agent`
+#> [1] "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#> 
+#> 
+#> $origin
+#> [1] "157.130.179.86"
+#> 
+#> $url
+#> [1] "https://httpbin.org/get"
+```
+
+Now you can use the `make_request` function in your script or package.
+
+## More customized function
+
+Once you get more familiar (or if you're already familiar with HTTP) you may
+want to have more control, toggle more switches.
+
+In the next function, we'll allow for users to pass in curl options, use 
+a custom HTTP status checker, and xxx.
+
+
+```r
+make_request2 <- function(url, ...) {
+  # create a HttpClient object, defining the url
+  cli <- crul::HttpClient$new(url = url)
+  # do a GET request, allow curl options to be passed in
+  res <- cli$get(...)
+  # check to see if request failed or succeeded
+  # - a custom approach this time combining status code, 
+  #   explanation of the code, and message from the server
+  if (res$status_code > 201) {
+    mssg <- jsonlite::fromJSON(res$parse("UTF-8"))$message$message
+    x <- res$status_http()
+    stop(
+      sprintf("HTTP (%s) - %s\n  %s", x$status_code, x$explanation, mssg),
+      call. = FALSE
+    )
+  }
+  # parse response
+  txt <- res$parse("UTF-8")
+  # parse the JSON to an R list
+  jsonlite::fromJSON(txt)
+}
+```
+
+Use the function
+
+
+```r
+make_request2("https://api.crossref.org/works?rows=0")
+#> $status
+#> [1] "ok"
+#> 
+#> $`message-type`
+#> [1] "work-list"
+#> 
+#> $`message-version`
+#> [1] "1.0.0"
+#> 
+#> $message
+#> $message$facets
+#> named list()
+#> 
+#> $message$`total-results`
+#> [1] 89155073
+#> 
+#> $message$items
+#> list()
+#> 
+#> $message$`items-per-page`
+#> [1] 0
+#> 
+#> $message$query
+#> $message$query$`start-index`
+#> [1] 0
+#> 
+#> $message$query$`search-terms`
+#> NULL
+```
+
+No different from the first function (besides the URL). However, now we can 
+pass in curl options:
+
+
+```r
+make_request2("https://api.crossref.org/works?rows=0", verbose = TRUE)
+make_request2("https://api.crossref.org/works?rows=0", timeout_ms = 1)
+```
+
+We can also pass named parameters supported in the `get` method, including
+`query`, `disk`, and `stream`.
+
+
+```r
+make_request2("https://api.crossref.org/works", query = list(rows = 0))
+#> $status
+#> [1] "ok"
+#> 
+#> $`message-type`
+#> [1] "work-list"
+#> 
+#> $`message-version`
+#> [1] "1.0.0"
+#> 
+#> $message
+#> $message$facets
+#> named list()
+#> 
+#> $message$`total-results`
+#> [1] 89155073
+#> 
+#> $message$items
+#> list()
+#> 
+#> $message$`items-per-page`
+#> [1] 0
+#> 
+#> $message$query
+#> $message$query$`start-index`
+#> [1] 0
+#> 
+#> $message$query$`search-terms`
+#> NULL
+```
+
+In addition, the failure behavior is different, and customized to the 
+specific web resource we are working with
+
+
+```r
+make_request2("https://api.crossref.org/works?rows=asdf")
+#> Error: HTTP (400) - Bad request syntax or unsupported method
+#>   Integer specified as asdf but must be a positive integer less than or equal to 1000.
+```
+
+## Asynchronous requests
+
+You may want to use asynchronous HTTP requests when any one HTTP request 
+takes "too long". This is of course all relative. You may be dealing with a 
+server that responds very slowly, or other circumstances. 
+
+See the __async with crul__ vignette for more details on asynchronous requests.
diff --git a/inst/doc/how-to-use-crul.html b/inst/doc/how-to-use-crul.html
new file mode 100644
index 0000000..ffbd184
--- /dev/null
+++ b/inst/doc/how-to-use-crul.html
@@ -0,0 +1,311 @@
+<!DOCTYPE html>
+
+<html xmlns="http://www.w3.org/1999/xhtml">
+
+<head>
+
+<meta charset="utf-8" />
+<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
+<meta name="generator" content="pandoc" />
+
+
+
+
+<title></title>
+
+<script src="data:application/x-javascript;base64,LyohIGpRdWVyeSB2MS4xMS4zIHwgKGMpIDIwMDUsIDIwMTUgalF1ZXJ5IEZvdW5kYXRpb24sIEluYy4gfCBqcXVlcnkub3JnL2xpY2Vuc2UgKi8KIWZ1bmN0aW9uKGEsYil7Im9iamVjdCI9PXR5cGVvZiBtb2R1bGUmJiJvYmplY3QiPT10eXBlb2YgbW9kdWxlLmV4cG9ydHM/bW9kdWxlLmV4cG9ydHM9YS5kb2N1bWVudD9iKGEsITApOmZ1bmN0aW9uKGEpe2lmKCFhLmRvY3VtZW50KXRocm93IG5ldyBFcnJvcigialF1ZXJ5IHJlcXVpcmVzIGEgd2luZG93IHdpdGggYSBkb2N1bWVudCIpO3JldHVybiBiKGEpfTpiKGEpfSgidW5kZWZpbmVkIiE9dHlwZW9mIHdpbmRvdz93aW5kb3c6dG [...]
+<meta name="viewport" content="width=device-width, initial-scale=1" />
+<link href="data:text/css;charset=utf-8,html%7Bfont%2Dfamily%3Asans%2Dserif%3B%2Dwebkit%2Dtext%2Dsize%2Dadjust%3A100%25%3B%2Dms%2Dtext%2Dsize%2Dadjust%3A100%25%7Dbody%7Bmargin%3A0%7Darticle%2Caside%2Cdetails%2Cfigcaption%2Cfigure%2Cfooter%2Cheader%2Chgroup%2Cmain%2Cmenu%2Cnav%2Csection%2Csummary%7Bdisplay%3Ablock%7Daudio%2Ccanvas%2Cprogress%2Cvideo%7Bdisplay%3Ainline%2Dblock%3Bvertical%2Dalign%3Abaseline%7Daudio%3Anot%28%5Bcontrols%5D%29%7Bdisplay%3Anone%3Bheight%3A0%7D%5Bhidden%5D%2Ctem [...]
+<script src="data:application/x-javascript;base64,LyohCiAqIEJvb3RzdHJhcCB2My4zLjUgKGh0dHA6Ly9nZXRib290c3RyYXAuY29tKQogKiBDb3B5cmlnaHQgMjAxMS0yMDE1IFR3aXR0ZXIsIEluYy4KICogTGljZW5zZWQgdW5kZXIgdGhlIE1JVCBsaWNlbnNlCiAqLwppZigidW5kZWZpbmVkIj09dHlwZW9mIGpRdWVyeSl0aHJvdyBuZXcgRXJyb3IoIkJvb3RzdHJhcCdzIEphdmFTY3JpcHQgcmVxdWlyZXMgalF1ZXJ5Iik7K2Z1bmN0aW9uKGEpeyJ1c2Ugc3RyaWN0Ijt2YXIgYj1hLmZuLmpxdWVyeS5zcGxpdCgiICIpWzBdLnNwbGl0KCIuIik7aWYoYlswXTwyJiZiWzFdPDl8fDE9PWJbMF0mJjk9PWJbMV0mJmJbMl08MSl0aHJvdy [...]
+<script src="data:application/x-javascript;base64,LyoqCiogQHByZXNlcnZlIEhUTUw1IFNoaXYgMy43LjIgfCBAYWZhcmthcyBAamRhbHRvbiBAam9uX25lYWwgQHJlbSB8IE1JVC9HUEwyIExpY2Vuc2VkCiovCi8vIE9ubHkgcnVuIHRoaXMgY29kZSBpbiBJRSA4CmlmICghIXdpbmRvdy5uYXZpZ2F0b3IudXNlckFnZW50Lm1hdGNoKCJNU0lFIDgiKSkgewohZnVuY3Rpb24oYSxiKXtmdW5jdGlvbiBjKGEsYil7dmFyIGM9YS5jcmVhdGVFbGVtZW50KCJwIiksZD1hLmdldEVsZW1lbnRzQnlUYWdOYW1lKCJoZWFkIilbMF18fGEuZG9jdW1lbnRFbGVtZW50O3JldHVybiBjLmlubmVySFRNTD0ieDxzdHlsZT4iK2IrIjwvc3R5bGU+IixkLm [...]
+<script src="data:application/x-javascript;base64,LyohIFJlc3BvbmQuanMgdjEuNC4yOiBtaW4vbWF4LXdpZHRoIG1lZGlhIHF1ZXJ5IHBvbHlmaWxsICogQ29weXJpZ2h0IDIwMTMgU2NvdHQgSmVobAogKiBMaWNlbnNlZCB1bmRlciBodHRwczovL2dpdGh1Yi5jb20vc2NvdHRqZWhsL1Jlc3BvbmQvYmxvYi9tYXN0ZXIvTElDRU5TRS1NSVQKICogICovCgovLyBPbmx5IHJ1biB0aGlzIGNvZGUgaW4gSUUgOAppZiAoISF3aW5kb3cubmF2aWdhdG9yLnVzZXJBZ2VudC5tYXRjaCgiTVNJRSA4IikpIHsKIWZ1bmN0aW9uKGEpeyJ1c2Ugc3RyaWN0IjthLm1hdGNoTWVkaWE9YS5tYXRjaE1lZGlhfHxmdW5jdGlvbihhKXt2YXIgYixjPWEuZG [...]
+<script src="data:application/x-javascript;base64,CgovKioKICogalF1ZXJ5IFBsdWdpbjogU3RpY2t5IFRhYnMKICoKICogQGF1dGhvciBBaWRhbiBMaXN0ZXIgPGFpZGFuQHBocC5uZXQ+CiAqIGFkYXB0ZWQgYnkgUnViZW4gQXJzbGFuIHRvIGFjdGl2YXRlIHBhcmVudCB0YWJzIHRvbwogKiBodHRwOi8vd3d3LmFpZGFubGlzdGVyLmNvbS8yMDE0LzAzL3BlcnNpc3RpbmctdGhlLXRhYi1zdGF0ZS1pbi1ib290c3RyYXAvCiAqLwooZnVuY3Rpb24oJCkgewogICJ1c2Ugc3RyaWN0IjsKICAkLmZuLnJtYXJrZG93blN0aWNreVRhYnMgPSBmdW5jdGlvbigpIHsKICAgIHZhciBjb250ZXh0ID0gdGhpczsKICAgIC8vIFNob3cgdGhlIHRhYi [...]
+<link href="data:text/css;charset=utf-8,pre%20%2Eoperator%2C%0Apre%20%2Eparen%20%7B%0Acolor%3A%20rgb%28104%2C%20118%2C%20135%29%0A%7D%0Apre%20%2Eliteral%20%7B%0Acolor%3A%20%23990073%0A%7D%0Apre%20%2Enumber%20%7B%0Acolor%3A%20%23099%3B%0A%7D%0Apre%20%2Ecomment%20%7B%0Acolor%3A%20%23998%3B%0Afont%2Dstyle%3A%20italic%0A%7D%0Apre%20%2Ekeyword%20%7B%0Acolor%3A%20%23900%3B%0Afont%2Dweight%3A%20bold%0A%7D%0Apre%20%2Eidentifier%20%7B%0Acolor%3A%20rgb%280%2C%200%2C%200%29%3B%0A%7D%0Apre%20%2Estri [...]
+<script src="data:application/x-javascript;base64,dmFyIGhsanM9bmV3IGZ1bmN0aW9uKCl7ZnVuY3Rpb24gbShwKXtyZXR1cm4gcC5yZXBsYWNlKC8mL2dtLCImYW1wOyIpLnJlcGxhY2UoLzwvZ20sIiZsdDsiKX1mdW5jdGlvbiBmKHIscSxwKXtyZXR1cm4gUmVnRXhwKHEsIm0iKyhyLmNJPyJpIjoiIikrKHA/ImciOiIiKSl9ZnVuY3Rpb24gYihyKXtmb3IodmFyIHA9MDtwPHIuY2hpbGROb2Rlcy5sZW5ndGg7cCsrKXt2YXIgcT1yLmNoaWxkTm9kZXNbcF07aWYocS5ub2RlTmFtZT09IkNPREUiKXtyZXR1cm4gcX1pZighKHEubm9kZVR5cGU9PTMmJnEubm9kZVZhbHVlLm1hdGNoKC9ccysvKSkpe2JyZWFrfX19ZnVuY3Rpb24gaCh0LH [...]
+
+<style type="text/css">code{white-space: pre;}</style>
+<style type="text/css">
+  pre:not([class]) {
+    background-color: white;
+  }
+</style>
+<script type="text/javascript">
+if (window.hljs && document.readyState && document.readyState === "complete") {
+   window.setTimeout(function() {
+      hljs.initHighlighting();
+   }, 0);
+}
+</script>
+
+
+
+<style type="text/css">
+h1 {
+  font-size: 34px;
+}
+h1.title {
+  font-size: 38px;
+}
+h2 {
+  font-size: 30px;
+}
+h3 {
+  font-size: 24px;
+}
+h4 {
+  font-size: 18px;
+}
+h5 {
+  font-size: 16px;
+}
+h6 {
+  font-size: 12px;
+}
+.table th:not([align]) {
+  text-align: left;
+}
+</style>
+
+
+</head>
+
+<body>
+
+<style type="text/css">
+.main-container {
+  max-width: 940px;
+  margin-left: auto;
+  margin-right: auto;
+}
+code {
+  color: inherit;
+  background-color: rgba(0, 0, 0, 0.04);
+}
+img {
+  max-width:100%;
+  height: auto;
+}
+.tabbed-pane {
+  padding-top: 12px;
+}
+button.code-folding-btn:focus {
+  outline: none;
+}
+</style>
+
+
+
+<div class="container-fluid main-container">
+
+<!-- tabsets -->
+<script>
+$(document).ready(function () {
+  window.buildTabsets("TOC");
+});
+</script>
+
+<!-- code folding -->
+
+
+
+
+
+
+<div class="fluid-row" id="header">
+
+
+
+
+</div>
+
+
+<!--
+%\VignetteIndexEntry{How to use crul}
+%\VignetteEngine{knitr::rmarkdown}
+%\VignetteEncoding{UTF-8}
+-->
+<div id="how-to-use-crul" class="section level1">
+<h1>How to use crul</h1>
+<p>The following aims to help you decide how to use <code>crul</code> in different scenarios.</p>
+<p>First, <code>crul</code> is aimed a bit more at developers than at the casual user doing HTTP requests. That is, <code>crul</code> is probably a better fit for an R package developer, mainly because it heavily uses <code>R6</code> - an interface that’s very unlike the interface in <code>httr</code> but very similar to interacting with classes in Ruby/Python.</p>
+<p>Second, there is not yet, but will be soon, the ability to mock HTTP requests. We are working on that, and should have it integrated soon. When that feature arrives we’ll update this vignette.</p>
+<p>Load the library</p>
+<pre class="r"><code>library("crul")</code></pre>
+<div id="a-simple-http-request-function" class="section level2">
+<h2>A simple HTTP request function</h2>
+<p>Most likely you’ll want to do a <code>GET</code> request - so let’s start with that - though the details are not much different for other HTTP verbs.</p>
+<p>And in most cases you’ll likely not want to do asynchronous requests - though see below if you do.</p>
+<p>You’ll probably want to write a small function, like so (annotated for clarity)</p>
+<pre class="r"><code>make_request <- function(url) {
+  # create a HttpClient object, defining the url
+  cli <- crul::HttpClient$new(url = url)
+  # do a GET request
+  res <- cli$get()
+  # check to see if request failed or succeeded
+  # - if succeeds this will return nothing and proceeds to next step
+  res$raise_for_status()
+  # parse response to plain text (JSON in this case) - most likely you'll 
+  # want UTF-8 encoding
+  txt <- res$parse("UTF-8")
+  # parse the JSON to an R list
+  jsonlite::fromJSON(txt)
+}</code></pre>
+<p>Use the function</p>
+<pre class="r"><code>make_request("https://httpbin.org/get")
+#> $args
+#> named list()
+#> 
+#> $headers
+#> $headers$Accept
+#> [1] "*/*"
+#> 
+#> $headers$`Accept-Encoding`
+#> [1] "gzip, deflate"
+#> 
+#> $headers$Connection
+#> [1] "close"
+#> 
+#> $headers$Host
+#> [1] "httpbin.org"
+#> 
+#> $headers$`User-Agent`
+#> [1] "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#> 
+#> 
+#> $origin
+#> [1] "157.130.179.86"
+#> 
+#> $url
+#> [1] "https://httpbin.org/get"</code></pre>
+<p>Now you can use the <code>make_request</code> function in your script or package.</p>
+</div>
+<div id="more-customized-function" class="section level2">
+<h2>More customized function</h2>
+<p>Once you get more familiar (or if you’re already familiar with HTTP) you may want to have more control, toggle more switches.</p>
+<p>In the next function, we’ll allow for users to pass in curl options, use a custom HTTP status checker, and xxx.</p>
+<pre class="r"><code>make_request2 <- function(url, ...) {
+  # create a HttpClient object, defining the url
+  cli <- crul::HttpClient$new(url = url)
+  # do a GET request, allow curl options to be passed in
+  res <- cli$get(...)
+  # check to see if request failed or succeeded
+  # - a custom approach this time combining status code, 
+  #   explanation of the code, and message from the server
+  if (res$status_code > 201) {
+    mssg <- jsonlite::fromJSON(res$parse("UTF-8"))$message$message
+    x <- res$status_http()
+    stop(
+      sprintf("HTTP (%s) - %s\n  %s", x$status_code, x$explanation, mssg),
+      call. = FALSE
+    )
+  }
+  # parse response
+  txt <- res$parse("UTF-8")
+  # parse the JSON to an R list
+  jsonlite::fromJSON(txt)
+}</code></pre>
+<p>Use the function</p>
+<pre class="r"><code>make_request2("https://api.crossref.org/works?rows=0")
+#> $status
+#> [1] "ok"
+#> 
+#> $`message-type`
+#> [1] "work-list"
+#> 
+#> $`message-version`
+#> [1] "1.0.0"
+#> 
+#> $message
+#> $message$facets
+#> named list()
+#> 
+#> $message$`total-results`
+#> [1] 89155073
+#> 
+#> $message$items
+#> list()
+#> 
+#> $message$`items-per-page`
+#> [1] 0
+#> 
+#> $message$query
+#> $message$query$`start-index`
+#> [1] 0
+#> 
+#> $message$query$`search-terms`
+#> NULL</code></pre>
+<p>No different from the first function (besides the URL). However, now we can pass in curl options:</p>
+<pre class="r"><code>make_request2("https://api.crossref.org/works?rows=0", verbose = TRUE)
+make_request2("https://api.crossref.org/works?rows=0", timeout_ms = 1)</code></pre>
+<p>We can also pass named parameters supported in the <code>get</code> method, including <code>query</code>, <code>disk</code>, and <code>stream</code>.</p>
+<pre class="r"><code>make_request2("https://api.crossref.org/works", query = list(rows = 0))
+#> $status
+#> [1] "ok"
+#> 
+#> $`message-type`
+#> [1] "work-list"
+#> 
+#> $`message-version`
+#> [1] "1.0.0"
+#> 
+#> $message
+#> $message$facets
+#> named list()
+#> 
+#> $message$`total-results`
+#> [1] 89155073
+#> 
+#> $message$items
+#> list()
+#> 
+#> $message$`items-per-page`
+#> [1] 0
+#> 
+#> $message$query
+#> $message$query$`start-index`
+#> [1] 0
+#> 
+#> $message$query$`search-terms`
+#> NULL</code></pre>
+<p>In addition, the failure behavior is different, and customized to the specific web resource we are working with</p>
+<pre class="r"><code>make_request2("https://api.crossref.org/works?rows=asdf")
+#> Error: HTTP (400) - Bad request syntax or unsupported method
+#>   Integer specified as asdf but must be a positive integer less than or equal to 1000.</code></pre>
+</div>
+<div id="asynchronous-requests" class="section level2">
+<h2>Asynchronous requests</h2>
+<p>You may want to use asynchronous HTTP requests when any one HTTP request takes “too long”. This is of course all relative. You may be dealing with a server that responds very slowly, or other circumstances.</p>
+<p>See the <strong>async with crul</strong> vignette for more details on asynchronous requests.</p>
+</div>
+</div>
+
+
+
+
+</div>
+
+<script>
+
+// add bootstrap table styles to pandoc tables
+function bootstrapStylePandocTables() {
+  $('tr.header').parent('thead').parent('table').addClass('table table-condensed');
+}
+$(document).ready(function () {
+  bootstrapStylePandocTables();
+});
+
+
+</script>
+
+<!-- dynamically load mathjax for compatibility with self-contained -->
+<script>
+  (function () {
+    var script = document.createElement("script");
+    script.type = "text/javascript";
+    script.src  = "https://mathjax.rstudio.com/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML";
+    document.getElementsByTagName("head")[0].appendChild(script);
+  })();
+</script>
+
+</body>
+</html>
diff --git a/inst/vign/crul_vignette.Rmd b/inst/vign/crul_vignette.Rmd
deleted file mode 100644
index 39ae112..0000000
--- a/inst/vign/crul_vignette.Rmd
+++ /dev/null
@@ -1,156 +0,0 @@
-<!--
-%\VignetteEngine{knitr::knitr}
-%\VignetteIndexEntry{crul introduction}
-%\VignetteEncoding{UTF-8}
--->
-
-```{r echo=FALSE}
-knitr::opts_chunk$set(
-  comment = "#>",
-  collapse = TRUE,
-  warning = FALSE,
-  message = FALSE
-)
-```
-
-crul introduction
-=================
-
-`crul` is an HTTP client for R.
-
-## Install
-
-Stable CRAN version
-
-```{r eval=FALSE}
-install.packages("crul")
-```
-
-Dev version
-
-```{r eval=FALSE}
-devtools::install_github("ropensci/crul")
-```
-
-```{r}
-library("crul")
-```
-
-## the client
-
-`HttpClient` is where to start
-
-```{r}
-(x <- HttpClient$new(
-  url = "https://httpbin.org",
-  opts = list(
-    timeout = 1
-  ),
-  headers = list(
-    a = "hello world"
-  )
-))
-```
-
-Makes a R6 class, that has all the bits and bobs you'd expect for doing HTTP
-requests. When it prints, it gives any defaults you've set. As you update
-the object you can see what's been set
-
-```{r}
-x$opts
-```
-
-```{r}
-x$headers
-```
-
-## do some http
-
-The client object created above has http methods that you can call,
-and pass paths to, as well as query parameters, body values, and any other
-curl options.
-
-Here, we'll do a __GET__ request on the route `/get` on our base url
-`https://httpbin.org` (the full url is then `https://httpbin.org/get`)
-
-```{r}
-res <- x$get("get")
-```
-
-The response from a http request is another R6 class `HttpResponse`, which
-has slots for the outputs of the request, and some functions to deal with
-the response:
-
-Status code
-
-```{r}
-res$status_code
-```
-
-The content
-
-```{r}
-res$content
-```
-
-HTTP method
-
-```{r}
-res$method
-```
-
-Request headers
-
-```{r}
-res$request_headers
-```
-
-Response headers
-
-```{r}
-res$response_headers
-```
-
-And you can parse the content with a provided function:
-
-```{r}
-res$parse()
-jsonlite::fromJSON(res$parse())
-```
-
-With the `HttpClient` object, which holds any configuration stuff
-we set, we can make other HTTP verb requests. For example, a `HEAD`
-request:
-
-```{r}
-x$post(
-  path = "post", 
-  body = list(hello = "world")
-)
-```
-
-
-## write to disk
-
-```{r}
-x <- HttpClient$new(url = "https://httpbin.org")
-f <- tempfile()
-res <- x$get(disk = f)
-# when using write to disk, content is a path
-res$content 
-```
-
-Read lines
-
-```{r}
-readLines(res$content, n = 10)
-```
-
-## stream data
-
-```{r}
-(x <- HttpClient$new(url = "https://httpbin.org"))
-res <- x$get('stream/5', stream = function(x) cat(rawToChar(x)))
-# when streaming, content is NULL
-res$content 
-```
diff --git a/inst/vign/crul_vignette.md b/inst/vign/crul_vignette.md
deleted file mode 100644
index 0c31d74..0000000
--- a/inst/vign/crul_vignette.md
+++ /dev/null
@@ -1,281 +0,0 @@
-<!--
-%\VignetteEngine{knitr::knitr}
-%\VignetteIndexEntry{crul introduction}
-%\VignetteEncoding{UTF-8}
--->
-
-
-
-crul introduction
-=================
-
-`crul` is an HTTP client for R.
-
-## Install
-
-Stable CRAN version
-
-
-```r
-install.packages("crul")
-```
-
-Dev version
-
-
-```r
-devtools::install_github("ropensci/crul")
-```
-
-
-```r
-library("crul")
-```
-
-## the client
-
-`HttpClient` is where to start
-
-
-```r
-(x <- HttpClient$new(
-  url = "https://httpbin.org",
-  opts = list(
-    timeout = 1
-  ),
-  headers = list(
-    a = "hello world"
-  )
-))
-#> <crul connection> 
-#>   url: https://httpbin.org
-#>   options: 
-#>     timeout: 1
-#>   headers: 
-#>     a: hello world
-```
-
-Makes a R6 class, that has all the bits and bobs you'd expect for doing HTTP
-requests. When it prints, it gives any defaults you've set. As you update
-the object you can see what's been set
-
-
-```r
-x$opts
-#> $timeout
-#> [1] 1
-```
-
-
-```r
-x$headers
-#> $a
-#> [1] "hello world"
-```
-
-## do some http
-
-The client object created above has http methods that you can call,
-and pass paths to, as well as query parameters, body values, and any other
-curl options.
-
-Here, we'll do a __GET__ request on the route `/get` on our base url
-`https://httpbin.org` (the full url is then `https://httpbin.org/get`)
-
-
-```r
-res <- x$get("get")
-```
-
-The response from a http request is another R6 class `HttpResponse`, which
-has slots for the outputs of the request, and some functions to deal with
-the response:
-
-Status code
-
-
-```r
-res$status_code
-#> [1] 200
-```
-
-The content
-
-
-```r
-res$content
-#>   [1] 7b 0a 20 20 22 61 72 67 73 22 3a 20 7b 7d 2c 20 0a 20 20 22 68 65 61
-#>  [24] 64 65 72 73 22 3a 20 7b 0a 20 20 20 20 22 41 22 3a 20 22 68 65 6c 6c
-#>  [47] 6f 20 77 6f 72 6c 64 22 2c 20 0a 20 20 20 20 22 41 63 63 65 70 74 22
-#>  [70] 3a 20 22 2a 2f 2a 22 2c 20 0a 20 20 20 20 22 41 63 63 65 70 74 2d 45
-#>  [93] 6e 63 6f 64 69 6e 67 22 3a 20 22 67 7a 69 70 2c 20 64 65 66 6c 61 74
-#> [116] 65 22 2c 20 0a 20 20 20 20 22 48 6f 73 74 22 3a 20 22 68 74 74 70 62
-#> [139] 69 6e 2e 6f 72 67 22 2c 20 0a 20 20 20 20 22 55 73 65 72 2d 41 67 65
-#> [162] 6e 74 22 3a 20 22 6c 69 62 63 75 72 6c 2f 37 2e 35 31 2e 30 20 72 2d
-#> [185] 63 75 72 6c 2f 32 2e 33 20 63 72 75 6c 2f 30 2e 32 2e 30 22 0a 20 20
-#> [208] 7d 2c 20 0a 20 20 22 6f 72 69 67 69 6e 22 3a 20 22 37 31 2e 36 33 2e
-#> [231] 32 32 33 2e 31 31 33 22 2c 20 0a 20 20 22 75 72 6c 22 3a 20 22 68 74
-#> [254] 74 70 73 3a 2f 2f 68 74 74 70 62 69 6e 2e 6f 72 67 2f 67 65 74 22 0a
-#> [277] 7d 0a
-```
-
-HTTP method
-
-
-```r
-res$method
-#> [1] "get"
-```
-
-Request headers
-
-
-```r
-res$request_headers
-#> $useragent
-#> [1] "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"
-#> 
-#> $a
-#> [1] "hello world"
-```
-
-Response headers
-
-
-```r
-res$response_headers
-#> $status
-#> [1] "HTTP/1.1 200 OK"
-#> 
-#> $server
-#> [1] "nginx"
-#> 
-#> $date
-#> [1] "Tue, 03 Jan 2017 05:52:03 GMT"
-#> 
-#> $`content-type`
-#> [1] "application/json"
-#> 
-#> $`content-length`
-#> [1] "278"
-#> 
-#> $connection
-#> [1] "keep-alive"
-#> 
-#> $`access-control-allow-origin`
-#> [1] "*"
-#> 
-#> $`access-control-allow-credentials`
-#> [1] "true"
-```
-
-And you can parse the content with a provided function:
-
-
-```r
-res$parse()
-#> [1] "{\n  \"args\": {}, \n  \"headers\": {\n    \"A\": \"hello world\", \n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.3 crul/0.2.0\"\n  }, \n  \"origin\": \"71.63.223.113\", \n  \"url\": \"https://httpbin.org/get\"\n}\n"
-jsonlite::fromJSON(res$parse())
-#> $args
-#> named list()
-#> 
-#> $headers
-#> $headers$A
-#> [1] "hello world"
-#> 
-#> $headers$Accept
-#> [1] "*/*"
-#> 
-#> $headers$`Accept-Encoding`
-#> [1] "gzip, deflate"
-#> 
-#> $headers$Host
-#> [1] "httpbin.org"
-#> 
-#> $headers$`User-Agent`
-#> [1] "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"
-#> 
-#> 
-#> $origin
-#> [1] "71.63.223.113"
-#> 
-#> $url
-#> [1] "https://httpbin.org/get"
-```
-
-With the `HttpClient` object, which holds any configuration stuff
-we set, we can make other HTTP verb requests. For example, a `HEAD`
-request:
-
-
-```r
-x$post(
-  path = "post", 
-  body = list(hello = "world")
-)
-#> <crul response> 
-#>   url: https://httpbin.org/post
-#>   request_headers: 
-#>     useragent: libcurl/7.51.0 r-curl/2.3 crul/0.2.0
-#>     a: hello world
-#>   response_headers: 
-#>     status: HTTP/1.1 200 OK
-#>     server: nginx
-#>     date: Tue, 03 Jan 2017 05:52:03 GMT
-#>     content-type: application/json
-#>     content-length: 491
-#>     connection: keep-alive
-#>     access-control-allow-origin: *
-#>     access-control-allow-credentials: true
-#>   status: 200
-```
-
-
-## write to disk
-
-
-```r
-x <- HttpClient$new(url = "https://httpbin.org")
-f <- tempfile()
-res <- x$get(disk = f)
-# when using write to disk, content is a path
-res$content 
-#> [1] "/var/folders/gs/4khph0xs0436gmd2gdnwsg080000gn/T//RtmpoZ8Rrd/fileee7a81dea18"
-```
-
-Read lines
-
-
-```r
-readLines(res$content, n = 10)
-#>  [1] "<!DOCTYPE html>"                                                                           
-#>  [2] "<html>"                                                                                    
-#>  [3] "<head>"                                                                                    
-#>  [4] "  <meta http-equiv='content-type' value='text/html;charset=utf8'>"                         
-#>  [5] "  <meta name='generator' value='Ronn/v0.7.3 (http://github.com/rtomayko/ronn/tree/0.7.3)'>"
-#>  [6] "  <title>httpbin(1): HTTP Client Testing Service</title>"                                  
-#>  [7] "  <style type='text/css' media='all'>"                                                     
-#>  [8] "  /* style: man */"                                                                        
-#>  [9] "  body#manpage {margin:0}"                                                                 
-#> [10] "  .mp {max-width:100ex;padding:0 9ex 1ex 4ex}"
-```
-
-## stream data
-
-
-```r
-(x <- HttpClient$new(url = "https://httpbin.org"))
-#> <crul connection> 
-#>   url: https://httpbin.org
-#>   options: 
-#>   headers:
-res <- x$get('stream/5', stream = function(x) cat(rawToChar(x)))
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 0, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 1, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 2, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 3, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 4, "origin": "71.63.223.113"}
-# when streaming, content is NULL
-res$content 
-#> NULL
-```
diff --git a/man/Async.Rd b/man/Async.Rd
new file mode 100644
index 0000000..011bf62
--- /dev/null
+++ b/man/Async.Rd
@@ -0,0 +1,68 @@
+% Generated by roxygen2: do not edit by hand
+% Please edit documentation in R/async.R
+\docType{data}
+\name{Async}
+\alias{Async}
+\title{Simple async client}
+\arguments{
+\item{urls}{(character) one or more URLs (required)}
+}
+\value{
+a list, with objects of class \code{\link[=HttpResponse]{HttpResponse()}}.
+Responses are returned in the order they are passed in.
+}
+\description{
+A client to work with many URLs, but all with the same HTTP method
+}
+\details{
+\strong{Methods}
+\describe{
+\item{\code{get(path, query, ...)}}{
+make async GET requests for all URLs
+}
+\item{\code{post(path, query, body, encode, ...)}}{
+make async POST requests for all URLs
+}
+\item{\code{put(path, query, body, encode, ...)}}{
+make async PUT requests for all URLs
+}
+\item{\code{patch(path, query, body, encode, ...)}}{
+make async PATCH requests for all URLs
+}
+\item{\code{delete(path, query, body, encode, ...)}}{
+make async DELETE requests for all URLs
+}
+\item{\code{head(path, ...)}}{
+make async HEAD requests for all URLs
+}
+}
+
+See \code{\link[=HttpClient]{HttpClient()}} for information on parameters.
+}
+\examples{
+\dontrun{
+cc <- Async$new(
+  urls = c(
+    'https://httpbin.org/',
+    'https://httpbin.org/get?a=5',
+    'https://httpbin.org/get?foo=bar'
+  )
+)
+cc
+(res <- cc$get())
+res[[1]]
+res[[1]]$url
+res[[1]]$success()
+res[[1]]$status_http()
+res[[1]]$response_headers
+res[[1]]$method
+res[[1]]$content
+res[[1]]$parse("UTF-8")
+
+lapply(res, function(z) z$parse("UTF-8"))
+}
+}
+\seealso{
+Other async: \code{\link{AsyncVaried}}
+}
+\keyword{datasets}
diff --git a/man/AsyncVaried.Rd b/man/AsyncVaried.Rd
new file mode 100644
index 0000000..448345d
--- /dev/null
+++ b/man/AsyncVaried.Rd
@@ -0,0 +1,125 @@
+% Generated by roxygen2: do not edit by hand
+% Please edit documentation in R/asyncvaried.R
+\docType{data}
+\name{AsyncVaried}
+\alias{AsyncVaried}
+\title{Async client for different request types}
+\arguments{
+\item{..., .list}{Any number of objects of class \code{\link[=HttpRequest]{HttpRequest()}},
+must supply inputs to one of these parameters, but not both}
+}
+\value{
+An object of class \code{AsyncVaried} with variables and methods.
+Responses are returned in the order they are passed in.
+}
+\description{
+Async client for different request types
+}
+\details{
+\strong{Methods}
+\describe{
+\item{\code{request()}}{
+Execute asynchronous requests
+- returns: nothing, responses stored inside object,
+though will print messages if you choose verbose output
+}
+\item{\code{requests()}}{
+list requests
+- returns: a list of \code{HttpRequest} objects, empty list before
+requests made
+}
+\item{\code{responses()}}{
+list responses
+- returns: a list of \code{HttpResponse} objects, empty list before
+requests made
+}
+\item{\code{parse(encoding = "UTF-8")}}{
+parse content
+- returns: character vector, empty character vector before
+requests made
+}
+\item{\code{status_code()}}{
+(integer) HTTP status codes
+- returns: numeric vector, empty numeric vector before
+requests made
+}
+\item{\code{status()}}{
+(list) HTTP status objects
+- returns: a list of \code{http_code} objects, empty list before
+requests made
+}
+\item{\code{content()}}{
+raw content
+- returns: raw list, empty list before requests made
+}
+\item{\code{times()}}{
+curl request times
+- returns: list of named numeric vectors, empty list before
+requests made
+}
+}
+}
+\examples{
+\dontrun{
+# pass in requests via ...
+req1 <- HttpRequest$new(
+  url = "https://httpbin.org/get",
+  opts = list(verbose = TRUE),
+  headers = list(foo = "bar")
+)$get()
+req2 <- HttpRequest$new(url = "https://httpbin.org/post")$post()
+
+# Create an AsyncVaried object
+out <- AsyncVaried$new(req1, req2)
+
+# before you make requests, the methods return empty objects
+out$status()
+out$status_code()
+out$content()
+out$times()
+out$parse()
+out$responses()
+
+# make requests
+out$request()
+
+# access various parts
+## http status objects
+out$status()
+## status codes
+out$status_code()
+## content (raw data)
+out$content()
+## times
+out$times()
+## parsed content
+out$parse()
+## response objects
+out$responses()
+
+# pass in requests in a list via .list param
+reqlist <- list(
+  HttpRequest$new(url = "https://httpbin.org/get")$get(),
+  HttpRequest$new(url = "https://httpbin.org/post")$post(),
+  HttpRequest$new(url = "https://httpbin.org/put")$put(),
+  HttpRequest$new(url = "https://httpbin.org/delete")$delete(),
+  HttpRequest$new(url = "https://httpbin.org/get?g=5")$get(),
+  HttpRequest$new(
+    url = "https://httpbin.org/post")$post(body = list(y = 9)),
+  HttpRequest$new(
+    url = "https://httpbin.org/get")$get(query = list(hello = "world"))
+)
+
+out <- AsyncVaried$new(.list = reqlist)
+out$request()
+out$status()
+out$status_code()
+out$content()
+out$times()
+out$parse()
+}
+}
+\seealso{
+Other async: \code{\link{Async}}
+}
+\keyword{datasets}
diff --git a/man/HttpClient.Rd b/man/HttpClient.Rd
index bf0d0bc..833f47b 100644
--- a/man/HttpClient.Rd
+++ b/man/HttpClient.Rd
@@ -7,48 +7,56 @@
 \arguments{
 \item{url}{(character) A url. One of \code{url} or \code{handle} required.}
 
-\item{opts}{(list) curl options}
+\item{opts}{(list) curl options, a named list. See
+\code{\link[curl]{curl_options}} for available curl options}
 
-\item{handle}{A handle}
+\item{proxies}{an object of class \code{proxy}, as returned from the
+\code{\link{proxy}} function. Supports one proxy for now}
+
+\item{headers}{(list) a named list of headers}
+
+\item{handle}{A handle, see \code{\link{handle}}}
 }
 \description{
 HTTP client
 }
 \details{
 \strong{Methods}
-  \describe{
-    \item{\code{get(path, query, disk, stream, ...)}}{
-      Make a GET request
-    }
-    \item{\code{post(path, query, body, disk, stream, ...)}}{
-      Make a POST request
-    }
-    \item{\code{put(path, query, body, disk, stream, ...)}}{
-      Make a PUT request
-    }
-    \item{\code{patch(path, query, body, disk, stream, ...)}}{
-      Make a PATCH request
-    }
-    \item{\code{delete(path, query, body, disk, stream, ...)}}{
-      Make a DELETE request
-    }
-    \item{\code{head(path, disk, stream, ...)}}{
-      Make a HEAD request
-    }
-  }
+\describe{
+\item{\code{get(path, query, disk, stream, ...)}}{
+Make a GET request
+}
+\item{\code{post(path, query, body, disk, stream, ...)}}{
+Make a POST request
+}
+\item{\code{put(path, query, body, disk, stream, ...)}}{
+Make a PUT request
+}
+\item{\code{patch(path, query, body, disk, stream, ...)}}{
+Make a PATCH request
+}
+\item{\code{delete(path, query, body, disk, stream, ...)}}{
+Make a DELETE request
+}
+\item{\code{head(path, disk, stream, ...)}}{
+Make a HEAD request
+}
+}
 
 Possible parameters (not all are allowed in each HTTP verb):
 \itemize{
- \item path - URL path, appended to the base URL
- \item query - query terms, as a list
- \item body - body as an R list
- \item encode - one of form, multipart, json, or raw
- \item disk - a path to write to. if NULL (default), memory used
- \item stream - an R function to determine how to stream data. if
- NULL (default), memory used
- \item ... curl options, only those in the acceptable set from
- \code{\link[curl]{curl_options}} except the following: httpget, httppost,
- post, postfields, postfieldsize, and customrequest
+\item path - URL path, appended to the base URL
+\item query - query terms, as a list
+\item body - body as an R list
+\item encode - one of form, multipart, json, or raw
+\item disk - a path to write to. if NULL (default), memory used.
+See \code{\link[curl:curl_fetch_disk]{curl::curl_fetch_disk()}} for help.
+\item stream - an R function to determine how to stream data. if
+NULL (default), memory used. See \code{\link[curl:curl_fetch_stream]{curl::curl_fetch_stream()}}
+for help
+\item ... curl options, only those in the acceptable set from
+\code{\link[curl:curl_options]{curl::curl_options()}} except the following: httpget, httppost,
+post, postfields, postfieldsize, and customrequest
 }
 }
 \examples{
@@ -85,11 +93,10 @@ x$post('post')
 # query params are URL encoded for you, so DO NOT do it yourself
 ## if you url encode yourself, it gets double encoded, and that's bad
 (x <- HttpClient$new(url = "https://httpbin.org"))
-res <- x$get("get", query = list(a = 'hello world'), verbose = TRUE)
+res <- x$get("get", query = list(a = 'hello world'))
 }
 \seealso{
-\code{\link{post-requests}}, \code{\link{http-headers}},
-\code{\link{writing-options}}
+\link{post-requests}, \link{delete-requests}, \link{http-headers},
+\link{writing-options}
 }
 \keyword{datasets}
-
diff --git a/man/HttpRequest.Rd b/man/HttpRequest.Rd
new file mode 100644
index 0000000..32ff2fb
--- /dev/null
+++ b/man/HttpRequest.Rd
@@ -0,0 +1,91 @@
+% Generated by roxygen2: do not edit by hand
+% Please edit documentation in R/httprequest.R
+\docType{data}
+\name{HttpRequest}
+\alias{HttpRequest}
+\title{HTTP request object}
+\arguments{
+\item{url}{(character) A url. One of \code{url} or \code{handle} required.}
+
+\item{opts}{(list) curl options, a named list. See
+\code{\link[curl]{curl_options}} for available curl options}
+
+\item{proxies}{an object of class \code{proxy}, as returned from the
+\code{\link{proxy}} function. Supports one proxy for now}
+
+\item{headers}{(list) a named list of headers}
+
+\item{handle}{A handle, see \code{\link{handle}}}
+}
+\description{
+HTTP request object
+}
+\details{
+This R6 class doesn't do actual HTTP requests as does
+\code{\link[=HttpClient]{HttpClient()}} - it is for building requests to use for async HTTP
+requests in \code{\link[=AsyncVaried]{AsyncVaried()}}
+
+Note that you can access HTTP verbs after creating an \code{HttpRequest}
+object, just as you can with \code{HttpClient}. See examples for usage.
+
+Also note that when you call HTTP verbs on a \code{HttpRequest} object you
+don't need to assign the new object to a variable as the new details
+you've added are added to the object itself.
+
+\strong{Methods}
+\describe{
+\item{\code{get(path, query, disk, stream, ...)}}{
+Define a GET request
+}
+\item{\code{post(path, query, body, disk, stream, ...)}}{
+Define a POST request
+}
+\item{\code{put(path, query, body, disk, stream, ...)}}{
+Define a PUT request
+}
+\item{\code{patch(path, query, body, disk, stream, ...)}}{
+Define a PATCH request
+}
+\item{\code{delete(path, query, body, disk, stream, ...)}}{
+Define a DELETE request
+}
+\item{\code{head(path, disk, stream, ...)}}{
+Define a HEAD request
+}
+\item{\code{method()}}{
+Get the HTTP method (if defined)
+- returns character string
+}
+}
+
+See \code{\link[=HttpClient]{HttpClient()}} for information on parameters.
+}
+\examples{
+x <- HttpRequest$new(url = "https://httpbin.org/get")
+## note here how the HTTP method is shown on the first line to the right
+x$get()
+
+## assign to a new object to keep the output
+z <- x$get()
+### get the HTTP method
+z$method()
+
+(x <- HttpRequest$new(url = "https://httpbin.org/get")$get())
+x$url
+x$payload
+
+(x <- HttpRequest$new(url = "https://httpbin.org/post"))
+x$post(body = list(foo = "bar"))
+
+HttpRequest$new(
+  url = "https://httpbin.org/get",
+  headers = list(
+    `Content-Type` = "application/json"
+  )
+)
+}
+\seealso{
+\link{post-requests}, \link{delete-requests},
+\link{http-headers}, \link{writing-options}
+}
+\keyword{datasets}
diff --git a/man/HttpResponse.Rd b/man/HttpResponse.Rd
index 9581d71..bda0eaa 100644
--- a/man/HttpResponse.Rd
+++ b/man/HttpResponse.Rd
@@ -5,35 +5,51 @@
 \alias{HttpResponse}
 \title{Base response object}
 \arguments{
-\item{url}{(character) A url}
+\item{url}{(character) A url, required}
 
 \item{opts}{(list) curl options}
 
 \item{handle}{A handle}
+
+\item{method}{(character) HTTP method}
+
+\item{status_code}{(integer) status code}
+
+\item{request_headers}{(list) request headers, named list}
+
+\item{response_headers}{(list) response headers, named list}
+
+\item{modified}{(character) modified date}
+
+\item{times}{(vector) named vector}
+
+\item{content}{(raw) raw binary content response}
+
+\item{request}{request object, with all details}
 }
 \description{
 Base response object
 }
 \details{
 \strong{Methods}
-  \describe{
-    \item{\code{parse()}}{
-      Parse the raw response content to text
-    }
-    \item{\code{success()}}{
-      Was status code less than or equal to 201.
-      returns boolean
-    }
-    \item{\code{status_http()}}{
-      Get HTTP status code, message, and explanation
-    }
-    \item{\code{raise_for_status()}}{
-      Check HTTP status and stop with appropriate
-      HTTP error code and message if >= 300.
-      - If you have \code{fauxpas} installed we use that,
-      otherwise use \pkg{httpcode}
-    }
-  }
+\describe{
+\item{\code{parse()}}{
+Parse the raw response content to text
+}
+\item{\code{success()}}{
+Was status code less than or equal to 201.
+returns boolean
+}
+\item{\code{status_http()}}{
+Get HTTP status code, message, and explanation
+}
+\item{\code{raise_for_status()}}{
+Check HTTP status and stop with appropriate
+HTTP error code and message if >= 300.
+- If you have \code{fauxpas} installed we use that,
+otherwise use \pkg{httpcode}
+}
+}
 }
 \examples{
 x <- HttpResponse$new(method = "get", url = "https://httpbin.org")
@@ -61,4 +77,3 @@ x <- HttpClient$new(url = 'https://httpbin.org/status/414')
  \dontrun{res$raise_for_status()}
 }
 \keyword{datasets}
-
diff --git a/man/HttpStubbedResponse.Rd b/man/HttpStubbedResponse.Rd
new file mode 100644
index 0000000..74f0380
--- /dev/null
+++ b/man/HttpStubbedResponse.Rd
@@ -0,0 +1,43 @@
+% Generated by roxygen2: do not edit by hand
+% Please edit documentation in R/stubbed-response.R
+\docType{data}
+\name{HttpStubbedResponse}
+\alias{HttpStubbedResponse}
+\title{stubbed response object}
+\arguments{
+\item{url}{(character) A url}
+
+\item{opts}{(list) curl options}
+
+\item{handle}{A handle}
+}
+\description{
+stubbed response object
+}
+\details{
+\strong{Methods}
+\describe{
+\item{\code{parse()}}{
+Parse the raw response content to text
+}
+\item{\code{success()}}{
+Was status code less than or equal to 201.
+returns boolean
+}
+\item{\code{status_http()}}{
+Get HTTP status code, message, and explanation
+}
+\item{\code{raise_for_status()}}{
+Check HTTP status and stop with appropriate
+HTTP error code and message if >= 300.
+- If you have \code{fauxpas} installed we use that,
+otherwise use \pkg{httpcode}
+}
+}
+}
+\examples{
+(x <- HttpStubbedResponse$new(method = "get", url = "https://httpbin.org"))
+x$url
+x$method
+}
+\keyword{datasets}
diff --git a/man/auth.Rd b/man/auth.Rd
new file mode 100644
index 0000000..4599451
--- /dev/null
+++ b/man/auth.Rd
@@ -0,0 +1,44 @@
+% Generated by roxygen2: do not edit by hand
+% Please edit documentation in R/auth.R
+\name{auth}
+\alias{auth}
+\title{Authentication}
+\usage{
+auth(user, pwd, auth = "basic")
+}
+\arguments{
+\item{user}{(character) username, optional}
+
+\item{pwd}{(character) password, optional}
+
+\item{auth}{(character) authentication type, one of basic (default),
+digest, digest_ie, gssnegotiate, ntlm, or any. optional}
+}
+\description{
+Authentication
+}
+\details{
+Only supporting simple auth for now, OAuth later.
+}
+\examples{
+auth(user = "foo", pwd = "bar", auth = "basic")
+auth(user = "foo", pwd = "bar", auth = "digest")
+auth(user = "foo", pwd = "bar", auth = "ntlm")
+auth(user = "foo", pwd = "bar", auth = "any")
+
+# with HttpClient
+(res <- HttpClient$new(
+  url = "https://httpbin.org/basic-auth/user/passwd",
+  auth = auth(user = "user", pwd = "passwd")
+))
+res$auth
+x <- res$get()
+jsonlite::fromJSON(x$parse("UTF-8"))
+
+# with HttpRequest
+(res <- HttpRequest$new(
+  url = "https://httpbin.org/basic-auth/user/passwd",
+  auth = auth(user = "user", pwd = "passwd")
+))
+res$auth
+}
diff --git a/man/crul-package.Rd b/man/crul-package.Rd
index 8ebaff3..6e88c61 100644
--- a/man/crul-package.Rd
+++ b/man/crul-package.Rd
@@ -2,8 +2,8 @@
 % Please edit documentation in R/crul-package.r
 \docType{package}
 \name{crul-package}
-\alias{crul}
 \alias{crul-package}
+\alias{crul}
 \title{\strong{HTTP R client}}
 \description{
 \strong{HTTP R client}
@@ -11,10 +11,17 @@
 \section{Package API}{
 
 \itemize{
- \item \code{\link{HttpClient}} - create a connection client, set all
- your http options, make http requests
- \item \code{\link{HttpResponse}} - mostly for internal use, handles
- http responses
+\item \code{\link[=HttpClient]{HttpClient()}} - create a connection client, set all
+your http options, make http requests
+\item \code{\link[=HttpResponse]{HttpResponse()}} - mostly for internal use, handles
+http responses
+\item \code{\link[=Async]{Async()}} - asynchronous requests
+\item \code{\link[=AsyncVaried]{AsyncVaried()}} - varied asynchronous requests
+\item \code{\link[=HttpRequest]{HttpRequest()}} - generate an HTTP request, mostly for
+use in building requests to be used in \code{Async} or \code{AsyncVaried}
+\item \code{\link[=mock]{mock()}} - Turn on/off mocking, via \code{webmockr}
+\item \code{\link[=auth]{auth()}} - Simple authentication helper
+\item \code{\link[=proxy]{proxy()}} - Proxy helper
 }
 }
 
@@ -23,7 +30,13 @@
 We use \code{fauxpas} if you have it installed for handling HTTP
 conditions but if it's not installed we use \pkg{httpcode}
 }
+
+\section{Mocking}{
+
+Mocking HTTP requests is supported via the \pkg{webmockr}
+package. See \link{mock} for guidance.
+}
+
 \author{
 Scott Chamberlain \email{myrmecocystus at gmail.com}
 }
-
diff --git a/man/curl-options.Rd b/man/curl-options.Rd
index b5a8cb3..37d6599 100644
--- a/man/curl-options.Rd
+++ b/man/curl-options.Rd
@@ -2,11 +2,21 @@
 % Please edit documentation in R/curl-options.R
 \name{curl-options}
 \alias{curl-options}
+\alias{user-agent}
+\alias{verbose}
+\alias{timeout}
 \title{curl options}
 \description{
-curl options
+With the \code{opts} parameter you can pass in various
+curl options, including user agent string, whether to get verbose
+curl output or not, setting a timeout for requests, and more. See
+\code{\link[curl:curl_options]{curl::curl_options()}} for all the options you can use.
+}
+\details{
+A progress helper will be coming soon.
 }
 \examples{
+\dontrun{
 # set curl options on client initialization
 (res <- HttpClient$new(
   url = "https://httpbin.org",
@@ -21,7 +31,14 @@ res$get('get')
 # or set curl options when performing HTTP operation
 (res <- HttpClient$new(url = "https://httpbin.org"))
 res$get('get', verbose = TRUE)
-\dontrun{res$get('get', stuff = "things")}
-\dontrun{res$get('get', httpget = TRUE)}
-}
+res$get('get', stuff = "things")
+res$get('get', httpget = TRUE)
 
+# set a timeout
+(res <- HttpClient$new(
+  url = "https://httpbin.org",
+  opts = list(timeout_ms = 1)
+))
+res$get('get')
+}
+}
diff --git a/man/delete-requests.Rd b/man/delete-requests.Rd
new file mode 100644
index 0000000..b34063a
--- /dev/null
+++ b/man/delete-requests.Rd
@@ -0,0 +1,23 @@
+% Generated by roxygen2: do not edit by hand
+% Please edit documentation in R/delete-requests.R
+\name{delete-requests}
+\alias{delete-requests}
+\title{HTTP DELETE requests}
+\description{
+HTTP DELETE requests
+}
+\examples{
+(x <- HttpClient$new(url = "https://httpbin.org"))
+
+## a list
+(res1 <- x$delete('delete', body = list(hello = "world"), verbose = TRUE))
+jsonlite::fromJSON(res1$parse("UTF-8"))
+
+## a string
+(res2 <- x$delete('delete', body = "hello world", verbose = TRUE))
+jsonlite::fromJSON(res2$parse("UTF-8"))
+
+## empty body request
+x$delete('delete', verbose = TRUE)
+
+}
diff --git a/man/handle.Rd b/man/handle.Rd
index 8850b74..2c76ac7 100644
--- a/man/handle.Rd
+++ b/man/handle.Rd
@@ -9,7 +9,7 @@ handle(url, ...)
 \arguments{
 \item{url}{(character) A url. required.}
 
-\item{...}{options passed on to \code{\link[curl]{new_handle}}}
+\item{...}{options passed on to \code{\link[curl:new_handle]{curl::new_handle()}}}
 }
 \description{
 Make a handle
@@ -22,4 +22,3 @@ h <- handle("https://httpbin.org")
 (res <- HttpClient$new(handle = h))
 out <- res$get("get")
 }
-
diff --git a/man/http-headers.Rd b/man/http-headers.Rd
index 84b8f73..9c36451 100644
--- a/man/http-headers.Rd
+++ b/man/http-headers.Rd
@@ -38,4 +38,3 @@ res$get('get')
 ))
 res$get('get')
 }
-
diff --git a/man/mock.Rd b/man/mock.Rd
new file mode 100644
index 0000000..1124039
--- /dev/null
+++ b/man/mock.Rd
@@ -0,0 +1,46 @@
+% Generated by roxygen2: do not edit by hand
+% Please edit documentation in R/mocking.R
+\name{mock}
+\alias{mock}
+\title{Mocking HTTP requests}
+\usage{
+mock(on = TRUE)
+}
+\arguments{
+\item{on}{(logical) turn mocking on with \code{TRUE} or turn off with \code{FALSE}.
+By default is \code{FALSE}}
+}
+\description{
+Mocking HTTP requests
+}
+\details{
+\code{webmockr} package required for mocking behavior
+}
+\examples{
+\dontrun{
+# load webmockr
+library(webmockr)
+library(crul)
+
+URL <- "https://httpbin.org"
+
+# turn on mocking
+crul::mock()
+
+# stub a request
+stub_request("get", file.path(URL, "get"))
+webmockr:::webmockr_stub_registry
+
+# create an HTTP client
+(x <- HttpClient$new(url = URL))
+
+# make a request - matches stub - no real request made
+x$get('get')
+
+# allow net connect
+webmockr::webmockr_allow_net_connect()
+x$get('get', query = list(foo = "bar"))
+webmockr::webmockr_disable_net_connect()
+x$get('get', query = list(foo = "bar"))
+}
+}
diff --git a/man/post-requests.Rd b/man/post-requests.Rd
index 7b85b9a..676a8bd 100644
--- a/man/post-requests.Rd
+++ b/man/post-requests.Rd
@@ -2,20 +2,25 @@
 % Please edit documentation in R/post-requests.R
 \name{post-requests}
 \alias{post-requests}
-\title{HTTP POST requests}
+\title{HTTP POST/PUT/PATCH requests}
 \description{
-HTTP POST requests
+HTTP POST/PUT/PATCH requests
 }
 \examples{
 (x <- HttpClient$new(url = "https://httpbin.org"))
 
-# post request
-(res_post <- x$post('post', body = list(hello = "world")))
+# POST requests
+## a list
+(res_post <- x$post('post', body = list(hello = "world"), verbose = TRUE))
+
+## a string
+(res_post <- x$post('post', body = "hello world", verbose = TRUE))
 
 ## empty body request
 x$post('post')
 
 ## form requests
+\dontrun{
 (cli <- HttpClient$new(
   url = "http://apps.kew.org/wcsp/advsearch.do"
 ))
@@ -28,6 +33,7 @@ cli$post(
     selectedLevel = 'cont'
   )
 )
+}
 
 (x <- HttpClient$new(url = "https://httpbin.org"))
 res <- x$post("post",
@@ -39,7 +45,37 @@ res <- x$post("post",
 )
 jsonlite::fromJSON(res$parse())
 
+# PUT requests
+(x <- HttpClient$new(url = "https://httpbin.org"))
+(res <- x$put(path = "put",
+  encode = "json",
+  body = list(
+    genus = 'Gagea',
+    species = 'pratensis'
+  )
+))
+jsonlite::fromJSON(res$parse("UTF-8"))
+
+res <- x$put("put", body = "foo bar")
+jsonlite::fromJSON(res$parse("UTF-8"))
+
 
+# PATCH requests
+(x <- HttpClient$new(url = "https://httpbin.org"))
+(res <- x$patch(path = "patch",
+  encode = "json",
+  body = list(
+    genus = 'Gagea',
+    species = 'pratensis'
+  )
+))
+jsonlite::fromJSON(res$parse("UTF-8"))
+
+res <- x$patch("patch", body = "foo bar")
+jsonlite::fromJSON(res$parse("UTF-8"))
+
+
+# Upload files - STILL WORKING ON THIS
 # path <- file.path(Sys.getenv("R_DOC_DIR"), "html/logo.jpg")
 # (x <- HttpClient$new(url = "https://httpbin.org"))
 # x$post("post",
@@ -49,4 +85,3 @@ jsonlite::fromJSON(res$parse())
 # )
 
 }
-
diff --git a/man/proxies.Rd b/man/proxies.Rd
new file mode 100644
index 0000000..fb4c01b
--- /dev/null
+++ b/man/proxies.Rd
@@ -0,0 +1,60 @@
+% Generated by roxygen2: do not edit by hand
+% Please edit documentation in R/proxies.R
+\name{proxies}
+\alias{proxies}
+\alias{proxy}
+\title{proxy options}
+\usage{
+proxy(url, user = NULL, pwd = NULL, auth = "basic")
+}
+\arguments{
+\item{url}{(character) URL, with scheme (http/https), domain and
+port (must be numeric). required.}
+
+\item{user}{(character) username, optional}
+
+\item{pwd}{(character) password, optional}
+
+\item{auth}{(character) authentication type, one of basic (default),
+digest, digest_ie, gssnegotiate, ntlm, or any. optional}
+}
+\description{
+proxy options
+}
+\details{
+See \url{http://proxylist.hidemyass.com/} for a list of proxies you
+can use
+}
+\examples{
+proxy("http://97.77.104.22:3128")
+proxy("97.77.104.22:3128")
+proxy("http://97.77.104.22:3128", "foo", "bar")
+proxy("http://97.77.104.22:3128", "foo", "bar", auth = "digest")
+proxy("http://97.77.104.22:3128", "foo", "bar", auth = "ntlm")
+
+# with proxy (look at request/outgoing headers)
+(res <- HttpClient$new(
+  url = "http://www.google.com",
+  proxies = proxy("http://97.77.104.22:3128")
+))
+res$proxies
+\dontrun{res$get(verbose = TRUE)}
+
+# vs. without proxy (look at request/outgoing headers)
+(res2 <- HttpClient$new(url = "http://www.google.com"))
+res2$get(verbose = TRUE)
+
+
+# Use authentication
+(res <- HttpClient$new(
+  url = "http://google.com",
+  proxies = proxy("http://97.77.104.22:3128", user = "foo", pwd = "bar")
+))
+
+# another example
+(res <- HttpClient$new(
+  url = "http://ip.tyk.nu/",
+  proxies = proxy("http://200.29.191.149:3128")
+))
+\dontrun{res$get()$parse("UTF-8")}
+}
diff --git a/man/url_build.Rd b/man/url_build.Rd
new file mode 100644
index 0000000..969a03b
--- /dev/null
+++ b/man/url_build.Rd
@@ -0,0 +1,39 @@
+% Generated by roxygen2: do not edit by hand
+% Please edit documentation in R/make_url.R
+\name{url_build}
+\alias{url_build}
+\alias{url_parse}
+\title{Build and parse URLs}
+\usage{
+url_build(url, path = NULL, query = NULL)
+
+url_parse(url)
+}
+\arguments{
+\item{url}{(character) a url}
+
+\item{path}{(character) a path}
+
+\item{query}{(list) a named list of query parameters}
+}
+\value{
+\code{url_build} returns a character string URL; \code{url_parse}
+returns a list with URL components
+}
+\description{
+Build and parse URLs
+}
+\examples{
+url_build("https://httpbin.org")
+url_build("https://httpbin.org", "get")
+url_build("https://httpbin.org", "post")
+url_build("https://httpbin.org", "get", list(foo = "bar"))
+
+url_parse("httpbin.org")
+url_parse("http://httpbin.org")
+url_parse(url = "https://httpbin.org")
+url_parse("https://httpbin.org/get")
+url_parse("https://httpbin.org/get?foo=bar")
+url_parse("https://httpbin.org/get?foo=bar&stuff=things")
+url_parse("https://httpbin.org/get?foo=bar&stuff=things[]")
+}
diff --git a/man/writing-options.Rd b/man/writing-options.Rd
index 6dee37c..9550c05 100644
--- a/man/writing-options.Rd
+++ b/man/writing-options.Rd
@@ -21,4 +21,3 @@ readLines(res$content)
 res <- x$get('stream/50', stream = function(x) cat(rawToChar(x)))
 res$content # when streaming, content is NULL
 }
-
diff --git a/tests/testthat/test-async.R b/tests/testthat/test-async.R
new file mode 100644
index 0000000..8819ca9
--- /dev/null
+++ b/tests/testthat/test-async.R
@@ -0,0 +1,151 @@
+context("Async - General")
+
+test_that("Async works", {
+  skip_on_cran()
+
+  expect_is(Async, "R6ClassGenerator")
+
+  aa <- Async$new(urls = c('https://httpbin.org/get', 'https://google.com'))
+
+  expect_is(aa, "Async")
+  expect_null(aa$handle)
+  expect_is(aa$urls, "character")
+  expect_equal(length(aa$urls), 2)
+  expect_is(aa$.__enclos_env__$private$gen_interface, "function")
+
+  expect_is(aa$get, "function")
+  expect_is(aa$post, "function")
+  expect_is(aa$put, "function")
+  expect_is(aa$patch, "function")
+  expect_is(aa$delete, "function")
+  expect_is(aa$head, "function")
+
+  # after calling
+  res <- aa$get()
+  expect_is(res, "list")
+  expect_equal(length(res), 2)
+  expect_is(res[[1]], "HttpResponse")
+  expect_is(res[[1]]$request, "HttpRequest")
+  expect_is(res[[1]]$content, "raw")
+})
+
+test_that("Async fails well", {
+  skip_on_cran()
+
+  expect_error(Async$new(), "\"urls\" is missing, with no default")
+})
+
+
+
+context("Async - get")
+test_that("Async - get", {
+  skip_on_cran()
+
+  aa <- Async$new(urls = c('https://httpbin.org/get',
+                           'https://google.com'))
+  out <- aa$get()
+
+  expect_is(out, "list")
+  expect_is(out[[1]], "HttpResponse")
+  expect_is(out[[2]], "HttpResponse")
+  expect_equal(out[[1]]$method, "get")
+  expect_equal(out[[2]]$method, "get")
+})
+
+
+context("Async - post")
+test_that("Async - post", {
+  skip_on_cran()
+
+  aa <- Async$new(urls = c('https://httpbin.org/post',
+                           'https://httpbin.org/post'))
+  out <- aa$post()
+
+  expect_is(out, "list")
+  expect_is(out[[1]], "HttpResponse")
+  expect_equal(out[[1]]$method, "post")
+})
+
+
+context("Async - put")
+test_that("Async - put", {
+  skip_on_cran()
+
+  aa <- Async$new(urls = c('https://httpbin.org/put',
+                           'https://httpbin.org/put'))
+  out <- aa$put()
+
+  expect_is(out, "list")
+  expect_is(out[[1]], "HttpResponse")
+  expect_is(out[[2]], "HttpResponse")
+  expect_equal(out[[1]]$method, "put")
+  expect_equal(out[[2]]$method, "put")
+})
+
+
+context("Async - patch")
+test_that("Async - patch", {
+  skip_on_cran()
+
+  aa <- Async$new(urls = c('https://httpbin.org/patch',
+                           'https://httpbin.org/patch'))
+  out <- aa$patch()
+
+  expect_is(out, "list")
+  expect_is(out[[1]], "HttpResponse")
+  expect_is(out[[2]], "HttpResponse")
+  expect_equal(out[[1]]$method, "patch")
+  expect_equal(out[[2]]$method, "patch")
+})
+
+
+context("Async - delete")
+test_that("Async - delete", {
+  skip_on_cran()
+
+  aa <- Async$new(urls = c('https://httpbin.org/delete',
+                           'https://httpbin.org/delete'))
+  out <- aa$delete()
+
+  expect_is(out, "list")
+  expect_is(out[[1]], "HttpResponse")
+  expect_is(out[[2]], "HttpResponse")
+  expect_equal(out[[1]]$method, "delete")
+  expect_equal(out[[2]]$method, "delete")
+})
+
+
+context("Async - head")
+test_that("Async - head", {
+  skip_on_cran()
+
+  aa <- Async$new(urls = c('https://httpbin.org/head',
+                           'https://httpbin.org/head'))
+  out <- aa$head()
+
+  expect_is(out, "list")
+  expect_is(out[[1]], "HttpResponse")
+  expect_is(out[[2]], "HttpResponse")
+  expect_equal(out[[1]]$method, "head")
+  expect_equal(out[[2]]$method, "head")
+})
+
+
+context("Async - order of results")
+test_that("Async - order", {
+  skip_on_cran()
+
+  aa <- Async$new(urls = c('https://httpbin.org/get?a=5',
+                           'https://httpbin.org/get?b=6',
+                           'https://httpbin.org/get?c=7'))
+  out <- aa$get()
+
+  expect_is(out, "list")
+  expect_is(out[[1]], "HttpResponse")
+  expect_is(out[[2]], "HttpResponse")
+  expect_is(out[[3]], "HttpResponse")
+
+  expect_match(out[[1]]$url, "a=5")
+  expect_match(out[[2]]$url, "b=6")
+  expect_match(out[[3]]$url, "c=7")
+})
diff --git a/tests/testthat/test-asyncvaried.R b/tests/testthat/test-asyncvaried.R
new file mode 100644
index 0000000..2c466e9
--- /dev/null
+++ b/tests/testthat/test-asyncvaried.R
@@ -0,0 +1,59 @@
+context("AsyncVaried")
+
+test_that("AsyncVaried works", {
+  skip_on_cran()
+
+  expect_is(AsyncVaried, "R6ClassGenerator")
+
+  req1 <- HttpRequest$new(url = "https://httpbin.org/get")$get()
+  req2 <- HttpRequest$new(url = "https://httpbin.org/post")$post()
+
+  aa <- AsyncVaried$new(req1, req2)
+
+  expect_is(aa, "AsyncVaried")
+  expect_is(aa$.__enclos_env__$private$async_request, "function")
+  expect_is(aa$parse, "function")
+  expect_is(aa$content, "function")
+  expect_is(aa$requests, "function")
+
+  # before requests
+  expect_equal(length(aa$content()), 0)
+  expect_equal(length(aa$status()), 0)
+  expect_equal(length(aa$status_code()), 0)
+  expect_equal(length(aa$times()), 0)
+
+  # after requests
+  aa$request()
+  expect_equal(length(aa$content()), 2)
+  expect_equal(length(aa$status()), 2)
+  expect_equal(length(aa$status_code()), 2)
+  expect_equal(length(aa$times()), 2)
+})
+
+test_that("AsyncVaried fails well", {
+  skip_on_cran()
+
+  expect_error(AsyncVaried$new(), "must pass in at least one request")
+  expect_error(AsyncVaried$new(5), "all inputs must be of class 'HttpRequest'")
+})
+
+context("AsyncVaried - order of results")
+test_that("AsyncVaried - order", {
+  skip_on_cran()
+
+  req1 <- HttpRequest$new(url = "https://httpbin.org/get?a=5")$get()
+  req2 <- HttpRequest$new(url = "https://httpbin.org/get?b=6")$get()
+  req3 <- HttpRequest$new(url = "https://httpbin.org/get?c=7")$get()
+  aa <- AsyncVaried$new(req1, req2, req3)
+  aa$request()
+  out <- aa$responses()
+
+  expect_is(out, "list")
+  expect_is(out[[1]], "HttpResponse")
+  expect_is(out[[2]], "HttpResponse")
+  expect_is(out[[3]], "HttpResponse")
+
+  expect_match(out[[1]]$url, "a=5")
+  expect_match(out[[2]]$url, "b=6")
+  expect_match(out[[3]]$url, "c=7")
+})
diff --git a/tests/testthat/test-auth.R b/tests/testthat/test-auth.R
new file mode 100644
index 0000000..bccadff
--- /dev/null
+++ b/tests/testthat/test-auth.R
@@ -0,0 +1,57 @@
+context("authenticate")
+
+test_that("auth construction works", {
+  basic <- auth(user = "foo", pwd = "bar", auth = "basic")
+  digest <- auth(user = "foo", pwd = "bar", auth = "digest")
+  ntlm <- auth(user = "foo", pwd = "bar", auth = "ntlm")
+  any <- auth(user = "foo", pwd = "bar", auth = "any")
+
+  expect_is(basic, "auth")
+  expect_is(digest, "auth")
+  expect_is(ntlm, "auth")
+  expect_is(any, "auth")
+
+  expect_named(basic, c('userpwd', 'httpauth'))
+  expect_named(digest, c('userpwd', 'httpauth'))
+  expect_named(ntlm, c('userpwd', 'httpauth'))
+  expect_named(any, c('userpwd', 'httpauth'))
+
+  expect_equal(attr(basic, "type"), "basic")
+  expect_equal(attr(digest, "type"), "digest")
+  expect_equal(attr(ntlm, "type"), "ntlm")
+  expect_equal(attr(any, "type"), "any")
+})
+
+test_that("auth works with HttpClient", {
+  aa <- HttpClient$new(
+    url = "https://httpbin.org/basic-auth/user/passwd",
+    auth = auth(user = "foo", pwd = "bar")
+  )
+
+  expect_is(aa, "HttpClient")
+  expect_is(aa$auth, "auth")
+  expect_equal(aa$auth$userpwd, "foo:bar")
+  expect_equal(aa$auth$httpauth, 1)
+})
+
+test_that("auth works with HttpRequest", {
+  aa <- HttpRequest$new(
+    url = "https://httpbin.org/basic-auth/user/passwd",
+    auth = auth(user = "foo", pwd = "bar")
+  )
+
+  expect_is(aa, "HttpRequest")
+  expect_is(aa$auth, "auth")
+  expect_equal(aa$auth$userpwd, "foo:bar")
+  expect_equal(aa$auth$httpauth, 1)
+})
+
+test_that("auth fails well", {
+  expect_error(auth(), "argument \"user\" is missing")
+  expect_error(auth(user = "asdf"), "argument \"pwd\" is missing")
+  expect_error(auth(5, 5), "user must be of class character")
+  expect_error(auth("adsf", 5), "pwd must be of class character")
+  expect_error(
+    auth("asdf", "asdf", 5), "inherits\\(x, \"character\"\\) is not TRUE")
+})
+
diff --git a/tests/testthat/test-client.R b/tests/testthat/test-client.R
index 05cfa59..b4c78a4 100644
--- a/tests/testthat/test-client.R
+++ b/tests/testthat/test-client.R
@@ -1,5 +1,4 @@
 context("HttpClient")
-
 test_that("HttpClient works", {
   skip_on_cran()
 
@@ -21,3 +20,49 @@ test_that("HttpClient fails well", {
 
   expect_error(HttpClient$new(), "need one of url or handle")
 })
+
+
+context("HttpClient - disk")
+test_that("HttpClient works", {
+  skip_on_cran()
+
+  aa <- HttpClient$new(url = "https://httpbin.org")
+  f <- tempfile()
+  res <- aa$get("get", disk = f)
+  lns <- readLines(res$content, n = 10)
+
+  expect_is(aa, "HttpClient")
+  expect_is(res$content, "character")
+  expect_gt(length(lns), 0)
+
+  unlink(f)
+})
+
+test_that("HttpClient disk fails well", {
+  skip_on_cran()
+
+  aa <- HttpClient$new(url = "https://httpbin.org")
+  expect_error(aa$get("get", disk = 5), "invalid 'path' argument")
+})
+
+
+context("HttpClient - stream")
+test_that("HttpClient works", {
+  skip_on_cran()
+
+  aa <- HttpClient$new(url = "https://httpbin.org")
+  expect_output(
+    res <- aa$get('stream/50', stream = function(x) cat(rawToChar(x))),
+    "headers"
+  )
+
+  expect_is(res, "HttpResponse")
+  expect_null(res$content)
+})
+
+test_that("HttpClient disk fails well", {
+  skip_on_cran()
+
+  aa <- HttpClient$new(url = "https://httpbin.org")
+  expect_error(aa$get("get", stream = 5), "could not find function \"fun\"")
+})
diff --git a/tests/testthat/test-delete.R b/tests/testthat/test-delete.R
index bb43839..74d1d2e 100644
--- a/tests/testthat/test-delete.R
+++ b/tests/testthat/test-delete.R
@@ -30,7 +30,7 @@ test_that("delete request with body", {
   expect_is(aa$method, "character")
   expect_equal(aa$method, "delete")
   expect_is(aa$parse, "function")
-  expect_is(aa$parse(), "character")
+  expect_is(aa$parse("UTF-8"), "character")
   expect_true(aa$success())
 
   expect_named(aa$request$fields, "hello")
diff --git a/tests/testthat/test-handle.R b/tests/testthat/test-handle.R
new file mode 100644
index 0000000..90f8985
--- /dev/null
+++ b/tests/testthat/test-handle.R
@@ -0,0 +1,14 @@
+context("handle")
+
+test_that("handle - works", {
+  aa <- handle("https://httpbin.org")
+
+  expect_is(aa, "list")
+  expect_is(aa$url, "character")
+  expect_is(aa$handle, "curl_handle")
+  expect_match(aa$url, "https")
+})
+
+test_that("handle fails well", {
+  expect_error(handle(), "argument \"url\" is missing")
+})
diff --git a/tests/testthat/test-headers.R b/tests/testthat/test-headers.R
index 33b5b72..36395ca 100644
--- a/tests/testthat/test-headers.R
+++ b/tests/testthat/test-headers.R
@@ -7,7 +7,7 @@ test_that("headers work - just default headers", {
   aa <- cli$get('get')
 
   expect_is(aa, "HttpResponse")
-  expect_named(aa$request_headers, 'useragent')
+  expect_named(aa$request_headers, c('User-Agent', 'Accept-Encoding'))
 })
 
 test_that("headers work - user headers passed", {
@@ -20,6 +20,7 @@ test_that("headers work - user headers passed", {
   bb <- cli$get('get')
 
   expect_is(bb, "HttpResponse")
-  expect_named(bb$request_headers, c('useragent', 'hello'))
-  expect_true(any(grepl("Hello", names(jsonlite::fromJSON(bb$parse())$headers))))
+  expect_named(bb$request_headers, c('User-Agent', 'Accept-Encoding', 'hello'))
+  expect_true(
+    any(grepl("Hello", names(jsonlite::fromJSON(bb$parse("UTF-8"))$headers))))
 })
diff --git a/tests/testthat/test-mocking.R b/tests/testthat/test-mocking.R
new file mode 100644
index 0000000..09822e2
--- /dev/null
+++ b/tests/testthat/test-mocking.R
@@ -0,0 +1,73 @@
+context("mocking: mock function")
+test_that("crul_opts env", {
+  skip_on_cran()
+
+  expect_is(crul_opts, "environment")
+  expect_false(crul_opts$mock)
+})
+
+test_that("mock function", {
+  skip_on_cran()
+
+  expect_is(mock, "function")
+  expect_true(mock())
+  expect_true(crul_opts$mock)
+  expect_false(mock(FALSE))
+  expect_false(crul_opts$mock)
+})
+
+context("mocking: HttpClient")
+test_that("mocking with HttpClient", {
+  skip_on_cran()
+
+  library(webmockr)
+  url <- "https://httpbin.org"
+  st <- stub_request("get", file.path(url, "get"))
+  #webmockr:::webmockr_stub_registry
+
+  # webmockr IS NOT enabled
+  cli <- HttpClient$new(url = url)
+  aa <- cli$get("get")
+
+  # webmockr IS enabled
+  mock()
+  bb <- cli$get("get")
+
+  # content and times differ btw the two
+  expect_is(aa, "HttpResponse")
+  expect_is(bb, "HttpResponse")
+
+  expect_is(aa$content, "raw")
+  expect_null(bb$content)
+
+  expect_is(aa$times, "numeric")
+  expect_null(bb$times)
+})
+
+context("mocking: HttpClient when not stubbed yet")
+test_that("mocking with HttpClient: ", {
+  skip_on_cran()
+
+  library(webmockr)
+  url <- "https://httpbin.org"
+  st <- stub_request("get", file.path(url, "get"))
+  #webmockr:::webmockr_stub_registry
+
+  # webmockr IS NOT enabled
+  cli <- HttpClient$new(url = url)
+  expect_error(
+    cli$post("post"),
+    "Real HTTP connections are disabled"
+  )
+  expect_error(
+    cli$post("post"),
+    "You can stub this request with the following snippet"
+  )
+  expect_error(
+    cli$post("post"),
+    "registered request stubs"
+  )
+})
+
+# turn mocking off
+mock(FALSE)
diff --git a/tests/testthat/test-paths.R b/tests/testthat/test-paths.R
index 1e9d06b..c59f779 100644
--- a/tests/testthat/test-paths.R
+++ b/tests/testthat/test-paths.R
@@ -15,23 +15,25 @@ test_that("paths work", {
 test_that("path - multiple route paths work", {
   skip_on_cran()
 
-  cli <- HttpClient$new(url = "https://api.github.com")
-  bb <- cli$get('orgs/ropenscilabs')
+  cli <- HttpClient$new(url = "https://httpbin.org")
+  bb <- cli$get('status/200')
 
   expect_is(bb, "HttpResponse")
   urlsp <- strsplit(bb$url, "/")[[1]]
-  expect_equal(urlsp[4:5], c('orgs', 'ropenscilabs'))
+  expect_equal(urlsp[4:5], c('status', '200'))
   expect_equal(bb$status_code, 200)
 })
 
 test_that("path - paths don't work if paths already on URL", {
   skip_on_cran()
 
-  cli <- HttpClient$new(url = "https://api.github.com/orgs")
-  bb <- cli$get('ropenscilabs')
+  cli <- HttpClient$new(url = "https://httpbin.org/get/adsfasdf")
+  bb <- cli$get('stuff')
 
   expect_is(bb, "HttpResponse")
   expect_equal(bb$status_code, 404)
+  expect_true(grepl("stuff", bb$url))
+  expect_false(grepl("adsfasdf", bb$url))
 })
 
 test_that("path - work with routes that have spaces", {
diff --git a/tests/testthat/test-proxies.R b/tests/testthat/test-proxies.R
new file mode 100644
index 0000000..12845ed
--- /dev/null
+++ b/tests/testthat/test-proxies.R
@@ -0,0 +1,56 @@
+context("proxies")
+
+test_that("proxy without http requests works", {
+  aa <- proxy("http://97.77.104.22:3128")
+  bb <- proxy("97.77.104.22:3128")
+  cc <- proxy("http://97.77.104.22:3128", "foo", "bar")
+  dd <- proxy("http://97.77.104.22:3128", "foo", "bar", auth = "digest")
+  ee <- proxy("http://97.77.104.22:3128", "foo", "bar", auth = "ntlm")
+
+  expect_is(aa, "proxy")
+  expect_is(unclass(aa), "list")
+  expect_is(aa$proxy, "character")
+  expect_type(aa$proxyport, "double")
+  expect_type(aa$proxyauth, "double")
+
+  expect_is(bb, "proxy")
+  expect_is(unclass(bb), "list")
+  expect_is(bb$proxy, "character")
+  expect_type(bb$proxyport, "double")
+  expect_type(bb$proxyauth, "double")
+
+  expect_is(cc, "proxy")
+  expect_is(unclass(cc), "list")
+  expect_is(cc$proxy, "character")
+  expect_type(cc$proxyport, "double")
+  expect_type(cc$proxyauth, "double")
+
+  expect_is(dd, "proxy")
+  expect_is(unclass(dd), "list")
+  expect_is(dd$proxy, "character")
+  expect_type(dd$proxyport, "double")
+  expect_type(dd$proxyauth, "double")
+
+  expect_is(ee, "proxy")
+  expect_is(unclass(ee), "list")
+  expect_is(ee$proxy, "character")
+  expect_type(ee$proxyport, "double")
+  expect_type(ee$proxyauth, "double")
+})
+
+test_that("proxy - using in HttpClient", {
+  aa <- HttpClient$new(
+    url = "http://www.google.com",
+    proxies = proxy("http://97.77.104.22:3128")
+  )
+
+  expect_is(aa, "HttpClient")
+  expect_is(aa$proxies, "proxy")
+})
+
+test_that("proxy fails well", {
+  expect_error(proxy(), "proxy URL not of correct form")
+  expect_error(proxy(user = mtcars), "proxy URL not of correct form")
+  expect_error(proxy("adff", user = 5), "user must be of class character")
+  expect_error(proxy("adff", pwd = 5), "pwd must be of class character")
+})
diff --git a/tests/testthat/test-put.R b/tests/testthat/test-put.R
index eb51ed9..f8cee1b 100644
--- a/tests/testthat/test-put.R
+++ b/tests/testthat/test-put.R
@@ -30,7 +30,7 @@ test_that("put request with body", {
   expect_is(aa$method, "character")
   expect_equal(aa$method, "put")
   expect_is(aa$parse, "function")
-  expect_is(aa$parse(), "character")
+  expect_is(aa$parse("UTF-8"), "character")
   expect_true(aa$success())
 
   expect_named(aa$request$fields, "hello")
diff --git a/tests/testthat/test-query.R b/tests/testthat/test-query.R
index 15418b9..13f1b70 100644
--- a/tests/testthat/test-query.R
+++ b/tests/testthat/test-query.R
@@ -21,7 +21,8 @@ test_that("query - multiple params of same name work", {
 
   expect_is(aa, "HttpResponse")
   expect_equal(length(gregexpr("hello", aa$url)[[1]]), 2)
-  expect_equal(length(gregexpr("hello", jsonlite::fromJSON(aa$parse())$url)[[1]]), 2)
+  expect_equal(
+    length(gregexpr("hello", jsonlite::fromJSON(aa$parse())$url)[[1]]), 2)
 })
 
 test_that("query - length 0 query list works", {
diff --git a/tests/testthat/test-request.R b/tests/testthat/test-request.R
new file mode 100644
index 0000000..debc343
--- /dev/null
+++ b/tests/testthat/test-request.R
@@ -0,0 +1,71 @@
+context("HttpRequest")
+
+test_that("HttpRequest works", {
+  expect_is(HttpRequest, "R6ClassGenerator")
+
+  aa <- HttpRequest$new(url = "https://httpbin.org")
+
+  expect_is(aa, "HttpRequest")
+  expect_null(aa$handle)
+  expect_length(aa$opts, 0)
+  expect_is(aa$url, "character")
+  expect_is(aa$headers, "list")
+  expect_is(aa$post, "function")
+  expect_is(aa$get, "function")
+
+  expect_is(aa$print, "function")
+  expect_output(aa$print(), "<crul http request> ")
+})
+
+test_that("HttpRequest - get", {
+  aa <- HttpRequest$new(url = "https://httpbin.org")$get()
+
+  expect_is(aa, "HttpRequest")
+  expect_equal(aa$method(), "get")
+  expect_equal(aa$url, "https://httpbin.org")
+})
+
+test_that("HttpRequest - post", {
+  aa <- HttpRequest$new(url = "https://httpbin.org")$post()
+
+  expect_is(aa, "HttpRequest")
+  expect_equal(aa$method(), "post")
+  expect_equal(aa$url, "https://httpbin.org")
+})
+
+test_that("HttpRequest - put", {
+  aa <- HttpRequest$new(url = "https://httpbin.org")$put()
+
+  expect_is(aa, "HttpRequest")
+  expect_equal(aa$method(), "put")
+  expect_equal(aa$url, "https://httpbin.org")
+})
+
+test_that("HttpRequest - patch", {
+  aa <- HttpRequest$new(url = "https://httpbin.org")$patch()
+
+  expect_is(aa, "HttpRequest")
+  expect_equal(aa$method(), "patch")
+  expect_equal(aa$url, "https://httpbin.org")
+})
+
+test_that("HttpRequest - delete", {
+  aa <- HttpRequest$new(url = "https://httpbin.org")$delete()
+
+  expect_is(aa, "HttpRequest")
+  expect_equal(aa$method(), "delete")
+  expect_equal(aa$url, "https://httpbin.org")
+})
+
+test_that("HttpRequest - head", {
+  aa <- HttpRequest$new(url = "https://httpbin.org")$head()
+
+  expect_is(aa, "HttpRequest")
+  expect_equal(aa$method(), "head")
+  expect_equal(aa$url, "https://httpbin.org")
+})
+
+
+test_that("HttpRequest fails well", {
+  expect_error(HttpRequest$new(), "need one of url or handle")
+})
diff --git a/tests/testthat/test-response.R b/tests/testthat/test-response.R
new file mode 100644
index 0000000..8373dd4
--- /dev/null
+++ b/tests/testthat/test-response.R
@@ -0,0 +1,36 @@
+context("HttpResponse")
+
+test_that("HttpResponse works", {
+  expect_is(HttpResponse, "R6ClassGenerator")
+
+  aa <- HttpResponse$new(
+    method = "get",
+    url = "https://httpbin.org",
+    status_code = 201,
+    request_headers = list(useragent = "foo bar"),
+    content = charToRaw("hello world"),
+    request = list()
+  )
+
+  expect_is(aa, "HttpResponse")
+  expect_null(aa$handle)
+  expect_null(aa$opts)
+  expect_is(aa$url, "character")
+  expect_is(aa$method, "character")
+  expect_is(aa$content, "raw")
+  expect_null(aa$modified)
+  expect_is(aa$parse, "function")
+  expect_is(aa$raise_for_status, "function")
+  expect_is(aa$request_headers, "list")
+  expect_null(aa$response_headers)
+  expect_equal(aa$status_code, 201)
+  expect_is(aa$status_http, "function")
+  expect_is(aa$success, "function")
+  expect_true(aa$success())
+  expect_null(aa$times)
+  expect_is(aa$request, "list")
+})
+
+test_that("HttpResponse fails well", {
+  expect_error(HttpResponse$new(), "argument \"url\" is missing")
+})
diff --git a/tests/testthat/test-url_build_parse.R b/tests/testthat/test-url_build_parse.R
new file mode 100644
index 0000000..3707d84
--- /dev/null
+++ b/tests/testthat/test-url_build_parse.R
@@ -0,0 +1,96 @@
+context("url build")
+
+test_that("url build works", {
+  skip_on_cran()
+
+  aa <- url_build("https://httpbin.org")
+  bb <- url_build("https://httpbin.org", "get")
+  cc <- url_build("https://httpbin.org", "get", list(foo = "bar"))
+
+  expect_is(aa, "character")
+  expect_match(aa, "https")
+  expect_match(aa, "httpbin.org")
+
+  expect_is(bb, "character")
+  expect_match(bb, "https")
+  expect_match(bb, "httpbin.org")
+  expect_match(bb, "get")
+
+  expect_is(cc, "character")
+  expect_match(cc, "https")
+  expect_match(cc, "httpbin.org")
+  expect_match(cc, "?foo=bar")
+})
+
+test_that("build fails well", {
+  skip_on_cran()
+
+  # url param required
+  expect_error(url_build(), "argument \"url\" is missing")
+
+  # wrong types
+  expect_error(url_build(5), "url must be of class character")
+  expect_error(url_build("ASDf", path = 5), "path must be of class character")
+  expect_error(url_build("adff", query = 5), "query must be of class list")
+
+  # query list is named
+  expect_error(url_build("As", query = list(4, 5)),
+               "all query elements must be named")
+})
+
+
+context("url parse")
+
+test_that("url parse works", {
+  skip_on_cran()
+
+  aa <- url_parse("https://httpbin.org")
+  bb <- url_parse("https://httpbin.org/get?foo=bar")
+  cc <- url_parse("https://httpbin.org/get?foo=bar&stuff=things")
+
+  expect_is(aa, "list")
+  expect_named(aa, c('scheme', 'domain', 'port', 'path', 'parameter',
+                     'fragment'))
+  expect_is(aa$scheme, "character")
+  expect_equal(aa$scheme, "https")
+  expect_is(aa$domain, "character")
+  expect_true(is.na(aa$path))
+  expect_true(is.na(aa$parameter))
+
+  expect_is(bb, "list")
+  expect_named(bb, c('scheme', 'domain', 'port', 'path', 'parameter',
+                     'fragment'))
+  expect_is(bb$scheme, "character")
+  expect_equal(bb$scheme, "https")
+  expect_is(bb$domain, "character")
+  expect_equal(bb$path, "get")
+  expect_is(bb$parameter, "list")
+  expect_equal(bb$parameter$foo, "bar")
+
+  expect_is(cc, "list")
+  expect_named(cc, c('scheme', 'domain', 'port', 'path', 'parameter',
+                     'fragment'))
+  expect_is(cc$scheme, "character")
+  expect_equal(cc$scheme, "https")
+  expect_is(cc$domain, "character")
+  expect_equal(cc$path, "get")
+  expect_is(cc$parameter, "list")
+  expect_equal(cc$parameter$foo, "bar")
+  expect_equal(cc$parameter$stuff, "things")
+})
+
+test_that("parse fails well", {
+  skip_on_cran()
+
+  # url param required
+  expect_error(url_build(), "argument \"url\" is missing")
+
+  # wrong types
+  expect_error(url_build(5), "url must be of class character")
+  expect_error(url_build("ASDf", path = 5), "path must be of class character")
+  expect_error(url_build("adff", query = 5), "query must be of class list")
+
+  # query list is named
+  expect_error(url_build("As", query = list(4, 5)),
+               "all query elements must be named")
+})
diff --git a/tests/testthat/test-utils.R b/tests/testthat/test-utils.R
new file mode 100644
index 0000000..b1aea35
--- /dev/null
+++ b/tests/testthat/test-utils.R
@@ -0,0 +1,49 @@
+context("encode")
+test_that("encode", {
+  aa <- encode("https://httpbin.org")
+  bb <- encode(I("https://httpbin.org"))
+
+  expect_is(aa, "character")
+  expect_is(bb, "AsIs")
+
+  expect_match(aa, "%3A")
+  expect_false(grepl("%3A", bb))
+})
+
+
+context("has_name")
+test_that("has_name", {
+  expect_false(has_name(5))
+  expect_true(all(has_name(mtcars)))
+  expect_true(has_name(list(a = 5)))
+  expect_false(has_name(list(5)))
+})
+
+
+context("has_namez")
+test_that("has_namez", {
+  expect_false(has_namez(5))
+  expect_true(has_namez(mtcars))
+  expect_true(has_namez(list(a = 5)))
+  expect_false(has_namez(list(5)))
+})
+
+
+context("make_query")
+test_that("make_query", {
+  aa <- make_query(list(foo = "hello", bar = "world"))
+
+  expect_is(aa, "character")
+  expect_match(aa, "foo")
+  expect_match(aa, "&")
+  expect_match(aa, "=")
+})
+
+
+context("curl_opts_check")
+test_that("curl_opts_check works", {
+  expect_null(curl_opts_check(verbose = TRUE))
+  expect_null(curl_opts_check(timeout_ms = 0.001))
+  expect_error(
+    curl_opts_check(httppost = 1), "the following curl options are not allowed")
+})
diff --git a/vignettes/async.Rmd b/vignettes/async.Rmd
new file mode 100644
index 0000000..c0adf23
--- /dev/null
+++ b/vignettes/async.Rmd
@@ -0,0 +1,263 @@
+<!--
+%\VignetteIndexEntry{async with crul}
+%\VignetteEngine{knitr::rmarkdown}
+%\VignetteEncoding{UTF-8}
+-->
+
+async with crul
+===============
+
+
+
+Asynchronous requests with `crul`.
+
+There are two interfaces to asynchronous requests in `crul`:
+
+1. Simple async: any number of URLs, all treated with the same curl options, 
+headers, etc., and only one HTTP method type at a time.
+2. Varied request async: build any type of request and execute all asynchronously.
+
+The first option takes less thinking, less work, and is good solution when you 
+just want to hit a bunch of URLs asynchronously.
+
+The second option is ideal when you want to set curl options/headers on each 
+request and/or want to do different types of HTTP methods on each request.
+
+One thing to think about before using async is whether the data provider is 
+okay with it. It's possible that a data provider's service may be brought down
+if you do too many async requests. 
+
+
+```r
+library("crul")
+```
+
+## simple async
+
+Build request objcect with 1 or more URLs
+
+
+
+
+```r
+(cc <- Async$new(
+  urls = c(
+    'https://httpbin.org/get?a=5',
+    'https://httpbin.org/get?a=5&b=6',
+    'https://httpbin.org/ip'
+  )
+))
+#> <crul async connection> 
+#>   urls: 
+#>    https://httpbin.org/get?a=5
+#>    https://httpbin.org/get?a=5&b=6
+#>    https://httpbin.org/ip
+```
+
+Make request with any HTTP method
+
+
+```r
+(res <- cc$get())
+#> [[1]]
+#> <crul response> 
+#>   url: https://httpbin.org/get?a=5
+#>   request_headers: 
+#>     useragent: libcurl/7.51.0 r-curl/2.6 crul/0.3.8
+#>   response_headers: 
+#>     status: HTTP/1.1 200 OK
+#>     connection: keep-alive
+#>     server: meinheld/0.6.1
+#>     date: Tue, 13 Jun 2017 22:26:33 GMT
+#>     content-type: application/json
+#>     access-control-allow-origin: *
+#>     access-control-allow-credentials: true
+#>     x-powered-by: Flask
+#>     x-processed-time: 0.000774145126343
+#>     content-length: 302
+#>     via: 1.1 vegur
+#>   params: 
+#>     a: 5
+#>   status: 200
+#> 
+#> [[2]]
+#> <crul response> 
+#>   url: https://httpbin.org/get?a=5&b=6
+#>   request_headers: 
+#>     useragent: libcurl/7.51.0 r-curl/2.6 crul/0.3.8
+#>   response_headers: 
+#>     status: HTTP/1.1 200 OK
+#>     connection: keep-alive
+#>     server: meinheld/0.6.1
+#>     date: Tue, 13 Jun 2017 22:26:33 GMT
+#>     content-type: application/json
+#>     access-control-allow-origin: *
+#>     access-control-allow-credentials: true
+#>     x-powered-by: Flask
+#>     x-processed-time: 0.00131487846375
+#>     content-length: 321
+#>     via: 1.1 vegur
+#>   params: 
+#>     a: 5
+#>     b: 6
+#>   status: 200
+#> 
+#> [[3]]
+#> <crul response> 
+#>   url: https://httpbin.org/ip
+#>   request_headers: 
+#>     useragent: libcurl/7.51.0 r-curl/2.6 crul/0.3.8
+#>   response_headers: 
+#>     status: HTTP/1.1 200 OK
+#>     connection: keep-alive
+#>     server: meinheld/0.6.1
+#>     date: Tue, 13 Jun 2017 22:26:33 GMT
+#>     content-type: application/json
+#>     access-control-allow-origin: *
+#>     access-control-allow-credentials: true
+#>     x-powered-by: Flask
+#>     x-processed-time: 0.000844955444336
+#>     content-length: 33
+#>     via: 1.1 vegur
+#>   status: 200
+```
+
+You get back a list matching length of the number of input URLs
+
+Access object variables and methods just as with `HttpClient` results, here just one at a time.
+
+
+```r
+res[[1]]$url
+#> [1] "https://httpbin.org/get?a=5"
+res[[1]]$success()
+#> [1] TRUE
+res[[1]]$parse("UTF-8")
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get?a=5\"\n}\n"
+```
+
+Or apply access/method calls aross many results, e.g., parse all results
+
+
+```r
+lapply(res, function(z) z$parse("UTF-8"))
+#> [[1]]
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get?a=5\"\n}\n"
+#> 
+#> [[2]]
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\", \n    \"b\": \"6\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get?a=5&b=6\"\n}\n"
+#> 
+#> [[3]]
+#> [1] "{\n  \"origin\": \"157.130.179.86\"\n}\n"
+```
+
+## varied request async
+
+
+```r
+req1 <- HttpRequest$new(
+  url = "https://httpbin.org/get?a=5",
+  opts = list(
+    verbose = TRUE
+  )
+)
+req1$get()
+#> <crul http request> get
+#>   url: https://httpbin.org/get?a=5
+#>   curl options: 
+#>     verbose: TRUE
+#>   proxies: 
+#>   auth: 
+#>   headers:
+
+req2 <- HttpRequest$new(
+  url = "https://httpbin.org/post?a=5&b=6"
+)
+req2$post(body = list(a = 5))
+#> <crul http request> post
+#>   url: https://httpbin.org/post?a=5&b=6
+#>   curl options: 
+#>   proxies: 
+#>   auth: 
+#>   headers:
+
+(res <- AsyncVaried$new(req1, req2))
+#> <crul async varied connection> 
+#>   requests: 
+#>    get: https://httpbin.org/get?a=5 
+#>    post: https://httpbin.org/post?a=5&b=6
+```
+
+Make requests asynchronously
+
+
+```r
+res$request()
+```
+
+Parse all results
+
+
+```r
+res$parse()
+#> [1] "{\n  \"args\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get?a=5\"\n}\n"                                                                                                                                    [...]
+#> [2] "{\n  \"args\": {\n    \"a\": \"5\", \n    \"b\": \"6\"\n  }, \n  \"data\": \"\", \n  \"files\": {}, \n  \"form\": {\n    \"a\": \"5\"\n  }, \n  \"headers\": {\n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Content-Length\": \"137\", \n    \"Content-Type\": \"multipart/form-data; boundary=------------------------4cd6745fc264b623\", \n    \"Expect\": \"100-continue\", \n    \"Host\": \"httpbin.org\", \n    \"User-Age [...]
+```
+
+
+```r
+lapply(res$parse(), jsonlite::prettify)
+#> [[1]]
+#> {
+#>     "args": {
+#>         "a": "5"
+#>     },
+#>     "headers": {
+#>         "Accept": "*/*",
+#>         "Accept-Encoding": "gzip, deflate",
+#>         "Connection": "close",
+#>         "Host": "httpbin.org",
+#>         "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#>     },
+#>     "origin": "157.130.179.86",
+#>     "url": "https://httpbin.org/get?a=5"
+#> }
+#>  
+#> 
+#> [[2]]
+#> {
+#>     "args": {
+#>         "a": "5",
+#>         "b": "6"
+#>     },
+#>     "data": "",
+#>     "files": {
+#> 
+#>     },
+#>     "form": {
+#>         "a": "5"
+#>     },
+#>     "headers": {
+#>         "Accept": "*/*",
+#>         "Accept-Encoding": "gzip, deflate",
+#>         "Connection": "close",
+#>         "Content-Length": "137",
+#>         "Content-Type": "multipart/form-data; boundary=------------------------4cd6745fc264b623",
+#>         "Expect": "100-continue",
+#>         "Host": "httpbin.org",
+#>         "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#>     },
+#>     "json": null,
+#>     "origin": "157.130.179.86",
+#>     "url": "https://httpbin.org/post?a=5&b=6"
+#> }
+#> 
+```
+
+Status codes
+
+
+```r
+res$status_code()
+#> [1] 200 200
+```
diff --git a/vignettes/crul_vignette.Rmd b/vignettes/crul_vignette.Rmd
index 0c31d74..9e86d89 100644
--- a/vignettes/crul_vignette.Rmd
+++ b/vignettes/crul_vignette.Rmd
@@ -1,6 +1,6 @@
 <!--
-%\VignetteEngine{knitr::knitr}
 %\VignetteIndexEntry{crul introduction}
+%\VignetteEngine{knitr::rmarkdown}
 %\VignetteEncoding{UTF-8}
 -->
 
@@ -49,8 +49,10 @@ library("crul")
 ))
 #> <crul connection> 
 #>   url: https://httpbin.org
-#>   options: 
+#>   curl options: 
 #>     timeout: 1
+#>   proxies: 
+#>   auth: 
 #>   headers: 
 #>     a: hello world
 ```
@@ -109,14 +111,15 @@ res$content
 #>  [47] 6f 20 77 6f 72 6c 64 22 2c 20 0a 20 20 20 20 22 41 63 63 65 70 74 22
 #>  [70] 3a 20 22 2a 2f 2a 22 2c 20 0a 20 20 20 20 22 41 63 63 65 70 74 2d 45
 #>  [93] 6e 63 6f 64 69 6e 67 22 3a 20 22 67 7a 69 70 2c 20 64 65 66 6c 61 74
-#> [116] 65 22 2c 20 0a 20 20 20 20 22 48 6f 73 74 22 3a 20 22 68 74 74 70 62
-#> [139] 69 6e 2e 6f 72 67 22 2c 20 0a 20 20 20 20 22 55 73 65 72 2d 41 67 65
-#> [162] 6e 74 22 3a 20 22 6c 69 62 63 75 72 6c 2f 37 2e 35 31 2e 30 20 72 2d
-#> [185] 63 75 72 6c 2f 32 2e 33 20 63 72 75 6c 2f 30 2e 32 2e 30 22 0a 20 20
-#> [208] 7d 2c 20 0a 20 20 22 6f 72 69 67 69 6e 22 3a 20 22 37 31 2e 36 33 2e
-#> [231] 32 32 33 2e 31 31 33 22 2c 20 0a 20 20 22 75 72 6c 22 3a 20 22 68 74
-#> [254] 74 70 73 3a 2f 2f 68 74 74 70 62 69 6e 2e 6f 72 67 2f 67 65 74 22 0a
-#> [277] 7d 0a
+#> [116] 65 22 2c 20 0a 20 20 20 20 22 43 6f 6e 6e 65 63 74 69 6f 6e 22 3a 20
+#> [139] 22 63 6c 6f 73 65 22 2c 20 0a 20 20 20 20 22 48 6f 73 74 22 3a 20 22
+#> [162] 68 74 74 70 62 69 6e 2e 6f 72 67 22 2c 20 0a 20 20 20 20 22 55 73 65
+#> [185] 72 2d 41 67 65 6e 74 22 3a 20 22 6c 69 62 63 75 72 6c 2f 37 2e 35 31
+#> [208] 2e 30 20 72 2d 63 75 72 6c 2f 32 2e 36 20 63 72 75 6c 2f 30 2e 33 2e
+#> [231] 38 22 0a 20 20 7d 2c 20 0a 20 20 22 6f 72 69 67 69 6e 22 3a 20 22 31
+#> [254] 35 37 2e 31 33 30 2e 31 37 39 2e 38 36 22 2c 20 0a 20 20 22 75 72 6c
+#> [277] 22 3a 20 22 68 74 74 70 73 3a 2f 2f 68 74 74 70 62 69 6e 2e 6f 72 67
+#> [300] 2f 67 65 74 22 0a 7d 0a
 ```
 
 HTTP method
@@ -132,8 +135,11 @@ Request headers
 
 ```r
 res$request_headers
-#> $useragent
-#> [1] "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"
+#> $`User-Agent`
+#> [1] "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#> 
+#> $`Accept-Encoding`
+#> [1] "gzip, deflate"
 #> 
 #> $a
 #> [1] "hello world"
@@ -147,26 +153,35 @@ res$response_headers
 #> $status
 #> [1] "HTTP/1.1 200 OK"
 #> 
+#> $connection
+#> [1] "keep-alive"
+#> 
 #> $server
-#> [1] "nginx"
+#> [1] "meinheld/0.6.1"
 #> 
 #> $date
-#> [1] "Tue, 03 Jan 2017 05:52:03 GMT"
+#> [1] "Tue, 13 Jun 2017 22:26:40 GMT"
 #> 
 #> $`content-type`
 #> [1] "application/json"
 #> 
-#> $`content-length`
-#> [1] "278"
-#> 
-#> $connection
-#> [1] "keep-alive"
-#> 
 #> $`access-control-allow-origin`
 #> [1] "*"
 #> 
 #> $`access-control-allow-credentials`
 #> [1] "true"
+#> 
+#> $`x-powered-by`
+#> [1] "Flask"
+#> 
+#> $`x-processed-time`
+#> [1] "0.000635147094727"
+#> 
+#> $`content-length`
+#> [1] "307"
+#> 
+#> $via
+#> [1] "1.1 vegur"
 ```
 
 And you can parse the content with a provided function:
@@ -174,7 +189,7 @@ And you can parse the content with a provided function:
 
 ```r
 res$parse()
-#> [1] "{\n  \"args\": {}, \n  \"headers\": {\n    \"A\": \"hello world\", \n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.3 crul/0.2.0\"\n  }, \n  \"origin\": \"71.63.223.113\", \n  \"url\": \"https://httpbin.org/get\"\n}\n"
+#> [1] "{\n  \"args\": {}, \n  \"headers\": {\n    \"A\": \"hello world\", \n    \"Accept\": \"*/*\", \n    \"Accept-Encoding\": \"gzip, deflate\", \n    \"Connection\": \"close\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"libcurl/7.51.0 r-curl/2.6 crul/0.3.8\"\n  }, \n  \"origin\": \"157.130.179.86\", \n  \"url\": \"https://httpbin.org/get\"\n}\n"
 jsonlite::fromJSON(res$parse())
 #> $args
 #> named list()
@@ -189,15 +204,18 @@ jsonlite::fromJSON(res$parse())
 #> $headers$`Accept-Encoding`
 #> [1] "gzip, deflate"
 #> 
+#> $headers$Connection
+#> [1] "close"
+#> 
 #> $headers$Host
 #> [1] "httpbin.org"
 #> 
 #> $headers$`User-Agent`
-#> [1] "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"
+#> [1] "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
 #> 
 #> 
 #> $origin
-#> [1] "71.63.223.113"
+#> [1] "157.130.179.86"
 #> 
 #> $url
 #> [1] "https://httpbin.org/get"
@@ -213,21 +231,6 @@ x$post(
   path = "post", 
   body = list(hello = "world")
 )
-#> <crul response> 
-#>   url: https://httpbin.org/post
-#>   request_headers: 
-#>     useragent: libcurl/7.51.0 r-curl/2.3 crul/0.2.0
-#>     a: hello world
-#>   response_headers: 
-#>     status: HTTP/1.1 200 OK
-#>     server: nginx
-#>     date: Tue, 03 Jan 2017 05:52:03 GMT
-#>     content-type: application/json
-#>     content-length: 491
-#>     connection: keep-alive
-#>     access-control-allow-origin: *
-#>     access-control-allow-credentials: true
-#>   status: 200
 ```
 
 
@@ -240,7 +243,7 @@ f <- tempfile()
 res <- x$get(disk = f)
 # when using write to disk, content is a path
 res$content 
-#> [1] "/var/folders/gs/4khph0xs0436gmd2gdnwsg080000gn/T//RtmpoZ8Rrd/fileee7a81dea18"
+#> [1] "/var/folders/gs/4khph0xs0436gmd2gdnwsg080000gn/T//RtmpF4Mh1B/file6fcc2a328add"
 ```
 
 Read lines
@@ -267,14 +270,16 @@ readLines(res$content, n = 10)
 (x <- HttpClient$new(url = "https://httpbin.org"))
 #> <crul connection> 
 #>   url: https://httpbin.org
-#>   options: 
+#>   curl options: 
+#>   proxies: 
+#>   auth: 
 #>   headers:
 res <- x$get('stream/5', stream = function(x) cat(rawToChar(x)))
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 0, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 1, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 2, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 3, "origin": "71.63.223.113"}
-#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.3 crul/0.2.0"}, "args": {}, "id": 4, "origin": "71.63.223.113"}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 0}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 1}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 2}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 3}
+#> {"url": "https://httpbin.org/stream/5", "headers": {"Host": "httpbin.org", "Connection": "close", "Accept-Encoding": "gzip, deflate", "Accept": "*/*", "User-Agent": "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"}, "args": {}, "origin": "157.130.179.86", "id": 4}
 # when streaming, content is NULL
 res$content 
 #> NULL
diff --git a/vignettes/how-to-use-crul.Rmd b/vignettes/how-to-use-crul.Rmd
new file mode 100644
index 0000000..fc9951e
--- /dev/null
+++ b/vignettes/how-to-use-crul.Rmd
@@ -0,0 +1,224 @@
+<!--
+%\VignetteIndexEntry{How to use crul}
+%\VignetteEngine{knitr::rmarkdown}
+%\VignetteEncoding{UTF-8}
+-->
+
+
+
+How to use crul
+===============
+
+The following aims to help you decide how to use `crul` in different 
+scenarios.
+
+First, `crul` is aimed a bit more at developers than at the casual 
+user doing HTTP requests. That is, `crul` is probably a better fit 
+for an R package developer, mainly because it heavily uses `R6` - 
+an interface that's very unlike the interface in `httr` but very
+similar to interacting with classes in Ruby/Python.
+
+Second, there is not yet, but will be soon, the ability to mock 
+HTTP requests. We are working on that, and should have it integrated
+soon. When that feature arrives we'll update this vignette.
+
+Load the library
+
+
+```r
+library("crul")
+```
+
+## A simple HTTP request function
+
+Most likely you'll want to do a `GET` request - so let's start with that - 
+though the details are not much different for other HTTP verbs.
+
+And in most cases you'll likely not want to do asynchronous requests - though
+see below if you do.
+
+You'll probably want to write a small function, like so (annotated for 
+clarity)
+
+
+```r
+make_request <- function(url) {
+  # create a HttpClient object, defining the url
+  cli <- crul::HttpClient$new(url = url)
+  # do a GET request
+  res <- cli$get()
+  # check to see if request failed or succeeded
+  # - if succeeds this will return nothing and proceeds to next step
+  res$raise_for_status()
+  # parse response to plain text (JSON in this case) - most likely you'll 
+  # want UTF-8 encoding
+  txt <- res$parse("UTF-8")
+  # parse the JSON to an R list
+  jsonlite::fromJSON(txt)
+}
+```
+
+Use the function
+
+
+```r
+make_request("https://httpbin.org/get")
+#> $args
+#> named list()
+#> 
+#> $headers
+#> $headers$Accept
+#> [1] "*/*"
+#> 
+#> $headers$`Accept-Encoding`
+#> [1] "gzip, deflate"
+#> 
+#> $headers$Connection
+#> [1] "close"
+#> 
+#> $headers$Host
+#> [1] "httpbin.org"
+#> 
+#> $headers$`User-Agent`
+#> [1] "libcurl/7.51.0 r-curl/2.6 crul/0.3.8"
+#> 
+#> 
+#> $origin
+#> [1] "157.130.179.86"
+#> 
+#> $url
+#> [1] "https://httpbin.org/get"
+```
+
+Now you can use the `make_request` function in your script or package.
+
+## More customized function
+
+Once you get more familiar (or if you're already familiar with HTTP) you may
+want to have more control, toggle more switches.
+
+In the next function, we'll allow for users to pass in curl options, use 
+a custom HTTP status checker, and xxx.
+
+
+```r
+make_request2 <- function(url, ...) {
+  # create a HttpClient object, defining the url
+  cli <- crul::HttpClient$new(url = url)
+  # do a GET request, allow curl options to be passed in
+  res <- cli$get(...)
+  # check to see if request failed or succeeded
+  # - a custom approach this time combining status code, 
+  #   explanation of the code, and message from the server
+  if (res$status_code > 201) {
+    mssg <- jsonlite::fromJSON(res$parse("UTF-8"))$message$message
+    x <- res$status_http()
+    stop(
+      sprintf("HTTP (%s) - %s\n  %s", x$status_code, x$explanation, mssg),
+      call. = FALSE
+    )
+  }
+  # parse response
+  txt <- res$parse("UTF-8")
+  # parse the JSON to an R list
+  jsonlite::fromJSON(txt)
+}
+```
+
+Use the function
+
+
+```r
+make_request2("https://api.crossref.org/works?rows=0")
+#> $status
+#> [1] "ok"
+#> 
+#> $`message-type`
+#> [1] "work-list"
+#> 
+#> $`message-version`
+#> [1] "1.0.0"
+#> 
+#> $message
+#> $message$facets
+#> named list()
+#> 
+#> $message$`total-results`
+#> [1] 89155073
+#> 
+#> $message$items
+#> list()
+#> 
+#> $message$`items-per-page`
+#> [1] 0
+#> 
+#> $message$query
+#> $message$query$`start-index`
+#> [1] 0
+#> 
+#> $message$query$`search-terms`
+#> NULL
+```
+
+No different from the first function (besides the URL). However, now we can 
+pass in curl options:
+
+
+```r
+make_request2("https://api.crossref.org/works?rows=0", verbose = TRUE)
+make_request2("https://api.crossref.org/works?rows=0", timeout_ms = 1)
+```
+
+We can also pass named parameters supported in the `get` method, including
+`query`, `disk`, and `stream`.
+
+
+```r
+make_request2("https://api.crossref.org/works", query = list(rows = 0))
+#> $status
+#> [1] "ok"
+#> 
+#> $`message-type`
+#> [1] "work-list"
+#> 
+#> $`message-version`
+#> [1] "1.0.0"
+#> 
+#> $message
+#> $message$facets
+#> named list()
+#> 
+#> $message$`total-results`
+#> [1] 89155073
+#> 
+#> $message$items
+#> list()
+#> 
+#> $message$`items-per-page`
+#> [1] 0
+#> 
+#> $message$query
+#> $message$query$`start-index`
+#> [1] 0
+#> 
+#> $message$query$`search-terms`
+#> NULL
+```
+
+In addition, the failure behavior is different, and customized to the 
+specific web resource we are working with
+
+
+```r
+make_request2("https://api.crossref.org/works?rows=asdf")
+#> Error: HTTP (400) - Bad request syntax or unsupported method
+#>   Integer specified as asdf but must be a positive integer less than or equal to 1000.
+```
+
+## Asynchronous requests
+
+You may want to use asynchronous HTTP requests when any one HTTP request 
+takes "too long". This is of course all relative. You may be dealing with a 
+server that responds very slowly, or other circumstances. 
+
+See the __async with crul__ vignette for more details on asynchronous requests.

-- 
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/debian-med/r-cran-crul.git



More information about the debian-med-commit mailing list