[lua-torch-dok] 01/01: Imported Upstream version 0~20160131-g1b36900

Zhou Mo cdluminate-guest at moszumanska.debian.org
Sat Aug 20 15:07:57 UTC 2016


This is an automated email from the git hooks/post-receive script.

cdluminate-guest pushed a commit to branch master
in repository lua-torch-dok.

commit 8a42ca2027f5da9dd755ded45ffed159aa19b5cd
Author: Zhou Mo <cdluminate at gmail.com>
Date:   Mon Aug 15 13:23:46 2016 +0000

    Imported Upstream version 0~20160131-g1b36900
---
 CMakeLists.txt                 |   8 +
 COPYRIGHT.txt                  |  36 ++
 docinstall/README.md           | 613 ++++++++++++++++++++++++++++++++++
 docinstall/blas.md             | 170 ++++++++++
 docinstall/installqtdebian.md  |  32 ++
 docinstall/installqtwindows.md |  20 ++
 doclua/README.md               |  84 +++++
 doctutorial/README.md          | 653 ++++++++++++++++++++++++++++++++++++
 init.lua                       |  10 +
 inline.lua                     | 739 +++++++++++++++++++++++++++++++++++++++++
 rocks/dok-scm-1.rockspec       |  32 ++
 11 files changed, 2397 insertions(+)

diff --git a/CMakeLists.txt b/CMakeLists.txt
new file mode 100644
index 0000000..96e49b2
--- /dev/null
+++ b/CMakeLists.txt
@@ -0,0 +1,8 @@
+CMAKE_MINIMUM_REQUIRED(VERSION 2.6 FATAL_ERROR)
+CMAKE_POLICY(VERSION 2.6)
+
+SET(luasrc 
+	   ${CMAKE_CURRENT_SOURCE_DIR}/init.lua 
+	   ${CMAKE_CURRENT_SOURCE_DIR}/inline.lua)
+
+INSTALL(FILES ${luasrc} DESTINATION ${LUADIR}/dok)
diff --git a/COPYRIGHT.txt b/COPYRIGHT.txt
new file mode 100644
index 0000000..c9cc784
--- /dev/null
+++ b/COPYRIGHT.txt
@@ -0,0 +1,36 @@
+Copyright (c) 2011-2014 Idiap Research Institute (Ronan Collobert)
+Copyright (c) 2012-2014 Deepmind Technologies (Koray Kavukcuoglu)
+Copyright (c) 2011-2012 NEC Laboratories America (Koray Kavukcuoglu)
+Copyright (c) 2011-2013 NYU (Clement Farabet)
+Copyright (c) 2006-2010 NEC Laboratories America (Ronan Collobert, Leon Bottou, Iain Melvin, Jason Weston)
+Copyright (c) 2006      Idiap Research Institute (Samy Bengio)
+Copyright (c) 2001-2004 Idiap Research Institute (Ronan Collobert, Samy Bengio, Johnny Mariethoz)
+
+All rights reserved.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are met:
+
+1. Redistributions of source code must retain the above copyright
+   notice, this list of conditions and the following disclaimer.
+
+2. Redistributions in binary form must reproduce the above copyright
+   notice, this list of conditions and the following disclaimer in the
+   documentation and/or other materials provided with the distribution.
+
+3. Neither the names of Deepmind Technologies, NYU, NEC Laboratories America 
+   and IDIAP Research Institute nor the names of its contributors may be 
+   used to endorse or promote products derived from this software without 
+   specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
+AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
+LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
+POSSIBILITY OF SUCH DAMAGE.
diff --git a/docinstall/README.md b/docinstall/README.md
new file mode 100644
index 0000000..77a9ad4
--- /dev/null
+++ b/docinstall/README.md
@@ -0,0 +1,613 @@
+<a name="install.dok"/>
+# Torch Installation Manual #
+
+Currently Torch7 installation can be done only from the
+sources. Binary releaseses will be distributed soon.
+
+<a name="install.sources"/>
+# Installing from sources #
+
+`Torch7` is mainly made out of `ANSI C` and `Lua`, which makes
+it easy to compile everywhere. The graphical interface is based on QT
+and requires a `C++` compiler.
+
+The installation process became easily portable on most platforms,
+thanks to [CMake](http://www.cmake.org), a tool which replace the
+aging `configure/automake` tools. CMake allows us to detect and
+configure Torch properly.
+
+You will find here step-by-step instructions for each system we are supporting.
+
+You are also strongly encouraged to read the [CMake hints](#CMakeHints)
+section for more details on CMake (and before reporting a problem).
+
+If you are a programmer, you might want to produce your own
+[development package](#DevPackages).
+
+<a name="install.linux"/>
+## Linux ##
+
+### A. Requirements ###
+
+Torch compilation requires a number of standard packages described below:
+  * __Mandatory:__
+    * A `C/C++` compiler. [CLang](http:_clang.llvm.org) is great. The [GNU compiler](http:_gcc.gnu.org) or Intel compiler work fine.
+    * [CMake](http://www.cmake.org) version 2.6 or later is required.
+    * [Gnuplot](http://gnuplot.info), version `4.4` or later is recommended for best experience.
+
+  * __Recommended:__
+    * [GNU Readline](http://tiswww.case.edu/php/chet/readline/rltop.html)
+    * [Git](http://git-scm.com/) to keep up-to-date sources
+    * [QT 4.4](http://trolltech.com/products) or newer development libraries
+    * BLAS. [OpenBLAS](https://github.com/xianyi/OpenBLAS) is recommended for that purpose on Intel computers.
+    * LAPACK. [OpenBLAS](https://github.com/xianyi/OpenBLAS) is recommended for that purpose on Intel computers.
+
+The installation of most of these packages should be rather
+straightforward. For `Ubuntu 10.04 LTS` system we use the
+`apt-get` magic:
+
+For GCC:
+```
+sudo apt-get install gcc g++
+```
+If you prefer to use CLang:
+```
+sudo apt-get install clang
+```
+
+CMake reads CC and CXX variables. If you do not want to use the default compiler, just do
+```
+export CC=clang
+export CXX=clang++
+```
+
+To install the additional packages, do:
+```
+sudo apt-get install cmake
+sudo apt-get install libreadline5-dev
+sudo apt-get install git-core
+sudo apt-get install gnuplot
+```
+
+Please adapt according to your distribution.
+
+Note: readline library is helpful for better command line interaction,
+but it is not required. It is only used when QT is installed.
+
+We require `QT 4.4` for handling graphics (_beware_ not installing QT 4.3
+or older). If it is not found at compile time, Torch will still compile but
+no graphics will be available. On `Ubuntu 10.04 LTS` distribution you can
+install it with
+```
+sudo apt-get install libqt4-core libqt4-gui libqt4-dev
+```
+
+An excellent BLAS/LAPACK implementation is also recommended for speed. See
+our [BLAS recommendations](blas).
+
+<a name="install.sources"/>
+### B. Getting Torch sources ###
+
+Torch7 is being developed on [github](http://github.com).
+
+```
+git clone git://github.com/andresy/torch.git
+```
+
+
+<a name="install.config"/>
+### C. Configuring Torch ###
+
+We use `CMake` for configuring `Torch`. We _highly_ recommend to create
+first a dedicated build directory. This eases cleaning up built objects,
+but also allow you to build Torch with _various configurations_
+(e.g. Release and Debug in two different build directories).
+
+```
+cd torch
+mkdir build
+cd build
+cmake ..
+```
+
+The `..` given to `cmake` indicates the directory where the
+sources are. We chose here to have a `build` directory inside
+`torch`, but it could be anywhere else. In that latter case, go
+instead in your build directory and then do: 
+
+```
+cmake /path/to/torch/sources
+```
+
+CMake detects external libraries or tools necessary for Torch, and
+produces Makefiles such that Torch is then easily compilable on your
+platform. If you prefer the GUI version of CMake, you can replace
+`cmake` by `ccmake` in the above command lines. In particular, it
+is _strongly encouraged_ to use `ccmake` for finer configuration
+of Torch.
+
+The most common Torch configuration step you might want to perform is
+changing the installation path. By default, Torch will be installed in
+`/usr/local`. You will need super-user rights to perform that. If
+you are not root on your computer, you can instead specifying a
+install directory to `CMake` on the above `cmake` command:
+
+```
+cmake .. -DCMAKE_INSTALL_PREFIX=/my/install/path
+```
+
+Equivalently you can set the variable `CMAKE_INSTALL_PREFIX` if you
+use `ccmake` GUI.  Please, see [[http://www.cmake.org|CMake
+documentation]] or _at least_ [[#CMakeHints|some of our CMake
+hints]] for more details on configuration.
+
+<a name="install.compile"/>
+### D. Compiling and installing ###
+
+If the configuration was successful, Makefiles should have appeared in
+your build directory.  Compile Torch with:
+
+then compile and install with:
+```
+make install
+```
+
+This last command might possibly be prefixed by `sudo` if you are
+installing Torch in `/usr/local`.
+
+<a name="install.run"/>
+### E. Running Torch ###
+
+Now Torch should be installed in `/usr/local` or in
+`/my/install/path` if you chose to use the `CMAKE_INSTALL_PREFIX`
+when configuring with CMake.  Lua executables (`torch-lua`, 
+`torch-qlua` and `torch`) are found in the `bin` sub-directory of 
+these installation directories.
+
+```
+/usr/local/bin/torch-lua
+Lua 5.1.4  Copyright (C) 1994-2008 Lua.org, PUC-Rio
+> require 'torch'
+> = torch.Tensor(5):zero()
+
+0
+0
+0
+0
+0
+[torch.Tensor of dimension 5]
+
+> 
+```
+
+For convenience, you might want to add to your `PATH` the path to
+lua binaries. The executable `torch-lua` is a simple Lua interpreter 
+(as provided on [Lua website](http://www.lua.org)), while `torch-qlua` 
+has enhanced interactivity (like completion) and is able to handle
+graphics and QT widgets.
+
+For best experience we suggest using the `torch` executable, which
+preloads the most commonly used libraries into the global namespace.
+
+```
+/usr/local/bin/torch
+Try the IDE: torch -ide
+Type help() for more info
+Torch 7.0  Copyright (C) 2001-2011 Idiap, NEC Labs, NYU
+Lua 5.1  Copyright (C) 1994-2008 Lua.org, PUC-Rio
+torch> =torch.randn(10,10)
+ 1.3862  1.5983 -2.0216 -0.1502  1.9467 -1.2322  0.1628 -2.6253  1.3255 -0.5784
+ 0.1363 -1.2638 -1.0661  0.0233  1.3064 -0.8817  1.1424  1.0952 -0.2147  0.7712
+ 1.1348 -0.8596 -0.6102  0.9137 -1.1582 -0.3301  0.5250  1.3631 -0.4051 -0.9549
+-0.2734 -0.0914  0.9728  1.3272 -0.4126 -0.1264 -1.2936 -0.2120  1.3040 -1.9991
+-0.9642  0.2367 -0.5246 -0.0476 -0.6586  1.8705  0.8482 -1.2768 -0.0782  0.5403
+ 0.4551 -1.4549 -0.7079 -1.6308 -0.2086 -1.7208 -1.4915  0.9703  0.3661  0.5051
+ 0.3082  0.3188 -1.1247  0.1343 -0.2671 -0.4596 -0.2618  1.7482  0.4714  0.5217
+-0.8406 -0.2372 -0.1504  0.6982 -0.5437  0.7447  0.0229 -2.4998  0.7367 -1.2721
+-0.3993  1.5155 -0.3685 -0.0534 -0.0495 -0.1002 -0.3652  0.1248 -0.2693  0.9159
+-1.5035  0.7326 -0.6262  0.2715  0.0543 -0.7419 -0.6758 -0.0221  0.5342 -0.4262
+[torch.DoubleTensor of dimension 10x10]
+
+torch> 
+
+```
+
+You can get more help about `torch`:
+
+```
+/usr/local/bin/torch -h
+Torch7 Shell
+
+Usage: torch [options] [script [args]]
+
+General options:
+  -b|-bare         start a bare environment (no libs preloaded)
+  -e string        execute string
+  -l lib           require lib
+  -i               enter interactive mode after executing script [false]
+  -v|-version      show version information [false]
+  -h|-help         this help [false]
+
+Qt options:
+  -nographics|-ng  disable all the graphical capabilities [false]
+  -ide             enable IDE (graphical console) [false]
+  -onethread       run lua in the main thread (might be safer) [false] 
+```
+
+## MacOS X ##
+
+### A. Requirements ###
+
+Torch compilation requires a number of standard packages described below:
+  * __Mandatory:__
+    * A `C/C++` compiler. [CLang](http:_clang.llvm.org) is great. The [GNU compiler](http:_gcc.gnu.org) or Intel compiler work fine.
+    * [CMake](http://www.cmake.org) version 2.6 or later is required.
+    * [Gnuplot](http://gnuplot.info), version `4.4` or later is recommended for best experience.
+
+  * __Recommended:__
+    * [GNU Readline](http://tiswww.case.edu/php/chet/readline/rltop.html)
+    * [Git](http://git-scm.com/) to keep up-to-date sources
+    * [QT 4.4](http://trolltech.com/products) or newer development libraries
+    * BLAS. [OpenBLAS](https://github.com/xianyi/OpenBLAS) is recommended for that purpose on Intel computers.
+    * LAPACK. [OpenBLAS](https://github.com/xianyi/OpenBLAS) is recommended for that purpose on Intel computers.
+
+Installation of gcc should be done by installing the
+[[http://developer.apple.com/tools/xcode|the Apple developer
+tools]]. These tools should also be available on you MacOS X
+installation DVD.
+
+CMake can be retrieved from
+[CMake website](http://www.cmake.org/HTML/Download.html) (you can
+take the __DMG__ installer). However, we found it was as simple to use
+[Homebrew](http:_mxcl.github.com/homebrew/), or [MacPorts](http:_www.macports.org/)
+which are necessary anyway for git and the Readline library. We recommend to avoid
+[Fink](http://finkproject.org/), which tends to be always
+outdated. Assuming you installed Homebrew, just do:
+
+```
+brew install readline
+brew install cmake
+brew install git
+brew install gnuplot
+```
+
+For installing QT, one can use Homebrew, but it might take too long to
+compile.  Instead, you can
+[download](http://trolltech.com/downloads/opensource/appdev/mac-os-cpp)
+the binary __DMG__ file available on [[http://trolltech.com|Trolltech
+website]] and install it.
+
+An excellent BLAS/LAPACK implementation is also recommended for speed. See
+our [BLAS recommendations](blas).
+
+Last but not least, GCC >= 4.6 is *required* to enable OpenMP on MacOS X. This
+is a bit crazy, but compiling against OpenMP with previous versions of GCC
+will give you random segfaults and trap errors (a known issue on the web).
+We strongly recommend you to install GCC 4.6, to fully benefit from Torch's
+fast numeric routines. A very simple way of doing so is to install the 
+[GFortran](http://gcc.gnu.org/wiki/GFortranBinaries) libraries, which are
+packaged as a simple dmg, ready to install. That'll automatically install gcc
+and g++. Once this is done, set your CC and CXX before building Torch:
+
+```
+export CC=/usr/local/gfortran/bin/gcc
+export CXX=/usr/local/gfortran/bin/g++
+```
+
+### B. Getting Torch sources ###
+
+Same as [getting sources](#install.sources) for linux.
+
+### C. Configuring Torch ###
+
+Same as [configuring](#install.config) for linux.
+
+### D. Compiling and Installing ###
+
+Same as [compiling](#install.compile) for linux.
+
+### E. Running Torch ###
+
+Same as [runnning](#install.run) for linux.
+
+<a name="install.freebsd"/>
+## FreeBSD ##
+
+### A. Requirements ###
+
+Torch compilation requires a number of standard packages described below:
+  * __Mandatory:__
+    * A `C/C++` compiler. [CLang](http:_clang.llvm.org) is great. The [GNU compiler](http:_gcc.gnu.org) or Intel compiler work fine.
+    * [CMake](http://www.cmake.org) version 2.6 or later is required.
+    * [Gnuplot](http://gnuplot.info), version `4.4` or later is recommended for best experience.
+
+  * __Recommended:__
+    * [GNU Readline](http://tiswww.case.edu/php/chet/readline/rltop.html)
+    * [Git](http://git-scm.com/) to keep up-to-date sources
+    * [QT 4.4](http://trolltech.com/products) or newer development libraries
+    * BLAS. [OpenBLAS](https://github.com/xianyi/OpenBLAS) is recommended for that purpose on Intel computers.
+    * LAPACK. [OpenBLAS](https://github.com/xianyi/OpenBLAS) is recommended for that purpose on Intel computers.
+
+GCC and CLang come with FreeBSD install. However, only GCC 4.2 is installed by default (for licensing reasons).
+We prefer to use CLang. If you want to stick with GCC, we recommend installing GCC 4.4 or GCC 4.6 instead of using
+GCC 4.2 (poor performance on recent CPUs).
+```
+pkg_add -r gcc46
+```
+
+CMake reads CC and CXX variables. If you do not want to use the default compiler, just do
+```
+export CC=clang
+export CXX=clang++
+```
+
+Additional packages can be easily installed with:
+```
+pkg_add -r readline
+pkg_add -r cmake
+pkg_add -r git
+pkg_add -r gnuplot
+```
+
+Note: on FreeBSD 9.0, it seems `pdflib` (a dependency of gnuplot) is not available as binary. Please,
+install gnuplot instead in the port tree:
+```
+cd /usr/ports/math/gnuplot
+make install clean
+```
+
+For installing QT, use also `pkg_add -r qt4`, followed by `pkg_add -r qt4-XXX`, where
+XXX is one of the components (or tools) listed on [Qt FreeBSD page](http://www.freebsd.org/doc/en/books/porters-handbook/using-qt.html).
+Be sure to install all components and tools listed there.
+
+An excellent BLAS/LAPACK implementation is also recommended for speed. See
+our [BLAS recommendations](blas).
+
+### B. Getting Torch sources ###
+
+Same as [getting sources](#install.sources) for linux.
+
+### C. Configuring Torch ###
+
+Same as [configuring](#install.config) for linux. Note that dynamic RPATH (related to `$ORIGIN`) do not work properly
+on my FreeBSD 9. You can deactivate this with the `WITH_DYNAMIC_RPATH` option.
+```
+cmake .. -DCMAKE_INSTALL_PREFIX=/my/install/path -DWITH_DYNAMIC_RPATH=OFF
+```
+
+### D. Compiling and Installing ###
+
+Same as [compiling](#install.compile) for linux.
+
+### E. Running Torch ###
+
+Same as [runnning](#install.run) for linux.
+
+## Cygwin ##
+
+_We do not recommend_ Cygwin installation. Cygwin is pretty slow, and we
+could not manage to make QT 4.4 work under Cygwin. Instead prefer
+[native windows](#Windows) installation.
+
+<a name="Windows"/>
+## Windows ##
+
+___ Torch7 is not yet Windows compatible, coming soon ___
+
+
+<a name="CMakeHints"/>
+## CMake hints ##
+
+CMake is well documented on [http:_www.cmake.org](http:_www.cmake.org).
+
+### CMake and CLang ###
+
+If you like to use [CLang](http://clang.llvm.org) for compiling Torch7, assuming a proper
+CLang installation, you only have to do
+```
+export CC=clang
+export CXX=clang++
+```
+before calling cmake command line.
+
+### CMake GUI ###
+
+Under Windows, CMake comes by default with a GUI. Under Unix system it is
+quite handy to use the _text GUI_ available through `ccmake`.
+`ccmake` works in the same way than `cmake`: go in your build directory and
+```
+ccmake /path/to/torch/source
+```
+
+Windows and Unix GUI works in the same way: you `configure`, _possibly several times_,
+until CMake has detected everything and proposes to `generate` the configuration.
+
+After each configuration step, you can modify CMake variables to suit your needs.
+
+### CMake variables ###
+
+CMake is highly configurable thanks to _variables_ you can set when
+executing it. It is really easy to change these variables with CMake GUI. If you want
+to stick with the command line you can also change a variable by doing:
+```
+cmake /path/to/torch/source -DMY_VARIABLE=MY_VALUE
+```
+where `MY_VARIABLE` is the name of the variable you want to set and
+`MY_VALUE` is its corresponding value.
+
+#### Interesting standard CMake variables ####
+
+  * `CMAKE_INSTALL_PREFIX`: directory where Torch is going to be installed
+  * `CMAKE_BUILD_TYPE`: `Release` for optimized compilation, `Debug` for debug compilation.
+  * `CMAKE_C_FLAGS`: add here the flags you want to pass to the C compiler (like `-Wall` for e.g.)
+
+#### Notable Torch7 CMake variables ####
+
+  * `WITH_BLAS`: specify which BLAS you want to use (if you have several on your computers). Can be mkl/open/goto/acml/atlas/accelerate/veclib/generic.
+  * `WITH_LUA_JIT`: say to CMake to compile Torch7 against LuaJIT instead of Lua. (default is OFF)
+  * `WITH_QTLUA`: compile QtLua if Qt is found (default is ON)
+  * `WITH_QTLUA_IDE`: compile QtLua IDE if Qt is found (default is ON)
+  * `WITH_RPATH`: use RPATH such that you do not need to add Torch7 install library path in LD_LIBRARY_PATH. (default is ON)
+  * `WITH_DYNAMIC_RPATH`: if used together with WITH_RPATH, will make library paths relative to the Torch7 executable. If you move the install directory, things will still work. This flag does not work on FreeBSD. (default is ON).
+
+### CMake caches everything ###
+
+As soon as CMake performed a test to detect an external library, it saves
+the result of this test in a cache and will not test it again.
+
+If you forgot to install a library (like QT or Readline), and install it
+after having performed a CMake configuration, it will not be used by Torch
+when compiling.
+
+_In doubt_, if you changed, updated, added some libraries that should be used by Torch, you should
+_erase your build directory and perform CMake configuration again_.
+
+
+<a name="DevPackages"/>
+## Development Torch packages ##
+
+If you want to develop your own package, you can put it in the `dev`
+sub-directory. Packages in `dev` are all compiled in the same way that the
+ones in `packages` sub-directory. We prefer to have this directory to make a
+clear difference between official packages and development packages.
+
+Alternatively, you can use [Torch package manager](#PackageManager) 
+to build and distribute your packages.
+
+<a name="PackageManager"/>
+## The Torch Package Management System ##
+
+Torch7 has a built-in package management system that makes it very easy 
+for anyone to get extra (experimental) packages, and create and distribute
+yours.
+
+Calling `torch-pkg` without arguments will give you some help:
+
+```
+/usr/local/bin/torch-pkg
+Torch7 Package Manager
+
+Usage: torch-pkg [options] <command> [argument]
+
+Supported Commands:
+  help            print this help
+  install         install a package (download + build + deploy)
+  download        download a package locally
+  build           build a local package
+  deploy          deploy a package (build it if necessary)
+  list            list available packages
+  search          search for a package
+  add             add a server address to local config ($HOME/.torchpkg/config)
+
+Arguments for install|download:
+  <pkg-name>      a package name (to be found in one of the configured repos)
+  <pkg-repo>      the full address of a GIT repository
+  <pkg-url>       the URL of a simple package (should be a tar/tgz/tbz)
+
+Arguments for add:
+  <base-url>      a base URL where GIT repos or tars can be found
+
+Options:
+  -v|-verbose     be more verbose
+  -l|-local       local install
+  -n|-nodeps      do not install dependencies (when installing)
+  -d|-dry         dry run 
+```
+
+It's fairly self-explanatory. You can easily get a list of the available
+packages:
+
+```
+/usr/local/bin/torch-pkg list
+
+--> retrieving package lists from servers
+
+--> parallel
+    A parallel computing framework for Torch7, with serialization facilities
+    hosted at: https://github.com/clementfarabet/lua---parallel
+
+--> image
+    An image-processing toolbox for Torch7
+    hosted at: https://github.com/clementfarabet/lua---image
+
+--> optim
+    An optimization toolbox for Torch7
+    hosted at: https://github.com/koraykv/optim
+
+...
+```
+
+To install a new package, simply do:
+
+```
+/usr/local/bin/torch-pkg install pkgname
+```
+
+The sources of the packages are downloaded and kept in a hidden
+directory in your home:
+
+```
+torch-pkg install image
+ls ~/.torch/torch-pkg/image/
+```
+
+If you just want to get the sources of a package, without
+installing it, you can get it like this:
+
+```
+/usr/local/bin/torch-pkg download pkgname
+```
+
+And then build it and install it:
+
+```
+cd pkgname
+/usr/local/bin/torch-pkg build
+/usr/local/bin/torch-pkg deploy
+```
+
+If you need to distribute your own packages, you just have
+to create a package file, which contains one entry per package,
+and then make it available online. Users can then easily add
+that file to their repository by doing:
+
+```
+/usr/local/bin/torch-pkg add http://url/to/config
+```
+
+The config typically looks like:
+
+```
+pkg = pkg or {}
+
+pkg.image = {
+  git = 'https://github.com/clementfarabet/lua---image',
+  description = 'An image-processing toolbox for Torch7',
+  dependencies = {'sys', 'xlua'},
+  commit = 'master'
+}
+
+pkg.optim = {
+  git = 'https://github.com/koraykv/optim',
+  description = 'An optimization toolbox for Torch7',
+  dependencies = {},
+  commit = 'newpack'
+}
+
+pkg.parallel = {
+  git = 'https://github.com/clementfarabet/lua---parallel',
+  description = 'A parallel computing framework for Torch7, with serialization facilities',
+  dependencies = {'sys'},
+  commit = 'newpack'
+}
+```
+
+<a name="install.binary"/>
+# Installing from binaries #
+
+__This section is not applicable now as we have not produced binaries yet.__
+
+__Please [install from sources](#install.sources).__
+
+
+
diff --git a/docinstall/blas.md b/docinstall/blas.md
new file mode 100644
index 0000000..2d34df4
--- /dev/null
+++ b/docinstall/blas.md
@@ -0,0 +1,170 @@
+# BLAS and LAPACK #
+
+There are multiple BLAS and LAPACK libraries out there. Most Linux
+distributions come with pre-compiled BLAS or ATLAS libraries.
+__We strongly discourage you to use those libraries__. According to our experience,
+these libraries are slow. Things have been improved with recent ATLAS
+development versions, but they have still a hard time to catch up with Intel MKL
+or GotoBLAS/OpenBLAS implementations.
+
+We found that on Intel platforms,
+[GotoBLAS](http:_www.tacc.utexas.edu/tacc-projects/gotoblas2)/[OpenBLAS](https:_github.com/xianyi/OpenBLAS)
+or [Intel MKL](www.intel.com/software/products/mkl) implementations were
+the fastest. The advantage of GotoBLAS and OpenBLAS being that they are
+distributed with a BSD-like license. The choice is yours.
+
+## Installing OpenBLAS ##
+
+[GotoBLAS](http://www.tacc.utexas.edu/tacc-projects/gotoblas2) has been
+extremely well hand-optimized by Kazushige Goto.  The project has been
+released under a BSD-like license. Unfortunately, it is not maintained
+anymore (at this time), but several forks have been released later. Our preference
+goes to [OpenBLAS](https://github.com/xianyi/OpenBLAS).
+
+We provide below simple instructions to install OpenBLAS.
+
+First get the latest OpenBLAS stable code:
+```
+git clone git://github.com/xianyi/OpenBLAS.git
+```
+
+You will need a Fortran compiler. On most Linux distributions, `gfortran` is available.
+For e.g., on Debian,
+```
+apt-get install gfortran
+```
+If you prefer, you can also install GCC 4.6 which also supports Fortran language.
+
+On FreeBSD, gfortran is not available, so please use GCC 4.6.
+```
+pkg_add -r gcc46
+```
+
+On MacOS X, you should install one gfortran package provided on
+[this GCC webpage](http://gcc.gnu.org/wiki/GFortranBinaries).
+
+You can now go into the OpenBlas directory, and just do:
+```
+make NO_AFFINITY=1 USE_OPENMP=1
+```
+OpenBLAS uses processor affinity to go faster. However, in general, on a
+computer shared between several users, this causes processes to fight for
+the same CPU. We thus disable it here with the `NO_AFFINITY` flag. We
+also use the `USE_OPENMP` flag, such that OpenBLAS uses OpenMP and not
+pthreads. This is important to avoid some confusion in the number of
+threads, as Torch7 uses OpenMP. Read OpenBLAS manual for more details.
+
+You can use `CC` and `FC` variables to control the C and Fortran compilers.
+
+On FreeBSD use 'gmake' instead of 'make'. You also have to specify the correct MD5 sum program
+You will probably want to use the following command line:
+```
+gmake NO_AFFINITY=1 USE_OPENMP=1 CC=gcc46 FC=gcc46 MD5SUM='md5 -q'
+```
+
+On MacOS X, you will also have to specify the correct MD5SUM program:
+```
+make NO_AFFINITY=1 USE_OPENMP=1 MD5SUM='md5 -q'
+```
+
+Be sure to specify MD5SUM correctly, otherwise OpenBLAS might not compile LAPACK properly.
+
+At the end of the compilation, you might want to do a 
+```
+make PREFIX=/your_installation_path/ install
+```
+to install OpenBLAS at a specific location. You might also want to keep it where you compiled it.
+
+Note that on MacOS X, the generated __dynamic__ (`.dylib`) library does not contain LAPACK. Simply remove
+the dylib (keeping the archive `.a`) such that LAPACK is correctly detected.
+
+### CMake detection ###
+Make sure that CMake can find your OpenBLAS library. This can be done with
+```
+export CMAKE_LIBRARY_PATH=/your_installation_path/lib
+```
+before starting cmake command line. On some platforms, the `gfortran`
+library might also be not found. In this case, add the path to the
+`gfortran` library into `CMAKE_LIBRARY_PATH`.
+
+## Installing Intel MKL ##
+
+[Intel MKL](www.intel.com/software/products/mkl) is a closed-source
+library _sold_ by Intel. Follow Intel instructions to unpack MKL. Then make
+sure the libraries relevant for your system (e.g. `em64t` if you are on a
+64 bits distribution) are available in your `LD_LIBRARY_PATH`. Both BLAS
+and LAPACK interfaces are readily included in MKL.
+
+### CMake detection ###
+Make sure that CMake can find your libraries. This can be done with something like
+```
+export CMAKE_INCLUDE_PATH=/path/to/mkl/include
+export CMAKE_LIBRARY_PATH=/path/to/mkl/lib/intel64:/path/to/mkl/compiler/lib/intel64
+export LD_LIBRARY_PATH=$CMAKE_LIBRARY_PATH:$LD_LIBRARY_PATH
+```
+before starting cmake command line.
+
+Of course, you have to adapt `/path/to/mkl` and `/path/to/mkl/compiler` to your installation setup. In the above
+case, we also chose the `intel64` libraries, which might not be what you need.
+
+A common mistake is to forgot the path to Intel compiler libraries. CMake
+will not be able to detect threaded libraries in that case.
+
+## CMake and BLAS/LAPACK ##
+
+As mentioned above, you should make sure CMake can find your
+libraries. Carefully watch for libraries found (or not found) in the output
+generated by cmake.
+
+For example, if you see something like:
+```
+-- Checking for [openblas - gfortran]
+--   Library openblas: /Users/ronan/open/lib/libopenblas.dylib
+--   Library gfortran: BLAS_gfortran_LIBRARY-NOTFOUND
+```
+It means CMake found the OpenBLAS library, but could not make it work
+properly because it did not find the required gfortran library. Make sure
+that CMake can find all the required libraries through CMAKE_LIBRARY_PATH.
+If your libraries are present in LD_LIBRARY_PATH, it should be fine too.
+
+The locations to search for are generally as follows.
+```
+/usr/lib/gcc/x86_64-linux-gnu/
+/usr/lib/gcc/x86_64-redhat-linux/4.4.4/
+```
+These are a bit crytic, but look around and find the path that contains libgfortran.so. And, use
+```
+export CMAKE_LIBRARY_PATH=...
+```
+before calling cmake to build torch, this makes sure that OpenBLAS will be found.
+
+
+Note that CMake will try to detect various BLAS/LAPACK libraries. If you have several libraries
+installed on your computer (say Intel MKL and OpenBLAS), or if you want to avoid all these checks,
+you might want to select the one you want to use with:
+```
+cd torch7/build
+cmake .. -DWITH_BLAS=open
+```
+Valid options for WITH_BLAS are `mkl` (Intel MKL), `open` (OpenBLAS),
+`goto` (GotoBlas2), `acml` (AMD ACML), `atlas` (ATLAS),
+`accelerate` (Accelerate framework on MacOS X), `vecLib` (vecLib
+framework on MacOS X) or `generic`.
+
+Note again that the best choices are probably `open` or `mkl`. For
+consistency reasons, CMake will try to find the corresponding LAPACK
+package (and does not allow mixing up different BLAS/LAPACK versions).
+
+## GotoBLAS/OpenBLAS and MKL threads ##
+
+GotoBLAS/OpenBLAS and MKL are multi-threaded libraries.
+With MKL, the number of threads can be controlled by
+```
+export OMP_NUM_THREADS=N
+```
+where N is an integer.
+
+Beware that running small problems on a large number of threads reduce
+performance! Multi-threading should be enable only for large-scale
+computations.
+
diff --git a/docinstall/installqtdebian.md b/docinstall/installqtdebian.md
new file mode 100644
index 0000000..ea4e6c5
--- /dev/null
+++ b/docinstall/installqtdebian.md
@@ -0,0 +1,32 @@
+# Appendix: Installing QT 4.4 #
+
+The version 4.4 of QT might not be available on old distributions. Ubuntu
+provides QT 4.4 backports for Hardy but not for Gutsy.  Debian testing and
+unstable contain QT 4.4 but not the Debian stable. If your distribution
+does not provide QT 4.4, you will have to compile it yourself. This is easily
+done on Debian and Ubuntu with the following:
+```
+# You will need these packages
+sudo apt-get install wget fakeroot dpkg-dev
+mkdir qt
+cd qt
+wget http://ml.nec-labs.com/download/qt/qt4-x11_4.4.0-4.dsc
+wget http://ml.nec-labs.com/download/qt/qt4-x11_4.4.0.orig.tar.gz
+wget http://ml.nec-labs.com/download/qt/qt4-x11_4.4.0-4.diff.gz
+dpkg-source -x qt4-x11_4.4.0-4.dsc
+cd qt4-x11-4.4.0
+dpkg-buildpackage -rfakeroot
+```
+The command `dpkg-buildpackage` might complain for some unmet
+dependencies. Install them with `apt-get install`, and then execute the
+command again. The compilation takes around two hours on a recent computer.
+You can then install all the packages it created with:
+```
+cd ..
+sudo dpkg -i *.deb
+```
+For distributions not Debian-based, please refer to the documentation for compiling
+packages. You might also be able to find QT 4.4 binary packages on the web.
+
+We provide ourselves binary packages for [some architecture](http://ml.nec-labs.com/download/qt/binaries/).
+
diff --git a/docinstall/installqtwindows.md b/docinstall/installqtwindows.md
new file mode 100644
index 0000000..7217ba5
--- /dev/null
+++ b/docinstall/installqtwindows.md
@@ -0,0 +1,20 @@
+# Appendix: Install QT 4.4 under Windows #
+
+Download [QT 4.4 sources for Windows](http://trolltech.com/downloads/opensource/appdev/windows-cpp)
+on Trolltech website.
+
+Unzip then and move the directory to `C:\Qt`.
+
+Set the system `PATH` such that it contains `C:\Qt\bin`. It can be done by
+opening the Control Panel, then System, then go to Advanced and then
+Environment Variables. Make sure you do that before compilation. The Torch
+configuration procedure also needs it for finding QT.
+
+Considering you have Microsoft Visual Studio, you can then do:
+```
+cd c:\Qt
+configure -release
+nmake
+```
+Given the size of QT, allow few hours for compilation.
+
diff --git a/doclua/README.md b/doclua/README.md
new file mode 100644
index 0000000..a2b7b96
--- /dev/null
+++ b/doclua/README.md
@@ -0,0 +1,84 @@
+# The Lua Language #
+
+`Lua` is a __powerful__, __fast__, __light-weight__, embeddable _scripting language_.
+`Lua` combines simple procedural syntax with powerful data description
+constructs based on associative arrays and extensible semantics. `Lua` is
+dynamically typed, runs by interpreting bytecode for a register-based
+virtual machine, and has automatic memory management with incremental
+garbage collection, making it ideal for configuration, scripting, and rapid
+prototyping. 'Lua' means 'moon' in Portuguese and is pronounced __LOO-ah__. 
+
+Please visit [http:_www.lua.org](http:_www.lua.org) for more
+information, or have a look on the [Lua Reference Manual](LuaManual).
+
+## Why choose Lua? ##
+
+### Lua is a proven and robust language ###
+
+ Lua has been used in
+[many industrial applications](http://www.lua.org/uses.html) (e.g.,
+[Adobe's Photoshop Lightroom](http://since1968.com/article/190/mark-hamburg-interview-adobe-photoshop-lightroom-part-2-of-2)),
+with an emphasis on embedded systems and games. Lua
+is currently the leading scripting language in games. Lua has a solid
+[reference manual](LuaManual) and there are
+[several books about it](http://www.lua.org/docs.html#books). Several
+[versions](http://www.lua.org/versions.html) of Lua have been released
+and used in real applications since its creation in 1993.
+
+### Lua is fast ###
+
+Lua has a deserved reputation for performance. To
+claim to be "as fast as Lua" is an aspiration of other scripting
+languages. Several benchmarks show Lua as the fastest language in the realm
+of interpreted scripting languages. Lua is fast not only in fine-tuned
+benchmark programs, but in real life too. A substantial fraction of large
+applications have been written in Lua.
+
+### Lua is portable ###
+
+Lua is [distributed](http://www.lua.org/download.html) in a small
+package that builds out-of-the-box in all platforms that have an `ANSI/ISO C`
+compiler. Lua runs on all flavors of `Unix` and `Windows`, and also on mobile
+devices (such as handheld computers and cell phones that use `BREW`, `Symbian`,
+`Pocket PC`, etc.) and embedded microprocessors (such as `ARM` and `Rabbit`) for
+applications like `Lego MindStorms`.
+
+### Lua is embeddable ###
+
+Lua is a fast language engine with small footprint that you can embed into
+your application. Lua has a simple and well documented `API` that allows
+strong integration with code written in other languages. It is easy to
+extend Lua with libraries written in other languages. It is also easy to
+extend programs written in other languages with Lua. Lua has been used to
+extend programs written not only in `C` and `C++`, but also in `Java`, `C#`,
+`Smalltalk`, `Fortran`, `Ada`, and even in other scripting languages,
+such as
+`Perl` and `Ruby`.
+
+### Lua is simple and powerful ###
+
+A fundamental concept in the design of Lua is to provide _meta-mechanisms_
+for implementing features, instead of providing a host of features directly
+in the language. For example, although Lua is not a pure object-oriented
+language, it does provide meta-mechanisms for implementing classes and
+inheritance. Lua's meta-mechanisms bring an economy of concepts and keep
+the language small, while allowing the semantics to be extended in
+unconventional ways.
+
+### Lua is free ###
+
+Lua is free software, distributed under a
+[liberal license](http://www.lua.org/license.html) (the well-known `MIT`
+license). It can be used for both academic and commercial purposes at
+absolutely no cost. Just [download](http://www.lua.org/download.html) it and use it.
+
+### Where does Lua come from? ###
+
+Lua is designed and implemented by a team at
+[PUC-Rio](http://www.puc-rio.br/), the Pontifical Catholic University of
+Rio de Janeiro in Brazil. Lua was born and raised at
+[Tecgraf](http://www.tecgraf.puc-rio.br/), the Computer Graphics
+Technology Group of PUC-Rio, and is now housed at
+[Lablua](http://www.lua.inf.puc-rio.br/). Both Tecgraf and Lablua are
+laboratories of the [Department of Computer Science](http://www.inf.puc-rio.br/).
+
diff --git a/doctutorial/README.md b/doctutorial/README.md
new file mode 100644
index 0000000..e771f56
--- /dev/null
+++ b/doctutorial/README.md
@@ -0,0 +1,653 @@
+<a name="torch.tutorial"/>
+# Torch Tutorial #
+
+So you are wondering how to work with Torch?
+This is a little tutorial that should help get you started.
+
+By the end of this tutorial, you should have managed to install torch
+on your machine, and have a good understanding of how to manipulate
+vectors, matrices and tensors and how to build and train a basic
+neural network. For anything else, you should know how to access the
+html help and read about how to do it.
+
+## What is Torch? ##
+
+Torch7 provides a Matlab-like environment for state-of-the-art machine
+learning algorithms. It is easy to use and provides a very efficient
+implementation, thanks to a easy and fast scripting language (Lua) and
+an underlying C/C++ implementation.  You can read more about Lua
+[here](http://www.lua.org).
+
+## Installation ##
+
+First before you can do anything, you need to install Torch7 on your
+machine.  That is not described in detail here, but is instead
+described in the [installation help](..:install:index).
+
+
+## Checking your installation works and requiring packages ##
+
+If you have got this far, hopefully your Torch installation works. A simple
+way to make sure it does is to start Lua from the shell command line, 
+and then try to start Torch:
+```lua
+$ torch
+Try the IDE: torch -ide
+Type help() for more info
+Torch 7.0  Copyright (C) 2001-2011 Idiap, NEC Labs, NYU
+Lua 5.1  Copyright (C) 1994-2008 Lua.org, PUC-Rio
+t7> 
+t7> x = torch.Tensor()
+t7> print(x)
+[torch.DoubleTensor with no dimension]
+
+```
+
+You might have to specify the exact path of the `torch` executable
+if you installed Torch in a non-standard path.
+
+In this example, we checked Torch was working by creating an empty
+[Tensor](..:torch:tensor) and printing it on the screen.  The Tensor
+is the main tool in Torch, and is used to represent vector, matrices
+or higher-dimensional objects (tensors).
+
+`torch` only preloads the basic parts of torch (including
+Tensors). To see the list of all packages distributed with Torch7,
+click [here](..:index).
+
+## Getting Help ##
+
+There are two main ways of getting help in Torch7. One way is ofcourse
+the html formatted help. However, another and easier method is to use
+inline help in torch interpreter. The `torch` executable also
+integrates this capability. Help about any function can be accessed by
+calling the `help()` function.
+
+```lua
+
+t7> help(torch.rand)
+
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+torch.rand( [res,] m [, n, k, ...])        
+ y=torch.rand(n) returns a one-dimensional tensor of size n filled with 
+random numbers from a uniform distribution on the interval (0,1).
+ y=torch.rand(m,n) returns a mxn tensor of random numbers from a uniform 
+distribution on the interval (0,1).
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+
+```
+
+Even a more intuitive method is to use tab completion. Whenever any
+input is entered at the `torch` prompt, one can eneter two
+consecutive `TAB` characters (`double TAB`) to get the syntax
+completion. Moreover entering `double TAB` at an open paranthesis
+also causes the help for that particular function to be printed.
+
+```lua
+
+t7> torch.randn( -- enter double TAB after (
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+torch.randn( [res,] m [, n, k, ...])       
+ y=torch.randn(n) returns a one-dimensional tensor of size n filled with 
+random numbers from a normal distribution with mean zero and variance 
+one.
+ y=torch.randn(m,n) returns a mxn tensor of random numbers from a normal 
+distribution with mean zero and variance one.
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+
+/  \  
+t7> torch.randn(
+
+```
+
+## Lua Basics ##
+
+Torch is entirely built around [Lua](http://www.lua.org/), so the first thing you have to 
+know is get some basic knowledge about the language. The [online book](http://www.lua.org/docs.html)
+is great for that. 
+
+Here I'll just summarize a couple of very basic things to get you started.
+
+### Variables ###
+
+Creating variables is straightforward, Lua is dynamically typed language. Printing variables from the prompt is a bit misleading, you have to add the = sign before it:
+
+```lua
+t7> a = 10
+t7> print(a)
+10
+t7> = a
+10
+t7> b = a + 1
+t7> = b
+11
+```
+
+### Lua's universal data structure: the table ###
+
+The best thing about Lua is its consistency, and compactness. The whole language relies on a single data structure, the table, which will allow you to construct the most complex programs, with style!
+
+  * The Lua table can be used as:
+  * an array (with linearly stored values, of arbitrary types)
+  * a hash-table (with hashed key/value pairs)
+  * an object (which is just a hash-table)
+  * a namespace (any package in Lua is a simple table, in fact the global namespace, _G, is also a table)
+
+You already know enough about tables, let's hack around:
+
+```lua
+t7> t = {}
+t7> =t
+{}
+t7> t = {1,2,3,4}
+t7> =t
+{[1] = 1
+ [2] = 2
+ [3] = 3
+ [4] = 4}
+t7> = {1,2,3,'mixed types',true}
+{[1] = 1
+ [2] = 2
+ [3] = 3
+ [4] = string : "mixed types"
+ [5] = true}
+t7> t =  {4,3,2,1}
+```
+
+In the example above, we've shown how to use a table as a linear array. Lua is one-based, like Matlab, so if we try to get the length of this last array created, it'll be equal to the number of elements we've put in:
+
+```lua
+t7> =#t
+4
+```
+
+Ok, let's see about hash-tables now:
+
+```lua
+t7> h = {firstname='Paul', lastname='Eluard', age='117'}
+t7> =h
+{[firstname] = string : "Paul"
+ [lastname]  = string : "Eluard"
+ [age]       = string : "117"}
+```
+
+So now mixing arrays and hash-tables is easy:
+
+```lua
+t7> h = {firstname='Paul', lastname='Eluard', age='117', 1, 2, 3}
+t7> =h
+{[1]         = 1
+ [2]         = 2
+ [3]         = 3
+ [firstname] = string : "Paul"
+ [lastname]  = string : "Eluard"
+ [age]       = string : "117"}
+t7> 
+```
+
+Easy right?
+
+So we've seen a couple of basic types already: strings, numbers, tables, booleans (true/false). There's one last type in Lua: the function. 
+are first-order citizens in Lua, which means that they can be treated as regular variables. This is great, because it's the reason why we can construct very powerful data structures (such as objects) with tables:
+
+```lua
+t7> h = {firstname='Paul', lastname='Eluard', age='117',
+. >      print=function(self)
+. >               print(self.firstname .. ' ' .. self.lastname 
+. >                     .. ' (age: ' .. self.age .. ')')
+. >      end
+. >     }
+
+t7> =h
+{[firstname] = string : "Paul"
+ [print]     = function: 0x7f885d00c430
+ [lastname]  = string : "Eluard"
+ [age]       = string : "117"}
+```
+
+In this example above, we're basically storing a function at the key (hash) print. It's fairly straightforward, note that the function takes one argument, named self, which is assumed to be the object itself. The function simply concatenates the fields of the table self, and prints the whole string.
+
+One important note: accessing fields of a table is either done using square brackets [], or the . operator. The square brackets are more general: they allow the use of arbitrary strings. In the following, we now try to access the elements of h, that we just created:
+
+```lua
+t7> h. + TAB
+h.age        h.firstname  h.lastname   h.print(     
+
+t7> = h.print
+function: 0x7f885d00ec80
+
+t7> h.print(h)
+Paul Eluard (age: 117)
+
+t7> h:print()
+Paul Eluard (age: 117)
+```
+
+On the first line we type h. and then use TAB to complete and automatically explore the symbols present in h. We then print h.print, and confirm that it is indeed a function.
+
+At the next line, we call the function h.print, and pass h as the argument (which becomes self in the body of the function). This is fairly natural, but a bit heavy to manipulate objects. Lua provides a simple shortcut, :, the column, which passes the parent table as the first argument: h:print() is strictly equivalent to h.print(h).
+
+### Functions ###
+
+A few more things about functions: functions in Lua are proper closures, so in combination with tables, you can use them to build complex and very flexible programs. An example of closure is given here:
+
+```lua
+myfuncs = {}
+for i = 1,4 do
+    local calls = 0
+    myfuncs[i] = function()
+        calls = calls + 1
+        print('this function has been called ' .. calls .. ' times')
+    end
+end
+
+t7> myfuncs[1]()
+this function has been called 1 times
+t7> myfuncs[1]()
+this function has been called 2 times
+t7> myfuncs[4]()
+this function has been called 1 times
+t7> myfuncs[4]()
+this function has been called 2 times
+t7> myfuncs[1]()
+this function has been called 3 times
+```
+
+You can use such closures to create objects on the fly, that is, tables which combine functions and data to act upon. Thanks to closure, data can live in arbitrary locations (not necessarily the object's table), and simply be bound at runtime to the function's scope.
+
+## Torch Basics: Playing with Tensors ##
+
+Ok, now we are ready to actually do something in Torch.  Lets start by
+constructing a vector, say a vector with 5 elements, and filling the
+i-th element with value i. Here's how:
+
+```lua
+t7> x=torch.Tensor(5)
+t7> for i=1,5 do x[i]=i; end
+t7> print(x)
+
+ 1
+ 2
+ 3
+ 4
+ 5
+[torch.DoubleTensor of dimension 5] 
+
+t7>
+```
+
+However, making use of Lua's powerfull closures and functions being
+first class citizens of the language, the same code could be written
+in much nicer way:
+
+```lua
+t7> x=torch.Tensor(5)
+t7> i=0;x:apply(function() i=i+1;return i; end)
+t7> =x
+ 1
+ 2
+ 3
+ 4
+ 5
+[torch.DoubleTensor of dimension 5]
+
+t7> x:apply(function(x) return x^2; end)
+t7> =x
+  1
+  4
+  9
+ 16
+ 25
+[torch.DoubleTensor of dimension 5]
+
+t7> 
+```
+
+To make a matrix (2-dimensional Tensor), one simply does something
+like `x=torch.Tensor(5,5)` instead:
+
+```lua
+x=torch.Tensor(5,5)
+for i=1,5 do 
+ for j=1,5 do 
+   x[i][j]=math.random();
+ end
+end
+```
+
+Another way to do the same thing as the code above is provided by torch:
+
+```lua
+x=torch.rand(5,5)
+```
+
+The [torch](..:torch:maths) package contains a wide variety of commands 
+for manipulating Tensors that follow rather closely the equivalent
+Matlab commands. For example one can construct Tensors using the commands
+[ones](..:torch:maths#torch.ones), 
+[zeros](..:torch:maths#torch.zeros), 
+[rand](..:torch:maths#torch.rand),
+[randn](..:torch:maths#torch.randn) and
+[eye](..:torch:maths#torch.eye), amongst others.
+
+Similarly, row or column-wise operations such as 
+[sum](..:torch:maths#torch.sum) and 
+[max](..:torch:maths#torch.max) are called in the same way:
+
+```lua
+t7> x1=torch.rand(5,5)
+t7> x2=torch.sum(x1,2); 
+t7> print(x2) 
+ 2.3450
+ 2.7099
+ 2.5044
+ 3.6897
+ 2.4089
+[torch.DoubleTensor of dimension 5x1]
+
+t7>
+```
+
+Naturally, many BLAS operations like matrix-matrix, matrix-vector products
+are implemented. We suggest everyone to install ATLAS or MKL libraries since
+Torch7 can optionally take advantage with these very efficient and multi-threaded 
+libraries if they are found in your system. Checkout 
+[Mathematical operations using tensors.](..:torch:maths) for details.
+
+```lua
+
+t7> a=torch.ones(5,5)
+t7> b=torch.ones(5,2)
+t7> =a
+ 1  1  1  1  1
+ 1  1  1  1  1
+ 1  1  1  1  1
+ 1  1  1  1  1
+ 1  1  1  1  1
+[torch.DoubleTensor of dimension 5x5]
+
+t7> =b
+ 1  1
+ 1  1
+ 1  1
+ 1  1
+ 1  1
+[torch.DoubleTensor of dimension 5x2]
+
+t7> =torch.mm(a,b)
+ 5  5
+ 5  5
+ 5  5
+ 5  5
+ 5  5
+[torch.DoubleTensor of dimension 5x2]
+
+```
+
+## Types in Torch7 ##
+
+In Torch7, different types of tensors can be used. By default, all
+tensors are created using `double` type. `torch.Tensor` is a
+convenience call to `torch.DoubleTensor`. One can easily switch the
+default tensor type to other types, like `float`.
+
+```lua
+t7> =torch.Tensor()
+[torch.DoubleTensor with no dimension]
+t7> torch.setdefaulttensortype('torch.FloatTensor')
+t7> =torch.Tensor()
+[torch.FloatTensor with no dimension]
+```
+
+## Saving code to files, running files ##
+
+Before we go any further, let's just review one basic thing: saving code to files, and executing them.
+
+As Torch relies on Lua, it's best to give all your files a .lua extension. Let's generate a lua file that contains some Lua code, and then execute it:
+
+```lua
+$ echo "print('Hello World\!')" > helloworld.lua
+...
+
+$ torch helloworld.lua
+...
+Hello World!
+
+$ torch
+...
+t7> dofile 'helloworld.lua'
+Hello World!
+```
+
+That's it, you can either run programs form your shell, or from the Torch prompt. You can also run programs from the shell, and get an interactive prompt whenever an error occurs, or the program terminates (good for debugging):
+
+```lua
+$ torch -i helloworld.lua
+...
+Hello World!
+t7>
+```
+
+We're good with all the basic things: you now know how to run code, from files or from the prompt, and write basic Lua (which is almost all Lua is!).
+
+## Example: training a neural network ##
+
+We will show now how to train a neural network using the [nn](..:nn:index) package
+available in Torch.
+
+### Torch basics: building a dataset using Lua tables ###
+
+In general the user has the freedom to create any kind of structure he
+wants for dealing with data.
+
+For example, training a neural network in Torch is achieved easily by
+performing a loop over the data, and forwarding/backwarding tensors
+through the network. Then, the way the dataset is built is left to the
+user's creativity.
+
+
+However, if you want to use some convenience classes, like
+[StochasticGradient](..:nn:index#nn.StochasticGradient), which basically
+does the training loop for you, one has to follow the dataset
+convention of these classes.  (We will discuss manual training of a
+network, where one does not use these convenience classes, in a later
+section.)
+
+StochasticGradient expects as a `dataset` an object which implements
+the operator `dataset[index]` and implements the method
+`dataset:size()`. The `size()` methods returns the number of
+examples and `dataset[i]` has to return the i-th example.
+
+An `example` has to be an object which implements the operator
+`example[field]`, where `field` often takes the value `1` (for
+input features) or `2` (for corresponding labels), i.e an example is
+a pair of input and output objects.  The input is usually a Tensor
+(exception: if you use special kind of gradient modules, like
+[table layers](..:nn:index#nn.TableLayers)). The label type depends
+on the criterion. For example, the
+[MSECriterion](..:nn:index#nn.MSECriterion) expects a Tensor, but the
+[ClassNLLCriterion](..:nn:index#nn.ClassNLLCriterion) expects an
+integer (the class).
+
+Such a dataset is easily constructed by using Lua tables, but it could any object
+as long as the required operators/methods are implemented.
+
+Here is an example of making a dataset for an XOR type problem:
+```lua
+dataset={};
+function dataset:size() return 100 end -- 100 examples
+for i=1,dataset:size() do 
+	local input= torch.randn(2);     --normally distributed example in 2d
+	local output= torch.Tensor(1);
+	if input[1]*input[2]>0 then    --calculate label for XOR function
+		output[1]=-1;
+	else
+		output[1]=1;
+	end
+	dataset[i] = {input, output};
+end
+```
+
+### Torch basics: building a neural network ###
+
+To train a neural network we first need some data.  We can use the XOR data
+we just generated in the section before.  Now all that remains is to define
+our network architecture, and train it.
+
+To use Neural Networks in Torch you have to require the 
+[nn](..:nn:index) package. 
+A classical feed-forward network is created with the `Sequential` object:
+```lua
+require "nn"
+mlp=nn.Sequential();  -- make a multi-layer perceptron
+```
+
+To build the layers of the network, you simply add the Torch objects 
+corresponding to those layers to the _mlp_ variable created above.
+
+The two basic objects you might be interested in first are the 
+[Linear](..:nn:index#nn.Linear) and 
+[Tanh](..:nn:index#nn.Tanh) layers.
+The Linear layer is created with two parameters: the number of input
+dimensions, and the number of output dimensions. 
+So making a classical feed-forward neural network with one hidden layer with 
+_HUs_ hidden units is as follows:
+```lua
+require "nn"
+mlp=nn.Sequential();  -- make a multi-layer perceptron
+inputs=2; outputs=1; HUs=20;
+mlp:add(nn.Linear(inputs,HUs))
+mlp:add(nn.Tanh())
+mlp:add(nn.Linear(HUs,outputs))
+```
+
+
+### Torch basics: training a neural network ###
+
+Now we're ready to train.
+This is done with the following code:
+```lua
+criterion = nn.MSECriterion()  
+trainer = nn.StochasticGradient(mlp, criterion)
+trainer.learningRate = 0.01
+trainer:train(dataset)
+```
+
+You should see printed on the screen something like this:
+```lua
+# StochasticGradient: training
+# current error = 0.94550937745458
+# current error = 0.83996744568527
+# current error = 0.70880093908742
+# current error = 0.58663679932706
+# current error = 0.49190661630473
+[..snip..]
+# current error = 0.34533844015756
+# current error = 0.344305927029
+# current error = 0.34321901952818
+# current error = 0.34206793525954
+# StochasticGradient: you have reached the maximum number of iterations
+```
+
+Some other options of the _trainer_ you might be interested in are for example:
+```lua
+trainer.maxIteration = 10
+trainer.shuffleIndices = false
+```
+See the nn package description of the
+[StochasticGradient](..:nn:index#nn.StochasticGradient) object
+for more details.
+
+
+### Torch basics: testing your neural network ###
+
+To test your network on a single example you can do this:
+```lua
+x=torch.Tensor(2);   -- create a test example Tensor
+x[1]=0.5; x[2]=-0.5; -- set its values
+pred=mlp:forward(x)  -- get the prediction of the mlp 
+print(pred)          -- print it 
+```
+
+You should see that your network has learned XOR:
+```lua
+t7> x=torch.Tensor(2); x[1]=0.5; x[2]=0.5; print(mlp:forward(x))
+-0.5886
+[torch.DoubleTensor of dimension 1]
+
+t7> x=torch.Tensor(2); x[1]=-0.5; x[2]=0.5; print(mlp:forward(x))
+ 0.9261
+[torch.DoubleTensor of dimension 1]
+
+t7> x=torch.Tensor(2); x[1]=0.5; x[2]=-0.5; print(mlp:forward(x))
+ 0.7913
+[torch.DoubleTensor of dimension 1]
+
+t7> x=torch.Tensor(2); x[1]=-0.5; x[2]=-0.5; print(mlp:forward(x))
+-0.5576
+[torch.DoubleTensor of dimension 1]
+```
+
+### Manual Training of a Neural Network ###
+
+Instead of using the [StochasticGradient](..:nn:index#nn.StochasticGradient) class
+you can directly make the forward and backward calls on the network yourself.
+This gives you greater flexibility.
+In the following code example we create the same XOR data on the fly
+and train each example online.
+
+```lua
+criterion = nn.MSECriterion()  
+mlp=nn.Sequential();  -- make a multi-layer perceptron
+inputs=2; outputs=1; HUs=20;
+mlp:add(nn.Linear(inputs,HUs))
+mlp:add(nn.Tanh())
+mlp:add(nn.Linear(HUs,outputs))
+
+for i = 1,2500 do
+  -- random sample
+  local input= torch.randn(2);     -- normally distributed example in 2d
+  local output= torch.Tensor(1);
+  if input[1]*input[2] > 0 then  -- calculate label for XOR function
+    output[1] = -1
+  else
+    output[1] = 1
+  end
+
+  -- feed it to the neural network and the criterion
+  prediction = mlp:forward(input)
+  criterion:forward(prediction, output)
+
+  -- train over this example in 3 steps
+
+  -- (1) zero the accumulation of the gradients
+  mlp:zeroGradParameters()
+
+  -- (2) accumulate gradients
+  criterion_gradient = criterion:backward(prediction, output)
+  mlp:backward(input, criterion_gradient)
+
+  -- (3) update parameters with a 0.01 learning rate
+  mlp:updateParameters(0.01)
+end
+```
+
+Super!
+
+## Concluding remarks / going further ##
+
+That's the end of this tutorial, but not the end of what you have left
+to discover of Torch! To explore more of Torch, you should take a look
+at the [Torch package help](..:index) which has been linked to
+throughout this tutorial every time we have mentioned one of the basic
+Torch object types.  The Torch library reference manual is available
+[here](..:index) and the external torch packages installed on your
+system can be viewed [here](..:torch:index).
+
+We've also compiled a couple of demonstrations and tutorial scripts
+that demonstrate how to train more complex models, and build gui-based
+demos, and so on... All of these can be found in 
+[this repo](http://github.com/andresy/torch-demos).
+
+Good luck and have fun!
+
diff --git a/init.lua b/init.lua
new file mode 100644
index 0000000..8d235e9
--- /dev/null
+++ b/init.lua
@@ -0,0 +1,10 @@
+dok = {}
+
+require 'dok.inline'
+
+local ok,sd = pcall(require, 'sundown')
+if ok then
+    dok.markdown2html = sd.render
+else
+    dok.markdown2html = function() return '<p> Error: Sundown could not be loaded </p>' end
+end
diff --git a/inline.lua b/inline.lua
new file mode 100644
index 0000000..dce34ca
--- /dev/null
+++ b/inline.lua
@@ -0,0 +1,739 @@
+--------------------------------------------------------------------------------
+-- inline help
+-- that file defines all the tools and goodies to generate inline help
+--------------------------------------------------------------------------------
+local function splitpackagepath()
+   local str = package.path
+   local t = {} 
+   local last_end = 1
+   local s, e, cap = string.find(str, "(.-);", 1)
+   while s do
+      if s ~= 1 or cap ~= "" then
+	 table.insert(t,cap)
+      end
+      last_end = e+1
+      s, e, cap = string.find(str, "(.-);", last_end)
+   end
+   if last_end <= string.len(str) then
+      cap = string.sub(str, last_end)
+      table.insert(t, cap)
+   end
+   -- get package prefixes
+   for i=1,#t do
+      local s,p = t[i]:find('?')
+      if s then t[i] = t[i]:sub(1,s-1) end
+   end
+   -- remove duplicates
+   local t2 = {}
+   for i=1,#t do
+      local exists = false;
+      for j=1,#t2 do if t[i] == t2[j] then 
+	    exists = true end; 
+      end
+      if not exists then table.insert(t2, t[i]) end
+   end
+   return t2
+end
+local mdsearchpaths = splitpackagepath()
+
+local knownpkg = {}
+
+-- Lua 5.2 compatibility
+local unpack = unpack or table.unpack
+local loadstring = loadstring or load
+
+dok.inline = {}
+
+dok.colors = {
+   none = '\27[0m',
+   black = '\27[0;30m',
+   red = '\27[0;31m',
+   green = '\27[0;32m',
+   yellow = '\27[0;33m',
+   blue = '\27[0;34m',
+   magenta = '\27[0;35m',
+   cyan = '\27[0;36m',
+   white = '\27[0;37m',
+   Black = '\27[1;30m',
+   Red = '\27[1;31m',
+   Green = '\27[1;32m',
+   Yellow = '\27[1;33m',
+   Blue = '\27[1;34m',
+   Magenta = '\27[1;35m',
+   Cyan = '\27[1;36m',
+   White = '\27[1;37m',
+   _black = '\27[40m',
+   _red = '\27[41m',
+   _green = '\27[42m',
+   _yellow = '\27[43m',
+   _blue = '\27[44m',
+   _magenta = '\27[45m',
+   _cyan = '\27[46m',
+   _white = '\27[47m'
+}
+local c = dok.colors
+
+local style = {}
+function dok.usecolors()
+   style = {
+      banner = '+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++',
+      list = c.blue .. '> ' .. c.none,
+      title = c.Magenta,
+      pre = c.cyan,
+      em = c.Black,
+      bold = c.Black,
+      img = c.red,
+      link = c.red,
+      code = c.green,
+      error = c.Red,
+      none = c.none
+   }
+end
+function dok.dontusecolors()
+   style = {
+      banner = '+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++',
+      list = '> ',
+      title = '',
+      pre = '',
+      em = '',
+      bold = '',
+      img = '',
+      link = '',
+      code = '',
+      error = '',
+      none = ''
+   }
+end
+dok.usecolors()
+
+local function uncleanText(txt)
+   txt = txt:gsub(''', "'")
+   txt = txt:gsub('*', '%*')
+   txt = txt:gsub('+', '%+')
+   txt = txt:gsub('<', '<')
+   txt = txt:gsub('>', '>')
+   return txt
+end
+
+local function string2symbol(str)
+   local str  = str:gsub(':','.')
+   local ok, res = pcall(loadstring('local t = ' .. str .. '; return t'))
+   if not ok then
+      ok, res = pcall(loadstring('local t = _torchimport.' .. str .. '; return t'))
+   end
+   return res
+end
+
+local function maxcols(str, cols)
+   cols = cols or 70
+   local res = ''
+   local k = 1
+   local color = false
+   for i = 1,#str do
+      res = res .. str:sub(i,i)
+      if str:sub(i,i) == '\27' then
+         color = true
+      elseif str:sub(i,i) == 'm' then
+         color = false
+      end
+      if k == cols then
+         if str:sub(i,i) == ' ' then
+            res = res .. '\n'
+            k = 1
+         end
+      elseif not color then
+         k = k + 1
+      end
+      if str:sub(i,i) == '\n' then
+         k = 1
+      end
+   end
+   return res
+end
+
+function dok.stylize(html, package)
+   local styled = html
+   -- (0) useless white space
+   styled = styled:gsub('^%s+','')
+   -- (1) function title
+   styled = styled:gsub('<h%d>(.-)</h%d>', function(title) return style.title .. title .. style.none .. '\n' end)
+   -- (2) lists
+   styled = styled:gsub('<ul>(.-)</ul>', 
+			function(list) 
+			   return list:gsub('<li>%s*(.-)%s*</li>%s*', style.list .. '%1\n')
+			end
+   )
+   -- (3) code
+   styled = styled:gsub('%s*<code>%s*(.-)%s*</code>%s*', style.code .. ' %1 ' .. style.none)
+   styled = styled:gsub('%s*<code class%="%S-">%s*(.-)%s*</code>%s*', style.pre .. ' %1 ' .. style.none)
+
+   -- (4) pre
+   styled = styled:gsub('<pre.->(.-)</pre>', style.pre .. '%1' .. style.none)
+
+   -- (5) formatting
+   styled = styled:gsub('<em>(.-)</em>', style.em .. '%1' .. style.none)
+   styled = styled:gsub('<b>(.-)</b>', style.bold .. '%1' .. style.none)
+   styled = styled:gsub('<strong>(.-)</strong>', style.bold .. '%1' .. style.none)
+   styled = styled:gsub('//(.-)//', style.bold .. '%1' .. style.none)
+
+   -- (6) links
+   styled = styled:gsub('<a.->(.-)</a>', style.none .. '%1' .. style.none)
+   -- (7) images
+   styled = styled:gsub('<img.-src="(.-)".->%s*', 
+			style.img .. 'image: file://' 
+			   .. paths.concat(package,'%1') -- OUCH DEBUG paths.install_dokmedia,
+			   .. style.none .. '\n')
+   -- (8) remove internal anchors
+   styled = styled:gsub('<a(.-)/>', '')
+   -- (-) paragraphs
+   styled = styled:gsub('<p>', '\n')
+   styled = styled:gsub('</p>', '')
+
+   -- (-) special chars
+   styled = uncleanText(styled)
+   -- (-) max columns
+   styled = maxcols(styled)
+   -- (-) conclude
+   styled = styled:gsub('%s*$','')
+
+   return styled
+end
+
+local function adddok(...)
+   local tt = {}
+   local arg = {...}
+   for i=1,#arg do
+      table.insert(tt,arg[i])
+   end
+   return table.concat(tt,'\n')
+end
+
+--[[
+   function dok.html2funcs(html, package)
+   This is how this function works:
+   It initializes the section to level-0 (which means a heading with <h0>). 
+   There cant be a html section higher than this, so root node.
+
+   It looks first for an anchor, like this one:
+      <a name="nn.dok"/>
+   Once it found an anchor, it gets the name of the anchor, in this case it is nn.dok
+
+   It also looks for headings, like this:
+      <h1>Neural Network Package</h1>
+   The level of the heading is determined by <h[number]>
+
+   It then sees if before the heading <h1> appeared and after the 
+   anchor (<a>) appeared, if any other lines were parsed. It records those 
+   lines in csection (just table.insert the lines) as a string.
+
+   Now, continuing, the level is a number. 
+   It looks at if the current level is lower than the current csection's level 
+   (which means is it higher up the tree towards the root). If it is, it 
+   traverses up the tree until it finds the parent is one level above, so that
+   it can insert the current node into the right level of the tree.
+   
+   Now, it creates this subsection (which depends on the anchor and heading-name.
+   it inserts it into this tree appropriately.
+
+   Then it keeps parsing lines until the lines are again either an anchor or 
+   a heading. Once this happens, the subsection is deemed to be traversed and sealed.
+
+   Then finally it creates a table of functions that have a key of the anchor 
+   name, and a value of the parsed lines (after stylizing them). A small 
+   check is done to make sure that the anchor name starts with the package-name.
+   
+]]--
+function dok.html2funcs(html, package)
+   local sections = {level=0}
+   local csection = sections
+   local canchor
+   local lines = {}
+   for line in html:gmatch('[^\n\r]+') do
+      local anchor = line:match('<a.-name=["\'](.-)["\']/>') 
+         or line:match('<a.-name=["\'](.-)["\']>.-</a>')
+      local level, name = line:match('<h(%d)>(.*)</h%d>')
+      if anchor then
+         canchor = anchor
+      elseif level and name then
+         if #lines > 0 then
+            table.insert(csection, table.concat(lines, '\n'))
+            lines = {}
+         end
+
+         level = tonumber(level)
+         if level <= csection.level then
+            while level <= csection.level do
+               csection = csection.parent
+            end
+         end
+
+         local subsection = {level=level, parent=csection, name=name, anchor=canchor}
+         table.insert(csection, subsection)
+         csection = subsection
+
+      elseif line:match('^%s+$') then
+      else
+         canchor = nil
+         table.insert(lines, line)
+      end
+   end
+
+   -- deal with remaining lines
+   if #lines > 0 then
+      table.insert(csection, table.concat(lines, '\n'))
+      lines = {}
+   end
+   
+   local function printsection(section, txt)
+      if section.level > 0 and section.name then
+         table.insert(txt, string.format('<h%d>%s</h%d>', section.level, section.name, section.level))
+      end
+      if section.anchor then
+      end
+      for i=1,#section do
+         if type(section[i]) == 'string' then -- this is NOT a subsection. Do not include sub-sections in there.
+            table.insert(txt, section[i])
+         end
+      end
+   end
+
+   local funcs = {}
+
+   local function traversesection(section)
+      if section.anchor then
+         local txt = {}
+         local key = string.lower(section.anchor):match(package .. '%.(.*)')
+         if key then
+            printsection(section, txt)
+            txt = table.concat(txt, '\n')
+            funcs[key] = adddok(funcs[key], dok.stylize(txt, package))
+         end
+      end
+      for i=1,#section do
+         if type(section[i]) == 'string' then
+         else
+            traversesection(section[i])
+         end
+      end
+   end
+   traversesection(sections)
+
+   return funcs
+end
+
+local function packageiterator()
+   local co = coroutine.create(
+      function()
+         local trees = mdsearchpaths
+	 for _,tree in ipairs(trees) do
+	    for file in paths.files(tree) do
+	       if file ~= '.' and file ~= '..' then
+		  coroutine.yield(file, paths.concat(tree, file))
+	       end
+	    end
+         end
+   end)
+
+   return function()
+      local code, res1, res2 = coroutine.resume(co)
+      return res1, res2
+   end
+end
+
+local function mditerator(path)
+   local co = coroutine.create(
+      function()
+	 function iterate(path)
+	    if path == '.' or path == '..' then
+	    elseif paths.filep(path) then
+	       if path:match('%.md$') then
+		  coroutine.yield(path)
+	       end
+	    else
+	       for file in paths.files(path) do
+		  if file ~= '.' and file ~= '..' then
+		     iterate(paths.concat(path, file))
+		  end
+	       end
+	    end
+	 end
+	 iterate(path)
+   end)
+
+   return function()
+      local code, res = coroutine.resume(co)
+      return res
+   end
+
+end
+
+function dok.refresh()
+   for pkgname, path in packageiterator() do
+      local pkgtbl = _G[pkgname] or package.loaded[pkgname]
+      if pkgtbl and not knownpkg[pkgname] then
+         knownpkg[pkgname] = true
+         for file in mditerator(path) do
+            local f = io.open(file)
+            if f then
+               local content = f:read('*all')
+               local html = dok.markdown2html(content)
+               local funcs = dok.html2funcs(html, pkgname)
+               if type(pkgtbl) ~= 'table' and _G._torchimport then 
+                  -- unsafe import, use protected import
+                  pkgtbl = _G._torchimport[pkgname]
+               end
+               if pkgtbl and type(pkgtbl) == 'table' then
+                  -- level 0: the package itself
+                  dok.inline[pkgtbl] = dok.inline[pkgtbl] or funcs['dok'] 
+		     or funcs['reference.dok'] or funcs['overview.dok']
+                  -- next levels
+                  for key,symb in pairs(pkgtbl) do
+                     -- level 1: global functions and objects
+                     local entry = (key):lower()
+                     if funcs[entry] or funcs[entry..'.dok'] then
+                        local sym = string2symbol(pkgname .. '.' .. key)
+                        dok.inline[sym] = adddok(funcs[entry..'.dok'],funcs[entry])
+                     end
+                     -- level 2: objects' methods
+                     if type(pkgtbl[key]) == 'table' then
+                        local entries = {}
+                        for k,v in pairs(pkgtbl[key]) do
+                           entries[k] = v
+                        end
+                        local mt = getmetatable(pkgtbl[key]) or {}
+                        for k,v in pairs(mt) do
+                           entries[k] = v
+                        end
+                        for subkey,subsymb in pairs(entries) do
+                           local entry = (key .. '.' .. subkey):lower()
+                           if funcs[entry] or funcs[entry..'.dok'] then
+                              local sym = string2symbol(pkgname .. '.' .. key .. '.' .. subkey)
+                              dok.inline[sym] = adddok(funcs[entry..'.dok'],funcs[entry])
+                           end
+                        end
+                     end
+                  end
+               end
+            end
+         end
+      end
+   end
+end
+
+--------------------------------------------------------------------------------
+-- help() is the main user-side function: prints help for any given
+-- symbol that has an anchor defined in a .dok file.
+--------------------------------------------------------------------------------
+function dok.help(symbol, asstring)
+   -- color detect
+   if qtide then
+      dok.dontusecolors()
+   else
+      dok.usecolors()
+   end
+   -- no symbol? global help
+   if not symbol then
+      print(style.banner)
+      print(style.title .. 'help(symbol)' .. style.none 
+	       .. '\n\nget inline help on a specific symbol\n'
+	       .. '\nto browse the complete html documentation, call: '
+	       .. style.title .. 'browse()' .. style.none)
+      print(style.banner)
+      return
+   end
+   -- always refresh (takes time, but insures that 
+   -- we generate help for all packages loaded)
+   dok.refresh()
+   if type(symbol) == 'string' then
+      symbol = string2symbol(symbol)
+   end
+   local inline = dok.inline[symbol]
+   if asstring then
+      return inline
+   else
+      if inline then
+         print(style.banner)
+         print(inline)
+         print(style.banner)
+      else
+	 print('undocumented symbol')
+      end
+   end
+end
+
+rawset(_G, 'help', dok.help)
+
+-- function to parse a package's markdown files in a different way than for help().
+-- this function returns the package's markdown files names (without extension) along with their headings.
+local function package_headings(pname)
+   local out = {}
+   for pkgname, path in packageiterator() do
+      if pkgname == pname then
+	 local pkgtbl = _G[pkgname] or package.loaded[pkgname]
+	 if pkgtbl then
+	    for file in mditerator(path) do
+	       local basename = paths.basename(file)
+	       basename = basename:sub(1,#basename-3) -- remove .md
+	       local f = io.open(file)
+	       if f then
+		  local heading = 'Untitled'
+		  local content = f:read('*all')
+		  local html = dok.markdown2html(content)
+		  -- find first heading
+		  for line in html:gmatch('[^\n\r]+') do
+		     local level, name = line:match('<h(%d)>(.*)</h%d>')
+		     if level and name then
+			heading = name
+			break -- found the first heading, call it a day
+		     end
+		  end
+		  f:close()
+		  out[basename] = {heading, file}
+	       end -- if f then
+	    end -- for file in mditerator
+	 end -- if pkgtbl
+	 break;
+      end -- if pkgname == pname      
+   end -- for pkgname, path in packageiterator()
+   return out
+end
+
+--------------------------------------------------------------------------------
+-- browse() is a function that triggers a manual viewer in command-line
+--------------------------------------------------------------------------------
+function dok.browse(package_name)
+   print(style.banner)
+   print('Inline Package Manual Browser')
+   print(style.banner)
+   print('To exit at any time, type q or exit')
+   if not package_name then
+      print('Enter package name to browse (example: torch):')
+      local answer = io.read()
+      package_name = answer
+   end
+   if package_name == 'exit' then return end
+   local ok, p = pcall(function() require(package_name) end)
+   if not ok then
+      print('Package not found or failed to load:', package_name)
+      return
+   end
+
+   dok.help(package_name)
+   -- Approach 1:
+   -- show all available anchors from this dok and ask user to enter page.
+
+   -- Approach 2:
+   -- show file-by-file. After Overview, traverse Each file, and get the file heading. Print it out.
+   -- Ask user to enter next file they want to traverse-to. (in brackets you can give them hints).
+
+   -- Let's hack Approach 2.
+   local headings = package_headings(package_name)
+   
+   print('Sections:')
+   local example
+   local menu = ''
+   for k,v in pairs(headings) do
+      menu = menu .. '\t' .. v[1] .. ' [' .. k .. ']\n'
+      example = k
+   end
+   assert(example, 'No more subsections for this package')
+   while true do
+      print(menu)
+      print('Enter choice (Example: ' .. example .. ') or type exit. :')
+      local k = io.read()
+      if k == 'exit'  then return end
+      if headings[k] then
+	 local file = headings[k][2]
+	 local f = io.open(file)
+	 if f then
+	    local content = f:read('*all')
+	    local sd = require 'sundown'
+	    local str = sd.renderASCII(content)
+	    local tracker = 0
+	    f:close()
+	    local max_chars = 1000 -- now print this string conservatively, maybe 1000 characters at a time	    
+	    while tracker <= #str do
+	       print(str:sub(tracker, tracker + max_chars))
+	       print('[enter] for more or press any character to exit')
+	       local ans = io.read()
+	       if ans ~= ''  then break; end
+	       tracker = tracker + max_chars
+	    end
+	 end
+      end
+   end
+   
+end
+
+rawset(_G, 'browse', dok.browse)
+
+--------------------------------------------------------------------------------
+-- standard usage function: used to display automated help for functions
+--
+-- @param funcname     function name
+-- @param description  description of the function
+-- @param example      usage example
+-- @param ...          [optional] arguments
+--------------------------------------------------------------------------------
+function dok.usage(funcname, description, example, ...)
+   local str = ''
+
+   local help = help(string2symbol(funcname), true)
+   if help then
+      str = str .. help
+   else
+      str = str .. style.banner .. '\n'
+      str = str .. style.title .. funcname .. style.none .. '\n'
+      if description then
+         str = str .. '\n' .. description .. '\n'
+      end
+   end
+
+   str = str .. '\n' .. style.list .. 'usage:\n' .. style.pre
+
+   -- named arguments:
+   local args = {...}
+   if args[1].tabled then
+      args = args[1].tabled 
+   end
+   if args[1].arg then
+      str = str .. funcname .. '{\n'
+      for i,param in ipairs(args) do
+         local key
+         if param.req then
+            key = '    ' .. param.arg .. ' = ' .. param.type
+         else
+            key = '    [' .. param.arg .. ' = ' .. param.type .. ']'
+         end
+         -- align:
+         while key:len() < 40 do
+            key = key .. ' '
+         end
+         str = str .. key .. '-- ' .. param.help 
+         if param.default or param.default == false then
+            str = str .. '  [default = ' .. tostring(param.default) .. ']'
+         elseif param.defaulta then
+            str = str .. '  [default == ' .. param.defaulta .. ']'
+         end
+         str = str.. '\n'
+      end
+      str = str .. '}\n'
+
+      -- unnamed args:
+   else
+      local idx = 1
+      while true do
+         local param
+         str = str .. funcname .. '(\n'
+         while true do
+            param = args[idx]
+            idx = idx + 1
+            if not param or param == '' then break end
+            local key
+            if param.req then
+               key = '    ' .. param.type
+            else
+               key = '    [' .. param.type .. ']'
+            end
+            -- align:
+            while key:len() < 40 do
+               key = key .. ' '
+            end
+            str = str .. key .. '-- ' .. param.help .. '\n'
+         end
+         str = str .. ')\n'
+         if not param then break end
+      end
+   end
+   str = str .. style.none
+
+   if example then
+      str = str .. '\n' .. style.pre .. example .. style.none .. '\n'
+   end
+
+   str = str .. style.banner
+   return str
+end
+
+--------------------------------------------------------------------------------
+-- standard argument function: used to handle named arguments, and 
+-- display automated help for functions
+--------------------------------------------------------------------------------
+function dok.unpack(args, funcname, description, ...)
+   -- put args in table
+   local defs = {...}
+
+   -- generate usage string as a closure:
+   -- this way the function only gets called when an error occurs
+   local fusage = function() 
+      local example
+      if #defs > 1 then
+	 example = funcname .. '{' .. defs[2].arg .. '=' .. defs[2].type .. ', '
+	    .. defs[1].arg .. '=' .. defs[1].type .. '}\n'
+	 example = example .. funcname .. '(' .. defs[1].type .. ',' .. ' ...)'
+      end
+      return dok.usage(funcname, description, example, {tabled=defs})
+   end
+   local usage = {}
+   setmetatable(usage, {__tostring=fusage})
+
+   -- get args
+   local iargs = {}
+   if args.__printhelp then 
+      print(usage)
+      error('error')
+   elseif #args == 1 and type(args[1]) == 'table' and #args[1] == 0 
+   and not (torch and torch.typename(args[1]) ~= nil) then
+      -- named args
+      iargs = args[1]
+   else
+      -- ordered args
+      for i = 1,select('#',...) do
+         iargs[defs[i].arg] = args[i]
+      end
+   end
+
+   -- check/set arguments
+   local dargs = {}
+   for i = 1,#defs do
+      local def = defs[i]
+      -- is value requested ?
+      if def.req and iargs[def.arg] == nil then
+         print(style.error .. 'missing argument: ' .. def.arg .. style.none)
+         print(usage)
+         error('error')
+      end
+      -- get value or default
+      dargs[def.arg] = iargs[def.arg]
+      if dargs[def.arg] == nil then
+         dargs[def.arg] = def.default
+      end
+      if dargs[def.arg] == nil and def.defaulta then
+         dargs[def.arg] = dargs[def.defaulta]
+      end
+      dargs[i] = dargs[def.arg]
+   end
+
+   -- return usage too
+   dargs.usage = usage
+
+   -- stupid lua bug: we return all args by hand
+   if dargs[65] then
+      error('<dok.unpack> oups, cant deal with more than 64 arguments :-)')
+   end
+
+   -- return modified args
+   return dargs,unpack(dargs)
+end
+
+--------------------------------------------------------------------------------
+-- prints an error with nice formatting. If domain is provided, it is used as
+-- following: <domain> msg
+--------------------------------------------------------------------------------
+function dok.error(message, domain)
+   if domain then
+      message = '<' .. domain .. '> ' .. message
+   end
+   local col_msg = style.error .. tostring(message) .. style.none
+   error(col_msg)
+end
diff --git a/rocks/dok-scm-1.rockspec b/rocks/dok-scm-1.rockspec
new file mode 100644
index 0000000..79e68c3
--- /dev/null
+++ b/rocks/dok-scm-1.rockspec
@@ -0,0 +1,32 @@
+package = "dok"
+version = "scm-1"
+
+source = {
+   url = "git://github.com/torch/dok.git",
+}
+
+description = {
+   summary = "Support for the old torch7 dok system",
+   detailed = [[
+   ]],
+   homepage = "https://github.com/torch/dok",
+   license = "BSD"
+}
+
+dependencies = {
+   "lua >= 5.1",
+   "sundown >= 1.0"
+}
+
+build = {
+   type = "builtin",
+   modules = {
+      ["dok.init"] = "init.lua",
+      ["dok.inline"] = "inline.lua",
+   },
+   install = {
+      lua = {
+--         ["dok.README"] = "README.md"
+      }
+   }
+}

-- 
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/debian-science/packages/lua-torch-dok.git



More information about the debian-science-commits mailing list