[ic3-autosubmit] 01/06: Upstream 3.0.0rc8

Alastair McKinstry mckinstry at moszumanska.debian.org
Fri Jun 12 09:09:53 UTC 2015


This is an automated email from the git hooks/post-receive script.

mckinstry pushed a commit to branch master
in repository ic3-autosubmit.

commit 1c67380940b0105c3453c9d179e3a9d349624e7b
Author: Alastair McKinstry <mckinstry at debian.org>
Date:   Wed May 6 09:01:38 2015 +0100

    Upstream 3.0.0rc8
---
 CHANGES                                  |    2 +
 LICENSE                                  |  674 ++++++++++++++
 MANIFEST.in                              |    5 +
 PKG-INFO                                 |   12 +
 README                                   |   98 ++
 VERSION                                  |    1 +
 autosubmit.egg-info/PKG-INFO             |   12 +
 autosubmit.egg-info/SOURCES.txt          |   45 +
 autosubmit.egg-info/dependency_links.txt |    1 +
 autosubmit.egg-info/requires.txt         |    5 +
 autosubmit.egg-info/top_level.txt        |    1 +
 autosubmit/__init__.py                   |    0
 autosubmit/autosubmit.py                 | 1475 ++++++++++++++++++++++++++++++
 autosubmit/config/__init__.py            |    0
 autosubmit/config/basicConfig.py         |   86 ++
 autosubmit/config/config_common.py       |  968 ++++++++++++++++++++
 autosubmit/config/files/autosubmit.conf  |   19 +
 autosubmit/config/files/expdef.conf      |   64 ++
 autosubmit/config/files/jobs.conf        |   75 ++
 autosubmit/config/files/platforms.conf   |   29 +
 autosubmit/config/log.py                 |  245 +++++
 autosubmit/database/__init__.py          |    0
 autosubmit/database/data/autosubmit.sql  |   10 +
 autosubmit/database/db_common.py         |  423 +++++++++
 autosubmit/date/__init__.py              |    0
 autosubmit/date/chunk_date_lib.py        |  322 +++++++
 autosubmit/git/__init__.py               |    0
 autosubmit/git/git_common.py             |   86 ++
 autosubmit/job/__init__.py               |    0
 autosubmit/job/job.py                    |  597 ++++++++++++
 autosubmit/job/job_common.py             |  175 ++++
 autosubmit/job/job_list.py               |  862 +++++++++++++++++
 autosubmit/monitor/__init__.py           |    0
 autosubmit/monitor/monitor.py            |  331 +++++++
 autosubmit/platforms/__init__.py         |    0
 autosubmit/platforms/ecplatform.py       |  253 +++++
 autosubmit/platforms/hpcplatform.py      |  592 ++++++++++++
 autosubmit/platforms/localplatform.py    |  158 ++++
 autosubmit/platforms/lsfplatform.py      |  139 +++
 autosubmit/platforms/pbsplatform.py      |  244 +++++
 autosubmit/platforms/psplatform.py       |  126 +++
 autosubmit/platforms/sgeplatform.py      |  143 +++
 autosubmit/platforms/slurmplatform.py    |  136 +++
 bin/autosubmit                           |   39 +
 docs/autosubmit.pdf                      |  Bin 0 -> 1373082 bytes
 setup.cfg                                |    5 +
 setup.py                                 |   57 ++
 47 files changed, 8515 insertions(+)

diff --git a/CHANGES b/CHANGES
new file mode 100644
index 0000000..37e49b7
--- /dev/null
+++ b/CHANGES
@@ -0,0 +1,2 @@
+3.0.0
+		Restructure layout
diff --git a/LICENSE b/LICENSE
new file mode 100644
index 0000000..94a9ed0
--- /dev/null
+++ b/LICENSE
@@ -0,0 +1,674 @@
+                    GNU GENERAL PUBLIC LICENSE
+                       Version 3, 29 June 2007
+
+ Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+                            Preamble
+
+  The GNU General Public License is a free, copyleft license for
+software and other kinds of works.
+
+  The licenses for most software and other practical works are designed
+to take away your freedom to share and change the works.  By contrast,
+the GNU General Public License is intended to guarantee your freedom to
+share and change all versions of a program--to make sure it remains free
+software for all its users.  We, the Free Software Foundation, use the
+GNU General Public License for most of our software; it applies also to
+any other work released this way by its authors.  You can apply it to
+your programs, too.
+
+  When we speak of free software, we are referring to freedom, not
+price.  Our General Public Licenses are designed to make sure that you
+have the freedom to distribute copies of free software (and charge for
+them if you wish), that you receive source code or can get it if you
+want it, that you can change the software or use pieces of it in new
+free programs, and that you know you can do these things.
+
+  To protect your rights, we need to prevent others from denying you
+these rights or asking you to surrender the rights.  Therefore, you have
+certain responsibilities if you distribute copies of the software, or if
+you modify it: responsibilities to respect the freedom of others.
+
+  For example, if you distribute copies of such a program, whether
+gratis or for a fee, you must pass on to the recipients the same
+freedoms that you received.  You must make sure that they, too, receive
+or can get the source code.  And you must show them these terms so they
+know their rights.
+
+  Developers that use the GNU GPL protect your rights with two steps:
+(1) assert copyright on the software, and (2) offer you this License
+giving you legal permission to copy, distribute and/or modify it.
+
+  For the developers' and authors' protection, the GPL clearly explains
+that there is no warranty for this free software.  For both users' and
+authors' sake, the GPL requires that modified versions be marked as
+changed, so that their problems will not be attributed erroneously to
+authors of previous versions.
+
+  Some devices are designed to deny users access to install or run
+modified versions of the software inside them, although the manufacturer
+can do so.  This is fundamentally incompatible with the aim of
+protecting users' freedom to change the software.  The systematic
+pattern of such abuse occurs in the area of products for individuals to
+use, which is precisely where it is most unacceptable.  Therefore, we
+have designed this version of the GPL to prohibit the practice for those
+products.  If such problems arise substantially in other domains, we
+stand ready to extend this provision to those domains in future versions
+of the GPL, as needed to protect the freedom of users.
+
+  Finally, every program is threatened constantly by software patents.
+States should not allow patents to restrict development and use of
+software on general-purpose computers, but in those that do, we wish to
+avoid the special danger that patents applied to a free program could
+make it effectively proprietary.  To prevent this, the GPL assures that
+patents cannot be used to render the program non-free.
+
+  The precise terms and conditions for copying, distribution and
+modification follow.
+
+                       TERMS AND CONDITIONS
+
+  0. Definitions.
+
+  "This License" refers to version 3 of the GNU General Public License.
+
+  "Copyright" also means copyright-like laws that apply to other kinds of
+works, such as semiconductor masks.
+
+  "The Program" refers to any copyrightable work licensed under this
+License.  Each licensee is addressed as "you".  "Licensees" and
+"recipients" may be individuals or organizations.
+
+  To "modify" a work means to copy from or adapt all or part of the work
+in a fashion requiring copyright permission, other than the making of an
+exact copy.  The resulting work is called a "modified version" of the
+earlier work or a work "based on" the earlier work.
+
+  A "covered work" means either the unmodified Program or a work based
+on the Program.
+
+  To "propagate" a work means to do anything with it that, without
+permission, would make you directly or secondarily liable for
+infringement under applicable copyright law, except executing it on a
+computer or modifying a private copy.  Propagation includes copying,
+distribution (with or without modification), making available to the
+public, and in some countries other activities as well.
+
+  To "convey" a work means any kind of propagation that enables other
+parties to make or receive copies.  Mere interaction with a user through
+a computer network, with no transfer of a copy, is not conveying.
+
+  An interactive user interface displays "Appropriate Legal Notices"
+to the extent that it includes a convenient and prominently visible
+feature that (1) displays an appropriate copyright notice, and (2)
+tells the user that there is no warranty for the work (except to the
+extent that warranties are provided), that licensees may convey the
+work under this License, and how to view a copy of this License.  If
+the interface presents a list of user commands or options, such as a
+menu, a prominent item in the list meets this criterion.
+
+  1. Source Code.
+
+  The "source code" for a work means the preferred form of the work
+for making modifications to it.  "Object code" means any non-source
+form of a work.
+
+  A "Standard Interface" means an interface that either is an official
+standard defined by a recognized standards body, or, in the case of
+interfaces specified for a particular programming language, one that
+is widely used among developers working in that language.
+
+  The "System Libraries" of an executable work include anything, other
+than the work as a whole, that (a) is included in the normal form of
+packaging a Major Component, but which is not part of that Major
+Component, and (b) serves only to enable use of the work with that
+Major Component, or to implement a Standard Interface for which an
+implementation is available to the public in source code form.  A
+"Major Component", in this context, means a major essential component
+(kernel, window system, and so on) of the specific operating system
+(if any) on which the executable work runs, or a compiler used to
+produce the work, or an object code interpreter used to run it.
+
+  The "Corresponding Source" for a work in object code form means all
+the source code needed to generate, install, and (for an executable
+work) run the object code and to modify the work, including scripts to
+control those activities.  However, it does not include the work's
+System Libraries, or general-purpose tools or generally available free
+programs which are used unmodified in performing those activities but
+which are not part of the work.  For example, Corresponding Source
+includes interface definition files associated with source files for
+the work, and the source code for shared libraries and dynamically
+linked subprograms that the work is specifically designed to require,
+such as by intimate data communication or control flow between those
+subprograms and other parts of the work.
+
+  The Corresponding Source need not include anything that users
+can regenerate automatically from other parts of the Corresponding
+Source.
+
+  The Corresponding Source for a work in source code form is that
+same work.
+
+  2. Basic Permissions.
+
+  All rights granted under this License are granted for the term of
+copyright on the Program, and are irrevocable provided the stated
+conditions are met.  This License explicitly affirms your unlimited
+permission to run the unmodified Program.  The output from running a
+covered work is covered by this License only if the output, given its
+content, constitutes a covered work.  This License acknowledges your
+rights of fair use or other equivalent, as provided by copyright law.
+
+  You may make, run and propagate covered works that you do not
+convey, without conditions so long as your license otherwise remains
+in force.  You may convey covered works to others for the sole purpose
+of having them make modifications exclusively for you, or provide you
+with facilities for running those works, provided that you comply with
+the terms of this License in conveying all material for which you do
+not control copyright.  Those thus making or running the covered works
+for you must do so exclusively on your behalf, under your direction
+and control, on terms that prohibit them from making any copies of
+your copyrighted material outside their relationship with you.
+
+  Conveying under any other circumstances is permitted solely under
+the conditions stated below.  Sublicensing is not allowed; section 10
+makes it unnecessary.
+
+  3. Protecting Users' Legal Rights From Anti-Circumvention Law.
+
+  No covered work shall be deemed part of an effective technological
+measure under any applicable law fulfilling obligations under article
+11 of the WIPO copyright treaty adopted on 20 December 1996, or
+similar laws prohibiting or restricting circumvention of such
+measures.
+
+  When you convey a covered work, you waive any legal power to forbid
+circumvention of technological measures to the extent such circumvention
+is effected by exercising rights under this License with respect to
+the covered work, and you disclaim any intention to limit operation or
+modification of the work as a means of enforcing, against the work's
+users, your or third parties' legal rights to forbid circumvention of
+technological measures.
+
+  4. Conveying Verbatim Copies.
+
+  You may convey verbatim copies of the Program's source code as you
+receive it, in any medium, provided that you conspicuously and
+appropriately publish on each copy an appropriate copyright notice;
+keep intact all notices stating that this License and any
+non-permissive terms added in accord with section 7 apply to the code;
+keep intact all notices of the absence of any warranty; and give all
+recipients a copy of this License along with the Program.
+
+  You may charge any price or no price for each copy that you convey,
+and you may offer support or warranty protection for a fee.
+
+  5. Conveying Modified Source Versions.
+
+  You may convey a work based on the Program, or the modifications to
+produce it from the Program, in the form of source code under the
+terms of section 4, provided that you also meet all of these conditions:
+
+    a) The work must carry prominent notices stating that you modified
+    it, and giving a relevant date.
+
+    b) The work must carry prominent notices stating that it is
+    released under this License and any conditions added under section
+    7.  This requirement modifies the requirement in section 4 to
+    "keep intact all notices".
+
+    c) You must license the entire work, as a whole, under this
+    License to anyone who comes into possession of a copy.  This
+    License will therefore apply, along with any applicable section 7
+    additional terms, to the whole of the work, and all its parts,
+    regardless of how they are packaged.  This License gives no
+    permission to license the work in any other way, but it does not
+    invalidate such permission if you have separately received it.
+
+    d) If the work has interactive user interfaces, each must display
+    Appropriate Legal Notices; however, if the Program has interactive
+    interfaces that do not display Appropriate Legal Notices, your
+    work need not make them do so.
+
+  A compilation of a covered work with other separate and independent
+works, which are not by their nature extensions of the covered work,
+and which are not combined with it such as to form a larger program,
+in or on a volume of a storage or distribution medium, is called an
+"aggregate" if the compilation and its resulting copyright are not
+used to limit the access or legal rights of the compilation's users
+beyond what the individual works permit.  Inclusion of a covered work
+in an aggregate does not cause this License to apply to the other
+parts of the aggregate.
+
+  6. Conveying Non-Source Forms.
+
+  You may convey a covered work in object code form under the terms
+of sections 4 and 5, provided that you also convey the
+machine-readable Corresponding Source under the terms of this License,
+in one of these ways:
+
+    a) Convey the object code in, or embodied in, a physical product
+    (including a physical distribution medium), accompanied by the
+    Corresponding Source fixed on a durable physical medium
+    customarily used for software interchange.
+
+    b) Convey the object code in, or embodied in, a physical product
+    (including a physical distribution medium), accompanied by a
+    written offer, valid for at least three years and valid for as
+    long as you offer spare parts or customer support for that product
+    model, to give anyone who possesses the object code either (1) a
+    copy of the Corresponding Source for all the software in the
+    product that is covered by this License, on a durable physical
+    medium customarily used for software interchange, for a price no
+    more than your reasonable cost of physically performing this
+    conveying of source, or (2) access to copy the
+    Corresponding Source from a network server at no charge.
+
+    c) Convey individual copies of the object code with a copy of the
+    written offer to provide the Corresponding Source.  This
+    alternative is allowed only occasionally and noncommercially, and
+    only if you received the object code with such an offer, in accord
+    with subsection 6b.
+
+    d) Convey the object code by offering access from a designated
+    place (gratis or for a charge), and offer equivalent access to the
+    Corresponding Source in the same way through the same place at no
+    further charge.  You need not require recipients to copy the
+    Corresponding Source along with the object code.  If the place to
+    copy the object code is a network server, the Corresponding Source
+    may be on a different server (operated by you or a third party)
+    that supports equivalent copying facilities, provided you maintain
+    clear directions next to the object code saying where to find the
+    Corresponding Source.  Regardless of what server hosts the
+    Corresponding Source, you remain obligated to ensure that it is
+    available for as long as needed to satisfy these requirements.
+
+    e) Convey the object code using peer-to-peer transmission, provided
+    you inform other peers where the object code and Corresponding
+    Source of the work are being offered to the general public at no
+    charge under subsection 6d.
+
+  A separable portion of the object code, whose source code is excluded
+from the Corresponding Source as a System Library, need not be
+included in conveying the object code work.
+
+  A "User Product" is either (1) a "consumer product", which means any
+tangible personal property which is normally used for personal, family,
+or household purposes, or (2) anything designed or sold for incorporation
+into a dwelling.  In determining whether a product is a consumer product,
+doubtful cases shall be resolved in favor of coverage.  For a particular
+product received by a particular user, "normally used" refers to a
+typical or common use of that class of product, regardless of the status
+of the particular user or of the way in which the particular user
+actually uses, or expects or is expected to use, the product.  A product
+is a consumer product regardless of whether the product has substantial
+commercial, industrial or non-consumer uses, unless such uses represent
+the only significant mode of use of the product.
+
+  "Installation Information" for a User Product means any methods,
+procedures, authorization keys, or other information required to install
+and execute modified versions of a covered work in that User Product from
+a modified version of its Corresponding Source.  The information must
+suffice to ensure that the continued functioning of the modified object
+code is in no case prevented or interfered with solely because
+modification has been made.
+
+  If you convey an object code work under this section in, or with, or
+specifically for use in, a User Product, and the conveying occurs as
+part of a transaction in which the right of possession and use of the
+User Product is transferred to the recipient in perpetuity or for a
+fixed term (regardless of how the transaction is characterized), the
+Corresponding Source conveyed under this section must be accompanied
+by the Installation Information.  But this requirement does not apply
+if neither you nor any third party retains the ability to install
+modified object code on the User Product (for example, the work has
+been installed in ROM).
+
+  The requirement to provide Installation Information does not include a
+requirement to continue to provide support service, warranty, or updates
+for a work that has been modified or installed by the recipient, or for
+the User Product in which it has been modified or installed.  Access to a
+network may be denied when the modification itself materially and
+adversely affects the operation of the network or violates the rules and
+protocols for communication across the network.
+
+  Corresponding Source conveyed, and Installation Information provided,
+in accord with this section must be in a format that is publicly
+documented (and with an implementation available to the public in
+source code form), and must require no special password or key for
+unpacking, reading or copying.
+
+  7. Additional Terms.
+
+  "Additional permissions" are terms that supplement the terms of this
+License by making exceptions from one or more of its conditions.
+Additional permissions that are applicable to the entire Program shall
+be treated as though they were included in this License, to the extent
+that they are valid under applicable law.  If additional permissions
+apply only to part of the Program, that part may be used separately
+under those permissions, but the entire Program remains governed by
+this License without regard to the additional permissions.
+
+  When you convey a copy of a covered work, you may at your option
+remove any additional permissions from that copy, or from any part of
+it.  (Additional permissions may be written to require their own
+removal in certain cases when you modify the work.)  You may place
+additional permissions on material, added by you to a covered work,
+for which you have or can give appropriate copyright permission.
+
+  Notwithstanding any other provision of this License, for material you
+add to a covered work, you may (if authorized by the copyright holders of
+that material) supplement the terms of this License with terms:
+
+    a) Disclaiming warranty or limiting liability differently from the
+    terms of sections 15 and 16 of this License; or
+
+    b) Requiring preservation of specified reasonable legal notices or
+    author attributions in that material or in the Appropriate Legal
+    Notices displayed by works containing it; or
+
+    c) Prohibiting misrepresentation of the origin of that material, or
+    requiring that modified versions of such material be marked in
+    reasonable ways as different from the original version; or
+
+    d) Limiting the use for publicity purposes of names of licensors or
+    authors of the material; or
+
+    e) Declining to grant rights under trademark law for use of some
+    trade names, trademarks, or service marks; or
+
+    f) Requiring indemnification of licensors and authors of that
+    material by anyone who conveys the material (or modified versions of
+    it) with contractual assumptions of liability to the recipient, for
+    any liability that these contractual assumptions directly impose on
+    those licensors and authors.
+
+  All other non-permissive additional terms are considered "further
+restrictions" within the meaning of section 10.  If the Program as you
+received it, or any part of it, contains a notice stating that it is
+governed by this License along with a term that is a further
+restriction, you may remove that term.  If a license document contains
+a further restriction but permits relicensing or conveying under this
+License, you may add to a covered work material governed by the terms
+of that license document, provided that the further restriction does
+not survive such relicensing or conveying.
+
+  If you add terms to a covered work in accord with this section, you
+must place, in the relevant source files, a statement of the
+additional terms that apply to those files, or a notice indicating
+where to find the applicable terms.
+
+  Additional terms, permissive or non-permissive, may be stated in the
+form of a separately written license, or stated as exceptions;
+the above requirements apply either way.
+
+  8. Termination.
+
+  You may not propagate or modify a covered work except as expressly
+provided under this License.  Any attempt otherwise to propagate or
+modify it is void, and will automatically terminate your rights under
+this License (including any patent licenses granted under the third
+paragraph of section 11).
+
+  However, if you cease all violation of this License, then your
+license from a particular copyright holder is reinstated (a)
+provisionally, unless and until the copyright holder explicitly and
+finally terminates your license, and (b) permanently, if the copyright
+holder fails to notify you of the violation by some reasonable means
+prior to 60 days after the cessation.
+
+  Moreover, your license from a particular copyright holder is
+reinstated permanently if the copyright holder notifies you of the
+violation by some reasonable means, this is the first time you have
+received notice of violation of this License (for any work) from that
+copyright holder, and you cure the violation prior to 30 days after
+your receipt of the notice.
+
+  Termination of your rights under this section does not terminate the
+licenses of parties who have received copies or rights from you under
+this License.  If your rights have been terminated and not permanently
+reinstated, you do not qualify to receive new licenses for the same
+material under section 10.
+
+  9. Acceptance Not Required for Having Copies.
+
+  You are not required to accept this License in order to receive or
+run a copy of the Program.  Ancillary propagation of a covered work
+occurring solely as a consequence of using peer-to-peer transmission
+to receive a copy likewise does not require acceptance.  However,
+nothing other than this License grants you permission to propagate or
+modify any covered work.  These actions infringe copyright if you do
+not accept this License.  Therefore, by modifying or propagating a
+covered work, you indicate your acceptance of this License to do so.
+
+  10. Automatic Licensing of Downstream Recipients.
+
+  Each time you convey a covered work, the recipient automatically
+receives a license from the original licensors, to run, modify and
+propagate that work, subject to this License.  You are not responsible
+for enforcing compliance by third parties with this License.
+
+  An "entity transaction" is a transaction transferring control of an
+organization, or substantially all assets of one, or subdividing an
+organization, or merging organizations.  If propagation of a covered
+work results from an entity transaction, each party to that
+transaction who receives a copy of the work also receives whatever
+licenses to the work the party's predecessor in interest had or could
+give under the previous paragraph, plus a right to possession of the
+Corresponding Source of the work from the predecessor in interest, if
+the predecessor has it or can get it with reasonable efforts.
+
+  You may not impose any further restrictions on the exercise of the
+rights granted or affirmed under this License.  For example, you may
+not impose a license fee, royalty, or other charge for exercise of
+rights granted under this License, and you may not initiate litigation
+(including a cross-claim or counterclaim in a lawsuit) alleging that
+any patent claim is infringed by making, using, selling, offering for
+sale, or importing the Program or any portion of it.
+
+  11. Patents.
+
+  A "contributor" is a copyright holder who authorizes use under this
+License of the Program or a work on which the Program is based.  The
+work thus licensed is called the contributor's "contributor version".
+
+  A contributor's "essential patent claims" are all patent claims
+owned or controlled by the contributor, whether already acquired or
+hereafter acquired, that would be infringed by some manner, permitted
+by this License, of making, using, or selling its contributor version,
+but do not include claims that would be infringed only as a
+consequence of further modification of the contributor version.  For
+purposes of this definition, "control" includes the right to grant
+patent sublicenses in a manner consistent with the requirements of
+this License.
+
+  Each contributor grants you a non-exclusive, worldwide, royalty-free
+patent license under the contributor's essential patent claims, to
+make, use, sell, offer for sale, import and otherwise run, modify and
+propagate the contents of its contributor version.
+
+  In the following three paragraphs, a "patent license" is any express
+agreement or commitment, however denominated, not to enforce a patent
+(such as an express permission to practice a patent or covenant not to
+sue for patent infringement).  To "grant" such a patent license to a
+party means to make such an agreement or commitment not to enforce a
+patent against the party.
+
+  If you convey a covered work, knowingly relying on a patent license,
+and the Corresponding Source of the work is not available for anyone
+to copy, free of charge and under the terms of this License, through a
+publicly available network server or other readily accessible means,
+then you must either (1) cause the Corresponding Source to be so
+available, or (2) arrange to deprive yourself of the benefit of the
+patent license for this particular work, or (3) arrange, in a manner
+consistent with the requirements of this License, to extend the patent
+license to downstream recipients.  "Knowingly relying" means you have
+actual knowledge that, but for the patent license, your conveying the
+covered work in a country, or your recipient's use of the covered work
+in a country, would infringe one or more identifiable patents in that
+country that you have reason to believe are valid.
+
+  If, pursuant to or in connection with a single transaction or
+arrangement, you convey, or propagate by procuring conveyance of, a
+covered work, and grant a patent license to some of the parties
+receiving the covered work authorizing them to use, propagate, modify
+or convey a specific copy of the covered work, then the patent license
+you grant is automatically extended to all recipients of the covered
+work and works based on it.
+
+  A patent license is "discriminatory" if it does not include within
+the scope of its coverage, prohibits the exercise of, or is
+conditioned on the non-exercise of one or more of the rights that are
+specifically granted under this License.  You may not convey a covered
+work if you are a party to an arrangement with a third party that is
+in the business of distributing software, under which you make payment
+to the third party based on the extent of your activity of conveying
+the work, and under which the third party grants, to any of the
+parties who would receive the covered work from you, a discriminatory
+patent license (a) in connection with copies of the covered work
+conveyed by you (or copies made from those copies), or (b) primarily
+for and in connection with specific products or compilations that
+contain the covered work, unless you entered into that arrangement,
+or that patent license was granted, prior to 28 March 2007.
+
+  Nothing in this License shall be construed as excluding or limiting
+any implied license or other defenses to infringement that may
+otherwise be available to you under applicable patent law.
+
+  12. No Surrender of Others' Freedom.
+
+  If conditions are imposed on you (whether by court order, agreement or
+otherwise) that contradict the conditions of this License, they do not
+excuse you from the conditions of this License.  If you cannot convey a
+covered work so as to satisfy simultaneously your obligations under this
+License and any other pertinent obligations, then as a consequence you may
+not convey it at all.  For example, if you agree to terms that obligate you
+to collect a royalty for further conveying from those to whom you convey
+the Program, the only way you could satisfy both those terms and this
+License would be to refrain entirely from conveying the Program.
+
+  13. Use with the GNU Affero General Public License.
+
+  Notwithstanding any other provision of this License, you have
+permission to link or combine any covered work with a work licensed
+under version 3 of the GNU Affero General Public License into a single
+combined work, and to convey the resulting work.  The terms of this
+License will continue to apply to the part which is the covered work,
+but the special requirements of the GNU Affero General Public License,
+section 13, concerning interaction through a network will apply to the
+combination as such.
+
+  14. Revised Versions of this License.
+
+  The Free Software Foundation may publish revised and/or new versions of
+the GNU General Public License from time to time.  Such new versions will
+be similar in spirit to the present version, but may differ in detail to
+address new problems or concerns.
+
+  Each version is given a distinguishing version number.  If the
+Program specifies that a certain numbered version of the GNU General
+Public License "or any later version" applies to it, you have the
+option of following the terms and conditions either of that numbered
+version or of any later version published by the Free Software
+Foundation.  If the Program does not specify a version number of the
+GNU General Public License, you may choose any version ever published
+by the Free Software Foundation.
+
+  If the Program specifies that a proxy can decide which future
+versions of the GNU General Public License can be used, that proxy's
+public statement of acceptance of a version permanently authorizes you
+to choose that version for the Program.
+
+  Later license versions may give you additional or different
+permissions.  However, no additional obligations are imposed on any
+author or copyright holder as a result of your choosing to follow a
+later version.
+
+  15. Disclaimer of Warranty.
+
+  THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
+APPLICABLE LAW.  EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
+HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
+OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
+THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+PURPOSE.  THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
+IS WITH YOU.  SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
+ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
+
+  16. Limitation of Liability.
+
+  IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
+WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
+THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
+GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
+USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
+DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
+PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
+EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
+SUCH DAMAGES.
+
+  17. Interpretation of Sections 15 and 16.
+
+  If the disclaimer of warranty and limitation of liability provided
+above cannot be given local legal effect according to their terms,
+reviewing courts shall apply local law that most closely approximates
+an absolute waiver of all civil liability in connection with the
+Program, unless a warranty or assumption of liability accompanies a
+copy of the Program in return for a fee.
+
+                     END OF TERMS AND CONDITIONS
+
+            How to Apply These Terms to Your New Programs
+
+  If you develop a new program, and you want it to be of the greatest
+possible use to the public, the best way to achieve this is to make it
+free software which everyone can redistribute and change under these terms.
+
+  To do so, attach the following notices to the program.  It is safest
+to attach them to the start of each source file to most effectively
+state the exclusion of warranty; and each file should have at least
+the "copyright" line and a pointer to where the full notice is found.
+
+    <one line to give the program's name and a brief idea of what it does.>
+    Copyright (C) <year>  <name of author>
+
+    This program is free software: you can redistribute it and/or modify
+    it under the terms of the GNU General Public License as published by
+    the Free Software Foundation, either version 3 of the License, or
+    (at your option) any later version.
+
+    This program is distributed in the hope that it will be useful,
+    but WITHOUT ANY WARRANTY; without even the implied warranty of
+    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+    GNU General Public License for more details.
+
+    You should have received a copy of the GNU General Public License
+    along with this program.  If not, see <http://www.gnu.org/licenses/>.
+
+Also add information on how to contact you by electronic and paper mail.
+
+  If the program does terminal interaction, make it output a short
+notice like this when it starts in an interactive mode:
+
+    <program>  Copyright (C) <year>  <name of author>
+    This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
+    This is free software, and you are welcome to redistribute it
+    under certain conditions; type `show c' for details.
+
+The hypothetical commands `show w' and `show c' should show the appropriate
+parts of the General Public License.  Of course, your program's commands
+might be different; for a GUI interface, you would use an "about box".
+
+  You should also get your employer (if you work as a programmer) or school,
+if any, to sign a "copyright disclaimer" for the program, if necessary.
+For more information on this, and how to apply and follow the GNU GPL, see
+<http://www.gnu.org/licenses/>.
+
+  The GNU General Public License does not permit incorporating your program
+into proprietary programs.  If your program is a subroutine library, you
+may consider it more useful to permit linking proprietary applications with
+the library.  If this is what you want to do, use the GNU Lesser General
+Public License instead of this License.  But first, please read
+<http://www.gnu.org/philosophy/why-not-lgpl.html>.
diff --git a/MANIFEST.in b/MANIFEST.in
new file mode 100644
index 0000000..c2ae7ad
--- /dev/null
+++ b/MANIFEST.in
@@ -0,0 +1,5 @@
+include LICENSE
+include VERSION
+include CHANGES
+include docs/autosubmit.pdf
+global-include *.conf *.sql
diff --git a/PKG-INFO b/PKG-INFO
new file mode 100644
index 0000000..1c740d9
--- /dev/null
+++ b/PKG-INFO
@@ -0,0 +1,12 @@
+Metadata-Version: 1.1
+Name: autosubmit
+Version: 3.0.0rc8
+Summary: Autosubmit: a versatile tool to manage Weather and Climate Experiments in diverse Supercomputing Environments
+Home-page: https://autosubmit.ic3.cat
+Author: Domingo Manubens-Gil
+Author-email: domingo.manubens at ic3.cat
+License: GNU GPL v3
+Download-URL: http://ic3.cat/wikicfu/index.php/Tools/Autosubmit
+Description: UNKNOWN
+Keywords: climate,weather,workflow,HPC
+Platform: GNU/Linux Debian
diff --git a/README b/README
new file mode 100644
index 0000000..5324175
--- /dev/null
+++ b/README
@@ -0,0 +1,98 @@
+
+Autosubmit is a tool to create, manage and monitor experiments using
+Computing Clusters, HPC's and Supercomputers remotely via ssh.
+
+
+HOW TO DEPLOY/SETUP AUTOSUBMIT FRAMEWORK
+========================================
+
+- Autosubmit has been tested:
+
+  with the following Operating Systems:
+   * Linux Debian
+
+  on the following HPC's/Clusters:
+   * Ithaca (IC3 machine)
+   * Stargate (IC3 machine)
+   * MareNostrum (BSC machine)
+   * MareNostrum3 (BSC machine)
+   * HECToR (EPCC machine)
+   * Lindgren (PDC machine)
+   * C2A (ECMWF machine)
+   * CCA (ECMWF machine)
+   * ARCHER (EPCC machine)
+
+- Pre-requisties: These packages (bash, python2, sqlite3, git-scm > 1.8.2, subversion) must be available at local
+  machine. These packages (argparse, dateutil, pyparsing, numpy, pydotplus, matplotlib, paramiko) must be available for
+  python runtime. And the machine is also able to access HPC's/Clusters via password-less ssh.
+
+- Install Autosubmit
+   > pip install autosubmit
+  or download, unpack and "python setup.py install"
+
+- Create a repository for experiments: Say for example "/cfu/autosubmit" then "autosubmit configure" and follow
+  instructions
+
+- Create a blank database: "autosubmit install"
+
+HOW TO USE AUTOSUBMIT
+=====================
+> autosubmit -h
+
+> autosubmit expid --HPC ithaca --description "experiment is about..."
+
+Say for example, "cxxx" is 4 character based expid generated by system automatically.
+First character is a letter, the other three alfa-numeric characters allow to identify uniquely the experiment.
+
+> vi /cfu/autosubmit/cxxx/conf/expdef_cxxx.conf
+
+> vi /cfu/autosubmit/cxxx/conf/autosubmit_cxxx.conf
+
+> vi /cfu/autosubmit/cxxx/conf/platforms_cxxx.conf
+
+> vi /cfu/autosubmit/cxxx/conf/jobs_cxxx.conf
+
+> autosubmit create cxxx
+
+> nohup autosubmit run cxxx
+
+Cautions: 
+- Before launching autosubmit check the following stuff:
+> ssh ithaca # (for example) check other HPC's where password-less ssh is feasible.
+- After launching autosubmit, one must be aware of login expiry limit and policy (if applicable for any HPC)
+and renew the login access accordingly (by using token/key etc) before expiry.
+
+HOW TO MONITOR EXPERIMENT
+=========================
+
+> autosubmit monitor -h
+
+> autosubmit monitor cxxx
+or
+> autosubmit monitor cxxx -o png
+
+Above generated plot with date & time stamp can be found at:
+
+/cfu/autosubmit/cxxx/plot/cxxx_date_time.pdf
+or 
+/cfu/autosubmit/cxxx/plot/cxxx_date_time.png
+
+
+HOW TO RESTART EXPERIMENT
+=========================
+
+> autosubmit recovery -h
+
+> autosubmit recovery  cxxx -g # getting/fetching completed files
+
+> autosubmit recovery  cxxx -s # saving the pickle file
+
+> nohup autosubmit run cxxx
+
+
+FULL DOCUMENTATION AND HOW TOs
+==============================
+
+Check the Autosubmit documentation provided in the docs/ folder of the package, in PDF format.
+Check the online documentation in the following web page: http://autosbumit.ic3.cat
+
diff --git a/VERSION b/VERSION
new file mode 100644
index 0000000..398bfa1
--- /dev/null
+++ b/VERSION
@@ -0,0 +1 @@
+3.0.0rc8
diff --git a/autosubmit.egg-info/PKG-INFO b/autosubmit.egg-info/PKG-INFO
new file mode 100644
index 0000000..1c740d9
--- /dev/null
+++ b/autosubmit.egg-info/PKG-INFO
@@ -0,0 +1,12 @@
+Metadata-Version: 1.1
+Name: autosubmit
+Version: 3.0.0rc8
+Summary: Autosubmit: a versatile tool to manage Weather and Climate Experiments in diverse Supercomputing Environments
+Home-page: https://autosubmit.ic3.cat
+Author: Domingo Manubens-Gil
+Author-email: domingo.manubens at ic3.cat
+License: GNU GPL v3
+Download-URL: http://ic3.cat/wikicfu/index.php/Tools/Autosubmit
+Description: UNKNOWN
+Keywords: climate,weather,workflow,HPC
+Platform: GNU/Linux Debian
diff --git a/autosubmit.egg-info/SOURCES.txt b/autosubmit.egg-info/SOURCES.txt
new file mode 100644
index 0000000..f8dc131
--- /dev/null
+++ b/autosubmit.egg-info/SOURCES.txt
@@ -0,0 +1,45 @@
+CHANGES
+LICENSE
+MANIFEST.in
+README
+VERSION
+setup.py
+autosubmit/__init__.py
+autosubmit/autosubmit.py
+autosubmit.egg-info/PKG-INFO
+autosubmit.egg-info/SOURCES.txt
+autosubmit.egg-info/dependency_links.txt
+autosubmit.egg-info/requires.txt
+autosubmit.egg-info/top_level.txt
+autosubmit/config/__init__.py
+autosubmit/config/basicConfig.py
+autosubmit/config/config_common.py
+autosubmit/config/log.py
+autosubmit/config/files/autosubmit.conf
+autosubmit/config/files/expdef.conf
+autosubmit/config/files/jobs.conf
+autosubmit/config/files/platforms.conf
+autosubmit/database/__init__.py
+autosubmit/database/db_common.py
+autosubmit/database/data/autosubmit.sql
+autosubmit/date/__init__.py
+autosubmit/date/chunk_date_lib.py
+autosubmit/git/__init__.py
+autosubmit/git/git_common.py
+autosubmit/job/__init__.py
+autosubmit/job/job.py
+autosubmit/job/job_common.py
+autosubmit/job/job_list.py
+autosubmit/monitor/__init__.py
+autosubmit/monitor/monitor.py
+autosubmit/platforms/__init__.py
+autosubmit/platforms/ecplatform.py
+autosubmit/platforms/hpcplatform.py
+autosubmit/platforms/localplatform.py
+autosubmit/platforms/lsfplatform.py
+autosubmit/platforms/pbsplatform.py
+autosubmit/platforms/psplatform.py
+autosubmit/platforms/sgeplatform.py
+autosubmit/platforms/slurmplatform.py
+bin/autosubmit
+docs/autosubmit.pdf
\ No newline at end of file
diff --git a/autosubmit.egg-info/dependency_links.txt b/autosubmit.egg-info/dependency_links.txt
new file mode 100644
index 0000000..8b13789
--- /dev/null
+++ b/autosubmit.egg-info/dependency_links.txt
@@ -0,0 +1 @@
+
diff --git a/autosubmit.egg-info/requires.txt b/autosubmit.egg-info/requires.txt
new file mode 100644
index 0000000..4e34dcb
--- /dev/null
+++ b/autosubmit.egg-info/requires.txt
@@ -0,0 +1,5 @@
+argparse>=1.2,<2
+python-dateutil>=1,<2
+pydotplus
+pyparsing
+paramiko
\ No newline at end of file
diff --git a/autosubmit.egg-info/top_level.txt b/autosubmit.egg-info/top_level.txt
new file mode 100644
index 0000000..a2bad6d
--- /dev/null
+++ b/autosubmit.egg-info/top_level.txt
@@ -0,0 +1 @@
+autosubmit
diff --git a/autosubmit/__init__.py b/autosubmit/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/autosubmit/autosubmit.py b/autosubmit/autosubmit.py
new file mode 100644
index 0000000..2e76ef5
--- /dev/null
+++ b/autosubmit/autosubmit.py
@@ -0,0 +1,1475 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+
+"""
+Main module for autosubmit. Only contains an interface class to all functionality implemented on autosubmit
+"""
+
+from ConfigParser import SafeConfigParser
+import argparse
+from commands import getstatusoutput
+import json
+import time
+import cPickle
+import os
+import sys
+import shutil
+import re
+import random
+from pkg_resources import require, resource_listdir, resource_exists, resource_string
+from time import strftime
+from distutils.util import strtobool
+
+from pyparsing import nestedExpr
+
+sys.path.insert(0, os.path.abspath('.'))
+
+from config.basicConfig import BasicConfig
+from config.config_common import AutosubmitConfig
+from job.job_common import Status
+from git.git_common import AutosubmitGit
+from job.job_list import JobList
+from config.log import Log
+from database.db_common import create_db
+from database.db_common import new_experiment
+from database.db_common import copy_experiment
+from database.db_common import delete_experiment
+from monitor.monitor import Monitor
+
+
+class Autosubmit:
+    """
+    Interface class for autosubmit.
+    """
+    # Get the version number from the relevant file. If not, from autosubmit package
+    scriptdir = os.path.abspath(os.path.dirname(sys.argv[0]))
+    version_path = os.path.join(scriptdir, '..', 'VERSION')
+    if os.path.isfile(version_path):
+        with open(version_path) as f:
+            autosubmit_version = f.read().strip()
+    else:
+        autosubmit_version = require("autosubmit")[0].version
+
+    @staticmethod
+    def parse_args():
+        """
+        Parse arguments given to an executable and start execution of command given
+        """
+        try:
+            BasicConfig.read()
+
+            parser = argparse.ArgumentParser(description='Main executable for autosubmit. ')
+            parser.add_argument('-v', '--version', action='version', version=Autosubmit.autosubmit_version,
+                                help="returns autosubmit's version number and exit")
+            parser.add_argument('-lf', '--logfile', choices=('EVERYTHING', 'DEBUG', 'INFO', 'RESULT', 'USER_WARNING',
+                                                             'WARNING', 'ERROR', 'CRITICAL', 'NO_LOG'),
+                                default='DEBUG', type=str,
+                                help="sets file's log level.")
+            parser.add_argument('-lc', '--logconsole', choices=('EVERYTHING', 'DEBUG', 'INFO', 'RESULT', 'USER_WARNING',
+                                                                'WARNING', 'ERROR', 'CRITICAL', 'NO_LOG'),
+                                default='INFO', type=str,
+                                help="sets console's log level")
+
+            subparsers = parser.add_subparsers(dest='command')
+
+            # Run
+            subparser = subparsers.add_parser('run', description="runs specified experiment")
+            subparser.add_argument('expid', help='experiment identifier')
+
+            # Expid
+            subparser = subparsers.add_parser('expid', description="Creates a new experiment")
+            group = subparser.add_mutually_exclusive_group()
+            group.add_argument('-y', '--copy', help='makes a copy of the specified experiment')
+            group.add_argument('-dm', '--dummy', action='store_true',
+                               help='creates a new experiment with default values, usually for testing')
+
+            subparser.add_argument('-H', '--HPC', required=True,
+                                   help='specifies the HPC to use for the experiment')
+            subparser.add_argument('-d', '--description', type=str, required=True,
+                                   help='sets a description for the experiment to store in the database.')
+
+            # Delete
+            subparser = subparsers.add_parser('delete', description="delete specified experiment")
+            subparser.add_argument('expid',  help='experiment identifier')
+            subparser.add_argument('-f', '--force', action='store_true', help='deletes experiment without confirmation')
+
+            # Monitor
+            subparser = subparsers.add_parser('monitor', description="plots specified experiment")
+            subparser.add_argument('expid', help='experiment identifier')
+            subparser.add_argument('-o', '--output', choices=('pdf', 'png', 'ps', 'svg'), default='pdf',
+                                   help='chooses type of output for generated plot')
+
+            # Stats
+            subparser = subparsers.add_parser('stats', description="plots statistics for specified experiment")
+            subparser.add_argument('expid', help='experiment identifier')
+            subparser.add_argument('-o', '--output', choices=('pdf', 'png', 'ps', 'svg'), default='pdf',
+                                   help='type of output for generated plot')
+
+            # Clean
+            subparser = subparsers.add_parser('clean', description="clean specified experiment")
+            subparser.add_argument('expid', help='experiment identifier')
+            subparser.add_argument('-pr', '--project', action="store_true", help='clean project')
+            subparser.add_argument('-p', '--plot', action="store_true",
+                                   help='clean plot, only 2 last will remain')
+            subparser.add_argument('-s', '--stats', action="store_true",
+                                   help='clean stats, only last will remain')
+
+            # Recovery
+            subparser = subparsers.add_parser('recovery', description="recover specified experiment")
+            subparser.add_argument('expid', type=str, help='experiment identifier')
+            subparser.add_argument('-all', action="store_true", default=False,
+                                   help='Get completed files to synchronize pkl')
+            subparser.add_argument('-s', '--save', action="store_true", default=False, help='Save changes to disk')
+
+            # Check
+            subparser = subparsers.add_parser('check', description="check configuration for specified experiment")
+            subparser.add_argument('expid',  help='experiment identifier')
+
+            # Create
+            subparser = subparsers.add_parser('create', description="create specified experiment joblist")
+            subparser.add_argument('expid',  help='experiment identifier')
+            subparser.add_argument('-np', '--noplot', action='store_true', default=False, help='omit plot')
+
+            # Configure
+            subparser = subparsers.add_parser('configure', description="configure database and path for autosubmit. It "
+                                                                       "can be done at machine, user or local level "
+                                                                       "(by default at machine level)")
+            subparser.add_argument('-db', '--databasepath', default=None, help='path to database. If not supplied, '
+                                                                               'it will prompt for it')
+            subparser.add_argument('-dbf', '--databasefilename', default=None, help='database filename')
+            subparser.add_argument('-lr', '--localrootpath', default=None, help='path to store experiments. If not '
+                                                                                'supplied, it will prompt for it')
+            subparser.add_argument('-qc', '--platformsconfpath', default=None,
+                                   help='path to platforms.conf file to use by default. If not supplied, it will not'
+                                   ' prompt for it')
+            subparser.add_argument('-jc', '--jobsconfpath', default=None, help='path to jobs.conf file to use by '
+                                                                               'default. If not supplied, it will not '
+                                                                               'prompt for it')
+            group = subparser.add_mutually_exclusive_group()
+            group.add_argument('-a', '--all', action="store_true", help='configure for all users')
+            group.add_argument('-l', '--local', action="store_true", help='configure only for using Autosubmit from '
+                                                                          'this path')
+
+            # Install
+            subparsers.add_parser('install', description='install database for autosubmit on the configured folder')
+
+            # Set stattus
+            subparser = subparsers.add_parser('setstatus', description="sets job status for an experiment")
+            subparser.add_argument('expid',  help='experiment identifier')
+            subparser.add_argument('-s', '--save', action="store_true", default=False, help='Save changes to disk')
+            subparser.add_argument('-t', '--status_final',
+                                   choices=('READY', 'COMPLETED', 'WAITING', 'SUSPENDED', 'FAILED', 'UNKNOWN',
+                                            'QUEUING', 'RUNNING'),
+                                   required=True,
+                                   help='Supply the target status')
+            group = subparser.add_mutually_exclusive_group(required=True)
+            group.add_argument('-l', '--list', type=str,
+                               help='Supply the list of job names to be changed. Default = "Any". '
+                                    'LIST = "b037_20101101_fc3_21_sim b037_20111101_fc4_26_sim"')
+            group.add_argument('-fc', '--filter_chunks', type=str,
+                               help='Supply the list of chunks to change the status. Default = "Any". '
+                                    'LIST = "[ 19601101 [ fc0 [1 2 3 4] fc1 [1] ] 19651101 [ fc0 [16-30] ] ]"')
+            group.add_argument('-fs', '--filter_status', type=str,
+                               choices=('Any', 'READY', 'COMPLETED', 'WAITING', 'SUSPENDED', 'FAILED', 'UNKNOWN'),
+                               help='Select the original status to filter the list of jobs')
+            group.add_argument('-ft', '--filter_type', type=str,
+                               help='Select the job type to filter the list of jobs')
+
+            # Test
+            subparser = subparsers.add_parser('test', description='test experiment')
+            subparser.add_argument('expid',  help='experiment identifier')
+            subparser.add_argument('-c', '--chunks', required=True, help='chunks to run')
+            subparser.add_argument('-m', '--member', help='member to run')
+            subparser.add_argument('-s', '--stardate', help='stardate to run')
+            subparser.add_argument('-H', '--HPC', help='HPC to run experiment on it')
+            subparser.add_argument('-b', '--branch', help='branch of git to run (or revision from subversion)')
+
+            # Refresh
+            subparser = subparsers.add_parser('refresh', description='refresh project directory for an experiment')
+            subparser.add_argument('expid',  help='experiment identifier')
+
+            args = parser.parse_args()
+
+            Log.set_console_level(args.logconsole)
+            Log.set_file_level(args.logfile)
+
+            if args.command == 'run':
+                return Autosubmit.run_experiment(args.expid)
+            elif args.command == 'expid':
+                return Autosubmit.expid(args.HPC, args.description, args.copy, args.dummy) != ''
+            elif args.command == 'delete':
+                return Autosubmit.delete(args.expid, args.force)
+            elif args.command == 'monitor':
+                return Autosubmit.monitor(args.expid, args.output)
+            elif args.command == 'stats':
+                return Autosubmit.statistics(args.expid, args.output)
+            elif args.command == 'clean':
+                return Autosubmit.clean(args.expid, args.project, args.plot, args.stats)
+            elif args.command == 'recovery':
+                return Autosubmit.recovery(args.expid, args.save, args.all)
+            elif args.command == 'check':
+                return Autosubmit.check(args.expid)
+            elif args.command == 'create':
+                return Autosubmit.create(args.expid, args.noplot)
+            elif args.command == 'configure':
+                return Autosubmit.configure(args.databasepath, args.databasefilename, args.localrootpath,
+                                            args.platformsconfpath, args.jobsconfpath, args.all, args.local)
+            elif args.command == 'install':
+                return Autosubmit.install()
+            elif args.command == 'setstatus':
+                return Autosubmit.set_status(args.expid, args.save, args.status_final, args.list,
+                                             args.filter_chunks, args.filter_status, args.filter_type)
+            elif args.command == 'test':
+                return Autosubmit.test(args.expid, args.chunks, args.member, args.stardate, args.HPC, args.branch)
+            elif args.command == 'refresh':
+                return Autosubmit.refresh(args.expid)
+        except Exception as e:
+            from traceback import format_exc
+            Log.critical('Unhandled exception on Autosubmit: {0}\n{1}', e, format_exc(10))
+
+            return False
+
+    @staticmethod
+    def _delete_expid(expid_delete):
+        """
+        Removes an experiment from path and database
+
+        :type expid_delete: str
+        :param expid_delete: identifier of the experiment to delete
+        """
+        Log.info("Removing experiment directory...")
+        try:
+            shutil.rmtree(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid_delete))
+        except OSError as e:
+            Log.warning('Can not delete experiment folder: {0}', e)
+        Log.info("Deleting experiment from database...")
+        ret = delete_experiment(expid_delete)
+        if ret:
+            Log.result("Experiment {0} deleted".format(expid_delete))
+        return ret
+
+    @staticmethod
+    def expid(hpc, description, copy_id='', dummy=False):
+        """
+        Creates a new experiment for given HPC
+
+        :type hpc: str
+        :type description: str
+        :type copy_id: str
+        :type dummy: bool
+        :param hpc: name of the main HPC for the experiment
+        :param description: short experiment's description.
+        :param copy_id: experiment identifier of experiment to copy
+        :param dummy: if true, writes a default dummy configuration for testing
+        :return: experiment identifier. If method fails, returns ''.
+        :rtype: str
+        """
+        BasicConfig.read()
+
+        log_path = os.path.join(BasicConfig.LOCAL_ROOT_DIR, 'expid.log'.format(os.getuid()))
+        try:
+            Log.set_file(log_path)
+        except IOError as e:
+            Log.error("Can not create log file in path {0}: {1}".format(log_path, e.message))
+        exp_id = None
+        if description is None:
+            Log.error("Missing experiment description.")
+            return ''
+        if hpc is None:
+            Log.error("Missing HPC.")
+            return ''
+        if not copy_id:
+            exp_id = new_experiment(hpc, description, Autosubmit.autosubmit_version)
+            if exp_id == '':
+                return ''
+            try:
+                os.mkdir(os.path.join(BasicConfig.LOCAL_ROOT_DIR, exp_id))
+
+                os.mkdir(os.path.join(BasicConfig.LOCAL_ROOT_DIR, exp_id, 'conf'))
+                Log.info("Copying config files...")
+                # autosubmit config and experiment copyed from AS.
+                files = resource_listdir('autosubmit.config', 'files')
+                for filename in files:
+                    if resource_exists('autosubmit.config', 'files/' + filename):
+                        index = filename.index('.')
+                        new_filename = filename[:index] + "_" + exp_id + filename[index:]
+
+                        if filename == 'platforms.conf' and BasicConfig.DEFAULT_PLATFORMS_CONF != '':
+                            content = file(os.path.join(BasicConfig.DEFAULT_PLATFORMS_CONF, filename)).read()
+                        elif filename == 'jobs.conf' and BasicConfig.DEFAULT_JOBS_CONF != '':
+                            content = file(os.path.join(BasicConfig.DEFAULT_JOBS_CONF, filename)).read()
+                        else:
+                            content = resource_string('autosubmit.config', 'files/' + filename)
+
+                        conf_new_filename = os.path.join(BasicConfig.LOCAL_ROOT_DIR, exp_id, "conf", new_filename)
+                        Log.debug(conf_new_filename)
+                        file(conf_new_filename, 'w').write(content)
+                Autosubmit._prepare_conf_files(exp_id, hpc, Autosubmit.autosubmit_version, dummy)
+            except (OSError, IOError) as e:
+                Log.error("Can not create experiment: {0}\nCleaning...".format(e))
+                Autosubmit._delete_expid(exp_id)
+                return ''
+        else:
+            try:
+                if os.path.exists(os.path.join(BasicConfig.LOCAL_ROOT_DIR, copy_id)):
+                    exp_id = copy_experiment(copy_id, hpc, description, Autosubmit.autosubmit_version)
+                    if exp_id == '':
+                        return ''
+                    dir_exp_id = os.path.join(BasicConfig.LOCAL_ROOT_DIR, exp_id)
+                    os.mkdir(dir_exp_id)
+                    os.mkdir(dir_exp_id + '/conf')
+                    Log.info("Copying previous experiment config directories")
+                    conf_copy_id = os.path.join(BasicConfig.LOCAL_ROOT_DIR, copy_id, "conf")
+                    files = os.listdir(conf_copy_id)
+                    for filename in files:
+                        if os.path.isfile(os.path.join(conf_copy_id, filename)):
+                            new_filename = filename.replace(copy_id, exp_id)
+                            content = file(os.path.join(conf_copy_id, filename), 'r').read()
+                            file(os.path.join(dir_exp_id, "conf", new_filename), 'w').write(content)
+                    Autosubmit._prepare_conf_files(exp_id, hpc, Autosubmit.autosubmit_version, dummy)
+                else:
+                    Log.critical("The previous experiment directory does not exist")
+                    return ''
+            except (OSError, IOError) as e:
+                Log.error("Can not create experiment: {0}\nCleaning...".format(e))
+                Autosubmit._delete_expid(exp_id)
+                return ''
+
+        Log.debug("Creating temporal directory...")
+        exp_id_path = os.path.join(BasicConfig.LOCAL_ROOT_DIR, exp_id)
+        os.mkdir(os.path.join(exp_id_path, "tmp"))
+
+        Log.debug("Creating pkl directory...")
+        os.mkdir(os.path.join(exp_id_path, "pkl"))
+
+        Log.debug("Creating plot directory...")
+        os.mkdir(os.path.join(exp_id_path, "plot"))
+        os.chmod(os.path.join(exp_id_path, "plot"), 0o775)
+        Log.result("Experiment registered successfully")
+        Log.user_warning("Remember to MODIFY the config files!")
+        return exp_id
+
+    @staticmethod
+    def delete(expid, force):
+        """
+        Deletes and experiment from database and experiment's folder
+
+        :type force: bool
+        :type expid: str
+        :param expid: identifier of the experiment to delete
+        :param force: if True, does not ask for confrmation
+
+        :returns: True if succesful, False if not
+        :rtype: bool
+        """
+        log_path = os.path.join(BasicConfig.LOCAL_ROOT_DIR, 'delete.log'.format(os.getuid()))
+        try:
+            Log.set_file(log_path)
+        except IOError as e:
+            Log.error("Can not create log file in path {0}: {1}".format(log_path, e.message))
+
+        if os.path.exists(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid)):
+            if force or Autosubmit._user_yes_no_query("Do you want to delete " + expid + " ?"):
+                return Autosubmit._delete_expid(expid)
+            else:
+                Log.info("Quitting...")
+                return False
+        else:
+            Log.error("The experiment does not exist")
+            return True
+
+    @staticmethod
+    def _load_parameters(as_conf, joblist, platforms):
+        # Load parameters
+        Log.debug("Loading parameters...")
+        parameters = as_conf.load_parameters()
+        for platform_name in platforms:
+            platform = platforms[platform_name]
+            parameters['{0}_ARCH'.format(platform.name)] = platform.name
+            parameters['{0}_USER'.format(platform.name)] = platform.user
+            parameters['{0}_PROJ'.format(platform.name)] = platform.project
+            parameters['{0}_BUDG'.format(platform.name)] = platform.budget
+            parameters['{0}_TYPE'.format(platform.name)] = platform.type
+            parameters['{0}_VERSION'.format(platform.name)] = platform.version
+            parameters['{0}_SCRATCH_DIR'.format(platform.name)] = platform.scratch
+            parameters['{0}_ROOTDIR'.format(platform.name)] = platform.root_dir
+
+        platform = platforms[as_conf.get_platform()]
+        parameters['HPCARCH'] = platform.name
+        parameters['HPCUSER'] = platform.user
+        parameters['HPCPROJ'] = platform.project
+        parameters['HPCBUDG'] = platform.budget
+        parameters['HPCTYPE'] = platform.type
+        parameters['HPCVERSION'] = platform.version
+        parameters['SCRATCH_DIR'] = platform.scratch
+        parameters['HPCROOTDIR'] = platform.root_dir
+        Log.debug("Updating parameters...")
+        joblist.update_parameters(parameters)
+
+    @staticmethod
+    def run_experiment(expid):
+        """
+        Runs and experiment (submitting all the jobs properly and repeating its execution in case of failure).
+
+        :type expid: str
+        :param expid: identifier of experiment to be run
+        :return: True if run to the end, False otherwise
+        :rtype: bool
+        """
+        if expid is None:
+            Log.critical("Missing expid.")
+        BasicConfig.read()
+        Log.set_file(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_TMP_DIR,
+                                  'run.log'))
+        os.system('clear')
+
+        as_conf = AutosubmitConfig(expid)
+        if not as_conf.check_conf_files():
+            Log.critical('Can not run with invalid configuration')
+            return False
+
+        project_type = as_conf.get_project_type()
+        if project_type != "none":
+            # Check proj configuration
+            as_conf.check_proj()
+
+        expid = as_conf.get_expid()
+        hpcarch = as_conf.get_platform()
+        max_jobs = as_conf.get_total_jobs()
+        max_waiting_jobs = as_conf.get_max_waiting_jobs()
+        safetysleeptime = as_conf.get_safetysleeptime()
+        retrials = as_conf.get_retrials()
+
+        platforms = as_conf.read_platforms_conf()
+        if platforms is None:
+            return False
+
+        Log.debug("The Experiment name is: {0}", expid)
+        Log.debug("Total jobs to submit: {0}", max_jobs)
+        Log.debug("Maximum waiting jobs in platforms: {0}", max_waiting_jobs)
+        Log.debug("Sleep: {0}", safetysleeptime)
+        Log.debug("Retrials: {0}", retrials)
+        Log.info("Starting job submission...")
+
+        # for platforms in platforms:
+        #     signal.signal(signal.SIGQUIT, platforms[platforms].smart_stop)
+        #     signal.signal(signal.SIGINT, platforms[platforms].normal_stop)
+
+        filename = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, 'pkl', 'job_list_' + expid + '.pkl')
+        Log.debug(filename)
+
+        # the experiment should be loaded as well
+        if os.path.exists(filename):
+            joblist = cPickle.load(file(filename, 'rw'))
+            Log.debug("Starting from joblist pickled in {0}", filename)
+        else:
+            Log.error("The necessary pickle file {0} does not exist.", filename)
+            return False
+
+        Log.debug("Length of joblist: {0}", len(joblist))
+
+        Autosubmit._load_parameters(as_conf, joblist, platforms)
+        # check the job list script creation
+        Log.debug("Checking experiment templates...")
+
+        platforms_to_test = set()
+        for job in joblist.get_job_list():
+            if job.platform_name is None:
+                job.platform_name = hpcarch
+            job.set_platform(platforms[job.platform_name])
+            platforms_to_test.add(platforms[job.platform_name])
+
+        joblist.check_scripts(as_conf)
+
+        # check the availability of the Queues
+        for platform in platforms_to_test:
+            platform.connect()
+            platform.check_remote_log_dir()
+
+        #########################
+        # AUTOSUBMIT - MAIN LOOP
+        #########################
+        # Main loop. Finishing when all jobs have been submitted
+        while joblist.get_active():
+            active = len(joblist.get_running())
+            waiting = len(joblist.get_submitted() + joblist.get_queuing())
+            available = max_waiting_jobs - waiting
+
+            # reload parameters changes
+            Log.debug("Reloading parameters...")
+            as_conf.reload()
+            Autosubmit._load_parameters(as_conf, joblist, platforms)
+
+            # variables to be updated on the fly
+            max_jobs = as_conf.get_total_jobs()
+            Log.debug("Total jobs: {0}".format(max_jobs))
+            total_jobs = len(joblist.get_job_list())
+            Log.info("\n{0} of {1} jobs remaining ({2})".format(total_jobs-len(joblist.get_completed()), total_jobs,
+                                                                strftime("%H:%M")))
+            safetysleeptime = as_conf.get_safetysleeptime()
+            Log.debug("Sleep: {0}", safetysleeptime)
+            retrials = as_conf.get_retrials()
+            Log.debug("Number of retrials: {0}", retrials)
+
+            # read FAIL_RETRIAL number if, blank at creation time put a given number
+            # check availability of machine, if not next iteration after sleep time
+            # check availability of jobs, if no new jobs submited and no jobs available, then stop
+
+            # ??? why
+            joblist.save()
+
+            Log.info("Active jobs in platforms:\t{0}", active)
+            Log.info("Waiting jobs in platforms:\t{0}", waiting)
+
+            if available == 0:
+                Log.debug("There's no room for more jobs...")
+            else:
+                Log.debug("We can safely submit {0} jobs...", available)
+
+            ######################################
+            # AUTOSUBMIT - ALREADY SUBMITTED JOBS
+            ######################################
+            # get the list of jobs currently in the Queue
+            jobinqueue = joblist.get_in_queue()
+            Log.info("Number of jobs in platforms: {0}", str(len(jobinqueue)))
+
+            for job in jobinqueue:
+
+                job.print_job()
+                Log.debug("Number of jobs in platforms: {0}", str(len(jobinqueue)))
+                # Check platforms availability
+                job_platform = job.get_platform()
+                platform_available = job_platform.check_host()
+                if not platform_available:
+                    Log.debug("There is no platforms available")
+                else:
+                    status = job_platform.check_job(job.id)
+                    if status == Status.COMPLETED:
+                        Log.debug("This job seems to have completed...checking")
+                        job_platform.get_completed_files(job.name)
+                        job.check_completion()
+                    else:
+                        job.status = status
+                    if job.status is Status.QUEUING:
+                        Log.info("Job {0} is QUEUING", job.name)
+                    elif job.status is Status.RUNNING:
+                        Log.info("Job {0} is RUNNING", job.name)
+                    elif job.status is Status.COMPLETED:
+                        Log.result("Job {0} is COMPLETED", job.name)
+                    elif job.status is Status.FAILED:
+                        Log.user_warning("Job {0} is FAILED", job.name)
+                    elif job.status is Status.UNKNOWN:
+                        Log.debug("Job {0} in UNKNOWN status. Checking completed files", job.name)
+                        job_platform.get_completed_files(job.name)
+                        job.check_completion(Status.UNKNOWN)
+                    elif job.status is Status.SUBMITTED:
+                        # after checking the jobs , no job should have the status "submitted"
+                        Log.warning('Job {0} in SUBMITTED status after checking.', job.name)
+
+            # explain it !!
+            joblist.update_list()
+
+            ##############################
+            # AUTOSUBMIT - JOBS TO SUBMIT
+            ##############################
+            # get the list of jobs READY
+            jobsavail = joblist.get_ready()
+
+            if min(available, len(jobsavail)) == 0:
+                Log.debug("There is no job READY or available")
+                Log.debug("Number of jobs ready: {0}", len(jobsavail))
+                Log.debug("Number of jobs available in platforms: {0}", available)
+            elif min(available, len(jobsavail)) > 0 and len(jobinqueue) <= max_jobs:
+                Log.info("\nStarting to submit {0} job(s)", min(available, len(jobsavail)))
+                # should sort the jobsavail by priority Clean->post->sim>ini
+                # s = sorted(jobsavail, key=lambda k:k.name.split('_')[1][:6])
+                # probably useless to sort by year before sorting by type
+                s = sorted(jobsavail, key=lambda k: k.long_name.split('_')[1][:6])
+
+                list_of_jobs_avail = sorted(s, key=lambda k: k.priority, reverse=True)
+
+                for job in list_of_jobs_avail[0:min(available, len(jobsavail), max_jobs - len(jobinqueue))]:
+                    Log.debug(job.name)
+                    scriptname = job.create_script(as_conf)
+                    Log.debug(scriptname)
+
+                    job_platform = job.get_platform()
+                    platform_available = job_platform.check_host()
+                    if not platform_available:
+                        Log.warning("Queue {0} is not available".format(job_platform.name))
+                    else:
+                        job_platform.send_script(scriptname)
+                        job.id = job_platform.submit_job(scriptname)
+                        if job.id is None:
+                            continue
+                        # set status to "submitted"
+                        job.status = Status.SUBMITTED
+                        Log.info("{0} submitted to {1}\n", job.name, job.get_platform().name)
+
+            time.sleep(safetysleeptime)
+
+        Log.info("No more jobs to run.")
+        if len(joblist.get_failed()) > 0:
+            Log.info("Some jobs have failed and reached maximun retrials")
+            return False
+        else:
+            Log.result("Run successful")
+            return True
+
+    @staticmethod
+    def monitor(expid, file_format):
+        """
+        Plots workflow graph for a given experiment with status of each job coded by node color.
+        Plot is created in experiment's plot folder with name <expid>_<date>_<time>.<file_format>
+
+        :type file_format: str
+        :type expid: str
+        :param expid: identifier of the experiment to plot
+        :param file_format: plot's file format. It can be pdf, png or ps
+        """
+        root_name = 'job_list'
+        BasicConfig.read()
+        Log.set_file(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_TMP_DIR, 'monitor.log'))
+        filename = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, 'pkl',  root_name + '_' + expid + '.pkl')
+        Log.info("Getting job list...")
+        Log.debug("JobList: {0}".format(filename))
+        jobs = cPickle.load(file(filename, 'r'))
+        if not isinstance(jobs, type([])):
+            jobs = jobs.get_job_list()
+
+        Log.info("Plotting...")
+        monitor_exp = Monitor()
+        monitor_exp.generate_output(expid, jobs, file_format)
+        Log.result("Plot ready")
+        return True
+
+    @staticmethod
+    def statistics(expid, file_format):
+        """
+        Plots statistics graph for a given experiment.
+        Plot is created in experiment's plot folder with name <expid>_<date>_<time>.<file_format>
+
+        :type file_format: str
+        :type expid: str
+        :param expid: identifier of the experiment to plot
+        :param file_format: plot's file format. It can be pdf, png or ps
+        """
+        root_name = 'job_list'
+        BasicConfig.read()
+        Log.set_file(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_TMP_DIR,
+                                  'statistics.log'))
+        Log.info("Loading jobs...")
+        filename = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, 'pkl', root_name + '_' + expid + '.pkl')
+        jobs = cPickle.load(file(filename, 'r'))
+        # if not isinstance(jobs, type([])):
+        #     jobs = [job for job in jobs.get_finished() if job.type == Type.SIMULATION]
+
+        if len(jobs.get_job_list()) > 0:
+            Log.info("Plotting stats...")
+            monitor_exp = Monitor()
+            monitor_exp.generate_output_stats(expid, jobs.get_job_list(), file_format)
+            Log.result("Stats plot ready")
+        else:
+            Log.info("There are no COMPLETED jobs...")
+        return True
+
+    @staticmethod
+    def clean(expid, project, plot, stats):
+        """
+        Clean experiment's directory to save storage space.
+        It removes project directory and outdated plots or stats.
+
+        :type plot: bool
+        :type project: bool
+        :type expid: str
+        :type stats: bool
+        :param expid: identifier of experiment to clean
+        :param project: set True to delete project directory
+        :param plot: set True to delete outdated plots
+        :param stats: set True to delete outdated stats
+        """
+        BasicConfig.read()
+        Log.set_file(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_TMP_DIR,
+                                  'finalise_exp.log'))
+        if project:
+            autosubmit_config = AutosubmitConfig(expid)
+            if not autosubmit_config.check_conf_files():
+                Log.critical('Can not clean project with invalid configuration')
+                return False
+
+            project_type = autosubmit_config.get_project_type()
+            if project_type == "git":
+                autosubmit_config.check_proj()
+                Log.info("Registering commit SHA...")
+                autosubmit_config.set_git_project_commit()
+                autosubmit_git = AutosubmitGit(expid[0])
+                Log.info("Cleaning GIT directory...")
+                if not autosubmit_git.clean_git():
+                    return False
+            else:
+                Log.info("No project to clean...\n")
+        if plot:
+            Log.info("Cleaning plots...")
+            monitor_autosubmit = Monitor()
+            monitor_autosubmit.clean_plot(expid)
+        if stats:
+            Log.info("Cleaning stats directory...")
+            monitor_autosubmit = Monitor()
+            monitor_autosubmit.clean_stats(expid)
+        return True
+
+    @staticmethod
+    def recovery(expid, save, all_jobs):
+        """
+        TODO
+
+        :param expid: identifier of the experiment to recover
+        :type expid: str
+        :param save: If true, recovery saves changes to joblist
+        :type save: bool
+        :param all_jobs: if True, it tries to get completed files for all jobs, not only active.
+        :type all_jobs: bool
+        """
+        root_name = 'job_list'
+        BasicConfig.read()
+
+        Log.set_file(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_TMP_DIR,
+                                  'recovery.log'))
+
+        Log.info('Recovering experiment {0}'.format(expid))
+        path = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "pkl", root_name + "_" + expid + ".pkl")
+        job_list = cPickle.load(file(path, 'r'))
+
+        as_conf = AutosubmitConfig(expid)
+        if not as_conf.check_conf_files():
+            Log.critical('Can not recover with invalid configuration')
+            return False
+
+        hpcarch = as_conf.get_platform()
+
+        platforms = as_conf.read_platforms_conf()
+        if platforms is None:
+            return False
+        for platform in platforms:
+            platforms[platform].connect()
+        if all_jobs:
+            jobs_to_recover = job_list.get_job_list()
+        else:
+            jobs_to_recover = job_list.get_active()
+
+        for job in jobs_to_recover:
+            if job.platform_name is None:
+                job.platform_name = hpcarch
+            job.set_platform(platforms[job.platform_name])
+
+            if job.get_platform().get_completed_files(job.name, 0, True):
+                job.status = Status.COMPLETED
+                Log.info("CHANGED job '{0}' status to COMPLETED".format(job.name))
+            elif job.status != Status.SUSPENDED:
+                job.status = Status.WAITING
+                job.fail_count = 0
+                Log.info("CHANGED job '{0}' status to WAITING".format(job.name))
+
+        sys.setrecursionlimit(50000)
+        job_list.update_list(False)
+        job_list.update_from_file(False)
+
+        if save:
+            job_list.save()
+
+        Log.result("Recovery finalized")
+        monitor_exp = Monitor()
+        monitor_exp.generate_output(expid, job_list.get_job_list())
+        return True
+
+    @staticmethod
+    def check(expid):
+        """
+        Checks experiment configuration and warns about any detected error or inconsistency.
+
+        :param expid: experiment identifier:
+        :type expid: str
+        """
+        BasicConfig.read()
+        Log.set_file(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_TMP_DIR, 'check_exp.log'))
+        as_conf = AutosubmitConfig(expid)
+        if not as_conf.check_conf_files():
+            return False
+        project_type = as_conf.get_project_type()
+        if project_type != "none":
+            if not as_conf.check_proj():
+                return False
+
+        platforms = as_conf.read_platforms_conf()
+        if platforms is None:
+            return False
+
+        filename = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, 'pkl', 'job_list_' + expid + '.pkl')
+        # the experiment should be loaded as well
+        if os.path.exists(filename):
+            joblist = cPickle.load(file(filename, 'rw'))
+            Log.debug("Starting from joblist pickled in {0}", filename)
+        else:
+            Log.error("The necessary pickle file {0} does not exist. Can not check templates!", filename)
+            return False
+
+        Autosubmit._load_parameters(as_conf, joblist, platforms)
+
+        hpcarch = as_conf.get_platform()
+        for job in joblist.get_job_list():
+            if job.platform_name is None:
+                job.platform_name = hpcarch
+            job.set_platform(platforms[job.platform_name])
+
+        return joblist.check_scripts(as_conf)
+
+    @staticmethod
+    def configure(database_path, database_filename, local_root_path, platforms_conf_path, jobs_conf_path,
+                  machine, local):
+        """
+        Configure several paths for autosubmit: database, local root and others. Can be configured at system,
+        user or local levels. Local level configuration precedes user level and user level precedes system
+        configuration.
+
+        :param database_path: path to autosubmit database
+        :type database_path: str
+        :param database_path: path to autosubmit database
+        :type database_path: str
+        :param local_root_path: path to autosubmit's experiments' directory
+        :type local_root_path: str
+        :param platforms_conf_path: path to platforms conf file to be used as model for new experiments
+        :type platforms_conf_path: str
+        :param jobs_conf_path: path to jobs conf file to be used as model for new experiments
+        :type jobs_conf_path: str
+        :param machine: True if this configuration has to be stored for all the machine users
+        :type machine: bool
+        :param local: True if this configuration has to be stored in the local path
+        :type local: bool
+        """
+        home_path = os.path.expanduser('~')
+        while database_path is None:
+            database_path = raw_input("Introduce Database path: ")
+        database_path = database_path.replace('~', home_path)
+        if not os.path.exists(database_path):
+            Log.error("Database path does not exist.")
+            return False
+
+        while local_root_path is None:
+            local_root_path = raw_input("Introduce Local Root path: ")
+        local_root_path = local_root_path.replace('~', home_path)
+        if not os.path.exists(local_root_path):
+            Log.error("Local Root path does not exist.")
+            return False
+
+        if platforms_conf_path is not None:
+            platforms_conf_path = platforms_conf_path.replace('~', home_path)
+            if not os.path.exists(platforms_conf_path):
+                Log.error("platforms.conf path does not exist.")
+                return False
+        if jobs_conf_path is not None:
+            jobs_conf_path = jobs_conf_path.replace('~', home_path)
+            if not os.path.exists(jobs_conf_path):
+                Log.error("jobs.conf path does not exist.")
+                return False
+
+        if machine:
+            path = '/etc'
+        elif local:
+            path = '.'
+        else:
+            path = home_path
+        path = os.path.join(path, '.autosubmitrc')
+
+        config_file = open(path, 'w')
+        Log.info("Writing configuration file...")
+        try:
+            parser = SafeConfigParser()
+            parser.add_section('database')
+            parser.set('database', 'path', database_path)
+            if database_filename is not None:
+                parser.set('database', 'filename', database_filename)
+            parser.add_section('local')
+            parser.set('local', 'path', local_root_path)
+            if jobs_conf_path is not None or platforms_conf_path is not None:
+                parser.add_section('conf')
+                if jobs_conf_path is not None:
+                    parser.set('conf', 'jobs', jobs_conf_path)
+                if platforms_conf_path is not None:
+                    parser.set('conf', 'queues', platforms_conf_path)
+
+            parser.write(config_file)
+            config_file.close()
+            Log.result("Configuration file written successfully")
+        except (IOError, OSError) as e:
+            Log.critical("Can not write config file: {0}".format(e.message))
+            return False
+        return True
+
+    @staticmethod
+    def install():
+        """
+        Creates a new database instance for autosubmit at the configured path
+        """
+        BasicConfig.read()
+        if not os.path.exists(BasicConfig.DB_PATH):
+            Log.info("Creating autosubmit database...")
+            qry = resource_string('autosubmit.database', 'data/autosubmit.sql')
+            if not create_db(qry):
+                Log.critical("Can not write database file")
+                return False
+            Log.result("Autosubmit database created successfully")
+        else:
+            Log.error("Database already exists.")
+            return False
+        return True
+
+    @staticmethod
+    def refresh(expid):
+        """
+        Refresh project folder for given experiment
+
+        :param expid: experiment identifier
+        :type expid: str
+        """
+        BasicConfig.read()
+        Log.set_file(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_TMP_DIR,
+                                  'refresh.log'))
+        as_conf = AutosubmitConfig(expid)
+        if not as_conf.check_conf_files():
+            Log.critical('Can not copy with invalid configuration')
+            return False
+        project_type = as_conf.get_project_type()
+        if Autosubmit._copy_code(as_conf, expid, project_type, True):
+            Log.result("Project folder updated")
+        return True
+
+    @staticmethod
+    def create(expid, noplot):
+        """
+        Creates job list for given experiment. Configuration files must be valid before realizaing this process.
+
+        :param expid: experiment identifier
+        :type expid: str
+        :param noplot: if True, method omits final ploting of joblist. Only needed on large experiments when plotting
+                       time can be much larger than creation time.
+        :type noplot: bool
+        :return: True if succesful, False if not
+        :rtype: bool
+        """
+        BasicConfig.read()
+        Log.set_file(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_TMP_DIR,
+                                  'create_exp.log'))
+        as_conf = AutosubmitConfig(expid)
+        if not as_conf.check_conf_files():
+            Log.critical('Can not create with invalid configuration')
+            return False
+
+        project_type = as_conf.get_project_type()
+
+        if not Autosubmit._copy_code(as_conf, expid, project_type, False):
+            return False
+        if as_conf.get_project_type() != 'none':
+            destiny = as_conf.project_file
+            if not os.path.exists(destiny):
+                shutil.copyfile(os.path.join(as_conf.get_project_dir(), as_conf.get_file_project_conf()), destiny)
+
+        if project_type != "none":
+            # Check project configuration
+            as_conf.check_proj()
+
+        # Load parameters
+        Log.info("Loading parameters...")
+        parameters = as_conf.load_parameters()
+
+        date_list = as_conf.get_date_list()
+        if len(date_list) != len(set(date_list)):
+            Log.error('There are repeated start dates!')
+            return False
+        num_chunks = as_conf.get_num_chunks()
+        member_list = as_conf.get_member_list()
+        if len(member_list) != len(set(member_list)):
+            Log.error('There are repeated member names!')
+            return False
+        rerun = as_conf.get_rerun()
+
+        Log.info("\nCreating joblist...")
+        job_list = JobList(expid)
+
+        date_format = ''
+        if as_conf.get_chunk_size_unit() is 'hour':
+            date_format = 'H'
+        for date in date_list:
+            if date.hour > 1:
+                date_format = 'H'
+            if date.minute > 1:
+                date_format = 'M'
+        job_list.create(date_list, member_list,  num_chunks, parameters, date_format)
+        if rerun == "true":
+            chunk_list = Autosubmit._create_json(as_conf.get_chunk_list())
+            job_list.rerun(chunk_list)
+        else:
+            job_list.remove_rerun_only_jobs()
+
+        pltfrm = as_conf.get_platform()
+        if pltfrm == 'hector' or pltfrm == 'archer':
+            job_list.update_shortened_names()
+
+        Log.info("\nSaving joblist...")
+        job_list.save()
+        if not noplot:
+            Log.info("\nPloting joblist...")
+            monitor_exp = Monitor()
+            monitor_exp.generate_output(expid, job_list.get_job_list(), 'pdf')
+
+        Log.result("\nJob list created succesfully")
+        Log.user_warning("Remember to MODIFY the MODEL config files!")
+        return True
+
+    @staticmethod
+    def _copy_code(as_conf, expid, project_type, force):
+        """
+        Method to copy code from experiment repository to project directory.
+
+        :param as_conf: experiment configuration class
+        :type as_conf: AutosubmitConfig
+        :param expid: experiment identifier
+        :type expid: str
+        :param project_type: project type (git, svn, local)
+        :type project_type: str
+        :param force: if True, overwrites current data
+        :return: True if succesful, False if not
+        :rtype: bool
+        """
+        project_destination = as_conf.get_project_destination()
+        if project_type == "git":
+            git_project_origin = as_conf.get_git_project_origin()
+            git_project_branch = as_conf.get_git_project_branch()
+            git_project_commit = as_conf.get_git_project_commit()
+            project_path = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_PROJ_DIR)
+            if os.path.exists(project_path):
+                Log.info("Using project folder: {0}", project_path)
+                if not force:
+                    Log.debug("The project folder exists. SKIPPING...")
+                    return True
+                else:
+                    shutil.rmtree(project_path)
+            os.mkdir(project_path)
+            Log.debug("The project folder {0} has been created.", project_path)
+
+            Log.info("Cloning {0} into {1}", git_project_branch + " " + git_project_origin, project_path)
+            (status, output) = getstatusoutput("cd " + project_path + "; git clone --recursive -b "
+                                               + git_project_branch + " " + git_project_origin + " "
+                                               + project_destination)
+            if status:
+                Log.error("Can not clone {0} into {1}", git_project_branch + " " + git_project_origin, project_path)
+                shutil.rmtree(project_path)
+                return False
+            if git_project_commit:
+                (status, output) = getstatusoutput("cd {0}; git checkout {1} ".format(project_path, git_project_commit))
+                if status:
+                    Log.error("Can not checkout commit {0}", git_project_commit)
+                    shutil.rmtree(project_path)
+                    return False
+
+            Log.debug("{0}", output)
+
+        elif project_type == "svn":
+            svn_project_url = as_conf.get_svn_project_url()
+            svn_project_revision = as_conf.get_svn_project_revision()
+            project_path = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_PROJ_DIR)
+            if os.path.exists(project_path):
+                Log.info("Using project folder: {0}", project_path)
+                if not force:
+                    Log.debug("The project folder exists. SKIPPING...")
+                    return True
+            else:
+                Log.debug("The project folder {0} has been created.", project_path)
+            shutil.rmtree(project_path)
+            Log.info("Checking out revision {0} into {1}", svn_project_revision + " " + svn_project_url, project_path)
+            (status, output) = getstatusoutput("cd " + project_path + "; svn checkout -r " + svn_project_revision +
+                                               " " + svn_project_url + " " + project_destination)
+            if status:
+                Log.error("Can not check out revision {0} into {1}", svn_project_revision + " " + svn_project_url,
+                          project_path)
+                shutil.rmtree(project_path)
+                return False
+            Log.debug("{0}" % output)
+
+        elif project_type == "local":
+            local_project_path = as_conf.get_local_project_path()
+            project_path = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_PROJ_DIR,
+                                        project_destination)
+            if os.path.exists(project_path):
+                Log.info("Using project folder: {0}", project_path)
+                if not force:
+                    Log.debug("The project folder exists. SKIPPING...")
+                    return True
+                else:
+                    shutil.rmtree(project_path)
+            os.mkdir(project_path)
+            Log.debug("The project folder {0} has been created.", project_path)
+
+            Log.info("Copying {0} into {1}", local_project_path, project_path)
+            (status, output) = getstatusoutput("cp -R " + local_project_path + " " + project_path)
+            if status:
+                Log.error("Can not copy {0} into {1}. Exiting...", local_project_path, project_path)
+                shutil.rmtree(project_path)
+                return False
+            Log.debug("{0}", output)
+        return True
+
+    @staticmethod
+    def change_status(final, final_status, job):
+        job.status = final_status
+        Log.info("CHANGED: job: " + job.name + " status to: " + final)
+
+    @staticmethod
+    def set_status(expid, save, final, lst, filter_chunks, filter_status, filter_section):
+        """
+        TODO
+
+        :param expid: experiment identifier
+        :type expid: str
+        :param save:
+        :type save: bool
+        :param final:
+        :type final: str
+        :param lst:
+        :type lst: str
+        :param filter_chunks:
+        :type filter_chunks: str
+        :param filter_status:
+        :type filter_status: str
+        :param filter_section:
+        :type filter_section: str
+        """
+        root_name = 'job_list'
+        BasicConfig.read()
+
+        Log.set_file(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, BasicConfig.LOCAL_TMP_DIR,
+                                  'change_pkl.log'))
+        Log.debug('Exp ID: {0}', expid)
+        job_list = cPickle.load(file(os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, 'pkl',  root_name + "_" + expid +
+                                     ".pkl"), 'r'))
+
+        final_status = Autosubmit._get_status(final)
+        if filter_chunks:
+            fc = filter_chunks
+            Log.debug(fc)
+
+            if fc == 'Any':
+                for job in job_list.get_job_list():
+                    Autosubmit.change_status(final, final_status, job)
+            else:
+                data = json.loads(Autosubmit._create_json(fc))
+                for datejson in data['sds']:
+                    date = datejson['sd']
+                    jobs_date = filter(lambda j: j.date == date, job_list.get_job_list())
+
+                    for job in filter(lambda j: j.member is None, jobs_date):
+                            Autosubmit.change_status(final, final_status, job)
+
+                    for memberjson in datejson['ms']:
+                        member = memberjson['m']
+                        jobs_member = filter(lambda j: j.member == member, jobs_date)
+
+                        for job in filter(lambda j: j.chunk is None, jobs_member):
+                            Autosubmit.change_status(final, final_status, job)
+
+                        for chunkjson in memberjson['cs']:
+                            chunk = int(chunkjson)
+                            for job in filter(lambda j: j.chunk == chunk, jobs_member):
+                                Autosubmit.change_status(final, final_status, job)
+
+        if filter_status:
+            Log.debug("Filtering jobs with status {0}", filter_status)
+            if filter_status == 'Any':
+                for job in job_list.get_job_list():
+                    Autosubmit.change_status(final, final_status, job)
+            else:
+                fs = Autosubmit._get_status(filter_status)
+                for job in filter(lambda j: j.status == fs, job_list.get_job_list()):
+                    Autosubmit.change_status(final, final_status, job)
+
+        if filter_section:
+            ft = filter_section
+            Log.debug(ft)
+
+            if ft == 'Any':
+                for job in job_list.get_job_list():
+                    Autosubmit.change_status(final, final_status, job)
+            else:
+                for job in job_list.get_job_list():
+                    if job.section == ft:
+                        Autosubmit.change_status(final, final_status, job)
+
+        if lst:
+            jobs = lst.split()
+
+            if jobs == 'Any':
+                for job in job_list.get_job_list():
+                    Autosubmit.change_status(final, final_status, job)
+            else:
+                for job in job_list.get_job_list():
+                    if job.name in jobs:
+                        Autosubmit.change_status(final, final_status, job)
+
+        sys.setrecursionlimit(50000)
+
+        if save:
+            job_list.update_list()
+            path = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "pkl", root_name + "_" + expid + ".pkl")
+            cPickle.dump(job_list, file(path, 'w'))
+            Log.info("Saving JobList: {0}", path)
+        else:
+            job_list.update_list(False)
+            Log.warning("Changes NOT saved to the JobList!!!!:  use -s option to save")
+
+        monitor_exp = Monitor()
+        monitor_exp.generate_output(expid, job_list.get_job_list())
+        return True
+
+    @staticmethod
+    def _user_yes_no_query(question):
+        """
+        Utility function to ask user a yes/no question
+
+        :param question: question to ask
+        :type question: str
+        :return: True if answer is yes, False if it is no
+        :rtype: bool
+        """
+        sys.stdout.write('{0} [y/n]\n'.format(question))
+        while True:
+            try:
+                return strtobool(raw_input().lower())
+            except ValueError:
+                sys.stdout.write('Please respond with \'y\' or \'n\'.\n')
+
+    @staticmethod
+    def _prepare_conf_files(exp_id, hpc, autosubmit_version, dummy):
+        """
+        Changes default configuration files to match new experminet values
+
+        :param exp_id: experiment identifier
+        :type exp_id: str
+        :param hpc: hpc to use
+        :type hpc: str
+        :param autosubmit_version: current autosubmit's version
+        :type autosubmit_version: str
+        :param dummy: if True, creates a dummy experiment adding some dafault values
+        :type dummy: bool
+        """
+        as_conf = AutosubmitConfig(exp_id)
+        as_conf.set_version(autosubmit_version)
+        as_conf.set_expid(exp_id)
+        as_conf.set_platform(hpc)
+        as_conf.set_safetysleeptime(10)
+
+        if dummy:
+            content = file(as_conf.experiment_file).read()
+
+            # Experiment
+            content = content.replace(re.search('^DATELIST =.*', content, re.MULTILINE).group(0),
+                                      "DATELIST = 20000101")
+            content = content.replace(re.search('^MEMBERS =.*', content, re.MULTILINE).group(0),
+                                      "MEMBERS = fc0")
+            content = content.replace(re.search('^CHUNKSIZE =.*', content, re.MULTILINE).group(0),
+                                      "CHUNKSIZE = 4")
+            content = content.replace(re.search('^NUMCHUNKS =.*', content, re.MULTILINE).group(0),
+                                      "NUMCHUNKS = 1")
+            content = content.replace(re.search('^PROJECT_TYPE =.*', content, re.MULTILINE).group(0),
+                                      "PROJECT_TYPE = none")
+
+            file(as_conf.experiment_file, 'w').write(content)
+
+    @staticmethod
+    def _get_status(s):
+        """
+        Convert job status from str to Status
+
+        :param s: status string
+        :type s: str
+        :return: status instance
+        :rtype: Status
+        """
+        if s == 'READY':
+            return Status.READY
+        elif s == 'COMPLETED':
+            return Status.COMPLETED
+        elif s == 'WAITING':
+            return Status.WAITING
+        elif s == 'SUSPENDED':
+            return Status.SUSPENDED
+        elif s == 'FAILED':
+            return Status.FAILED
+        elif s == 'RUNNING':
+            return Status.RUNNING
+        elif s == 'QUEUING':
+            return Status.QUEUING
+        elif s == 'UNKNOWN':
+            return Status.UNKNOWN
+
+    @staticmethod
+    def _get_members(out):
+        """
+        Function to get a list of members from json
+
+        :param out: json member definition
+        :type out: str
+        :return: list of members
+        :rtype: list
+        """
+        count = 0
+        data = []
+        # noinspection PyUnusedLocal
+        for element in out:
+            if count % 2 == 0:
+                ms = {'m': out[count], 'cs': Autosubmit._get_chunks(out[count + 1])}
+                data.append(ms)
+                count += 1
+            else:
+                count += 1
+
+        return data
+
+    @staticmethod
+    def _get_chunks(out):
+        """
+        Function to get a list of chunks from json
+
+        :param out: json member definition
+        :type out: str
+        :return: list of chunks
+        :rtype: list
+        """
+        data = []
+        for element in out:
+            if element.find("-") != -1:
+                numbers = element.split("-")
+                for count in range(int(numbers[0]), int(numbers[1]) + 1):
+                    data.append(str(count))
+            else:
+                data.append(element)
+
+        return data
+
+    @staticmethod
+    def _create_json(text):
+        """
+        Function to parse rerun specification from json format
+
+        :param text: text to parse
+        :type text: list
+        :return: parsed output
+        """
+        count = 0
+        data = []
+        # text = "[ 19601101 [ fc0 [1 2 3 4] fc1 [1] ] 16651101 [ fc0 [1-30 31 32] ] ]"
+
+        out = nestedExpr('[', ']').parseString(text).asList()
+
+        # noinspection PyUnusedLocal
+        for element in out[0]:
+            if count % 2 == 0:
+                sd = {'sd': out[0][count], 'ms': Autosubmit._get_members(out[0][count + 1])}
+                data.append(sd)
+                count += 1
+            else:
+                count += 1
+
+        sds = {'sds': data}
+        result = json.dumps(sds)
+        return result
+
+    @staticmethod
+    def test(expid, chunks, member=None, stardate=None, hpc=None, branch=None):
+        """
+        Method to conduct a test for a given experiment. It creates a new experiment for a given experiment qith a
+        given number of chunks with a random start date and a random member to be run on a random HPC.
+
+
+        :param expid: experiment identifier
+        :type expid: str
+        :param chunks: number of chunks to be run by the experiment
+        :type chunks: int
+        :param member: member to be used by the test. If None, it uses a random one from which are defined on
+                       the experiment.
+        :type member: str
+        :param stardate: start date to be used by the test. If None, it uses a random one from which are defined on
+                         the experiment.
+        :type stardate: str
+        :param hpc: HPC to be used by the test. If None, it uses a random one from which are defined on
+                    the experiment.
+        :type hpc: str
+        :param branch: branch or revision to be used by the test. If None, it uses configured branch.
+        :type branch: str
+        :return: True if test was succesful, False otherwise
+        :rtype: bool
+        """
+        testid = Autosubmit.expid('test', 'test experiment for {0}'.format(expid), expid, False)
+        if testid == '':
+            return False
+
+        as_conf = AutosubmitConfig(testid)
+        exp_parser = as_conf.get_parser(as_conf.experiment_file)
+        if AutosubmitConfig.get_bool_option(exp_parser, 'rerun', "RERUN", True):
+            Log.error('Can not test a RERUN experiment')
+            Autosubmit.delete(testid, True)
+            return False
+
+        content = file(as_conf.experiment_file).read()
+        if hpc is None:
+            platforms_parser = as_conf.get_parser(as_conf.platforms_file)
+            test_platforms = list()
+            for section in platforms_parser.sections():
+                if AutosubmitConfig.get_option(platforms_parser, section, 'TEST_SUITE', 'false').lower() == 'true':
+                    test_platforms.append(section)
+            if len(test_platforms) == 0:
+                Log.critical('No test HPC defined')
+                return False
+            hpc = random.choice(test_platforms)
+        if member is None:
+            member = random.choice(exp_parser.get('experiment', 'MEMBERS').split(' '))
+        if stardate is None:
+            stardate = random.choice(exp_parser.get('experiment', 'DATELIST').split(' '))
+        # Experiment
+        content = content.replace(re.search('DATELIST =.*', content).group(0),
+                                  "DATELIST = " + stardate)
+        content = content.replace(re.search('MEMBERS =.*', content).group(0),
+                                  "MEMBERS = " + member)
+        content = content.replace(re.search('NUMCHUNKS =.*', content).group(0),
+                                  "NUMCHUNKS = " + chunks)
+        content = content.replace(re.search('HPCARCH =.*', content).group(0),
+                                  "HPCARCH = " + hpc)
+        content = content.replace(re.search('EXPID =.*', content).group(0),
+                                  "EXPID = " + testid)
+        if branch is not None:
+            content = content.replace(re.search('PROJECT_BRANCH =.*', content).group(0),
+                                      "PROJECT_BRANCH = " + branch)
+            content = content.replace(re.search('PROJECT_REVISION =.*', content).group(0),
+                                      "PROJECT_REVISION = " + branch)
+
+        file(as_conf.experiment_file, 'w').write(content)
+
+        Autosubmit.create(testid, False)
+        if not Autosubmit.run_experiment(testid):
+            return False
+        return Autosubmit.delete(testid, True)
+
diff --git a/autosubmit/config/__init__.py b/autosubmit/config/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/autosubmit/config/basicConfig.py b/autosubmit/config/basicConfig.py
new file mode 100755
index 0000000..30491dd
--- /dev/null
+++ b/autosubmit/config/basicConfig.py
@@ -0,0 +1,86 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+from ConfigParser import SafeConfigParser
+import os
+
+from autosubmit.config.log import Log
+
+
+class BasicConfig:
+    """
+    Class to manage configuration for autosubmit path, database and default values for new experiments
+    """
+    DB_DIR = '~/autosubmit'
+    DB_FILE = 'autosubmit.db'
+    DB_PATH = os.path.join(DB_DIR, DB_FILE)
+    LOCAL_ROOT_DIR = '~/autosubmit'
+    LOCAL_TMP_DIR = 'tmp'
+    LOCAL_PROJ_DIR = 'proj'
+    DEFAULT_PLATFORMS_CONF = ''
+    DEFAULT_JOBS_CONF = ''
+
+    @staticmethod
+    def _update_config():
+        """
+        Updates commonly used composed paths
+        """
+        # Just one needed for the moment.
+        BasicConfig.DB_PATH = os.path.join(BasicConfig.DB_DIR, BasicConfig.DB_FILE)
+
+    @staticmethod
+    def __read_file_config(file_path):
+        """
+        Reads configuration file. If configuration file dos not exist in given path,
+        no error is raised. Configuration options also are not required to exist
+
+        :param file_path: configuration file to read
+        :type file_path: str
+        """
+        if not os.path.isfile(file_path):
+            return
+        Log.debug('Reading config from ' + file_path)
+        parser = SafeConfigParser()
+        parser.optionxform = str
+        parser.read(file_path)
+
+        if parser.has_option('database', 'path'):
+            BasicConfig.DB_DIR = parser.get('database', 'path')
+        if parser.has_option('database', 'filename'):
+            BasicConfig.DB_FILE = parser.get('database', 'filename')
+        if parser.has_option('local', 'path'):
+            BasicConfig.LOCAL_ROOT_DIR = parser.get('local', 'path')
+        if parser.has_option('conf', 'queues'):
+            BasicConfig.DEFAULT_PLATFORMS_CONF = parser.get('conf', 'queues')
+        if parser.has_option('conf', 'jobs'):
+            BasicConfig.DEFAULT_JOBS_CONF = parser.get('conf', 'jobs')
+
+    @staticmethod
+    def read():
+        """
+        Reads configuration from .autosubmitrc files, first from /etc, then for user
+        directory and last for current path.
+        """
+        filename = '.autosubmitrc'
+
+        BasicConfig.__read_file_config(os.path.join('/etc', filename))
+        BasicConfig.__read_file_config(os.path.join(os.path.expanduser('~'), filename))
+        BasicConfig.__read_file_config(os.path.join('.', filename))
+
+        BasicConfig._update_config()
+        return
\ No newline at end of file
diff --git a/autosubmit/config/config_common.py b/autosubmit/config/config_common.py
new file mode 100644
index 0000000..68fe19d
--- /dev/null
+++ b/autosubmit/config/config_common.py
@@ -0,0 +1,968 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+from ConfigParser import SafeConfigParser
+import os
+import platform
+from pyparsing import nestedExpr
+import re
+from os import listdir
+from commands import getstatusoutput
+
+from autosubmit.date.chunk_date_lib import parse_date
+
+from autosubmit.config.log import Log
+from autosubmit.config.basicConfig import BasicConfig
+
+from autosubmit.platforms.psplatform import PsPlatform
+from autosubmit.platforms.lsfplatform import LsfPlatform
+from autosubmit.platforms.pbsplatform import PBSPlatform
+from autosubmit.platforms.sgeplatform import SgePlatform
+from autosubmit.platforms.ecplatform import EcPlatform
+from autosubmit.platforms.slurmplatform import SlurmPlatform
+from autosubmit.platforms.localplatform import LocalPlatform
+from autosubmit.platforms.hpcplatform import HPCPlatformException
+
+
+class AutosubmitConfig:
+    """Class to handle experiment configuration coming from file or database"""
+
+    def __init__(self, expid):
+        self.expid = expid
+        self._conf_parser_file = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "conf",
+                                              "autosubmit_" + expid + ".conf")
+        self._exp_parser_file = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "conf",
+                                             "expdef_" + expid + ".conf")
+        self._platforms_parser_file = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "conf",
+                                                   "platforms_" + expid + ".conf")
+        self._jobs_parser_file = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "conf",
+                                              "jobs_" + expid + ".conf")
+        self._proj_parser_file = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "conf",
+                                              "proj_" + expid + ".conf")
+
+    @property
+    def experiment_file(self):
+        """
+        Returns experiment's config file name
+        """
+        return self._exp_parser_file
+
+    @property
+    def platforms_file(self):
+        """
+        Returns experiment's queues config file name
+
+        :return: queues config file's name
+        :rtype: str
+        """
+        return self._platforms_parser_file
+
+    @property
+    def project_file(self):
+        """
+        Returns model's config file name
+        """
+        return self._proj_parser_file
+
+    def get_project_dir(self):
+        """
+        Returns experiment's project directory
+
+        :return: experiment's project directory
+        :rtype: str
+        """
+        dir_templates = os.path.join(BasicConfig.LOCAL_ROOT_DIR, self.get_expid(), BasicConfig.LOCAL_PROJ_DIR)
+        # Getting project name for each type of project
+        if self.get_project_type().lower() == "local":
+            dir_templates = os.path.join(dir_templates, os.path.split(self.get_local_project_path())[1])
+        elif self.get_project_type().lower() == "svn":
+            dir_templates = os.path.join(dir_templates, self.get_svn_project_url().split('/')[-1])
+        elif self.get_project_type().lower() == "git":
+            dir_templates = os.path.join(dir_templates, self.get_project_destination())
+        return dir_templates
+
+    def check_conf_files(self):
+        """
+        Checks configuration files (autosubmit, experiment jobs and queues), looking for invalid values, missing
+        required options. Prints results in log
+
+        :return: True if everithing is correct, False if it founds any error
+        :rtype: bool
+        """
+        Log.info('\nChecking configuration files...')
+        self.reload()
+        result = self._check_autosubmit_conf()
+        result = result and self._check_platforms_conf()
+        result = result and self._check_jobs_conf()
+        result = result and self._check_expdef_conf()
+        if result:
+            Log.result("Configuration files OK\n")
+        else:
+            Log.error("Configuration files invalid\n")
+        return result
+
+    def _check_autosubmit_conf(self):
+        """
+        Checks experiment's autosubmit configuration file.
+
+        :return: True if everything is correct, False if it founds any error
+        :rtype: bool
+        """
+        result = True
+        result = result and AutosubmitConfig.check_exists(self._conf_parser, 'config', 'AUTOSUBMIT_VERSION')
+        result = result and AutosubmitConfig.check_is_int(self._conf_parser, 'config', 'MAXWAITINGJOBS', True)
+        result = result and AutosubmitConfig.check_is_int(self._conf_parser, 'config', 'TOTALJOBS', True)
+        result = result and AutosubmitConfig.check_is_int(self._conf_parser, 'config', 'SAFETYSLEEPTIME', True)
+        result = result and AutosubmitConfig.check_is_int(self._conf_parser, 'config', 'RETRIALS', True)
+
+        if not result:
+            Log.critical("{0} is not a valid config file".format(os.path.basename(self._conf_parser_file)))
+        else:
+            Log.info('{0} OK'.format(os.path.basename(self._conf_parser_file)))
+        return result
+
+    def _check_platforms_conf(self):
+        """
+        Checks experiment's queues configuration file.
+
+        :return: True if everything is correct, False if it founds any error
+        :rtype: bool
+        """
+        result = True
+        if len(self._platforms_parser.sections()) == 0:
+            Log.warning("No remote platforms configured")
+
+        if len(self._platforms_parser.sections()) != len(set(self._platforms_parser.sections())):
+            Log.error('There are repeated platforms names')
+
+        for section in self._platforms_parser.sections():
+            result = result and AutosubmitConfig.check_is_choice(self._platforms_parser, section, 'TYPE', True,
+                                                                 ['ps', 'pbs', 'sge', 'lsf', 'ecaccess', 'slurm'])
+            platform_type = AutosubmitConfig.get_option(self._platforms_parser, section, 'TYPE', '').lower()
+            if platform_type != 'ps':
+                result = result and AutosubmitConfig.check_exists(self._platforms_parser, section, 'PROJECT')
+                result = result and AutosubmitConfig.check_exists(self._platforms_parser, section, 'USER')
+
+            if platform_type in ['pbs', 'ecaccess']:
+                result = result and AutosubmitConfig.check_exists(self._platforms_parser, section, 'VERSION')
+
+            result = result and AutosubmitConfig.check_exists(self._platforms_parser, section, 'HOST')
+            result = result and AutosubmitConfig.check_is_boolean(self._platforms_parser, section,
+                                                                  'ADD_PROJECT_TO_HOST', False)
+            result = result and AutosubmitConfig.check_is_boolean(self._platforms_parser, section, 'TEST_SUITE', False)
+
+        if not result:
+            Log.critical("{0} is not a valid config file".format(os.path.basename(self._platforms_parser_file)))
+        else:
+            Log.info('{0} OK'.format(os.path.basename(self._platforms_parser_file)))
+        return result
+
+    def _check_jobs_conf(self):
+        """
+        Checks experiment's jobs configuration file.
+
+        :return: True if everything is correct, False if it founds any error
+        :rtype: bool
+        """
+        result = True
+        parser = self._jobs_parser
+        sections = parser.sections()
+        if len(sections) == 0:
+            Log.warning("No remote platforms configured")
+
+        if len(sections) != len(set(sections)):
+            Log.error('There are repeated job names')
+
+        for section in sections:
+            result = result and AutosubmitConfig.check_exists(parser, section, 'FILE')
+            result = result and AutosubmitConfig.check_is_boolean(parser, section, 'RERUN_ONLY', False)
+            if parser.has_option(section, 'DEPENDENCIES'):
+                for dependency in str(AutosubmitConfig.get_option(parser, section, 'DEPENDENCIES', '')).split(' '):
+                    if '-' in dependency:
+                        dependency = dependency.split('-')[0]
+                    if dependency not in sections:
+                        Log.error('Job {0} depends on job {1} that is not defined'.format(section, dependency))
+
+            if parser.has_option(section, 'RERUN_DEPENDENCIES'):
+                for dependency in str(AutosubmitConfig.get_option(parser, section, 'RERUN_DEPENDENCIES',
+                                                                  '')).split(' '):
+                    if '-' in dependency:
+                        dependency = dependency.split('-')[0]
+                    if dependency not in sections:
+                        Log.error('Job {0} depends on job {1} that is not defined'.format(section, dependency))
+            result = result and AutosubmitConfig.check_is_choice(parser, section, 'RUNNING', False,
+                                                                 ['once', 'date', 'member', 'chunk'])
+
+        if not result:
+            Log.critical("{0} is not a valid config file".format(os.path.basename(self._jobs_parser_file)))
+        else:
+            Log.info('{0} OK'.format(os.path.basename(self._jobs_parser_file)))
+
+        return result
+
+    def _check_expdef_conf(self):
+        """
+        Checks experiment's experiment configuration file.
+
+        :return: True if everything is correct, False if it founds any error
+        :rtype: bool
+        """
+        result = True
+        parser = self._exp_parser
+
+        result = result and AutosubmitConfig.check_exists(parser, 'DEFAULT', 'EXPID')
+        result = result and AutosubmitConfig.check_exists(parser, 'DEFAULT', 'HPCARCH')
+
+        result = result and AutosubmitConfig.check_exists(parser, 'experiment', 'DATELIST')
+        result = result and AutosubmitConfig.check_exists(parser, 'experiment', 'MEMBERS')
+        result = result and AutosubmitConfig.check_is_choice(parser, 'experiment', 'CHUNKSIZEUNIT', True,
+                                                             ['year', 'month', 'day', 'hour'])
+        result = result and AutosubmitConfig.check_is_int(parser, 'experiment', 'CHUNKSIZE', True)
+        result = result and AutosubmitConfig.check_is_int(parser, 'experiment', 'NUMCHUNKS', True)
+        result = result and AutosubmitConfig.check_is_choice(parser, 'experiment', 'CALENDAR', True,
+                                                             ['standard', 'noleap'])
+
+        result = result and AutosubmitConfig.check_is_boolean(parser, 'rerun', 'RERUN', True)
+
+        if AutosubmitConfig.check_is_choice(parser, 'project', 'PROJECT_TYPE', True,
+                                            ['none', 'git', 'svn', 'local']):
+            project_type = AutosubmitConfig.get_option(parser, 'project', 'PROJECT_TYPE', '')
+
+            if project_type == 'git':
+                result = result and AutosubmitConfig.check_exists(parser, 'git', 'PROJECT_ORIGIN')
+                result = result and AutosubmitConfig.check_exists(parser, 'git', 'PROJECT_BRANCH')
+            elif project_type == 'svn':
+                result = result and AutosubmitConfig.check_exists(parser, 'svn', 'PROJECT_URL')
+                result = result and AutosubmitConfig.check_exists(parser, 'svn', 'PROJECT_REVISION')
+            elif project_type == 'local':
+                result = result and AutosubmitConfig.check_exists(parser, 'local', 'PROJECT_PATH')
+
+            if project_type != 'none':
+                result = result and AutosubmitConfig.check_exists(parser, 'project_files', 'FILE_PROJECT_CONF')
+        else:
+            result = False
+
+        if not result:
+            Log.critical("{0} is not a valid config file".format(os.path.basename(self._exp_parser_file)))
+        else:
+            Log.info('{0}  OK'.format(os.path.basename(self._exp_parser_file)))
+        return result
+
+    def check_proj(self):
+        """
+        Checks project config file
+
+        :return: True if everything is correct, False if it founds any error
+        :rtype: bool
+        """
+        try:
+            if self._proj_parser_file == '':
+                self._proj_parser = None
+            else:
+                self._proj_parser = AutosubmitConfig.get_parser(self._proj_parser_file)
+            return True
+        except Exception as e:
+            Log.error('Project conf file error: {0}', e)
+            return False
+
+    def reload(self):
+        """
+        Creates parser objects for configuration files
+        """
+        self._conf_parser = AutosubmitConfig.get_parser(self._conf_parser_file)
+        self._platforms_parser = AutosubmitConfig.get_parser(self._platforms_parser_file)
+        self._jobs_parser = AutosubmitConfig.get_parser(self._jobs_parser_file)
+        self._exp_parser = AutosubmitConfig.get_parser(self._exp_parser_file)
+
+    def load_parameters(self):
+        """
+        Load parameters from experiment and autosubmit config files. If experiment's type is not none,
+        also load parameters from model's config file
+
+        :return: a dictionary containing tuples [parameter_name, parameter_value]
+        :rtype: dict
+        """
+        parameters = dict()
+        for section in self._exp_parser.sections():
+            for option in self._exp_parser.options(section):
+                parameters[option] = self._exp_parser.get(section, option)
+        for section in self._conf_parser.sections():
+            for option in self._conf_parser.options(section):
+                parameters[option] = self._conf_parser.get(section, option)
+
+        project_type = self.get_project_type()
+        if project_type != "none" and self._proj_parser is not None:
+            # Load project parameters
+            Log.debug("Loading project parameters...")
+            parameters2 = parameters.copy()
+            parameters2.update(self.load_project_parameters())
+            parameters = parameters2
+
+        return parameters
+
+    def load_project_parameters(self):
+        """
+        Loads parameters from model config file
+
+        :return: dictionary containing tuples [parameter_name, parameter_value]
+        :rtype: dict
+        """
+        projdef = []
+        for section in self._proj_parser.sections():
+            projdef += self._proj_parser.items(section)
+
+        parameters = dict()
+        for item in projdef:
+            parameters[item[0]] = item[1]
+
+        return parameters
+
+    @staticmethod
+    def print_parameters(title, parameters):
+        """
+        Prints the parameters table in a tabular mode
+
+        :param title: table's title
+        :type title: str
+        :param parameters: parameters to print
+        :type: list
+        """
+        Log.info(title)
+        Log.info("----------------------")
+        Log.info("{0:<{col1}}| {1:<{col2}}".format("-- Parameter --", "-- Value --", col1=15, col2=15))
+        for i in parameters:
+            Log.info("{0:<{col1}}| {1:<{col2}}".format(i[0], i[1], col1=15, col2=15))
+        Log.info("")
+
+    def check_parameters(self):
+        """
+        Function to check configuration of Autosubmit.
+
+        :return: True if all variables are set. If some parameter do not exist, the function returns False.
+        :rtype: bool
+        """
+        result = True
+
+        for section in self._conf_parser.sections():
+            self.print_parameters("AUTOSUBMIT PARAMETERS - " + section, self._conf_parser.items(section))
+            if "" in [item[1] for item in self._conf_parser.items(section)]:
+                result = False
+        for section in self._exp_parser.sections():
+            self.print_parameters("EXPERIMENT PARAMETERS - " + section, self._exp_parser.items(section))
+            if "" in [item[1] for item in self._exp_parser.items(section)]:
+                result = False
+
+        project_type = self.get_project_type()
+        if project_type != "none" and self._proj_parser is not None:
+            for section in self._proj_parser.sections():
+                self.print_parameters("PROJECT PARAMETERS - " + section, self._proj_parser.items(section))
+                if "" in [item[1] for item in self._proj_parser.items(section)]:
+                    result = False
+
+        return result
+
+    def get_expid(self):
+        """
+        Returns experiment identifier read from experiment's config file
+
+        :return: experiment identifier
+        :rtype: str
+        """
+        return self._exp_parser.get('DEFAULT', 'EXPID')
+
+    def set_expid(self, exp_id):
+        """
+        Set experiment identifier in autosubmit and experiment config files
+
+        :param exp_id: experiment identifier to store
+        :type exp_id: str
+        """
+        # Experiment conf
+        content = file(self._exp_parser_file).read()
+        if re.search('EXPID =.*', content):
+            content = content.replace(re.search('EXPID =.*', content).group(0), "EXPID = " + exp_id)
+        file(self._exp_parser_file, 'w').write(content)
+
+        content = file(self._conf_parser_file).read()
+        if re.search('EXPID =.*', content):
+            content = content.replace(re.search('EXPID =.*', content).group(0), "EXPID = " + exp_id)
+        file(self._conf_parser_file, 'w').write(content)
+
+    def get_project_type(self):
+        """
+        Returns project type from experiment config file
+
+        :return: project type
+        :rtype: str
+        """
+        return self._exp_parser.get('project', 'PROJECT_TYPE').lower()
+
+    def get_file_project_conf(self):
+        """
+        Returns path to project config file from experiment config file
+
+        :return: path to project config file
+        :rtype: str
+        """
+        return self._exp_parser.get('project_files', 'FILE_PROJECT_CONF')
+
+    def get_git_project_origin(self):
+        """
+        Returns git origin from experiment config file
+
+        :return: git origin
+        :rtype: str
+        """
+        return self._exp_parser.get('git', 'PROJECT_ORIGIN')
+
+    def get_git_project_branch(self):
+        """
+        Returns git branch  from experiment's config file
+
+        :return: git branch
+        :rtype: str
+        """
+        return self._exp_parser.get('git', 'PROJECT_BRANCH')
+
+    def get_git_project_commit(self):
+        """
+        Returns git commit from experiment's config file
+
+        :return: git commit
+        :rtype: str
+        """
+        return self._exp_parser.get('git', 'PROJECT_COMMIT')
+
+    def get_project_destination(self):
+        """
+        Returns git commit from experiment's config file
+
+        :return: git commit
+        :rtype: str
+        """
+        value = self._exp_parser.get('project', 'PROJECT_DESTINATION')
+        if not value:
+            if self.get_project_type().lower() == "local":
+                value = os.path.split(self.get_local_project_path())[1]
+            elif self.get_project_type().lower() == "svn":
+                value = self.get_svn_project_url().split('/')[-1]
+            elif self.get_project_type().lower() == "git":
+                value = self.get_git_project_origin().split('/')[-1].split('.')[-2]
+        return value
+
+    def set_git_project_commit(self):
+        """
+        Function to register in the configuration the commit SHA of the git project version.
+        """
+        save = False
+        project_branch_sha = None
+        full_proj_dir = os.path.join(BasicConfig.LOCAL_ROOT_DIR, self.get_expid(), BasicConfig.LOCAL_PROJ_DIR)
+        project_name = listdir(full_proj_dir)[0]
+        full_project_path = os.path.join(full_proj_dir, project_name)
+        (status1, output) = getstatusoutput("cd " + full_project_path)
+        (status2, output) = getstatusoutput("cd " + full_project_path + "; " +
+                                            "git rev-parse --abbrev-ref HEAD")
+        if status1 == 0 and status2 == 0:
+            project_branch = output
+            Log.debug("Project branch is: " + project_branch)
+
+            (status1, output) = getstatusoutput("cd " + full_project_path)
+            (status2, output) = getstatusoutput("cd " + full_project_path + "; " +
+                                                "git rev-parse HEAD")
+            if status1 == 0 and status2 == 0:
+                project_sha = output
+                save = True
+                Log.debug("Project commit SHA is: " + project_sha)
+                project_branch_sha = project_branch + " " + project_sha
+            else:
+                Log.critical("Failed to retrieve project commit SHA...")
+
+        else:
+            Log.critical("Failed to retrieve project branch...")
+
+            # register changes
+        if save:
+            content = file(self._exp_parser_file).read()
+            if re.search('PROJECT_COMMIT =.*', content):
+                content = content.replace(re.search('PROJECT_COMMIT =.*', content).group(0),
+                                          "PROJECT_COMMIT = " + project_branch_sha)
+            file(self._exp_parser_file, 'w').write(content)
+            Log.debug("Project commit SHA succesfully registered to the configuration file.")
+            return True
+        else:
+            Log.critical("Changes NOT registered to the configuration file...")
+        return False
+
+    def get_svn_project_url(self):
+        """
+        Gets subversion project url
+
+        :return: subversion project url
+        :rtype: str
+        """
+        return self._exp_parser.get('svn', 'PROJECT_URL')
+
+    def get_svn_project_revision(self):
+        """
+        Get revision for subversion project
+
+        :return: revision for subversion project
+        :rtype: str
+        """
+        return self._exp_parser.get('svn', 'PROJECT_REVISION')
+
+    def get_local_project_path(self):
+        """
+        Gets path to origin for local project
+
+        :return: path to local project
+        :rtype: str
+        """
+        return self._exp_parser.get('local', 'PROJECT_PATH')
+
+    def get_date_list(self):
+        """
+        Returns startdates list from experiment's config file
+
+        :return: experiment's startdates
+        :rtype: list
+        """
+        date_list = list()
+        string = self._exp_parser.get('experiment', 'DATELIST')
+        if not string.startswith("["):
+            string = '[{0}]'.format(string)
+        split_string = nestedExpr('[', ']').parseString(string).asList()
+        string_date = None
+        for split in split_string[0]:
+            if type(split) is list:
+                for split_in in split:
+                    date_list.append(parse_date(string_date + split_in))
+                string_date = None
+            else:
+                if string_date is not None:
+                    date_list.append(parse_date(string_date))
+                string_date = split
+        if string_date is not None:
+            date_list.append(parse_date(string_date))
+        return date_list
+
+    def get_num_chunks(self):
+        """
+        Returns number of chunks to run for each member
+
+        :return: number of chunks
+        :rtype: int
+        """
+        return int(self._exp_parser.get('experiment', 'NUMCHUNKS'))
+
+    def get_chunk_size_unit(self):
+        """
+        Unit for the chunk length
+
+        :return: Unit for the chunk length  Options: {hour, day, month, year}
+        :rtype: str
+        """
+        return self._exp_parser.get('experiment', 'CHUNKSIZEUNIT').lower()
+
+    def get_member_list(self):
+        """
+        Returns members list from experiment's config file
+
+        :return: experiment's members
+        :rtype: list
+        """
+        return self._exp_parser.get('experiment', 'MEMBERS').split(' ')
+
+    def get_rerun(self):
+        """
+        Returns startdates list from experiment's config file
+
+        :return: rerurn value
+        :rtype: list
+        """
+        return self._exp_parser.get('rerun', 'RERUN').lower()
+
+    def get_chunk_list(self):
+        """
+        Returns chunk list from experiment's config file
+
+        :return: experiment's chunks
+        :rtype: list
+        """
+        return self._exp_parser.get('rerun', 'CHUNKLIST')
+
+    def get_platform(self):
+        """
+        Returns main platforms from experiment's config file
+
+        :return: main platforms
+        :rtype: str
+        """
+        return self._exp_parser.get('experiment', 'HPCARCH').lower()
+
+    def set_platform(self, hpc):
+        """
+        Sets main platforms in experiment's config file
+
+        :param hpc: main platforms
+        :type: str
+        """
+        content = file(self._exp_parser_file).read()
+        if re.search('HPCARCH =.*', content):
+            content = content.replace(re.search('HPCARCH =.*', content).group(0), "HPCARCH = " + hpc)
+        file(self._exp_parser_file, 'w').write(content)
+
+    def set_version(self, autosubmit_version):
+        """
+        Sets autosubmit's version in autosubmit's config file
+
+        :param autosubmit_version: autosubmit's version
+        :type autosubmit_version: str
+        """
+        content = file(self._conf_parser_file).read()
+        if re.search('AUTOSUBMIT_VERSION =.*', content):
+            content = content.replace(re.search('AUTOSUBMIT_VERSION =.*', content).group(0),
+                                      "AUTOSUBMIT_VERSION = " + autosubmit_version)
+        file(self._conf_parser_file, 'w').write(content)
+
+    def get_total_jobs(self):
+        """
+        Returns max number of running jobs  from autosubmit's config file
+
+        :return: max number of running jobs
+        :rtype: int
+        """
+        return int(self._conf_parser.get('config', 'TOTALJOBS'))
+
+    def get_max_waiting_jobs(self):
+        """
+        Returns max number of waitng jobs from autosubmit's config file
+
+        :return: main platforms
+        :rtype: int
+        """
+        return int(self._conf_parser.get('config', 'MAXWAITINGJOBS'))
+
+    def get_safetysleeptime(self):
+        """
+        Returns safety sleep time from autosubmit's config file
+
+        :return: safety sleep time
+        :rtype: int
+        """
+        return int(self._conf_parser.get('config', 'SAFETYSLEEPTIME'))
+
+    def set_safetysleeptime(self, sleep_time):
+        """
+        Sets autosubmit's version in autosubmit's config file
+
+        :param sleep_time: value to set
+        :type sleep_time: int
+        """
+        content = file(self._conf_parser_file).read()
+        content = content.replace(re.search('SAFETYSLEEPTIME =.*', content).group(0),
+                                  "SAFETYSLEEPTIME = %d" % sleep_time)
+        file(self._conf_parser_file, 'w').write(content)
+
+    def get_retrials(self):
+        """
+        Returns max number of retrials for job from autosubmit's config file
+
+        :return: safety sleep time
+        :rtype: int
+        """
+        return int(self._conf_parser.get('config', 'RETRIALS'))
+
+    @staticmethod
+    def get_parser(file_path):
+        """
+        Gets parser for given file
+
+        :param file_path: path to file to be parsed
+        :type file_path: str
+        :return: parser
+        :rtype: SafeConfigParser
+        """
+        parser = SafeConfigParser()
+        parser.optionxform = str
+        parser.read(file_path)
+        return parser
+
+    def read_platforms_conf(self):
+        """
+        Read platforms configuration file and create defined platforms. Also adds the local remote_platform to the list
+
+        :return: platforms defined on file and local remote_platform. None if configuration is invalid
+        :rtype: list
+        """
+        parser = self._platforms_parser
+
+        platforms = dict()
+        local_platform = LocalPlatform(self.expid)
+        local_platform.name = 'local'
+        local_platform.type = 'local'
+        local_platform.version = ''
+        local_platform.queue = ''
+        local_platform.set_host(platform.node())
+        local_platform.set_scratch(os.path.join(BasicConfig.LOCAL_ROOT_DIR, self.expid, BasicConfig.LOCAL_TMP_DIR))
+        local_platform.set_project(self.expid)
+        local_platform.set_budget(self.expid)
+        local_platform.set_user('')
+        local_platform.update_cmds()
+
+        platforms['local'] = local_platform
+        for section in parser.sections():
+            platform_type = AutosubmitConfig.get_option(parser, section, 'TYPE', '').lower()
+            platform_version = AutosubmitConfig.get_option(parser, section, 'VERSION', '')
+            try:
+                if platform_type == 'pbs':
+                    remote_platform = PBSPlatform(self.expid, platform_version)
+                elif platform_type == 'sge':
+                    remote_platform = SgePlatform(self.expid)
+                elif platform_type == 'ps':
+                    remote_platform = PsPlatform(self.expid)
+                elif platform_type == 'lsf':
+                    remote_platform = LsfPlatform(self.expid)
+                elif platform_type == 'ecaccess':
+                    remote_platform = EcPlatform(self.expid, platform_version)
+                elif platform_type == 'slurm':
+                    remote_platform = SlurmPlatform(self.expid)
+                elif platform_type == '':
+                    Log.error("Queue type not specified".format(platform_type))
+                    return None
+                else:
+                    Log.error("Queue type {0} not defined".format(platform_type))
+                    return None
+            except HPCPlatformException as e:
+                Log.error("Queue exception: {0}".format(e.message))
+                return None
+
+            remote_platform.type = platform_type
+            remote_platform.version = platform_version
+            if AutosubmitConfig.get_option(parser, section, 'ADD_PROJECT_TO_HOST', '').lower() == 'true':
+                host = '{0}-{1}'.format(AutosubmitConfig.get_option(parser, section, 'HOST', None),
+                                        AutosubmitConfig.get_option(parser, section, 'PROJECT', None))
+            else:
+                host = AutosubmitConfig.get_option(parser, section, 'HOST', None)
+            remote_platform.set_host(host)
+            remote_platform.set_project(AutosubmitConfig.get_option(parser, section, 'PROJECT', None))
+            remote_platform.set_budget(AutosubmitConfig.get_option(parser, section, 'BUDGET', remote_platform.project))
+            remote_platform.set_user(AutosubmitConfig.get_option(parser, section, 'USER', None))
+            remote_platform.set_scratch(AutosubmitConfig.get_option(parser, section, 'SCRATCH_DIR', None))
+            remote_platform._default_queue = AutosubmitConfig.get_option(parser, section, 'QUEUE', None)
+            remote_platform._serial_queue = AutosubmitConfig.get_option(parser, section, 'SERIAL_QUEUE', None)
+            remote_platform.name = section.lower()
+            remote_platform.update_cmds()
+            platforms[section.lower()] = remote_platform
+
+        for section in parser.sections():
+            if parser.has_option(section, 'SERIAL_PLATFORM'):
+                platforms[section.lower()].set_serial_platform(platforms[AutosubmitConfig.get_option(parser, section,
+                                                                                                     'SERIAL_PLATFORM',
+                                                                                                     None).lower()])
+
+        return platforms
+
+    @staticmethod
+    def get_option(parser, section, option, default):
+        """
+        Gets an option from given parser
+
+        :param parser: parser to use
+        :type parser: SafeConfigParser
+        :param section: section that contains the option
+        :type section: str
+        :param option: option to get
+        :type option: str
+        :param default: value to be returned if option is not present
+        :type default: object
+        :return: option value
+        :rtype: str
+        """
+        if parser.has_option(section, option):
+            return parser.get(section, option)
+        else:
+            return default
+
+    @staticmethod
+    def get_bool_option(parser, section, option, default):
+        """
+        Gets a boolean option from given parser
+
+        :param parser: parser to use
+        :type parser: SafeConfigParser
+        :param section: section that contains the option
+        :type section: str
+        :param option: option to get
+        :type option: str
+        :param default: value to be returned if option is not present
+        :type default: bool
+        :return: option value
+        :rtype: bool
+        """
+        if parser.has_option(section, option):
+            return parser.get(section, option).lower().strip() == 'true'
+        else:
+            return default
+
+    @staticmethod
+    def check_exists(parser, section, option):
+        """
+        Checks if an option exists in given parser
+
+        :param parser: parser to use
+        :type parser: SafeConfigParser
+        :param section: section that contains the option
+        :type section: str
+        :param option: option to check
+        :type option: str
+        :return: True if option exists, False otherwise
+        :rtype: bool
+        """
+        if parser.has_option(section, option):
+            return True
+        else:
+            Log.error('Option {0} in section {1} not found'.format(option, section))
+            return False
+
+    @staticmethod
+    def check_is_boolean(parser, section, option, must_exist):
+        """
+        Checks if an option is a boolean value in given parser
+
+        :param parser: parser to use
+        :type parser: SafeConfigParser
+        :param section: section that contains the option
+        :type section: str
+        :param option: option to check
+        :type option: str
+        :param must_exist: if True, option must exist
+        :type must_exist: bool
+        :return: True if option value is boolean, False otherwise
+        :rtype: bool
+        """
+        if must_exist and not AutosubmitConfig.check_exists(parser, section, option):
+            Log.error('Option {0} in section {1} must exist'.format(option, section))
+            return False
+        if AutosubmitConfig.get_option(parser, section, option, 'false').lower() not in ['false', 'true']:
+            Log.error('Option {0} in section {1} must be true or false'.format(option, section))
+            return False
+        return True
+
+    @staticmethod
+    def check_is_choice(parser, section, option, must_exist, choices):
+        """
+        Checks if an option is a valid choice in given parser
+
+        :param parser: parser to use
+        :type parser: SafeConfigParser
+        :param section: section that contains the option
+        :type section: str
+        :param option: option to check
+        :type option: str
+        :param must_exist: if True, option must exist
+        :type must_exist: bool
+        :param choices: valid choices
+        :type choices: list
+        :return: True if option value is a valid choice, False otherwise
+        :rtype: bool
+        """
+        if must_exist and not AutosubmitConfig.check_exists(parser, section, option):
+            return False
+        value = AutosubmitConfig.get_option(parser, section, option, choices[0])
+        if value.lower() not in choices:
+            Log.error('Value {2} in option {0} in section {1} is not a valid choice'.format(option, section, value))
+            return False
+        return True
+
+    @staticmethod
+    def check_is_int(parser, section, option, must_exist):
+        """
+        Checks if an option is an integer value in given parser
+
+        :param parser: parser to use
+        :type parser: SafeConfigParser
+        :param section: section that contains the option
+        :type section: str
+        :param option: option to check
+        :type option: str
+        :param must_exist: if True, option must exist
+        :type must_exist: bool
+        :return: True if option value is integer, False otherwise
+        :rtype: bool
+        """
+        if must_exist and not AutosubmitConfig.check_exists(parser, section, option):
+            return False
+        value = AutosubmitConfig.get_option(parser, section, option, '1')
+        try:
+            int(value)
+        except ValueError:
+            Log.error('Option {0} in section {1} is not valid an integer'.format(option, section))
+            return False
+        return True
+
+    @staticmethod
+    def check_regex(parser, section, option, must_exist, regex):
+        """
+        Checks if an option complies with a regular expression in given parser
+
+        :param parser: parser to use
+        :type parser: SafeConfigParser
+        :param section: section that contains the option
+        :type section: str
+        :param option: option to check
+        :type option: str
+        :param must_exist: if True, option must exist
+        :type must_exist: bool
+        :param regex: regular expression to check
+        :type regex: str
+        :return: True if option complies with regex, False otherwise
+        :rtype: bool
+        """
+        if must_exist and not AutosubmitConfig.check_exists(parser, section, option):
+            return False
+        prog = re.compile(regex)
+        value = AutosubmitConfig.get_option(parser, section, option, '1')
+        if not prog.match(value):
+            Log.error('Option {0} in section {1} is not valid: {2}'.format(option, section, value))
+            return False
+        return True
+
+    @staticmethod
+    def check_json(key, value):
+        """
+        Checks if value is a valid json
+
+        :param key: key to check
+        :type key: str
+        :param value: value
+        :type value: str
+        :return: True if value is a valid json, False otherwise
+        :rtype: bool
+        """
+        # noinspection PyBroadException
+        try:
+            nestedExpr('[', ']').parseString(value).asList()
+            return True
+        except:
+            Log.error("Invalid value {0}: {1}", key, value)
+            return False
+
+
+
+
diff --git a/autosubmit/config/files/autosubmit.conf b/autosubmit/config/files/autosubmit.conf
new file mode 100644
index 0000000..cf56036
--- /dev/null
+++ b/autosubmit/config/files/autosubmit.conf
@@ -0,0 +1,19 @@
+[config]
+# Experiment identifier
+# No need to change
+EXPID =
+# No need to change.
+# Autosubmit version identifier
+AUTOSUBMIT_VERSION =
+# Maximum number of jobs to be waiting in the HPC queue
+# Default = 3
+MAXWAITINGJOBS = 3
+# Maximum number of jobs to be running at the same time at the HPC
+# Default = 6
+TOTALJOBS = 6 
+# Time (seconds) between connections to the HPC queue scheduler to poll already submitted jobs status
+# Default = 10
+SAFETYSLEEPTIME = 10
+# Number of retrials if a job fails
+# Default = 4
+RETRIALS = 4
\ No newline at end of file
diff --git a/autosubmit/config/files/expdef.conf b/autosubmit/config/files/expdef.conf
new file mode 100644
index 0000000..ff21fa1
--- /dev/null
+++ b/autosubmit/config/files/expdef.conf
@@ -0,0 +1,64 @@
+[DEFAULT]
+# Experiment identifier
+# No need to change
+EXPID =
+# HPC name.
+# No need to change
+HPCARCH =
+
+[experiment]
+# Supply the list of start dates. Available formats: YYYYMMDD YYYYMMDDhh YYYYMMDDhhmm
+# Also you can use an abreviated sintax for multiple dates with common parts: 200001[01 15] <=> 20000101 20000115
+# DATELIST = 19600101 19650101 19700101
+# DATELIST = 1960[0101 0201 0301]
+DATELIST =
+# Supply the list of members. LIST = fc0 fc1 fc2 fc3 fc4
+MEMBERS =
+# Chunk size unit. STRING = hour, day, month, year
+CHUNKSIZEUNIT = month
+# Chunk size. NUMERIC = 4, 6, 12
+CHUNKSIZE =
+# Total number of chunks in experiment. NUMERIC = 30, 15, 10
+NUMCHUNKS =
+# Calendar used. LIST: standard, noleap
+CALENDAR = standard
+
+[rerun]
+# Is a rerun or not? [Default: Do set FALSE]. BOOLEAN = TRUE, FALSE
+RERUN = FALSE
+# If RERUN = TRUE then supply the list of chunks to rerun
+# LIST = "[ 19601101 [ fc0 [1 2 3 4] fc1 [1] ] 19651101 [ fc0 [16-30] ] ]"
+CHUNKLIST =
+
+[project]
+# Select project type. STRING = git, svn, local, none
+# If PROJECT_TYPE is set to none, Autosubmit self-contained dummy templates will be used
+PROJECT_TYPE =
+# Destination folder name for project. type = STRING, default = leave empty,
+PROJECT_DESTINATION =
+
+# If PROJECT_TYPE is not git, no need to change
+[git]
+# Repository URL  STRING = 'https://github.com/torvalds/linux.git'
+PROJECT_ORIGIN =
+# Select branch or tag, STRING, default = 'master', help = {'master' (default), 'develop', 'v3.1b', ...}
+PROJECT_BRANCH =
+# type = STRING, default = leave empty, help = if model branch is a TAG leave empty
+PROJECT_COMMIT =
+
+# If PROJECT_TYPE is not svn, no need to change
+[svn]
+# type = STRING, help = 'https://svn.ec-earth.org/ecearth3'
+PROJECT_URL =
+# Select revision number. NUMERIC = 1778
+PROJECT_REVISION =
+
+# If PROJECT_TYPE is not local, no need to change
+[local]
+# type = STRING, help = /foo/bar/ecearth 
+PROJECT_PATH =
+
+# If PROJECT_TYPE is none, no need to change
+[project_files]
+# Where is PROJECT CONFIGURATION file location relative to project root path
+FILE_PROJECT_CONF =
\ No newline at end of file
diff --git a/autosubmit/config/files/jobs.conf b/autosubmit/config/files/jobs.conf
new file mode 100644
index 0000000..2beaf56
--- /dev/null
+++ b/autosubmit/config/files/jobs.conf
@@ -0,0 +1,75 @@
+# Example job with all options specified
+
+## Job name
+# [JOBNAME]
+## Script to execute. If not specified, job will be omited from workflow.
+## Path relative to the project directory
+# FILE =
+## Platform to execute the job. If not specificied, defaults to HPCARCH in expedf file.
+## LOCAL is always defined and referes to current machine
+# PLATFORM =
+## Queue to add the job to. If not specificied, uses PLATFORM default.
+# QUEUE =
+## Defines dependencies from job as a list of parents jobs separed by spaces.
+## Dependencies to jobs in previous chunk, member o startdate, use -(DISTANCE)
+# DEPENDENCIES = INI SIM-1 CLEAN-2
+## Define if jobs runs once, once per stardate, once per member or once per chunk. Options: once, date, member, chunk.
+## If not specified, defaults to once
+# RUNNING = once
+## Specifies that job has only to be run after X dates, members or chunk. A job will always be created for the last
+## If not specified, defaults to once
+# FREQUENCY = 3
+## Defines if job is only to be executed in reruns. If not specified, defaults to false.
+# RERUN_ONLY = False
+## Defines jobs needed to be rerun if this job is going to be rerunned
+# RERUN_DEPENDENCIES = RERUN INI LOCAL_SETUP REMOTE_SETUP TRANSFER
+## Wallclock to be submitted to the HPC queue in format HH:MM
+# WALLCLOCK = 00:05
+## Processors number to be submitted to the HPC. If not specified, defaults to 1.
+# PROCESSORS = 1
+## Threads number to be submitted to the HPC. If not specified, defaults to 1.
+# THREADS = 1
+## Tasks number to be submitted to the HPC. If not specified, defaults to 1.
+# TASKS = 1
+
+[LOCAL_SETUP]
+FILE = LOCAL_SETUP.sh
+PLATFORM = LOCAL
+
+[REMOTE_SETUP]
+FILE = REMOTE_SETUP.sh
+DEPENDENCIES = LOCAL_SETUP
+WALLCLOCK = 00:05
+
+[INI]
+FILE = INI.sh
+DEPENDENCIES = REMOTE_SETUP
+RUNNING = member
+WALLCLOCK = 00:05
+
+[SIM]
+FILE = SIM.sh
+DEPENDENCIES = INI SIM-1 CLEAN-2
+RUNNING = chunk
+WALLCLOCK = 00:05
+PROCESSORS = 2
+THREADS = 1
+TASKS = 1
+
+[POST]
+FILE = POST.sh
+DEPENDENCIES = SIM
+RUNNING = chunk
+WALLCLOCK = 00:05
+
+[CLEAN]
+FILE = CLEAN.sh
+DEPENDENCIES = POST
+RUNNING = chunk
+WALLCLOCK = 00:05
+
+[TRANSFER]
+FILE = TRANSFER.sh
+PLATFORM = LOCAL
+DEPENDENCIES = CLEAN
+RUNNING = member
\ No newline at end of file
diff --git a/autosubmit/config/files/platforms.conf b/autosubmit/config/files/platforms.conf
new file mode 100644
index 0000000..40c29cb
--- /dev/null
+++ b/autosubmit/config/files/platforms.conf
@@ -0,0 +1,29 @@
+# Example queue with all options specified
+
+## Platform name
+# [PLAFORM]
+## Queue type. Options: PBS, SGE, PS, LSF, ecaccess, SLURM
+# TYPE =
+## Version of queue manager to use. Needed only in PBS (options: 10, 11, 12) and ecaccess (options: pbs, loadleveler)
+# VERSION =
+## Hostname of the HPC
+# HOST =
+## Project for the machine scheduler
+# PROJECT =
+## Budget account for the machine scheduler. If omited, takes the value defined in PROJECT
+# BUDGET =
+## Option to add project name to host. This is required for some HPCs
+# ADD_PROJECT_TO_HOST = False
+## User for the machine scheduler
+# USER =
+## Path to the scratch directory for the machine
+# SCRATCH_DIR = /scratch
+## If true, autosubmit test command can use this queue as a main queue. Defaults to false
+# TEST_SUITE = False
+## If given, autosubmit will add jobs to the given queue
+# QUEUE =
+## If specified, autosubmit will run jobs with only one processor in the specified platform.
+# SERIAL_PLATFORM = SERIAL_PLATFORM_NAME
+## If specified, autosubmit will run jobs with only one processor in the specified queue.
+## Autosubmit will ignore this configuration if SERIAL_PLATFORM is provided
+# SERIAL_QUEUE = SERIAL_QUEUE_NAME
\ No newline at end of file
diff --git a/autosubmit/config/log.py b/autosubmit/config/log.py
new file mode 100644
index 0000000..0f7fa7c
--- /dev/null
+++ b/autosubmit/config/log.py
@@ -0,0 +1,245 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+
+import logging
+import os
+import sys
+from datetime import datetime
+
+
+class LogFormatter:
+    """
+    Class to format log output.
+
+    :param to_file: If True, creates a LogFormatter for files; if False, for console
+    :type to_file: bool
+    """
+    RESULT = '\033[32m'
+    WARNING = '\033[33m'
+    ERROR = '\033[31m'
+    CRITICAL = '\033[1m \033[31m'
+    DEFAULT = '\033[0m\033[39m'
+
+    def __init__(self, to_file=False):
+        """
+        Initializer for LogFormatter
+
+
+        """
+        self._file = to_file
+        if self._file:
+            self._formatter = logging.Formatter('%(asctime)s %(message)s')
+        else:
+            self._formatter = logging.Formatter('%(message)s')
+
+    def format(self, record):
+        """
+        Format log output, adding labels if needed for log level. If logging to console, also manages font color.
+        If logging to file adds timestamp
+
+        :param record: log record to format
+        :type record: LogRecord
+        :return: formatted record
+        :rtype: str
+        """
+        header = ''
+        if record.levelno == Log.RESULT:
+            if not self._file:
+                header = LogFormatter.RESULT
+        elif record.levelno == Log.USER_WARNING:
+            if not self._file:
+                header = LogFormatter.WARNING
+        elif record.levelno == Log.WARNING:
+            if not self._file:
+                header = LogFormatter.WARNING
+            header += "[WARNING] "
+        elif record.levelno == Log.ERROR:
+            if not self._file:
+                header = LogFormatter.ERROR
+            header += "[ERROR] "
+        elif record.levelno == Log.CRITICAL:
+            if not self._file:
+                header = LogFormatter.ERROR
+            header += "[CRITICAL] "
+
+        msg = self._formatter.format(record)
+        if header != '' and not self._file:
+            msg += LogFormatter.DEFAULT
+        return header + msg
+
+
+class Log:
+    """
+    Static class to manage the log for the application. Messages will be sent to console and to file if it is
+    configured. Levels can be set for each output independently. These levels are (from lower to higher priority):
+
+        - EVERYTHING : this level is just defined to show every output
+        - DEBUG
+        - INFO
+        - RESULT
+        - USER_WARNING
+        - WARNING
+        - ERROR
+        - CRITICAL
+        - NO_LOG : this level is just defined to remove every output
+
+    """
+    EVERYTHING = 0
+    DEBUG = logging.DEBUG
+    INFO = logging.INFO
+    RESULT = 25
+    USER_WARNING = 29
+    WARNING = logging.WARNING
+    ERROR = logging.ERROR
+    CRITICAL = logging.CRITICAL
+    NO_LOG = CRITICAL + 1
+
+    logging.basicConfig()
+
+    log = logging.Logger('Autosubmit', EVERYTHING)
+
+    console_handler = logging.StreamHandler(sys.stdout)
+    console_handler.setLevel(INFO)
+    console_handler.setFormatter(LogFormatter(False))
+    log.addHandler(console_handler)
+
+    file_handler = None
+    file_level = INFO
+
+    @staticmethod
+    def set_file(file_path):
+        """
+        Configure the file to store the log. If another file was specified earlier, new messages will only go to the
+        new file.
+
+        :param file_path: file to store the log
+        :type file_path: str
+        """
+        (directory, filename) = os.path.split(file_path)
+        files = [f for f in os.listdir(directory) if os.path.isfile(os.path.join(directory, f)) and
+                 f.endswith(filename)]
+        if len(files) >= 5:
+            files.sort()
+            os.remove(os.path.join(directory, files[0]))
+        file_path = os.path.join(directory, '{0:%Y%m%d_%H%M%S}_'.format(datetime.now()) + filename)
+        if Log.file_handler is not None:
+            Log.log.removeHandler(Log.file_handler)
+        Log.file_handler = logging.FileHandler(file_path, 'w')
+        Log.file_handler.setLevel(Log.file_level)
+        Log.file_handler.setFormatter(LogFormatter(True))
+        Log.log.addHandler(Log.file_handler)
+
+    @staticmethod
+    def set_console_level(level):
+        """
+        Sets log level for logging to console. Every output of level equal or higher to parameter level will be
+        printed on console
+
+        :param level: new level for console
+        :return: None
+        """
+        if type(level) is str:
+            level = getattr(Log, level)
+        Log.console_handler.level = level
+
+    @staticmethod
+    def set_file_level(level):
+        """
+        Sets log level for logging to file. Every output of level equal or higher to parameter level will be
+        added to log file
+
+        :param level: new level for log file
+        """
+        if type(level) is str:
+            level = getattr(Log, level)
+        Log.file_level = level
+        if Log.file_handler is not None:
+            Log.file_handler.level = level
+
+    @staticmethod
+    def debug(msg, *args):
+        """
+        Sends debug information to the log
+
+        :param msg: message to show
+        :param args: arguments for message formating (it will be done using format() method on str)
+        """
+        Log.log.debug(msg.format(*args))
+
+    @staticmethod
+    def info(msg, *args):
+        """
+        Sends information to the log
+
+        :param msg: message to show
+        :param args: arguments for message formating (it will be done using format() method on str)
+        """
+        Log.log.info(msg.format(*args))
+
+    @staticmethod
+    def result(msg, *args):
+        """
+        Sends results information to the log. It will be shown in green in the console.
+
+        :param msg: message to show
+        :param args: arguments for message formating (it will be done using format() method on str)
+        """
+        Log.log.log(Log.RESULT, msg.format(*args))
+
+    @staticmethod
+    def user_warning(msg, *args):
+        """
+        Sends warnings for the user to the log. It will be shown in yellow in the console.
+
+        :param msg: message to show
+        :param args: arguments for message formating (it will be done using format() method on str)
+        """
+        Log.log.log(Log.USER_WARNING, msg.format(*args))
+
+    @staticmethod
+    def warning(msg, *args):
+        """
+        Sends program warnings to the log. It will be shown in yellow in the console.
+
+        :param msg: message to show
+        :param args: arguments for message formating (it will be done using format() method on str)
+        """
+        Log.log.warning(msg.format(*args))
+
+    @staticmethod
+    def error(msg, *args):
+        """
+        Sends errors to the log. It will be shown in red in the console.
+
+        :param msg: message to show
+        :param args: arguments for message formating (it will be done using format() method on str)
+        """
+        Log.log.error(msg.format(*args))
+
+    @staticmethod
+    def critical(msg, *args):
+        """
+        Sends critical errors to the log. It will be shown in red in the console.
+
+        :param msg: message to show
+        :param args: arguments for message formating (it will be done using format() method on str)
+        """
+        Log.log.critical(msg.format(*args))
+
+
diff --git a/autosubmit/database/__init__.py b/autosubmit/database/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/autosubmit/database/data/autosubmit.sql b/autosubmit/database/data/autosubmit.sql
new file mode 100644
index 0000000..90674cd
--- /dev/null
+++ b/autosubmit/database/data/autosubmit.sql
@@ -0,0 +1,10 @@
+CREATE TABLE experiment(
+	id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
+	name VARCHAR NOT NULL,
+	description VARCHAR NOT NULL,
+	autosubmit_version VARCHAR,
+    );
+CREATE TABLE db_version(
+    version INTEGER NOT NULL,
+    );
+INSERT INTO db_version (version) VALUES (1);
\ No newline at end of file
diff --git a/autosubmit/database/db_common.py b/autosubmit/database/db_common.py
new file mode 100644
index 0000000..fc2c61d
--- /dev/null
+++ b/autosubmit/database/db_common.py
@@ -0,0 +1,423 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+
+"""
+Module containing functions to manage autosubmit's database.
+"""
+import os
+import sqlite3
+import string
+
+from autosubmit.config.log import Log
+from autosubmit.config.basicConfig import BasicConfig
+
+CURRENT_DATABASE_VERSION = 1
+
+
+def create_db(qry):
+    """
+    Creates a new database for autosubmit
+
+    :param qry: query to create the new database
+    :type qry: str    """
+
+    try:
+        (conn, cursor) = open_conn(False)
+    except DbException as e:
+        Log.error('Connection to database could not be established: {0}', e.message)
+        return False
+
+    try:
+        cursor.execute(qry)
+    except sqlite3.Error:
+        close_conn(conn, cursor)
+        Log.error('The database can not be created.' + 'DB file:' + BasicConfig.DB_PATH)
+        return False
+
+    conn.commit()
+    close_conn(conn, cursor)
+    return True
+
+
+def _set_experiment(name, description, version):
+    """
+    Stores experiment in database
+
+    :param name: experiment's name
+    :type name: str
+    :param description: experiment's description
+    :type description: str
+    """
+    if not check_db():
+        return False
+    name = check_name(name)
+    if name == '':
+        return False
+    try:
+        (conn, cursor) = open_conn()
+    except DbException as e:
+        Log.error('Connection to database could not be established: {0}', e.message)
+        return False
+    try:
+        cursor.execute('INSERT INTO experiment (name, description, autosubmit_version) VALUES (:name, :description, '
+                       ':version)',
+                       {'name': name, 'description': description, 'version': version})
+    except sqlite3.IntegrityError as e:
+        close_conn(conn, cursor)
+        Log.error('Could not register experiment: {0}'.format(e))
+        return False
+
+    conn.commit()
+    close_conn(conn, cursor)
+    return True
+
+
+def check_experiment_exists(name, error_on_inexistence=True):
+    """
+    Checks if exist an experiment with the given name.
+
+    :param name: Experiment name
+    :type name: str
+    :return: If experiment exists returns true, if not returns false
+    :rtype: bool
+    """
+    if not check_db():
+        return False
+    name = check_name(name)
+    if name == '':
+        return False
+
+    try:
+        (conn, cursor) = open_conn()
+    except DbException as e:
+        Log.error('Connection to database could not be established: {0}', e.message)
+        return False
+    conn.isolation_level = None
+
+    # SQLite always return a unicode object, but we can change this
+    # behaviour with the next sentence
+    conn.text_factory = str
+    cursor.execute('select name from experiment where name=:name', {'name': name})
+    row = cursor.fetchone()
+    close_conn(conn, cursor)
+    if row is None:
+        if error_on_inexistence:
+            Log.error('The experiment name "{0}" does not exist yet!!!', name)
+        return False
+    return True
+
+
+def new_experiment(hpc, description, version):
+    """
+    Stores a new experiment on the database and generates its identifier
+
+    :param hpc: name of the main HPC to be used by the experiment
+    :type hpc: str
+    :param description: experiment's description
+    :type description: str
+    :return: experiment id for the new experiment
+    :rtype: str
+    """
+    last_exp_name = last_name_used()
+    if last_exp_name == '':
+        return ''
+    if last_exp_name == 'empty':
+        if hpc == 'test':
+            new_name = 'test0000'
+        else:
+            new_name = 'a000'
+    else:
+        new_name = _next_name(last_exp_name)
+        if new_name == '':
+            return ''
+    while check_experiment_exists(new_name, False):
+        new_name = _next_name(new_name)
+        if new_name == '':
+            return ''
+    if not _set_experiment(new_name, description, version):
+        return ''
+    Log.info('The new experiment "{0}" has been registered.', new_name)
+    return new_name
+
+
+def copy_experiment(name, hpc, description, version):
+    """
+    Creates a new experiment by copying an existing experiment
+
+    :param name: identifier of experiment to copy
+    :type name: str
+    :param hpc: name of the main HPC to be used by the experiment
+    :type hpc: str
+    :param description: experiment's description
+    :type description: str
+    :return: experiment id for the new experiment
+    :rtype: str
+    """
+    if not check_experiment_exists(name):
+        return ''
+    new_name = new_experiment(hpc, description, version)
+    return new_name
+
+
+def base36encode(number, alphabet=string.digits + string.ascii_lowercase):
+    """
+    Convert positive integer to a base36 string.
+
+    :param number: number to convert
+    :type number: int
+    :param alphabet: set of characters to use
+    :type alphabet: str
+    :return: number's base36 string value
+    :rtype: str
+    """
+    if not isinstance(number, (int, long)):
+        raise TypeError('number must be an integer')
+
+    # Special case for zero
+    if number == 0:
+        return '0'
+
+    base36 = ''
+
+    sign = ''
+    if number < 0:
+        sign = '-'
+        number = - number
+
+    while number > 0:
+        number, i = divmod(number, len(alphabet))
+        # noinspection PyAugmentAssignment
+        base36 = alphabet[i] + base36
+
+    return sign + base36.rjust(4, '0')
+
+
+def base36decode(number):
+    """
+    Converts a base36 string to a positive integer
+
+    :param number: base36 string to convert
+    :type number: str
+    :return: number's integer value
+    :rtype: int
+    """
+    return int(number, 36)
+
+
+def _next_name(name):
+    """
+    Get next experiment identifier
+
+    :param name: previous experiment identifier
+    :type name: str
+    :return: new experiment identifier
+    :rtype: str
+    """
+    name = check_name(name)
+    if name == '':
+        return ''
+    # Convert the name to base 36 in number add 1 and then encode it
+    return base36encode(base36decode(name) + 1)
+
+
+def last_name_used():
+    """
+    Gets last experiment identifier used for HPC
+
+    :return: last experiment identifier used for HPC, 'empty' if there is none
+    :rtype: str
+    """
+    if not check_db():
+        return ''
+    try:
+        (conn, cursor) = open_conn()
+    except DbException as e:
+        Log.error('Connection to database could not be established: {0}', e.message)
+        return ''
+    conn.text_factory = str
+    cursor.execute('SELECT name '
+                   'FROM experiment '
+                   'WHERE rowid=(SELECT max(rowid) FROM experiment WHERE autosubmit_version IS NOT NULL AND '
+                   'NOT (autosubmit_version LIKE "%3.0.0b%"))')
+    row = cursor.fetchone()
+    close_conn(conn, cursor)
+    if row is None:
+        return 'empty'
+
+    # If starts by number (during 3.0 beta some jobs starting with numbers where created), returns empty.
+    try:
+        int(row[0][0])
+        return 'empty'
+    except ValueError:
+        return row[0]
+
+
+def delete_experiment(name):
+    """
+    Removes experiment from database
+
+    :param name: experiment identifier
+    :type name: str
+    :return: True if delete is succesful
+    :rtype: bool
+    """
+    if not check_db():
+        return False
+    name = check_name(name)
+    if name == '':
+        return False
+    try:
+        (conn, cursor) = open_conn()
+    except DbException as e:
+        Log.error('Connection to database could not be established: {0}', e.message)
+        return False
+    cursor.execute('DELETE FROM experiment '
+                   'WHERE name=:name', {'name': name})
+    row = cursor.fetchone()
+    if row is None:
+        Log.debug('The experiment {0} has been deleted!!!', name)
+    close_conn(conn, cursor)
+    return True
+
+
+def check_name(name):
+    """
+    Checks if it is a valid experiment identifier
+
+    :param name: experiment identifier to check
+    :type name: str
+    :return: name if is valid, terminates program otherwise
+    :rtype: str
+    """
+    name = name.lower()
+    if len(name) < 4 or not name.isalnum():
+        Log.error("So sorry, but the name must have at least 4 alphanumeric chars!!!")
+        return ''
+    return name
+
+
+def check_db():
+    """
+    Checks if database file exist
+
+    :return: None if exists, terminates program if not
+    """
+
+    if not os.path.exists(BasicConfig.DB_PATH):
+        Log.error('Some problem has happened...check the database file.' + 'DB file:' + BasicConfig.DB_PATH)
+        return False
+    return True
+
+
+def open_conn(check_version=True):
+    """
+    Opens a connection to database
+
+    :return: connection object, cursor object
+    :rtype: sqlite3.Connection, sqlite3.Cursor
+    """
+    conn = sqlite3.connect(BasicConfig.DB_PATH)
+    cursor = conn.cursor()
+
+    # Getting databse version
+    if check_version:
+        try:
+            cursor.execute('SELECT version '
+                           'FROM db_version;')
+            row = cursor.fetchone()
+            version = row[0]
+        except sqlite3.OperationalError:
+            # If this exception is thrown it's because db_version does not exist.
+            # Database is from 2.x or 3.0 beta releases
+            try:
+                cursor.execute('SELECT type '
+                               'FROM experiment;')
+                # If type field exists, it's from 2.x
+                version = -1
+            except sqlite3.Error:
+                # If raises and error , it's from 3.0 beta releases
+                version = 0
+
+        # If database version is not the expected, update database....
+        if version < CURRENT_DATABASE_VERSION:
+            if not _update_database(version, cursor):
+                raise DbException('Database version could not be updated')
+
+        # ... or ask for autosubmit upgrade
+        elif version > CURRENT_DATABASE_VERSION:
+            Log.critical('Database version is not compatible with this autosubmit version. Please execute pip install '
+                         'autosubmit --upgrade')
+            raise DbException('Database version not compatible')
+
+    return conn, cursor
+
+
+def close_conn(conn, cursor):
+    """
+    Commits changes and close connection to database
+
+    :param conn: connection to close
+    :type conn: sqlite3.Connection
+    :param cursor: cursor to close
+    :type cursor: sqlite3.Cursor
+    """
+    conn.commit()
+    cursor.close()
+    conn.close()
+    return
+
+
+def _update_database(version, cursor):
+
+    Log.info("Autosubmit's database version is {0}. Current version is {1}. Updating...",
+             version, CURRENT_DATABASE_VERSION)
+    try:
+        # For databases from Autosubmit 2
+        if version <= -1:
+            cursor.executescript('CREATE TABLE experiment_backup(id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, '
+                                 'name VARCHAR NOT NULL, type VARCHAR, autosubmit_version VARCHAR, '
+                                 'description VARCHAR NOT NULL, model_branch VARCHAR, template_name VARCHAR, '
+                                 'template_branch VARCHAR, ocean_diagnostics_branch VARCHAR);'
+                                 'INSERT INTO experiment_backup (name,type,description,model_branch,template_name,'
+                                 'template_branch,ocean_diagnostics_branch) SELECT name,type,description,model_branch,'
+                                 'template_name,template_branch,ocean_diagnostics_branch FROM experiment;'
+                                 'UPDATE experiment_backup SET autosubmit_version = "2";'
+                                 'DROP TABLE experiment;'
+                                 'ALTER TABLE experiment_backup RENAME TO experiment;')
+        if version <= 0:
+            # Autosubmit beta version. Create db_version table
+            cursor.executescript('CREATE TABLE db_version(version INTEGER NOT NULL);'
+                                 'INSERT INTO db_version (version) VALUES (1);'
+                                 'ALTER TABLE experiment ADD COLUMN autosubmit_version VARCHAR;'
+                                 'UPDATE experiment SET autosubmit_version = "3.0.0b" '
+                                 'WHERE autosubmit_version NOT NULL;')
+        cursor.execute('UPDATE db_version SET version={0};'.format(CURRENT_DATABASE_VERSION))
+    except sqlite3.Error as e:
+        Log.critical('Can not update database: {0}', e)
+        return False
+    Log.info("Update completed")
+    return True
+
+
+class DbException(Exception):
+    """
+    Exception class for database errors
+    """
+    def __init__(self, message):
+        self.message = message
diff --git a/autosubmit/date/__init__.py b/autosubmit/date/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/autosubmit/date/chunk_date_lib.py b/autosubmit/date/chunk_date_lib.py
new file mode 100755
index 0000000..32a5c0e
--- /dev/null
+++ b/autosubmit/date/chunk_date_lib.py
@@ -0,0 +1,322 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+"""
+In this python script there are tools to manipulate the dates and make mathematical
+operations between them.
+"""
+
+import datetime
+import calendar
+from dateutil.relativedelta import *
+
+from autosubmit.config.log import Log
+
+
+def add_time(date, total_size, chunk_unit, cal):
+    """
+    Adds given time to a date
+
+    :param date: base date
+    :type date: datetime.datetime
+    :param total_size: time to add
+    :type total_size: int
+    :param chunk_unit: unit of time to add
+    :type chunk_unit: str
+    :param cal: calendar to use
+    :type cal: str
+    :return: result of adding time to base date
+    :rtype: datetime.datetime
+    """
+    if chunk_unit == 'year':
+        return add_years(date, total_size)
+    elif chunk_unit == 'month':
+        return add_months(date, total_size, cal)
+    elif chunk_unit == 'day':
+        return add_days(date, total_size, cal)
+    elif chunk_unit == 'hour':
+        return add_hours(date, total_size, cal)
+    else:
+        Log.critical('Chunk unit not valid: {0}'.format(chunk_unit))
+
+
+def add_years(date, number_of_years):
+    """
+    Adds years to a date
+
+    :param date: base date
+    :type date: datetime.datetime
+    :param number_of_years: number of years to add
+    :type number_of_years: int
+    :return: base date plus added years
+    :rtype: date
+    """
+    return date + relativedelta(years=number_of_years)
+
+
+def add_months(date, number_of_months, cal):
+    """
+    Adds months to a date
+
+    :param date: base date
+    :type date: datetime.datetime
+    :param number_of_months: number of months to add
+    :type number_of_months: int
+    :param cal: calendar to use
+    :type cal: str
+    :return: base date plus added months
+    :rtype: date
+    """
+    result = date + relativedelta(months=number_of_months)
+    if cal == 'noleap':
+        if result.month == 2 and result.day == 29:
+            result = result - relativedelta(days=1)
+    return result
+
+
+def add_days(date, number_of_days, cal):
+    """
+    Adds days to a date
+
+    :param date: base date
+    :type date: datetime.datetime
+    :param number_of_days: number of days to add
+    :type number_of_days: int
+    :param cal: calendar to use
+    :type cal: str
+    :return: base date plus added days
+    :rtype: date
+    """
+    result = date + relativedelta(days=number_of_days)
+    if cal == 'noleap':
+        year = date.tm_year
+        if date.tm_mon > 2:
+            year += 1
+
+        while year <= result.year:
+            if calendar.isleap(year):
+                if result.year == year and result < datetime.date(year, 2, 29):
+                    year += 1
+                    continue
+                result += relativedelta(days=1)
+            year += 1
+        if result.month == 2 and result.day == 29:
+            result += relativedelta(days=1)
+    return result
+
+
+def sub_days(date, number_of_days, cal):
+    """
+    Substract days to a date
+
+    :param date: base date
+    :type date: datetime.datetime
+    :param number_of_days: number of days to substract
+    :type number_of_days: int
+    :param cal: calendar to use
+    :type cal: str
+    :return: base date minus substracted days
+    :rtype: datetime.datetime
+    """
+    result = date - relativedelta(days=number_of_days)
+    if cal == 'noleap':
+        year = date.tm_year
+        if date.tm_mon <= 2:
+            year -= 1
+
+        while year >= result.year:
+            if calendar.isleap(year):
+                if result.year == year and result > datetime.date(year, 2, 29):
+                    year -= 1
+                    continue
+                result -= relativedelta(days=1)
+            year -= 1
+        if result.month == 2 and result.day == 29:
+            result -= relativedelta(days=1)
+    return result
+
+
+def add_hours(date, number_of_hours, cal):
+    """
+    Adds hours to a date
+
+    :param date: base date
+    :type date: datetime.datetime
+    :param number_of_hours: number of hours to add
+    :type number_of_hours: int
+    :param cal: calendar to use
+    :type cal: str
+    :return: base date plus added hours
+    :rtype: date
+    """
+    result = date + relativedelta(hours=number_of_hours)
+    if cal == 'noleap':
+        year = date.tm_year
+        if date.tm_mon > 2:
+            year += 1
+
+        while year <= result.year:
+            if calendar.isleap(year):
+                if result.year == year and result < datetime.date(year, 2, 29):
+                    year += 1
+                    continue
+                result += relativedelta(days=1)
+            year += 1
+        if result.month == 2 and result.day == 29:
+            result += relativedelta(days=1)
+    return result
+
+
+def subs_dates(start_date, end_date, cal):
+    """
+    Gets days between start_date and end_date
+
+    :param start_date: interval's start date
+    :type start_date: datetime.datetime
+    :param end_date: interval's end date
+    :type end_date: datetime.datetime
+    :param cal: calendar to use
+    :type cal: str
+    :return: interval length in days
+    :rtype: int
+    """
+    result = end_date - start_date
+    if cal == 'noleap':
+        year = start_date.year
+        if start_date.month > 2:
+            year += 1
+
+        while year <= end_date.year:
+            if calendar.isleap(year):
+                if end_date.year == year and end_date < datetime.date(year, 2, 29):
+                    year += 1
+                    continue
+                result -= datetime.timedelta(days=1)
+            year += 1
+    return result.days
+
+
+def chunk_start_date(date, chunk, chunk_length, chunk_unit, cal):
+    """
+    Gets chunk's interval start date
+
+    :param date: start date for member
+    :type date: datetime.datetime
+    :param chunk: number of chunk
+    :type chunk: int
+    :param chunk_length: length of chunks
+    :type chunk_length: int
+    :param chunk_unit: chunk length unit
+    :type chunk_unit: str
+    :param cal: calendar to use
+    :type cal: str
+    :return: chunk's start date
+    :rtype: datetime.datetime
+    """
+    chunk_1 = chunk - 1
+    total_months = chunk_1 * chunk_length
+    result = add_time(date, total_months, chunk_unit, cal)
+    return result
+
+
+def chunk_end_date(start_date, chunk_length, chunk_unit, cal):
+    """
+    Gets chunk interval end date
+
+    :param start_date: chunk's start date
+    :type start_date: datetime.datetime
+    :param chunk_length: length of the chunks
+    :type chunk_length: int
+    :param chunk_unit: chunk length unit
+    :type chunk_unit: str
+    :param cal: calendar to use
+    :type cal: str
+    :return: chunk's end date
+    :rtype: datetime.datetime
+    """
+    return add_time(start_date, chunk_length, chunk_unit, cal)
+
+
+def previous_day(date, cal):
+    """
+    Gets previous day
+
+    :param date: base date
+    :type date: datetime.datetime
+    :param cal: calendar to use
+    :type cal: str
+    :return: base date minus one day
+    :rtype: datetime.datetime
+    """
+    return sub_days(date, 1, cal)
+
+
+def parse_date(string_date):
+    """
+    Parses a string into a datetime object
+
+    :param string_date: string to parse
+    :type string_date: str
+    :rtype: datetime.datetime
+    """
+    if string_date is None:
+        return None
+    length = len(string_date)
+    if length == 6:
+        return datetime.datetime.strptime(string_date, "%Y%m")
+    if length == 8:
+        return datetime.datetime.strptime(string_date, "%Y%m%d")
+    elif length == 10:
+        return datetime.datetime.strptime(string_date, "%Y%m%d%H")
+    elif length == 12:
+        return datetime.datetime.strptime(string_date, "%Y%m%d%H%M")
+
+
+def date2str(date, date_format=''):
+    """
+    Converts a datetime object to a str
+
+    :param date: date to convert
+    :type date: datetime.datetime
+    :rtype: str
+    """
+    if date is None:
+        return ''
+    if date_format == 'H':
+        return date.strftime("%Y%m%d%H")
+    elif date_format == 'M':
+        return date.strftime("%Y%m%d%H%M")
+    else:
+        return date.strftime("%Y%m%d")
+
+
+####################
+# Main Program
+####################
+def main():
+    string_date = datetime.datetime(2010, 5, 1, 12)
+    cal = 'noleap'
+    start_date = chunk_start_date(string_date, 1, 1, 'month', cal)
+    Log.info(start_date)
+    end_date = chunk_end_date(start_date, 1, 'month', cal)
+    Log.info(end_date)
+    Log.info("yesterday: {0} ", previous_day(string_date, cal))
+
+
+if __name__ == "__main__":
+    main()
diff --git a/autosubmit/git/__init__.py b/autosubmit/git/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/autosubmit/git/git_common.py b/autosubmit/git/git_common.py
new file mode 100644
index 0000000..b107f43
--- /dev/null
+++ b/autosubmit/git/git_common.py
@@ -0,0 +1,86 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+
+from os import path
+from os import listdir
+import os
+from shutil import rmtree
+from commands import getstatusoutput
+
+from autosubmit.config.basicConfig import BasicConfig
+from autosubmit.config.log import Log
+
+
+class AutosubmitGit:
+    """
+    Class to handle experiment git repository
+
+    :param expid: experiment identifier
+    :type expid: str
+    """
+
+    def __init__(self, expid):
+        self._expid = expid
+
+    def clean_git(self):
+        """
+        Function to clean space on BasicConfig.LOCAL_ROOT_DIR/git directory.
+        """
+        proj_dir = os.path.join(BasicConfig.LOCAL_ROOT_DIR, self._expid, BasicConfig.LOCAL_PROJ_DIR)
+        dirs = listdir(proj_dir)
+        if dirs:
+            Log.debug("Checking git directories status...")
+            for dirname in dirs:
+                dirname_path = os.path.join(proj_dir, dirname)
+                Log.debug("Directory: " + dirname)
+                if path.isdir(dirname_path):
+                    if path.isdir(os.path.join(dirname_path, '.git')):
+                        (status, output) = getstatusoutput("cd " + dirname_path + "; " +
+                                                           "git diff-index HEAD --")
+                        if status == 0:
+                            if output:
+                                Log.info("Changes not commited detected... SKIPPING!")
+                                Log.user_warning("Commit needed!")
+                                return False
+                            else:
+                                (status, output) = getstatusoutput("cd " + dirname_path + "; " +
+                                                                   "git log --branches --not --remotes")
+                                if output:
+                                    Log.info("Changes not pushed detected... SKIPPING!")
+                                    Log.user_warning("Synchronization needed!")
+                                    return False
+                                else:
+                                    Log.debug("Ready to clean...")
+                                    Log.debug("Cloning: 'git clone --bare " + dirname + " " + dirname + ".git' ...")
+                                    # noinspection PyUnusedLocal
+                                    (status, output) = getstatusoutput("cd " + proj_dir + "; " +
+                                                                       "git clone --bare " + dirname +
+                                                                       " " + dirname + ".git")
+                                    Log.debug("Removing: " + dirname)
+                                    rmtree(dirname_path)
+                                    Log.debug(dirname + " directory clean!")
+                                    Log.user_warning("Further runs will require 'git clone {0}.git {0} '...", dirname)
+                        else:
+                            Log.error("Failed to retrieve git info...")
+                            return False
+                    else:
+                        Log.debug("Not a git repository... SKIPPING!")
+                else:
+                    Log.debug("Not a directory... SKIPPING!")
+        return True
diff --git a/autosubmit/job/__init__.py b/autosubmit/job/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/autosubmit/job/job.py b/autosubmit/job/job.py
new file mode 100644
index 0000000..a33faef
--- /dev/null
+++ b/autosubmit/job/job.py
@@ -0,0 +1,597 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+
+"""
+Main module for autosubmit. Only contains an interface class to all functionality implemented on autosubmit
+"""
+
+import os
+import re
+
+from autosubmit.job.job_common import Status
+from autosubmit.job.job_common import StatisticsSnippet
+from autosubmit.config.basicConfig import BasicConfig
+from autosubmit.date.chunk_date_lib import *
+from autosubmit.platforms.localplatform import LocalPlatform
+
+
+class Job:
+    """
+    Class to handle all the tasks with Jobs at HPC.
+    A job is created by default with a name, a jobid, a status and a type.
+    It can have children and parents. The inheritance reflects the dependency between jobs.
+    If Job2 must wait until Job1 is completed then Job2 is a child of Job1. Inversely Job1 is a parent of Job2
+
+    :param name: job's name
+    :type name: str
+    :param jobid: job's identifier
+    :type jobid: int
+    :param status: job inicial status
+    :type status: Status
+    :param priority: job's priority
+    :type priority: int
+    """
+
+    def __str__(self):
+        return self.name
+
+    def __init__(self, name, jobid, status, priority):
+        self._platform = None
+        self._queue = None
+        self.platform_name = None
+        self.section = None
+        self.wallclock = None
+        self.tasks = None
+        self.threads = None
+        self.processors = None
+        self.chunk = None
+        self.member = None
+        self.date = None
+        self.name = name
+        self._long_name = None
+        self.long_name = name
+        self._short_name = None
+        self.short_name = name
+        self.date_format = ''
+
+        self.id = jobid
+        self.file = None
+        self.status = status
+        self.priority = priority
+        self._parents = set()
+        self._children = set()
+        self.fail_count = 0
+        self.expid = name.split('_')[0]
+        self._complete = True
+        self.parameters = dict()
+        self._tmp_path = os.path.join(BasicConfig.LOCAL_ROOT_DIR, self.expid, BasicConfig.LOCAL_TMP_DIR)
+        self._ancestors = None
+
+    def __getstate__(self):
+        odict = self.__dict__
+        if '_platform' in odict:
+            odict = odict.copy()    # copy the dict since we change it
+            del odict['_platform']              # remove filehandle entry
+        return odict
+
+    def print_job(self):
+        """
+        Prints debug information about the job
+        """
+        Log.debug('NAME: {0}', self.name)
+        Log.debug('JOBID: {0}', self.id)
+        Log.debug('STATUS: {0}', self.status)
+        Log.debug('TYPE: {0}', self.priority)
+        Log.debug('PARENTS: {0}', [p.name for p in self.parents])
+        Log.debug('CHILDREN: {0}', [c.name for c in self.children])
+        Log.debug('FAIL_COUNT: {0}', self.fail_count)
+        Log.debug('EXPID: {0}', self.expid)
+
+    # Properties
+    @property
+    def parents(self):
+        """
+        Return parent jobs list
+
+        :return: parent jobs
+        :rtype: set
+        """
+        return self._parents
+
+    def get_platform(self):
+        """
+        Returns the platforms to be used by the job. Chooses between serial and parallel platforms
+
+        :return HPCPlatform object for the job to use
+        :rtype: HPCPlatform
+        """
+        if self.processors > 1:
+            return self._platform
+        else:
+            return self._platform.get_serial_platform()
+
+    def set_platform(self, value):
+        """
+        Sets the HPC platforms to be used by the job.
+
+        :param value: platforms to set
+        :type value: HPCPlatform
+        """
+        self._platform = value
+
+    def get_queue(self):
+        """
+        Returns the queue to be used by the job. Chooses between serial and parallel platforms
+
+        :return HPCPlatform object for the job to use
+        :rtype: HPCPlatform
+        """
+        if self._queue is not None:
+            return self._queue
+        if self.processors > 1:
+            return self._platform.get_queue()
+        else:
+            return self._platform.get_serial_platform().get_serial_queue()
+
+    def set_queue(self, value):
+        """
+        Sets the queue to be used by the job.
+
+        :param value: queue to set
+        :type value: HPCPlatform
+        """
+        self._queue = value
+
+    @property
+    def ancestors(self):
+        """
+        Returns all job's ancestors
+
+        :return: job ancestors
+        :rtype: set
+        """
+        if self._ancestors is None:
+            self._ancestors = set()
+            if self.has_parents():
+                for parent in self.parents:
+                    self._ancestors.add(parent)
+                    for ancestor in parent.ancestors:
+                        self._ancestors.add(ancestor)
+        return self._ancestors
+
+    @property
+    def children(self):
+        """
+        Returns a list containing all children of the job
+
+        :return: child jobs
+        :rtype: set
+        """
+        return self._children
+
+    @property
+    def long_name(self):
+        """
+        Job's long name. If not setted, returns name
+
+        :return: long name
+        :rtype: str
+        """
+        if hasattr(self, '_long_name'):
+            return self._long_name
+        else:
+            return self.name
+
+    @long_name.setter
+    def long_name(self, value):
+        """
+        Sets long name for the job
+
+        :param value: long name to set
+        :type value: str
+        """
+        self._long_name = value
+
+    @property
+    def short_name(self):
+        """
+        Job short name
+
+        :return: short name
+        :rtype: str
+        """
+        return self._short_name
+
+    @short_name.setter
+    def short_name(self, value):
+        """
+        Sets short name
+
+        :param value: short name
+        :type value: str
+        """
+        n = value.split('_')
+        if len(n) == 5:
+            self._short_name = n[1][:6] + "_" + n[2][2:] + "_" + n[3] + n[4][:1]
+        elif len(n) == 4:
+            self._short_name = n[1][:6] + "_" + n[2][2:] + "_" + n[3][:1]
+        elif len(n) == 2:
+            self._short_name = n[1]
+        else:
+            self._short_name = n[0][:15]
+
+    def log_job(self):
+        """
+        Prints job information in log
+        """
+        Log.info("{0}\t{1}\t{2}", "Job Name", "Job Id", "Job Status")
+        Log.info("{0}\t\t{1}\t{2}", self.name, self.id, self.status)
+
+    def print_parameters(self):
+        """
+        Print sjob parameters in log
+        """
+        Log.info(self.parameters)
+
+    def inc_fail_count(self):
+        """
+        Increments fail count
+        """
+        self.fail_count += 1
+
+    def add_parent(self, *new_parent):
+        """
+        Add parents for the job. It also adds current job as a child for all the new parents
+
+        :param \*new_parent: job parent
+        :type \*new_parent: Job
+        """
+        self._ancestors = None
+        for parent in new_parent:
+            self._parents.add(parent)
+            parent.__add_child(self)
+
+    def __add_child(self, new_child):
+        """
+        Adds a new child to the job
+
+        :param new_child: new child to add
+        :type new_child: Job
+        """
+        self.children.add(new_child)
+
+    def delete_parent(self, parent):
+        """
+        Remove a parent from the job
+
+        :param parent: parent to remove
+        :type parent: Job
+        """
+        self._ancestors = None
+        # careful, it is only possible to remove one parent at a time
+        self.parents.remove(parent)
+
+    def delete_child(self, child):
+        """
+        Removes a child from the job
+
+        :param child: child to remove
+        :type child: Job
+        """
+        # careful it is only possible to remove one child at a time
+        self.children.remove(child)
+
+    def has_children(self):
+        """
+        Returns true if job has any children, else return false
+
+        :return: true if job has any children, otherwise return false
+        :rtype: bool
+        """
+        return self.children.__len__()
+
+    def has_parents(self):
+        """
+        Returns true if job has any parents, else return false
+
+        :return: true if job has any parent, otherwise return false
+        :rtype: bool
+        """
+        return self.parents.__len__()
+
+    def compare_by_status(self, other):
+        """
+        Compare jobs by status value
+
+        :param other: job to compare
+        :type other: Job
+        :return: comparison result
+        :rtype: bool
+        """
+        return cmp(self.status, other.status)
+
+    def compare_by_id(self, other):
+        """
+        Compare jobs by ID
+
+        :param other: job to compare
+        :type other: Job
+        :return: comparison result
+        :rtype: bool
+        """
+        return cmp(self.id, other.id)
+
+    def compare_by_name(self, other):
+        """
+        Compare jobs by name
+
+        :param other: job to compare
+        :type other: Job
+        :return: comparison result
+        :rtype: bool
+        """
+        return cmp(self.name, other.name)
+
+    def _get_from_completed(self, index):
+        """
+        Returns value from given index position in completed file asociated to job
+
+        :param index: position to retrieve
+        :type index: int
+        :return: value in index position
+        :rtype: str
+        """
+        logname = self._tmp_path + self.name + '_COMPLETED'
+        if os.path.exists(logname):
+            split_line = open(logname).readline().split()
+            if len(split_line) >= index + 1:
+                return split_line[index]
+            else:
+                return 0
+        else:
+            return 0
+
+    def check_end_time(self):
+        """
+        Returns end time from completed file
+
+        :return: completed date and time
+        :rtype: str
+        """
+        return self._get_from_completed(0)
+
+    def check_queued_time(self):
+        """
+        Returns job's waiting time in HPC
+
+        :return: total time waiting in HPC platforms
+        :rtype: str
+        """
+        return self._get_from_completed(1)
+
+    def check_run_time(self):
+        """
+        Returns job's running time
+
+        :return: total time running
+        :rtype: str
+        """
+        return self._get_from_completed(2)
+
+    def check_failed_times(self):
+        """
+        Returns number of failed attempts before completing the job
+
+        :return: failed attempts to run
+        :rtype: str
+        """
+        return self._get_from_completed(3)
+
+    def check_fail_queued_time(self):
+        """
+        Returns total time spent waiting for failed jobs
+
+        :return: total time waiting in HPC platforms for failed jobs
+        :rtype: str
+        """
+        return self._get_from_completed(4)
+
+    def check_fail_run_time(self):
+        """
+        Returns total time running for failed jobs
+
+        :return: total time running in HPC  for failed jobs
+        :rtype: str
+        """
+        return self._get_from_completed(5)
+
+    def check_completion(self, default_status=Status.FAILED):
+        """
+        Check the presence of *COMPLETED* file and touch a Checked or failed file.
+        Change statis to COMPLETED if *COMPLETED* file exists and to FAILED otherwise.
+        """
+        logname = os.path.join(self._tmp_path, self.name + '_COMPLETED')
+        if os.path.exists(logname):
+            self._complete = True
+            os.system('touch ' + os.path.join(self._tmp_path, self.name + 'Checked'))
+            self.status = Status.COMPLETED
+        else:
+            os.system('touch ' + os.path.join(self._tmp_path, self.name + 'Failed'))
+            self.status = default_status
+
+    def remove_dependencies(self):
+        """
+        Checks if job is completed and then remove dependencies for childs
+        """
+        if self._complete:
+            self.status = Status.COMPLETED
+            for child in self.children:
+                if child.get_parents().__contains__(self):
+                    child.delete_parent(self)
+        else:
+            self.status = Status.FAILED
+
+    def update_parameters(self, as_conf):
+        """
+        Refresh parameters value
+
+        :param as_conf:
+        :type as_conf: AutosubmitConfig
+        """
+        parameters = self.parameters
+        parameters['JOBNAME'] = self.name
+        parameters['FAIL_COUNT'] = str(self.fail_count)
+
+        parameters['SDATE'] = date2str(self.date, self.date_format)
+        parameters['MEMBER'] = self.member
+        if self.date is not None:
+            if self.chunk is None:
+                chunk = 1
+            else:
+                chunk = self.chunk
+
+            parameters['CHUNK'] = chunk
+            total_chunk = int(parameters['NUMCHUNKS'])
+            chunk_length = int(parameters['CHUNKSIZE'])
+            chunk_unit = parameters['CHUNKSIZEUNIT'].lower()
+            cal = parameters['CALENDAR'].lower()
+            chunk_start = chunk_start_date(self.date, chunk, chunk_length, chunk_unit, cal)
+            chunk_end = chunk_end_date(chunk_start, chunk_length, chunk_unit, cal)
+            chunk_end_1 = previous_day(chunk_end, cal)
+            parameters['DAY_BEFORE'] = date2str(previous_day(self.date, cal), self.date_format)
+            parameters['Chunk_START_DATE'] = date2str(chunk_start, self.date_format)
+            parameters['Chunk_END_DATE'] = date2str(chunk_end_1, self.date_format)
+            parameters['RUN_DAYS'] = str(subs_dates(chunk_start, chunk_end, cal))
+            parameters['Chunk_End_IN_DAYS'] = str(subs_dates(self.date, chunk_end, cal))
+
+            parameters['Chunk_START_YEAR'] = str(self.date.year)
+            parameters['Chunk_START_MONTH'] = str(self.date.month)
+
+            parameters['PREV'] = str(subs_dates(self.date, chunk_start, cal))
+
+            if chunk == 1:
+                parameters['Chunk_FIRST'] = 'TRUE'
+            else:
+                parameters['Chunk_FIRST'] = 'FALSE'
+
+            if total_chunk == chunk:
+                parameters['Chunk_LAST'] = 'TRUE'
+            else:
+                parameters['Chunk_LAST'] = 'FALSE'
+
+        parameters['NUMPROC'] = self.processors
+        parameters['NUMTHREADS'] = self.threads
+        parameters['NUMTASK'] = self.tasks
+        parameters['WALLCLOCK'] = self.wallclock
+        parameters['TASKTYPE'] = self.section
+
+        job_platform = self.get_platform()
+        parameters['CURRENT_ARCH'] = job_platform.name
+        parameters['CURRENT_QUEUE'] = self.get_queue()
+        parameters['CURRENT_USER'] = job_platform.user
+        parameters['CURRENT_PROJ'] = job_platform.project
+        parameters['CURRENT_BUDG'] = job_platform.budget
+        parameters['CURRENT_TYPE'] = job_platform.type
+        parameters['CURRENT_VERSION'] = job_platform.version
+        parameters['CURRENT_SCRATCH_DIR'] = job_platform.scratch
+        parameters['CURRENT_ROOTDIR'] = job_platform.root_dir
+
+        parameters['ROOTDIR'] = os.path.join(BasicConfig.LOCAL_ROOT_DIR, self.expid)
+        parameters['PROJDIR'] = as_conf.get_project_dir()
+
+        self.parameters = parameters
+
+        return parameters
+
+    def update_content(self, project_dir):
+        """
+        Create the script content to be run for the job
+
+        :param project_dir: project directory
+        :type project_dir: str
+        :return: script code
+        :rtype: str
+        """
+        if self.parameters['PROJECT_TYPE'].lower() != "none":
+            template_file = file(os.path.join(project_dir, self.file), 'r')
+            template = template_file.read()
+        else:
+            template = ''
+        current_platform = self.get_platform()
+        if isinstance(current_platform, LocalPlatform):
+            stats_header = StatisticsSnippet.AS_HEADER_LOC
+            stats_tailer = StatisticsSnippet.AS_TAILER_LOC
+        else:
+            stats_header = StatisticsSnippet.AS_HEADER_REM
+            stats_tailer = StatisticsSnippet.AS_TAILER_REM
+
+        template_content = ''.join([current_platform.get_header(self),
+                                   stats_header,
+                                   template,
+                                   stats_tailer])
+
+        return template_content
+
+    def create_script(self, as_conf):
+        """
+        Creates script file to be run for the job
+
+        :param as_conf: configuration object
+        :type as_conf: AutosubmitConfig
+        :return: script's filename
+        :rtype: str
+        """
+        parameters = self.update_parameters(as_conf)
+        template_content = self.update_content(as_conf.get_project_dir())
+        # print "jobType: %s" % self._type
+        # print template_content
+
+        for key, value in parameters.items():
+            # print "%s:\t%s" % (key,parameters[key])
+            template_content = template_content.replace("%" + key + "%", str(parameters[key]))
+
+        scriptname = self.name + '.cmd'
+        file(os.path.join(self._tmp_path, scriptname), 'w').write(template_content)
+        os.chmod(os.path.join(self._tmp_path, scriptname), 0775)
+
+        return scriptname
+
+    def check_script(self, as_conf):
+        """
+        Checks if script is well formed
+
+        :param as_conf: configuration file
+        :type as_conf: AutosubmitConfig
+        :return: true if not problem has been detected, false otherwise
+        :rtype: bool
+        """
+        parameters = self.update_parameters(as_conf)
+        template_content = self.update_content(as_conf.get_project_dir())
+
+        variables = re.findall('%' + '(\w+)' + '%', template_content)
+        # variables += re.findall('%%'+'(.+?)'+'%%', template_content)
+        out = set(parameters).issuperset(set(variables))
+
+        if not out:
+            Log.warning("The following set of variables to be substituted in template script is not part of "
+                        "parameters set: {0}", str(set(variables) - set(parameters)))
+        else:
+            self.create_script(as_conf)
+
+        return out
diff --git a/autosubmit/job/job_common.py b/autosubmit/job/job_common.py
new file mode 100644
index 0000000..48095a1
--- /dev/null
+++ b/autosubmit/job/job_common.py
@@ -0,0 +1,175 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+import textwrap
+
+
+class Status:
+    """
+    Class to handle the status of a job
+    """
+    WAITING = 0
+    READY = 1
+    SUBMITTED = 2
+    QUEUING = 3
+    RUNNING = 4
+    COMPLETED = 5
+    FAILED = -1
+    UNKNOWN = -2
+    SUSPENDED = -3
+
+    def retval(self, value):
+        return getattr(self, value)
+
+
+# noinspection PyPep8
+class StatisticsSnippet:
+    """
+    Class to handle the statistics snippet of a job. It contains header and tailer for
+    local and remote jobs
+    """
+
+    AS_HEADER_LOC = textwrap.dedent("""\
+
+            ###################
+            # Autosubmit header
+            ###################
+
+            set -x
+            job_name_ptrn=%ROOTDIR%/tmp/LOG_%EXPID%/%JOBNAME%
+            job_cmd_stamp=$(stat -c %Z $job_name_ptrn.cmd)
+            job_start_time=$(date +%s)
+
+            rm -f ${job_name_ptrn}_COMPLETED
+
+            ###################
+            # Autosubmit job
+            ###################
+
+            """)
+
+    # noinspection PyPep8
+    AS_TAILER_LOC = textwrap.dedent("""\
+            ###################
+            # Autosubmit tailer
+            ###################
+
+            set -x
+            job_end_time=$(date +%s)
+            job_run_time=$((job_end_time - job_start_time))
+            errfile_ptrn="\.e"
+
+            failed_jobs=$(($(ls -1 ${job_name_ptrn}* | grep $errfile_ptrn | wc -l) - 1))
+            failed_errfiles=$(ls -1 ${job_name_ptrn}* | grep $errfile_ptrn | head -n $failed_jobs)
+            failed_jobs_rt=0
+
+            for failed_errfile in $failed_errfiles; do
+                failed_errfile_stamp=$(stat -c %Z $failed_errfile)
+                failed_jobs_rt=$((failed_jobs_rt + $((failed_errfile_stamp - $(grep "job_start_time=" $failed_errfile | head -n 2 | tail -n 1 | cut -d '=' -f 2)))))
+            done
+            echo "
+            $job_end_time 0 $job_run_time $failed_jobs 0 $failed_jobs_rt" > ${job_name_ptrn}_COMPLETED
+            exit 0
+            """)
+
+    AS_HEADER_REM = textwrap.dedent("""\
+
+            ###################
+            # Autosubmit header
+            ###################
+
+            set -x
+            job_name_ptrn=%SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/%JOBNAME%
+            job_cmd_stamp=$(stat -c %Z $job_name_ptrn.cmd)
+            job_start_time=$(date +%s)
+            job_queue_time=$((job_start_time - job_cmd_stamp))
+
+            if [[ %HPCTYPE% == ecaccess ]]; then
+              hpcversion=%HPCVERSION%
+              if [[ ! -z ${hpcversion+x} ]]; then
+                if [[ $hpcversion == pbs ]]; then
+                  filein="$(ls -rt %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/.ecaccess_DO_NOT_REMOVE/job.i* | xargs grep -l %JOBNAME% | tail -1)"
+                  jobid="$(echo "$filein" | cut -d. -f3 | cut -c2-)"
+                  fileout="%SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/.ecaccess_DO_NOT_REMOVE/job.o"$jobid"_0"
+                  ln -s ${fileout} ${job_name_ptrn}_${jobid}.out
+                  fileerr="%SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/.ecaccess_DO_NOT_REMOVE/job.e"$jobid"_0"
+                  ln -s ${fileerr} ${job_name_ptrn}_${jobid}.err
+                fi
+              fi
+            fi
+
+            rm -f ${job_name_ptrn}_COMPLETED
+
+            ###################
+            # Autosubmit job
+            ###################
+
+            """)
+
+    # noinspection PyPep8
+    AS_TAILER_REM = textwrap.dedent("""\
+            ###################
+            # Autosubmit tailer
+            ###################
+
+            set -x
+            job_end_time=$(date +%s)
+            job_run_time=$((job_end_time - job_start_time))
+            case %HPCTYPE% in
+             sge)       errfile_created="TRUE"; errfile_ptrn="\.e" ;;
+             lsf)       errfile_created="TRUE"; errfile_ptrn="\.err" ;;
+             ecaccess)  errfile_created="TRUE"; errfile_ptrn="\.err" ;;
+             pbs)       errfile_created="FALSE"; errfile_ptrn="\.e" ;;
+             slurm)     errfile_created="TRUE"; errfile_ptrn="\.err" ;;
+             ps)        errfile_created="TRUE"; errfile_ptrn="\.err" ;;
+             *) echo "!!! %HPCTYPE% is not valid scheduler !!!"; exit 1 ;;
+            esac
+            failed_jobs=0; failed_errfiles=""
+            set +e; ls -1 ${job_name_ptrn}* | grep $errfile_ptrn
+            if [[ $? -eq 0 ]]; then
+             case $errfile_created in 
+              TRUE)
+                failed_jobs=$(($(ls -1 ${job_name_ptrn}* | grep $errfile_ptrn | wc -l) - 1))
+                failed_errfiles=$(ls -1 ${job_name_ptrn}* | grep $errfile_ptrn | head -n $failed_jobs)
+              ;;
+              FALSE)
+                failed_jobs=$(ls -1 ${job_name_ptrn}* | grep $errfile_ptrn | wc -l)
+                failed_errfiles=$(ls -1 ${job_name_ptrn}* | grep $errfile_ptrn)
+              ;;
+              *) "!!! $errfile_created is not valid errfile_created option !!!"; exit 1 ;;
+             esac
+            fi; set -e
+            failed_jobs_qt=0; failed_jobs_rt=0
+            for failed_errfile in $failed_errfiles; do
+             failed_errfile_stamp=$(stat -c %Z $failed_errfile)
+             job_qt=$(grep "job_queue_time=" $failed_errfile | head -n 2 | tail -n 1 | cut -d '=' -f 2)
+             if [[ ! -z ${job_qt+x} ]]; then
+               job_qt=0
+             fi
+             failed_jobs_qt=$((failed_jobs_qt + job_qt))
+             job_st=$(grep "job_start_time=" $failed_errfile | head -n 2 | tail -n 1 | cut -d '=' -f 2)
+             if [[ ! -z ${job_qt+x} ]]; then
+               job_st=0
+             fi
+             failed_jobs_rt=$((failed_jobs_rt + $((failed_errfile_stamp - job_st))))
+            done
+
+            echo "$job_end_time $job_queue_time $job_run_time $failed_jobs $failed_jobs_qt $failed_jobs_rt" > ${job_name_ptrn}_COMPLETED
+            exit 0
+            """)
+
diff --git a/autosubmit/job/job_list.py b/autosubmit/job/job_list.py
new file mode 100644
index 0000000..caffea5
--- /dev/null
+++ b/autosubmit/job/job_list.py
@@ -0,0 +1,862 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+from ConfigParser import SafeConfigParser
+import json
+
+import os
+import pickle
+from time import localtime, strftime
+from sys import setrecursionlimit
+from shutil import move
+
+from autosubmit.job.job_common import Status
+from autosubmit.job.job import Job
+from autosubmit.config.basicConfig import BasicConfig
+from autosubmit.config.log import Log
+from autosubmit.date.chunk_date_lib import date2str
+
+
+class JobList:
+    """
+    Class to manage the list of jobs to be run by autosubmit
+
+    :param expid: experiment's indentifier
+    :type expid: str
+    """
+    def __init__(self, expid):
+        self._pkl_path = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "pkl")
+        self._update_file = "updated_list_" + expid + ".txt"
+        self._failed_file = "failed_job_list_" + expid + ".pkl"
+        self._job_list_file = "job_list_" + expid + ".pkl"
+        self._job_list = list()
+        self._expid = expid
+        self._stat_val = Status()
+        self._parameters = []
+
+    @property
+    def expid(self):
+        """
+        Returns experiment identifier
+
+        :return: experiment's identifier
+        :rtype: str
+        """
+        return self._expid
+
+    def create(self, date_list, member_list, num_chunks, parameters, date_format):
+        """
+        Creates all jobs needed for the current workflow
+
+        :param date_list: start dates
+        :type date_list: list
+        :param member_list: members
+        :type member_list: list
+        :param num_chunks: number of chunks to run
+        :type num_chunks: int
+        :param parameters: parameters for the jobs
+        :type parameters: dict
+        :param date_format: option to formate dates
+        :type date_format: str
+        """
+        self._parameters = parameters
+
+        parser = SafeConfigParser()
+        parser.optionxform = str
+        parser.read(os.path.join(BasicConfig.LOCAL_ROOT_DIR, self._expid, 'conf', "jobs_" + self._expid + ".conf"))
+
+        chunk_list = range(1, num_chunks+1)
+
+        self._date_list = date_list
+        self._member_list = member_list
+        self._chunk_list = chunk_list
+
+        dic_jobs = DicJobs(self, parser, date_list, member_list, chunk_list, date_format)
+        self._dic_jobs = dic_jobs
+        priority = 0
+
+        Log.info("Creating jobs...")
+        for section in parser.sections():
+            Log.debug("Creating {0} jobs".format(section))
+            dic_jobs.read_section(section, priority)
+            priority += 1
+
+        Log.info("Adding dependencies...")
+        for section in parser.sections():
+            Log.debug("Adding dependencies for {0} jobs".format(section))
+            if not parser.has_option(section, "DEPENDENCIES"):
+                continue
+            dependencies = parser.get(section, "DEPENDENCIES").split()
+            dep_section = dict()
+            dep_distance = dict()
+            dep_running = dict()
+            for dependency in dependencies:
+                if '-' in dependency:
+                    dependency_split = dependency.split('-')
+                    dep_section[dependency] = dependency_split[0]
+                    dep_distance[dependency] = int(dependency_split[1])
+                    dep_running[dependency] = dic_jobs.get_option(dependency_split[0], 'RUNNING', 'once').lower()
+                else:
+                    dep_section[dependency] = dependency
+
+            for job in dic_jobs.get_jobs(section):
+                for dependency in dependencies:
+                    chunk = job.chunk
+                    member = job.member
+                    date = job.date
+
+                    section_name = dep_section[dependency]
+
+                    if '-' in dependency:
+                        distance = dep_distance[dependency]
+                        if chunk is not None and dep_running[dependency] == 'chunk':
+                            chunk_index = chunk_list.index(chunk)
+                            if chunk_index >= distance:
+                                chunk = chunk_list[chunk_index - distance]
+                            else:
+                                continue
+                        elif member is not None and dep_running[dependency] in ['chunk', 'member']:
+                            member_index = member_list.index(member)
+                            if member_index >= distance:
+                                member = member_list[member_index - distance]
+                            else:
+                                continue
+                        elif date is not None and dep_running[dependency] in ['chunk', 'member', 'startdate']:
+                            date_index = date_list.index(date)
+                            if date_index >= distance:
+                                date = date_list[date_index - distance]
+                            else:
+                                continue
+
+                    for parent in dic_jobs.get_jobs(section_name, date, member, chunk):
+                        job.add_parent(parent)
+
+                    if job.wait and job.frequency > 1:
+                        if job.chunk is not None:
+                            max_distance = (chunk_list.index(chunk)+1) % job.frequency
+                            if max_distance == 0:
+                                max_distance = job.frequency
+                            for distance in range(1, max_distance, 1):
+                                for parent in dic_jobs.get_jobs(section_name, date, member, chunk - distance):
+                                    job.add_parent(parent)
+                        elif job.member is not None:
+                            member_index = member_list.index(job.member)
+                            max_distance = (member_index + 1) % job.frequency
+                            if max_distance == 0:
+                                max_distance = job.frequency
+                            for distance in range(1, max_distance, 1):
+                                for parent in dic_jobs.get_jobs(section_name, date,
+                                                                member_list[member_index - distance], chunk):
+                                    job.add_parent(parent)
+                        elif job.date is not None:
+                            date_index = date_list.index(job.date)
+                            max_distance = (date_index + 1) % job.frequency
+                            if max_distance == 0:
+                                max_distance = job.frequency
+                            for distance in range(1, max_distance, 1):
+                                for parent in dic_jobs.get_jobs(section_name, date_list[date_index - distance],
+                                                                member, chunk):
+                                    job.add_parent(parent)
+
+        Log.info("Removing redundant dependencies...")
+        self.update_genealogy()
+        for job in self._job_list:
+            job.parameters = parameters
+
+    def __len__(self):
+        return self._job_list.__len__()
+
+    def get_job_list(self):
+        """
+        Get inner job list
+
+        :return: job list
+        :rtype: list
+        """
+        return self._job_list
+
+    def get_completed(self):
+        """
+        Returns a list of completed jobs
+
+        :return: completed jobs
+        :rtype: list
+        """
+        return [job for job in self._job_list if job.status == Status.COMPLETED]
+
+    def get_submitted(self):
+        """
+        Returns a list of submitted jobs
+
+        :return: submitted jobs
+        :rtype: list
+        """
+        return [job for job in self._job_list if job.status == Status.SUBMITTED]
+
+    def get_running(self):
+        """
+        Returns a list of jobs running
+
+        :return: running jobs
+        :rtype: list
+        """
+        return [job for job in self._job_list if job.status == Status.RUNNING]
+
+    def get_queuing(self):
+        """
+        Returns a list of jobs queuing
+
+        :return: queuedjobs
+        :rtype: list
+        """
+        return [job for job in self._job_list if job.status == Status.QUEUING]
+
+    def get_failed(self):
+        """
+        Returns a list of failed jobs
+
+        :return: failed jobs
+        :rtype: list
+        """
+        return [job for job in self._job_list if job.status == Status.FAILED]
+
+    def get_ready(self):
+        """
+        Returns a list of ready jobs
+
+        :return: ready jobs
+        :rtype: list
+        """
+        return [job for job in self._job_list if job.status == Status.READY]
+
+    def get_waiting(self):
+        """
+        Returns a list of jobs waiting
+
+        :return: waiting jobs
+        :rtype: list
+        """
+        return [job for job in self._job_list if job.status == Status.WAITING]
+
+    def get_unknown(self):
+        """
+        Returns a list of jobs on unknown state
+
+        :return: unknown state jobs
+        :rtype: list
+        """
+        return [job for job in self._job_list if job.status == Status.UNKNOWN]
+
+    def get_in_queue(self):
+        """
+        Returns a list of jobs in the platforms (Submitted, Running, Queuing)
+
+        :return: jobs in platforms
+        :rtype: list
+        """
+        return self.get_submitted() + self.get_running() + self.get_queuing()
+
+    def get_not_in_queue(self):
+        """
+        Returns a list of jobs NOT in the platforms (Ready, Waiting)
+
+        :return: jobs not in platforms
+        :rtype: list
+        """
+        return self.get_ready() + self.get_waiting()
+
+    def get_finished(self):
+        """
+        Returns a list of jobs finished (Completed, Failed)
+
+        :return: finsihed jobs
+        :rtype: list
+        """
+        return self.get_completed() + self.get_failed()
+
+    def get_active(self):
+        """
+        Returns a list of active jobs (In platforms, Ready)
+
+        :return: active jobs
+        :rtype: list
+        """
+        return self.get_in_queue() + self.get_ready() + self.get_unknown()
+
+    def get_job_by_name(self, name):
+        """
+        Returns the job that its name matches parameter name
+
+        :parameter name: name to look for
+        :type name: str
+        :return: found job
+        :rtype: job
+        """
+        for job in self._job_list:
+            if job.name == name:
+                return job
+        Log.warning("We could not find that job {0} in the list!!!!", name)
+
+    def sort_by_name(self):
+        """
+        Returns a list of jobs sorted by name
+
+        :return: jobs sorted by name
+        :rtype: list
+        """
+        return sorted(self._job_list, key=lambda k: k.name)
+
+    def sort_by_id(self):
+        """
+        Returns a list of jobs sorted by id
+
+        :return: jobs sorted by ID
+        :rtype: list
+        """
+        return sorted(self._job_list, key=lambda k: k.id)
+
+    def sort_by_type(self):
+        """
+        Returns a list of jobs sorted by type
+
+        :return: job sorted by type
+        :rtype: list
+        """
+        return sorted(self._job_list, key=lambda k: k.type)
+
+    def sort_by_status(self):
+        """
+        Returns a list of jobs sorted by status
+
+        :return: job sorted by status
+        :rtype: list
+        """
+        return sorted(self._job_list, key=lambda k: k.status)
+
+    @staticmethod
+    def load_file(filename):
+        """
+        Recreates an stored joblist from the pickle file
+
+        :param filename: pickle file to load
+        :type filename: str
+        :return: loaded joblist object
+        :rtype: JobList
+        """
+        if os.path.exists(filename):
+            return pickle.load(file(filename, 'r'))
+        else:
+            Log.critical('File {0} does not exist'.format(filename))
+            return list()
+
+    def load(self):
+        """
+        Recreates an stored joblist from the pickle file
+
+        :return: loaded joblist object
+        :rtype: JobList
+        """
+        Log.info("Loading JobList: " + self._pkl_path + self._job_list_file)
+        return JobList.load_file(self._pkl_path + self._job_list_file)
+
+    def load_updated(self):
+        Log.info("Loading updated list: " + self._pkl_path + self._update_file)
+        return JobList.load_file(self._pkl_path + self._update_file)
+
+    def load_failed(self):
+        Log.info("Loading failed list: " + self._pkl_path + self._failed_file)
+        return JobList.load_file(self._pkl_path + self._failed_file)
+
+    def save_failed(self, failed_list):
+        # URi: should we check that the path exists?
+        Log.info("Saving failed list: " + self._pkl_path + self._failed_file)
+        pickle.dump(failed_list, file(self._pkl_path + self._failed_file, 'w'))
+
+    def save(self):
+        """
+        Stores joblist as a pickle file
+
+        :return: loaded joblist object
+        :rtype: JobList
+        """
+        setrecursionlimit(50000)
+        path = os.path.join(self._pkl_path, self._job_list_file)
+        Log.debug("Saving JobList: " + path)
+        pickle.dump(self, file(path, 'w'))
+
+    def update_from_file(self, store_change=True):
+        if os.path.exists(self._pkl_path + self._update_file):
+            for line in open(self._pkl_path + self._update_file):
+                if line.strip() == '':
+                    continue
+                job = self.get_job_by_name(line.split()[0])
+                if job:
+                    job.status = self._stat_val.retval(line.split()[1])
+                    job.fail_count = 0
+            now = localtime()
+            output_date = strftime("%Y%m%d_%H%M", now)
+            if store_change:
+                move(self._pkl_path + self._update_file, self._pkl_path + self._update_file + "_" + output_date)
+
+    def update_parameters(self, parameters):
+        self._parameters = parameters
+        for job in self._job_list:
+            job.parameters = parameters
+
+    def update_list(self, store_change=True):
+        # load updated file list
+        self.update_from_file(store_change)
+
+        # reset jobs that has failed less ethan 10 times
+        if 'RETRIALS' in self._parameters:
+            retrials = int(self._parameters['RETRIALS'])
+        else:
+            retrials = 4
+
+        for job in self.get_failed():
+            job.inc_fail_count()
+            if job.fail_count < retrials:
+                job.status = Status.READY
+
+        # if waiting jobs has all parents completed change its State to READY
+        for job in self.get_waiting():
+            tmp = [parent for parent in job.parents if parent.status == Status.COMPLETED]
+            # for parent in job.parents:
+            # if parent.status != Status.COMPLETED:
+            # break
+            if len(tmp) == len(job.parents):
+                job.status = Status.READY
+        if store_change:
+            self.save()
+
+    def update_shortened_names(self):
+        """
+        In some cases the scheduler only can operate with names shorter than 15 characters.
+        Update the job list replacing job names by the corresponding shortened job name
+        """
+        for job in self._job_list:
+            job.name = job.short_name
+
+    def update_genealogy(self):
+        """
+        When we have created the joblist, every type of job is created.
+        Update genealogy remove jobs that have no templates
+        """
+
+        # Use a copy of job_list because original is modified along iterations
+        for job in self._job_list[:]:
+            if job.file is None or job.file == '':
+                self._remove_job(job)
+
+        # Simplifing dependencies: if a parent is alreaday an ancestor of another parent,
+        # we remove parent dependency
+        for job in self._job_list:
+            for parent in list(job.parents):
+                for ancestor in parent.ancestors:
+                    if ancestor in job.parents:
+                        job.parents.remove(ancestor)
+                        ancestor.children.remove(job)
+
+        for job in self._job_list:
+            if not job.has_parents():
+                job.status = Status.READY
+
+    def check_scripts(self, as_conf):
+        """
+        When we have created the scripts, all parameters should have been substituted.
+        %PARAMETER% handlers not allowed
+
+        :param as_conf: experiment configuration
+        :type as_conf: AutosubmitConfig
+        """
+        Log.info("Checking scripts...")
+        out = True
+        sections_checked = set()
+        for job in self._job_list:
+            if job.section in sections_checked:
+                continue
+            if not job.check_script(as_conf):
+                out = False
+                Log.warning("Invalid parameter substitution in {0} template!!!", job.section)
+            sections_checked.add(job.section)
+        if out:
+            Log.result("Scripts OK")
+        else:
+            Log.error("Scripts check failed")
+            Log.user_warning("Running after failed scripts check is at your own risk!")
+        return out
+
+    def _remove_job(self, job):
+        """
+        Remove a job from the list
+
+        :param job: job to remove
+        :type job: Job
+        """
+        for child in job.children:
+            for parent in job.parents:
+                child.add_parent(parent)
+            child.delete_parent(job)
+
+        for parent in job.parents:
+            parent.children.remove(job)
+
+        self._job_list.remove(job)
+
+    def rerun(self, chunk_list):
+        """
+        Updates joblist to rerun the jobs specified by chunk_list
+
+        :param chunk_list: list of chunks to rerun
+        :type chunk_list: str
+        :return:
+        """
+        parser = SafeConfigParser()
+        parser.optionxform = str
+        parser.read(os.path.join(BasicConfig.LOCAL_ROOT_DIR, self._expid, 'conf', "jobs_" + self._expid + ".conf"))
+
+        Log.info("Adding dependencies...")
+        dep_section = dict()
+        dep_distance = dict()
+        dependencies = dict()
+        dep_running = dict()
+        for section in parser.sections():
+            Log.debug("Reading rerun dependencies for {0} jobs".format(section))
+            if not parser.has_option(section, "RERUN_DEPENDENCIES"):
+                continue
+            dependencies[section] = parser.get(section, "RERUN_DEPENDENCIES").split()
+            dep_section[section] = dict()
+            dep_distance[section] = dict()
+            dep_running[section] = dict()
+            for dependency in dependencies[section]:
+                if '-' in dependency:
+                    dependency_split = dependency.split('-')
+                    dep_section[section][dependency] = dependency_split[0]
+                    dep_distance[section][dependency] = int(dependency_split[1])
+                    dep_running[section][dependency] = self._dic_jobs.get_option(dependency_split[0], 'RUNNING',
+                                                                                 'once').lower()
+                else:
+                    dep_section[section][dependency] = dependency
+
+        for job in self._job_list:
+            job.status = Status.COMPLETED
+
+        data = json.loads(chunk_list)
+        for d in data['sds']:
+            date = d['sd']
+            Log.debug("Date: " + date)
+            for m in d['ms']:
+                member = m['m']
+                Log.debug("Member: " + member)
+                previous_chunk = 0
+                for c in m['cs']:
+                    Log.debug("Chunk: " + c)
+                    chunk = int(c)
+                    for job in [i for i in self._job_list if i.date == date and i.member == member
+                                and i.chunk == chunk]:
+                        if not job.rerun_only or chunk != previous_chunk+1:
+                            job.status = Status.WAITING
+                            Log.debug("Job: " + job.name)
+                        section = job.section
+                        if section not in dependencies:
+                            continue
+                        for dependency in dependencies[section]:
+                            current_chunk = chunk
+                            current_member = member
+                            current_date = date
+                            if '-' in dependency:
+                                distance = dep_distance[section][dependency]
+                                running = dep_running[section][dependency]
+                                if current_chunk is not None and running == 'chunk':
+                                    chunk_index = self._chunk_list.index(current_chunk)
+                                    if chunk_index >= distance:
+                                        current_chunk = self._chunk_list[chunk_index - distance]
+                                    else:
+                                        continue
+                                elif current_member is not None and running in ['chunk', 'member']:
+                                    member_index = self._member_list.index(current_member)
+                                    if member_index >= distance:
+                                        current_member = self._member_list[member_index - distance]
+                                    else:
+                                        continue
+                                elif current_date is not None and running in ['chunk', 'member', 'startdate']:
+                                    date_index = self._date_list.index(current_date)
+                                    if date_index >= distance:
+                                        current_date = self._date_list[date_index - distance]
+                                    else:
+                                        continue
+                            section_name = dep_section[section][dependency]
+                            for parent in self._dic_jobs.get_jobs(section_name, current_date, current_member,
+                                                                  current_chunk):
+                                parent.status = Status.WAITING
+                                Log.debug("Parent: " + parent.name)
+                    previous_chunk = chunk
+
+        for job in [j for j in self._job_list if j.status == Status.COMPLETED]:
+            self._remove_job(job)
+
+        self.update_genealogy()
+
+    def remove_rerun_only_jobs(self):
+        """
+        Removes all jobs to be runned only in reruns
+        """
+        flag = False
+        for job in set(self._job_list):
+            if job.rerun_only:
+                self._remove_job(job)
+                flag = True
+
+        if flag:
+            self.update_genealogy()
+        del self._dic_jobs
+
+
+class DicJobs:
+    """
+    Class to create jobs from conf file and to find jobs by stardate, member and chunk
+
+    :param joblist: joblist to use
+    :type joblist: JobList
+    :param parser: jobs conf file parser
+    :type parser: SafeConfigParser
+    :param date_list: startdates
+    :type date_list: list
+    :param member_list: member
+    :type member_list: list
+    :param chunk_list: chunks
+    :type chunk_list: list
+    :param date_format: option to formate dates
+    :type date_format: str
+
+    """
+    def __init__(self, joblist, parser, date_list, member_list, chunk_list, date_format):
+        self._date_list = date_list
+        self._joblist = joblist
+        self._member_list = member_list
+        self._chunk_list = chunk_list
+        self._parser = parser
+        self._date_format = date_format
+        self._dic = dict()
+
+    def read_section(self, section, priority):
+        """
+        Read a section from jobs conf and creates all jobs for it
+
+        :param section: section to read
+        :type section: str
+        :param priority: priority for the jobs
+        :type priority: int
+        """
+        running = 'once'
+        if self._parser.has_option(section, 'RUNNING'):
+            running = self._parser.get(section, 'RUNNING').lower()
+        frequency = int(self.get_option(section, "FREQUENCY", 1))
+        if running == 'once':
+            self._create_jobs_once(section, priority)
+        elif running == 'startdate':
+            self._create_jobs_startdate(section, priority, frequency)
+        elif running == 'member':
+            self._create_jobs_member(section, priority, frequency)
+        elif running == 'chunk':
+            self._create_jobs_chunk(section, priority, frequency)
+
+    def _create_jobs_once(self, section, priority):
+        """
+        Create jobs to be run once
+
+        :param section: section to read
+        :type section: str
+        :param priority: priority for the jobs
+        :type priority: int
+        """
+        self._dic[section] = self._create_job(section, priority, None, None, None)
+
+    def _create_jobs_startdate(self, section, priority, frequency):
+        """
+        Create jobs to be run once per startdate
+
+        :param section: section to read
+        :type section: str
+        :param priority: priority for the jobs
+        :type priority: int
+        :param frequency: if greater than 1, only creates one job each frequency startdates. Allways creates one job
+                          for the last
+        :type frequency: int
+        """
+        self._dic[section] = dict()
+        count = 0
+        for date in self._date_list:
+            count += 1
+            if count % frequency == 0 or count == len(self._date_list):
+                self._dic[section][date] = self._create_job(section, priority, date, None, None)
+
+    def _create_jobs_member(self, section, priority, frequency):
+        """
+        Create jobs to be run once per member
+
+        :param section: section to read
+        :type section: str
+        :param priority: priority for the jobs
+        :type priority: int
+        :param frequency: if greater than 1, only creates one job each frequency members. Allways creates one job
+                          for the last
+        :type frequency: int
+        """
+        self._dic[section] = dict()
+        for date in self._date_list:
+            self._dic[section][date] = dict()
+            count = 0
+            for member in self._member_list:
+                count += 1
+                if count % frequency == 0 or count == len(self._member_list):
+                    self._dic[section][date][member] = self._create_job(section, priority, date, member, None)
+
+    def _create_jobs_chunk(self, section, priority, frequency):
+        """
+        Create jobs to be run once per chunk
+
+        :param section: section to read
+        :type section: str
+        :param priority: priority for the jobs
+        :type priority: int
+        :param frequency: if greater than 1, only creates one job each frequency chunks. Allways creates one job
+                          for the last
+        :type frequency: int
+        """
+        self._dic[section] = dict()
+        for date in self._date_list:
+            self._dic[section][date] = dict()
+            for member in self._member_list:
+                self._dic[section][date][member] = dict()
+                count = 0
+                for chunk in self._chunk_list:
+                    count += 1
+                    if count % frequency == 0 or count == len(self._chunk_list):
+                        self._dic[section][date][member][chunk] = self._create_job(section, priority, date, member,
+                                                                                   chunk)
+
+    def get_jobs(self, section, date=None, member=None, chunk=None):
+        """
+        Return all the jobs matching section, date, member and chunk provided. If any parameter is none, returns all
+        the jobs without checking that parameter value. If a job has one parameter to None, is returned if all the
+        others match parameters passed
+
+        :param section: section to return
+        :type section: str
+        :param date: stardate to return
+        :type date: str
+        :param member: member to return
+        :type member: str
+        :param chunk: chunk to return
+        :type chunk: int
+        :return: jobs matching parameters passed
+        :rtype: list
+        """
+        jobs = list()
+        dic = self._dic[section]
+        if type(dic) is not dict:
+            jobs.append(dic)
+        else:
+            if date is not None:
+                self._get_date(jobs, dic, date, member, chunk)
+            else:
+                for d in self._date_list:
+                    self._get_date(jobs, dic, d, member, chunk)
+        return jobs
+
+    def _get_date(self, jobs, dic, date, member, chunk):
+        if date not in dic:
+            return jobs
+        dic = dic[date]
+        if type(dic) is not dict:
+            jobs.append(dic)
+        else:
+            if member is not None:
+                self._get_member(jobs, dic, member, chunk)
+            else:
+                for m in self._member_list:
+                    self._get_member(jobs, dic, m, chunk)
+
+        return jobs
+
+    def _get_member(self, jobs, dic, member, chunk):
+        if member not in dic:
+            return jobs
+        dic = dic[member]
+        if type(dic) is not dict:
+            jobs.append(dic)
+        else:
+            if chunk is not None and chunk in dic:
+                jobs.append(dic[chunk])
+            else:
+                for c in self._chunk_list:
+                    if c not in dic:
+                        continue
+                    jobs.append(dic[c])
+        return jobs
+
+    def _create_job(self, section, priority, date, member, chunk):
+        name = self._joblist.expid
+        if date is not None:
+            name += "_" + date2str(date, self._date_format)
+        if member is not None:
+            name += "_" + member
+        if chunk is not None:
+            name += "_{0}".format(chunk)
+        name += "_" + section
+        job = Job(name, 0, Status.WAITING, priority)
+        job.section = section
+        job.date = date
+        job.member = member
+        job.chunk = chunk
+        job.date_format = self._date_format
+
+        job.frequency = int(self.get_option(section, "FREQUENCY", 1))
+        job.wait = self.get_option(section, "WAIT", 'false').lower() == 'true'
+        job.rerun_only = self.get_option(section, "RERUN_ONLY", 'false').lower() == 'true'
+
+        job.platform_name = self.get_option(section, "PLATFORM", None)
+        if job.platform_name is not None:
+            job.platform_name = job.platform_name.lower()
+        job.file = self.get_option(section, "FILE", None)
+        job.set_queue(self.get_option(section, "QUEUE", None))
+
+        job.processors = self.get_option(section, "PROCESSORS", 1)
+        job.threads = self.get_option(section, "THREADS", 1)
+        job.tasks = self.get_option(section, "TASKS", 1)
+
+        job.wallclock = self.get_option(section, "WALLCLOCK", '')
+        self._joblist.get_job_list().append(job)
+        return job
+
+    def get_option(self, section, option, default):
+        """
+        Returns value for a given option
+
+        :param section: section name
+        :type section: str
+        :param option: option to return
+        :type option: str
+        :param default: value to return if not defined in configuration file
+        :type default: object
+        """
+        if self._parser.has_option(section, option):
+            return self._parser.get(section, option)
+        else:
+            return default
diff --git a/autosubmit/monitor/__init__.py b/autosubmit/monitor/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/autosubmit/monitor/monitor.py b/autosubmit/monitor/monitor.py
new file mode 100644
index 0000000..b6d0b14
--- /dev/null
+++ b/autosubmit/monitor/monitor.py
@@ -0,0 +1,331 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+import os
+
+import time
+from os import path
+from os import chdir
+from os import listdir
+from os import remove
+
+import pydotplus
+
+
+# These packages produce errors when added to setup.
+# noinspection PyPackageRequirements
+import numpy as np
+# noinspection PyPackageRequirements
+import matplotlib.pyplot as plt
+
+from autosubmit.job.job_common import Status
+from autosubmit.config.basicConfig import BasicConfig
+from autosubmit.config.log import Log
+
+
+class Monitor:
+    """Class to handle monitoring of Jobs at HPC."""
+    _table = dict([(Status.UNKNOWN, 'white'), (Status.WAITING, 'gray'), (Status.READY, 'lightblue'),
+                   (Status.SUBMITTED, 'cyan'), (Status.QUEUING, 'lightpink'), (Status.RUNNING, 'green'),
+                   (Status.COMPLETED, 'yellow'), (Status.FAILED, 'red'), (Status.SUSPENDED, 'orange')])
+
+    @staticmethod
+    def color_status(status):
+        """
+        Return color associated to given status
+
+        :param status: status
+        :type status: Status
+        :return: color
+        :rtype: str
+        """
+        if status == Status.WAITING:
+            return Monitor._table[Status.WAITING]
+        elif status == Status.READY:
+            return Monitor._table[Status.READY]
+        elif status == Status.SUBMITTED:
+            return Monitor._table[Status.SUBMITTED]
+        elif status == Status.QUEUING:
+            return Monitor._table[Status.QUEUING]
+        elif status == Status.RUNNING:
+            return Monitor._table[Status.RUNNING]
+        elif status == Status.COMPLETED:
+            return Monitor._table[Status.COMPLETED]
+        elif status == Status.FAILED:
+            return Monitor._table[Status.FAILED]
+        elif status == Status.SUSPENDED:
+            return Monitor._table[Status.SUSPENDED]
+        else:
+            return Monitor._table[Status.UNKNOWN]
+
+    def create_tree_list(self, expid, joblist):
+        """
+        Create graph from joblist
+
+        :param expid: experiment's identifier
+        :type expid: str
+        :param joblist: joblist to plot
+        :type joblist: JobList
+        :return: created graph
+        :rtype: pydotplus.Dot
+        """
+        graph = pydotplus.Dot(graph_type='digraph')
+
+        legend = pydotplus.Subgraph(graph_name='Legend', label='Legend', rank="source")
+        legend.add_node(pydotplus.Node(name='WAITING', shape='box', style="filled",
+                                       fillcolor=self._table[Status.WAITING]))
+        legend.add_node(pydotplus.Node(name='READY', shape='box', style="filled",
+                                       fillcolor=self._table[Status.READY]))
+        legend.add_node(
+            pydotplus.Node(name='SUBMITTED', shape='box', style="filled", fillcolor=self._table[Status.SUBMITTED]))
+        legend.add_node(pydotplus.Node(name='QUEUING', shape='box', style="filled",
+                                       fillcolor=self._table[Status.QUEUING]))
+        legend.add_node(pydotplus.Node(name='RUNNING', shape='box', style="filled",
+                                       fillcolor=self._table[Status.RUNNING]))
+        legend.add_node(
+            pydotplus.Node(name='COMPLETED', shape='box', style="filled", fillcolor=self._table[Status.COMPLETED]))
+        legend.add_node(pydotplus.Node(name='FAILED', shape='box', style="filled",
+                                       fillcolor=self._table[Status.FAILED]))
+        legend.add_node(
+            pydotplus.Node(name='SUSPENDED', shape='box', style="filled", fillcolor=self._table[Status.SUSPENDED]))
+        graph.add_subgraph(legend)
+
+        exp = pydotplus.Subgraph(graph_name='Experiment', label=expid)
+        self.nodes_ploted = set()
+        for job in joblist:
+            if job.has_parents():
+                continue
+
+            node_job = pydotplus.Node(job.name, shape='box', style="filled",
+                                      fillcolor=self.color_status(job.status))
+            exp.add_node(node_job)
+            self._add_children(job, exp, node_job)
+
+        graph.add_subgraph(exp)
+
+        return graph
+
+    def _add_children(self, job, exp, node_job):
+        if job in self.nodes_ploted:
+            return
+        self.nodes_ploted.add(job)
+        if job.has_children() != 0:
+            for child in sorted(job.children, key=lambda k: k.name):
+                node_child = exp.get_node(child.name)
+                if len(node_child) == 0:
+                    node_child = pydotplus.Node(child.name, shape='box', style="filled",
+                                                fillcolor=self.color_status(child.status))
+                    exp.add_node(node_child)
+                    flag = True
+                else:
+                    node_child = node_child[0]
+                    flag = False
+                exp.add_edge(pydotplus.Edge(node_job, node_child))
+                if flag:
+                    self._add_children(child, exp, node_child)
+
+    def generate_output(self, expid, joblist, output_format="pdf"):
+        """
+        Plots graph for joblist and stores it in a file
+
+        :param expid: experiment's identifier
+        :type expid: str
+        :param joblist: joblist to plot
+        :type joblist: JobList
+        :param output_format: file format for plot
+        :type output_format: str (png, pdf, ps)
+        """
+        now = time.localtime()
+        output_date = time.strftime("%Y%m%d_%H%M", now)
+        output_file = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "plot", expid + "_" + output_date + "." +
+                                   output_format)
+
+        graph = self.create_tree_list(expid, joblist)
+
+        if output_format == "png":
+            # noinspection PyUnresolvedReferences
+            graph.write_png(output_file)
+        elif output_format == "pdf":
+            # noinspection PyUnresolvedReferences
+            graph.write_pdf(output_file)
+        elif output_format == "ps":
+            # noinspection PyUnresolvedReferences
+            graph.write_ps(output_file)
+        elif output_format == "svg":
+            # noinspection PyUnresolvedReferences
+            graph.write_svg(output_file)
+
+    def generate_output_stats(self, expid, joblist, output_format="pdf"):
+        """
+        Plots stats for joblist and stores it in a file
+
+        :param expid: experiment's identifier
+        :type expid: str
+        :param joblist: joblist to plot
+        :type joblist: JobList
+        :param output_format: file format for plot
+        :type output_format: str (png, pdf, ps)
+        """
+        now = time.localtime()
+        output_date = time.strftime("%Y%m%d_%H%M", now)
+        output_file = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "plot",  expid + "_statistics_" + output_date +
+                                   "." + output_format)
+        self.create_bar_diagram(expid, joblist, output_file)
+
+    @staticmethod
+    def create_bar_diagram(expid, joblist, output_file):
+        """
+        Function to plot statistics
+
+        :param expid: experiment's identifier
+        :type expid: str
+        :param joblist: joblist to plot
+        :type joblist: JobList
+        :param output_file: path to create file
+        :type output_file: str
+        """
+
+        def autolabel(rects):
+            # attach text labels
+            for rect in rects:
+                height = rect.get_height()
+                if height > max_time:
+                    ax[plot - 1].text(rect.get_x() + rect.get_width() / 2., 1.05 * max_time, '%d' % int(height),
+                                      ha='center', va='bottom', rotation='vertical', fontsize=9)
+
+        def failabel(rects):
+            for rect in rects:
+                height = rect.get_height()
+                if height > 0:
+                    ax[plot - 1].text(rect.get_x() + rect.get_width() / 2., 1 + height, '%d' % int(height), ha='center',
+                                      va='bottom', fontsize=9)
+
+        average_run_time = sum([float(job.check_run_time()) / 3600 for job in joblist]) / len(joblist)
+        max_time = max(max([float(job.check_run_time()) / 3600 for job in joblist]),
+                       max([float(job.check_queued_time()) / 3600 for job in joblist]))
+        min_time = min([int(float(job.check_run_time()) / 3600 - average_run_time) for job in joblist])
+        # print average_run_time
+        # l1 = 0
+        # l2 = len(joblist)
+        # print [int(job.check_queued_time())/3600 for job in joblist[l1:l2]]
+        # print [int(job.check_run_time())/3600 for job in joblist[l1:l2]]
+        # print [int(int(job.check_run_time())/3600-average_run_time) for job in joblist[l1:l2]]
+        # print [int(job.check_failed_times()) for job in joblist[l1:l2]]
+
+        # These are constants, so they need to be CAPS. Suppress PyCharm warning
+        # noinspection PyPep8Naming
+        MAX = 12.0
+        # noinspection PyPep8Naming
+        N = len(joblist)
+        num_plots = int(np.ceil(N / MAX))
+
+        ind = np.arange(int(MAX))  # the x locations for the groups
+        width = 0.16  # the width of the bars
+
+        plt.close('all')
+        fig = plt.figure(figsize=(14, 6 * num_plots))
+        # fig = plt.figure()
+        ax = []
+        lgd = None
+        for plot in range(1, num_plots + 1):
+            ax.append(fig.add_subplot(num_plots, 1, plot))
+            l1 = int((plot - 1) * MAX)
+            l2 = int(plot * MAX)
+            queued = [float(job.check_queued_time()) / 3600 for job in joblist[l1:l2]]
+            run = [float(job.check_run_time()) / 3600 for job in joblist[l1:l2]]
+            excess = [float(job.check_run_time()) / 3600 - average_run_time for job in joblist[l1:l2]]
+            failed_jobs = [int(job.check_failed_times()) for job in joblist[l1:l2]]
+            fail_queued = [float(job.check_fail_queued_time()) / 3600 for job in joblist[l1:l2]]
+            fail_run = [float(job.check_fail_run_time()) / 3600 for job in joblist[l1:l2]]
+            if plot == num_plots:
+                queued = queued + [0] * int(MAX - len(joblist[l1:l2]))
+                run = run + [0] * int(MAX - len(joblist[l1:l2]))
+                excess = excess + [0] * int(MAX - len(joblist[l1:l2]))
+                failed_jobs = failed_jobs + [0] * int(MAX - len(joblist[l1:l2]))
+                fail_queued = fail_queued + [0] * int(MAX - len(joblist[l1:l2]))
+                fail_run = fail_run + [0] * int(MAX - len(joblist[l1:l2]))
+                # ind = np.arange(len([int(job.check_queued_time())/3600 for job in joblist[l1:l2]]))
+            rects1 = ax[plot - 1].bar(ind, queued, width, color='r')
+            rects2 = ax[plot - 1].bar(ind + width, run, width, color='g')
+            rects3 = ax[plot - 1].bar(ind + width * 2, excess, width, color='b')
+            rects4 = ax[plot - 1].bar(ind + width * 3, failed_jobs, width, color='y')
+            rects5 = ax[plot - 1].bar(ind + width * 4, fail_queued, width, color='m')
+            rects6 = ax[plot - 1].bar(ind + width * 5, fail_run, width, color='c')
+            ax[plot - 1].set_ylabel('hours')
+            ax[plot - 1].set_xticks(ind + width)
+            ax[plot - 1].set_xticklabels([job.short_name for job in joblist[l1:l2]], rotation='vertical')
+            box = ax[plot - 1].get_position()
+            ax[plot - 1].set_position([box.x0, box.y0, box.width * 0.8, box.height * 0.8])
+            ax[plot - 1].set_title(expid, fontsize=20, fontweight='bold')
+            lgd = ax[plot - 1].legend((rects1[0], rects2[0], rects3[0], rects4[0], rects5[0], rects6[0]), (
+                'Queued (h)', 'Run (h)', 'Excess (h)', 'Failed jobs (#)', 'Fail Queued (h)', 'Fail Run (h)'),
+                loc="upper left", bbox_to_anchor=(1, 1))
+            autolabel(rects1)
+            autolabel(rects2)
+            failabel(rects4)
+            autolabel(rects5)
+            autolabel(rects6)
+            plt.ylim((1.15 * min_time, 1.15 * max_time))
+
+        # fig.set_size_inches(14,num_plots*6)
+        # plt.savefig(output_file, bbox_extra_artists=(lgd), bbox_inches='tight')
+        # plt.savefig(output_file, bbox_inches='tight')
+        # fig.tight_layout()
+        # plt.show()
+        plt.subplots_adjust(left=0.1, right=0.8, top=0.97, bottom=0.05, wspace=0.2, hspace=0.6)
+        plt.savefig(output_file, bbox_extra_artists=lgd)
+
+    @staticmethod
+    def clean_plot(expid):
+        """
+        Function to clean space on BasicConfig.LOCAL_ROOT_DIR/plot directory.
+        Removes all plots except last two.
+
+        :param expid: experiment's identifier
+        :type expid: str
+        """
+        search_dir = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "plot")
+        chdir(search_dir)
+        files = filter(path.isfile, listdir(search_dir))
+        files = [path.join(search_dir, f) for f in files if 'statistics' not in f]
+        files.sort(key=lambda x: path.getmtime(x))
+        remain = files[-2:]
+        filelist = [f for f in files if f not in remain]
+        for f in filelist:
+            remove(f)
+        Log.result("Plots cleaned!\nLast two plots remanining there.\n")
+
+    @staticmethod
+    def clean_stats(expid):
+        """
+        Function to clean space on BasicConfig.LOCAL_ROOT_DIR/plot directory.
+        Removes all stats' plots except last two.
+
+        :param expid: experiment's identifier
+        :type expid: str
+        """
+        search_dir = os.path.join(BasicConfig.LOCAL_ROOT_DIR, expid, "plot")
+        chdir(search_dir)
+        files = filter(path.isfile, listdir(search_dir))
+        files = [path.join(search_dir, f) for f in files if 'statistics' in f]
+        files.sort(key=lambda x: path.getmtime(x))
+        remain = files[-1:]
+        filelist = [f for f in files if f not in remain]
+        for f in filelist:
+            remove(f)
+        Log.result("Stats cleaned!\nLast stats' plot remanining there.\n")
diff --git a/autosubmit/platforms/__init__.py b/autosubmit/platforms/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/autosubmit/platforms/ecplatform.py b/autosubmit/platforms/ecplatform.py
new file mode 100644
index 0000000..3c17c2d
--- /dev/null
+++ b/autosubmit/platforms/ecplatform.py
@@ -0,0 +1,253 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+import textwrap
+import os
+from commands import getstatusoutput
+
+from autosubmit.platforms.hpcplatform import HPCPlatform, HPCPlatformException
+from autosubmit.config.log import Log
+
+
+class EcPlatform(HPCPlatform):
+    """
+    Class to manage queues with eceacces
+
+    :param expid: experiment's identifier
+    :type expid: str
+    :param scheduler: scheduler to use
+    :type scheduler: str (pbs, loadleveler)
+    """
+
+    def __init__(self, expid, scheduler):
+        HPCPlatform.__init__(self)
+        self._host = ""
+        self.scratch = ""
+        self.project = ""
+        self.user = ""
+        if scheduler == 'pbs':
+            self._header = EcCcaHeader()
+        elif scheduler == 'loadleveler':
+            self._header = EcHeader()
+        else:
+            raise HPCPlatformException('ecaccess scheduler {0} not supported'.format(scheduler))
+        self.expid = expid
+        self.job_status = dict()
+        self.job_status['COMPLETED'] = ['DONE']
+        self.job_status['RUNNING'] = ['EXEC']
+        self.job_status['QUEUING'] = ['INIT', 'RETR', 'STDBY', 'WAIT']
+        self.job_status['FAILED'] = ['STOP']
+        self._pathdir = "\$HOME/LOG_" + self.expid
+        self.update_cmds()
+
+    def update_cmds(self):
+        """
+        Updates commands for platforms
+        """
+        self.root_dir = os.path.join(self.scratch, self.project, self.user, self.expid)
+        self.remote_log_dir = os.path.join(self.root_dir, "LOG_"+self.expid)
+        self.cancel_cmd = "eceaccess-job-delete"
+        self._checkjob_cmd = "ecaccess-job-list "
+        self._checkhost_cmd = "ecaccess-certificate-list"
+        self._submit_cmd = ("ecaccess-job-submit -distant -queueName " + self._host + " " + self._host + ":" +
+                            self.remote_log_dir + "/")
+        self.put_cmd = "ecaccess-file-put"
+        self.get_cmd = "ecaccess-file-get"
+        self.mkdir_cmd = ("ecaccess-file-mkdir " + self._host + ":" + self.scratch + "/" + self.project + "/" +
+                          self.user + "/" + self.expid + "; " + "ecaccess-file-mkdir " + self._host + ":" +
+                          self.remote_log_dir)
+
+    def get_checkhost_cmd(self):
+        return self._checkhost_cmd
+
+    def get_remote_log_dir(self):
+        return self.remote_log_dir
+
+    def get_mkdir_cmd(self):
+        return self.mkdir_cmd
+
+    def parse_job_output(self, output):
+        job_state = output.split('\n')
+        if len(job_state) > 7:
+            job_state = job_state[7].split()
+            if len(job_state) > 1:
+                return job_state[1]
+        return 'DONE'
+
+    def get_submitted_job_id(self, output):
+        return output
+
+    def jobs_in_queue(self):
+        """
+        Returns empty list because ecacces does not support this command
+
+        :return: empty list
+        :rtype: list
+        """
+        return ''.split()
+
+    def get_checkjob_cmd(self, job_id):
+        return self._checkjob_cmd + str(job_id)
+
+    def get_submit_cmd(self, job_script):
+        return self._submit_cmd + job_script
+
+    def connect(self):
+        """
+        In this case, it does nothing because connection is established foe each command
+
+        :return: True
+        :rtype: bool
+        """
+        return True
+
+    def send_command(self, command):
+        (status, output) = getstatusoutput(command)
+        if status != 0:
+            Log.error('Could not execute command {0} on {1}'.format(command, self._host))
+            return False
+        self._ssh_output = output
+        return True
+
+    def send_file(self, local_path, remote_path):
+        command = '{0} {1} {3}:{2}'.format(self.put_cmd, local_path, remote_path, self._host)
+        (status, output) = getstatusoutput(command)
+        if status != 0:
+            Log.error('Could not send file {0} to {1}'.format(local_path, remote_path))
+            return False
+        return True
+
+    def get_file(self, remote_path, local_path, omit_error=False):
+        command = '{0} {3}:{2} {1}'.format(self.get_cmd, local_path, remote_path, self._host)
+        (status, output) = getstatusoutput(command)
+        if status != 0:
+            if not omit_error:
+                Log.error('Could not get file {0} from {1}'.format(local_path, remote_path))
+            return False
+        return True
+
+    def get_ssh_output(self):
+        return self._ssh_output
+
+
+class EcHeader:
+    """Class to handle the ECMWF headers of a job"""
+
+    # noinspection PyMethodMayBeStatic,PyUnusedLocal
+    def get_queue_directive(self, job):
+        """
+        Returns queue directive for the specified job
+
+        :param job: job to create queue directibve for
+        :type job: Job
+        :return: queue directive
+        :rtype: str
+        """
+        # There is no queue, so directive is empty
+        return ""
+
+    SERIAL = textwrap.dedent("""\
+            #!/bin/ksh
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #@ shell            = /usr/bin/ksh
+            #@ class            = ns
+            #@ job_type         = serial
+            #@ job_name         = %JOBNAME%
+            #@ output           = %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/$(job_name).$(jobid).out
+            #@ error            = %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/$(job_name).$(jobid).err
+            #@ notification     = error
+            #@ resources        = ConsumableCpus(1) ConsumableMemory(1200mb)
+            #@ wall_clock_limit = %WALLCLOCK%:00
+            #@ platforms
+            #
+            ###############################################################################
+            """)
+
+    PARALLEL = textwrap.dedent("""\
+            #!/bin/ksh
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #@ shell            = /usr/bin/ksh
+            #@ class            = np
+            #@ job_type         = parallel
+            #@ job_name         = %JOBNAME%
+            #@ output           = %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/$(job_name).$(jobid).out
+            #@ error            = %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/$(job_name).$(jobid).err
+            #@ notification     = error
+            #@ resources        = ConsumableCpus(1) ConsumableMemory(1200mb)
+            #@ ec_smt           = no
+            #@ total_tasks      = %NUMPROC%
+            #@ wall_clock_limit = %WALLCLOCK%:00
+            #@ platforms
+            #
+            ###############################################################################
+            """)
+
+
+class EcCcaHeader:
+    """Class to handle the ECMWF headers of a job"""
+
+    # noinspection PyMethodMayBeStatic,PyUnusedLocal
+    def get_queue_directive(self, job):
+        """
+        Returns queue directive for the specified job
+
+        :param job: job to create queue directibve for
+        :type job: Job
+        :return: queue directive
+        :rtype: str
+        """
+        # There is no queue, so directive is empty
+        return ""
+
+    SERIAL = textwrap.dedent("""\
+             #!/bin/bash
+             ###############################################################################
+             #                   %TASKTYPE% %EXPID% EXPERIMENT
+             ###############################################################################
+             #
+             #PBS -N %JOBNAME%
+             #PBS -q ns
+             #PBS -l walltime=%WALLCLOCK%:00
+             #PBS -l EC_billing_account=%HPCBUDG%
+             #
+             ###############################################################################
+
+            """)
+
+    PARALLEL = textwrap.dedent("""\
+             #!/bin/bash
+             ###############################################################################
+             #                   %TASKTYPE% %EXPID% EXPERIMENT
+             ###############################################################################
+             #
+             #PBS -N %JOBNAME%
+             #PBS -q np
+             #PBS -l EC_total_tasks=%NUMPROC%
+             #PBS -l EC_threads_per_task=%NUMTHREADS%
+             #PBS -l EC_tasks_per_node=%NUMTASK%
+             #PBS -l walltime=%WALLCLOCK%:00
+             #PBS -l EC_billing_account=%HPCBUDG%
+             #
+             ###############################################################################
+            """)
\ No newline at end of file
diff --git a/autosubmit/platforms/hpcplatform.py b/autosubmit/platforms/hpcplatform.py
new file mode 100644
index 0000000..7c8dc64
--- /dev/null
+++ b/autosubmit/platforms/hpcplatform.py
@@ -0,0 +1,592 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+
+
+import os
+import paramiko
+from time import sleep
+from sys import exit
+
+from autosubmit.job.job_common import Status
+from autosubmit.config.basicConfig import BasicConfig
+from autosubmit.config.log import Log
+
+
+SLEEPING_TIME = 30
+
+
+class HPCPlatform:
+    """
+    Base class to manage schedulers
+    """
+    def __init__(self):
+        self._submit_cmd = None
+        self.get_cmd = None
+        self._checkjob_cmd = None
+        self.job_status = None
+        self.expid = None
+        self.put_cmd = None
+        self.mkdir_cmd = None
+        self.cancel_cmd = None
+        self._header = None
+        self._default_queue = None
+        self._serial_platform = None
+        self._serial_queue = None
+        self._ssh_config = None
+        self._user_config_file = None
+        self._host_config = None
+        self._host_config_id = None
+
+    @property
+    def header(self):
+        """
+        Header to add to jobs for scheduler cofiguration
+
+        :return: header
+        :rtype: object
+        """
+        return self._header
+
+    def connect(self):
+        """
+        Creates ssh connection to host
+
+        :return: True if connection is created, False otherwise
+        :rtype: bool
+        """
+        try:
+            self._ssh = paramiko.SSHClient()
+            self._ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
+            self._ssh_config = paramiko.SSHConfig()
+            self._user_config_file = os.path.expanduser("~/.ssh/config")
+            if os.path.exists(self._user_config_file):
+                with open(self._user_config_file) as f:
+                    # noinspection PyTypeChecker
+                    self._ssh_config.parse(f)
+            self._host_config = self._ssh_config.lookup(self._host)
+            if 'identityfile' in self._host_config:
+                self._host_config_id = self._host_config['identityfile']
+
+            self._ssh.connect(self._host_config['hostname'], 22, username=self.user,
+                              key_filename=self._host_config_id)
+            return True
+        except BaseException as e:
+            Log.error('Can not create ssh connection to {0}: {1}', self._host, e.message)
+            return False
+
+    def send_command(self, command):
+        """
+        Sends given command to HPC
+
+        :param command: command to send
+        :type command: str
+        :return: True if executed, False if failed
+        :rtype: bool
+        """
+        if self._ssh is None:
+            if not self.connect():
+                return None
+        try:
+            stdin, stdout, stderr = self._ssh.exec_command(command)
+            stderr_readlines = stderr.readlines()
+            self._ssh_output = stdout.read().rstrip()
+            if stdout.channel.recv_exit_status() == 0:
+                if len(stderr_readlines) > 0:
+                    Log.warning('Command {0} in {1} warning: {2}', command, self._host, '\n'.join(stderr_readlines))
+                Log.debug('Command {0} in {1} successful with out message: {2}', command, self._host, self._ssh_output)
+                return True
+            else:
+                Log.error('Command {0} in {1} failed with error message: {2}',
+                          command, self._host, '\n'.join(stderr_readlines))
+                return False
+        except BaseException as e:
+            Log.error('Can not send command {0} to {1}: {2}', command, self._host, e.message)
+            return False
+
+    def send_file(self, local_path, root_path):
+        """
+        Copies file in local_path to remote_path
+
+        :param local_path: path to the local file to copy
+        :type local_path: str
+        :param root_path: path to the remote file to create
+        :type root_path: str
+        :return: True if succesful, False if failed
+        :rtype: bool
+        """
+        if self._ssh is None:
+            if not self.connect():
+                return None
+
+        try:
+            ftp = self._ssh.open_sftp()
+            ftp.put(local_path, root_path)
+            ftp.close()
+            return True
+        except BaseException as e:
+            Log.error('Can not send file {0} to {1}: {2}', local_path, root_path, e.message)
+            return False
+
+    def get_file(self, remote_path, local_path, omit_error=False):
+        """
+        Copies file in remote_path to local_path
+
+        :param remote_path: path to the remote file to copy
+        :type remote_path: str
+        :param local_path: path to the local file to create
+        :type local_path: str
+        :return: True if succesful, False if failed
+        :rtype: bool
+        """
+        if self._ssh is None:
+            if not self.connect():
+                return None
+
+        try:
+            ftp = self._ssh.open_sftp()
+            ftp.get(remote_path, local_path)
+            ftp.close()
+            return True
+        except BaseException as e:
+            if not omit_error:
+                Log.error('Can not get file from {0} to {1}: {2}', remote_path, local_path, e.message)
+            return False
+
+    def get_ssh_output(self):
+        """
+        Gets output from last command executed
+
+        :return: output from last command
+        :rtype: str
+        """
+        Log.debug('Output {0}', self._ssh_output)
+        return self._ssh_output
+
+    def close_connection(self):
+        """
+        Closes ssh connection to host
+        """
+        if self._ssh is None:
+            return
+        self._ssh.close()
+
+    def get_queue(self):
+        """
+        Returns default queue for current host. If not configured, returns empty string
+
+        :return: default queue for host
+        :rtype: str
+        """
+        if self._default_queue is None:
+            return ''
+        return self._default_queue
+
+    def set_queue(self, value):
+        """
+        Configures default queue for current host.
+
+        :param value: serial queue for host
+        :type value: str
+        """
+        self._default_queue = value
+
+    def get_serial_queue(self):
+        """
+        Returns serial queue for current host. If not configured, returns empty string
+
+        :return: serial queue for host
+        :rtype: str
+        """
+        if self._serial_queue is None:
+            return self.get_queue()
+        return self._serial_queue
+
+    def set_serial_queue(self, value):
+        """
+        Configures serial queue for current host.
+
+        :param value: serial queue for host
+        :type value: str
+        """
+        self._serial_queue = value
+
+    def get_serial_platform(self):
+        """
+        Returns serial platforms for current host. If not configured, returns self
+
+        :return: serial platforms for host
+        :rtype: HPCPlatform
+        """
+        if self._serial_platform is None:
+            return self
+        else:
+            return self._serial_platform
+
+    def set_serial_platform(self, value):
+        """
+        Configures serial platforms for current host.
+
+        :param value: serial platforms for host
+        :type value: HPCPlatform
+        """
+        self._serial_platform = value
+
+    def cancel_job(self, job_id):
+        """
+        Cancels job
+
+        :param job_id: job to cancel
+        :type job_id: int
+        """
+        Log.debug(self.cancel_cmd + ' ' + str(job_id))
+        self.send_command(self.cancel_cmd + ' ' + str(job_id))
+
+    def check_job(self, job_id):
+        """
+        Checks job statuts
+
+        :param job_id: job to check
+        :type job_id: int
+        :return: current job status
+        :rtype: Status
+        """
+        job_status = Status.UNKNOWN
+
+        if type(job_id) is not int:
+            # URi: logger
+            Log.error('check_job() The job id ({0}) is not an integer.', job_id)
+            # URi: value ?
+            return job_status
+
+        retry = 10
+        while not self.send_command(self.get_checkjob_cmd(job_id)) and retry > 0:
+            retry -= 1
+            Log.warning('Retrying check job command: {0}', self.get_checkjob_cmd(job_id))
+            Log.error('Can not get job status for job id ({0}), retrying in 10 sec', job_id)
+            sleep(10)
+
+        if retry > 0:
+            Log.debug('Successful check job command: {0}', self.get_checkjob_cmd(job_id))
+            job_status = self.parse_job_output(self.get_ssh_output())
+            # URi: define status list in HPC Queue Class
+            if job_status in self.job_status['COMPLETED'] or retry == 0:
+                job_status = Status.COMPLETED
+            elif job_status in self.job_status['RUNNING']:
+                job_status = Status.RUNNING
+            elif job_status in self.job_status['QUEUING']:
+                job_status = Status.QUEUING
+            elif job_status in self.job_status['FAILED']:
+                job_status = Status.FAILED
+            else:
+                job_status = Status.UNKNOWN
+        else:
+            # BOUOUOUOU	NOT	GOOD!
+            job_status = Status.UNKNOWN
+            Log.error('check_job() The job id ({0}) status is {1}.', job_id, job_status)
+        return job_status
+
+    def check_host(self):
+        """
+        Checks host availability
+
+        :return: True if host is available, False otherwise
+        :rtype: bool
+        """
+        checkhost_cmd = self.get_checkhost_cmd()
+        if not self.send_command(checkhost_cmd):
+            self.connect()
+            if not self.send_command(checkhost_cmd):
+                Log.debug('The host ' + self._host + ' is down')
+                return False
+        Log.debug('The host ' + self._host + ' is up')
+        return True
+
+    def check_remote_log_dir(self):
+        """
+        Creates log dir on remote host
+        """
+        if self.send_command(self.get_mkdir_cmd()):
+            Log.debug('{0} has been created on {1} .', self.remote_log_dir, self._host)
+        else:
+            Log.error('Could not create the DIR {0} on HPC {1}'.format(self.remote_log_dir, self._host))
+
+    def send_script(self, job_script):
+        """
+        Send a script to remote host
+
+        :param job_script: name of script to send
+        :type job_script: str
+        """
+        if self.send_file(os.path.join(BasicConfig.LOCAL_ROOT_DIR, self.expid, 'tmp', str(job_script)),
+                          os.path.join(self.remote_log_dir, str(job_script))):
+            Log.debug('The script {0} has been sent'.format(job_script))
+        else:
+            Log.error('The script {0} has not been sent'.format(job_script))
+
+    def get_completed_files(self, jobname, retries=1, omit_error=False):
+        """
+        Copies *COMPLETED* files from remote to local
+
+        :param jobname: name of job to check
+        :type jobname: str
+        :return: True if succesful, False otherwise
+        :rtype: bool
+        """
+        filename = jobname + '_COMPLETED'
+        completed_local_path = os.path.join(BasicConfig.LOCAL_ROOT_DIR, self.expid, BasicConfig.LOCAL_TMP_DIR, filename)
+        if os.path.exists(completed_local_path):
+            os.remove(completed_local_path)
+        completed_remote_path = os.path.join(self.remote_log_dir, filename)
+        if self.get_file(completed_remote_path, completed_local_path, omit_error):
+            Log.debug('The COMPLETED files have been transfered')
+            return True
+
+        while retries > 0:
+            # wait five seconds to check get file
+            sleep(5)
+            if self.get_file(completed_remote_path, completed_local_path, omit_error):
+                Log.debug('The COMPLETED files have been transfered')
+                return True
+            retries -= 1
+
+        Log.debug('Something did not work well when transferring the COMPLETED files')
+        return False
+
+    def submit_job(self, job_script):
+        """
+        Submits job to scheduler and returns job id
+
+        :param job_script: script path
+        :type job_script: str
+        :return: job id
+        :rtype: int
+        """
+        if self.send_command(self.get_submit_cmd(job_script)):
+            job_id = self.get_submitted_job_id(self.get_ssh_output())
+            Log.debug("Job ID: {0}", job_id)
+            return int(job_id)
+        else:
+            return None
+
+    # noinspection PyUnusedLocal
+    def normal_stop(self, arg1, arg2):
+        for job_id in self.jobs_in_queue():
+            self.cancel_job(job_id)
+        exit(0)
+
+    def smart_stop(self):
+        while self.jobs_in_queue():
+            sleep(SLEEPING_TIME)
+        exit(0)
+
+    def set_host(self, new_host):
+        """
+        Sets host name
+        :param new_host: host
+        :type new_host: str
+        """
+        self._host = new_host
+
+    def set_scratch(self, new_scratch):
+        """
+        Sets scracth directory name
+        :param new_scratch: scratch path
+        :type new_scratch: str
+        """
+        self.scratch = new_scratch
+
+    def set_project(self, new_project):
+        """
+        Sets project name
+        :param new_project: project
+        :type new_project: str
+        """
+        self.project = new_project
+
+    def set_budget(self, new_budget):
+        """
+        Sets budget
+        :param new_budget: project
+        :type new_budget: str
+        """
+        self.budget = new_budget
+
+    def set_user(self, new_user):
+        """
+        Sets user name
+        :param new_user: user
+        :type new_user: str
+        """
+        self.user = new_user
+
+    def set_remote_log_dir(self, new_remote_log_dir):
+        """
+        Sets remote directory for logs
+        :param new_remote_log_dir: path to log directory
+        :type new_remote_log_dir: str
+        """
+        self.remote_log_dir = new_remote_log_dir
+
+    def get_checkhost_cmd(self):
+        """
+        Gets command to check platforms availability
+
+        :return: command to check platforms availability
+        :rtype: str
+        """
+        raise NotImplementedError
+
+    def get_mkdir_cmd(self):
+        """
+        Gets command to create directories on HPC
+
+        :return: command to create directories on HPC
+        :rtype: str
+        """
+        raise NotImplementedError
+
+    def parse_job_output(self, output):
+        """
+        Parses check job command output so it can be interpreted by autosubmit
+
+        :param output: output to parse
+        :type output: str
+        :return: job status
+        :rtype: str
+        """
+        raise NotImplementedError
+
+    def jobs_in_queue(self):
+        """
+        Get jobs in platforms in this host
+
+        :return: jobs in platforms
+        :rtype: list
+        """
+        raise NotImplementedError
+
+    def get_submitted_job_id(self, output):
+        """
+        Parses submit command output to extract job id
+        :param output: output to parse
+        :type output: str
+        :return: job id
+        :rtype: str
+        """
+        raise NotImplementedError
+
+    def update_cmds(self):
+        """
+        Updates commands for platforms
+        """
+        raise NotImplementedError
+
+    def get_remote_log_dir(self):
+        """
+        Gets remote directory for logs
+
+        :return: log directory path
+        :rtype: str
+        """
+        raise NotImplementedError
+
+    # noinspection PyUnresolvedReferences
+    def get_header(self, job):
+        """
+        Gets header to be used by the job
+
+        :param job: job
+        :type job: Job
+        :return: header to use
+        :rtype: str
+        """
+        if job.processors > 1:
+            header = self.header.PARALLEL
+        else:
+            header = self.header.SERIAL
+
+        header = header.replace('%QUEUE_DIRECTIVE%', self.header.get_queue_directive(job))
+        return header
+
+    def get_checkjob_cmd(self, job_id):
+        """
+        Returns command to check job status on remote platforms
+
+        :param job_id: id of job to check
+        :param job_id: int
+        :return: command to check job status
+        :rtype: str
+        """
+        raise NotImplementedError
+
+    def get_submit_cmd(self, job_script):
+        """
+        Get command to add job to scheduler
+
+        :param job_script: path to job script
+        :param job_script: str
+        :return: command to submit job to platforms
+        :rtype: str
+        """
+        raise NotImplementedError
+
+    def get_shcall(self, job_script):
+        """
+        Gets execution command for given job
+
+        :param job_script: script to run
+        :type job_script: str
+        :return: command to execute script
+        :rtype: str
+        """
+        return 'nohup bash {0} > {0}.out 2> {0}.err & echo $!'.format(os.path.join(self.remote_log_dir,
+                                                                                   job_script))
+
+    @staticmethod
+    def get_pscall(job_id):
+        """
+        Gets command to check if a job is running given process identifier
+
+        :param job_id: process indentifier
+        :type job_id: int
+        :return: command to check job status script
+        :rtype: str
+        """
+        return 'nohup kill -0 {0}; echo $?'.format(job_id)
+
+    @staticmethod
+    def get_qstatjob(job_id):
+        """
+        Gets qstat command for given job id
+
+        :param job_id: job to check
+        :type job_id: int
+        :return: qstat command for job
+        :rtype: str
+        """
+        return '''if [[ $(qstat | grep {0}) != '' ]];
+        then echo $(qstat | grep {0} | awk '{{print $5}}' | head -n 1); else echo 'c'; fi'''.format(job_id)
+
+
+class HPCPlatformException(Exception):
+    """
+    Exception raised from HPC queues
+    """
+    def __init__(self, msg):
+        self.message = msg
diff --git a/autosubmit/platforms/localplatform.py b/autosubmit/platforms/localplatform.py
new file mode 100644
index 0000000..c863904
--- /dev/null
+++ b/autosubmit/platforms/localplatform.py
@@ -0,0 +1,158 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+from commands import getstatusoutput
+import os
+import textwrap
+from xml.dom.minidom import parseString
+
+from autosubmit.platforms.hpcplatform import HPCPlatform
+from autosubmit.config.basicConfig import BasicConfig
+from autosubmit.config.log import Log
+
+
+class LocalPlatform(HPCPlatform):
+    """
+    Class to manage jobs to localhost
+
+    :param expid: experiment's identifier
+    :type expid: str
+    """
+    def __init__(self, expid):
+        HPCPlatform.__init__(self)
+        self._host = ""
+        self.scratch = ""
+        self.project = ""
+        self.user = ""
+        self._header = LocalHeader()
+        self.expid = expid
+        self.job_status = dict()
+        self.job_status['COMPLETED'] = ['1']
+        self.job_status['RUNNING'] = ['0']
+        self.job_status['QUEUING'] = []
+        self.job_status['FAILED'] = []
+        self.update_cmds()
+
+    def update_cmds(self):
+        self.root_dir = os.path.join(BasicConfig.LOCAL_ROOT_DIR, self.expid)
+        self.remote_log_dir = os.path.join(self.root_dir, "tmp", 'LOG_' + self.expid)
+        self.cancel_cmd = "kill -SIGINT"
+        self._checkhost_cmd = "echo 1"
+        self.put_cmd = "cp -p"
+        self.get_cmd = "cp"
+        self.mkdir_cmd = "mkdir -p " + self.remote_log_dir
+
+    def get_checkhost_cmd(self):
+        return self._checkhost_cmd
+
+    def get_remote_log_dir(self):
+        return self.remote_log_dir
+
+    def get_mkdir_cmd(self):
+        return self.mkdir_cmd
+
+    def parse_job_output(self, output):
+        return output
+
+    def get_submitted_job_id(self, output):
+        return output
+
+    def jobs_in_queue(self):
+        dom = parseString('')
+        jobs_xml = dom.getElementsByTagName("JB_job_number")
+        return [int(element.firstChild.nodeValue) for element in jobs_xml]
+
+    def get_submit_cmd(self, job_script):
+        return self.get_shcall(job_script)
+
+    def get_checkjob_cmd(self, job_id):
+        return self.get_pscall(job_id)
+
+    def connect(self):
+        return True
+
+    def send_command(self, command):
+        (status, output) = getstatusoutput(command)
+        if status != 0:
+            Log.error('Could not execute command {0} on {1}'.format(command, self._host))
+            return False
+        self._ssh_output = output
+        return True
+
+    def send_file(self, local_path, remote_path):
+        command = '{0} {1} {2}'.format(self.put_cmd, local_path, remote_path)
+        (status, output) = getstatusoutput(command)
+        if status != 0:
+            Log.error('Could not send file {0} to {1}'.format(local_path, remote_path))
+            return False
+        return True
+
+    def get_file(self, remote_path, local_path, omit_error=False):
+        command = '{0} {2} {1}'.format(self.get_cmd, local_path, remote_path)
+        (status, output) = getstatusoutput(command)
+        if status != 0:
+            if not omit_error:
+                Log.error('Could not get file {0} from {1}'.format(local_path, remote_path))
+            return False
+        return True
+
+    def get_ssh_output(self):
+        return self._ssh_output
+
+
+class LocalHeader:
+    """Class to handle the Ps headers of a job"""
+
+    # noinspection PyMethodMayBeStatic,PyUnusedLocal
+    def get_queue_directive(self, job):
+        """
+        Returns queue directive for the specified job
+
+        :param job: job to create queue directibve for
+        :type job: Job
+        :return: queue directive
+        :rtype: str
+        """
+        # There is no queue, so directive is empty
+        return ""
+
+    SERIAL = textwrap.dedent("""\
+            #!/bin/bash
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            """)
+
+    PARALLEL = textwrap.dedent("""\
+            #!/bin/bash
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            """)
+
+# def main():
+# q = PsQueue()
+#     q.check_job(1688)
+#     j = q.submit_job("/cfu/autosubmit/l002/templates/l002.sim")
+#     sleep(10)
+#     print q.check_job(j)
+#     q.cancel_job(j)
+#
+#
+# if __name__ == "__main__":
+#     main()
diff --git a/autosubmit/platforms/lsfplatform.py b/autosubmit/platforms/lsfplatform.py
new file mode 100644
index 0000000..c42c502
--- /dev/null
+++ b/autosubmit/platforms/lsfplatform.py
@@ -0,0 +1,139 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+import textwrap
+import os
+
+from autosubmit.platforms.hpcplatform import HPCPlatform
+
+
+class LsfPlatform(HPCPlatform):
+    """
+    Class to manage jobs to host using LSF scheduler
+
+    :param expid: experiment's identifier
+    :type expid: str
+    """
+    def __init__(self, expid):
+        HPCPlatform.__init__(self)
+        self._host = "mn-ecm86"
+        self.scratch = ""
+        self.project = ""
+        self.user = ""
+        self._header = LsfHeader()
+        self.expid = expid
+        self.job_status = dict()
+        self.job_status['COMPLETED'] = ['DONE']
+        self.job_status['RUNNING'] = ['RUN']
+        self.job_status['QUEUING'] = ['PEND', 'FW_PEND']
+        self.job_status['FAILED'] = ['SSUSP', 'USUSP', 'EXIT']
+        self.update_cmds()
+
+    def update_cmds(self):
+        self.root_dir = os.path.join(self.scratch, self.project, self.user, self.expid)
+        self.remote_log_dir = os.path.join(self.root_dir, "LOG_"+self.expid)
+        self.cancel_cmd = "bkill"
+        self._checkjob_cmd = "bjobs "
+        self._checkhost_cmd = "echo 1"
+        self._submit_cmd = "bsub -cwd " + self.remote_log_dir + " < " + self.remote_log_dir + "/"
+        self.put_cmd = "scp"
+        self.get_cmd = "scp"
+        self.mkdir_cmd = "mkdir -p " + self.remote_log_dir
+
+    def get_checkhost_cmd(self):
+        return self._checkhost_cmd
+
+    def get_mkdir_cmd(self):
+        return self.mkdir_cmd
+
+    def get_remote_log_dir(self):
+        return self.remote_log_dir
+
+    def parse_job_output(self, output):
+        job_state = output.split('\n')
+        if len(job_state) > 1:
+            job_state = job_state[1].split()
+            if len(job_state) > 2:
+                return job_state[2]
+        # If we can not process output, assuming completed. Then we will look for completed files and status will
+        # change to failed if COMPLETED file is not present.
+        return 'DONE'
+
+    def get_submitted_job_id(self, output):
+        return output.split('<')[1].split('>')[0]
+
+    def jobs_in_queue(self):
+        return zip(*[line.split() for line in ''.split('\n')])[0][1:]
+
+    def get_checkjob_cmd(self, job_id):
+        return self._checkjob_cmd + str(job_id)
+
+    def get_submit_cmd(self, job_script):
+        return self._submit_cmd + job_script
+
+
+class LsfHeader:
+    """Class to handle the MareNostrum3 headers of a job"""
+
+    # noinspection PyMethodMayBeStatic
+    def get_queue_directive(self, job):
+        """
+        Returns queue directive for the specified job
+
+        :param job: job to create queue directibve for
+        :type job: Job
+        :return: queue directive
+        :rtype: str
+        """
+        if job.parameters['CURRENT_QUEUE'] == '':
+            return ""
+        else:
+            return "BSUB -q {0}".format(job.parameters['HPCQUEUE'])
+
+    SERIAL = textwrap.dedent("""\
+            #!/bin/sh
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #%QUEUE_DIRECTIVE%
+            #BSUB -J %JOBNAME%
+            #BSUB -oo %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/%JOBNAME%_%J.out
+            #BSUB -eo %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/%JOBNAME%_%J.err
+            #BSUB -W %WALLCLOCK%
+            #BSUB -n %NUMPROC%
+            #
+            ###############################################################################
+            """)
+
+    PARALLEL = textwrap.dedent("""\
+            #!/bin/sh
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #%QUEUE_DIRECTIVE%
+            #BSUB -J %JOBNAME%
+            #BSUB -oo %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/%JOBNAME%_%J.out
+            #BSUB -eo %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/%JOBNAME%_%J.err
+            #BSUB -W %WALLCLOCK%
+            #BSUB -n %NUMPROC%
+            #BSUB -R "span[ptile=16]"
+            #
+            ###############################################################################
+            """)
\ No newline at end of file
diff --git a/autosubmit/platforms/pbsplatform.py b/autosubmit/platforms/pbsplatform.py
new file mode 100644
index 0000000..b217148
--- /dev/null
+++ b/autosubmit/platforms/pbsplatform.py
@@ -0,0 +1,244 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+import textwrap
+import os
+
+from autosubmit.platforms.hpcplatform import HPCPlatform, HPCPlatformException
+from autosubmit.config.log import Log
+
+
+class PBSPlatform(HPCPlatform):
+    """
+    Class to manage jobs to host using PBS scheduler
+
+    :param expid: experiment's identifier
+    :type expid: str
+    :param version: scheduler version
+    :type version: str
+    """
+
+    def __init__(self, expid, version):
+        HPCPlatform.__init__(self)
+        self._host = ""
+        self._version = version
+        self.scratch = ""
+        self.project = ""
+        self.user = ""
+
+        if str.startswith(version, '10'):
+            self._header = Pbs10Header()
+        elif str.startswith(version, '11'):
+            self._header = Pbs11Header()
+        elif str.startswith(version, '12'):
+            self._header = Pbs12Header()
+        else:
+            Log.error('PBS version {0} not supported'.format(version))
+            raise HPCPlatformException('PBS version {0} not supported'.format(version))
+
+        self.expid = expid
+        self.job_status = dict()
+        self.job_status['COMPLETED'] = ['F', 'E', 'c', 'C']
+        self.job_status['RUNNING'] = ['R']
+        self.job_status['QUEUING'] = ['Q', 'H', 'S', 'T', 'W', 'U', 'M']
+        self.job_status['FAILED'] = ['Failed', 'Node_fail', 'Timeout']
+        self.update_cmds()
+
+    def update_cmds(self):
+        self.root_dir = os.path.join(self.scratch, self.project, self.user, self.expid)
+        self.remote_log_dir = os.path.join(self.root_dir, "LOG_"+self.expid)
+        self.cancel_cmd = "ssh " + self._host + " qdel"
+        self._checkhost_cmd = "ssh " + self._host + " echo 1"
+        self.put_cmd = "scp"
+        self.get_cmd = "scp"
+        self.mkdir_cmd = "ssh " + self._host + " mkdir -p " + self.remote_log_dir
+        self._submit_cmd = "ssh " + self._host + " qsub -d " + self.remote_log_dir + " " + self.remote_log_dir + "/ "
+
+        if str.startswith(self._version, '11'):
+            self._checkjob_cmd = "ssh " + self._host + " qstat"
+
+    def get_checkhost_cmd(self):
+        return self._checkhost_cmd
+
+    def get_remote_log_dir(self):
+        return self.remote_log_dir
+
+    def get_mkdir_cmd(self):
+        return self.mkdir_cmd
+
+    def parse_job_output(self, output):
+        # job_state = output.split('\n')[2].split()[4]
+        # return job_state
+        return output
+
+    def get_submitted_job_id(self, output):
+        return output.split('.')[0]
+
+    def jobs_in_queue(self):
+        return ''.split()
+
+    def get_submit_cmd(self, job_script):
+        return self._submit_cmd + job_script
+
+    def get_checkjob_cmd(self, job_id):
+        if str.startswith(self._version, '11'):
+            return self._checkjob_cmd + str(job_id)
+        else:
+            return "ssh " + self._host + " " + self.get_qstatjob(job_id)
+
+
+class Pbs12Header:
+    """Class to handle the Archer headers of a job"""
+
+    # noinspection PyMethodMayBeStatic,PyUnusedLocal
+    def get_queue_directive(self, job):
+        """
+        Returns queue directive for the specified job
+
+        :param job: job to create queue directibve for
+        :type job: Job
+        :return: queue directive
+        :rtype: str
+        """
+        # There is no queue, so directive is empty
+        return ""
+
+    SERIAL = textwrap.dedent("""\
+            #!/bin/sh
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #PBS -N %JOBNAME%
+            #PBS -l select=serial=true:ncpus=1
+            #PBS -l walltime=%WALLCLOCK%:00
+            #PBS -A %HPCBUDG%
+            #
+            ###############################################################################
+            """)
+
+    PARALLEL = textwrap.dedent("""\
+            #!/bin/sh
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #PBS -N %JOBNAME%
+            #PBS -l select=%NUMPROC%
+            #PBS -l walltime=%WALLCLOCK%:00
+            #PBS -A %HPCBUDG%
+            #
+            ###############################################################################
+            """)
+
+
+class Pbs10Header:
+    """Class to handle the Hector headers of a job"""
+
+    # noinspection PyMethodMayBeStatic,PyUnusedLocal
+    def get_queue_directive(self, job):
+        """
+        Returns queue directive for the specified job
+
+        :param job: job to create queue directibve for
+        :type job: Job
+        :return: queue directive
+        :rtype: str
+        """
+        # There is no queue, so directive is empty
+        return ""
+
+    SERIAL = textwrap.dedent("""\
+            #!/bin/sh
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #PBS -N %JOBNAME%
+            #PBS -q serial
+            #PBS -l cput=%WALLCLOCK%:00
+            #PBS -A %HPCBUDG%
+            #
+            ###############################################################################
+            """)
+
+    PARALLEL = textwrap.dedent("""\
+            #!/bin/sh
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #PBS -N %JOBNAME%
+            #PBS -l mppwidth=%NUMPROC%
+            #PBS -l mppnppn=32
+            #PBS -l walltime=%WALLCLOCK%:00
+            #PBS -A %HPCBUDG%
+            #
+            ###############################################################################
+            """)
+
+
+class Pbs11Header:
+    """Class to handle the Lindgren headers of a job"""
+
+    # noinspection PyMethodMayBeStatic,PyUnusedLocal
+    def get_queue_directive(self, job):
+        """
+        Returns queue directive for the specified job
+
+        :param job: job to create queue directibve for
+        :type job: Job
+        :return: queue directive
+        :rtype: str
+        """
+        # There is no queue, so directive is empty
+        return ""
+
+    SERIAL = textwrap.dedent("""\
+            #!/bin/sh
+            ###############################################################################
+            #                         %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #!/bin/sh --login
+            #PBS -N %JOBNAME%
+            #PBS -l mppwidth=%NUMPROC%
+            #PBS -l mppnppn=%NUMTASK%
+            #PBS -l walltime=%WALLCLOCK%
+            #PBS -e %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%
+            #PBS -o %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%
+            #
+            ###############################################################################
+            """)
+
+    PARALLEL = textwrap.dedent("""\
+            #!/bin/sh
+            ###############################################################################
+            #                         %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #!/bin/sh --login
+            #PBS -N %JOBNAME%
+            #PBS -l mppwidth=%NUMPROC%
+            #PBS -l mppnppn=%NUMTASK%
+            #PBS -l walltime=%WALLCLOCK%
+            #PBS -e %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%
+            #PBS -o %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%
+            #
+            ###############################################################################
+            """)
\ No newline at end of file
diff --git a/autosubmit/platforms/psplatform.py b/autosubmit/platforms/psplatform.py
new file mode 100644
index 0000000..7a9f3f6
--- /dev/null
+++ b/autosubmit/platforms/psplatform.py
@@ -0,0 +1,126 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+import textwrap
+import os
+from xml.dom.minidom import parseString
+import platform
+
+from autosubmit.platforms.hpcplatform import HPCPlatform
+
+
+class PsPlatform(HPCPlatform):
+    """
+    Class to manage jobs to host not using any scheduler
+
+    :param expid: experiment's identifier
+    :type expid: str
+    """
+
+    def __init__(self, expid):
+        HPCPlatform.__init__(self)
+        self._host = platform.node()
+        self.scratch = ""
+        self.project = ""
+        self.user = ""
+        self._header = PsHeader()
+        self.expid = expid
+        self.job_status = dict()
+        self.job_status['COMPLETED'] = ['1']
+        self.job_status['RUNNING'] = ['0']
+        self.job_status['QUEUING'] = []
+        self.job_status['FAILED'] = []
+        self.update_cmds()
+
+    def update_cmds(self):
+        self.root_dir = os.path.join(self.scratch, self.project, self.user, self.expid)
+        self.remote_log_dir = os.path.join(self.root_dir, "LOG_"+self.expid)
+        self.cancel_cmd = "kill -SIGINT"
+        self._checkhost_cmd = "echo 1"
+        self.put_cmd = "scp"
+        self.get_cmd = "scp"
+        self.mkdir_cmd = "mkdir -p " + self.remote_log_dir
+
+    def get_checkhost_cmd(self):
+        return self._checkhost_cmd
+
+    def get_remote_log_dir(self):
+        return self.remote_log_dir
+
+    def get_mkdir_cmd(self):
+        return self.mkdir_cmd
+
+    def parse_job_output(self, output):
+        return output
+
+    def get_submitted_job_id(self, output):
+        return output
+
+    def jobs_in_queue(self):
+        dom = parseString('')
+        jobs_xml = dom.getElementsByTagName("JB_job_number")
+        return [int(element.firstChild.nodeValue) for element in jobs_xml]
+
+    def get_submit_cmd(self, job_script):
+        return self.get_shcall(job_script)
+
+    def get_checkjob_cmd(self, job_id):
+        return self.get_pscall(job_id)
+
+
+class PsHeader:
+    """Class to handle the Ps headers of a job"""
+
+    # noinspection PyMethodMayBeStatic,PyUnusedLocal
+    def get_queue_directive(self, job):
+        """
+        Returns queue directive for the specified job
+
+        :param job: job to create queue directibve for
+        :type job: Job
+        :return: queue directive
+        :rtype: str
+        """
+        # There is no queue, so directive is empty
+        return ""
+
+    SERIAL = textwrap.dedent("""\
+            #!/bin/bash
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            """)
+
+    PARALLEL = textwrap.dedent("""\
+            #!/bin/bash
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            """)
+
+# def main():
+# q = PsQueue()
+#     q.check_job(1688)
+#     j = q.submit_job("/cfu/autosubmit/l002/templates/l002.sim")
+#     sleep(10)
+#     print q.check_job(j)
+#     q.cancel_job(j)
+#
+#
+# if __name__ == "__main__":
+#     main()
diff --git a/autosubmit/platforms/sgeplatform.py b/autosubmit/platforms/sgeplatform.py
new file mode 100644
index 0000000..c621f80
--- /dev/null
+++ b/autosubmit/platforms/sgeplatform.py
@@ -0,0 +1,143 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+from commands import getstatusoutput
+import textwrap
+import os
+
+from xml.dom.minidom import parseString
+
+from autosubmit.platforms.hpcplatform import HPCPlatform
+
+
+class SgePlatform(HPCPlatform):
+    """
+    Class to manage jobs to host using SGE scheduler
+
+    :param expid: experiment's identifier
+    :type expid: str
+    """
+    def __init__(self, expid):
+        HPCPlatform.__init__(self)
+        self._host = ""
+        self.scratch = ""
+        self.project = ""
+        self.user = ""
+        self._header = SgeHeader()
+        self.expid = expid
+        self.job_status = dict()
+        self.job_status['COMPLETED'] = ['c']
+        self.job_status['RUNNING'] = ['r', 't', 'Rr', 'Rt']
+        self.job_status['QUEUING'] = ['qw', 'hqw', 'hRwq', 'Rs', 'Rts', 'RS', 'RtS', 'RT', 'RtT']
+        self.job_status['FAILED'] = ['Eqw', 'Ehqw', 'EhRqw', 's', 'ts', 'S', 'tS', 'T', 'tT', 'dr', 'dt', 'dRr', 'dRt',
+                                     'ds', 'dS', 'dT', 'dRs', 'dRS', 'dRT']
+        self._pathdir = "\$HOME/LOG_" + self.expid
+        self.update_cmds()
+
+    def update_cmds(self):
+        self.root_dir = os.path.join(self.scratch, self.project, self.user, self.expid)
+        self.remote_log_dir = os.path.join(self.root_dir, "LOG_"+self.expid)
+        self.cancel_cmd = "qdel"
+        self._checkhost_cmd = "echo 1"
+        self._submit_cmd = "qsub -wd " + self.remote_log_dir + " " + self.remote_log_dir + "/"
+        self.put_cmd = "scp"
+        self.get_cmd = "scp"
+        self.mkdir_cmd = "mkdir -p " + self.remote_log_dir
+
+    def get_checkhost_cmd(self):
+        return self._checkhost_cmd
+
+    def get_mkdir_cmd(self):
+        return self.mkdir_cmd
+
+    def get_remote_log_dir(self):
+        return self.remote_log_dir
+
+    def parse_job_output(self, output):
+        return output
+
+    def get_submitted_job_id(self, output):
+        return output.split(' ')[2]
+
+    def jobs_in_queue(self):
+        (status, output) = getstatusoutput('qstat -xml'.format(self._host))
+        dom = parseString(output)
+        jobs_xml = dom.getElementsByTagName("JB_job_number")
+        return [int(element.firstChild.nodeValue) for element in jobs_xml]
+
+    def get_submit_cmd(self, job_script):
+        return self._submit_cmd + job_script
+
+    def get_checkjob_cmd(self, job_id):
+        return self.get_qstatjob(job_id)
+
+
+class SgeHeader:
+    """Class to handle the Ithaca headers of a job"""
+
+    # noinspection PyMethodMayBeStatic
+    def get_queue_directive(self, job):
+        """
+        Returns queue directive for the specified job
+
+        :param job: job to create queue directibve for
+        :type job: Job
+        :return: queue directive
+        :rtype: str
+        """
+        # There is no queue, so directive is empty
+        if job.parameters['CURRENT_QUEUE'] == '':
+            return ""
+        else:
+            return "$ -q {0}".format(job.parameters['CURRENT_QUEUE'])
+
+    SERIAL = textwrap.dedent("""\
+            #!/bin/sh
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #$ -S /bin/sh
+            #$ -N %JOBNAME%
+            #$ -e %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/
+            #$ -o %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/
+            #$ -V
+            #$ -l h_rt=%WALLCLOCK%:00
+            #%QUEUE_DIRECTIVE%
+            #
+            ###############################################################################
+            """)
+
+    PARALLEL = textwrap.dedent("""\
+            #!/bin/sh
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #$ -S /bin/sh
+            #$ -N %JOBNAME%
+            #$ -e %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/
+            #$ -o %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/
+            #$ -V
+            #$ -l h_rt=%WALLCLOCK%:00
+            #$ -pe orte %NUMPROC%
+            #%QUEUE_DIRECTIVE%
+            #
+            ###############################################################################
+            """)
+
diff --git a/autosubmit/platforms/slurmplatform.py b/autosubmit/platforms/slurmplatform.py
new file mode 100644
index 0000000..cca2748
--- /dev/null
+++ b/autosubmit/platforms/slurmplatform.py
@@ -0,0 +1,136 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+import textwrap
+import os
+
+from xml.dom.minidom import parseString
+
+from autosubmit.platforms.hpcplatform import HPCPlatform
+
+
+class SlurmPlatform(HPCPlatform):
+    """
+    Class to manage jobs to host using SLURM scheduler
+
+    :param expid: experiment's identifier
+    :type expid: str
+    """
+
+    # noinspection PyMethodMayBeStatic,PyUnusedLocal
+    def get_queue_directive(self, job):
+        """
+        Returns queue directive for the specified job
+
+        :param job: job to create queue directibve for
+        :type job: Job
+        :return: queue directive
+        :rtype: str
+        """
+        # There is no queue, so directive is empty
+        return "#"
+
+    def __init__(self, expid):
+        HPCPlatform.__init__(self)
+        self._host = ""
+        self.scratch = ""
+        self.project = ""
+        self.user = ""
+        self._header = SlurmHeader()
+        self.expid = expid
+        self.job_status = dict()
+
+        self.job_status['COMPLETED'] = ['COMPLETED']
+        self.job_status['RUNNING'] = ['RUNNING']
+        self.job_status['QUEUING'] = ['PENDING', 'CONFIGURING', 'RESIZING']
+        self.job_status['FAILED'] = ['FAILED', 'CANCELLED', 'NODE_FAIL', 'PREEMPTED', 'SUSPENDED', 'TIMEOUT']
+        self._pathdir = "\$HOME/LOG_" + self.expid
+        self.update_cmds()
+
+    def update_cmds(self):
+        self.root_dir = os.path.join(self.scratch, self.project, self.user, self.expid)
+        self.remote_log_dir = os.path.join(self.root_dir, "LOG_"+self.expid)
+        self.cancel_cmd = "scancel"
+        self._checkhost_cmd = "echo 1"
+        self._submit_cmd = 'sbatch -D {1} {1}/'.format(self._host, self.remote_log_dir)
+        self.put_cmd = "scp"
+        self.get_cmd = "scp"
+        self.mkdir_cmd = "mkdir -p " + self.remote_log_dir
+
+    def get_checkhost_cmd(self):
+        return self._checkhost_cmd
+
+    def get_mkdir_cmd(self):
+        return self.mkdir_cmd
+
+    def get_remote_log_dir(self):
+        return self.remote_log_dir
+
+    def parse_job_output(self, output):
+        return output.strip().split(' ')[0].strip()
+
+    def get_submitted_job_id(self, output):
+        return output.split(' ')[3]
+
+    def jobs_in_queue(self):
+        dom = parseString('')
+        jobs_xml = dom.getElementsByTagName("JB_job_number")
+        return [int(element.firstChild.nodeValue) for element in jobs_xml]
+
+    def get_submit_cmd(self, job_script):
+        return self._submit_cmd + job_script
+
+    def get_checkjob_cmd(self, job_id):
+        return 'sacct -n -j {1} -o "State"'.format(self._host, job_id)
+
+
+class SlurmHeader:
+    """Class to handle the SLURM headers of a job"""
+
+    # noinspection PyMethodMayBeStatic,PyUnusedLocal
+    def get_queue_directive(self, job):
+        """
+        Returns queue directive for the specified job
+
+        :param job: job to create queue directibve for
+        :type job: Job
+        :return: queue directive
+        :rtype: str
+        """
+        # There is no queue, so directive is empty
+        return ""
+
+    SERIAL = textwrap.dedent("""\
+            #!/bin/bash
+            ###############################################################################
+            #                   %TASKTYPE% %EXPID% EXPERIMENT
+            ###############################################################################
+            #
+            #SBATCH -n %NUMPROC%
+            #SBATCH -t %WALLCLOCK%:00
+            #SBATCH -J %JOBNAME%
+            #SBATCH -o %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/%JOBNAME%-%j.out
+            #SBATCH -e %SCRATCH_DIR%/%HPCPROJ%/%HPCUSER%/%EXPID%/LOG_%EXPID%/%JOBNAME%-%j.err
+            #
+            ###############################################################################
+           """)
+
+    PARALLEL = textwrap.dedent("""\
+
+            """)
+
diff --git a/bin/autosubmit b/bin/autosubmit
new file mode 100755
index 0000000..df99169
--- /dev/null
+++ b/bin/autosubmit
@@ -0,0 +1,39 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+
+"""Script for handling experiment monitoring"""
+import os
+import sys
+
+scriptdir = os.path.abspath(os.path.dirname(sys.argv[0]))
+assert sys.path[0] == scriptdir
+sys.path[0] = os.path.normpath(os.path.join(scriptdir, os.pardir))
+
+# noinspection PyUnresolvedReferences
+from autosubmit.autosubmit import Autosubmit
+
+
+def main():
+    if not Autosubmit.parse_args():
+        exit(1)
+
+    exit(0)
+
+if __name__ == "__main__":
+    main()
diff --git a/docs/autosubmit.pdf b/docs/autosubmit.pdf
new file mode 100644
index 0000000..8d61b2b
Binary files /dev/null and b/docs/autosubmit.pdf differ
diff --git a/setup.cfg b/setup.cfg
new file mode 100644
index 0000000..861a9f5
--- /dev/null
+++ b/setup.cfg
@@ -0,0 +1,5 @@
+[egg_info]
+tag_build = 
+tag_date = 0
+tag_svn_revision = 0
+
diff --git a/setup.py b/setup.py
new file mode 100644
index 0000000..079f6b2
--- /dev/null
+++ b/setup.py
@@ -0,0 +1,57 @@
+#!/usr/bin/env python
+
+# Copyright 2014 Climate Forecasting Unit, IC3
+
+# This file is part of Autosubmit.
+
+# Autosubmit is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# Autosubmit is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with Autosubmit.  If not, see <http://www.gnu.org/licenses/>.
+
+from os import path
+from setuptools import setup
+from setuptools import find_packages
+
+here = path.abspath(path.dirname(__file__))
+
+# Get the version number from the relevant file
+with open(path.join(here, 'VERSION')) as f:
+    version = f.read().strip()
+
+setup(
+    name='autosubmit',
+    license='GNU GPL v3',
+    platforms=['GNU/Linux Debian'],
+    version=version,
+    description='Autosubmit: a versatile tool to manage Weather and Climate Experiments in diverse '
+                'Supercomputing Environments',
+    author='Domingo Manubens-Gil',
+    author_email='domingo.manubens at ic3.cat',
+    url='https://autosubmit.ic3.cat',
+    download_url='http://ic3.cat/wikicfu/index.php/Tools/Autosubmit',
+    keywords=['climate', 'weather', 'workflow', 'HPC'],
+    install_requires=['argparse>=1.2,<2', 'python-dateutil>=1,<2', 'pydotplus', 'pyparsing', 'paramiko'],
+    # 'numpy','matplotlib>=1.1.1',
+    packages=find_packages(),
+    include_package_data=True,
+    package_data={'autosubmit': [
+        'autosubmit/config/files/autosubmit.conf',
+        'autosubmit/config/files/expdef.conf',
+        'autosubmit/database/data/autosubmit.sql',
+    ]
+    },
+    scripts=['bin/autosubmit'],  # data_files = [
+    # ('', ['VERSION']),
+    # ('conf', ['lib/autosubmit/config/files/autosubmit.conf','lib/autosubmit/config/files/expdef.conf']),
+    # ('data', ['lib/autosubmit/database/data/autosubmit.sql'])  #	]		  #entry_points = {
+    # 'console_scripts' : ['check_exp = bin/check_exp.py']  #	'gui_scripts' : ['monitor = monitor.py']  #	}
+)

-- 
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/debian-science/packages/ic3-autosubmit.git



More information about the debian-science-commits mailing list