[Git][debian-gis-team/netcdf4-python][master] 4 commits: New upstream version 1.4.2

Bas Couwenberg gitlab at salsa.debian.org
Thu Oct 25 06:29:12 BST 2018


Bas Couwenberg pushed to branch master at Debian GIS Project / netcdf4-python


Commits:
3cf86e78 by Bas Couwenberg at 2018-10-25T05:10:06Z
New upstream version 1.4.2
- - - - -
abc6a03f by Bas Couwenberg at 2018-10-25T05:10:13Z
Merge tag 'upstream/1.4.2'

Upstream version 1.4.2

- - - - -
b40041e0 by Bas Couwenberg at 2018-10-25T05:10:34Z
New upstream release.

- - - - -
1ec09955 by Bas Couwenberg at 2018-10-25T05:11:24Z
Set distribution to unstable.

- - - - -


20 changed files:

- + .appveyor.yml
- .travis.yml
- Changelog
- MANIFEST.in
- PKG-INFO
- README.md
- − appveyor.yml
- − conda.recipe/bld.bat
- − conda.recipe/meta.yaml
- − conda.recipe/run_test.py
- debian/changelog
- docs/netCDF4/index.html
- include/netCDF4.pxi
- netCDF4/_netCDF4.pyx
- setup.py
- test/tst_atts.py
- test/tst_dims.py
- test/tst_multifile.py
- test/tst_netcdftime.py
- test/tst_types.py


Changes:

=====================================
.appveyor.yml
=====================================
@@ -0,0 +1,55 @@
+environment:
+  CONDA_INSTALL_LOCN: C:\\Miniconda36-x64
+  matrix:
+    - TARGET_ARCH: x64
+      NPY: 1.15
+      PY: 3.6
+
+    - TARGET_ARCH: x64
+      NPY: 1.15
+      PY: 3.7
+
+platform:
+  - x64
+
+install:
+  # If there is a newer build queued for the same PR, cancel this one.
+  # The AppVeyor 'rollout builds' option is supposed to serve the same
+  # purpose but it is problematic because it tends to cancel builds pushed
+  # directly to master instead of just PR builds (or the converse).
+  # credits: JuliaLang developers.
+  - ps: if ($env:APPVEYOR_PULL_REQUEST_NUMBER -and $env:APPVEYOR_BUILD_NUMBER -ne ((Invoke-RestMethod `
+        https://ci.appveyor.com/api/projects/$env:APPVEYOR_ACCOUNT_NAME/$env:APPVEYOR_PROJECT_SLUG/history?recordsNumber=50).builds | `
+        Where-Object pullRequestId -eq $env:APPVEYOR_PULL_REQUEST_NUMBER)[0].buildNumber) { `
+        throw "There are newer queued builds for this pull request, failing early." }
+
+  # Add path, activate `conda` and update conda.
+  - cmd: call %CONDA_INSTALL_LOCN%\Scripts\activate.bat
+  - cmd: conda.exe config --set always_yes yes --set changeps1 no --set show_channel_urls true
+  - cmd: conda.exe update conda
+  - cmd: conda.exe config --remove channels defaults --force
+  - cmd: conda.exe config --add channels conda-forge --force
+  - cmd: set PYTHONUNBUFFERED=1
+  - cmd: conda.exe install conda-build vs2008_express_vc_python_patch
+  - cmd: call setup_x64
+
+  - cmd: conda.exe create --name TEST python=%PY% numpy=%NPY% cython pip pytest hdf5 libnetcdf cftime
+  - cmd: conda activate TEST
+
+  - cmd: conda.exe info --all
+  - cmd: conda.exe list
+
+  - cmd: echo [options] > setup.cfg
+  - cmd: echo [directories] >> setup.cfg
+  - cmd: echo HDF5_libdir = %CONDA_PREFIX%\Library\lib >> setup.cfg
+  - cmd: echo HDF5_incdir = %CONDA_PREFIX%\Library\include >> setup.cfg
+  - cmd: echo netCDF4_libdir = %CONDA_PREFIX%\Library\lib >> setup.cfg
+  - cmd: echo netCDF4_incdir = %CONDA_PREFIX%\Library\include >> setup.cfg
+
+# Skip .NET project specific build phase.
+build: off
+
+test_script:
+  - python -m pip install . --no-deps --ignore-installed --no-cache-dir -vvv
+  - set NO_NET=1
+  - cd test && python run_all.py


=====================================
.travis.yml
=====================================
@@ -1,6 +1,6 @@
 language: python
-
-sudo: false
+dist: xenial
+sudo: true
 
 addons:
   apt:
@@ -17,12 +17,13 @@ env:
 
 python:
   - "2.7"
-  - "3.5"
   - "3.6"
+  - "3.7"
+  - "3.8-dev"
 
 matrix:
   allow_failures:
-    - python: "3.7-dev"
+    - python: "3.8-dev"
   include:
     # Absolute minimum dependencies.
     - python: 2.7
@@ -30,11 +31,12 @@ matrix:
         - DEPENDS="numpy==1.9.0 cython==0.21 ordereddict==1.1 setuptools==18.0 cftime"
     # test MPI
     - python: 2.7
+      dist: trusty
       env: 
         - MPI=1
         - CC=mpicc
         - DEPENDS="numpy>=1.9.0 cython>=0.21 setuptools>=18.0 mpi4py>=1.3.1 cftime"
-        - NETCDF_VERSION=4.4.1.1
+        - NETCDF_VERSION=4.6.1
         - NETCDF_DIR=$HOME
         - PATH=${NETCDF_DIR}/bin:${PATH} # pick up nc-config here
       addons:


=====================================
Changelog
=====================================
@@ -1,3 +1,18 @@
+ version 1.4.2 (tag v1.4.2rel)
+=============================
+ * add get_dims Variable method (issue #824)
+ * make sure format keyword not ignored when mode is 'ws' (issue #827)
+ * fix numpy FutureWarning (non-tuple sequence for     
+   multidimensional indexing is deprecated), issue #833.
+ * add 'master_file' kwarg to MFDataset.__init__ (issue #835).
+ * always use nc_get_vars for strided access over OpenDAP (issue #838).
+ * raise FutureWarning when trying to set multi-dimensional array attribute
+   while still silently flattening the array (issue #841). Will change
+   to ValueError in next release (1.4.3).
+ * fix parallel writes when both nc4 parallel and pnetcdf parallel options
+   enabled in the netcdf-c library (issue #820).
+ * fix for writing masked scalar character variable (issue #850).
+
  version 1.4.1 (tag v1.4.1rel)
 =============================
  * disable workaround for slow nc_get_vars for __netcdflibversion__ >= 4.6.2,


=====================================
MANIFEST.in
=====================================
@@ -1,6 +1,5 @@
 recursive-include docs *
 recursive-include man *
-recursive-include conda.recipe *
 include MANIFEST.in
 include README.md
 include COPYING


=====================================
PKG-INFO
=====================================
@@ -1,6 +1,6 @@
 Metadata-Version: 1.1
 Name: netCDF4
-Version: 1.4.1
+Version: 1.4.2
 Author: Jeff Whitaker
 Author-email: jeffrey s whitaker at noaa gov
 Home-page: https://github.com/Unidata/netcdf4-python


=====================================
README.md
=====================================
@@ -8,6 +8,8 @@
 ## News
 For details on the latest updates, see the [Changelog](https://github.com/Unidata/netcdf4-python/blob/master/Changelog).
 
+10/26/2018: Version [1.4.2](https://pypi.python.org/pypi/netCDF4/1.4.2) released. Minor bugfixes, added `Variable.get_dims()` method and `master_file` kwarg for `MFDataset.__init__`.
+
 08/10/2018: Version [1.4.1](https://pypi.python.org/pypi/netCDF4/1.4.1) released. The old slicing behavior
 (numpy array returned unless missing values are present, otherwise masked array returned) is renabled
 via `set_always_mask(False)`.


=====================================
appveyor.yml deleted
=====================================
@@ -1,59 +0,0 @@
-environment:
-  matrix:
-    - TARGET_ARCH: x64
-      CONDA_NPY: 111
-      CONDA_PY: 27
-      CONDA_INSTALL_LOCN: C:\\Miniconda-x64
-
-    - TARGET_ARCH: x64
-      CONDA_NPY: 114
-      CONDA_PY: 27
-      CONDA_INSTALL_LOCN: C:\\Miniconda-x64
-
-    - TARGET_ARCH: x64
-      CONDA_NPY: 111
-      CONDA_PY: 36
-      CONDA_INSTALL_LOCN: C:\\Miniconda36-x64
-
-    - TARGET_ARCH: x64
-      CONDA_NPY: 114
-      CONDA_PY: 36
-      CONDA_INSTALL_LOCN: C:\\Miniconda36-x64
-
-platform:
-    - x64
-
-install:
-    # If there is a newer build queued for the same PR, cancel this one.
-    # The AppVeyor 'rollout builds' option is supposed to serve the same
-    # purpose but it is problematic because it tends to cancel builds pushed
-    # directly to master instead of just PR builds (or the converse).
-    # credits: JuliaLang developers.
-    - ps: if ($env:APPVEYOR_PULL_REQUEST_NUMBER -and $env:APPVEYOR_BUILD_NUMBER -ne ((Invoke-RestMethod `
-         https://ci.appveyor.com/api/projects/$env:APPVEYOR_ACCOUNT_NAME/$env:APPVEYOR_PROJECT_SLUG/history?recordsNumber=50).builds | `
-         Where-Object pullRequestId -eq $env:APPVEYOR_PULL_REQUEST_NUMBER)[0].buildNumber) { `
-           throw "There are newer queued builds for this pull request, failing early." }
-
-    # Add path, activate `conda` and update conda.
-    - cmd: call %CONDA_INSTALL_LOCN%\Scripts\activate.bat
-    - cmd: conda config --set always_yes yes --set changeps1 no --set show_channel_urls true --set auto_update_conda false
-    - cmd: conda update conda
-    # We need to pin conda until https://github.com/conda/conda/issues/6556 is fixed.
-    - cmd: conda config --system --add pinned_packages defaults::conda
-    - cmd: conda config --add channels conda-forge --force
-
-    # Install conda-build.
-    - cmd: conda install conda-build
-
-    - cmd: set PYTHONUNBUFFERED=1
-  
-    - cmd: conda install vs2008_express_vc_python_patch
-    - cmd: call setup_x64.bat
-  
-    - cmd: conda info --all
-
-# Skip .NET project specific build phase.
-build: off
-
-test_script:
-    - conda build conda.recipe


=====================================
conda.recipe/bld.bat deleted
=====================================
@@ -1,11 +0,0 @@
-set SITECFG=%SRC_DIR%/setup.cfg
-
-echo [options] > %SITECFG%
-echo [directories] >> %SITECFG%
-echo HDF5_libdir = %LIBRARY_LIB% >> %SITECFG%
-echo HDF5_incdir = %LIBRARY_INC% >> %SITECFG%
-echo netCDF4_libdir = %LIBRARY_LIB% >> %SITECFG%
-echo netCDF4_incdir = %LIBRARY_INC% >> %SITECFG%
-
-"%PYTHON%" setup.py install --single-version-externally-managed  --record record.txt
-if errorlevel 1 exit 1
\ No newline at end of file


=====================================
conda.recipe/meta.yaml deleted
=====================================
@@ -1,50 +0,0 @@
-{% set version = "dev" %}
-
-package:
-  name: netcdf4
-  version: {{ version }}
-
-source:
-  path: ../
-
-build:
-  number: 0
-  entry_points:
-    - ncinfo = netCDF4.utils:ncinfo
-    - nc4tonc3 = netCDF4.utils:nc4tonc3
-    - nc3tonc4 = netCDF4.utils:nc3tonc4
-
-requirements:
-  build:
-    - python
-    - setuptools
-    - cython
-    - numpy x.x
-    - msinttypes  # [win and py<35]
-    - hdf5 
-    - libnetcdf 
-    - cftime
-  run:
-    - python
-    - setuptools
-    - numpy x.x
-    - hdf5 
-    - libnetcdf 
-    - cython
-    - cftime
-
-test:
-  source_files:
-    - test
-  imports:
-    - netCDF4
-    - cftime
-  commands:
-    - ncinfo -h
-    - nc4tonc3 -h
-    - nc3tonc4 -h
-
-about:
-  home: http://github.com/Unidata/netcdf4-python
-  license: OSI Approved
-  summary: 'Provides an object-oriented python interface to the netCDF version 4 library..'


=====================================
conda.recipe/run_test.py deleted
=====================================
@@ -1,8 +0,0 @@
-import os
-import netCDF4
-
-# Run the unittests, skipping the opendap test.
-test_dir = os.path.join('test')
-os.chdir(test_dir)
-os.environ['NO_NET']='1'
-os.system('python run_all.py')


=====================================
debian/changelog
=====================================
@@ -1,9 +1,10 @@
-netcdf4-python (1.4.1-2) UNRELEASED; urgency=medium
+netcdf4-python (1.4.2-1) unstable; urgency=medium
 
+  * New upstream release.
   * Update watch file to limit matches to archive path.
   * Bump Standards-Version to 4.2.1, no changes.
 
- -- Bas Couwenberg <sebastic at debian.org>  Wed, 15 Aug 2018 18:54:54 +0200
+ -- Bas Couwenberg <sebastic at debian.org>  Thu, 25 Oct 2018 07:11:13 +0200
 
 netcdf4-python (1.4.1-1) unstable; urgency=medium
 


=====================================
docs/netCDF4/index.html
=====================================
The diff for this file was not included because it is too large.

=====================================
include/netCDF4.pxi
=====================================
@@ -55,8 +55,6 @@ cdef extern from "netcdf.h":
         NC_CLASSIC_MODEL # Enforce strict netcdf-3 rules. 
         # Use these 'mode' flags for both nc_create and nc_open.
         NC_SHARE # Share updates, limit cacheing 
-        NC_MPIIO
-        NC_MPIPOSIX
         # The following flag currently is ignored, but use in
         # nc_open() or nc_create() may someday support use of advisory
         # locking to prevent multiple writers from clobbering a file 
@@ -710,6 +708,7 @@ IF HAS_NC_PAR:
     cdef extern from "netcdf.h":
         cdef enum:
             NC_MPIIO
+            NC_MPIPOSIX
             NC_PNETCDF
 
 # taken from numpy.pxi in numpy 1.0rc2.


=====================================
netCDF4/_netCDF4.pyx
=====================================
@@ -1,5 +1,5 @@
 """
-Version 1.4.1
+Version 1.4.2
 -------------
 - - - 
 
@@ -31,7 +31,7 @@ Download
  - Latest bleeding-edge code from the 
    [github repository](http://github.com/Unidata/netcdf4-python).
  - Latest [releases](https://pypi.python.org/pypi/netCDF4)
-   (source code and windows installers).
+   (source code and binary installers).
 
 Requires
 ========
@@ -1104,7 +1104,7 @@ except ImportError:
     # python3: zip is already python2's itertools.izip
     pass
 
-__version__ = "1.4.1"
+__version__ = "1.4.2"
 
 # Initialize numpy
 import posixpath
@@ -1421,6 +1421,15 @@ cdef _set_att(grp, int varid, name, value,\
     attname = bytestr
     # put attribute value into a numpy array.
     value_arr = numpy.array(value)
+    if value_arr.ndim > 1: # issue #841
+        if __version__ > "1.4.2":
+            raise ValueError('multi-dimensional array attributes not supported')
+        else:
+            msg = """
+Multi-dimensional array attributes are now deprecated.
+Instead of silently flattening the array, an error will
+be raised in the next release."""
+            warnings.warn(msg,FutureWarning)
     # if array is 64 bit integers or
     # if 64-bit datatype not supported, cast to 32 bit integers.
     fmt = _get_format(grp._grpid)
@@ -2038,7 +2047,9 @@ references to the parent Dataset or Group.
             # for issue 170 (nc_open'ing a DAP dataset after switching
             # format to NETCDF4). This bug should be fixed in version
             # 4.3.0 of the netcdf library (add a version check here?).
-            _set_default_format(format='NETCDF3_64BIT_OFFSET')
+            # **this causes parallel mode to fail when both hdf5-parallel and
+            # pnetcdf are enabled - issue #820 **
+            #_set_default_format(format='NETCDF3_64BIT_OFFSET')
         elif mode == 'r':
             if memory is not None:
                 IF HAS_NC_OPEN_MEM:
@@ -2087,6 +2098,7 @@ references to the parent Dataset or Group.
             else:
                 ierr = nc_open(path, NC_SHARE, &grpid)
         elif mode == 'ws':
+            _set_default_format(format=format)
             if clobber:
                 if parallel:
                     # NC_SHARE ignored
@@ -3648,7 +3660,9 @@ behavior is similar to Fortran or Matlab, but different than numpy.
         # avoid calling nc_get_vars for strided slices by default.
         # a fix for strided slice access using HDF5 was added
         # in 4.6.2.
-        if __netcdf4libversion__ >= "4.6.2":
+        # always use nc_get_vars for strided access with OpenDAP (issue #838).
+        if __netcdf4libversion__ >= "4.6.2" or\
+           self._grp.filepath().startswith('http'):
             self._no_get_vars = False
         else:
             self._no_get_vars = True
@@ -4613,7 +4627,13 @@ cannot be safely cast to variable data type""" % attname
                         fillval = self._FillValue
                     else:
                         fillval = default_fillvals[self.dtype.str[1:]]
-                    data = data.filled(fill_value=fillval)
+                    # some versions of numpy have trouble handling
+                    # MaskedConstants when filling - this is is 
+                    # a workaround (issue #850)
+                    if data.shape == (1,) and data.mask.all():
+                        data = numpy.array([fillval],self.dtype)
+                    else:
+                        data = data.filled(fill_value=fillval)
 
         # Fill output array with data chunks.
         for (a,b,c,i) in zip(start, count, stride, put_ind):
@@ -4871,7 +4891,7 @@ numpy arrays are not performed).
             raise IndexError('size of data array does not conform to slice')
         if negstride:
             # reverse data along axes with negative strides.
-            data = data[sl].copy() # make sure a copy is made.
+            data = data[tuple(sl)].copy() # make sure a copy is made.
         if self._isprimitive or self._iscompound or self._isenum:
             # primitive, enum or compound data type.
             # if data type of array doesn't match variable,
@@ -5090,7 +5110,7 @@ numpy arrays are not performed).
         free(stridep)
         if negstride:
             # reverse data along axes with negative strides.
-            data = data[sl].copy() # make a copy so data is contiguous.
+            data = data[tuple(sl)].copy() # make a copy so data is contiguous.
         # netcdf-c always returns data in native byte order,
         # regardless of variable endian-ness. Here we swap the 
         # bytes if the variable dtype is not native endian, so the
@@ -5124,6 +5144,15 @@ open for parallel access.
         ELSE:
             pass # does nothing
 
+    def get_dims(self):
+        """
+**`get_dims(self)`**
+
+return a tuple of `netCDF4.Dimension` instances associated with this 
+`netCDF4.Variable.
+        """
+        return tuple(_find_dim(self._grp, dim) for dim in self.dimensions)
+
     def __reduce__(self):
         # raise error is user tries to pickle a Variable object.
         raise NotImplementedError('Variable is not picklable')
@@ -5793,14 +5822,14 @@ Adapted from [pycdf](http://pysclint.sourceforge.net/pycdf) by Andre Gosselin.
 Example usage (See `netCDF4.MFDataset.__init__` for more details):
 
     :::python
-    >>> import numpy
+    >>> import numpy as np
     >>> # create a series of netCDF files with a variable sharing
     >>> # the same unlimited dimension.
     >>> for nf in range(10):
-    >>>     f = Dataset("mftest%s.nc" % nf,"w")
+    >>>     f = Dataset("mftest%s.nc" % nf,"w",format='NETCDF4_CLASSIC')
     >>>     f.createDimension("x",None)
     >>>     x = f.createVariable("x","i",("x",))
-    >>>     x[0:10] = numpy.arange(nf*10,10*(nf+1))
+    >>>     x[0:10] = np.arange(nf*10,10*(nf+1))
     >>>     f.close()
     >>> # now read all those files in at once, in one Dataset.
     >>> f = MFDataset("mftest*nc")
@@ -5811,9 +5840,11 @@ Example usage (See `netCDF4.MFDataset.__init__` for more details):
      75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99]
     """
 
-    def __init__(self, files, check=False, aggdim=None, exclude=[]):
+    def __init__(self, files, check=False, aggdim=None, exclude=[],
+            master_file=None):
         """
-        **`__init__(self, files, check=False, aggdim=None, exclude=[])`**
+        **`__init__(self, files, check=False, aggdim=None, exclude=[],
+        master_file=None)`**
 
         Open a Dataset spanning multiple files, making it look as if it was a
         single file. Variables in the list of files that share the same
@@ -5823,7 +5854,8 @@ Example usage (See `netCDF4.MFDataset.__init__` for more details):
         of the variables to be aggregated.
         
         **`files`**: either a sequence of netCDF files or a string with a
-        wildcard (converted to a sorted list of files using glob)  The first file
+        wildcard (converted to a sorted list of files using glob)  If
+        the `master_file` kwarg is not specified, the first file
         in the list will become the "master" file, defining all the
         variables with an aggregation dimension which may span
         subsequent files. Attribute access returns attributes only from "master"
@@ -5840,6 +5872,9 @@ Example usage (See `netCDF4.MFDataset.__init__` for more details):
         
         **`exclude`**: A list of variable names to exclude from aggregation.
         Default is an empty list.
+
+        **`master_file`**: file to use as "master file", defining all the
+        variables with an aggregation dimension and all global attributes.
        """
 
         # Open the master file in the base class, so that the CDFMF instance
@@ -5851,7 +5886,13 @@ Example usage (See `netCDF4.MFDataset.__init__` for more details):
             else:
                 files = sorted(glob(files))
 
-        master = files[0]
+        if master_file is not None:
+            if master_file not in files:
+                raise ValueError('master_file not in files list')
+            else:
+                master = master_file
+        else:
+            master = files[0]
 
         # Open the master again, this time as a classic CDF instance. This will avoid
         # calling methods of the CDFMF subclass when querying the master file.
@@ -5897,18 +5938,23 @@ Example usage (See `netCDF4.MFDataset.__init__` for more details):
         #   cdfRecVar dictionary indexed by the aggregation var names; each key holds
         #             a list of the corresponding Variable instance, one for each
         #             cdf file of the file set
-        cdf = [cdfm]
+        cdf = []
         self._cdf = cdf        # Store this now, because dim() method needs it
-        cdfVLen = [len(aggDimId)]
+        cdfVLen = []
         cdfRecVar = {}
-        for v in masterRecVar.keys():
-            cdfRecVar[v] = [cdfm.variables[v]]
 
         # Open each remaining file in read-only mode.
         # Make sure each file defines the same aggregation variables as the master
         # and that the variables are defined in the same way (name, shape and type)
-        for f in files[1:]:
-            part = Dataset(f)
+        for f in files:
+            if f == master:
+                part = cdfm
+            else:
+                part = Dataset(f)
+            if cdfRecVar == {}:
+                empty_cdfRecVar = True
+            else:
+                empty_cdfRecVar = False
             varInfo = part.variables
             for v in masterRecVar.keys():
                 if check:
@@ -5947,12 +5993,16 @@ Example usage (See `netCDF4.MFDataset.__init__` for more details):
                                        (v, master, masterType, f, extType))
 
                     # Everything ok.
-                    vInst = part.variables[v]
-                    cdfRecVar[v].append(vInst)
+                    if empty_cdfRecVar:
+                        cdfRecVar[v] = [part.variables[v]]
+                    else:
+                        cdfRecVar[v].append(part.variables[v])
                 else:
                     # No making sure of anything -- assume this is ok..
-                    vInst = part.variables[v]
-                    cdfRecVar[v].append(vInst)
+                    if empty_cdfRecVar:
+                        cdfRecVar[v] = [part.variables[v]]
+                    else:
+                        cdfRecVar[v].append(part.variables[v])
 
             cdf.append(part)
             cdfVLen.append(len(part.dimensions[aggDimName]))


=====================================
setup.py
=====================================
@@ -553,7 +553,7 @@ else:
 
 setup(name="netCDF4",
       cmdclass=cmdclass,
-      version="1.4.1",
+      version="1.4.2",
       long_description="netCDF version 4 has many features not found in earlier versions of the library, such as hierarchical groups, zlib compression, multiple unlimited dimensions, and new data types.  It is implemented on top of HDF5.  This module implements most of the new features, and can read and write netCDF files compatible with older versions of the library.  The API is modelled after Scientific.IO.NetCDF, and should be familiar to users of that module.\n\nThis project is hosted on a `GitHub repository <https://github.com/Unidata/netcdf4-python>`_ where you may access the most up-to-date source.",
       author="Jeff Whitaker",
       author_email="jeffrey.s.whitaker at noaa.gov",


=====================================
test/tst_atts.py
=====================================
@@ -69,6 +69,9 @@ class VariablesTestCase(unittest.TestCase):
         g.floatatt = FLOATATT
         g.seqatt = SEQATT
         g.stringseqatt = STRINGSEQATT
+        if netCDF4.__version__ > "1.4.2":
+            with self.assertRaises(ValueError):
+                g.arrayatt = [[1, 2], [3, 4]] # issue #841
         g.setncattr_string('stringseqatt_array',STRINGSEQATT) # array of NC_STRING
         v = f.createVariable(VAR_NAME, 'f8',(DIM1_NAME,DIM2_NAME,DIM3_NAME))
         # try to set a variable attribute with one of the reserved names.


=====================================
test/tst_dims.py
=====================================
@@ -101,6 +101,13 @@ class DimensionsTestCase(unittest.TestCase):
         # check that dimension lengths are correct.
         for name,dim in g.dimensions.items():
             self.assertTrue(len(dim) == lensdict[name])
+        # check get_dims variable method
+        dim_tuple = vg.get_dims()
+        # some dimensions from parent group
+        dim_tup1 = (f.dimensions['level'],g.dimensions['lat'],\
+                    g.dimensions['lon'],f.dimensions['time'])
+        dim_tup2 = vg.get_dims()
+        assert(dim_tup1 == dim_tup2)
         # check that isunlimited() method works.
         for name,dim in g.dimensions.items():
             self.assertTrue(dim.isunlimited() == unlimdict[name])


=====================================
test/tst_multifile.py
=====================================
@@ -63,9 +63,13 @@ class VariablesTestCase(unittest.TestCase):
         assert_array_equal(varin[:],data2)
         assert getattr(varin,'nonexistantatt',None) == None
         f.close()
-
-    def test_get_by_mfdataset(self):
-        """testing multi-file get_variables_by_attributes."""
+        # test master_file kwarg (issue #835).
+        f = MFDataset(self.files,master_file=self.files[-1],check=True)
+        assert_array_equal(np.arange(0,nx),f.variables['x'][:])
+        varin = f.variables['data']
+        assert_array_equal(varin[4:-4:4,3:5,2:8],data2[4:-4:4,3:5,2:8])
+        f.close()
+        # testing multi-file get_variables_by_attributes
         f = MFDataset(self.files,check=True)
         assert f.get_variables_by_attributes(axis='T') == []
         f.get_variables_by_attributes(units='zlotys')[0] == f['x']


=====================================
test/tst_netcdftime.py
=====================================
@@ -434,7 +434,7 @@ class netcdftimeTestCase(unittest.TestCase):
             pass
         # this should not fail (year zero allowed in 'fake' calendars)
         t = date2num(datetime(1, 1, 1), units, calendar='360_day')
-        self.assertEqual(t, 360)
+        self.assertAlmostEqual(t, 360)
         d = num2date(t, units, calendar='360_day')
         self.assertEqual(d, Datetime360Day(1,1,1))
         d = num2date(0, units, calendar='360_day')


=====================================
test/tst_types.py
=====================================
@@ -2,7 +2,7 @@ import sys
 import unittest
 import os
 import tempfile
-import numpy as NP
+import numpy as np
 from numpy.testing import assert_array_equal, assert_array_almost_equal
 from numpy.random.mtrand import uniform
 import netCDF4
@@ -17,30 +17,30 @@ ranarr = 100.*uniform(size=(n1dim,n2dim))
 zlib=False;complevel=0;shuffle=0;least_significant_digit=None
 datatypes = ['f8','f4','i1','i2','i4','i8','u1','u2','u4','u8','S1']
 FillValue = 1.0
-issue273_data = NP.ma.array(['z']*10,dtype='S1',\
+issue273_data = np.ma.array(['z']*10,dtype='S1',\
 mask=[False,False,False,False,False,True,False,False,False,False])
 
 class PrimitiveTypesTestCase(unittest.TestCase):
 
     def setUp(self):
         self.file = FILE_NAME
-        file = netCDF4.Dataset(self.file,'w')
-        file.createDimension('n1', None)
-        file.createDimension('n2', n2dim)
+        f = netCDF4.Dataset(self.file,'w')
+        f.createDimension('n1', None)
+        f.createDimension('n2', n2dim)
         for typ in datatypes:
-            foo = file.createVariable('data_'+typ, typ, ('n1','n2',),zlib=zlib,complevel=complevel,shuffle=shuffle,least_significant_digit=least_significant_digit,fill_value=FillValue)
+            foo = f.createVariable('data_'+typ, typ, ('n1','n2',),zlib=zlib,complevel=complevel,shuffle=shuffle,least_significant_digit=least_significant_digit,fill_value=FillValue)
             #foo._FillValue = FillValue
             # test writing of _FillValue attribute for diff types
             # (should be cast to type of variable silently)
             foo[1:n1dim] = ranarr[1:n1dim]
-        v = file.createVariable('issue271', NP.dtype('S1'), [], fill_value=b'Z')
-        v2 = file.createVariable('issue273', NP.dtype('S1'), 'n2',\
+        v = f.createVariable('issue271', np.dtype('S1'), [], fill_value=b'Z')
+        v2 = f.createVariable('issue273', np.dtype('S1'), 'n2',\
                 fill_value='\x00')
         v2[:] = issue273_data
-        v3 = file.createVariable('issue707',NP.int8,'n2')
+        v3 = f.createVariable('issue707',np.int8,'n2')
         v3.setncattr('missing_value',255)
         v3[:]=-1
-        file.close()
+        f.close()
 
     def tearDown(self):
         # Remove the temporary files
@@ -48,9 +48,9 @@ class PrimitiveTypesTestCase(unittest.TestCase):
 
     def runTest(self):
         """testing primitive data type """
-        file = netCDF4.Dataset(self.file)
+        f = netCDF4.Dataset(self.file)
         for typ in datatypes:
-            data = file.variables['data_'+typ]
+            data = f.variables['data_'+typ]
             data.set_auto_maskandscale(False)
             datarr = data[1:n1dim]
             # fill missing data with _FillValue
@@ -65,22 +65,22 @@ class PrimitiveTypesTestCase(unittest.TestCase):
                 self.assertTrue(data.dtype.str[1:] == typ)
             # check data in variable.
             if data.dtype.str[1:] != 'S1':
-                #assert NP.allclose(datarr, ranarr[1:n1dim].astype(data.dtype))
+                #assert np.allclose(datarr, ranarr[1:n1dim].astype(data.dtype))
                 assert_array_almost_equal(datarr,ranarr[1:n1dim].astype(data.dtype))
             else:
                 assert datarr.tostring() == ranarr[1:n1dim].astype(data.dtype).tostring()
             # check that variable elements not yet written are filled
             # with the specified _FillValue.
-            assert_array_equal(datfilled,NP.asarray(data._FillValue,datfilled.dtype))
+            assert_array_equal(datfilled,np.asarray(data._FillValue,datfilled.dtype))
         # issue 271 (_FillValue should be a byte for character arrays on
         # Python 3)
-        v = file.variables['issue271']
+        v = f.variables['issue271']
         if type(v._FillValue) == bytes:
             assert(v._FillValue == b'Z') # python 3
         else:
             assert(v._FillValue == u'Z') # python 2
         # issue 273 (setting _FillValue to null byte manually)
-        v2 = file.variables['issue273']
+        v2 = f.variables['issue273']
         if type(v2._FillValue) == bytes:
             assert(v2._FillValue == b'\x00') # python 3
         else:
@@ -88,9 +88,14 @@ class PrimitiveTypesTestCase(unittest.TestCase):
         assert(str(issue273_data) == str(v2[:]))
         # isse 707 (don't apply missing_value if cast to variable type is
         # unsafe)
-        v3 = file.variables['issue707']
-        assert_array_equal(v3[:],-1*NP.ones(n2dim,v3.dtype))
-        file.close()
+        v3 = f.variables['issue707']
+        assert_array_equal(v3[:],-1*np.ones(n2dim,v3.dtype))
+        f.close()
+        # issue #850 (masked scalar char variable)
+        f = netCDF4.Dataset(self.file,'a')
+        a = f.createVariable('a', 'c', ())
+        a[:] = np.ma.masked
+        f.close()
 
 if __name__ == '__main__':
     unittest.main()



View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/compare/83c4cecef5ecf0f0c3345e83220d068ceaffc8c9...1ec09955732dd8d8ec50776d317613ec6eb4a95c

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/compare/83c4cecef5ecf0f0c3345e83220d068ceaffc8c9...1ec09955732dd8d8ec50776d317613ec6eb4a95c
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20181025/0a0c7f40/attachment-0001.html>


More information about the Pkg-grass-devel mailing list