Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@jsancho-gpl
Copy link
Contributor

@jsancho-gpl jsancho-gpl commented Apr 13, 2018

release 3.4.3 : maintenance / bugfix release

TODO:

  • update RELEASE-NOTES.txt and doc/source/release_notes.rst
  • set VERSION to 3.4.3 (perhaps also in docs?)
  • run --heavy test suite on Linux (or Mac)
  • run --heavy test suite on Windows.
  • merge into master
  • tag commit a v3.4.3
  • make sure docs are also built / updated
  • upload packages to PyPI. FOR NEXT RELEASE: This should happen after wheels are built.
  • create wheels and upload to PyPI (wheel builder at MacPython/pytables-wheels)
  • upload package and wheels to PyPI
  • update conda-forge/pytables-feedstock

After release:

  • post release actions (3c5f652) (merge master back into develop)

tomkooij and others added 30 commits April 19, 2017 20:32
[ci skip]
Implement `__dir__()` on group nodes to include in the list of
autocompletions also those children that are named as valid python
identifiers.
This tests autocompletion of group (node) names.
+ Add regular Group-attributes in its dir() test-case.
feat(group): autocomplete valid python-identifiers
+ Docs updated.
+ Simple test-case.
+ Not suitable for PY-2 (no `__dir__()` in datamodel).
+ Symetrical to #624(Groups).
+ Test dir() results for attributes.
# See discussion in #265.
feat(attributes): __dir__() autocompletes also valid py-indentifiers
The release notes for 3.3.0 -> 3.4.0 went missing.

[ci skip]
+ Add `Group.__getitem__()` change by ankostis.
Define `Group.__getitem__()` method
__item__ is not a standard dunder. Removed.
While trying to build PyTables using icc/icpc, the build would fail with the following error:

error: identifier "_mm256_castsi128_si256" is undefined
  _mm256_loadu2_m128i(const __m128i* const hiaddr, const __m128i* const loaddr)
  ^

test.c(5): error: expected a type specifier
  _mm256_loadu2_m128i(const __m128i* const hiaddr, const __m128i* const loaddr)
  ^

test.c(5): error: expected a type specifier
  _mm256_loadu2_m128i(const __m128i* const hiaddr, const __m128i* const loaddr)
  ^

test.c(5): error #141: unnamed prototyped parameters not allowed when body is present
  _mm256_loadu2_m128i(const __m128i* const hiaddr, const __m128i* const loaddr)
  ^

test.c(8): error: identifier "loaddr" is undefined
      _mm256_castsi128_si256(_mm_loadu_si128(loaddr)), _mm_loadu_si128(hiaddr), 1);
                                             ^

test.c(8): error: identifier "hiaddr" is undefined
      _mm256_castsi128_si256(_mm_loadu_si128(loaddr)), _mm_loadu_si128(hiaddr), 1);

It was diagnosed that immintrin.h from include directory of icc already defines split load/store intrinsics.
Ensuring that section of shuffle-avx2.c is not declared while building with icc/icpc, allows builds to go through.
enable building PyTables with Intel compiler (icc/icpc)
PEP 519 defines a filesystem path protocol, and enhances Python builtin I/O
functions to support "path-like" objects. As part of this PEP, the `os` module
also gained a `fspath` function, which takes an arbitrary path-like object and
returns a string.

If the os.fspath function is available, use it to convert `filename` to a
string.
Method Node._get_obj_timestamps() returns an object with attributes
atime, mtime, ctime and btime from the object's H5O_info_t struct.
These times as integer seconds since the UNIX epoch.

HDF5 stores timestamps associated with an object by default: access
time (atime), modification time (mtime), change time (ctime), and
birth time (btime).  Of these, only ctime has been implemented as of
HDF version 1.8.15.
This property retrieves the "track times" creation property of a
dataset using H5Pget_obj_track_times.

Retrieval of the property works in a newly created dataset but the
property does not seem to survive closing and reopening the file
(bug?).  So, the property has dubious value.  It might be better to
just advise users to check whether the ctime for the dataset is 0.
Allow disabling object time tracking when creating Tables, Arrays,
CArrays, EArrays, or VLArrays using H5Pset_obj_track_times, via an
optional keyword argument "track_times" in constructors, default True
for backwards compatibility.

Disabling object time tracking makes it at least possible to get
bitwise reproducible output files from different runs of the same
application.

From the HDF5 FAQ:
"If you turn off the create/modify/access time tracking for
  objects created (with the H5Pset_obj_track_times() routine),
  everything should be bit-for-bit reproducible. Coincidentally, it
  makes accessing those objects faster and the size of their metadata
  smaller also. You do lose the ability to know when the object was
  created/modified/accessed."

The user intent to track times / not track times is passed in to the
corresponding functions in the C source via the attribute
._want_track_times of the Leaf class, the way filters are.  These C
functions (H5ARRAYmake, H5TBOmake_table, and H5VLARRAYmake) have a new
boolean argument, track_times, before the data pointer argument.
DOC small documentation fixes
@FrancescAlted
Copy link
Member

Cool. There is an outstanding error at AppVeyor:

image

@tomkooij Any hint?

@FrancescAlted
Copy link
Member

Also, I'd like to update blosc sources to latest release (1.14.3). Should be easy enough.

@tomkooij
Copy link
Contributor

tomkooij commented Apr 13, 2018

About the conda error: We (also) test against HDF5 1.10.x, but that is not in conda defaults or conda-forge. This uses my version of hdf5-1.10 from my conda repos.

EDIT: It seems it was just a network error. Restarting that build should fix it. Sadly, I do not have permission to restart the Appveyor build.

@tomkooij
Copy link
Contributor

@jsancho-gpl : I have created a TODO list above. :-)

I can run the --heavy tests on Windows if you do not a have windows machine available.

(I can/should probably help with the wheels and conda-forge as well)

@jsancho-gpl
Copy link
Contributor Author

jsancho-gpl commented Apr 16, 2018

Thanks @tomkooij, I don't have a Windows machine so I'll need your help for that. I'm working on the rest of the list.

Changes from 3.4.2 to 3.4.3
===========================

Improvements
------------
 - On interactive python sessions, group/attribute  `__dir__()` method
   autocompletes children that are named as valid python identifiers.
   :issue:`624` & :issue:`625` thanks to ankostis.
 - Implement `Group.__getitem__()` to have groups act as python-containers,
   so code like this works: ``hfile.root['some child']``.
   :issue:`628` thanks to ankostis.
 - Enable building with Intel compiler (icc/icpc).
   Thanks to rohit-jamuar.
 - PEP 519 support, using new `os.fspath` method.
   Thanks to mruffalo.
 - Optional disable recording of ctime (metadata creation time) when
   creating datasets that makes possible to get bitwise identical output
   from repeated runs.
   Thanks to alex-cobb.
 - Prevent from reading all rows for each coord in a VLArray when
   indexing using a list .
   Thanks to igormq.
 - Internal Blosc version updated to 1.14.3

Bugs fixed
----------
 - Fixed division by zero when using `_convert_time64()` with an empty
   nparr array.
   :issue:`653`. Thanks to alobbs.
 - Fixed deprecation warnings with numpy 1.14.
   Thanks to oleksandr-pavlyk.
 - Skip DLL check when running from a frozen app.
   :issue:`675`. Thanks to jwiggins.
 - Fixed behaviour with slices out of range.
   :issue:`651`. Thanks to jackdbd.
@jsancho-gpl
Copy link
Contributor Author

When running heavy test on Linux, I get:

/home/jsancho/.local/lib/python3.6/site-packages/tables-3.4.3-py3.6-linux-x86_64.egg/tables/nodes/filenode.py:170: UserWarning: host PyTables file is already closed!
  warnings.warn("host PyTables file is already closed!")
......
======================================================================
ERROR: None (tables.tests.test_tables.LargeRowSize)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/jsancho/.local/lib/python3.6/site-packages/tables-3.4.3-py3.6-linux-x86_64.egg/tables/tests/test_tables.py", line 5126, in test01
    self.h5file.create_table(self.h5file.root, 'largerow', r)
  File "/home/jsancho/.local/lib/python3.6/site-packages/tables-3.4.3-py3.6-linux-x86_64.egg/tables/file.py", line 1066, in create_table
    track_times=track_times)
  File "/home/jsancho/.local/lib/python3.6/site-packages/tables-3.4.3-py3.6-linux-x86_64.egg/tables/table.py", line 842, in __init__
    byteorder, _log, track_times)
  File "/home/jsancho/.local/lib/python3.6/site-packages/tables-3.4.3-py3.6-linux-x86_64.egg/tables/leaf.py", line 290, in __init__
    super(Leaf, self).__init__(parentnode, name, _log)
  File "/home/jsancho/.local/lib/python3.6/site-packages/tables-3.4.3-py3.6-linux-x86_64.egg/tables/node.py", line 266, in __init__
    self._v_objectid = self._g_create()
  File "/home/jsancho/.local/lib/python3.6/site-packages/tables-3.4.3-py3.6-linux-x86_64.egg/tables/table.py", line 1029, in _g_create
    self._v_new_title, self.filters.complib or '', obversion)
  File "tables/tableextension.pyx", line 211, in tables.tableextension.Table._create_table
tables.exceptions.HDF5ExtError: Problems creating the table

----------------------------------------------------------------------
Ran 67132 tests in 9642.535s

FAILED (errors=1, skipped=304)

Any ideas?

@avalentino
Copy link
Member

All seems to work fine here

$ python3 tables/tests/test_tables.py --heavy -v LargeRowSize
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
PyTables version:    3.3.0
HDF5 version:        1.10.0-patch1
NumPy version:       1.12.1
Numexpr version:     2.6.2 (not using Intel's VML/MKL)
Zlib version:        1.2.11 (in Python interpreter)
LZO version:         2.08 (Jun 29 2014)
BZIP2 version:       1.0.6 (6-Sept-2010)
Blosc version:       1.11.1 (2016-09-03)
Blosc compressors:   blosclz (1.0.5), lz4 (1.7.1), lz4hc (1.7.1), snappy (unknown), zlib (1.2.11), zstd (1.2.0)
Blosc filters:       shuffle, bitshuffle
Cython version:      0.25.2
Python version:      3.6.3 (default, Oct  3 2017, 21:45:48) 
[GCC 7.2.0]
Platform:            Linux-4.13.0-38-generic-x86_64-with-Ubuntu-17.10-artful
Byte-ordering:       little
Detected cores:      4
Default encoding:    utf-8
Default FS encoding: utf-8
Default locale:      (it_IT, UTF-8)
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
test00 (__main__.LargeRowSize)
Checking saving a Table with a moderately large rowsize ... ok
test01 (__main__.LargeRowSize)
Checking saving a Table with an extremely large rowsize ... ok

----------------------------------------------------------------------
Ran 2 tests in 0.014s

OK

@tomkooij
Copy link
Contributor

I can't reproduce: I think just ignoring the error is fine. (Or perhaps run that specific test again?)

@jsancho-gpl
Copy link
Contributor Author

I've tried with Debian and Ubuntu in two different computers but always with the same result, error with LargeRowSize test. Probably I've something wrong in my tests environment. Maybe some kernel parameter for files size or something like that?

@jsancho-gpl
Copy link
Contributor Author

At

r = records.array([[np.arange(40000)]*4]) # 640 KB
we try to create a table with a row of 640KB.

With numpy==1.14.2 (my version):

In [4]: np.rec.array([[np.arange(10)]*4])
Out[4]: 
rec.array([[(0, 1, 2, 3, 4, 5, 6, 7, 8, 9),
            (0, 1, 2, 3, 4, 5, 6, 7, 8, 9),
            (0, 1, 2, 3, 4, 5, 6, 7, 8, 9),
            (0, 1, 2, 3, 4, 5, 6, 7, 8, 9)]],
          dtype=[('f0', '<i8'), ('f1', '<i8'), ('f2', '<i8'), ('f3', '<i8'), ('f4', '<i8'), ('f5', '<i8'), ('f6', '<i8'), ('f7', '<i8'), ('f8', '<i8'), ('f9', '<i8')])

But with numpy==1.12.1:

In [4]: np.rec.array([[np.arange(10)]*4])
Out[4]: 
rec.array([[(0, 1, 2, 3), (0, 1, 2, 3), (0, 1, 2, 3), (0, 1, 2, 3)]],
          dtype=[('f0', '<i8'), ('f1', '<i8'), ('f2', '<i8'), ('f3', '<i8')])
In [5]: np.rec.array([[np.arange(40000)]*4])
Out[5]: 
rec.array([[(0, 1, 2, 3), (0, 1, 2, 3), (0, 1, 2, 3), (0, 1, 2, 3)]],
          dtype=[('f0', '<i8'), ('f1', '<i8'), ('f2', '<i8'), ('f3', '<i8')])

It seems a bug in older versions of numpy.

@tomkooij
Copy link
Contributor

3.4.3 was release to PyPI: So we need wheels.

I'm merging this to be able to create wheels.

@tomkooij tomkooij merged commit ea7dc2b into master Apr 20, 2018
@jsancho-gpl
Copy link
Contributor Author

Thanks @tomkooij, my plan was to build the wheels during yesterday's evening and today, but I couldn't at the end, and the sources package had already been uploaded.

@tomkooij
Copy link
Contributor

tomkooij commented Apr 21, 2018

@jsancho-gpl Unfortunately, wheel building was not as easy as "update version". The wheelbuilder used the external C-BLOSC because the pytables internal C-BLOSC enables AVX2 opcodes on travis builds.
C-BLOSC will no longer build in de manylinux docker containers as it requires CMake-2.8.12 is not supported in CentOS 5 (yes, v5).

The wheelbuilder (MacPython/pytables-wheels@abebcb2) now applies the same patch conda-forge uses to disable AVX2 in pytables setup.py and uses the internal C-Blosc.

Wheels are being built and uploaded! conda-forge packages should be available (or are in the CI queue)

@tomkooij
Copy link
Contributor

Wheels are on pypi!
conda packages are on conda-forge~!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.