Thanks to visit codestin.com
Credit goes to github.com

Skip to content
This repository was archived by the owner on Dec 8, 2024. It is now read-only.

Tolerance fix #104

Merged
merged 76 commits into from
Feb 5, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
76 commits
Select commit Hold shift + click to select a range
baa57eb
Merge remote-tracking branch 'upstream/develop' into develop
SilviaAmAm Jul 30, 2018
ceb9ef5
Merge remote-tracking branch 'upstream/develop' into develop
SilviaAmAm Aug 15, 2018
d974133
SilviaAmAm Aug 24, 2018
8a3fbf3
Started updating so that model can be trained after its been reloaded
SilviaAmAm Aug 24, 2018
7a2dc7e
SilviaAmAm Aug 25, 2018
b33a537
Merge remote-tracking branch 'upstream/develop' into develop
SilviaAmAm Aug 25, 2018
5525d3b
Updated model so one can predict from xyz and disabled shuffling in t…
SilviaAmAm Aug 26, 2018
bb28c4a
SilviaAmAm Aug 27, 2018
4b29b95
Added some tests to make sure the predictions work
SilviaAmAm Aug 27, 2018
a4dc496
SilviaAmAm Aug 27, 2018
ef5c880
SilviaAmAm Aug 28, 2018
d1e27ad
SilviaAmAm Aug 28, 2018
f3a3e9c
Modified the name of a parameter
SilviaAmAm Aug 29, 2018
c77272e
Made modifications to make te symmetry functions more numerically stable
SilviaAmAm Aug 31, 2018
3db69a3
SilviaAmAm Sep 5, 2018
acf0ee7
SilviaAmAm Sep 5, 2018
49b36c2
SilviaAmAm Sep 6, 2018
75aac67
SilviaAmAm Sep 6, 2018
1f694f8
SilviaAmAm Sep 7, 2018
e5c4787
Attempt at fixing issue #10
SilviaAmAm Sep 11, 2018
fef6fba
another attempt at fixing #10
SilviaAmAm Sep 11, 2018
61fefce
Removed a pointless line
SilviaAmAm Sep 12, 2018
386be84
Merge
SilviaAmAm Sep 13, 2018
4c050f7
set-up
SilviaAmAm Sep 13, 2018
3bdd928
SilviaAmAm Sep 13, 2018
81c9824
Modifications which prevent installation from breaking on BC4
Sep 13, 2018
f4a7424
Merge branch 'kill_gracefully' into develop
SilviaAmAm Sep 13, 2018
4a41f14
Modification to add neural networks to qmlearn
SilviaAmAm Sep 14, 2018
7d7f580
Merge remote-tracking branch 'upstream/develop' into develop_qmlearn
SilviaAmAm Sep 18, 2018
898b640
Fix for issue #8
SilviaAmAm Sep 21, 2018
f77d040
SilviaAmAm Sep 24, 2018
2d5988f
SilviaAmAm Sep 24, 2018
da8b524
SilviaAmAm Sep 25, 2018
227e2b8
SilviaAmAm Sep 25, 2018
250031d
SilviaAmAm Sep 25, 2018
687291b
uncommented examples
SilviaAmAm Sep 25, 2018
4391906
Removed unique_elements in data class
larsbratholm Sep 26, 2018
2fb8d39
Made tensorflow an optional dependency
larsbratholm Sep 26, 2018
819fa71
Made is_numeric non-private and removed legacy code
larsbratholm Sep 26, 2018
20ca689
Added 1d array util function
larsbratholm Sep 27, 2018
9ef8c10
Removed QML check and moved functions from utils to tf_utils
larsbratholm Sep 27, 2018
00a2a7c
Support for linear models (no hidden layers)
larsbratholm Sep 27, 2018
a925ee0
fixed import bug in tf_utils
larsbratholm Sep 27, 2018
c8cf008
Added text to explain that you are scoring on training set
larsbratholm Sep 27, 2018
96713c0
Restructure.
larsbratholm Sep 28, 2018
f313eff
Moved documentation from init to class
larsbratholm Sep 28, 2018
20b78b8
Constant features will now be removed at fit/predict time
larsbratholm Sep 28, 2018
4b785f1
Moved get_batch_size back into utils, since it doesn't depend on tf
larsbratholm Sep 28, 2018
8d62084
Made the NeuralNetwork class compliant with sklearn
larsbratholm Sep 28, 2018
d8447e7
Fixed tests that didn't pass
larsbratholm Sep 28, 2018
a8f6062
Fixed mistake in checks of set_classes() in ARMP
SilviaAmAm Sep 28, 2018
d4d98b6
SilviaAmAm Oct 1, 2018
9539704
Merge remote-tracking branch 'upstream/develop' into develop
SilviaAmAm Oct 1, 2018
2b22aa3
Fixed bug in padding and added examples that give low errors
SilviaAmAm Oct 1, 2018
8f082c5
Attempted fix to make representations single precision
SilviaAmAm Oct 2, 2018
21fd2fd
Hot fix for AtomScaler
SilviaAmAm Oct 2, 2018
f2f7770
Merge pull request #13 from larsbratholm/nn_qmlearn
Oct 2, 2018
999301d
Merge branch 'develop_qmlearn' into develop
SilviaAmAm Oct 2, 2018
9ce72c6
Minor bug fixes
SilviaAmAm Oct 2, 2018
70e8622
SilviaAmAm Oct 2, 2018
89dfda9
Fixed some tests that had failures
SilviaAmAm Oct 3, 2018
908da82
SilviaAmAm Oct 3, 2018
c47ad62
SilviaAmAm Oct 3, 2018
fdc485e
Readded changes to tests
SilviaAmAm Oct 3, 2018
255ea74
Modifications after code review
SilviaAmAm Oct 3, 2018
de3192f
Merge remote-tracking branch 'upstream/develop' into develop
SilviaAmAm Oct 23, 2018
e2feecf
Version with the ACSF basis functions starting at 0.8 A
Dec 11, 2018
162c811
Merge remote-tracking branch 'origin/develop' into develop
SilviaAmAm Dec 11, 2018
c8e2886
Merge remote-tracking branch 'upstream/develop' into develop
SilviaAmAm Dec 11, 2018
9584817
Updated ACSF representations so that the minimum distance at which to…
SilviaAmAm Dec 12, 2018
0639a32
SilviaAmAm Dec 12, 2018
1ee640c
Merge remote-tracking branch 'upstream/develop' into develop
SilviaAmAm Jan 17, 2019
91449e6
SilviaAmAm Jan 30, 2019
4964a7e
SilviaAmAm Jan 31, 2019
e102e2c
Merge branch 'develop' into tolerance_fix
larsbratholm Feb 5, 2019
268b69c
Merge branch 'develop' into tolerance_fix
larsbratholm Feb 5, 2019
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 39 additions & 0 deletions qml/qmlearn/preprocessing.py
Original file line number Diff line number Diff line change
Expand Up @@ -206,6 +206,21 @@ def _transform(self, data, features, y):
else:
return delta_y

def _revert_transform(self, data, features, y):
"""
Reverts the work of the transform method.
"""

full_y = y + self.model.predict(features)

if data:
# Force copy
data.energies = data.energies.copy()
data.energies[data._indices] = full_y
return data
else:
return full_y

def _check_elements(self, nuclear_charges):
"""
Check that the elements in the given nuclear_charges was
Expand Down Expand Up @@ -261,3 +276,27 @@ def transform(self, X, y=None):
features = self._featurizer(nuclear_charges)

return self._transform(data, features, y)

def revert_transform(self, X, y=None):
"""
Transforms data back to what it originally would have been if it hadn't been transformed with the fitted linear
model. Supports three different types of input.
1) X is a list of nuclear charges and y is values to transform.
2) X is an array of indices of which to transform.
3) X is a data object

:param X: List with nuclear charges or Data object.
:type X: list
:param y: Values to revert to before transform
:type y: array or None
:return: Array of untransformed values or Data object, depending on input
:rtype: array or Data object
"""

data, nuclear_charges, y = self._parse_input(X, y)

self._check_elements(nuclear_charges)

features = self._featurizer(nuclear_charges)

return self._revert_transform(data, features, y)
12 changes: 6 additions & 6 deletions test/test_armp.py
Original file line number Diff line number Diff line change
Expand Up @@ -228,7 +228,7 @@ def test_predict_fromxyz():
pred1 = estimator.predict(idx)
pred2 = estimator.predict_from_xyz(xyz, zs)

assert np.all(np.isclose(pred1, pred2, rtol=1.e-6))
assert np.all(np.isclose(pred1, pred2, rtol=1.e-5))

estimator.save_nn(save_dir="temp")

Expand All @@ -243,11 +243,11 @@ def test_predict_fromxyz():
pred3 = new_estimator.predict(idx)
pred4 = new_estimator.predict_from_xyz(xyz, zs)

assert np.all(np.isclose(pred3, pred4, rtol=1.e-6))
assert np.all(np.isclose(pred1, pred3, rtol=1.e-6))

shutil.rmtree("temp")

assert np.all(np.isclose(pred3, pred4, rtol=1.e-5))
assert np.all(np.isclose(pred1, pred3, rtol=1.e-5))

def test_retraining():
xyz = np.array([[[0, 1, 0], [0, 1, 1], [1, 0, 1]],
[[1, 2, 2], [3, 1, 2], [1, 3, 4]],
Expand Down Expand Up @@ -291,8 +291,8 @@ def test_retraining():

pred4 = new_estimator.predict(idx)

assert np.all(np.isclose(pred1, pred3, rtol=1.e-6))
assert np.all(np.isclose(pred2, pred4, rtol=1.e-6))
assert np.all(np.isclose(pred1, pred3, rtol=1.e-5))
assert np.all(np.isclose(pred2, pred4, rtol=1.e-5))

shutil.rmtree("temp")

Expand Down