-
Notifications
You must be signed in to change notification settings - Fork 258
Optimize ABC caches #383
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Optimize ABC caches #383
Conversation
@methane I also inlined internal helpers to speed-up |
Because abc is full of magic, I can't say LGTM. BTW, this issue is new from 3.5.3 and 3.6.0? |
self._abc_cache = self.__extra__._abc_cache | ||
elif self.__origin__ is not None: | ||
self._abc_registry = self.__origin__._abc_registry | ||
self._abc_cache = self.__origin__._abc_cache |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about sharing self._abc_negative_cache
too?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self._abc_negative_cache
is also shared, but it could be overwritten by abc.py
on negative cache invalidation, this is why I made it a descriptor 7 lines below. (Positive cache is never invalidated, so that the sharing could be realised as a simple assignment on instantiation).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tracemalloc shows this.
I think removing _abc_negative_cache
at __init__
or __new__
is worth enough.
125.34KiB / count=573
File "/Users/inada-n/local/py37/lib/python3.7/_weakrefset.py", line 48
self._iterating = set()
File "/Users/inada-n/local/py37/lib/python3.7/abc.py", line 147
cls._abc_negative_cache = WeakSet()
File "/Users/inada-n/local/py37/lib/python3.7/typing.py", line 125
return super().__new__(cls, name, bases, namespace)
123.44KiB / count=1193
File "/Users/inada-n/local/py37/lib/python3.7/_weakrefset.py", line 38
def _remove(item, selfref=ref(self)):
File "/Users/inada-n/local/py37/lib/python3.7/abc.py", line 147
cls._abc_negative_cache = WeakSet()
File "/Users/inada-n/local/py37/lib/python3.7/typing.py", line 125
return super().__new__(cls, name, bases, namespace)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I could not just remove them, since some other places expect their presence. However, I found a bug in _abc_negative_cache
setters, and I made most assignments to _abc_negative_cache
a no-op. Could you please check the memory/speed situation once more with the newest commit in my branch?
I think it was always like this. Actually, it was maybe even worse. There were some improvements in 3.5.3 and 3.6.0 due to generic type caches (these are unrelated caches that actually help :-)) |
Memory consumption goes wrong.
|
I can reproduce above with original It wasn't in https://gist.github.com/methane/3c34f11fb677365a7e92afe73aca24e7 ("", line 488 usually includes memory for |
The last trace you posted (with
I don't know how to clean-up other internal caches. Could you please compare the master version and my patch both in same conditions "from cold start"? If you are interested, I am running a "micro-benchmark": import sys
from typing import *
from collections.abc import Iterable
classes = []
for i in range(1000):
classes.append(type('C'+str(i), (), {}))
dummy = List[classes[i]]
for i in range(1000):
dummy = isinstance(classes[i](), Iterable)
sys._debugmallocstats() This gives me total memory consumption 5.5MB with my patch vs 48MB for |
As I said above, i can reproduce it with default version of typing.
There are no regressions.
2017/02/02 午後10:44 "Ivan Levkivskyi" <[email protected]>:
… The last trace you posted (with 996.90KiB / count=3450) seems to be
unrelated to _abc_caches. This is normal creation of generic classes. It
could be a problem with tracemalloc. If necessary, you could clean-up
generic type caches:
for c in typing._cleanups:
c()
I don't know how to clean-up other internal caches.
Could you please compare the master version and my patch both in same
conditions "from cold start"?
------------------------------
If you are interested, I am running a "micro-benchmark":
import sysfrom typing import *from collections.abc import Iterable
classes = []for i in range(1000):
classes.append(type('C'+str(i), (), {}))
dummy = List[classes[i]]
for i in range(1000):
dummy = isinstance(classes[i](), Iterable)
sys._debugmallocstats()
This gives me total memory consumption 5.5MB with my patch vs 48MB for
typing/master.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#383 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AAMLqOlWiR55Oauh63T5OXsuof73RMjMks5rYd3DgaJpZM4Lz-42>
.
|
OK, thanks! Do you think the PR is satisfactory? |
It looks nice to me. |
@methane |
149b538
to
0ca5c0b
Compare
I continue to lack the energy to review/approve this (I can handle only moderately complex reviews ATM). Given that the 3.6.1 candidate is expected in about two weeks (see PEP 494) maybe the best way forward is to merge this (and the coverage PR that depends on it) no, merge everything upstream, and hope for the best? The memory stats look encouraging for sure! |
OK, let's do this. After merging, I will make additional checks (I do refleak checks from time to time, and run full CPython test suite after importing typing). |
Extensive use of
isinstance
with ABCs causes significant increase in memory consumption by generics, see https://mail.python.org/pipermail/python-dev/2017-January/147194.htmlHere I make
_abc_cache
and friends shortcut to the type's__extra__
or_gorg
(using descriptors where necessary). This allows to reduce memory consumption up to 4x on a micro-benchmark simulating conditions described in the python-dev thread. (There is of course a speed penalty, but it seems to be minor).@methane Please take a look.