diff --git a/.github/CODE_OF_CONDUCT.md b/.github/CODE_OF_CONDUCT.md new file mode 100644 index 000000000..1d8ad1833 --- /dev/null +++ b/.github/CODE_OF_CONDUCT.md @@ -0,0 +1,133 @@ + +# Contributor Covenant Code of Conduct + +## Our Pledge + +We as members, contributors, and leaders pledge to make participation in our +community a harassment-free experience for everyone, regardless of age, body +size, visible or invisible disability, ethnicity, sex characteristics, gender +identity and expression, level of experience, education, socio-economic status, +nationality, personal appearance, race, caste, color, religion, or sexual +identity and orientation. + +We pledge to act and interact in ways that contribute to an open, welcoming, +diverse, inclusive, and healthy community. + +## Our Standards + +Examples of behavior that contributes to a positive environment for our +community include: + +* Demonstrating empathy and kindness toward other people +* Being respectful of differing opinions, viewpoints, and experiences +* Giving and gracefully accepting constructive feedback +* Accepting responsibility and apologizing to those affected by our mistakes, + and learning from the experience +* Focusing on what is best not just for us as individuals, but for the overall + community + +Examples of unacceptable behavior include: + +* The use of sexualized language or imagery, and sexual attention or advances of + any kind +* Trolling, insulting or derogatory comments, and personal or political attacks +* Public or private harassment +* Publishing others' private information, such as a physical or email address, + without their explicit permission +* Other conduct which could reasonably be considered inappropriate in a + professional setting + +## Enforcement Responsibilities + +Community leaders are responsible for clarifying and enforcing our standards of +acceptable behavior and will take appropriate and fair corrective action in +response to any behavior that they deem inappropriate, threatening, offensive, +or harmful. + +Community leaders have the right and responsibility to remove, edit, or reject +comments, commits, code, wiki edits, issues, and other contributions that are +not aligned to this Code of Conduct, and will communicate reasons for moderation +decisions when appropriate. + +## Scope + +This Code of Conduct applies within all community spaces, and also applies when +an individual is officially representing the community in public spaces. +Examples of representing our community include using an official e-mail address, +posting via an official social media account, or acting as an appointed +representative at an online or offline event. + +## Enforcement + +Instances of abusive, harassing, or otherwise unacceptable behavior may be +reported to the community leaders responsible for enforcement at +. +All complaints will be reviewed and investigated promptly and fairly. + +All community leaders are obligated to respect the privacy and security of the +reporter of any incident. + +## Enforcement Guidelines + +Community leaders will follow these Community Impact Guidelines in determining +the consequences for any action they deem in violation of this Code of Conduct: + +### 1. Correction + +**Community Impact**: Use of inappropriate language or other behavior deemed +unprofessional or unwelcome in the community. + +**Consequence**: A private, written warning from community leaders, providing +clarity around the nature of the violation and an explanation of why the +behavior was inappropriate. A public apology may be requested. + +### 2. Warning + +**Community Impact**: A violation through a single incident or series of +actions. + +**Consequence**: A warning with consequences for continued behavior. No +interaction with the people involved, including unsolicited interaction with +those enforcing the Code of Conduct, for a specified period of time. This +includes avoiding interactions in community spaces as well as external channels +like social media. Violating these terms may lead to a temporary or permanent +ban. + +### 3. Temporary Ban + +**Community Impact**: A serious violation of community standards, including +sustained inappropriate behavior. + +**Consequence**: A temporary ban from any sort of interaction or public +communication with the community for a specified period of time. No public or +private interaction with the people involved, including unsolicited interaction +with those enforcing the Code of Conduct, is allowed during this period. +Violating these terms may lead to a permanent ban. + +### 4. Permanent Ban + +**Community Impact**: Demonstrating a pattern of violation of community +standards, including sustained inappropriate behavior, harassment of an +individual, or aggression toward or disparagement of classes of individuals. + +**Consequence**: A permanent ban from any sort of public interaction within the +community. + +## Attribution + +This Code of Conduct is adapted from the [Contributor Covenant][homepage], +version 2.1, available at +[https://www.contributor-covenant.org/version/2/1/code_of_conduct.html][v2.1]. + +Community Impact Guidelines were inspired by +[Mozilla's code of conduct enforcement ladder][Mozilla CoC]. + +For answers to common questions about this code of conduct, see the FAQ at +[https://www.contributor-covenant.org/faq][FAQ]. Translations are available at +[https://www.contributor-covenant.org/translations][translations]. + +[homepage]: https://www.contributor-covenant.org +[v2.1]: https://www.contributor-covenant.org/version/2/1/code_of_conduct.html +[Mozilla CoC]: https://github.com/mozilla/diversity +[FAQ]: https://www.contributor-covenant.org/faq +[translations]: https://www.contributor-covenant.org/translations diff --git a/.github/CODE_OF_CONDUCT.rst b/.github/CODE_OF_CONDUCT.rst deleted file mode 100644 index 56e8914ce..000000000 --- a/.github/CODE_OF_CONDUCT.rst +++ /dev/null @@ -1,55 +0,0 @@ -Contributor Covenant Code of Conduct -==================================== - -Our Pledge ----------- - -In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to make participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation. - -Our Standards -------------- - -Examples of behavior that contributes to creating a positive environment include: - -* Using welcoming and inclusive language -* Being respectful of differing viewpoints and experiences -* Gracefully accepting constructive criticism -* Focusing on what is best for the community -* Showing empathy towards other community members - -Examples of unacceptable behavior by participants include: - -* The use of sexualized language or imagery and unwelcome sexual attention or advances -* Trolling, insulting/derogatory comments, and personal or political attacks -* Public or private harassment -* Publishing others' private information, such as a physical or electronic address, without explicit permission -* Other conduct which could reasonably be considered inappropriate in a professional setting - -Our Responsibilities --------------------- - -Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior. - -Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful. - -Scope ------ - -This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. -Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. -Representation of a project may be further defined and clarified by project maintainers. - -Enforcement ------------ - -Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at hs@ox.cx. -All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances. -The project team is obligated to maintain confidentiality with regard to the reporter of an incident. -Further details of specific enforcement policies may be posted separately. - -Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership. - -Attribution ------------ - -This Code of Conduct is adapted from the `Contributor Covenant `_, version 1.4, available at . diff --git a/.github/CONTRIBUTING.md b/.github/CONTRIBUTING.md new file mode 100644 index 000000000..a9cc8662e --- /dev/null +++ b/.github/CONTRIBUTING.md @@ -0,0 +1,230 @@ +# How To Contribute + +First off, thank you for considering contributing to `attrs`! +It's people like *you* who make it such a great tool for everyone. + +This document intends to make contribution more accessible by codifying tribal knowledge and expectations. +Don't be afraid to open half-finished PRs, and ask questions if something is unclear! + +Please note that this project is released with a Contributor [Code of Conduct](https://github.com/python-attrs/attrs/blob/main/.github/CODE_OF_CONDUCT.md). +By participating in this project you agree to abide by its terms. +Please report any harm to [Hynek Schlawack] in any way you find appropriate. + + +## Support + +In case you'd like to help out but don't want to deal with GitHub, there's a great opportunity: +help your fellow developers on [Stack Overflow](https://stackoverflow.com/questions/tagged/python-attrs)! + +The official tag is `python-attrs` and helping out in support frees us up to improve `attrs` instead! + + +## Workflow + +- No contribution is too small! + Please submit as many fixes for typos and grammar bloopers as you can! +- Try to limit each pull request to *one* change only. +- Since we squash on merge, it's up to you how you handle updates to the main branch. + Whether you prefer to rebase on main or merge main into your branch, do whatever is more comfortable for you. +- *Always* add tests and docs for your code. + This is a hard rule; patches with missing tests or documentation can't be merged. +- Make sure your changes pass our [CI]. + You won't get any feedback until it's green unless you ask for it. +- For the CI to pass, the coverage must be 100%. + If you have problems to test something, open anyway and ask for advice. + In some situations, we may agree to add an `# pragma: no cover`. +- Once you've addressed review feedback, make sure to bump the pull request with a short note, so we know you're done. +- Don’t break backwards-compatibility. + + +## Local Development Environment + +You can (and should) run our test suite using [*tox*]. +However, you’ll probably want a more traditional environment as well. +We highly recommend to develop using the latest Python release because we try to take advantage of modern features whenever possible. + +First create a [virtual environment](https://virtualenv.pypa.io/) so you don't break your system-wide Python installation. +It’s out of scope for this document to list all the ways to manage virtual environments in Python, but if you don’t already have a pet way, take some time to look at tools like [*direnv*](https://hynek.me/til/python-project-local-venvs/), [*virtualfish*](https://virtualfish.readthedocs.io/), and [*virtualenvwrapper*](https://virtualenvwrapper.readthedocs.io/). + +Next, get an up-to-date checkout of the `attrs` repository: + +```console +$ git clone git@github.com:python-attrs/attrs.git +``` + +or if you prefer to use git via `https`: + +```console +$ git clone https://github.com/python-attrs/attrs.git +``` + +Change into the newly created directory and **after activating your virtual environment** install an editable version of `attrs` along with its tests and docs requirements: + +```console +$ cd attrs +$ python -m pip install --upgrade pip setuptools # PLEASE don't skip this step +$ python -m pip install -e '.[dev]' +``` + +At this point, + +```console +$ python -m pytest +``` + +should work and pass, as should: + +```console +$ cd docs +$ make html +``` + +The built documentation can then be found in `docs/_build/html/`. + +To avoid committing code that violates our style guide, we strongly advise you to install [*pre-commit*] [^dev] hooks: + +```console +$ pre-commit install +``` + +You can also run them anytime (as our *tox* does) using: + +```console +$ pre-commit run --all-files +``` + +[^dev]: *pre-commit* should have been installed into your virtualenv automatically when you ran `pip install -e '.[dev]'` above. + If *pre-commit* is missing, your probably need to run `pip install -e '.[dev]'` again. + + +## Code + +- Obey [PEP 8](https://peps.python.org/pep-0008/) and [PEP 257](https://peps.python.org/pep-0257/). + We use the `"""`-on-separate-lines style for docstrings: + + ```python + def func(x): + """ + Do something. + + :param str x: A very important parameter. + + :rtype: str + """ + ``` +- If you add or change public APIs, tag the docstring using `.. versionadded:: 16.0.0 WHAT` or `.. versionchanged:: 16.2.0 WHAT`. +- We use [*isort*](https://github.com/PyCQA/isort) to sort our imports, and we use [*Black*](https://github.com/psf/black) with line length of 79 characters to format our code. + As long as you run our full [*tox*] suite before committing, or install our [*pre-commit*] hooks (ideally you'll do both – see [*Local Development Environment*](#local-development-environment) above), you won't have to spend any time on formatting your code at all. + If you don't, [CI] will catch it for you – but that seems like a waste of your time! + + +## Tests + +- Write your asserts as `expected == actual` to line them up nicely: + + ```python + x = f() + + assert 42 == x.some_attribute + assert "foo" == x._a_private_attribute + ``` + +- To run the test suite, all you need is a recent [*tox*]. + It will ensure the test suite runs with all dependencies against all Python versions just as it will in our [CI]. + If you lack some Python versions, you can can always limit the environments like `tox -e py38,py39`, or make it a non-failure using `tox --skip-missing-interpreters`. + + In that case you should look into [*asdf*](https://asdf-vm.com) or [*pyenv*](https://github.com/pyenv/pyenv), which make it very easy to install many different Python versions in parallel. +- Write [good test docstrings](https://jml.io/pages/test-docstrings.html). +- To ensure new features work well with the rest of the system, they should be also added to our [*Hypothesis*](https://hypothesis.readthedocs.io/) testing strategy, which can be found in `tests/strategies.py`. +- If you've changed or added public APIs, please update our type stubs (files ending in `.pyi`). + + +## Documentation + +- Use [semantic newlines] in [*reStructuredText*] and [*Markdown*](https://docs.github.com/en/get-started/writing-on-github/getting-started-with-writing-and-formatting-on-github/basic-writing-and-formatting-syntax) files (files ending in `.rst` and `.md`): + + ```rst + This is a sentence. + This is another sentence. + ``` + +- If you start a new section, add two blank lines before and one blank line after the header, except if two headers follow immediately after each other: + + ```rst + Last line of previous section. + + + Header of New Top Section + ------------------------- + + Header of New Section + ^^^^^^^^^^^^^^^^^^^^^ + + First line of new section. + ``` + +- If you add a new feature, demonstrate its awesomeness on the [examples page](https://github.com/python-attrs/attrs/blob/main/docs/examples.rst)! + + +### Changelog + +If your change is noteworthy, there needs to be a changelog entry so our users can learn about it! + +To avoid merge conflicts, we use the [*towncrier*](https://pypi.org/project/towncrier) package to manage our changelog. +*towncrier* uses independent files for each pull request – so called *news fragments* – instead of one monolithic changelog file. +On release, those news fragments are compiled into our [`CHANGELOG.rst`](https://github.com/python-attrs/attrs/blob/main/CHANGELOG.rst). + +You don't need to install *towncrier* yourself, you just have to abide by a few simple rules: + +- For each pull request, add a new file into `changelog.d` with a filename adhering to the `pr#.(change|deprecation|breaking).rst` schema: + For example, `changelog.d/42.change.rst` for a non-breaking change that is proposed in pull request #42. +- As with other docs, please use [semantic newlines] within news fragments. +- Wrap symbols like modules, functions, or classes into double backticks so they are rendered in a `monospace font`. +- Wrap arguments into asterisks like in docstrings: + `Added new argument *an_argument*.` +- If you mention functions or other callables, add parentheses at the end of their names: + `attrs.func()` or `attrs.Class.method()`. + This makes the changelog a lot more readable. +- Prefer simple past tense or constructions with "now". + For example: + + + Added `attrs.validators.func()`. + + `attrs.func()` now doesn't crash the Large Hadron Collider anymore when passed the *foobar* argument. +- If you want to reference multiple issues, copy the news fragment to another filename. + *towncrier* will merge all news fragments with identical contents into one entry with multiple links to the respective pull requests. + +Example entries: + + ```rst + Added ``attrs.validators.func()``. + The feature really *is* awesome. + ``` + +or: + + ```rst + ``attrs.func()`` now doesn't crash the Large Hadron Collider anymore when passed the *foobar* argument. + The bug really *was* nasty. + ``` + +--- + +``tox -e changelog`` will render the current changelog to the terminal if you have any doubts. + + +## Governance + +`attrs` is maintained by [team of volunteers](https://github.com/python-attrs) that is always open to new members that share our vision of a fast, lean, and magic-free library that empowers programmers to write better code with less effort. +If you'd like to join, just get a pull request merged and ask to be added in the very same pull request! + +**The simple rule is that everyone is welcome to review/merge pull requests of others but nobody is allowed to merge their own code.** + +[Hynek Schlawack] acts reluctantly as the [BDFL](https://en.wikipedia.org/wiki/Benevolent_dictator_for_life) and has the final say over design decisions. + + +[CI]: https://github.com/python-attrs/attrs/actions?query=workflow%3ACI +[Hynek Schlawack]: https://hynek.me/about/ +[*pre-commit*]: https://pre-commit.com/ +[*tox*]: https://tox.wiki/ +[semantic newlines]: https://rhodesmill.org/brandon/2012/one-sentence-per-line/ +[*reStructuredText*]: https://www.sphinx-doc.org/en/stable/usage/restructuredtext/basics.html diff --git a/.github/CONTRIBUTING.rst b/.github/CONTRIBUTING.rst deleted file mode 100644 index 231e7a7f1..000000000 --- a/.github/CONTRIBUTING.rst +++ /dev/null @@ -1,250 +0,0 @@ -How To Contribute -================= - -First off, thank you for considering contributing to ``attrs``! -It's people like *you* who make it such a great tool for everyone. - -This document intends to make contribution more accessible by codifying tribal knowledge and expectations. -Don't be afraid to open half-finished PRs, and ask questions if something is unclear! - - -Support -------- - -In case you'd like to help out but don't want to deal with GitHub, there's a great opportunity: -help your fellow developers on `StackOverflow `_! - -The official tag is ``python-attrs`` and helping out in support frees us up to improve ``attrs`` instead! - - -Workflow --------- - -- No contribution is too small! - Please submit as many fixes for typos and grammar bloopers as you can! -- Try to limit each pull request to *one* change only. -- Since we squash on merge, it's up to you how you handle updates to the master branch. - Whether you prefer to rebase on master or merge master into your branch, do whatever is more comfortable for you. -- *Always* add tests and docs for your code. - This is a hard rule; patches with missing tests or documentation can't be merged. -- Make sure your changes pass our CI_. - You won't get any feedback until it's green unless you ask for it. -- Once you've addressed review feedback, make sure to bump the pull request with a short note, so we know you're done. -- Don’t break `backward compatibility`_. - - -Code ----- - -- Obey `PEP 8`_ and `PEP 257`_. - We use the ``"""``\ -on-separate-lines style for docstrings: - - .. code-block:: python - - def func(x): - """ - Do something. - - :param str x: A very important parameter. - - :rtype: str - """ -- If you add or change public APIs, tag the docstring using ``.. versionadded:: 16.0.0 WHAT`` or ``.. versionchanged:: 16.2.0 WHAT``. -- We use isort_ to sort our imports, and we follow the Black_ code style with a line length of 79 characters. - As long as you run our full tox suite before committing, or install our pre-commit_ hooks (ideally you'll do both -- see below "Local Development Environment"), you won't have to spend any time on formatting your code at all. - If you don't, CI will catch it for you -- but that seems like a waste of your time! - - -Tests ------ - -- Write your asserts as ``expected == actual`` to line them up nicely: - - .. code-block:: python - - x = f() - - assert 42 == x.some_attribute - assert "foo" == x._a_private_attribute - -- To run the test suite, all you need is a recent tox_. - It will ensure the test suite runs with all dependencies against all Python versions just as it will in our CI. - If you lack some Python versions, you can can always limit the environments like ``tox -e py27,py35`` (in that case you may want to look into pyenv_, which makes it very easy to install many different Python versions in parallel). -- Write `good test docstrings`_. -- To ensure new features work well with the rest of the system, they should be also added to our `Hypothesis`_ testing strategy, which is found in ``tests/strategies.py``. -- If you've changed or added public APIs, please update our type stubs (files ending in ``.pyi``). - - -Documentation -------------- - -- Use `semantic newlines`_ in reStructuredText_ files (files ending in ``.rst``): - - .. code-block:: rst - - This is a sentence. - This is another sentence. - -- If you start a new section, add two blank lines before and one blank line after the header, except if two headers follow immediately after each other: - - .. code-block:: rst - - Last line of previous section. - - - Header of New Top Section - ------------------------- - - Header of New Section - ^^^^^^^^^^^^^^^^^^^^^ - - First line of new section. - -- If you add a new feature, demonstrate its awesomeness on the `examples page`_! - - -Changelog -^^^^^^^^^ - -If your change is noteworthy, there needs to be a changelog entry so our users can learn about it! - -To avoid merge conflicts, we use the towncrier_ package to manage our changelog. -``towncrier`` uses independent files for each pull request -- so called *news fragments* -- instead of one monolithic changelog file. -On release, those news fragments are compiled into our ``CHANGELOG.rst``. - -You don't need to install ``towncrier`` yourself, you just have to abide by a few simple rules: - -- For each pull request, add a new file into ``changelog.d`` with a filename adhering to the ``pr#.(change|deprecation|breaking).rst`` schema: - For example, ``changelog.d/42.change.rst`` for a non-breaking change that is proposed in pull request #42. -- As with other docs, please use `semantic newlines`_ within news fragments. -- Wrap symbols like modules, functions, or classes into double backticks so they are rendered in a ``monospace font``. -- Wrap arguments into asterisks like in docstrings: *these* or *attributes*. -- If you mention functions or other callables, add parentheses at the end of their names: ``attr.func()`` or ``attr.Class.method()``. - This makes the changelog a lot more readable. -- Prefer simple past tense or constructions with "now". - For example: - - + Added ``attr.validators.func()``. - + ``attr.func()`` now doesn't crash the Large Hadron Collider anymore when passed the *foobar* argument. -- If you want to reference multiple issues, copy the news fragment to another filename. - ``towncrier`` will merge all news fragments with identical contents into one entry with multiple links to the respective pull requests. - -Example entries: - - .. code-block:: rst - - Added ``attr.validators.func()``. - The feature really *is* awesome. - -or: - - .. code-block:: rst - - ``attr.func()`` now doesn't crash the Large Hadron Collider anymore when passed the *foobar* argument. - The bug really *was* nasty. - ----- - -``tox -e changelog`` will render the current changelog to the terminal if you have any doubts. - - -Local Development Environment ------------------------------ - -You can (and should) run our test suite using tox_. -However, you’ll probably want a more traditional environment as well. -We highly recommend to develop using the latest Python 3 release because ``attrs`` tries to take advantage of modern features whenever possible. - -First create a `virtual environment `_. -It’s out of scope for this document to list all the ways to manage virtual environments in Python, but if you don’t already have a pet way, take some time to look at tools like `pew `_, `virtualfish `_, and `virtualenvwrapper `_. - -Next, get an up to date checkout of the ``attrs`` repository: - -.. code-block:: bash - - $ git clone git@github.com:python-attrs/attrs.git - -or if you want to use git via ``https``: - -.. code-block:: bash - - $ git clone https://github.com/python-attrs/attrs.git - -Change into the newly created directory and **after activating your virtual environment** install an editable version of ``attrs`` along with its tests and docs requirements: - -.. code-block:: bash - - $ cd attrs - $ pip install -e '.[dev]' - -At this point, - -.. code-block:: bash - - $ python -m pytest - -should work and pass, as should: - -.. code-block:: bash - - $ cd docs - $ make html - -The built documentation can then be found in ``docs/_build/html/``. - -To avoid committing code that violates our style guide, we strongly advise you to install pre-commit_ [#f1]_ hooks: - -.. code-block:: bash - - $ pre-commit install - -You can also run them anytime (as our tox does) using: - -.. code-block:: bash - - $ pre-commit run --all-files - - -.. [#f1] pre-commit should have been installed into your virtualenv automatically when you ran ``pip install -e '.[dev]'`` above. If pre-commit is missing, it may be that you need to re-run ``pip install -e '.[dev]'``. - - -Governance ----------- - -``attrs`` is maintained by `team of volunteers`_ that is always open to new members that share our vision of a fast, lean, and magic-free library that empowers programmers to write better code with less effort. -If you'd like to join, just get a pull request merged and ask to be added in the very same pull request! - -**The simple rule is that everyone is welcome to review/merge pull requests of others but nobody is allowed to merge their own code.** - -`Hynek Schlawack`_ acts reluctantly as the BDFL_ and has the final say over design decisions. - - -**** - -Please note that this project is released with a Contributor `Code of Conduct`_. -By participating in this project you agree to abide by its terms. -Please report any harm to `Hynek Schlawack`_ in any way you find appropriate. - -Thank you for considering contributing to ``attrs``! - - -.. _`Hynek Schlawack`: https://hynek.me/about/ -.. _`PEP 8`: https://www.python.org/dev/peps/pep-0008/ -.. _`PEP 257`: https://www.python.org/dev/peps/pep-0257/ -.. _`good test docstrings`: https://jml.io/pages/test-docstrings.html -.. _`Code of Conduct`: https://github.com/python-attrs/attrs/blob/master/.github/CODE_OF_CONDUCT.rst -.. _changelog: https://github.com/python-attrs/attrs/blob/master/CHANGELOG.rst -.. _`backward compatibility`: https://www.attrs.org/en/latest/backward-compatibility.html -.. _tox: https://tox.readthedocs.io/ -.. _pyenv: https://github.com/pyenv/pyenv -.. _reStructuredText: https://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html -.. _semantic newlines: https://rhodesmill.org/brandon/2012/one-sentence-per-line/ -.. _examples page: https://github.com/python-attrs/attrs/blob/master/docs/examples.rst -.. _Hypothesis: https://hypothesis.readthedocs.io/ -.. _CI: https://github.com/python-attrs/attrs/actions?query=workflow%3ACI -.. _`team of volunteers`: https://github.com/python-attrs -.. _BDFL: https://en.wikipedia.org/wiki/Benevolent_dictator_for_life -.. _towncrier: https://pypi.org/project/towncrier -.. _black: https://github.com/psf/black -.. _pre-commit: https://pre-commit.com/ -.. _isort: https://github.com/timothycrosley/isort diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md index 349590e1b..4133e06b1 100644 --- a/.github/PULL_REQUEST_TEMPLATE.md +++ b/.github/PULL_REQUEST_TEMPLATE.md @@ -1,18 +1,35 @@ +# Summary + + + + # Pull Request Check List -This is just a friendly reminder about the most common mistakes. Please make sure that you tick all boxes. But please read our [contribution guide](https://www.attrs.org/en/latest/contributing.html) at least once, it will save you unnecessary review cycles! + - [ ] Added **tests** for changed code. -- [ ] New features have been added to our [Hypothesis testing strategy](https://github.com/python-attrs/attrs/blob/master/tests/strategies.py). + Our CI fails if coverage is not 100%. +- [ ] New features have been added to our [Hypothesis testing strategy](https://github.com/python-attrs/attrs/blob/main/tests/strategies.py). - [ ] Changes or additions to public APIs are reflected in our type stubs (files ending in ``.pyi``). - [ ] ...and used in the stub test file `tests/typing_example.py`. + - [ ] If they've been added to `attr/__init__.pyi`, they've *also* been re-imported in `attrs/__init__.pyi`. - [ ] Updated **documentation** for changed code. - [ ] New functions/classes have to be added to `docs/api.rst` by hand. - [ ] Changes to the signature of `@attr.s()` have to be added by hand too. - - [ ] Changed/added classes/methods/functions have appropriate `versionadded`, `versionchanged`, or `deprecated` [directives](http://www.sphinx-doc.org/en/stable/markup/para.html#directive-versionadded). Find the appropriate next version in our [``__init__.py``](https://github.com/python-attrs/attrs/blob/master/src/attr/__init__.py) file. + - [ ] Changed/added classes/methods/functions have appropriate `versionadded`, `versionchanged`, or `deprecated` [directives](http://www.sphinx-doc.org/en/stable/markup/para.html#directive-versionadded). + Find the appropriate next version in our [``__init__.py``](https://github.com/python-attrs/attrs/blob/main/src/attr/__init__.py) file. - [ ] Documentation in `.rst` files is written using [semantic newlines](https://rhodesmill.org/brandon/2012/one-sentence-per-line/). -- [ ] Changes (and possible deprecations) have news fragments in [`changelog.d`](https://github.com/python-attrs/attrs/blob/master/changelog.d). +- [ ] Changes (and possible deprecations) have news fragments in [`changelog.d`](https://github.com/python-attrs/attrs/blob/main/changelog.d). +- [ ] Consider granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork), so maintainers can fix minor issues themselves without pestering you. -If you have *any* questions to *any* of the points above, just **submit and ask**! This checklist is here to *help* you, not to deter you from contributing! + diff --git a/.github/SECURITY.md b/.github/SECURITY.md new file mode 100644 index 000000000..e34c45d47 --- /dev/null +++ b/.github/SECURITY.md @@ -0,0 +1,12 @@ +# Security Policy + +## Supported Versions + +We are following [CalVer](https://calver.org) with generous backwards-compatibility guarantees. +Therefore we only support the latest version. + + +## Reporting a Vulnerability + +To report a security vulnerability, please use the [Tidelift security contact](https://tidelift.com/security). +Tidelift will coordinate the fix and disclosure. diff --git a/.github/SECURITY.yml b/.github/SECURITY.yml deleted file mode 100644 index 5e565ec19..000000000 --- a/.github/SECURITY.yml +++ /dev/null @@ -1,2 +0,0 @@ -To report a security vulnerability, please use the [Tidelift security contact](https://tidelift.com/security). -Tidelift will coordinate the fix and disclosure. diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml index 9d2536472..a4755b1c8 100644 --- a/.github/workflows/main.yml +++ b/.github/workflows/main.yml @@ -3,90 +3,112 @@ name: CI on: push: - branches: ["master"] + branches: ["main"] + tags: ["*"] pull_request: - branches: ["master"] + branches: ["main"] workflow_dispatch: +env: + FORCE_COLOR: "1" # Make tools pretty. + TOX_TESTENV_PASSENV: FORCE_COLOR + PYTHON_LATEST: "3.10" + + jobs: tests: - name: "Python ${{ matrix.python-version }}" - runs-on: "ubuntu-latest" - env: - USING_COVERAGE: "2.7,3.7,3.8" + name: tox on ${{ matrix.python-version }} + runs-on: ubuntu-latest strategy: + fail-fast: false matrix: - python-version: ["2.7", "3.5", "3.6", "3.7", "3.8", "3.9", "pypy2", "pypy3"] + python-version: ["3.5", "3.6", "3.7", "3.8", "3.9", "3.10", "3.11.0-beta - 3.11", "pypy-3.7", "pypy-3.8"] steps: - - uses: "actions/checkout@v2" - - uses: "actions/setup-python@v2" + - uses: actions/checkout@v3 + - uses: actions/setup-python@v3 with: - python-version: "${{ matrix.python-version }}" + python-version: ${{ matrix.python-version }} + - name: "Install dependencies" run: | - set -xe python -VV python -m site python -m pip install --upgrade pip setuptools wheel - python -m pip install --upgrade coverage[toml] virtualenv tox tox-gh-actions + python -m pip install --upgrade virtualenv tox tox-gh-actions + + - run: "python -m tox" + + - name: Upload coverage data + uses: actions/upload-artifact@v3 + with: + name: coverage-data + path: ".coverage.*" + if-no-files-found: ignore - - name: "Run tox targets for ${{ matrix.python-version }}" - run: "python -m tox" - # We always use a modern Python version for combining coverage to prevent - # parsing errors in older versions for modern code. - - uses: "actions/setup-python@v2" + coverage: + runs-on: ubuntu-latest + needs: tests + + steps: + - uses: actions/checkout@v3 + - uses: actions/setup-python@v3 with: - python-version: "3.9" + # Use latest Python, so it understands all syntax. + python-version: ${{env.PYTHON_LATEST}} - - name: "Combine coverage" + - run: python -m pip install --upgrade coverage[toml] + + - name: Download coverage data + uses: actions/download-artifact@v3 + with: + name: coverage-data + + - name: Combine coverage and fail if it's <100%. run: | - set -xe - python -m pip install coverage[toml] python -m coverage combine - python -m coverage xml - if: "contains(env.USING_COVERAGE, matrix.python-version)" - - name: "Upload coverage to Codecov" - if: "contains(env.USING_COVERAGE, matrix.python-version)" - uses: "codecov/codecov-action@v1" + python -m coverage html --skip-covered --skip-empty + python -m coverage report --fail-under=100 + + - name: Upload HTML report if check failed. + uses: actions/upload-artifact@v3 with: - fail_ci_if_error: true + name: html-report + path: htmlcov + if: ${{ failure() }} + package: - name: "Build & verify package" - runs-on: "ubuntu-latest" + name: Build & verify package + runs-on: ubuntu-latest steps: - - uses: "actions/checkout@v2" - - uses: "actions/setup-python@v2" + - uses: actions/checkout@v3 + - uses: actions/setup-python@v3 with: - python-version: "3.9" + python-version: ${{env.PYTHON_LATEST}} + + - run: python -m pip install build twine check-wheel-contents + - run: python -m build --sdist --wheel . + - run: ls -l dist + - run: check-wheel-contents dist/*.whl + - name: Check long_description + run: python -m twine check dist/* - - name: "Install pep517 and twine" - run: "python -m pip install pep517 twine" - - name: "Build package" - run: "python -m pep517.build --source --binary ." - - name: "List result" - run: "ls -l dist" - - name: "Check long_description" - run: "python -m twine check dist/*" install-dev: + name: Verify dev env + runs-on: ${{ matrix.os }} strategy: matrix: - os: ["ubuntu-latest", "windows-latest", "macos-latest"] - - name: "Verify dev env" - runs-on: "${{ matrix.os }}" + os: ["ubuntu-latest", "windows-latest"] steps: - - uses: "actions/checkout@v2" - - uses: "actions/setup-python@v2" + - uses: actions/checkout@v3 + - uses: actions/setup-python@v3 with: - python-version: "3.9" - - name: "Install in dev mode" - run: "python -m pip install -e .[dev]" - - name: "Import package" - run: "python -c 'import attr; print(attr.__version__)'" + python-version: ${{env.PYTHON_LATEST}} + - run: python -m pip install -e .[dev] + - run: python -c 'import attr; print(attr.__version__)' diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 8aab109e8..439cfd55c 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -1,34 +1,48 @@ --- +ci: + autoupdate_schedule: monthly + +default_language_version: + python: python3.10 # needed for match + repos: - repo: https://github.com/psf/black - rev: 20.8b1 + rev: 22.6.0 hooks: - id: black - language_version: python3.8 - - repo: https://github.com/pre-commit/mirrors-isort - rev: v5.6.4 + - repo: https://github.com/asottile/pyupgrade + rev: v2.37.3 + hooks: + - id: pyupgrade + args: [--py3-plus, --keep-percent-format] + exclude: "tests/test_slots.py" + + - repo: https://github.com/PyCQA/isort + rev: 5.10.1 hooks: - id: isort additional_dependencies: [toml] + files: \.py$ - - repo: https://gitlab.com/pycqa/flake8 - rev: 3.8.4 + - repo: https://github.com/PyCQA/flake8 + rev: 4.0.1 hooks: - id: flake8 - language_version: python3.8 + language_version: python3.10 - repo: https://github.com/econchick/interrogate - rev: 1.3.1 + rev: 1.5.0 hooks: - id: interrogate args: [tests] - language_version: python3.8 + language_version: python3.10 - repo: https://github.com/pre-commit/pre-commit-hooks - rev: v3.3.0 + rev: v4.3.0 hooks: - id: trailing-whitespace - id: end-of-file-fixer - id: debug-statements - id: check-toml + - id: check-yaml diff --git a/.readthedocs.yml b/.readthedocs.yml index 511ae165f..d335c40d5 100644 --- a/.readthedocs.yml +++ b/.readthedocs.yml @@ -1,9 +1,14 @@ --- version: 2 -python: - # Keep version in sync with tox.ini (docs and gh-actions). - version: 3.7 +formats: all + +build: + os: ubuntu-20.04 + tools: + # Keep version in sync with tox.ini (docs and gh-actions). + python: "3.10" +python: install: - method: pip path: . diff --git a/AUTHORS.rst b/AUTHORS.rst index f14ef6c60..aa677e81d 100644 --- a/AUTHORS.rst +++ b/AUTHORS.rst @@ -8,4 +8,4 @@ The development is kindly supported by `Variomedia AG `_. It’s the spiritual successor of `characteristic `_ and aspires to fix some of it clunkiness and unfortunate decisions. -Both were inspired by Twisted’s `FancyEqMixin `_ but both are implemented using class decorators because `subclassing is bad for you `_, m’kay? +Both were inspired by Twisted’s `FancyEqMixin `_ but both are implemented using class decorators because `subclassing is bad for you `_, m’kay? diff --git a/CHANGELOG.rst b/CHANGELOG.rst index 11dd81a00..61ca55997 100644 --- a/CHANGELOG.rst +++ b/CHANGELOG.rst @@ -1,11 +1,248 @@ Changelog ========= -Versions follow `CalVer `_ with a strict backwards compatibility policy. -The third digit is only for regressions. +Versions follow `CalVer `_ with a strict backwards-compatibility policy. + +The **first number** of the version is the year. +The **second number** is incremented with each release, starting at 1 for each year. +The **third number** is when we need to start branches for older releases (only for emergencies). + +Put simply, you shouldn't ever be afraid to upgrade ``attrs`` if you're only using its public APIs. +Whenever there is a need to break compatibility, it is announced here in the changelog, and raises a ``DeprecationWarning`` for a year (if possible) before it's finally really broken. + +.. warning:: + + The structure of the `attrs.Attribute` class is exempt from this rule. + It *will* change in the future, but since it should be considered read-only, that shouldn't matter. + + However if you intend to build extensions on top of ``attrs`` you have to anticipate that. .. towncrier release notes start +22.1.0 (2022-07-28) +------------------- + +Backwards-incompatible Changes +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +- Python 2.7 is not supported anymore. + + Dealing with Python 2.7 tooling has become too difficult for a volunteer-run project. + + We have supported Python 2 more than 2 years after it was officially discontinued and feel that we have paid our dues. + All version up to 21.4.0 from December 2021 remain fully functional, of course. + `#936 `_ +- The deprecated ``cmp`` attribute of ``attrs.Attribute`` has been removed. + This does not affect the *cmp* argument to ``attr.s`` that can be used as a shortcut to set *eq* and *order* at the same time. + `#939 `_ + + +Changes +^^^^^^^ + +- Instantiation of frozen slotted classes is now faster. + `#898 `_ +- If an ``eq`` key is defined, it is also used before hashing the attribute. + `#909 `_ +- Added ``attrs.validators.min_len()``. + `#916 `_ +- ``attrs.validators.deep_iterable()``'s *member_validator* argument now also accepts a list of validators and wraps them in an ``attrs.validators.and_()``. + `#925 `_ +- Added missing type stub re-imports for ``attrs.converters`` and ``attrs.filters``. + `#931 `_ +- Added missing stub for ``attr(s).cmp_using()``. + `#949 `_ +- ``attrs.validators._in()``'s ``ValueError`` is not missing the attribute, expected options, and the value it got anymore. + `#951 `_ +- Python 3.11 is now officially supported. + `#969 `_ + + +---- + + +21.4.0 (2021-12-29) +------------------- + +Changes +^^^^^^^ + +- Fixed the test suite on PyPy3.8 where ``cloudpickle`` does not work. + `#892 `_ +- Fixed ``coverage report`` for projects that use ``attrs`` and don't set a ``--source``. + `#895 `_, + `#896 `_ + + +---- + + +21.3.0 (2021-12-28) +------------------- + +Backward-incompatible Changes +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +- When using ``@define``, converters are now run by default when setting an attribute on an instance -- additionally to validators. + I.e. the new default is ``on_setattr=[attrs.setters.convert, attrs.setters.validate]``. + + This is unfortunately a breaking change, but it was an oversight, impossible to raise a ``DeprecationWarning`` about, and it's better to fix it now while the APIs are very fresh with few users. + `#835 `_, + `#886 `_ +- ``import attrs`` has finally landed! + As of this release, you can finally import ``attrs`` using its proper name. + + Not all names from the ``attr`` namespace have been transferred; most notably ``attr.s`` and ``attr.ib`` are missing. + See ``attrs.define`` and ``attrs.field`` if you haven't seen our next-generation APIs yet. + A more elaborate explanation can be found `On The Core API Names `_ + + This feature is at least for one release **provisional**. + We don't *plan* on changing anything, but such a big change is unlikely to go perfectly on the first strike. + + The API docs have been mostly updated, but it will be an ongoing effort to change everything to the new APIs. + Please note that we have **not** moved -- or even removed -- anything from ``attr``! + + Please do report any bugs or documentation inconsistencies! + `#887 `_ + + +Changes +^^^^^^^ + +- ``attr.asdict(retain_collection_types=False)`` (default) dumps collection-esque keys as tuples. + `#646 `_, + `#888 `_ +- ``__match_args__`` are now generated to support Python 3.10's + `Structural Pattern Matching `_. + This can be controlled by the ``match_args`` argument to the class decorators on Python 3.10 and later. + On older versions, it is never added and the argument is ignored. + `#815 `_ +- If the class-level *on_setattr* is set to ``attrs.setters.validate`` (default in ``@define`` and ``@mutable``) but no field defines a validator, pretend that it's not set. + `#817 `_ +- The generated ``__repr__`` is significantly faster on Pythons with f-strings. + `#819 `_ +- Attributes transformed via ``field_transformer`` are wrapped with ``AttrsClass`` again. + `#824 `_ +- Generated source code is now cached more efficiently for identical classes. + `#828 `_ +- Added ``attrs.converters.to_bool()``. + `#830 `_ +- ``attrs.resolve_types()`` now resolves types of subclasses after the parents are resolved. + `#842 `_ + `#843 `_ +- Added new validators: ``lt(val)`` (< val), ``le(va)`` (≤ val), ``ge(val)`` (≥ val), ``gt(val)`` (> val), and ``maxlen(n)``. + `#845 `_ +- ``attrs`` classes are now fully compatible with `cloudpickle `_ (no need to disable ``repr`` anymore). + `#857 `_ +- Added new context manager ``attrs.validators.disabled()`` and functions ``attrs.validators.(set|get)_disabled()``. + They deprecate ``attrs.(set|get)_run_validators()``. + All functions are interoperable and modify the same internal state. + They are not – and never were – thread-safe, though. + `#859 `_ +- ``attrs.validators.matches_re()`` now accepts pre-compiled regular expressions in addition to pattern strings. + `#877 `_ + + +---- + + +21.2.0 (2021-05-07) +------------------- + +Backward-incompatible Changes +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +- We had to revert the recursive feature for ``attr.evolve()`` because it broke some use-cases -- sorry! + `#806 `_ +- Python 3.4 is now blocked using packaging metadata because ``attrs`` can't be imported on it anymore. + To ensure that 3.4 users can keep installing ``attrs`` easily, we will `yank `_ 21.1.0 from PyPI. + This has **no** consequences if you pin ``attrs`` to 21.1.0. + `#807 `_ + + +---- + + +21.1.0 (2021-05-06) +------------------- + +Deprecations +^^^^^^^^^^^^ + +- The long-awaited, much-talked-about, little-delivered ``import attrs`` is finally upon us! + + Since the NG APIs have now been proclaimed stable, the **next** release of ``attrs`` will allow you to actually ``import attrs``. + We're taking this opportunity to replace some defaults in our APIs that made sense in 2015, but don't in 2021. + + So please, if you have any pet peeves about defaults in ``attrs``'s APIs, *now* is the time to air your grievances in #487! + We're not gonna get such a chance for a second time, without breaking our backward-compatibility guarantees, or long deprecation cycles. + Therefore, speak now or forever hold you peace! + `#487 `_ +- The *cmp* argument to ``attr.s()`` and `attr.ib()` has been **undeprecated** + It will continue to be supported as syntactic sugar to set *eq* and *order* in one go. + + I'm terribly sorry for the hassle around this argument! + The reason we're bringing it back is it's usefulness regarding customization of equality/ordering. + + The ``cmp`` attribute and argument on ``attr.Attribute`` remains deprecated and will be removed later this year. + `#773 `_ + + +Changes +^^^^^^^ + +- It's now possible to customize the behavior of ``eq`` and ``order`` by passing in a callable. + `#435 `_, + `#627 `_ +- The instant favorite next-generation APIs are not provisional anymore! + + They are also officially supported by Mypy as of their `0.800 release `_. + + We hope the next release will already contain an (additional) importable package called ``attrs``. + `#668 `_, + `#786 `_ +- If an attribute defines a converter, the type of its parameter is used as type annotation for its corresponding ``__init__`` parameter. + + If an ``attr.converters.pipe`` is used, the first one's is used. + `#710 `_ +- Fixed the creation of an extra slot for an ``attr.ib`` when the parent class already has a slot with the same name. + `#718 `_ +- ``__attrs__init__()`` will now be injected if ``init=False``, or if ``auto_detect=True`` and a user-defined ``__init__()`` exists. + + This enables users to do "pre-init" work in their ``__init__()`` (such as ``super().__init__()``). + + ``__init__()`` can then delegate constructor argument processing to ``self.__attrs_init__(*args, **kwargs)``. + `#731 `_ +- ``bool(attr.NOTHING)`` is now ``False``. + `#732 `_ +- It's now possible to use ``super()`` inside of properties of slotted classes. + `#747 `_ +- Allow for a ``__attrs_pre_init__()`` method that -- if defined -- will get called at the beginning of the ``attrs``-generated ``__init__()`` method. + `#750 `_ +- Added forgotten ``attr.Attribute.evolve()`` to type stubs. + `#752 `_ +- ``attrs.evolve()`` now works recursively with nested ``attrs`` classes. + `#759 `_ +- Python 3.10 is now officially supported. + `#763 `_ +- ``attr.resolve_types()`` now takes an optional *attrib* argument to work inside a ``field_transformer``. + `#774 `_ +- ``ClassVar``\ s are now also detected if they come from `typing-extensions `_. + `#782 `_ +- To make it easier to customize attribute comparison (#435), we have added the ``attr.cmp_with()`` helper. + + See the `new docs on comparison `_ for more details. + `#787 `_ +- Added **provisional** support for static typing in ``pyright`` via the `dataclass_transforms specification `_. + Both the ``pyright`` specification and ``attrs`` implementation may change in future versions of both projects. + + Your constructive feedback is welcome in both `attrs#795 `_ and `pyright#1782 `_. + `#796 `_ + + +---- + + 20.3.0 (2020-11-05) ------------------- @@ -14,7 +251,7 @@ Backward-incompatible Changes - ``attr.define()``, ``attr.frozen()``, ``attr.mutable()``, and ``attr.field()`` remain **provisional**. - This release does **not** change change anything about them and they are already used widely in production though. + This release does **not** change anything about them and they are already used widely in production though. If you wish to use them together with mypy, you can simply drop `this plugin `_ into your project. @@ -115,7 +352,7 @@ Deprecations Please check out the linked issue for more details. These new APIs have been added *provisionally* as part of #666 so you can try them out today and provide feedback. - Learn more in the `API docs `_. + Learn more in the `API docs `_. `#408 `_ @@ -270,7 +507,7 @@ Changes That callable must return a string and is then used for formatting the attribute by the generated ``__repr__()`` method. `#568 `_ - Added ``attr.__version_info__`` that can be used to reliably check the version of ``attrs`` and write forward- and backward-compatible code. - Please check out the `section on deprecated APIs `_ on how to use it. + Please check out the `section on deprecated APIs `_ on how to use it. `#580 `_ .. _`#425`: https://github.com/python-attrs/attrs/issues/425 @@ -338,7 +575,7 @@ Deprecations Changes ^^^^^^^ -- ``attrs`` now ships its own `PEP 484 `_ type hints. +- ``attrs`` now ships its own `PEP 484 `_ type hints. Together with `mypy `_'s ``attrs`` plugin, you've got all you need for writing statically typed code in both Python 2 and 3! At that occasion, we've also added `narrative docs `_ about type annotations in ``attrs``. @@ -392,7 +629,7 @@ Changes `#349 `_ - The order of attributes that are passed into ``attr.make_class()`` or the *these* argument of ``@attr.s()`` is now retained if the dictionary is ordered (i.e. ``dict`` on Python 3.6 and later, ``collections.OrderedDict`` otherwise). - Before, the order was always determined by the order in which the attributes have been defined which may not be desirable when creating classes programatically. + Before, the order was always determined by the order in which the attributes have been defined which may not be desirable when creating classes programmatically. `#300 `_, `#339 `_, @@ -404,7 +641,7 @@ Changes - Setting the cell type is now completely best effort. This fixes ``attrs`` on Jython. - We cannot make any guarantees regarding Jython though, because our test suite cannot run due to dependency incompatabilities. + We cannot make any guarantees regarding Jython though, because our test suite cannot run due to dependency incompatibilities. `#321 `_, `#334 `_ @@ -544,7 +781,7 @@ Changes This change paves the way for automatic type checking and serialization (though as of this release ``attrs`` does not make use of it). In Python 3.6 or higher, the value of ``attr.Attribute.type`` can alternately be set using variable type annotations - (see `PEP 526 `_). + (see `PEP 526 `_). (`#151 `_, `#214 `_, `#215 `_, `#239 `_) - The combination of ``str=True`` and ``slots=True`` now works on Python 2. (`#198 `_) @@ -644,7 +881,7 @@ Changes: - Accordingly, ``attr.validators.optional()`` now can take a list of validators too. `#161 `_ - Validators can now be defined conveniently inline by using the attribute as a decorator. - Check out the `validator examples `_ to see it in action! + Check out the `validator examples `_ to see it in action! `#143 `_ - ``attr.Factory()`` now has a *takes_self* argument that makes the initializer to pass the partially initialized instance into the factory. In other words you can define attribute defaults based on other attributes. @@ -736,7 +973,7 @@ Deprecations: This will remove the confusing error message if you write your own ``__init__`` and forget to initialize some attribute. Instead you will get a straightforward ``AttributeError``. In other words: decorated classes will work more like plain Python classes which was always ``attrs``'s goal. -- The serious business aliases ``attr.attributes`` and ``attr.attr`` have been deprecated in favor of ``attr.attrs`` and ``attr.attrib`` which are much more consistent and frankly obvious in hindsight. +- The serious-business aliases ``attr.attributes`` and ``attr.attr`` have been deprecated in favor of ``attr.attrs`` and ``attr.attrib`` which are much more consistent and frankly obvious in hindsight. They will be purged from documentation immediately but there are no plans to actually remove them. diff --git a/LICENSE b/LICENSE index 7ae3df930..2bd6453d2 100644 --- a/LICENSE +++ b/LICENSE @@ -1,6 +1,6 @@ The MIT License (MIT) -Copyright (c) 2015 Hynek Schlawack +Copyright (c) 2015 Hynek Schlawack and the attrs contributors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal diff --git a/MANIFEST.in b/MANIFEST.in index 46c9dbb6a..3d68bf9c5 100644 --- a/MANIFEST.in +++ b/MANIFEST.in @@ -2,12 +2,13 @@ include LICENSE *.rst *.toml *.yml *.yaml *.ini graft .github # Stubs -include src/attr/py.typed recursive-include src *.pyi +recursive-include src py.typed # Tests include tox.ini conftest.py recursive-include tests *.py +recursive-include tests *.yml # Documentation include docs/Makefile docs/docutils.conf diff --git a/README.rst b/README.rst index f6595f535..9f197ea7a 100644 --- a/README.rst +++ b/README.rst @@ -1,29 +1,29 @@ -.. image:: https://www.attrs.org/en/latest/_static/attrs_logo.png - :alt: attrs Logo - -====================================== -``attrs``: Classes Without Boilerplate -====================================== - -.. image:: https://readthedocs.org/projects/attrs/badge/?version=stable - :target: https://www.attrs.org/en/stable/?badge=stable - :alt: Documentation Status - -.. image:: https://github.com/python-attrs/attrs/workflows/CI/badge.svg?branch=master - :target: https://github.com/python-attrs/attrs/actions?workflow=CI - :alt: CI Status - -.. image:: https://codecov.io/github/python-attrs/attrs/branch/master/graph/badge.svg - :target: https://codecov.io/github/python-attrs/attrs - :alt: Test Coverage - -.. image:: https://img.shields.io/badge/code%20style-black-000000.svg - :target: https://github.com/psf/black - :alt: Code style: black +.. raw:: html + +

+ + attrs + +

+

+ + Documentation + + + License: MIT + + + + + + Downloads per month + +

.. teaser-begin -``attrs`` is the Python package that will bring back the **joy** of **writing classes** by relieving you from the drudgery of implementing object protocols (aka `dunder `_ methods). +``attrs`` is the Python package that will bring back the **joy** of **writing classes** by relieving you from the drudgery of implementing object protocols (aka `dunder methods `_). +`Trusted by NASA `_ for Mars missions since 2020! Its main goal is to help you to write **concise** and **correct** software without slowing down your code. @@ -35,12 +35,12 @@ For that, it gives you a class decorator and a way to declaratively define the a .. code-block:: pycon - >>> import attr + >>> from attrs import asdict, define, make_class, Factory - >>> @attr.s - ... class SomeClass(object): - ... a_number = attr.ib(default=42) - ... list_of_numbers = attr.ib(factory=list) + >>> @define + ... class SomeClass: + ... a_number: int = 42 + ... list_of_numbers: list[int] = Factory(list) ... ... def hard_math(self, another_number): ... return self.a_number + sum(self.list_of_numbers) * another_number @@ -57,59 +57,64 @@ For that, it gives you a class decorator and a way to declaratively define the a >>> sc != SomeClass(2, [3, 2, 1]) True - >>> attr.asdict(sc) + >>> asdict(sc) {'a_number': 1, 'list_of_numbers': [1, 2, 3]} >>> SomeClass() SomeClass(a_number=42, list_of_numbers=[]) - >>> C = attr.make_class("C", ["a", "b"]) + >>> C = make_class("C", ["a", "b"]) >>> C("foo", "bar") C(a='foo', b='bar') -After *declaring* your attributes ``attrs`` gives you: +After *declaring* your attributes, ``attrs`` gives you: - a concise and explicit overview of the class's attributes, - a nice human-readable ``__repr__``, -- a complete set of comparison methods (equality and ordering), +- equality-checking methods, - an initializer, - and much more, *without* writing dull boilerplate code again and again and *without* runtime performance penalties. -On Python 3.6 and later, you can often even drop the calls to ``attr.ib()`` by using `type annotations `_. +**Hate type annotations**!? +No problem! +Types are entirely **optional** with ``attrs``. +Simply assign ``attrs.field()`` to the attributes instead of annotating them with types. -This gives you the power to use actual classes with actual types in your code instead of confusing ``tuple``\ s or `confusingly behaving `_ ``namedtuple``\ s. -Which in turn encourages you to write *small classes* that do `one thing well `_. -Never again violate the `single responsibility principle `_ just because implementing ``__init__`` et al is a painful drag. +---- +This example uses ``attrs``'s modern APIs that have been introduced in version 20.1.0, and the ``attrs`` package import name that has been added in version 21.3.0. +The classic APIs (``@attr.s``, ``attr.ib``, plus their serious-business aliases) and the ``attr`` package import name will remain **indefinitely**. -.. -getting-help- +Please check out `On The Core API Names `_ for a more in-depth explanation. -Getting Help -============ -Please use the ``python-attrs`` tag on `StackOverflow `_ to get help. +Data Classes +============ -Answering questions of your fellow developers is also a great way to help the project! +On the tin, ``attrs`` might remind you of ``dataclasses`` (and indeed, ``dataclasses`` `are a descendant `_ of ``attrs``). +In practice it does a lot more and is more flexible. +For instance it allows you to define `special handling of NumPy arrays for equality checks `_, or allows more ways to `plug into the initialization process `_. +For more details, please refer to our `comparison page `_. .. -project-information- Project Information =================== -``attrs`` is released under the `MIT `_ license, -its documentation lives at `Read the Docs `_, -the code on `GitHub `_, -and the latest release on `PyPI `_. -It’s rigorously tested on Python 2.7, 3.5+, and PyPy. - -We collect information on **third-party extensions** in our `wiki `_. -Feel free to browse and add your own! +- **License**: `MIT `_ +- **PyPI**: https://pypi.org/project/attrs/ +- **Source Code**: https://github.com/python-attrs/attrs +- **Documentation**: https://www.attrs.org/ +- **Changelog**: https://www.attrs.org/en/stable/changelog.html +- **Get Help**: please use the ``python-attrs`` tag on `StackOverflow `_ +- **Third-party Extensions**: https://github.com/python-attrs/attrs/wiki/Extensions-to-attrs +- **Supported Python Versions**: 3.5 and later (last 2.7-compatible release is `21.4.0 `_) -If you'd like to contribute to ``attrs`` you're most welcome and we've written `a little guide `_ to get you started! +If you'd like to contribute to ``attrs`` you're most welcome and we've written `a little guide `_ to get you started! ``attrs`` for Enterprise diff --git a/changelog.d/towncrier_template.rst b/changelog.d/towncrier_template.rst index 29ca74c4e..55f1eef5c 100644 --- a/changelog.d/towncrier_template.rst +++ b/changelog.d/towncrier_template.rst @@ -1,4 +1,7 @@ +{{ versiondata.version }} ({{ versiondata.date }}) +{{ top_underline * ((versiondata.version + versiondata.date)|length + 3) -}} {% for section, _ in sections.items() %} + {% set underline = underlines[0] %}{% if section %}{{section}} {{ underline * section|length }}{% set underline = underlines[1] %} diff --git a/codecov.yml b/codecov.yml deleted file mode 100644 index 60a1e5c12..000000000 --- a/codecov.yml +++ /dev/null @@ -1,10 +0,0 @@ ---- -comment: false -coverage: - status: - patch: - default: - target: "100" - project: - default: - target: "100" diff --git a/conftest.py b/conftest.py index b34f1bd4f..33cc6a6cb 100644 --- a/conftest.py +++ b/conftest.py @@ -1,9 +1,10 @@ -from __future__ import absolute_import, division, print_function +# SPDX-License-Identifier: MIT -import sys from hypothesis import HealthCheck, settings +from attr._compat import PY36, PY310 + def pytest_configure(config): # HealthCheck.too_slow causes more trouble than good -- especially in CIs. @@ -14,7 +15,7 @@ def pytest_configure(config): collect_ignore = [] -if sys.version_info[:2] < (3, 6): +if not PY36: collect_ignore.extend( [ "tests/test_annotations.py", @@ -23,3 +24,5 @@ def pytest_configure(config): "tests/test_next_gen.py", ] ) +if not PY310: + collect_ignore.extend(["tests/test_pattern_matching.py"]) diff --git a/docs/_static/attrs_logo.svg b/docs/_static/attrs_logo.svg index 4326ae13b..b02ae6c02 100644 --- a/docs/_static/attrs_logo.svg +++ b/docs/_static/attrs_logo.svg @@ -1 +1,10 @@ - + + + + + + + + + + diff --git a/docs/_static/attrs_logo_white.svg b/docs/_static/attrs_logo_white.svg new file mode 100644 index 000000000..daad798da --- /dev/null +++ b/docs/_static/attrs_logo_white.svg @@ -0,0 +1,10 @@ + + + + + + + + + + diff --git a/docs/api.rst b/docs/api.rst index 5cc2f2d60..a273d19c2 100644 --- a/docs/api.rst +++ b/docs/api.rst @@ -3,32 +3,128 @@ API Reference .. currentmodule:: attr -``attrs`` works by decorating a class using `attr.s` and then optionally defining attributes on the class using `attr.ib`. +``attrs`` works by decorating a class using `attrs.define` or `attr.s` and then optionally defining attributes on the class using `attrs.field`, `attr.ib`, or a type annotation. -.. note:: - - When this documentation speaks about "``attrs`` attributes" it means those attributes that are defined using `attr.ib` in the class body. +If you're confused by the many names, please check out `names` for clarification. What follows is the API explanation, if you'd like a more hands-on introduction, have a look at `examples`. +As of version 21.3.0, ``attrs`` consists of **two** top-level package names: + +- The classic ``attr`` that powered the venerable `attr.s` and `attr.ib` +- The modern ``attrs`` that only contains most modern APIs and relies on `attrs.define` and `attrs.field` to define your classes. + Additionally it offers some ``attr`` APIs with nicer defaults (e.g. `attrs.asdict`). + Using this namespace requires Python 3.6 or later. + +The ``attrs`` namespace is built *on top of* ``attr`` which will *never* go away. Core ---- +.. note:: + + Please note that the ``attrs`` namespace has been added in version 21.3.0. + Most of the objects are simply re-imported from ``attr``. + Therefore if a class, method, or function claims that it has been added in an older version, it is only available in the ``attr`` namespace. + +.. autodata:: attrs.NOTHING + +.. autofunction:: attrs.define + +.. function:: attrs.mutable(same_as_define) + + Alias for `attrs.define`. + + .. versionadded:: 20.1.0 + +.. function:: attrs.frozen(same_as_define) + + Behaves the same as `attrs.define` but sets *frozen=True* and *on_setattr=None*. + + .. versionadded:: 20.1.0 + +.. autofunction:: attrs.field + +.. function:: define + + Old import path for `attrs.define`. + +.. function:: mutable + + Old import path for `attrs.mutable`. + +.. function:: frozen + + Old import path for `attrs.frozen`. + +.. function:: field + + Old import path for `attrs.field`. + +.. autoclass:: attrs.Attribute + :members: evolve + + For example: + + .. doctest:: + + >>> import attr + >>> @attr.s + ... class C: + ... x = attr.ib() + >>> attr.fields(C).x + Attribute(name='x', default=NOTHING, validator=None, repr=True, eq=True, eq_key=None, order=True, order_key=None, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None) + + +.. autofunction:: attrs.make_class + + This is handy if you want to programmatically create classes. + + For example: + + .. doctest:: + + >>> C1 = attr.make_class("C1", ["x", "y"]) + >>> C1(1, 2) + C1(x=1, y=2) + >>> C2 = attr.make_class("C2", {"x": attr.ib(default=42), + ... "y": attr.ib(default=attr.Factory(list))}) + >>> C2() + C2(x=42, y=[]) + + +.. autoclass:: attrs.Factory + + For example: + + .. doctest:: + + >>> @attr.s + ... class C: + ... x = attr.ib(default=attr.Factory(list)) + ... y = attr.ib(default=attr.Factory( + ... lambda self: set(self.x), + ... takes_self=True) + ... ) + >>> C() + C(x=[], y=set()) + >>> C([1, 2, 3]) + C(x=[1, 2, 3], y={1, 2, 3}) + -.. warning:: - As of ``attrs`` 20.1.0, it also ships with a bunch of provisional APIs that are intended to become the main way of defining classes in the future. +Classic +~~~~~~~ - Please have a look at :ref:`prov`. +.. data:: attr.NOTHING -.. autodata:: attr.NOTHING + Same as `attrs.NOTHING`. -.. autofunction:: attr.s(these=None, repr_ns=None, repr=None, cmp=None, hash=None, init=None, slots=False, frozen=False, weakref_slot=True, str=False, auto_attribs=False, kw_only=False, cache_hash=False, auto_exc=False, eq=None, order=None, auto_detect=False, collect_by_mro=False, getstate_setstate=None, on_setattr=None, field_transformer=None) +.. autofunction:: attr.s(these=None, repr_ns=None, repr=None, cmp=None, hash=None, init=None, slots=False, frozen=False, weakref_slot=True, str=False, auto_attribs=False, kw_only=False, cache_hash=False, auto_exc=False, eq=None, order=None, auto_detect=False, collect_by_mro=False, getstate_setstate=None, on_setattr=None, field_transformer=None, match_args=True) .. note:: - ``attrs`` also comes with a serious business alias ``attr.attrs``. + ``attrs`` also comes with a serious-business alias ``attr.attrs``. For example: @@ -36,11 +132,11 @@ Core >>> import attr >>> @attr.s - ... class C(object): + ... class C: ... _private = attr.ib() >>> C(private=42) C(_private=42) - >>> class D(object): + >>> class D: ... def __init__(self, x): ... self.x = x >>> D(1) @@ -68,14 +164,14 @@ Core .. note:: - ``attrs`` also comes with a serious business alias ``attr.attrib``. + ``attrs`` also comes with a serious-business alias ``attr.attrib``. The object returned by `attr.ib` also allows for setting the default and the validator using decorators: .. doctest:: >>> @attr.s - ... class C(object): + ... class C: ... x = attr.ib() ... y = attr.ib() ... @x.validator @@ -92,67 +188,32 @@ Core ... ValueError: x must be positive -.. autoclass:: attr.Attribute - :members: evolve - - .. doctest:: - - >>> import attr - >>> @attr.s - ... class C(object): - ... x = attr.ib() - >>> attr.fields(C).x - Attribute(name='x', default=NOTHING, validator=None, repr=True, eq=True, order=True, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None) - - -.. autofunction:: attr.make_class - - This is handy if you want to programmatically create classes. - - For example: - - .. doctest:: - - >>> C1 = attr.make_class("C1", ["x", "y"]) - >>> C1(1, 2) - C1(x=1, y=2) - >>> C2 = attr.make_class("C2", {"x": attr.ib(default=42), - ... "y": attr.ib(default=attr.Factory(list))}) - >>> C2() - C2(x=42, y=[]) - -.. autoclass:: attr.Factory - For example: +Exceptions +---------- - .. doctest:: +All exceptions are available from both ``attr.exceptions`` and ``attrs.exceptions`` and are the same thing. +That means that it doesn't matter from from which namespace they've been raised and/or caught: - >>> @attr.s - ... class C(object): - ... x = attr.ib(default=attr.Factory(list)) - ... y = attr.ib(default=attr.Factory( - ... lambda self: set(self.x), - ... takes_self=True) - ... ) - >>> C() - C(x=[], y=set()) - >>> C([1, 2, 3]) - C(x=[1, 2, 3], y={1, 2, 3}) +.. doctest:: + >>> import attrs, attr + >>> try: + ... raise attrs.exceptions.FrozenError() + ... except attr.exceptions.FrozenError: + ... print("this works!") + this works! -Exceptions ----------- - -.. autoexception:: attr.exceptions.PythonTooOldError -.. autoexception:: attr.exceptions.FrozenError -.. autoexception:: attr.exceptions.FrozenInstanceError -.. autoexception:: attr.exceptions.FrozenAttributeError -.. autoexception:: attr.exceptions.AttrsAttributeNotFoundError -.. autoexception:: attr.exceptions.NotAnAttrsClassError -.. autoexception:: attr.exceptions.DefaultAlreadySetError -.. autoexception:: attr.exceptions.UnannotatedAttributeError -.. autoexception:: attr.exceptions.NotCallableError +.. autoexception:: attrs.exceptions.PythonTooOldError +.. autoexception:: attrs.exceptions.FrozenError +.. autoexception:: attrs.exceptions.FrozenInstanceError +.. autoexception:: attrs.exceptions.FrozenAttributeError +.. autoexception:: attrs.exceptions.AttrsAttributeNotFoundError +.. autoexception:: attrs.exceptions.NotAnAttrsClassError +.. autoexception:: attrs.exceptions.DefaultAlreadySetError +.. autoexception:: attrs.exceptions.UnannotatedAttributeError +.. autoexception:: attrs.exceptions.NotCallableError For example:: @@ -169,132 +230,167 @@ Helpers ``attrs`` comes with a bunch of helper methods that make working with it easier: -.. autofunction:: attr.fields +.. autofunction:: attrs.cmp_using +.. function:: attr.cmp_using + + Same as `attrs.cmp_using`. + +.. autofunction:: attrs.fields For example: .. doctest:: >>> @attr.s - ... class C(object): + ... class C: ... x = attr.ib() ... y = attr.ib() - >>> attr.fields(C) - (Attribute(name='x', default=NOTHING, validator=None, repr=True, eq=True, order=True, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None), Attribute(name='y', default=NOTHING, validator=None, repr=True, eq=True, order=True, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None)) - >>> attr.fields(C)[1] - Attribute(name='y', default=NOTHING, validator=None, repr=True, eq=True, order=True, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None) - >>> attr.fields(C).y is attr.fields(C)[1] + >>> attrs.fields(C) + (Attribute(name='x', default=NOTHING, validator=None, repr=True, eq=True, eq_key=None, order=True, order_key=None, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None), Attribute(name='y', default=NOTHING, validator=None, repr=True, eq=True, eq_key=None, order=True, order_key=None, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None)) + >>> attrs.fields(C)[1] + Attribute(name='y', default=NOTHING, validator=None, repr=True, eq=True, eq_key=None, order=True, order_key=None, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None) + >>> attrs.fields(C).y is attrs.fields(C)[1] True -.. autofunction:: attr.fields_dict +.. function:: attr.fields + + Same as `attrs.fields`. + +.. autofunction:: attrs.fields_dict For example: .. doctest:: >>> @attr.s - ... class C(object): + ... class C: ... x = attr.ib() ... y = attr.ib() - >>> attr.fields_dict(C) - {'x': Attribute(name='x', default=NOTHING, validator=None, repr=True, eq=True, order=True, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None), 'y': Attribute(name='y', default=NOTHING, validator=None, repr=True, eq=True, order=True, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None)} + >>> attrs.fields_dict(C) + {'x': Attribute(name='x', default=NOTHING, validator=None, repr=True, eq=True, eq_key=None, order=True, order_key=None, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None), 'y': Attribute(name='y', default=NOTHING, validator=None, repr=True, eq=True, eq_key=None, order=True, order_key=None, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None)} >>> attr.fields_dict(C)['y'] - Attribute(name='y', default=NOTHING, validator=None, repr=True, eq=True, order=True, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None) - >>> attr.fields_dict(C)['y'] is attr.fields(C).y + Attribute(name='y', default=NOTHING, validator=None, repr=True, eq=True, eq_key=None, order=True, order_key=None, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None) + >>> attrs.fields_dict(C)['y'] is attrs.fields(C).y True +.. function:: attr.fields_dict + + Same as `attrs.fields_dict`. -.. autofunction:: attr.has +.. autofunction:: attrs.has For example: .. doctest:: >>> @attr.s - ... class C(object): + ... class C: ... pass >>> attr.has(C) True >>> attr.has(object) False +.. function:: attr.has -.. autofunction:: attr.resolve_types + Same as `attrs.has`. + +.. autofunction:: attrs.resolve_types For example: .. doctest:: >>> import typing - >>> @attr.s(auto_attribs=True) + >>> @attrs.define ... class A: ... a: typing.List['A'] ... b: 'B' ... - >>> @attr.s(auto_attribs=True) + >>> @attrs.define ... class B: ... a: A ... - >>> attr.fields(A).a.type + >>> attrs.fields(A).a.type typing.List[ForwardRef('A')] - >>> attr.fields(A).b.type + >>> attrs.fields(A).b.type 'B' - >>> attr.resolve_types(A, globals(), locals()) + >>> attrs.resolve_types(A, globals(), locals()) - >>> attr.fields(A).a.type + >>> attrs.fields(A).a.type typing.List[A] - >>> attr.fields(A).b.type + >>> attrs.fields(A).b.type -.. autofunction:: attr.asdict +.. function:: attr.resolve_types + + Same as `attrs.resolve_types`. + +.. autofunction:: attrs.asdict For example: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib() - ... y = attr.ib() - >>> attr.asdict(C(1, C(2, 3))) + >>> @attrs.define + ... class C: + ... x: int + ... y: int + >>> attrs.asdict(C(1, C(2, 3))) {'x': 1, 'y': {'x': 2, 'y': 3}} +.. autofunction:: attr.asdict -.. autofunction:: attr.astuple +.. autofunction:: attrs.astuple For example: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib() - ... y = attr.ib() - >>> attr.astuple(C(1,2)) + >>> @attrs.define + ... class C: + ... x = attr.field() + ... y = attr.field() + >>> attrs.astuple(C(1,2)) (1, 2) -``attrs`` includes some handy helpers for filtering the attributes in `attr.asdict` and `attr.astuple`: +.. autofunction:: attr.astuple + + +``attrs`` includes some handy helpers for filtering the attributes in `attrs.asdict` and `attrs.astuple`: + +.. autofunction:: attrs.filters.include + +.. autofunction:: attrs.filters.exclude -.. autofunction:: attr.filters.include +.. function:: attr.filters.include -.. autofunction:: attr.filters.exclude + Same as `attrs.filters.include`. -See :func:`asdict` for examples. +.. function:: attr.filters.exclude -.. autofunction:: attr.evolve + Same as `attrs.filters.exclude`. + +See :func:`attrs.asdict` for examples. + +All objects from ``attrs.filters`` are also available from ``attr.filters``. + +---- + +.. autofunction:: attrs.evolve For example: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib() - ... y = attr.ib() + >>> @attrs.define + ... class C: + ... x: int + ... y: int >>> i1 = C(1, 2) >>> i1 C(x=1, y=2) - >>> i2 = attr.evolve(i1, y=3) + >>> i2 = attrs.evolve(i1, y=3) >>> i2 C(x=1, y=3) >>> i1 == i2 @@ -307,22 +403,30 @@ See :func:`asdict` for examples. * attributes with ``init=False`` can't be set with ``evolve``. * the usual ``__init__`` validators will validate the new values. -.. autofunction:: validate +.. function:: attr.evolve + + Same as `attrs.evolve`. + +.. autofunction:: attrs.validate For example: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=attr.validators.instance_of(int)) + >>> @attrs.define(on_setattr=attrs.setters.NO_OP) + ... class C: + ... x = attrs.field(validator=attrs.validators.instance_of(int)) >>> i = C(1) >>> i.x = "1" - >>> attr.validate(i) + >>> attrs.validate(i) Traceback (most recent call last): ... TypeError: ("'x' must be (got '1' that is a ).", ...) +.. function:: attr.validate + + Same as `attrs.validate`. + Validators can be globally disabled if you want to run them only in development and tests but not in production because you fear their performance impact: @@ -336,19 +440,115 @@ Validators can be globally disabled if you want to run them only in development Validators ---------- -``attrs`` comes with some common validators in the ``attrs.validators`` module: +``attrs`` comes with some common validators in the ``attrs.validators`` module. +All objects from ``attrs.validators`` are also available from ``attr.validators``. + + +.. autofunction:: attrs.validators.lt + + For example: + .. doctest:: -.. autofunction:: attr.validators.instance_of + >>> @attrs.define + ... class C: + ... x = attrs.field(validator=attrs.validators.lt(42)) + >>> C(41) + C(x=41) + >>> C(42) + Traceback (most recent call last): + ... + ValueError: ("'x' must be < 42: 42") +.. autofunction:: attrs.validators.le For example: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=attr.validators.instance_of(int)) + >>> @attrs.define + ... class C: + ... x = attrs.field(validator=attr.validators.le(42)) + >>> C(42) + C(x=42) + >>> C(43) + Traceback (most recent call last): + ... + ValueError: ("'x' must be <= 42: 43") + +.. autofunction:: attrs.validators.ge + + For example: + + .. doctest:: + + >>> @attrs.define + ... class C: + ... x = attrs.field(validator=attrs.validators.ge(42)) + >>> C(42) + C(x=42) + >>> C(41) + Traceback (most recent call last): + ... + ValueError: ("'x' must be => 42: 41") + +.. autofunction:: attrs.validators.gt + + For example: + + .. doctest:: + + >>> @attrs.define + ... class C: + ... x = attr.field(validator=attrs.validators.gt(42)) + >>> C(43) + C(x=43) + >>> C(42) + Traceback (most recent call last): + ... + ValueError: ("'x' must be > 42: 42") + +.. autofunction:: attrs.validators.max_len + + For example: + + .. doctest:: + + >>> @attrs.define + ... class C: + ... x = attrs.field(validator=attrs.validators.max_len(4)) + >>> C("spam") + C(x='spam') + >>> C("bacon") + Traceback (most recent call last): + ... + ValueError: ("Length of 'x' must be <= 4: 5") + +.. autofunction:: attrs.validators.min_len + + For example: + + .. doctest:: + + >>> @attrs.define + ... class C: + ... x = attrs.field(validator=attrs.validators.min_len(1)) + >>> C("bacon") + C(x='bacon') + >>> C("") + Traceback (most recent call last): + ... + ValueError: ("Length of 'x' must be => 1: 0") + +.. autofunction:: attrs.validators.instance_of + + For example: + + .. doctest:: + + >>> @attrs.define + ... class C: + ... x = attrs.field(validator=attrs.validators.instance_of(int)) >>> C(42) C(x=42) >>> C("42") @@ -360,51 +560,51 @@ Validators ... TypeError: ("'x' must be (got None that is a ).", Attribute(name='x', default=NOTHING, validator=>, repr=True, cmp=True, hash=None, init=True, type=None, kw_only=False), , None) -.. autofunction:: attr.validators.in_ +.. autofunction:: attrs.validators.in_ For example: .. doctest:: - >>> import enum - >>> class State(enum.Enum): - ... ON = "on" - ... OFF = "off" - >>> @attr.s - ... class C(object): - ... state = attr.ib(validator=attr.validators.in_(State)) - ... val = attr.ib(validator=attr.validators.in_([1, 2, 3])) - >>> C(State.ON, 1) - C(state=, val=1) - >>> C("on", 1) - Traceback (most recent call last): - ... - ValueError: 'state' must be in (got 'on') - >>> C(State.ON, 4) - Traceback (most recent call last): - ... - ValueError: 'val' must be in [1, 2, 3] (got 4) + >>> import enum + >>> class State(enum.Enum): + ... ON = "on" + ... OFF = "off" + >>> @attrs.define + ... class C: + ... state = attrs.field(validator=attrs.validators.in_(State)) + ... val = attrs.field(validator=attrs.validators.in_([1, 2, 3])) + >>> C(State.ON, 1) + C(state=, val=1) + >>> C("on", 1) + Traceback (most recent call last): + ... + ValueError: 'state' must be in (got 'on'), Attribute(name='state', default=NOTHING, validator=>, repr=True, eq=True, eq_key=None, order=True, order_key=None, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None), , 'on') + >>> C(State.ON, 4) + Traceback (most recent call last): + ... + ValueError: 'val' must be in [1, 2, 3] (got 4), Attribute(name='val', default=NOTHING, validator=, repr=True, eq=True, eq_key=None, order=True, order_key=None, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None), [1, 2, 3], 4) -.. autofunction:: attr.validators.provides +.. autofunction:: attrs.validators.provides -.. autofunction:: attr.validators.and_ +.. autofunction:: attrs.validators.and_ - For convenience, it's also possible to pass a list to `attr.ib`'s validator argument. + For convenience, it's also possible to pass a list to `attrs.field`'s validator argument. Thus the following two statements are equivalent:: - x = attr.ib(validator=attr.validators.and_(v1, v2, v3)) - x = attr.ib(validator=[v1, v2, v3]) + x = attrs.field(validator=attrs.validators.and_(v1, v2, v3)) + x = attrs.field(validator=[v1, v2, v3]) -.. autofunction:: attr.validators.optional +.. autofunction:: attrs.validators.optional For example: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=attr.validators.optional(attr.validators.instance_of(int))) + >>> @attrs.define + ... class C: + ... x = attrs.field(validator=attrs.validators.optional(attr.validators.instance_of(int))) >>> C(42) C(x=42) >>> C("42") @@ -415,15 +615,15 @@ Validators C(x=None) -.. autofunction:: attr.validators.is_callable +.. autofunction:: attrs.validators.is_callable For example: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=attr.validators.is_callable()) + >>> @attrs.define + ... class C: + ... x = attrs.field(validator=attrs.validators.is_callable()) >>> C(isinstance) C(x=) >>> C("not a callable") @@ -432,15 +632,15 @@ Validators attr.exceptions.NotCallableError: 'x' must be callable (got 'not a callable' that is a ). -.. autofunction:: attr.validators.matches_re +.. autofunction:: attrs.validators.matches_re For example: .. doctest:: - >>> @attr.s - ... class User(object): - ... email = attr.ib(validator=attr.validators.matches_re( + >>> @attrs.define + ... class User: + ... email = attrs.field(validator=attrs.validators.matches_re( ... "(^[a-zA-Z0-9_.+-]+@[a-zA-Z0-9-]+\.[a-zA-Z0-9-.]+$)")) >>> User(email="user@example.com") User(email='user@example.com') @@ -450,17 +650,17 @@ Validators ValueError: ("'email' must match regex '(^[a-zA-Z0-9_.+-]+@[a-zA-Z0-9-]+\\\\.[a-zA-Z0-9-.]+$)' ('user@example.com@test.com' doesn't)", Attribute(name='email', default=NOTHING, validator=, repr=True, cmp=True, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False), re.compile('(^[a-zA-Z0-9_.+-]+@[a-zA-Z0-9-]+\\.[a-zA-Z0-9-.]+$)'), 'user@example.com@test.com') -.. autofunction:: attr.validators.deep_iterable +.. autofunction:: attrs.validators.deep_iterable For example: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=attr.validators.deep_iterable( - ... member_validator=attr.validators.instance_of(int), - ... iterable_validator=attr.validators.instance_of(list) + >>> @attrs.define + ... class C: + ... x = attrs.field(validator=attrs.validators.deep_iterable( + ... member_validator=attrs.validators.instance_of(int), + ... iterable_validator=attrs.validators.instance_of(list) ... )) >>> C(x=[1, 2, 3]) C(x=[1, 2, 3]) @@ -474,18 +674,18 @@ Validators TypeError: ("'x' must be (got '3' that is a ).", Attribute(name='x', default=NOTHING, validator=> iterables of >>, repr=True, cmp=True, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False), , '3') -.. autofunction:: attr.validators.deep_mapping +.. autofunction:: attrs.validators.deep_mapping For example: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=attr.validators.deep_mapping( - ... key_validator=attr.validators.instance_of(str), - ... value_validator=attr.validators.instance_of(int), - ... mapping_validator=attr.validators.instance_of(dict) + >>> @attrs.define + ... class C: + ... x = attrs.field(validator=attrs.validators.deep_mapping( + ... key_validator=attrs.validators.instance_of(str), + ... value_validator=attrs.validators.instance_of(int), + ... mapping_validator=attrs.validators.instance_of(dict) ... )) >>> C(x={"a": 1, "b": 2}) C(x={'a': 1, 'b': 2}) @@ -502,11 +702,21 @@ Validators ... TypeError: ("'x' must be (got 7 that is a ).", Attribute(name='x', default=NOTHING, validator=> to >>, repr=True, cmp=True, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False), , 7) +Validators can be both globally and locally disabled: + +.. autofunction:: attrs.validators.set_disabled + +.. autofunction:: attrs.validators.get_disabled + +.. autofunction:: attrs.validators.disabled + Converters ---------- -.. autofunction:: attr.converters.pipe +All objects from ``attrs.converters`` are also available from ``attr.converters``. + +.. autofunction:: attrs.converters.pipe For convenience, it's also possible to pass a list to `attr.ib`'s converter argument. @@ -515,14 +725,14 @@ Converters x = attr.ib(converter=attr.converter.pipe(c1, c2, c3)) x = attr.ib(converter=[c1, c2, c3]) -.. autofunction:: attr.converters.optional +.. autofunction:: attrs.converters.optional For example: .. doctest:: >>> @attr.s - ... class C(object): + ... class C: ... x = attr.ib(converter=attr.converters.optional(int)) >>> C(None) C(x=None) @@ -530,14 +740,14 @@ Converters C(x=42) -.. autofunction:: attr.converters.default_if_none +.. autofunction:: attrs.converters.default_if_none For example: .. doctest:: >>> @attr.s - ... class C(object): + ... class C: ... x = attr.ib( ... converter=attr.converters.default_if_none("") ... ) @@ -545,27 +755,56 @@ Converters C(x='') +.. autofunction:: attrs.converters.to_bool + + For example: + + .. doctest:: + + >>> @attr.s + ... class C: + ... x = attr.ib( + ... converter=attr.converters.to_bool + ... ) + >>> C("yes") + C(x=True) + >>> C(0) + C(x=False) + >>> C("foo") + Traceback (most recent call last): + File "", line 1, in + ValueError: Cannot convert value to bool: foo + + + .. _api_setters: Setters ------- -These are helpers that you can use together with `attr.s`'s and `attr.ib`'s ``on_setattr`` arguments. +These are helpers that you can use together with `attrs.define`'s and `attrs.fields`'s ``on_setattr`` arguments. +All setters in ``attrs.setters`` are also available from ``attr.setters``. + +.. autofunction:: attrs.setters.frozen +.. autofunction:: attrs.setters.validate +.. autofunction:: attrs.setters.convert +.. autofunction:: attrs.setters.pipe +.. data:: attrs.setters.NO_OP -.. autofunction:: attr.setters.frozen -.. autofunction:: attr.setters.validate -.. autofunction:: attr.setters.convert -.. autofunction:: attr.setters.pipe -.. autodata:: attr.setters.NO_OP + Sentinel for disabling class-wide *on_setattr* hooks for certain attributes. + + Does not work in `attrs.setters.pipe` or within lists. + + .. versionadded:: 20.1.0 For example, only ``x`` is frozen here: .. doctest:: - >>> @attr.s(on_setattr=attr.setters.frozen) - ... class C(object): - ... x = attr.ib() - ... y = attr.ib(on_setattr=attr.setters.NO_OP) + >>> @attrs.define(on_setattr=attr.setters.frozen) + ... class C: + ... x = attr.field() + ... y = attr.field(on_setattr=attr.setters.NO_OP) >>> c = C(1, 2) >>> c.y = 3 >>> c.y @@ -573,56 +812,9 @@ These are helpers that you can use together with `attr.s`'s and `attr.ib`'s ``on >>> c.x = 4 Traceback (most recent call last): ... - attr.exceptions.FrozenAttributeError: () - - N.B. Please use `attr.s`'s *frozen* argument to freeze whole classes; it is more efficient. + attrs.exceptions.FrozenAttributeError: () - -.. _prov: - -Provisional APIs ----------------- - -These are Python 3.6 and later-only, keyword-only, and **provisional** APIs that call `attr.s` with different default values. - -The most notable differences are: - -- automatically detect whether or not *auto_attribs* should be `True` -- *slots=True* (see :term:`slotted classes` for potentially surprising behaviors) -- *auto_exc=True* -- *auto_detect=True* -- *eq=True*, but *order=False* -- Validators run when you set an attribute (*on_setattr=attr.setters.validate*). -- Some options that aren't relevant to Python 3 have been dropped. - -Please note that these are *defaults* and you're free to override them, just like before. - ----- - -Their behavior is scheduled to become part of the upcoming ``import attrs`` that will introduce a new namespace with nicer names and nicer defaults (see `#408 `_ and `#487 `_). - -Therefore your constructive feedback in the linked issues above is strongly encouraged! - -.. note:: - Provisional doesn't mean we will remove it (although it will be deprecated once the final form is released), but that it might change if we receive relevant feedback. - - `attr.s` and `attr.ib` (and their serious business cousins) aren't going anywhere. - The new APIs build on top of them. - -.. autofunction:: attr.define -.. function:: attr.mutable(same_as_define) - - Alias for `attr.define`. - - .. versionadded:: 20.1.0 - -.. function:: attr.frozen(same_as_define) - - Behaves the same as `attr.define` but sets *frozen=True* and *on_setattr=None*. - - .. versionadded:: 20.1.0 - -.. autofunction:: attr.field + N.B. Please use `attrs.define`'s *frozen* argument (or `attrs.frozen`) to freeze whole classes; it is more efficient. Deprecated APIs @@ -644,15 +836,13 @@ It behaves similarly to `sys.version_info` and is an instance of `VersionInfo`: >>> cmp_off == {"eq": False} True >>> @attr.s(**cmp_off) - ... class C(object): + ... class C: ... pass ---- -The serious business aliases used to be called ``attr.attributes`` and ``attr.attr``. +The serious-business aliases used to be called ``attr.attributes`` and ``attr.attr``. There are no plans to remove them but they shouldn't be used in new code. -The ``cmp`` argument to both `attr.s` and `attr.ib` has been deprecated in 19.2 and shouldn't be used. - .. autofunction:: assoc diff --git a/docs/backward-compatibility.rst b/docs/backward-compatibility.rst deleted file mode 100644 index c1165be14..000000000 --- a/docs/backward-compatibility.rst +++ /dev/null @@ -1,19 +0,0 @@ -Backward Compatibility -====================== - -.. currentmodule:: attr - -``attrs`` has a very strong backward compatibility policy that is inspired by the policy of the `Twisted framework `_. - -Put simply, you shouldn't ever be afraid to upgrade ``attrs`` if you're only using its public APIs. -If there will ever be a need to break compatibility, it will be announced in the `changelog` and raise a ``DeprecationWarning`` for a year (if possible) before it's finally really broken. - - -.. _exemption: - -.. warning:: - - The structure of the `attr.Attribute` class is exempt from this rule. - It *will* change in the future, but since it should be considered read-only, that shouldn't matter. - - However if you intend to build extensions on top of ``attrs`` you have to anticipate that. diff --git a/docs/comparison.rst b/docs/comparison.rst new file mode 100644 index 000000000..0a2e432f3 --- /dev/null +++ b/docs/comparison.rst @@ -0,0 +1,66 @@ +Comparison +========== + +By default, two instances of ``attrs`` classes are equal if all their fields are equal. +For that, ``attrs`` writes ``__eq__`` and ``__ne__`` methods for you. + +Additionally, if you pass ``order=True`` (which is the default if you use the `attr.s` decorator), ``attrs`` will also create a full set of ordering methods that are based on the defined fields: ``__le__``, ``__lt__``, ``__ge__``, and ``__gt__``. + + +.. _custom-comparison: + +Customization +------------- + +As with other features, you can exclude fields from being involved in comparison operations: + +.. doctest:: + + >>> from attr import define, field + + >>> @define + ... class C: + ... x: int + ... y: int = field(eq=False) + + >>> C(1, 2) == C(1, 3) + True + +Additionally you can also pass a *callable* instead of a bool to both *eq* and *order*. +It is then used as a key function like you may know from `sorted`: + +.. doctest:: + + >>> from attr import define, field + + >>> @define + ... class S: + ... x: str = field(eq=str.lower) + + >>> S("foo") == S("FOO") + True + + >>> @define(order=True) + ... class C: + ... x: str = field(order=int) + + >>> C("10") > C("2") + True + +This is especially useful when you have fields with objects that have atypical comparison properties. +Common examples of such objects are `NumPy arrays `_. + +To save you unnecessary boilerplate, ``attrs`` comes with the `attrs.cmp_using` helper to create such functions. +For NumPy arrays it would look like this:: + + import numpy + + @define(order=False) + class C: + an_array = field(eq=attr.cmp_using(eq=numpy.array_equal)) + + +.. warning:: + + Please note that *eq* and *order* are set *independently*, because *order* is `False` by default in `attrs.define` (but not in `attr.s`). + You can set both at once by using the *cmp* argument that we've undeprecated just for this use-case. diff --git a/docs/conf.py b/docs/conf.py index 323788c71..0cc80be6a 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -1,38 +1,20 @@ -# -*- coding: utf-8 -*- - -import codecs -import os -import re - - -def read(*parts): - """ - Build an absolute path from *parts* and and return the contents of the - resulting file. Assume UTF-8 encoding. - """ - here = os.path.abspath(os.path.dirname(__file__)) - with codecs.open(os.path.join(here, *parts), "rb", "utf-8") as f: - return f.read() - - -def find_version(*file_paths): - """ - Build a path from *file_paths* and search for a ``__version__`` - string inside. - """ - version_file = read(*file_paths) - version_match = re.search( - r"^__version__ = ['\"]([^'\"]*)['\"]", version_file, re.M - ) - if version_match: - return version_match.group(1) - raise RuntimeError("Unable to find version string.") +# SPDX-License-Identifier: MIT + +from importlib import metadata # -- General configuration ------------------------------------------------ +doctest_global_setup = """ +from attr import define, frozen, field, validators, Factory +""" + linkcheck_ignore = [ + # We run into GitHub's rate limits. r"https://github.com/.*/(issues|pull)/\d+", + # It never finds the anchor even though it's there. + "https://github.com/microsoft/pyright/blob/main/specs/" + "dataclass_transforms.md#attrs", ] # In nitpick mode (-n), still ignore any of the following "broken" references @@ -52,6 +34,7 @@ def find_version(*file_paths): "sphinx.ext.doctest", "sphinx.ext.intersphinx", "sphinx.ext.todo", + "notfound.extension", ] @@ -65,17 +48,18 @@ def find_version(*file_paths): master_doc = "index" # General information about the project. -project = u"attrs" -copyright = u"2015, Hynek Schlawack" +project = "attrs" +author = "Hynek Schlawack" +copyright = f"2015, {author}" # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. -# -# The short X.Y version. -release = find_version("../src/attr/__init__.py") -version = release.rsplit(u".", 1)[0] + # The full version, including alpha/beta/rc tags. +release = metadata.version("attrs") +# The short X.Y version. +version = release.rsplit(".", 1)[0] # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. @@ -88,9 +72,6 @@ def find_version(*file_paths): # If true, '()' will be appended to :func: etc. cross-reference text. add_function_parentheses = True -# The name of the Pygments (syntax highlighting) style to use. -pygments_style = "sphinx" - # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for @@ -99,11 +80,10 @@ def find_version(*file_paths): html_theme = "furo" html_theme_options = { "sidebar_hide_name": True, + "light_logo": "attrs_logo.svg", + "dark_logo": "attrs_logo_white.svg", } -# The name of an image file (relative to this directory) to place at the top -# of the sidebar. -html_logo = "_static/attrs_logo.svg" # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 @@ -145,9 +125,7 @@ def find_version(*file_paths): # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). -man_pages = [ - ("index", "attrs", u"attrs Documentation", [u"Hynek Schlawack"], 1) -] +man_pages = [("index", "attrs", "attrs Documentation", ["Hynek Schlawack"], 1)] # -- Options for Texinfo output ------------------------------------------- @@ -159,14 +137,16 @@ def find_version(*file_paths): ( "index", "attrs", - u"attrs Documentation", - u"Hynek Schlawack", + "attrs Documentation", + "Hynek Schlawack", "attrs", - "One line description of project.", + "Python Clases Without Boilerplate", "Miscellaneous", ) ] +epub_description = "Python Clases Without Boilerplate" + intersphinx_mapping = { "https://docs.python.org/3": None, } diff --git a/docs/contributing.rst b/docs/contributing.rst deleted file mode 100644 index acb527b23..000000000 --- a/docs/contributing.rst +++ /dev/null @@ -1,5 +0,0 @@ -.. _contributing: - -.. include:: ../.github/CONTRIBUTING.rst - -.. include:: ../.github/CODE_OF_CONDUCT.rst diff --git a/docs/examples.rst b/docs/examples.rst index 0fac312a0..ae5ffa78e 100644 --- a/docs/examples.rst +++ b/docs/examples.rst @@ -9,9 +9,9 @@ The simplest possible usage is: .. doctest:: - >>> import attr - >>> @attr.s - ... class Empty(object): + >>> from attrs import define, field + >>> @define + ... class Empty: ... pass >>> Empty() Empty() @@ -26,10 +26,10 @@ But you'll usually want some data on your classes, so let's add some: .. doctest:: - >>> @attr.s - ... class Coordinates(object): - ... x = attr.ib() - ... y = attr.ib() + >>> @define + ... class Coordinates: + ... x: int + ... y: int By default, all features are added, so you immediately have a fully functional data class with a nice ``repr`` string and comparison methods. @@ -46,27 +46,13 @@ By default, all features are added, so you immediately have a fully functional d As shown, the generated ``__init__`` method allows for both positional and keyword arguments. -If playful naming turns you off, ``attrs`` comes with serious-business aliases: - -.. doctest:: - - >>> from attr import attrs, attrib - >>> @attrs - ... class SeriousCoordinates(object): - ... x = attrib() - ... y = attrib() - >>> SeriousCoordinates(1, 2) - SeriousCoordinates(x=1, y=2) - >>> attr.fields(Coordinates) == attr.fields(SeriousCoordinates) - True - For private attributes, ``attrs`` will strip the leading underscores for keyword arguments: .. doctest:: - >>> @attr.s - ... class C(object): - ... _x = attr.ib() + >>> @define + ... class C: + ... _x: int >>> C(x=1) C(_x=1) @@ -74,9 +60,9 @@ If you want to initialize your private attributes yourself, you can do that too: .. doctest:: - >>> @attr.s - ... class C(object): - ... _x = attr.ib(init=False, default=42) + >>> @define + ... class C: + ... _x: int = field(init=False, default=42) >>> C() C(_x=42) >>> C(23) @@ -89,12 +75,12 @@ This is useful in times when you want to enhance classes that are not yours (nic .. doctest:: - >>> class SomethingFromSomeoneElse(object): + >>> class SomethingFromSomeoneElse: ... def __init__(self, x): ... self.x = x - >>> SomethingFromSomeoneElse = attr.s( + >>> SomethingFromSomeoneElse = define( ... these={ - ... "x": attr.ib() + ... "x": field() ... }, init=False)(SomethingFromSomeoneElse) >>> SomethingFromSomeoneElse(1) SomethingFromSomeoneElse(x=1) @@ -104,17 +90,17 @@ This is useful in times when you want to enhance classes that are not yours (nic .. doctest:: - >>> @attr.s - ... class A(object): - ... a = attr.ib() + >>> @define(slots=False) + ... class A: + ... a: int ... def get_a(self): ... return self.a - >>> @attr.s - ... class B(object): - ... b = attr.ib() - >>> @attr.s - ... class C(A, B): - ... c = attr.ib() + >>> @define(slots=False) + ... class B: + ... b: int + >>> @define(slots=False) + ... class C(B, A): + ... c: int >>> i = C(1, 2, 3) >>> i C(a=1, b=2, c=3) @@ -123,25 +109,9 @@ This is useful in times when you want to enhance classes that are not yours (nic >>> i.get_a() 1 -The order of the attributes is defined by the `MRO `_. - -In Python 3, classes defined within other classes are `detected `_ and reflected in the ``__repr__``. -In Python 2 though, it's impossible. -Therefore ``@attr.s`` comes with the ``repr_ns`` option to set it manually: - -.. doctest:: - - >>> @attr.s - ... class C(object): - ... @attr.s(repr_ns="C") - ... class D(object): - ... pass - >>> C.D() - C.D() - -``repr_ns`` works on both Python 2 and 3. -On Python 3 it overrides the implicit detection. +:term:`Slotted classes `, which are the default for the new APIs, don't play well with multiple inheritance so we don't use them in the example. +The order of the attributes is defined by the `MRO `_. Keyword-only Attributes ~~~~~~~~~~~~~~~~~~~~~~~ @@ -150,9 +120,9 @@ You can also add `keyword-only >> @attr.s + >>> @define ... class A: - ... a = attr.ib(kw_only=True) + ... a: int = field(kw_only=True) >>> A() Traceback (most recent call last): ... @@ -160,14 +130,14 @@ You can also add `keyword-only >> A(a=1) A(a=1) -``kw_only`` may also be specified at via ``attr.s``, and will apply to all attributes: +``kw_only`` may also be specified at via ``define``, and will apply to all attributes: .. doctest:: - >>> @attr.s(kw_only=True) + >>> @define(kw_only=True) ... class A: - ... a = attr.ib() - ... b = attr.ib() + ... a: int + ... b: int >>> A(1, 2) Traceback (most recent call last): ... @@ -183,12 +153,12 @@ Keyword-only attributes allow subclasses to add attributes without default value .. doctest:: - >>> @attr.s + >>> @define ... class A: - ... a = attr.ib(default=0) - >>> @attr.s + ... a: int = 0 + >>> @define ... class B(A): - ... b = attr.ib(kw_only=True) + ... b: int = field(kw_only=True) >>> B(b=1) B(a=0, b=1) >>> B() @@ -196,19 +166,19 @@ Keyword-only attributes allow subclasses to add attributes without default value ... TypeError: B() missing 1 required keyword-only argument: 'b' -If you don't set ``kw_only=True``, then there's is no valid attribute ordering and you'll get an error: +If you don't set ``kw_only=True``, then there is no valid attribute ordering, and you'll get an error: .. doctest:: - >>> @attr.s + >>> @define ... class A: - ... a = attr.ib(default=0) - >>> @attr.s + ... a: int = 0 + >>> @define ... class B(A): - ... b = attr.ib() + ... b: int Traceback (most recent call last): ... - ValueError: No mandatory attributes allowed after an attribute with a default value or factory. Attribute in question: Attribute(name='b', default=NOTHING, validator=None, repr=True, cmp=True, hash=None, init=True, converter=None, metadata=mappingproxy({}), type=None, kw_only=False) + ValueError: No mandatory attributes allowed after an attribute with a default value or factory. Attribute in question: Attribute(name='b', default=NOTHING, validator=None, repr=True, cmp=True, hash=None, init=True, converter=None, metadata=mappingproxy({}), type=int, kw_only=False) .. _asdict: @@ -219,46 +189,55 @@ When you have a class with data, it often is very convenient to transform that c .. doctest:: - >>> attr.asdict(Coordinates(x=1, y=2)) + >>> from attrs import asdict + + >>> asdict(Coordinates(x=1, y=2)) {'x': 1, 'y': 2} Some fields cannot or should not be transformed. -For that, `attr.asdict` offers a callback that decides whether an attribute should be included: +For that, `attrs.asdict` offers a callback that decides whether an attribute should be included: .. doctest:: - >>> @attr.s - ... class UserList(object): - ... users = attr.ib() - >>> @attr.s - ... class User(object): - ... email = attr.ib() - ... password = attr.ib() - >>> attr.asdict(UserList([User("jane@doe.invalid", "s33kred"), - ... User("joe@doe.invalid", "p4ssw0rd")]), - ... filter=lambda attr, value: attr.name != "password") + >>> @define + ... class User: + ... email: str + ... password: str + + >>> @define + ... class UserList: + ... users: list[User] + + >>> asdict(UserList([User("jane@doe.invalid", "s33kred"), + ... User("joe@doe.invalid", "p4ssw0rd")]), + ... filter=lambda attr, value: attr.name != "password") {'users': [{'email': 'jane@doe.invalid'}, {'email': 'joe@doe.invalid'}]} For the common case where you want to `include ` or `exclude ` certain types or attributes, ``attrs`` ships with a few helpers: .. doctest:: - >>> @attr.s - ... class User(object): - ... login = attr.ib() - ... password = attr.ib() - ... id = attr.ib() - >>> attr.asdict( + >>> from attrs import asdict, filters, fields + + >>> @define + ... class User: + ... login: str + ... password: str + ... id: int + + >>> asdict( ... User("jane", "s33kred", 42), - ... filter=attr.filters.exclude(attr.fields(User).password, int)) + ... filter=filters.exclude(fields(User).password, int)) {'login': 'jane'} - >>> @attr.s - ... class C(object): - ... x = attr.ib() - ... y = attr.ib() - ... z = attr.ib() - >>> attr.asdict(C("foo", "2", 3), - ... filter=attr.filters.include(int, attr.fields(C).x)) + + >>> @define + ... class C: + ... x: str + ... y: str + ... z: int + + >>> asdict(C("foo", "2", 3), + ... filter=filters.include(int, fields(C).x)) {'x': 'foo', 'z': 3} Other times, all you want is a tuple and ``attrs`` won't let you down: @@ -266,45 +245,50 @@ Other times, all you want is a tuple and ``attrs`` won't let you down: .. doctest:: >>> import sqlite3 - >>> import attr - >>> @attr.s + >>> from attrs import astuple + + >>> @define ... class Foo: - ... a = attr.ib() - ... b = attr.ib() + ... a: int + ... b: int + >>> foo = Foo(2, 3) >>> with sqlite3.connect(":memory:") as conn: ... c = conn.cursor() ... c.execute("CREATE TABLE foo (x INTEGER PRIMARY KEY ASC, y)") #doctest: +ELLIPSIS - ... c.execute("INSERT INTO foo VALUES (?, ?)", attr.astuple(foo)) #doctest: +ELLIPSIS + ... c.execute("INSERT INTO foo VALUES (?, ?)", astuple(foo)) #doctest: +ELLIPSIS ... foo2 = Foo(*c.execute("SELECT x, y FROM foo").fetchone()) >>> foo == foo2 True +For more advanced transformations and conversions, we recommend you look at a companion library (such as `cattrs `_). Defaults -------- Sometimes you want to have default values for your initializer. -And sometimes you even want mutable objects as default values (ever used accidentally ``def f(arg=[])``?). +And sometimes you even want mutable objects as default values (ever accidentally used ``def f(arg=[])``?). ``attrs`` has you covered in both cases: .. doctest:: >>> import collections - >>> @attr.s - ... class Connection(object): - ... socket = attr.ib() + + >>> @define + ... class Connection: + ... socket: int ... @classmethod ... def connect(cls, db_string): ... # ... connect somehow to db_string ... ... return cls(socket=42) - >>> @attr.s - ... class ConnectionPool(object): - ... db_string = attr.ib() - ... pool = attr.ib(default=attr.Factory(collections.deque)) - ... debug = attr.ib(default=False) + + >>> @define + ... class ConnectionPool: + ... db_string: str + ... pool: collections.deque = Factory(collections.deque) + ... debug: bool = False ... def get_connection(self): ... try: ... return self.pool.pop() @@ -327,33 +311,26 @@ And sometimes you even want mutable objects as default values (ever used acciden >>> cp ConnectionPool(db_string='postgres://localhost', pool=deque([Connection(socket=42)]), debug=False) -More information on why class methods for constructing objects are awesome can be found in this insightful `blog post `_. +More information on why class methods for constructing objects are awesome can be found in this insightful `blog post `_. -Default factories can also be set using a decorator. +Default factories can also be set using the ``factory`` argument to ``field``, and using a decorator. The method receives the partially initialized instance which enables you to base a default value on other attributes: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(default=1) - ... y = attr.ib() + >>> @define + ... class C: + ... x: int = 1 + ... y: int = field() ... @y.default ... def _any_name_except_a_name_of_an_attribute(self): ... return self.x + 1 + ... z: list = field(factory=list) >>> C() - C(x=1, y=2) - - -And since the case of ``attr.ib(default=attr.Factory(f))`` is so common, ``attrs`` also comes with syntactic sugar for it: - -.. doctest:: + C(x=1, y=2, z=[]) - >>> @attr.s - ... class C(object): - ... x = attr.ib(factory=list) - >>> C() - C(x=[]) +Please keep in mind that the decorator approach *only* works if the attribute in question has a ``field`` assigned to it. +As a result, annotating an attribute with a type is *not* enough if you use ``@default``. .. _examples_validators: @@ -368,9 +345,9 @@ You can use a decorator: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib() + >>> @define + ... class C: + ... x: int = field() ... @x.validator ... def check(self, attribute, value): ... if value > 42: @@ -386,14 +363,16 @@ You can use a decorator: .. doctest:: + >>> from attrs import validators + >>> def x_smaller_than_y(instance, attribute, value): ... if value >= instance.y: ... raise ValueError("'x' has to be smaller than 'y'!") - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=[attr.validators.instance_of(int), - ... x_smaller_than_y]) - ... y = attr.ib() + >>> @define + ... class C: + ... x: int = field(validator=[validators.instance_of(int), + ... x_smaller_than_y]) + ... y: int >>> C(x=3, y=4) C(x=3, y=4) >>> C(x=4, y=3) @@ -405,9 +384,9 @@ You can use a decorator: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=attr.validators.instance_of(int)) + >>> @define + ... class C: + ... x: int = field(validator=validators.instance_of(int)) ... @x.validator ... def fits_byte(self, attribute, value): ... if not 0 <= value < 256: @@ -417,22 +396,22 @@ You can use a decorator: >>> C("128") Traceback (most recent call last): ... - TypeError: ("'x' must be (got '128' that is a ).", Attribute(name='x', default=NOTHING, validator=[>, ], repr=True, cmp=True, hash=True, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False), , '128') + TypeError: ("'x' must be (got '128' that is a ).", Attribute(name='x', default=NOTHING, validator=[>, ], repr=True, cmp=True, hash=True, init=True, metadata=mappingproxy({}), type=int, converter=None, kw_only=False), , '128') >>> C(256) Traceback (most recent call last): ... ValueError: value out of bounds -Please note that the decorator approach only works if -- and only if! -- the attribute in question has an ``attr.ib`` assigned. -Therefore if you use ``@attr.s(auto_attribs=True)``, it is *not* enough to decorate said attribute with a type. +Please note that the decorator approach only works if -- and only if! -- the attribute in question has a ``field`` assigned. +Therefore if you use ``@validator``, it is *not* enough to annotate said attribute with a type. ``attrs`` ships with a bunch of validators, make sure to `check them out ` before writing your own: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=attr.validators.instance_of(int)) + >>> @define + ... class C: + ... x: int = field(validator=validators.instance_of(int)) >>> C(42) C(x=42) >>> C("42") @@ -440,6 +419,9 @@ Therefore if you use ``@attr.s(auto_attribs=True)``, it is *not* enough to decor ... TypeError: ("'x' must be (got '42' that is a ).", Attribute(name='x', default=NOTHING, factory=NOTHING, validator=>, type=None, kw_only=False), , '42') +Please note that if you use `attr.s` (and not `attrs.define`) to define your class, validators only run on initialization by default. +This behavior can be changed using the ``on_setattr`` argument. + Check out `validators` for more details. @@ -451,13 +433,15 @@ This can be useful for doing type-conversions on values that you don't want to f .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(converter=int) + >>> @define + ... class C: + ... x: int = field(converter=int) >>> o = C("1") >>> o.x 1 +Please note that converters only run on initialization. + Check out `converters` for more details. @@ -470,12 +454,14 @@ All ``attrs`` attributes may include arbitrary metadata in the form of a read-on .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(metadata={'my_metadata': 1}) - >>> attr.fields(C).x.metadata + >>> from attrs import fields + + >>> @define + ... class C: + ... x = field(metadata={'my_metadata': 1}) + >>> fields(C).x.metadata mappingproxy({'my_metadata': 1}) - >>> attr.fields(C).x.metadata['my_metadata'] + >>> fields(C).x.metadata['my_metadata'] 1 Metadata is not used by ``attrs``, and is meant to enable rich functionality in third-party libraries. @@ -487,41 +473,47 @@ If you're the author of a third-party library with ``attrs`` integration, please Types ----- -``attrs`` also allows you to associate a type with an attribute using either the *type* argument to `attr.ib` or -- as of Python 3.6 -- using `PEP 526 `_-annotations: +``attrs`` also allows you to associate a type with an attribute using either the *type* argument to `attr.ib` or -- as of Python 3.6 -- using :pep:`526`-annotations: .. doctest:: + >>> from attrs import fields + + >>> @define + ... class C: + ... x: int + >>> fields(C).x.type + + + >>> import attr >>> @attr.s ... class C: ... x = attr.ib(type=int) - ... y: int = attr.ib() - >>> attr.fields(C).x.type - - >>> attr.fields(C).y.type + >>> fields(C).x.type -If you don't mind annotating *all* attributes, you can even drop the `attr.ib` and assign default values instead: +If you don't mind annotating *all* attributes, you can even drop the `attrs.field` and assign default values instead: .. doctest:: >>> import typing - >>> @attr.s(auto_attribs=True) + >>> from attrs import fields + + >>> @define ... class AutoC: ... cls_var: typing.ClassVar[int] = 5 # this one is ignored - ... l: typing.List[int] = attr.Factory(list) + ... l: list[int] = Factory(list) ... x: int = 1 - ... foo: str = attr.ib( - ... default="every attrib needs a type if auto_attribs=True" - ... ) + ... foo: str = "every attrib needs a type if auto_attribs=True" ... bar: typing.Any = None - >>> attr.fields(AutoC).l.type - typing.List[int] - >>> attr.fields(AutoC).x.type + >>> fields(AutoC).l.type + list[int] + >>> fields(AutoC).x.type - >>> attr.fields(AutoC).foo.type + >>> fields(AutoC).foo.type - >>> attr.fields(AutoC).bar.type + >>> fields(AutoC).bar.type typing.Any >>> AutoC() AutoC(l=[], x=1, foo='every attrib needs a type if auto_attribs=True', bar=None) @@ -531,32 +523,38 @@ If you don't mind annotating *all* attributes, you can even drop the `attr.ib` a The generated ``__init__`` method will have an attribute called ``__annotations__`` that contains this type information. If your annotations contain strings (e.g. forward references), -you can resolve these after all references have been defined by using :func:`attr.resolve_types`. +you can resolve these after all references have been defined by using :func:`attrs.resolve_types`. This will replace the *type* attribute in the respective fields. .. doctest:: - >>> import typing - >>> @attr.s(auto_attribs=True) + >>> from attrs import fields, resolve_types + + >>> @define ... class A: - ... a: typing.List['A'] + ... a: 'list[A]' ... b: 'B' ... - >>> @attr.s(auto_attribs=True) + >>> @define ... class B: ... a: A ... - >>> attr.fields(A).a.type - typing.List[ForwardRef('A')] - >>> attr.fields(A).b.type + >>> fields(A).a.type + 'list[A]' + >>> fields(A).b.type 'B' - >>> attr.resolve_types(A, globals(), locals()) + >>> resolve_types(A, globals(), locals()) - >>> attr.fields(A).a.type - typing.List[A] - >>> attr.fields(A).b.type + >>> fields(A).a.type + list[A] + >>> fields(A).b.type +.. note:: + + If you find yourself using string type annotations to handle forward references, wrap the entire type annotation in quotes instead of only the type you need a forward reference to (so ``'list[A]'`` instead of ``list['A']``). + This is a limitation of the Python typing system. + .. warning:: ``attrs`` itself doesn't have any features that work on top of type metadata *yet*. @@ -567,14 +565,16 @@ Slots ----- :term:`Slotted classes ` have several advantages on CPython. -Defining ``__slots__`` by hand is tedious, in ``attrs`` it's just a matter of passing ``slots=True``: +Defining ``__slots__`` by hand is tedious, in ``attrs`` it's just a matter of using `attrs.define` or passing ``slots=True`` to `attr.s`: .. doctest:: + >>> import attr + >>> @attr.s(slots=True) - ... class Coordinates(object): - ... x = attr.ib() - ... y = attr.ib() + ... class Coordinates: + ... x: int + ... y: int Immutability @@ -585,10 +585,11 @@ Immutability is especially popular in functional programming and is generally a If you'd like to enforce it, ``attrs`` will try to help: .. doctest:: + >>> from attrs import frozen - >>> @attr.s(frozen=True) - ... class C(object): - ... x = attr.ib() + >>> @frozen + ... class C: + ... x: int >>> i = C(1) >>> i.x = 2 Traceback (most recent call last): @@ -605,14 +606,16 @@ In Clojure that function is called `assoc >> @attr.s(frozen=True) - ... class C(object): - ... x = attr.ib() - ... y = attr.ib() + >>> from attrs import evolve, frozen + + >>> @frozen + ... class C: + ... x: int + ... y: int >>> i1 = C(1, 2) >>> i1 C(x=1, y=2) - >>> i2 = attr.evolve(i1, y=3) + >>> i2 = evolve(i1, y=3) >>> i2 C(x=1, y=3) >>> i1 == i2 @@ -623,25 +626,28 @@ Other Goodies ------------- Sometimes you may want to create a class programmatically. -``attrs`` won't let you down and gives you `attr.make_class` : +``attrs`` won't let you down and gives you `attrs.make_class` : .. doctest:: - >>> @attr.s - ... class C1(object): - ... x = attr.ib() - ... y = attr.ib() - >>> C2 = attr.make_class("C2", ["x", "y"]) - >>> attr.fields(C1) == attr.fields(C2) + >>> from attrs import fields, make_class + >>> @define + ... class C1: + ... x = field() + ... y = field() + >>> C2 = make_class("C2", ["x", "y"]) + >>> fields(C1) == fields(C2) True -You can still have power over the attributes if you pass a dictionary of name: ``attr.ib`` mappings and can pass arguments to ``@attr.s``: +You can still have power over the attributes if you pass a dictionary of name: ``field`` mappings and can pass arguments to ``@attr.s``: .. doctest:: - >>> C = attr.make_class("C", {"x": attr.ib(default=42), - ... "y": attr.ib(default=attr.Factory(list))}, - ... repr=False) + >>> from attrs import make_class + + >>> C = make_class("C", {"x": field(default=42), + ... "y": field(default=Factory(list))}, + ... repr=False) >>> i = C() >>> i # no repr added! <__main__.C object at ...> @@ -650,30 +656,32 @@ You can still have power over the attributes if you pass a dictionary of name: ` >>> i.y [] -If you need to dynamically make a class with `attr.make_class` and it needs to be a subclass of something else than ``object``, use the ``bases`` argument: +If you need to dynamically make a class with `attrs.make_class` and it needs to be a subclass of something else than ``object``, use the ``bases`` argument: .. doctest:: - >>> class D(object): - ... def __eq__(self, other): - ... return True # arbitrary example - >>> C = attr.make_class("C", {}, bases=(D,), cmp=False) - >>> isinstance(C(), D) - True + >>> from attrs import make_class + + >>> class D: + ... def __eq__(self, other): + ... return True # arbitrary example + >>> C = make_class("C", {}, bases=(D,), cmp=False) + >>> isinstance(C(), D) + True Sometimes, you want to have your class's ``__init__`` method do more than just the initialization, validation, etc. that gets done for you automatically when -using ``@attr.s``. +using ``@define``. To do this, just define a ``__attrs_post_init__`` method in your class. It will get called at the end of the generated ``__init__`` method. .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib() - ... y = attr.ib() - ... z = attr.ib(init=False) + >>> @define + ... class C: + ... x: int + ... y: int + ... z: int = field(init=False) ... ... def __attrs_post_init__(self): ... self.z = self.x + self.y @@ -685,10 +693,10 @@ You can exclude single attributes from certain methods: .. doctest:: - >>> @attr.s - ... class C(object): - ... user = attr.ib() - ... password = attr.ib(repr=False) + >>> @define + ... class C: + ... user: str + ... password: str = field(repr=False) >>> C("me", "s3kr3t") C(user='me') @@ -696,9 +704,9 @@ Alternatively, to influence how the generated ``__repr__()`` method formats a sp .. doctest:: - >>> @attr.s - ... class C(object): - ... user = attr.ib() - ... password = attr.ib(repr=lambda value: '***') + >>> @define + ... class C: + ... user: str + ... password: str = field(repr=lambda value: '***') >>> C("me", "s3kr3t") C(user='me', password=***) diff --git a/docs/extending.rst b/docs/extending.rst index fed39a306..d5775adcc 100644 --- a/docs/extending.rst +++ b/docs/extending.rst @@ -2,26 +2,26 @@ Extending ========= Each ``attrs``-decorated class has a ``__attrs_attrs__`` class attribute. -It is a tuple of `attr.Attribute` carrying meta-data about each attribute. +It's a tuple of `attrs.Attribute` carrying metadata about each attribute. So it is fairly simple to build your own decorators on top of ``attrs``: .. doctest:: - >>> import attr + >>> from attr import define >>> def print_attrs(cls): ... print(cls.__attrs_attrs__) ... return cls >>> @print_attrs - ... @attr.s - ... class C(object): - ... a = attr.ib() - (Attribute(name='a', default=NOTHING, validator=None, repr=True, eq=True, order=True, hash=None, init=True, metadata=mappingproxy({}), type=None, converter=None, kw_only=False, inherited=False, on_setattr=None),) + ... @define + ... class C: + ... a: int + (Attribute(name='a', default=NOTHING, validator=None, repr=True, eq=True, eq_key=None, order=True, order_key=None, hash=None, init=True, metadata=mappingproxy({}), type=, converter=None, kw_only=False, inherited=False, on_setattr=None),) .. warning:: - The `attr.s` decorator **must** be applied first because it puts ``__attrs_attrs__`` in place! + The `attrs.define`/`attr.s` decorator **must** be applied first because it puts ``__attrs_attrs__`` in place! That means that is has to come *after* your decorator because:: @a @@ -46,7 +46,10 @@ An example for that is the package `environ-config `_ mypy's ``attrs`` plugin. +Mypy +^^^^ + +Unfortunately, decorator wrapping currently `confuses `_ mypy's ``attrs`` plugin. At the moment, the best workaround is to hold your nose, write a fake mypy plugin, and mutate a bunch of global variables:: from mypy.plugin import Plugin @@ -86,27 +89,56 @@ Then tell mypy about your plugin using your project's ``mypy.ini``: Please note that it is currently *impossible* to let mypy know that you've changed defaults like *eq* or *order*. You can only use this trick to tell mypy that a class is actually an ``attrs`` class. +Pyright +^^^^^^^ + +Generic decorator wrapping is supported in `pyright `_ via their dataclass_transform_ specification. + +For a custom wrapping of the form:: + + def custom_define(f): + return attr.define(f) + +This is implemented via a ``__dataclass_transform__`` type decorator in the custom extension's ``.pyi`` of the form:: + + def __dataclass_transform__( + *, + eq_default: bool = True, + order_default: bool = False, + kw_only_default: bool = False, + field_descriptors: Tuple[Union[type, Callable[..., Any]], ...] = (()), + ) -> Callable[[_T], _T]: ... + + @__dataclass_transform__(field_descriptors=(attr.attrib, attr.field)) + def custom_define(f): ... + +.. warning:: + + ``dataclass_transform`` is supported **provisionally** as of ``pyright`` 1.1.135. + + Both the ``pyright`` dataclass_transform_ specification and ``attrs`` implementation may change in future versions. + Types ----- ``attrs`` offers two ways of attaching type information to attributes: -- `PEP 526 `_ annotations on Python 3.6 and later, +- :pep:`526` annotations on Python 3.6 and later, - and the *type* argument to `attr.ib`. This information is available to you: .. doctest:: - >>> import attr - >>> @attr.s - ... class C(object): - ... x: int = attr.ib() - ... y = attr.ib(type=str) - >>> attr.fields(C).x.type + >>> from attr import attrib, define, field, fields + >>> @define + ... class C: + ... x: int = field() + ... y = attrib(type=str) + >>> fields(C).x.type - >>> attr.fields(C).y.type + >>> fields(C).y.type Currently, ``attrs`` doesn't do anything with this information but it's very useful if you'd like to write your own validators or serializers! @@ -128,36 +160,37 @@ Here are some tips for effective use of metadata: from mylib import MY_METADATA_KEY - @attr.s - class C(object): - x = attr.ib(metadata={MY_METADATA_KEY: 1}) + @define + class C: + x = field(metadata={MY_METADATA_KEY: 1}) Metadata should be composable, so consider supporting this approach even if you decide implementing your metadata in one of the following ways. -- Expose ``attr.ib`` wrappers for your specific metadata. +- Expose ``field`` wrappers for your specific metadata. This is a more graceful approach if your users don't require metadata from other libraries. .. doctest:: + >>> from attr import fields, NOTHING >>> MY_TYPE_METADATA = '__my_type_metadata' >>> >>> def typed( - ... cls, default=attr.NOTHING, validator=None, repr=True, - ... eq=True, order=None, hash=None, init=True, metadata={}, - ... type=None, converter=None + ... cls, default=NOTHING, validator=None, repr=True, + ... eq=True, order=None, hash=None, init=True, metadata=None, + ... converter=None ... ): - ... metadata = dict() if not metadata else metadata + ... metadata = metadata or {} ... metadata[MY_TYPE_METADATA] = cls - ... return attr.ib( + ... return field( ... default=default, validator=validator, repr=repr, ... eq=eq, order=order, hash=hash, init=init, - ... metadata=metadata, type=type, converter=converter + ... metadata=metadata, converter=converter ... ) >>> - >>> @attr.s - ... class C(object): - ... x = typed(int, default=1, init=False) - >>> attr.fields(C).x.metadata[MY_TYPE_METADATA] + >>> @define + ... class C: + ... x: int = typed(int, default=1, init=False) + >>> fields(C).x.metadata[MY_TYPE_METADATA] @@ -172,13 +205,13 @@ Its main purpose is to automatically add converters to attributes based on their This hook must have the following signature: -.. function:: your_hook(cls: type, fields: List[attr.Attribute]) -> List[attr.Attribute] +.. function:: your_hook(cls: type, fields: list[attrs.Attribute]) -> list[attrs.Attribute] :noindex: - *cls* is your class right *before* it is being converted into an attrs class. This means it does not yet have the ``__attrs_attrs__`` attribute. -- *fields* is a list of all :class:`attr.Attribute` instances that will later be set to ``__attrs_attrs__``. +- *fields* is a list of all `attrs.Attribute` instances that will later be set to ``__attrs_attrs__``. You can modify these attributes any way you want: You can add converters, change types, and even remove attributes completely or create new ones! @@ -189,7 +222,7 @@ For example, let's assume that you really don't like floats: >>> def drop_floats(cls, fields): ... return [f for f in fields if f.type not in {float, 'float'}] ... - >>> @attr.frozen(field_transformer=drop_floats) + >>> @frozen(field_transformer=drop_floats) ... class Data: ... a: int ... b: float @@ -217,7 +250,7 @@ A more realistic example would be to automatically convert data that you, e.g., ... results.append(field.evolve(converter=converter)) ... return results ... - >>> @attr.frozen(field_transformer=auto_convert) + >>> @frozen(field_transformer=auto_convert) ... class Data: ... a: int ... b: str @@ -231,19 +264,20 @@ A more realistic example would be to automatically convert data that you, e.g., Customize Value Serialization in ``asdict()`` --------------------------------------------- -``attrs`` allows you to serialize instances of ``attrs`` classes to dicts using the `attr.asdict` function. +``attrs`` allows you to serialize instances of ``attrs`` classes to dicts using the `attrs.asdict` function. However, the result can not always be serialized since most data types will remain as they are: .. doctest:: >>> import json >>> import datetime + >>> from attrs import asdict >>> - >>> @attr.frozen + >>> @frozen ... class Data: ... dt: datetime.datetime ... - >>> data = attr.asdict(Data(datetime.datetime(2020, 5, 4, 13, 37))) + >>> data = asdict(Data(datetime.datetime(2020, 5, 4, 13, 37))) >>> data {'dt': datetime.datetime(2020, 5, 4, 13, 37)} >>> json.dumps(data) @@ -254,17 +288,18 @@ However, the result can not always be serialized since most data types will rema To help you with this, `attr.asdict` allows you to pass a *value_serializer* hook. It has the signature -.. function:: your_hook(inst: type, field: attr.Attribute, value: typing.Any) -> typing.Any +.. function:: your_hook(inst: type, field: attrs.Attribute, value: typing.Any) -> typing.Any :noindex: .. doctest:: + >>> from attr import asdict >>> def serialize(inst, field, value): ... if isinstance(value, datetime.datetime): ... return value.isoformat() ... return value ... - >>> data = attr.asdict( + >>> data = asdict( ... Data(datetime.datetime(2020, 5, 4, 13, 37)), ... value_serializer=serialize, ... ) @@ -272,3 +307,7 @@ It has the signature {'dt': '2020-05-04T13:37:00'} >>> json.dumps(data) '{"dt": "2020-05-04T13:37:00"}' + +***** + +.. _dataclass_transform: https://github.com/microsoft/pyright/blob/master/specs/dataclass_transforms.md diff --git a/docs/glossary.rst b/docs/glossary.rst index 8bd53556b..c270a8cab 100644 --- a/docs/glossary.rst +++ b/docs/glossary.rst @@ -3,15 +3,24 @@ Glossary .. glossary:: + dunder methods + "Dunder" is a contraction of "double underscore". + + It's methods like ``__init__`` or ``__eq__`` that are sometimes also called *magic methods* or it's said that they implement an *object protocol*. + + In spoken form, you'd call ``__init__`` just "dunder init". + + Its first documented use is a `mailing list posting `_ by Mark Jackson from 2002. + dict classes A regular class whose attributes are stored in the `object.__dict__` attribute of every single instance. This is quite wasteful especially for objects with very few data attributes and the space consumption can become significant when creating large numbers of instances. - This is the type of class you get by default both with and without ``attrs`` (except with the next APIs `attr.define`, `attr.mutable`, and `attr.frozen`). + This is the type of class you get by default both with and without ``attrs`` (except with the next APIs `attrs.define()`, `attrs.mutable()`, and `attrs.frozen()`). slotted classes A class whose instances have no `object.__dict__` attribute and `define `_ their attributes in a `object.__slots__` attribute instead. - In ``attrs``, they are created by passing ``slots=True`` to ``@attr.s`` (and are on by default in `attr.define`/`attr.mutable`/`attr.frozen`). + In ``attrs``, they are created by passing ``slots=True`` to ``@attr.s`` (and are on by default in `attrs.define()`/`attrs.mutable()`/`attrs.frozen()`). Their main advantage is that they use less memory on CPython [#pypy]_ and are slightly faster. @@ -22,11 +31,11 @@ Glossary .. doctest:: - >>> import attr - >>> @attr.s(slots=True) - ... class Coordinates(object): - ... x = attr.ib() - ... y = attr.ib() + >>> from attr import define + >>> @define + ... class Coordinates: + ... x: int + ... y: int ... >>> c = Coordinates(x=1, y=2) >>> c.z = 3 @@ -47,9 +56,9 @@ Glossary .. doctest:: >>> import attr, unittest.mock - >>> @attr.s(slots=True) - ... class Slotted(object): - ... x = attr.ib() + >>> @define + ... class Slotted: + ... x: int ... ... def method(self): ... return self.x @@ -61,7 +70,7 @@ Glossary Traceback (most recent call last): ... AttributeError: 'Slotted' object attribute 'method' is read-only - >>> @attr.s # implies 'slots=False' + >>> @define(slots=False) ... class Dicted(Slotted): ... pass >>> d = Dicted(42) @@ -71,7 +80,7 @@ Glossary ... assert 23 == d.method() - Slotted classes must implement :meth:`__getstate__ ` and :meth:`__setstate__ ` to be serializable with `pickle` protocol 0 and 1. - Therefore, ``attrs`` creates these methods automatically for ``slots=True`` classes (Python 2 uses protocol 0 by default). + Therefore, ``attrs`` creates these methods automatically for ``slots=True`` classes. .. note:: diff --git a/docs/hashing.rst b/docs/hashing.rst index 30888f97b..4f2b868e9 100644 --- a/docs/hashing.rst +++ b/docs/hashing.rst @@ -32,7 +32,7 @@ Because according to the definition_ from the official Python docs, the returned It follows that the moment you (or ``attrs``) change the way equality is handled by implementing ``__eq__`` which is based on attribute values, this constraint is broken. For that reason Python 3 will make a class that has customized equality unhashable. Python 2 on the other hand will happily let you shoot your foot off. - Unfortunately ``attrs`` currently mimics Python 2's behavior for backward compatibility reasons if you set ``hash=False``. + Unfortunately, ``attrs`` still mimics (otherwise unsupported) Python 2's behavior for backward compatibility reasons if you set ``hash=False``. The *correct way* to achieve hashing by id is to set ``@attr.s(eq=False)``. Setting ``@attr.s(hash=False)`` (which implies ``eq=True``) is almost certainly a *bug*. @@ -47,14 +47,14 @@ Because according to the definition_ from the official Python docs, the returned The easiest way to reset ``__hash__`` on a class is adding ``__hash__ = object.__hash__`` in the class body. -#. If two object are not equal, their hash **should** be different. +#. If two objects are not equal, their hash **should** be different. While this isn't a requirement from a standpoint of correctness, sets and dicts become less effective if there are a lot of identical hashes. The worst case is when all objects have the same hash which turns a set into a list. #. The hash of an object **must not** change. - If you create a class with ``@attr.s(frozen=True)`` this is fullfilled by definition, therefore ``attrs`` will write a ``__hash__`` function for you automatically. + If you create a class with ``@attr.s(frozen=True)`` this is fulfilled by definition, therefore ``attrs`` will write a ``__hash__`` function for you automatically. You can also force it to write one with ``hash=True`` but then it's *your* responsibility to make sure that the object is not mutated. This point is the reason why mutable structures like lists, dictionaries, or sets aren't hashable while immutable ones like tuples or frozensets are: @@ -80,7 +80,7 @@ If such objects are to be stored in hash-based collections, it can be useful to To enable caching of hash codes, pass ``cache_hash=True`` to ``@attrs``. This may only be done if ``attrs`` is already generating a hash function for the object. -.. [#fn1] The hash is computed by hashing a tuple that consists of an unique id for the class plus all attribute values. +.. [#fn1] The hash is computed by hashing a tuple that consists of a unique id for the class plus all attribute values. .. _definition: https://docs.python.org/3/glossary.html#term-hashable .. _`Python Hashes and Equality`: https://hynek.me/articles/hashes-and-equality/ diff --git a/docs/how-does-it-work.rst b/docs/how-does-it-work.rst index 8519c8119..c7b408341 100644 --- a/docs/how-does-it-work.rst +++ b/docs/how-does-it-work.rst @@ -10,24 +10,26 @@ Boilerplate ``attrs`` certainly isn't the first library that aims to simplify class definition in Python. But its **declarative** approach combined with **no runtime overhead** lets it stand out. -Once you apply the ``@attr.s`` decorator to a class, ``attrs`` searches the class object for instances of ``attr.ib``\ s. +Once you apply the ``@attrs.define`` (or ``@attr.s``) decorator to a class, ``attrs`` searches the class object for instances of ``attr.ib``\ s. Internally they're a representation of the data passed into ``attr.ib`` along with a counter to preserve the order of the attributes. +Alternatively, it's possible to define them using :doc:`types`. In order to ensure that subclassing works as you'd expect it to work, ``attrs`` also walks the class hierarchy and collects the attributes of all base classes. Please note that ``attrs`` does *not* call ``super()`` *ever*. -It will write dunder methods to work on *all* of those attributes which also has performance benefits due to fewer function calls. +It will write :term:`dunder methods` to work on *all* of those attributes which also has performance benefits due to fewer function calls. -Once ``attrs`` knows what attributes it has to work on, it writes the requested dunder methods and -- depending on whether you wish to have a :term:`dict ` or :term:`slotted ` class -- creates a new class for you (``slots=True``) or attaches them to the original class (``slots=False``). +Once ``attrs`` knows what attributes it has to work on, it writes the requested :term:`dunder methods` and -- depending on whether you wish to have a :term:`dict ` or :term:`slotted ` class -- creates a new class for you (``slots=True``) or attaches them to the original class (``slots=False``). While creating new classes is more elegant, we've run into several edge cases surrounding metaclasses that make it impossible to go this route unconditionally. To be very clear: if you define a class with a single attribute without a default value, the generated ``__init__`` will look *exactly* how you'd expect: .. doctest:: - >>> import attr, inspect - >>> @attr.s - ... class C(object): - ... x = attr.ib() + >>> import inspect + >>> from attr import define + >>> @define + ... class C: + ... x: int >>> print(inspect.getsource(C.__init__)) def __init__(self, x): self.x = x @@ -40,7 +42,7 @@ No magic, no meta programming, no expensive introspection at runtime. Everything until this point happens exactly *once* when the class is defined. As soon as a class is done, it's done. And it's just a regular Python class like any other, except for a single ``__attrs_attrs__`` attribute that ``attrs`` uses internally. -Much of the information is accessible via `attr.fields` and other functions which can be used for introspection or for writing your own tools and decorators on top of ``attrs`` (like `attr.asdict`). +Much of the information is accessible via `attrs.fields` and other functions which can be used for introspection or for writing your own tools and decorators on top of ``attrs`` (like `attrs.asdict`). And once you start instantiating your classes, ``attrs`` is out of your way completely. @@ -52,11 +54,11 @@ This **static** approach was very much a design goal of ``attrs`` and what I str Immutability ------------ -In order to give you immutability, ``attrs`` will attach a ``__setattr__`` method to your class that raises an `attr.exceptions.FrozenInstanceError` whenever anyone tries to set an attribute. +In order to give you immutability, ``attrs`` will attach a ``__setattr__`` method to your class that raises an `attrs.exceptions.FrozenInstanceError` whenever anyone tries to set an attribute. -The same is true if you choose to freeze individual attributes using the `attr.setters.frozen` *on_setattr* hook -- except that the exception becomes `attr.exceptions.FrozenAttributeError`. +The same is true if you choose to freeze individual attributes using the `attrs.setters.frozen` *on_setattr* hook -- except that the exception becomes `attrs.exceptions.FrozenAttributeError`. -Both errors subclass `attr.exceptions.FrozenError`. +Both errors subclass `attrs.exceptions.FrozenError`. ----- @@ -85,16 +87,16 @@ This is (still) slower than a plain assignment: $ pyperf timeit --rigorous \ -s "import attr; C = attr.make_class('C', ['x', 'y', 'z'], slots=True)" \ "C(1, 2, 3)" - ........................................ - Median +- std dev: 378 ns +- 12 ns + ......................................... + Mean +- std dev: 228 ns +- 18 ns $ pyperf timeit --rigorous \ -s "import attr; C = attr.make_class('C', ['x', 'y', 'z'], slots=True, frozen=True)" \ "C(1, 2, 3)" - ........................................ - Median +- std dev: 676 ns +- 16 ns + ......................................... + Mean +- std dev: 450 ns +- 26 ns -So on a laptop computer the difference is about 300 nanoseconds (1 second is 1,000,000,000 nanoseconds). +So on a laptop computer the difference is about 230 nanoseconds (1 second is 1,000,000,000 nanoseconds). It's certainly something you'll feel in a hot loop but shouldn't matter in normal code. Pick what's more important to you. @@ -102,6 +104,6 @@ Pick what's more important to you. Summary +++++++ -You should avoid instantiating lots of frozen slotted classes (i.e. ``@attr.s(slots=True, frozen=True)``) in performance-critical code. +You should avoid instantiating lots of frozen slotted classes (i.e. ``@frozen``) in performance-critical code. Frozen dict classes have barely a performance impact, unfrozen slotted classes are even *faster* than unfrozen dict classes (i.e. regular classes). diff --git a/docs/index.rst b/docs/index.rst index 70ef6c109..b637e2bee 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -1,3 +1,6 @@ +.. module:: attr +.. module:: attrs + ====================================== ``attrs``: Classes Without Boilerplate ====================================== @@ -19,39 +22,39 @@ The recommended installation method is `pip `_-i $ python -m pip install attrs -The next three steps should bring you up and running in no time: +The next steps will get you up and running in no time: - `overview` will show you a simple example of ``attrs`` in action and introduce you to its philosophy. - Afterwards, you can start writing your own classes, understand what drives ``attrs``'s design, and know what ``@attr.s`` and ``attr.ib()`` stand for. + Afterwards, you can start writing your own classes and understand what drives ``attrs``'s design. - `examples` will give you a comprehensive tour of ``attrs``'s features. After reading, you will know about our advanced features and how to use them. -- Finally `why` gives you a rundown of potential alternatives and why we think ``attrs`` is superior. - Yes, we've heard about ``namedtuple``\ s and Data Classes! +- `why` gives you a rundown of potential alternatives and why we think ``attrs`` is still worthwhile -- depending on *your* needs even superior. - If at any point you get confused by some terminology, please check out our `glossary`. -If you need any help while getting started, feel free to use the ``python-attrs`` tag on `StackOverflow `_ and someone will surely help you out! +If you need any help while getting started, feel free to use the ``python-attrs`` tag on `Stack Overflow `_ and someone will surely help you out! Day-to-Day Usage ================ - `types` help you to write *correct* and *self-documenting* code. - ``attrs`` has first class support for them and even allows you to drop the calls to `attr.ib` on modern Python versions! + ``attrs`` has first class support for them, yet keeps them optional if you’re not convinced! - Instance initialization is one of ``attrs`` key feature areas. Our goal is to relieve you from writing as much code as possible. `init` gives you an overview what ``attrs`` has to offer and explains some related philosophies we believe in. +- Comparing and ordering objects is a common task. + `comparison` shows you how ``attrs`` helps you with that and how you can customize it. - If you want to put objects into sets or use them as keys in dictionaries, they have to be hashable. The simplest way to do that is to use frozen classes, but the topic is more complex than it seems and `hashing` will give you a primer on what to look out for. - Once you're comfortable with the concepts, our `api` contains all information you need to use ``attrs`` to its fullest. - ``attrs`` is built for extension from the ground up. `extending` will show you the affordances it offers and how to make it a building block of your own projects. +- Finally, if you're confused by all the ``attr.s``, ``attr.ib``, ``attrs``, ``attrib``, ``define``, ``frozen``, and ``field``, head over to `names` for a very short explanation, and optionally a quick history lesson. .. include:: ../README.rst - :start-after: -getting-help- - :end-before: -project-information- - + :start-after: -project-information- ---- @@ -67,28 +70,19 @@ Full Table of Contents examples types init + comparison hashing api extending how-does-it-work + names glossary - -.. include:: ../README.rst - :start-after: -project-information- - .. toctree:: :maxdepth: 1 license - backward-compatibility - python-2 - contributing changelog -Indices and tables -================== - -* `genindex` -* `search` +`Full Index ` diff --git a/docs/init.rst b/docs/init.rst index c65e545d4..487dbf2b2 100644 --- a/docs/init.rst +++ b/docs/init.rst @@ -8,7 +8,7 @@ Passing complex objects into ``__init__`` and then using them to derive data for So assuming you use an ORM and want to extract 2D points from a row object, do not write code like this:: - class Point(object): + class Point: def __init__(self, database_row): self.x = database_row.x self.y = database_row.y @@ -17,10 +17,10 @@ So assuming you use an ORM and want to extract 2D points from a row object, do n Instead, write a `classmethod` that will extract it for you:: - @attr.s - class Point(object): - x = attr.ib() - y = attr.ib() + @define + class Point: + x: float + y: float @classmethod def from_row(cls, row): @@ -34,10 +34,10 @@ For similar reasons, we strongly discourage from patterns like:: pt = Point(**row.attributes) -which couples your classes to the data model. +which couples your classes to the database data model. Try to design your classes in a way that is clean and convenient to use -- not based on your database format. The database format can change anytime and you're stuck with a bad class design that is hard to change. -Embrace classmethods as a filter between reality and what's best for you to work with. +Embrace functions and classmethods as a filter between reality and what's best for you to work with. If you look for object serialization, there's a bunch of projects listed on our ``attrs`` extensions `Wiki page`_. Some of them even support nested schemas. @@ -51,21 +51,22 @@ One thing people tend to find confusing is the treatment of private attributes t .. doctest:: - >>> import inspect, attr - >>> @attr.s - ... class C(object): - ... _x = attr.ib() + >>> import inspect, attr, attrs + >>> from attr import define + >>> @define + ... class C: + ... _x: int >>> inspect.signature(C.__init__) - None> + None> There really isn't a right or wrong, it's a matter of taste. But it's important to be aware of it because it can lead to surprising syntax errors: .. doctest:: - >>> @attr.s - ... class C(object): - ... _1 = attr.ib() + >>> @define + ... class C: + ... _1: int Traceback (most recent call last): ... SyntaxError: invalid syntax @@ -83,13 +84,14 @@ This is when default values come into play: .. doctest:: - >>> import attr - >>> @attr.s - ... class C(object): - ... a = attr.ib(default=42) - ... b = attr.ib(default=attr.Factory(list)) - ... c = attr.ib(factory=list) # syntactic sugar for above - ... d = attr.ib() + >>> from attr import define, field, Factory + + >>> @define + ... class C: + ... a: int = 42 + ... b: list = field(factory=list) + ... c: list = Factory(list) # syntactic sugar for above + ... d: dict = field() ... @d.default ... def _any_name_except_a_name_of_an_attribute(self): ... return {} @@ -97,15 +99,14 @@ This is when default values come into play: C(a=42, b=[], c=[], d={}) It's important that the decorated method -- or any other method or property! -- doesn't have the same name as the attribute, otherwise it would overwrite the attribute definition. -You also cannot use type annotations to elide the `attr.ib` call for ``d`` as explained in `types`. Please note that as with function and method signatures, ``default=[]`` will *not* do what you may think it might do: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(default=[]) + >>> @define + ... class C: + ... x = [] >>> i = C() >>> k = C() >>> i.x.append(42) @@ -145,11 +146,13 @@ The method has to accept three arguments: #. the *attribute* that it's validating, and finally #. the *value* that is passed for it. +These values are passed as *positional arguments*, therefore their names don't matter. + If the value does not pass the validator's standards, it just raises an appropriate exception. - >>> @attr.s - ... class C(object): - ... x = attr.ib() + >>> @define + ... class C: + ... x: int = field() ... @x.validator ... def _check_x(self, attribute, value): ... if value > 42: @@ -161,28 +164,29 @@ If the value does not pass the validator's standards, it just raises an appropri ... ValueError: x must be smaller or equal to 42 -Again, it's important that the decorated method doesn't have the same name as the attribute and that you can't elide the call to `attr.ib`. +Again, it's important that the decorated method doesn't have the same name as the attribute and that the `attrs.field()` helper is used. Callables ~~~~~~~~~ -If you want to re-use your validators, you should have a look at the ``validator`` argument to `attr.ib`. +If you want to re-use your validators, you should have a look at the ``validator`` argument to `attrs.field`. It takes either a callable or a list of callables (usually functions) and treats them as validators that receive the same arguments as with the decorator approach. +Also as with the decorator approach, they are passed as *positional arguments* so you can name them however you want. -Since the validators runs *after* the instance is initialized, you can refer to other attributes while validating: +Since the validators run *after* the instance is initialized, you can refer to other attributes while validating: .. doctest:: >>> def x_smaller_than_y(instance, attribute, value): ... if value >= instance.y: ... raise ValueError("'x' has to be smaller than 'y'!") - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=[attr.validators.instance_of(int), - ... x_smaller_than_y]) - ... y = attr.ib() + >>> @define + ... class C: + ... x = field(validator=[attrs.validators.instance_of(int), + ... x_smaller_than_y]) + ... y = field() >>> C(x=3, y=4) C(x=3, y=4) >>> C(x=4, y=3) @@ -190,15 +194,15 @@ Since the validators runs *after* the instance is initialized, you can refer to ... ValueError: 'x' has to be smaller than 'y'! -This example also shows of some syntactic sugar for using the `attr.validators.and_` validator: if you pass a list, all validators have to pass. +This example also shows of some syntactic sugar for using the `attrs.validators.and_` validator: if you pass a list, all validators have to pass. -``attrs`` won't intercept your changes to those attributes but you can always call `attr.validate` on any instance to verify that it's still valid: +``attrs`` won't intercept your changes to those attributes but you can always call `attrs.validate` on any instance to verify that it's still valid: +When using `attrs.define` or `attrs.frozen`, ``attrs`` will run the validators even when setting the attribute. .. doctest:: >>> i = C(4, 5) - >>> i.x = 5 # works, no magic here - >>> attr.validate(i) + >>> i.x = 5 Traceback (most recent call last): ... ValueError: 'x' has to be smaller than 'y'! @@ -207,9 +211,9 @@ This example also shows of some syntactic sugar for using the `attr.validators.a .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=attr.validators.instance_of(int)) + >>> @define + ... class C: + ... x = field(validator=attrs.validators.instance_of(int)) >>> C(42) C(x=42) >>> C("42") @@ -222,9 +226,9 @@ If you define validators both ways for an attribute, they are both ran: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(validator=attr.validators.instance_of(int)) + >>> @define + ... class C: + ... x = field(validator=attrs.validators.instance_of(int)) ... @x.validator ... def fits_byte(self, attribute, value): ... if not 0 <= value < 256: @@ -242,10 +246,20 @@ If you define validators both ways for an attribute, they are both ran: And finally you can disable validators globally: - >>> attr.set_run_validators(False) + >>> attrs.validators.set_disabled(True) >>> C("128") C(x='128') - >>> attr.set_run_validators(True) + >>> attrs.validators.set_disabled(False) + >>> C("128") + Traceback (most recent call last): + ... + TypeError: ("'x' must be (got '128' that is a ).", Attribute(name='x', default=NOTHING, validator=[>, ], repr=True, cmp=True, hash=True, init=True, metadata=mappingproxy({}), type=None, converter=None), , '128') + +You can achieve the same by using the context manager: + + >>> with attrs.validators.disabled(): + ... C("128") + C(x='128') >>> C("128") Traceback (most recent call last): ... @@ -265,9 +279,9 @@ This can be useful for doing type-conversions on values that you don't want to f .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib(converter=int) + >>> @define + ... class C: + ... x = field(converter=int) >>> o = C("1") >>> o.x 1 @@ -279,9 +293,9 @@ Converters are run *before* validators, so you can use validators to check the f >>> def validate_x(instance, attribute, value): ... if value < 0: ... raise ValueError("x must be at least 0.") - >>> @attr.s - ... class C(object): - ... x = attr.ib(converter=int, validator=validate_x) + >>> @define + ... class C: + ... x = field(converter=int, validator=validate_x) >>> o = C("0") >>> o.x 0 @@ -301,20 +315,82 @@ Arguably, you can abuse converters as one-argument validators: ValueError: invalid literal for int() with base 10: 'x' -Post-Init Hook --------------- +If a converter's first argument has a type annotation, that type will appear in the signature for ``__init__``. +A converter will override an explicit type annotation or ``type`` argument. + +.. doctest:: + + >>> def str2int(x: str) -> int: + ... return int(x) + >>> @define + ... class C: + ... x = field(converter=str2int) + >>> C.__init__.__annotations__ + {'return': None, 'x': } + + +Hooking Yourself Into Initialization +------------------------------------ Generally speaking, the moment you think that you need finer control over how your class is instantiated than what ``attrs`` offers, it's usually best to use a classmethod factory or to apply the `builder pattern `_. -However, sometimes you need to do that one quick thing after your class is initialized. -And for that ``attrs`` offers the ``__attrs_post_init__`` hook that is automatically detected and run after ``attrs`` is done initializing your instance: +However, sometimes you need to do that one quick thing before or after your class is initialized. +And for that ``attrs`` offers three means: + +- ``__attrs_pre_init__`` is automatically detected and run *before* ``attrs`` starts initializing. + This is useful if you need to inject a call to ``super().__init__()``. +- ``__attrs_post_init__`` is automatically detected and run *after* ``attrs`` is done initializing your instance. + This is useful if you want to derive some attribute from others or perform some kind of validation over the whole instance. +- ``__attrs_init__`` is written and attached to your class *instead* of ``__init__``, if ``attrs`` is told to not write one (i.e. ``init=False`` or a combination of ``auto_detect=True`` and a custom ``__init__``). + This is useful if you want full control over the initialization process, but don't want to set the attributes by hand. + + +Pre Init +~~~~~~~~ + +The sole reason for the existence of ``__attrs_pre_init__`` is to give users the chance to call ``super().__init__()``, because some subclassing-based APIs require that. + +.. doctest:: + + >>> @define + ... class C: + ... x: int + ... def __attrs_pre_init__(self): + ... super().__init__() + >>> C(42) + C(x=42) + +If you need more control, use the custom init approach described next. + + +Custom Init +~~~~~~~~~~~ + +If you tell ``attrs`` to not write an ``__init__``, it will write an ``__attrs_init__`` instead, with the same code that it would have used for ``__init__``. +You have full control over the initialization, but also have to type out the types of your arguments etc. +Here's an example of a manual default value: .. doctest:: - >>> @attr.s - ... class C(object): - ... x = attr.ib() - ... y = attr.ib(init=False) + >>> @define + ... class C: + ... x: int + ... + ... def __init__(self, x: int = 42): + ... self.__attrs_init__(x) + >>> C() + C(x=42) + + +Post Init +~~~~~~~~~ + +.. doctest:: + + >>> @define + ... class C: + ... x: int + ... y: int = field(init=False) ... def __attrs_post_init__(self): ... self.y = self.x + 1 >>> C(1) @@ -324,31 +400,31 @@ Please note that you can't directly set attributes on frozen classes: .. doctest:: - >>> @attr.s(frozen=True) - ... class FrozenBroken(object): - ... x = attr.ib() - ... y = attr.ib(init=False) + >>> @frozen + ... class FrozenBroken: + ... x: int + ... y: int = field(init=False) ... def __attrs_post_init__(self): ... self.y = self.x + 1 >>> FrozenBroken(1) Traceback (most recent call last): ... - attr.exceptions.FrozenInstanceError: can't set attribute + attrs.exceptions.FrozenInstanceError: can't set attribute If you need to set attributes on a frozen class, you'll have to resort to the `same trick ` as ``attrs`` and use :meth:`object.__setattr__`: .. doctest:: - >>> @attr.s(frozen=True) - ... class Frozen(object): - ... x = attr.ib() - ... y = attr.ib(init=False) + >>> @define + ... class Frozen: + ... x: int + ... y: int = field(init=False) ... def __attrs_post_init__(self): ... object.__setattr__(self, "y", self.x + 1) >>> Frozen(1) Frozen(x=1, y=2) -Note that you *must not* access the hash code of the object in ``__attrs_post__init__`` if ``cache_hash=True``. +Note that you *must not* access the hash code of the object in ``__attrs_post_init__`` if ``cache_hash=True``. Order of Execution @@ -356,17 +432,59 @@ Order of Execution If present, the hooks are executed in the following order: -1. For each attribute, in the order it was declared: +1. ``__attrs_pre_init__`` (if present on *current* class) +2. For each attribute, in the order it was declared: a. default factory b. converter -2. *all* validators -3. ``__attrs_post_init__`` +3. *all* validators +4. ``__attrs_post_init__`` (if present on *current* class) Notably this means, that you can access all attributes from within your validators, but your converters have to deal with invalid values and have to return a valid value. +Derived Attributes +------------------ + +One of the most common ``attrs`` questions on *Stack Overflow* is how to have attributes that depend on other attributes. +For example if you have an API token and want to instantiate a web client that uses it for authentication. +Based on the previous sections, there are two approaches. + +The simpler one is using ``__attrs_post_init__``:: + + @define + class APIClient: + token: str + client: WebClient = field(init=False) + + def __attrs_post_init__(self): + self.client = WebClient(self.token) + +The second one is using a decorator-based default:: + + @define + class APIClient: + token: str + client: WebClient = field() # needed! attr.ib works too + + @client.default + def _client_factory(self): + return WebClient(self.token) + +That said, and as pointed out in the beginning of the chapter, a better approach would be to have a factory class method:: + + @define + class APIClient: + client: WebClient + + @classmethod + def from_token(cls, token: str) -> "APIClient": + return cls(client=WebClient(token)) + +This makes the class more testable. + + .. _`Wiki page`: https://github.com/python-attrs/attrs/wiki/Extensions-to-attrs .. _`get confused`: https://github.com/python-attrs/attrs/issues/289 .. _`there is no such thing as a private argument`: https://github.com/hynek/characteristic/issues/6 diff --git a/docs/license.rst b/docs/license.rst index cef5f3939..a341a31eb 100644 --- a/docs/license.rst +++ b/docs/license.rst @@ -3,6 +3,6 @@ License and Credits =================== ``attrs`` is licensed under the `MIT `_ license. -The full license text can be also found in the `source code repository `_. +The full license text can be also found in the `source code repository `_. .. include:: ../AUTHORS.rst diff --git a/docs/names.rst b/docs/names.rst new file mode 100644 index 000000000..8fb59c306 --- /dev/null +++ b/docs/names.rst @@ -0,0 +1,122 @@ +On The Core API Names +===================== + +You may be surprised seeing ``attrs`` classes being created using `attrs.define` and with type annotated fields, instead of `attr.s` and `attr.ib()`. + +Or, you wonder why the web and talks are full of this weird `attr.s` and `attr.ib` -- including people having strong opinions about it and using ``attr.attrs`` and ``attr.attrib`` instead. + +And what even is ``attr.dataclass`` that's not documented but commonly used!? + + +TL;DR +----- + +We recommend our modern APIs for new code: + +- `attrs.define()` to define a new class, +- `attrs.mutable()` is an alias for `attrs.define()`, +- `attrs.frozen()` is an alias for ``define(frozen=True)`` +- and `attrs.field()` to define an attribute. + +They have been added in ``attrs`` 20.1.0, they are expressive, and they have modern defaults like slots and type annotation awareness switched on by default. +They are only available in Python 3.6 and later. +Sometimes they're referred to as *next-generation* or *NG* APIs. +As of ``attrs`` 21.3.0 you can also import them from the ``attrs`` package namespace. + +The traditional APIs `attr.s` / `attr.ib`, their serious-business aliases ``attr.attrs`` / ``attr.attrib``, and the never-documented, but popular ``attr.dataclass`` easter egg will stay **forever**. + +``attrs`` will **never** force you to use type annotations. + + +A Short History Lesson +---------------------- + +At this point, ``attrs`` is an old project. +It had its first release in April 2015 -- back when most Python code was on Python 2.7 and Python 3.4 was the first Python 3 release that showed promise. +``attrs`` was always Python 3-first, but `type annotations `_ came only into Python 3.5 that was released in September 2015 and were largely ignored until years later. + +At this time, if you didn't want to implement all the :term:`dunder methods`, the most common way to create a class with some attributes on it was to subclass a `collections.namedtuple`, or one of the many hacks that allowed you to access dictionary keys using attribute lookup. + +But ``attrs`` history goes even a bit further back, to the now-forgotten `characteristic `_ that came out in May 2014 and already used a class decorator, but was overall too unergonomic. + +In the wake of all of that, `glyph `_ and `Hynek `_ came together on IRC and brainstormed how to take the good ideas of ``characteristic``, but make them easier to use and read. +At this point the plan was not to make ``attrs`` what it is now -- a flexible class building kit. +All we wanted was an ergonomic little library to succinctly define classes with attributes. + +Under the impression of the unwieldy ``characteristic`` name, we went to the other side and decided to make the package name part of the API, and keep the API functions very short. +This led to the infamous `attr.s` and `attr.ib` which some found confusing and pronounced it as "attr dot s" or used a singular ``@s`` as the decorator. +But it was really just a way to say ``attrs`` and ``attrib``\ [#attr]_. + +Some people hated this cutey API from day one, which is why we added aliases for them that we called *serious business*: ``@attr.attrs`` and ``attr.attrib()``. +Fans of them usually imported the names and didn't use the package name in the first place. +Unfortunately, the ``attr`` package name started creaking the moment we added ``attr.Factory``, since it couldn’t be morphed into something meaningful in any way. +A problem that grew worse over time, as more APIs and even modules were added. + +But overall, ``attrs`` in this shape was a **huge** success -- especially after glyph's blog post `The One Python Library Everyone Needs `_ in August 2016 and `pytest `_ adopting it. + +Being able to just write:: + + @attr.s + class Point: + x = attr.ib() + y = attr.ib() + +was a big step for those who wanted to write small, focused classes. + + +Dataclasses Enter The Arena +^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +A big change happened in May 2017 when Hynek sat down with `Guido van Rossum `_ and `Eric V. Smith `_ at PyCon US 2017. + +Type annotations for class attributes have `just landed `_ in Python 3.6 and Guido felt like it would be a good mechanic to introduce something similar to ``attrs`` to the Python standard library. +The result, of course, was :pep:`557`\ [#stdlib]_ which eventually became the `dataclasses` module in Python 3.7. + +``attrs`` at this point was lucky to have several people on board who were also very excited about type annotations and helped implement it; including a `Mypy plugin `_. +And so it happened that ``attrs`` `shipped `_ the new method of defining classes more than half a year before Python 3.7 -- and thus `dataclasses` -- were released. + +----- + +Due to backward-compatibility concerns, this feature is off by default in the `attr.s` decorator and has to be activated using ``@attr.s(auto_attribs=True)``, though. +As a little easter egg and to save ourselves some typing, we've also `added `_ an alias called ``attr.dataclasses`` that just set ``auto_attribs=True``. +It was never documented, but people found it and used it and loved it. + +Over the next months and years it became clear that type annotations have become the popular way to define classes and their attributes. +However, it has also become clear that some people viscerally hate type annotations. +We're determined to serve both. + + +``attrs`` TNG +^^^^^^^^^^^^^ + +Over its existence, ``attrs`` never stood still. +But since we also greatly care about backward compatibility and not breaking our users' code, many features and niceties have to be manually activated. + +That is not only annoying, it also leads to the problem that many of ``attrs``'s users don't even know what it can do for them. +We've spent years alone explaining that defining attributes using type annotations is in no way unique to `dataclasses`. + +Finally we've decided to take the `Go route `_: +instead of fiddling with the old APIs -- whose names felt anachronistic anyway -- we'd define new ones, with better defaults. +So in July 2018, we `looked for better names `_ and came up with `attr.define`, `attr.field`, and friends. +Then in January 2019, we `started looking for inconvenient defaults `_ that we now could fix without any repercussions. + +These APIs proved to be very popular, so we've finally changed the documentation to them in November of 2021. + +All of this took way too long, of course. +One reason is the COVID-19 pandemic, but also our fear to fumble this historic chance to fix our APIs. + +Finally, in December 2021, we've added the ``attrs`` package namespace. + +We hope you like the result:: + + from attrs import define + + @define + class Point: + x: int + y: int + + +.. [#attr] We considered calling the PyPI package just ``attr`` too, but the name was already taken by an *ostensibly* inactive `package on PyPI `_. +.. [#stdlib] The highly readable PEP also explains why ``attrs`` wasn't just added to the standard library. + Don't believe the myths and rumors. diff --git a/docs/overview.rst b/docs/overview.rst index 146a9284e..7df1f2476 100644 --- a/docs/overview.rst +++ b/docs/overview.rst @@ -6,7 +6,7 @@ In order to fulfill its ambitious goal of bringing back the joy to writing class .. include:: ../README.rst :start-after: -code-begin- - :end-before: -getting-help- + :end-before: -project-information- .. _philosophy: @@ -23,10 +23,10 @@ Philosophy The end. It doesn't add metaclasses. It doesn't add classes you've never heard of to your inheritance tree. - An ``attrs`` class in runtime is indistiguishable from a regular class: because it *is* a regular class with a few boilerplate-y methods attached. + An ``attrs`` class in runtime is indistinguishable from a regular class: because it *is* a regular class with a few boilerplate-y methods attached. **Be light on API impact.** - As convenient as it seems at first, ``attrs`` will *not* tack on any methods to your classes save the dunder ones. + As convenient as it seems at first, ``attrs`` will *not* tack on any methods to your classes except for the :term:`dunder ones `. Hence all the useful `tools ` that come with ``attrs`` live in functions that operate on top of instances. Since they take an ``attrs`` instance as their first argument, you can attach them to your classes with one line of code. @@ -49,39 +49,10 @@ What ``attrs`` Is Not All ``attrs`` does is: -1. take your declaration, -2. write dunder methods based on that information, +1. Take your declaration, +2. write :term:`dunder methods` based on that information, 3. and attach them to your class. It does *nothing* dynamic at runtime, hence zero runtime overhead. It's still *your* class. Do with it as you please. - - -On the ``attr.s`` and ``attr.ib`` Names -======================================= - -The ``attr.s`` decorator and the ``attr.ib`` function aren't any obscure abbreviations. -They are a *concise* and highly *readable* way to write ``attrs`` and ``attrib`` with an *explicit namespace*. - -At first, some people have a negative gut reaction to that; resembling the reactions to Python's significant whitespace. -And as with that, once one gets used to it, the readability and explicitness of that API prevails and delights. - -For those who can't swallow that API at all, ``attrs`` comes with serious business aliases: ``attr.attrs`` and ``attr.attrib``. - -Therefore, the following class definition is identical to the previous one: - -.. doctest:: - - >>> from attr import attrs, attrib, Factory - >>> @attrs - ... class SomeClass(object): - ... a_number = attrib(default=42) - ... list_of_numbers = attrib(default=Factory(list)) - ... - ... def hard_math(self, another_number): - ... return self.a_number + sum(self.list_of_numbers) * another_number - >>> SomeClass(1, [1, 2, 3]) - SomeClass(a_number=1, list_of_numbers=[1, 2, 3]) - -Use whichever variant fits your taste better. diff --git a/docs/python-2.rst b/docs/python-2.rst deleted file mode 100644 index 4fdf2c9b2..000000000 --- a/docs/python-2.rst +++ /dev/null @@ -1,25 +0,0 @@ -Python 2 Statement -================== - -While ``attrs`` has always been a Python 3-first package, we the maintainers are aware that Python 2 will not magically disappear in 2020. -We are also aware that ``attrs`` is an important building block in many people's systems and livelihoods. - -As such, we do **not** have any immediate plans to drop Python 2 support in ``attrs``. -We intend to support is as long as it will be technically feasible for us. - -Feasibility in this case means: - -1. Possibility to run the tests on our development computers, -2. and **free** CI options. - -This can mean that we will have to run our tests on PyPy, whose maintainters have unequivocally declared that they do not intend to stop the development and maintenance of their Python 2-compatible line at all. -And this can mean that at some point, a sponsor will have to step up and pay for bespoke CI setups. - -**However**: there is no promise of new features coming to ``attrs`` running under Python 2. -It is up to our discretion alone, to decide whether the introduced complexity or awkwardness are worth it, or whether we choose to make a feature available on modern platforms only. - - -Summary -------- - -We will do our best to support existing users, but nobody is entitled to the latest and greatest features on a platform that is officially end of life. diff --git a/docs/types.rst b/docs/types.rst index 782909754..e4f7c85db 100644 --- a/docs/types.rst +++ b/docs/types.rst @@ -3,31 +3,30 @@ Type Annotations ``attrs`` comes with first class support for type annotations for both Python 3.6 (:pep:`526`) and legacy syntax. -On Python 3.6 and later, you can even drop the `attr.ib`\ s if you're willing to annotate *all* attributes. -That means that on modern Python versions, the declaration part of the example from the README can be simplified to: - +However they will forever remain *optional*, therefore the example from the README could also be written as: .. doctest:: - >>> import attr - >>> import typing + >>> from attrs import define, field - >>> @attr.s(auto_attribs=True) + >>> @define ... class SomeClass: - ... a_number: int = 42 - ... list_of_numbers: typing.List[int] = attr.Factory(list) + ... a_number = field(default=42) + ... list_of_numbers = field(factory=list) >>> sc = SomeClass(1, [1, 2, 3]) >>> sc SomeClass(a_number=1, list_of_numbers=[1, 2, 3]) - >>> attr.fields(SomeClass).a_number.type - -You will still need `attr.ib` for advanced features, but not for the common cases. +You can choose freely between the approaches, but please remember that if you choose to use type annotations, you **must** annotate **all** attributes! + +---- + +Even when going all-in on type annotations, you will need `attr.field` for some advanced features though. One of those features are the decorator-based features like defaults. It's important to remember that ``attrs`` doesn't do any magic behind your back. -All the decorators are implemented using an object that is returned by the call to `attr.ib`. +All the decorators are implemented using an object that is returned by the call to `attrs.field`. Attributes that only carry a class annotation do not have that object so trying to call a method on it will inevitably fail. @@ -36,12 +35,16 @@ Attributes that only carry a class annotation do not have that object so trying Please note that types -- however added -- are *only metadata* that can be queried from the class and they aren't used for anything out of the box! Because Python does not allow references to a class object before the class is defined, -types may be defined as string literals, so-called *forward references*. -Also, starting in Python 3.10 (:pep:`526`) **all** annotations will be string literals. -When this happens, ``attrs`` will simply put these string literals into the ``type`` attributes. -If you need to resolve these to real types, you can call `attr.resolve_types` which will update the attribute in place. +types may be defined as string literals, so-called *forward references* (:pep:`526`). +You can enable this automatically for a whole module by using ``from __future__ import annotations`` (:pep:`563`) as of Python 3.7. +In this case ``attrs`` simply puts these string literals into the ``type`` attributes. +If you need to resolve these to real types, you can call `attrs.resolve_types` which will update the attribute in place. -In practice though, types show their biggest usefulness in combination with tools like mypy_ or pytype_ that both have dedicated support for ``attrs`` classes. +In practice though, types show their biggest usefulness in combination with tools like mypy_, pytype_, or pyright_ that have dedicated support for ``attrs`` classes. + +The addition of static types is certainly one of the most exciting features in the Python ecosystem and helps you write *correct* and *verified self-documenting* code. + +If you don't know where to start, Carl Meyer gave a great talk on `Type-checked Python in the Real World `_ at PyCon US 2018 that will help you to get started in no time. mypy @@ -65,16 +68,41 @@ To mypy, this code is equivalent to the one above: .. code-block:: python @attr.s - class SomeClass(object): + class SomeClass: a_number = attr.ib(default=42) # type: int - list_of_numbers = attr.ib(factory=list, type=typing.List[int]) + list_of_numbers = attr.ib(factory=list, type=list[int]) -***** -The addition of static types is certainly one of the most exciting features in the Python ecosystem and helps you writing *correct* and *verified self-documenting* code. +pyright +------- -If you don't know where to start, Carl Meyer gave a great talk on `Type-checked Python in the Real World `_ at PyCon US 2018 that will help you to get started in no time. +``attrs`` provides support for pyright_ though the dataclass_transform_ specification. +This provides static type inference for a subset of ``attrs`` equivalent to standard-library ``dataclasses``, +and requires explicit type annotations using the `attrs.define` or ``@attr.s(auto_attribs=True)`` API. + +Given the following definition, ``pyright`` will generate static type signatures for ``SomeClass`` attribute access, ``__init__``, ``__eq__``, and comparison methods:: + + @attr.define + class SomeClass: + a_number: int = 42 + list_of_numbers: list[int] = attr.field(factory=list) + +.. warning:: + + The ``pyright`` inferred types are a subset of those supported by ``mypy``, including: + + - The generated ``__init__`` signature only includes the attribute type annotations. + It currently does not include attribute ``converter`` types. + + - The ``attr.frozen`` decorator is not typed with frozen attributes, which are properly typed via ``attr.define(frozen=True)``. + + A `full list `_ of limitations and incompatibilities can be found in pyright's repository. + + Your constructive feedback is welcome in both `attrs#795 `_ and `pyright#1782 `_. + Generally speaking, the decision on improving ``attrs`` support in pyright is entirely Microsoft's prerogative though. .. _mypy: http://mypy-lang.org .. _pytype: https://google.github.io/pytype/ +.. _pyright: https://github.com/microsoft/pyright +.. _dataclass_transform: https://github.com/microsoft/pyright/blob/main/specs/dataclass_transforms.md diff --git a/docs/why.rst b/docs/why.rst index 3a3d478cb..9edae27a2 100644 --- a/docs/why.rst +++ b/docs/why.rst @@ -2,54 +2,51 @@ Why not… ======== -If you'd like third party's account why ``attrs`` is great, have a look at Glyph's `The One Python Library Everyone Needs `_! +If you'd like third party's account why ``attrs`` is great, have a look at Glyph's `The One Python Library Everyone Needs `_. It predates type annotations and hence Data Classes, but it masterfully illustrates the appeal of class-building packages. -…tuples? --------- - - -Readability -^^^^^^^^^^^ - -What makes more sense while debugging:: - - Point(x=1, y=2) - -or:: - - (1, 2) - -? - -Let's add even more ambiguity:: +…Data Classes? +-------------- - Customer(id=42, reseller=23, first_name="Jane", last_name="John") +:pep:`557` added Data Classes to `Python 3.7 `_ that resemble ``attrs`` in many ways. -or:: +They are the result of the Python community's `wish `_ to have an easier way to write classes in the standard library that doesn't carry the problems of ``namedtuple``\ s. +To that end, ``attrs`` and its developers were involved in the PEP process and while we may disagree with some minor decisions that have been made, it's a fine library and if it stops you from abusing ``namedtuple``\ s, they are a huge win. - (42, 23, "Jane", "John") +Nevertheless, there are still reasons to prefer ``attrs`` over Data Classes. +Whether they're relevant to *you* depends on your circumstances: -? +- Data Classes are *intentionally* less powerful than ``attrs``. + There is a long list of features that were sacrificed for the sake of simplicity and while the most obvious ones are validators, converters, :ref:`equality customization `, or :doc:`extensibility ` in general, it permeates throughout all APIs. -Why would you want to write ``customer[2]`` instead of ``customer.first_name``? + On the other hand, Data Classes currently do not offer any significant feature that ``attrs`` doesn't already have. +- ``attrs`` supports all mainstream Python versions including PyPy. +- ``attrs`` doesn't force type annotations on you if you don't like them. +- But since it **also** supports typing, it's the best way to embrace type hints *gradually*, too. +- While Data Classes are implementing features from ``attrs`` every now and then, their presence is dependent on the Python version, not the package version. + For example, support for ``__slots__`` has only been added in Python 3.10, but it doesn’t do cell rewriting and therefore doesn’t support bare calls to ``super()``. + This may or may not be fixed in later Python releases, but handling all these differences is especially painful for PyPI packages that support multiple Python versions. + And of course, this includes possible implementation bugs. +- ``attrs`` can and will move faster. + We are not bound to any release schedules and we have a clear deprecation policy. -Don't get me started when you add nesting. -If you've never run into mysterious tuples you had no idea what the hell they meant while debugging, you're much smarter than yours truly. + One of the `reasons `_ to not vendor ``attrs`` in the standard library was to not impede ``attrs``'s future development. -Using proper classes with names and types makes program code much more readable and comprehensible_. -Especially when trying to grok a new piece of software or returning to old code after several months. +One way to think about ``attrs`` vs Data Classes is that ``attrs`` is a fully-fledged toolkit to write powerful classes while Data Classes are an easy way to get a class with some attributes. +Basically what ``attrs`` was in 2015. -.. _comprehensible: https://arxiv.org/pdf/1304.5257.pdf +…pydantic? +---------- -Extendability -^^^^^^^^^^^^^ +*pydantic* is first and foremost a *data validation library*. +As such, it is a capable complement to class building libraries like ``attrs`` (or Data Classes!) for parsing and validating untrusted data. -Imagine you have a function that takes or returns a tuple. -Especially if you use tuple unpacking (eg. ``x, y = get_point()``), adding additional data means that you have to change the invocation of that function *everywhere*. +However, as convenient as it might be, using it for your business or data layer `is problematic in several ways `_: +Is it really necessary to re-validate all your objects while reading them from a trusted database? +In the parlance of `Form, Command, and Model Validation `_, *pydantic* is the right tool for *Commands*. -Adding an attribute to a class concerns only those who actually care about that attribute. +`Separation of concerns `_ feels tedious at times, but it's one of those things that you get to appreciate once you've shot your own foot often enough. …namedtuples? @@ -57,7 +54,7 @@ Adding an attribute to a class concerns only those who actually care about that `collections.namedtuple`\ s are tuples with names, not classes. [#history]_ Since writing classes is tiresome in Python, every now and then someone discovers all the typing they could save and gets really excited. -However that convenience comes at a price. +However, that convenience comes at a price. The most obvious difference between ``namedtuple``\ s and ``attrs``-based classes is that the latter are type-sensitive: @@ -92,7 +89,7 @@ Other often surprising behaviors include: - Iterability also implies that it's easy to accidentally unpack a ``namedtuple`` which leads to hard-to-find bugs. [#iter]_ - ``namedtuple``\ s have their methods *on your instances* whether you like it or not. [#pollution]_ - ``namedtuple``\ s are *always* immutable. - Not only does that mean that you can't decide for yourself whether your instances should be immutable or not, it also means that if you want to influence your class' initialization (validation? default values?), you have to implement :meth:`__new__() ` which is a particularly hacky and error-prone requirement for a very common problem. [#immutable]_ + Not only does that mean that you can't decide for yourself whether your instances should be immutable or not, it also means that if you want to influence your class' initialization (validation? default values?), you have to implement :meth:`__new__() ` which is a particularly hacky and error-prone requirement for a very common problem. [#immutable]_ - To attach methods to a ``namedtuple`` you have to subclass it. And if you follow the standard library documentation's recommendation of:: @@ -102,7 +99,7 @@ Other often surprising behaviors include: you end up with a class that has *two* ``Point``\ s in its :attr:`__mro__ `: ``[, , , ]``. That's not only confusing, it also has very practical consequences: - for example if you create documentation that includes class hierarchies like `Sphinx's autodoc `_ with ``show-inheritance``. + for example if you create documentation that includes class hierarchies like `Sphinx's autodoc `_ with ``show-inheritance``. Again: common problem, hacky solution with confusing fallout. All these things make ``namedtuple``\ s a particularly poor choice for public APIs because all your objects are irrevocably tainted. @@ -133,26 +130,50 @@ With ``attrs`` your users won't notice a difference because it creates regular, .. _behaving like a tuple: https://docs.python.org/3/tutorial/datastructures.html#tuples-and-sequences -…Data Classes? --------------- +…tuples? +-------- -:pep:`557` added Data Classes to `Python 3.7 `_ that resemble ``attrs`` in many ways. +Readability +^^^^^^^^^^^ -They are the result of the Python community's `wish `_ to have an easier way to write classes in the standard library that doesn't carry the problems of ``namedtuple``\ s. -To that end, ``attrs`` and its developers were involved in the PEP process and while we may disagree with some minor decisions that have been made, it's a fine library and if it stops you from abusing ``namedtuple``\ s, they are a huge win. +What makes more sense while debugging:: -Nevertheless, there are still reasons to prefer ``attrs`` over Data Classes whose relevancy depends on your circumstances: + Point(x=1, y=2) -- ``attrs`` supports all mainstream Python versions, including CPython 2.7 and PyPy. -- Data Classes are intentionally less powerful than ``attrs``. - There is a long list of features that were sacrificed for the sake of simplicity and while the most obvious ones are validators, converters, and ``__slots__``, it permeates throughout all APIs. +or:: - On the other hand, Data Classes currently do not offer any significant feature that ``attrs`` doesn't already have. -- ``attrs`` can and will move faster. - We are not bound to any release schedules and we have a clear deprecation policy. + (1, 2) - One of the `reasons `_ to not vendor ``attrs`` in the standard library was to not impede ``attrs``'s future development. +? + +Let's add even more ambiguity:: + Customer(id=42, reseller=23, first_name="Jane", last_name="John") + +or:: + + (42, 23, "Jane", "John") + +? + +Why would you want to write ``customer[2]`` instead of ``customer.first_name``? + +Don't get me started when you add nesting. +If you've never run into mysterious tuples you had no idea what the hell they meant while debugging, you're much smarter than yours truly. + +Using proper classes with names and types makes program code much more readable and comprehensible_. +Especially when trying to grok a new piece of software or returning to old code after several months. + +.. _comprehensible: https://arxiv.org/pdf/1304.5257.pdf + + +Extendability +^^^^^^^^^^^^^ + +Imagine you have a function that takes or returns a tuple. +Especially if you use tuple unpacking (eg. ``x, y = get_point()``), adding additional data means that you have to change the invocation of that function *everywhere*. + +Adding an attribute to a class concerns only those who actually care about that attribute. …dicts? @@ -181,7 +202,7 @@ To bring it into perspective, the equivalent of .. doctest:: >>> @attr.s - ... class SmartClass(object): + ... class SmartClass: ... a = attr.ib() ... b = attr.ib() >>> SmartClass(1, 2) @@ -191,7 +212,7 @@ is roughly .. doctest:: - >>> class ArtisanalClass(object): + >>> class ArtisanalClass: ... def __init__(self, a, b): ... self.a = a ... self.b = b @@ -251,7 +272,7 @@ You can freely choose which features you want and disable those that you want mo .. doctest:: >>> @attr.s(repr=False) - ... class SmartClass(object): + ... class SmartClass: ... a = attr.ib() ... b = attr.ib() ... diff --git a/mypy.ini b/mypy.ini deleted file mode 100644 index 685c02599..000000000 --- a/mypy.ini +++ /dev/null @@ -1,3 +0,0 @@ -[mypy] -disallow_untyped_defs = True -check_untyped_defs = True diff --git a/pyproject.toml b/pyproject.toml index 14f65a366..d100c75ac 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -6,13 +6,14 @@ build-backend = "setuptools.build_meta" [tool.coverage.run] parallel = true branch = true -source = ["attr"] +source = ["attr", "attrs"] [tool.coverage.paths] source = ["src", ".tox/*/site-packages"] [tool.coverage.report] show_missing = true +skip_covered = true exclude_lines = [ "pragma: no cover", # PyPy is unacceptably slow under coverage. @@ -30,6 +31,10 @@ fail-under = 100 whitelist-regex = ["test_.*"] +[tool.check-wheel-contents] +toplevel = ["attr", "attrs"] + + [tool.isort] profile = "attrs" @@ -49,7 +54,7 @@ profile = "attrs" [[tool.towncrier.type]] directory = "breaking" - name = "Backward-incompatible Changes" + name = "Backwards-incompatible Changes" showcontent = true [[tool.towncrier.type]] @@ -61,3 +66,8 @@ profile = "attrs" directory = "change" name = "Changes" showcontent = true + + +[tool.mypy] +disallow_untyped_defs = true +check_untyped_defs = true diff --git a/setup.py b/setup.py index 64af96e07..392bc04de 100644 --- a/setup.py +++ b/setup.py @@ -1,6 +1,10 @@ +# SPDX-License-Identifier: MIT + import codecs import os +import platform import re +import sys from setuptools import find_packages, setup @@ -13,11 +17,13 @@ KEYWORDS = ["class", "attribute", "boilerplate"] PROJECT_URLS = { "Documentation": "https://www.attrs.org/", + "Changelog": "https://www.attrs.org/en/stable/changelog.html", "Bug Tracker": "https://github.com/python-attrs/attrs/issues", "Source Code": "https://github.com/python-attrs/attrs", "Funding": "https://github.com/sponsors/hynek", "Tidelift": "https://tidelift.com/subscription/pkg/pypi-attrs?" "utm_source=pypi-attrs&utm_medium=pypi", + "Ko-fi": "https://ko-fi.com/the_hynek", } CLASSIFIERS = [ "Development Status :: 5 - Production/Stable", @@ -26,30 +32,39 @@ "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Programming Language :: Python", - "Programming Language :: Python :: 2", - "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", + "Programming Language :: Python :: 3.10", + "Programming Language :: Python :: 3.11", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: PyPy", "Topic :: Software Development :: Libraries :: Python Modules", ] INSTALL_REQUIRES = [] EXTRAS_REQUIRE = { - "docs": ["furo", "sphinx", "zope.interface"], + "docs": ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"], "tests_no_zope": [ + # For regression test to ensure cloudpickle compat doesn't break. + 'cloudpickle; python_implementation == "CPython"', # 5.0 introduced toml; parallel was broken until 5.0.2 "coverage[toml]>=5.0.2", "hypothesis", "pympler", "pytest>=4.3.0", # 4.3.0 dropped last use of `convert` - "six", ], } +if ( + sys.version_info[:2] >= (3, 6) + and platform.python_implementation() != "PyPy" +): + EXTRAS_REQUIRE["tests_no_zope"].extend( + ["mypy>=0.900,!=0.940", "pytest-mypy-plugins"] + ) + EXTRAS_REQUIRE["tests"] = EXTRAS_REQUIRE["tests_no_zope"] + ["zope.interface"] EXTRAS_REQUIRE["dev"] = ( EXTRAS_REQUIRE["tests"] + EXTRAS_REQUIRE["docs"] + ["pre-commit"] @@ -84,10 +99,17 @@ def find_meta(meta): raise RuntimeError("Unable to find __{meta}__ string.".format(meta=meta)) +LOGO = """ +.. image:: https://www.attrs.org/en/stable/_static/attrs_logo.png + :alt: attrs logo + :align: center +""" # noqa + VERSION = find_meta("version") URL = find_meta("url") LONG = ( - read("README.rst") + LOGO + + read("README.rst").split(".. teaser-begin")[1] + "\n\n" + "Release Information\n" + "===================\n\n" @@ -119,7 +141,7 @@ def find_meta(meta): long_description_content_type="text/x-rst", packages=PACKAGES, package_dir={"": "src"}, - python_requires=">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*", + python_requires=">=3.5", zip_safe=False, classifiers=CLASSIFIERS, install_requires=INSTALL_REQUIRES, diff --git a/src/attr/__init__.py b/src/attr/__init__.py index bf329cad5..386305d62 100644 --- a/src/attr/__init__.py +++ b/src/attr/__init__.py @@ -1,10 +1,12 @@ -from __future__ import absolute_import, division, print_function +# SPDX-License-Identifier: MIT + import sys from functools import partial from . import converters, exceptions, filters, setters, validators +from ._cmp import cmp_using from ._config import get_run_validators, set_run_validators from ._funcs import asdict, assoc, astuple, evolve, has, resolve_types from ._make import ( @@ -21,7 +23,7 @@ from ._version_info import VersionInfo -__version__ = "20.3.0" +__version__ = "22.1.0" __version_info__ = VersionInfo._from_version_string(__version__) __title__ = "attrs" @@ -52,6 +54,7 @@ "attrib", "attributes", "attrs", + "cmp_using", "converters", "evolve", "exceptions", @@ -71,6 +74,6 @@ ] if sys.version_info[:2] >= (3, 6): - from ._next_gen import define, field, frozen, mutable + from ._next_gen import define, field, frozen, mutable # noqa: F401 - __all__.extend((define, field, frozen, mutable)) + __all__.extend(("define", "field", "frozen", "mutable")) diff --git a/src/attr/__init__.pyi b/src/attr/__init__.pyi index 442d6e77f..03cc4c82d 100644 --- a/src/attr/__init__.pyi +++ b/src/attr/__init__.pyi @@ -1,12 +1,16 @@ +import sys + from typing import ( Any, Callable, + ClassVar, Dict, Generic, List, + Mapping, Optional, + Protocol, Sequence, - Mapping, Tuple, Type, TypeVar, @@ -15,12 +19,12 @@ from typing import ( ) # `import X as X` is required to make these public +from . import converters as converters from . import exceptions as exceptions from . import filters as filters -from . import converters as converters from . import setters as setters from . import validators as validators - +from ._cmp import cmp_using as cmp_using from ._version_info import VersionInfo __version__: str @@ -37,6 +41,7 @@ __copyright__: str _T = TypeVar("_T") _C = TypeVar("_C", bound=type) +_EqOrderType = Union[bool, Callable[[Any], Any]] _ValidatorType = Callable[[Any, Attribute[_T], _T], Any] _ConverterType = Callable[[Any], Any] _FilterType = Callable[[Attribute[_T], _T], bool] @@ -46,12 +51,18 @@ _OnSetAttrType = Callable[[Any, Attribute[Any], Any], Any] _OnSetAttrArgType = Union[ _OnSetAttrType, List[_OnSetAttrType], setters._NoOpType ] -_FieldTransformer = Callable[[type, List[Attribute]], List[Attribute]] +_FieldTransformer = Callable[ + [type, List[Attribute[Any]]], List[Attribute[Any]] +] # FIXME: in reality, if multiple validators are passed they must be in a list # or tuple, but those are invariant and so would prevent subtypes of # _ValidatorType from working when passed in a list or tuple. _ValidatorArgType = Union[_ValidatorType[_T], Sequence[_ValidatorType[_T]]] +# A protocol to be able to statically accept an attrs class. +class AttrsInstance(Protocol): + __attrs_attrs__: ClassVar[Any] + # _make -- NOTHING: object @@ -59,22 +70,54 @@ NOTHING: object # NOTE: Factory lies about its return type to make this possible: # `x: List[int] # = Factory(list)` # Work around mypy issue #4554 in the common case by using an overload. -@overload -def Factory(factory: Callable[[], _T]) -> _T: ... -@overload -def Factory( - factory: Union[Callable[[Any], _T], Callable[[], _T]], - takes_self: bool = ..., -) -> _T: ... +if sys.version_info >= (3, 8): + from typing import Literal + @overload + def Factory(factory: Callable[[], _T]) -> _T: ... + @overload + def Factory( + factory: Callable[[Any], _T], + takes_self: Literal[True], + ) -> _T: ... + @overload + def Factory( + factory: Callable[[], _T], + takes_self: Literal[False], + ) -> _T: ... + +else: + @overload + def Factory(factory: Callable[[], _T]) -> _T: ... + @overload + def Factory( + factory: Union[Callable[[Any], _T], Callable[[], _T]], + takes_self: bool = ..., + ) -> _T: ... + +# Static type inference support via __dataclass_transform__ implemented as per: +# https://github.com/microsoft/pyright/blob/1.1.135/specs/dataclass_transforms.md +# This annotation must be applied to all overloads of "define" and "attrs" +# +# NOTE: This is a typing construct and does not exist at runtime. Extensions +# wrapping attrs decorators should declare a separate __dataclass_transform__ +# signature in the extension module using the specification linked above to +# provide pyright support. +def __dataclass_transform__( + *, + eq_default: bool = True, + order_default: bool = False, + kw_only_default: bool = False, + field_descriptors: Tuple[Union[type, Callable[..., Any]], ...] = (()), +) -> Callable[[_T], _T]: ... class Attribute(Generic[_T]): name: str default: Optional[_T] validator: Optional[_ValidatorType[_T]] repr: _ReprArgType - cmp: bool - eq: bool - order: bool + cmp: _EqOrderType + eq: _EqOrderType + order: _EqOrderType hash: Optional[bool] init: bool converter: Optional[_ConverterType] @@ -82,6 +125,7 @@ class Attribute(Generic[_T]): type: Optional[Type[_T]] kw_only: bool on_setattr: _OnSetAttrType + def evolve(self, **changes: Any) -> "Attribute[Any]": ... # NOTE: We had several choices for the annotation to use for type arg: # 1) Type[_T] @@ -112,7 +156,7 @@ def attrib( default: None = ..., validator: None = ..., repr: _ReprArgType = ..., - cmp: Optional[bool] = ..., + cmp: Optional[_EqOrderType] = ..., hash: Optional[bool] = ..., init: bool = ..., metadata: Optional[Mapping[Any, Any]] = ..., @@ -120,8 +164,8 @@ def attrib( converter: None = ..., factory: None = ..., kw_only: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., ) -> Any: ... @@ -132,7 +176,7 @@ def attrib( default: None = ..., validator: Optional[_ValidatorArgType[_T]] = ..., repr: _ReprArgType = ..., - cmp: Optional[bool] = ..., + cmp: Optional[_EqOrderType] = ..., hash: Optional[bool] = ..., init: bool = ..., metadata: Optional[Mapping[Any, Any]] = ..., @@ -140,8 +184,8 @@ def attrib( converter: Optional[_ConverterType] = ..., factory: Optional[Callable[[], _T]] = ..., kw_only: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., ) -> _T: ... @@ -151,7 +195,7 @@ def attrib( default: _T, validator: Optional[_ValidatorArgType[_T]] = ..., repr: _ReprArgType = ..., - cmp: Optional[bool] = ..., + cmp: Optional[_EqOrderType] = ..., hash: Optional[bool] = ..., init: bool = ..., metadata: Optional[Mapping[Any, Any]] = ..., @@ -159,8 +203,8 @@ def attrib( converter: Optional[_ConverterType] = ..., factory: Optional[Callable[[], _T]] = ..., kw_only: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., ) -> _T: ... @@ -170,7 +214,7 @@ def attrib( default: Optional[_T] = ..., validator: Optional[_ValidatorArgType[_T]] = ..., repr: _ReprArgType = ..., - cmp: Optional[bool] = ..., + cmp: Optional[_EqOrderType] = ..., hash: Optional[bool] = ..., init: bool = ..., metadata: Optional[Mapping[Any, Any]] = ..., @@ -178,8 +222,8 @@ def attrib( converter: Optional[_ConverterType] = ..., factory: Optional[Callable[[], _T]] = ..., kw_only: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., ) -> Any: ... @overload @@ -213,8 +257,8 @@ def field( converter: Optional[_ConverterType] = ..., factory: Optional[Callable[[], _T]] = ..., kw_only: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., ) -> _T: ... @@ -231,8 +275,8 @@ def field( converter: Optional[_ConverterType] = ..., factory: Optional[Callable[[], _T]] = ..., kw_only: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., ) -> _T: ... @@ -249,17 +293,18 @@ def field( converter: Optional[_ConverterType] = ..., factory: Optional[Callable[[], _T]] = ..., kw_only: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., ) -> Any: ... @overload +@__dataclass_transform__(order_default=True, field_descriptors=(attrib, field)) def attrs( maybe_cls: _C, these: Optional[Dict[str, Any]] = ..., repr_ns: Optional[str] = ..., repr: bool = ..., - cmp: Optional[bool] = ..., + cmp: Optional[_EqOrderType] = ..., hash: Optional[bool] = ..., init: bool = ..., slots: bool = ..., @@ -270,21 +315,23 @@ def attrs( kw_only: bool = ..., cache_hash: bool = ..., auto_exc: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., auto_detect: bool = ..., collect_by_mro: bool = ..., getstate_setstate: Optional[bool] = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., field_transformer: Optional[_FieldTransformer] = ..., + match_args: bool = ..., ) -> _C: ... @overload +@__dataclass_transform__(order_default=True, field_descriptors=(attrib, field)) def attrs( maybe_cls: None = ..., these: Optional[Dict[str, Any]] = ..., repr_ns: Optional[str] = ..., repr: bool = ..., - cmp: Optional[bool] = ..., + cmp: Optional[_EqOrderType] = ..., hash: Optional[bool] = ..., init: bool = ..., slots: bool = ..., @@ -295,15 +342,17 @@ def attrs( kw_only: bool = ..., cache_hash: bool = ..., auto_exc: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., auto_detect: bool = ..., collect_by_mro: bool = ..., getstate_setstate: Optional[bool] = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., field_transformer: Optional[_FieldTransformer] = ..., + match_args: bool = ..., ) -> Callable[[_C], _C]: ... @overload +@__dataclass_transform__(field_descriptors=(attrib, field)) def define( maybe_cls: _C, *, @@ -325,8 +374,10 @@ def define( getstate_setstate: Optional[bool] = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., field_transformer: Optional[_FieldTransformer] = ..., + match_args: bool = ..., ) -> _C: ... @overload +@__dataclass_transform__(field_descriptors=(attrib, field)) def define( maybe_cls: None = ..., *, @@ -348,22 +399,20 @@ def define( getstate_setstate: Optional[bool] = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., field_transformer: Optional[_FieldTransformer] = ..., + match_args: bool = ..., ) -> Callable[[_C], _C]: ... mutable = define frozen = define # they differ only in their defaults -# TODO: add support for returning NamedTuple from the mypy plugin -class _Fields(Tuple[Attribute[Any], ...]): - def __getattr__(self, name: str) -> Attribute[Any]: ... - -def fields(cls: type) -> _Fields: ... -def fields_dict(cls: type) -> Dict[str, Attribute[Any]]: ... -def validate(inst: Any) -> None: ... +def fields(cls: Type[AttrsInstance]) -> Any: ... +def fields_dict(cls: Type[AttrsInstance]) -> Dict[str, Attribute[Any]]: ... +def validate(inst: AttrsInstance) -> None: ... def resolve_types( cls: _C, globalns: Optional[Dict[str, Any]] = ..., localns: Optional[Dict[str, Any]] = ..., + attribs: Optional[List[Attribute[Any]]] = ..., ) -> _C: ... # TODO: add support for returning a proper attrs class from the mypy plugin @@ -375,7 +424,7 @@ def make_class( bases: Tuple[type, ...] = ..., repr_ns: Optional[str] = ..., repr: bool = ..., - cmp: Optional[bool] = ..., + cmp: Optional[_EqOrderType] = ..., hash: Optional[bool] = ..., init: bool = ..., slots: bool = ..., @@ -386,8 +435,8 @@ def make_class( kw_only: bool = ..., cache_hash: bool = ..., auto_exc: bool = ..., - eq: Optional[bool] = ..., - order: Optional[bool] = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., collect_by_mro: bool = ..., on_setattr: Optional[_OnSetAttrArgType] = ..., field_transformer: Optional[_FieldTransformer] = ..., @@ -400,18 +449,22 @@ def make_class( # these: # https://github.com/python/mypy/issues/4236 # https://github.com/python/typing/issues/253 +# XXX: remember to fix attrs.asdict/astuple too! def asdict( - inst: Any, + inst: AttrsInstance, recurse: bool = ..., filter: Optional[_FilterType[Any]] = ..., dict_factory: Type[Mapping[Any, Any]] = ..., retain_collection_types: bool = ..., - value_serializer: Optional[Callable[[type, Attribute, Any], Any]] = ..., + value_serializer: Optional[ + Callable[[type, Attribute[Any], Any], Any] + ] = ..., + tuple_keys: Optional[bool] = ..., ) -> Dict[str, Any]: ... # TODO: add support for returning NamedTuple from the mypy plugin def astuple( - inst: Any, + inst: AttrsInstance, recurse: bool = ..., filter: Optional[_FilterType[Any]] = ..., tuple_factory: Type[Sequence[Any]] = ..., diff --git a/src/attr/_cmp.py b/src/attr/_cmp.py new file mode 100644 index 000000000..81b99e4c3 --- /dev/null +++ b/src/attr/_cmp.py @@ -0,0 +1,155 @@ +# SPDX-License-Identifier: MIT + + +import functools +import types + +from ._make import _make_ne + + +_operation_names = {"eq": "==", "lt": "<", "le": "<=", "gt": ">", "ge": ">="} + + +def cmp_using( + eq=None, + lt=None, + le=None, + gt=None, + ge=None, + require_same_type=True, + class_name="Comparable", +): + """ + Create a class that can be passed into `attr.ib`'s ``eq``, ``order``, and + ``cmp`` arguments to customize field comparison. + + The resulting class will have a full set of ordering methods if + at least one of ``{lt, le, gt, ge}`` and ``eq`` are provided. + + :param Optional[callable] eq: `callable` used to evaluate equality + of two objects. + :param Optional[callable] lt: `callable` used to evaluate whether + one object is less than another object. + :param Optional[callable] le: `callable` used to evaluate whether + one object is less than or equal to another object. + :param Optional[callable] gt: `callable` used to evaluate whether + one object is greater than another object. + :param Optional[callable] ge: `callable` used to evaluate whether + one object is greater than or equal to another object. + + :param bool require_same_type: When `True`, equality and ordering methods + will return `NotImplemented` if objects are not of the same type. + + :param Optional[str] class_name: Name of class. Defaults to 'Comparable'. + + See `comparison` for more details. + + .. versionadded:: 21.1.0 + """ + + body = { + "__slots__": ["value"], + "__init__": _make_init(), + "_requirements": [], + "_is_comparable_to": _is_comparable_to, + } + + # Add operations. + num_order_functions = 0 + has_eq_function = False + + if eq is not None: + has_eq_function = True + body["__eq__"] = _make_operator("eq", eq) + body["__ne__"] = _make_ne() + + if lt is not None: + num_order_functions += 1 + body["__lt__"] = _make_operator("lt", lt) + + if le is not None: + num_order_functions += 1 + body["__le__"] = _make_operator("le", le) + + if gt is not None: + num_order_functions += 1 + body["__gt__"] = _make_operator("gt", gt) + + if ge is not None: + num_order_functions += 1 + body["__ge__"] = _make_operator("ge", ge) + + type_ = types.new_class( + class_name, (object,), {}, lambda ns: ns.update(body) + ) + + # Add same type requirement. + if require_same_type: + type_._requirements.append(_check_same_type) + + # Add total ordering if at least one operation was defined. + if 0 < num_order_functions < 4: + if not has_eq_function: + # functools.total_ordering requires __eq__ to be defined, + # so raise early error here to keep a nice stack. + raise ValueError( + "eq must be define is order to complete ordering from " + "lt, le, gt, ge." + ) + type_ = functools.total_ordering(type_) + + return type_ + + +def _make_init(): + """ + Create __init__ method. + """ + + def __init__(self, value): + """ + Initialize object with *value*. + """ + self.value = value + + return __init__ + + +def _make_operator(name, func): + """ + Create operator method. + """ + + def method(self, other): + if not self._is_comparable_to(other): + return NotImplemented + + result = func(self.value, other.value) + if result is NotImplemented: + return NotImplemented + + return result + + method.__name__ = "__%s__" % (name,) + method.__doc__ = "Return a %s b. Computed by attrs." % ( + _operation_names[name], + ) + + return method + + +def _is_comparable_to(self, other): + """ + Check whether `other` is comparable to `self`. + """ + for func in self._requirements: + if not func(self, other): + return False + return True + + +def _check_same_type(self, other): + """ + Return True if *self* and *other* are of the same type, False otherwise. + """ + return other.value.__class__ is self.value.__class__ diff --git a/src/attr/_cmp.pyi b/src/attr/_cmp.pyi new file mode 100644 index 000000000..35437eff6 --- /dev/null +++ b/src/attr/_cmp.pyi @@ -0,0 +1,13 @@ +from typing import Any, Callable, Optional, Type + +_CompareWithType = Callable[[Any, Any], bool] + +def cmp_using( + eq: Optional[_CompareWithType], + lt: Optional[_CompareWithType], + le: Optional[_CompareWithType], + gt: Optional[_CompareWithType], + ge: Optional[_CompareWithType], + require_same_type: bool, + class_name: str, +) -> Type: ... diff --git a/src/attr/_compat.py b/src/attr/_compat.py index b0ead6e1c..582649325 100644 --- a/src/attr/_compat.py +++ b/src/attr/_compat.py @@ -1,16 +1,23 @@ -from __future__ import absolute_import, division, print_function +# SPDX-License-Identifier: MIT + +import inspect import platform import sys +import threading import types import warnings +from collections.abc import Mapping, Sequence # noqa + -PY2 = sys.version_info[0] == 2 PYPY = platform.python_implementation() == "PyPy" +PY36 = sys.version_info[:2] >= (3, 6) +HAS_F_STRINGS = PY36 +PY310 = sys.version_info[:2] >= (3, 10) -if PYPY or sys.version_info[:2] >= (3, 6): +if PYPY or PY36: ordered_dict = dict else: from collections import OrderedDict @@ -18,112 +25,54 @@ ordered_dict = OrderedDict -if PY2: - from collections import Mapping, Sequence - - from UserDict import IterableUserDict - - # We 'bundle' isclass instead of using inspect as importing inspect is - # fairly expensive (order of 10-15 ms for a modern machine in 2016) - def isclass(klass): - return isinstance(klass, (type, types.ClassType)) - - # TYPE is used in exceptions, repr(int) is different on Python 2 and 3. - TYPE = "type" - - def iteritems(d): - return d.iteritems() - - # Python 2 is bereft of a read-only dict proxy, so we make one! - class ReadOnlyDict(IterableUserDict): - """ - Best-effort read-only dict wrapper. - """ - - def __setitem__(self, key, val): - # We gently pretend we're a Python 3 mappingproxy. - raise TypeError( - "'mappingproxy' object does not support item assignment" - ) - - def update(self, _): - # We gently pretend we're a Python 3 mappingproxy. - raise AttributeError( - "'mappingproxy' object has no attribute 'update'" - ) - - def __delitem__(self, _): - # We gently pretend we're a Python 3 mappingproxy. - raise TypeError( - "'mappingproxy' object does not support item deletion" - ) - - def clear(self): - # We gently pretend we're a Python 3 mappingproxy. - raise AttributeError( - "'mappingproxy' object has no attribute 'clear'" - ) - - def pop(self, key, default=None): - # We gently pretend we're a Python 3 mappingproxy. - raise AttributeError( - "'mappingproxy' object has no attribute 'pop'" - ) +def just_warn(*args, **kw): + warnings.warn( + "Running interpreter doesn't sufficiently support code object " + "introspection. Some features like bare super() or accessing " + "__class__ will not work with slotted classes.", + RuntimeWarning, + stacklevel=2, + ) - def popitem(self): - # We gently pretend we're a Python 3 mappingproxy. - raise AttributeError( - "'mappingproxy' object has no attribute 'popitem'" - ) - def setdefault(self, key, default=None): - # We gently pretend we're a Python 3 mappingproxy. - raise AttributeError( - "'mappingproxy' object has no attribute 'setdefault'" - ) +class _AnnotationExtractor: + """ + Extract type annotations from a callable, returning None whenever there + is none. + """ - def __repr__(self): - # Override to be identical to the Python 3 version. - return "mappingproxy(" + repr(self.data) + ")" + __slots__ = ["sig"] - def metadata_proxy(d): - res = ReadOnlyDict() - res.data.update(d) # We blocked update, so we have to do it like this. - return res + def __init__(self, callable): + try: + self.sig = inspect.signature(callable) + except (ValueError, TypeError): # inspect failed + self.sig = None - def just_warn(*args, **kw): # pragma: no cover + def get_first_param_type(self): """ - We only warn on Python 3 because we are not aware of any concrete - consequences of not setting the cell on Python 2. + Return the type annotation of the first argument if it's not empty. """ + if not self.sig: + return None + params = list(self.sig.parameters.values()) + if params and params[0].annotation is not inspect.Parameter.empty: + return params[0].annotation -else: # Python 3 and later. - from collections.abc import Mapping, Sequence # noqa + return None - def just_warn(*args, **kw): + def get_return_type(self): """ - We only warn on Python 3 because we are not aware of any concrete - consequences of not setting the cell on Python 2. + Return the return type if it's not empty. """ - warnings.warn( - "Running interpreter doesn't sufficiently support code object " - "introspection. Some features like bare super() or accessing " - "__class__ will not work with slotted classes.", - RuntimeWarning, - stacklevel=2, - ) - - def isclass(klass): - return isinstance(klass, type) - - TYPE = "class" - - def iteritems(d): - return d.items() + if ( + self.sig + and self.sig.return_annotation is not inspect.Signature.empty + ): + return self.sig.return_annotation - def metadata_proxy(d): - return types.MappingProxyType(dict(d)) + return None def make_set_closure_cell(): @@ -155,26 +104,20 @@ def force_x_to_be_a_cell(): # pragma: no cover try: # Extract the code object and make sure our assumptions about # the closure behavior are correct. - if PY2: - co = set_first_cellvar_to.func_code - else: - co = set_first_cellvar_to.__code__ + co = set_first_cellvar_to.__code__ if co.co_cellvars != ("x",) or co.co_freevars != (): raise AssertionError # pragma: no cover # Convert this code object to a code object that sets the # function's first _freevar_ (not cellvar) to the argument. if sys.version_info >= (3, 8): - # CPython 3.8+ has an incompatible CodeType signature - # (added a posonlyargcount argument) but also added - # CodeType.replace() to do this without counting parameters. - set_first_freevar_code = co.replace( - co_cellvars=co.co_freevars, co_freevars=co.co_cellvars - ) + + def set_closure_cell(cell, value): + cell.cell_contents = value + else: args = [co.co_argcount] - if not PY2: - args.append(co.co_kwonlyargcount) + args.append(co.co_kwonlyargcount) args.extend( [ co.co_nlocals, @@ -195,15 +138,15 @@ def force_x_to_be_a_cell(): # pragma: no cover ) set_first_freevar_code = types.CodeType(*args) - def set_closure_cell(cell, value): - # Create a function using the set_first_freevar_code, - # whose first closure cell is `cell`. Calling it will - # change the value of that cell. - setter = types.FunctionType( - set_first_freevar_code, {}, "setter", (), (cell,) - ) - # And call it to set the cell. - setter(value) + def set_closure_cell(cell, value): + # Create a function using the set_first_freevar_code, + # whose first closure cell is `cell`. Calling it will + # change the value of that cell. + setter = types.FunctionType( + set_first_freevar_code, {}, "setter", (), (cell,) + ) + # And call it to set the cell. + setter(value) # Make sure it works on this interpreter: def make_func_with_cell(): @@ -214,10 +157,7 @@ def func(): return func - if PY2: - cell = make_func_with_cell().func_closure[0] - else: - cell = make_func_with_cell().__closure__[0] + cell = make_func_with_cell().__closure__[0] set_closure_cell(cell, 100) if cell.cell_contents != 100: raise AssertionError # pragma: no cover @@ -229,3 +169,17 @@ def func(): set_closure_cell = make_set_closure_cell() + +# Thread-local global to track attrs instances which are already being repr'd. +# This is needed because there is no other (thread-safe) way to pass info +# about the instances that are already being repr'd through the call stack +# in order to ensure we don't perform infinite recursion. +# +# For instance, if an instance contains a dict which contains that instance, +# we need to know that we're already repr'ing the outside instance from within +# the dict's repr() call. +# +# This lives here rather than in _make.py so that the functions in _make.py +# don't have a direct reference to the thread-local in their globals dict. +# If they have such a reference, it breaks cloudpickle. +repr_context = threading.local() diff --git a/src/attr/_config.py b/src/attr/_config.py index 8ec920962..96d420077 100644 --- a/src/attr/_config.py +++ b/src/attr/_config.py @@ -1,4 +1,4 @@ -from __future__ import absolute_import, division, print_function +# SPDX-License-Identifier: MIT __all__ = ["set_run_validators", "get_run_validators"] @@ -9,6 +9,10 @@ def set_run_validators(run): """ Set whether or not validators are run. By default, they are run. + + .. deprecated:: 21.3.0 It will not be removed, but it also will not be + moved to new ``attrs`` namespace. Use `attrs.validators.set_disabled()` + instead. """ if not isinstance(run, bool): raise TypeError("'run' must be bool.") @@ -19,5 +23,9 @@ def set_run_validators(run): def get_run_validators(): """ Return whether or not validators are run. + + .. deprecated:: 21.3.0 It will not be removed, but it also will not be + moved to new ``attrs`` namespace. Use `attrs.validators.get_disabled()` + instead. """ return _run_validators diff --git a/src/attr/_funcs.py b/src/attr/_funcs.py index e6c930cbd..a982d7cb5 100644 --- a/src/attr/_funcs.py +++ b/src/attr/_funcs.py @@ -1,8 +1,8 @@ -from __future__ import absolute_import, division, print_function +# SPDX-License-Identifier: MIT + import copy -from ._compat import iteritems from ._make import NOTHING, _obj_setattr, fields from .exceptions import AttrsAttributeNotFoundError @@ -25,7 +25,7 @@ def asdict( ``attrs``-decorated. :param callable filter: A callable whose return code determines whether an attribute or element is included (``True``) or dropped (``False``). Is - called with the `attr.Attribute` as the first argument and the + called with the `attrs.Attribute` as the first argument and the value as the second argument. :param callable dict_factory: A callable to produce dictionaries from. For example, to produce ordered dictionaries instead of normal Python @@ -46,6 +46,8 @@ def asdict( .. versionadded:: 16.0.0 *dict_factory* .. versionadded:: 16.1.0 *retain_collection_types* .. versionadded:: 20.3.0 *value_serializer* + .. versionadded:: 21.3.0 If a dict has a collection for a key, it is + serialized as a tuple. """ attrs = fields(inst.__class__) rv = dict_factory() @@ -61,11 +63,11 @@ def asdict( if has(v.__class__): rv[a.name] = asdict( v, - True, - filter, - dict_factory, - retain_collection_types, - value_serializer, + recurse=True, + filter=filter, + dict_factory=dict_factory, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, ) elif isinstance(v, (tuple, list, set, frozenset)): cf = v.__class__ if retain_collection_types is True else list @@ -73,10 +75,11 @@ def asdict( [ _asdict_anything( i, - filter, - dict_factory, - retain_collection_types, - value_serializer, + is_key=False, + filter=filter, + dict_factory=dict_factory, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, ) for i in v ] @@ -87,20 +90,22 @@ def asdict( ( _asdict_anything( kk, - filter, - df, - retain_collection_types, - value_serializer, + is_key=True, + filter=filter, + dict_factory=df, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, ), _asdict_anything( vv, - filter, - df, - retain_collection_types, - value_serializer, + is_key=False, + filter=filter, + dict_factory=df, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, ), ) - for kk, vv in iteritems(v) + for kk, vv in v.items() ) else: rv[a.name] = v @@ -111,6 +116,7 @@ def asdict( def _asdict_anything( val, + is_key, filter, dict_factory, retain_collection_types, @@ -123,22 +129,29 @@ def _asdict_anything( # Attrs class. rv = asdict( val, - True, - filter, - dict_factory, - retain_collection_types, - value_serializer, + recurse=True, + filter=filter, + dict_factory=dict_factory, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, ) elif isinstance(val, (tuple, list, set, frozenset)): - cf = val.__class__ if retain_collection_types is True else list + if retain_collection_types is True: + cf = val.__class__ + elif is_key: + cf = tuple + else: + cf = list + rv = cf( [ _asdict_anything( i, - filter, - dict_factory, - retain_collection_types, - value_serializer, + is_key=False, + filter=filter, + dict_factory=dict_factory, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, ) for i in val ] @@ -148,13 +161,23 @@ def _asdict_anything( rv = df( ( _asdict_anything( - kk, filter, df, retain_collection_types, value_serializer + kk, + is_key=True, + filter=filter, + dict_factory=df, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, ), _asdict_anything( - vv, filter, df, retain_collection_types, value_serializer + vv, + is_key=False, + filter=filter, + dict_factory=df, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, ), ) - for kk, vv in iteritems(val) + for kk, vv in val.items() ) else: rv = val @@ -181,7 +204,7 @@ def astuple( ``attrs``-decorated. :param callable filter: A callable whose return code determines whether an attribute or element is included (``True``) or dropped (``False``). Is - called with the `attr.Attribute` as the first argument and the + called with the `attrs.Attribute` as the first argument and the value as the second argument. :param callable tuple_factory: A callable to produce tuples from. For example, to produce lists instead of tuples. @@ -253,7 +276,7 @@ def astuple( if has(vv.__class__) else vv, ) - for kk, vv in iteritems(v) + for kk, vv in v.items() ) ) else: @@ -291,7 +314,9 @@ def assoc(inst, **changes): class. .. deprecated:: 17.1.0 - Use `evolve` instead. + Use `attrs.evolve` instead if you can. + This function will not be removed du to the slightly different approach + compared to `attrs.evolve`. """ import warnings @@ -302,7 +327,7 @@ def assoc(inst, **changes): ) new = copy.copy(inst) attrs = fields(inst.__class__) - for k, v in iteritems(changes): + for k, v in changes.items(): a = getattr(attrs, k, NOTHING) if a is NOTHING: raise AttrsAttributeNotFoundError( @@ -343,7 +368,7 @@ def evolve(inst, **changes): return cls(**changes) -def resolve_types(cls, globalns=None, localns=None): +def resolve_types(cls, globalns=None, localns=None, attribs=None): """ Resolve any strings and forward annotations in type annotations. @@ -360,31 +385,36 @@ def resolve_types(cls, globalns=None, localns=None): :param type cls: Class to resolve. :param Optional[dict] globalns: Dictionary containing global variables. :param Optional[dict] localns: Dictionary containing local variables. + :param Optional[list] attribs: List of attribs for the given class. + This is necessary when calling from inside a ``field_transformer`` + since *cls* is not an ``attrs`` class yet. :raise TypeError: If *cls* is not a class. :raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs`` - class. + class and you didn't pass any attribs. :raise NameError: If types cannot be resolved because of missing variables. :returns: *cls* so you can use this function also as a class decorator. - Please note that you have to apply it **after** `attr.s`. That means - the decorator has to come in the line **before** `attr.s`. + Please note that you have to apply it **after** `attrs.define`. That + means the decorator has to come in the line **before** `attrs.define`. .. versionadded:: 20.1.0 + .. versionadded:: 21.1.0 *attribs* + """ - try: - # Since calling get_type_hints is expensive we cache whether we've - # done it already. - cls.__attrs_types_resolved__ - except AttributeError: + # Since calling get_type_hints is expensive we cache whether we've + # done it already. + if getattr(cls, "__attrs_types_resolved__", None) != cls: import typing hints = typing.get_type_hints(cls, globalns=globalns, localns=localns) - for field in fields(cls): + for field in fields(cls) if attribs is None else attribs: if field.name in hints: # Since fields have been frozen we must work around it. _obj_setattr(field, "type", hints[field.name]) - cls.__attrs_types_resolved__ = True + # We store the class we resolved so that subclasses know they haven't + # been resolved. + cls.__attrs_types_resolved__ = cls # Return the class so you can use it as a decorator too. return cls diff --git a/src/attr/_make.py b/src/attr/_make.py index 49484f935..4d1afe3fc 100644 --- a/src/attr/_make.py +++ b/src/attr/_make.py @@ -1,21 +1,21 @@ -from __future__ import absolute_import, division, print_function +# SPDX-License-Identifier: MIT import copy import linecache import sys -import threading -import uuid -import warnings +import types +import typing from operator import itemgetter -from . import _config, setters +# We need to import _compat itself in addition to the _compat members to avoid +# having the thread-local in the globals here. +from . import _compat, _config, setters from ._compat import ( - PY2, + HAS_F_STRINGS, + PY310, PYPY, - isclass, - iteritems, - metadata_proxy, + _AnnotationExtractor, ordered_dict, set_closure_cell, ) @@ -23,7 +23,6 @@ DefaultAlreadySetError, FrozenInstanceError, NotAnAttrsClassError, - PythonTooOldError, UnannotatedAttributeError, ) @@ -35,35 +34,47 @@ _tuple_property_pat = ( " {attr_name} = _attrs_property(_attrs_itemgetter({index}))" ) -_classvar_prefixes = ("typing.ClassVar", "t.ClassVar", "ClassVar") +_classvar_prefixes = ( + "typing.ClassVar", + "t.ClassVar", + "ClassVar", + "typing_extensions.ClassVar", +) # we don't use a double-underscore prefix because that triggers # name mangling when trying to create a slot for the field # (when slots=True) _hash_cache_field = "_attrs_cached_hash" -_empty_metadata_singleton = metadata_proxy({}) +_empty_metadata_singleton = types.MappingProxyType({}) # Unique object for unequivocal getattr() defaults. _sentinel = object() +_ng_default_on_setattr = setters.pipe(setters.convert, setters.validate) + -class _Nothing(object): +class _Nothing: """ Sentinel class to indicate the lack of a value when ``None`` is ambiguous. ``_Nothing`` is a singleton. There is only ever one of it. + + .. versionchanged:: 21.1.0 ``bool(NOTHING)`` is now False. """ _singleton = None def __new__(cls): if _Nothing._singleton is None: - _Nothing._singleton = super(_Nothing, cls).__new__(cls) + _Nothing._singleton = super().__new__(cls) return _Nothing._singleton def __repr__(self): return "NOTHING" + def __bool__(self): + return False + NOTHING = _Nothing() """ @@ -83,17 +94,8 @@ class _CacheHashWrapper(int): See GH #613 for more details. """ - if PY2: - # For some reason `type(None)` isn't callable in Python 2, but we don't - # actually need a constructor for None objects, we just need any - # available function that returns None. - def __reduce__(self, _none_constructor=getattr, _args=(0, "", None)): - return _none_constructor, _args - - else: - - def __reduce__(self, _none_constructor=type(None), _args=()): - return _none_constructor, _args + def __reduce__(self, _none_constructor=type(None), _args=()): + return _none_constructor, _args def attrib( @@ -124,11 +126,11 @@ def attrib( is used and no value is passed while instantiating or the attribute is excluded using ``init=False``. - If the value is an instance of `Factory`, its callable will be + If the value is an instance of `attrs.Factory`, its callable will be used to construct a new value (useful for mutable data types like lists or dicts). - If a default is not set (or set manually to `attr.NOTHING`), a value + If a default is not set (or set manually to `attrs.NOTHING`), a value *must* be supplied when instantiating; otherwise a `TypeError` will be raised. @@ -141,7 +143,7 @@ def attrib( :param validator: `callable` that is called by ``attrs``-generated ``__init__`` methods after the instance has been initialized. They - receive the initialized instance, the `Attribute`, and the + receive the initialized instance, the :func:`~attrs.Attribute`, and the passed value. The return value is *not* inspected so the validator has to throw an @@ -165,13 +167,25 @@ def attrib( as-is, i.e. it will be used directly *instead* of calling ``repr()`` (the default). :type repr: a `bool` or a `callable` to use a custom function. - :param bool eq: If ``True`` (default), include this attribute in the + + :param eq: If ``True`` (default), include this attribute in the generated ``__eq__`` and ``__ne__`` methods that check two instances - for equality. - :param bool order: If ``True`` (default), include this attributes in the + for equality. To override how the attribute value is compared, + pass a ``callable`` that takes a single value and returns the value + to be compared. + :type eq: a `bool` or a `callable`. + + :param order: If ``True`` (default), include this attributes in the generated ``__lt__``, ``__le__``, ``__gt__`` and ``__ge__`` methods. - :param bool cmp: Setting to ``True`` is equivalent to setting ``eq=True, - order=True``. Deprecated in favor of *eq* and *order*. + To override how the attribute value is ordered, + pass a ``callable`` that takes a single value and returns the value + to be ordered. + :type order: a `bool` or a `callable`. + + :param cmp: Setting *cmp* is equivalent to setting *eq* and *order* to the + same value. Must not be mixed with *eq* or *order*. + :type cmp: a `bool` or a `callable`. + :param Optional[bool] hash: Include this attribute in the generated ``__hash__`` method. If ``None`` (default), mirror *eq*'s value. This is the correct behavior according the Python spec. Setting this value @@ -189,7 +203,7 @@ def attrib( components. See `extending_metadata`. :param type: The type of the attribute. In Python 3.6 or greater, the preferred method to specify the type is using a variable annotation - (see `PEP 526 `_). + (see :pep:`526`). This argument is provided for backward compatibility. Regardless of the approach used, the type will be stored on ``Attribute.type``. @@ -202,10 +216,10 @@ def attrib( parameter is ignored). :param on_setattr: Allows to overwrite the *on_setattr* setting from `attr.s`. If left `None`, the *on_setattr* value from `attr.s` is used. - Set to `attr.setters.NO_OP` to run **no** `setattr` hooks for this + Set to `attrs.setters.NO_OP` to run **no** `setattr` hooks for this attribute -- regardless of the setting in `attr.s`. :type on_setattr: `callable`, or a list of callables, or `None`, or - `attr.setters.NO_OP` + `attrs.setters.NO_OP` .. versionadded:: 15.2.0 *convert* .. versionadded:: 16.3.0 *metadata* @@ -219,14 +233,19 @@ def attrib( .. versionadded:: 18.1.0 ``factory=f`` is syntactic sugar for ``default=attr.Factory(f)``. .. versionadded:: 18.2.0 *kw_only* - .. versionchanged:: 19.2.0 *convert* keyword argument removed + .. versionchanged:: 19.2.0 *convert* keyword argument removed. .. versionchanged:: 19.2.0 *repr* also accepts a custom callable. .. deprecated:: 19.2.0 *cmp* Removal on or after 2021-06-01. .. versionadded:: 19.2.0 *eq* and *order* .. versionadded:: 20.1.0 *on_setattr* .. versionchanged:: 20.3.0 *kw_only* backported to Python 2 + .. versionchanged:: 21.1.0 + *eq*, *order*, and *cmp* also accept a custom callable + .. versionchanged:: 21.1.0 *cmp* undeprecated """ - eq, order = _determine_eq_order(cmp, eq, order, True) + eq, eq_key, order, order_key = _determine_attrib_eq_order( + cmp, eq, order, True + ) if hash is not None and hash is not True and hash is not False: raise TypeError( @@ -268,11 +287,50 @@ def attrib( type=type, kw_only=kw_only, eq=eq, + eq_key=eq_key, order=order, + order_key=order_key, on_setattr=on_setattr, ) +def _compile_and_eval(script, globs, locs=None, filename=""): + """ + "Exec" the script with the given global (globs) and local (locs) variables. + """ + bytecode = compile(script, filename, "exec") + eval(bytecode, globs, locs) + + +def _make_method(name, script, filename, globs): + """ + Create the method with the script given and return the method object. + """ + locs = {} + + # In order of debuggers like PDB being able to step through the code, + # we add a fake linecache entry. + count = 1 + base_filename = filename + while True: + linecache_tuple = ( + len(script), + None, + script.splitlines(True), + filename, + ) + old_val = linecache.cache.setdefault(filename, linecache_tuple) + if old_val == linecache_tuple: + break + else: + filename = "{}-{}>".format(base_filename[:-1], count) + count += 1 + + _compile_and_eval(script, globs, locs, filename) + + return locs[name] + + def _make_attr_tuple_class(cls_name, attr_names): """ Create a tuple subclass to hold `Attribute`s for an `attrs` class. @@ -296,8 +354,7 @@ class MyClassAttributes(tuple): else: attr_class_template.append(" pass") globs = {"_attrs_itemgetter": itemgetter, "_attrs_property": property} - eval(compile("\n".join(attr_class_template), "", "exec"), globs) - + _compile_and_eval("\n".join(attr_class_template), globs) return globs[attr_class_name] @@ -324,7 +381,13 @@ def _is_class_var(annot): annotations which would put attrs-based classes at a performance disadvantage compared to plain old classes. """ - return str(annot).startswith(_classvar_prefixes) + annot = str(annot) + + # Annotation can be quoted. + if annot.startswith(("'", '"')) and annot.endswith(("'", '"')): + annot = annot[1:-1] + + return annot.startswith(_classvar_prefixes) def _has_own_attribute(cls, attrib_name): @@ -438,7 +501,7 @@ def _transform_attrs( anns = _get_annotations(cls) if these is not None: - ca_list = [(name, ca) for name, ca in iteritems(these)] + ca_list = [(name, ca) for name, ca in these.items()] if not isinstance(these, ordered_dict): ca_list.sort(key=_counter_getter) @@ -498,15 +561,11 @@ def _transform_attrs( cls, {a.name for a in own_attrs} ) - attr_names = [a.name for a in base_attrs + own_attrs] - - AttrsClass = _make_attr_tuple_class(cls.__name__, attr_names) - if kw_only: own_attrs = [a.evolve(kw_only=True) for a in own_attrs] base_attrs = [a.evolve(kw_only=True) for a in base_attrs] - attrs = AttrsClass(base_attrs + own_attrs) + attrs = base_attrs + own_attrs # Mandatory vs non-mandatory attr order only matters when they are part of # the __init__ signature and when they aren't kw_only (which are moved to @@ -525,7 +584,13 @@ def _transform_attrs( if field_transformer is not None: attrs = field_transformer(cls, attrs) - return _Attributes((attrs, base_attrs, base_attr_map)) + + # Create AttrsClass *after* applying the field_transformer since it may + # add or remove attributes! + attr_names = [a.name for a in attrs] + AttrsClass = _make_attr_tuple_class(cls.__name__, attr_names) + + return _Attributes((AttrsClass(attrs), base_attrs, base_attr_map)) if PYPY: @@ -543,7 +608,6 @@ def _frozen_setattrs(self, name, value): raise FrozenInstanceError() - else: def _frozen_setattrs(self, name, value): @@ -560,7 +624,7 @@ def _frozen_delattrs(self, name): raise FrozenInstanceError() -class _ClassBuilder(object): +class _ClassBuilder: """ Iteratively build *one* class. """ @@ -575,12 +639,13 @@ class _ClassBuilder(object): "_cls_dict", "_delete_attribs", "_frozen", + "_has_pre_init", "_has_post_init", "_is_exc", "_on_setattr", "_slots", "_weakref_slot", - "_has_own_setattr", + "_wrote_own_setattr", "_has_custom_setattr", ) @@ -613,20 +678,21 @@ def __init__( self._cls = cls self._cls_dict = dict(cls.__dict__) if slots else {} self._attrs = attrs - self._base_names = set(a.name for a in base_attrs) + self._base_names = {a.name for a in base_attrs} self._base_attr_map = base_map self._attr_names = tuple(a.name for a in attrs) self._slots = slots self._frozen = frozen self._weakref_slot = weakref_slot self._cache_hash = cache_hash + self._has_pre_init = bool(getattr(cls, "__attrs_pre_init__", False)) self._has_post_init = bool(getattr(cls, "__attrs_post_init__", False)) self._delete_attribs = not bool(these) self._is_exc = is_exc self._on_setattr = on_setattr self._has_custom_setattr = has_custom_setattr - self._has_own_setattr = False + self._wrote_own_setattr = False self._cls_dict["__attrs_attrs__"] = self._attrs @@ -634,7 +700,33 @@ def __init__( self._cls_dict["__setattr__"] = _frozen_setattrs self._cls_dict["__delattr__"] = _frozen_delattrs - self._has_own_setattr = True + self._wrote_own_setattr = True + elif on_setattr in ( + _ng_default_on_setattr, + setters.validate, + setters.convert, + ): + has_validator = has_converter = False + for a in attrs: + if a.validator is not None: + has_validator = True + if a.converter is not None: + has_converter = True + + if has_validator and has_converter: + break + if ( + ( + on_setattr == _ng_default_on_setattr + and not (has_validator or has_converter) + ) + or (on_setattr == setters.validate and not has_validator) + or (on_setattr == setters.convert and not has_converter) + ): + # If class-level on_setattr is set to convert + validate, but + # there's no field to convert or validate, pretend like there's + # no on_setattr. + self._on_setattr = None if getstate_setstate: ( @@ -684,13 +776,13 @@ def _patch_original_class(self): # If we've inherited an attrs __setattr__ and don't write our own, # reset it to object's. - if not self._has_own_setattr and getattr( + if not self._wrote_own_setattr and getattr( cls, "__attrs_own_setattr__", False ): cls.__attrs_own_setattr__ = False if not self._has_custom_setattr: - cls.__setattr__ = object.__setattr__ + cls.__setattr__ = _obj_setattr return cls @@ -698,10 +790,9 @@ def _create_slots_class(self): """ Build and return a new class with a `__slots__` attribute. """ - base_names = self._base_names cd = { k: v - for k, v in iteritems(self._cls_dict) + for k, v in self._cls_dict.items() if k not in tuple(self._attr_names) + ("__dict__", "__weakref__") } @@ -713,21 +804,30 @@ def _create_slots_class(self): # XXX: a non-attrs class and subclass the resulting class with an attrs # XXX: class. See `test_slotted_confused` for details. For now that's # XXX: OK with us. - if not self._has_own_setattr: + if not self._wrote_own_setattr: cd["__attrs_own_setattr__"] = False if not self._has_custom_setattr: for base_cls in self._cls.__bases__: if base_cls.__dict__.get("__attrs_own_setattr__", False): - cd["__setattr__"] = object.__setattr__ + cd["__setattr__"] = _obj_setattr break - # Traverse the MRO to check for an existing __weakref__. + # Traverse the MRO to collect existing slots + # and check for an existing __weakref__. + existing_slots = dict() weakref_inherited = False for base_cls in self._cls.__mro__[1:-1]: if base_cls.__dict__.get("__weakref__", None) is not None: weakref_inherited = True - break + existing_slots.update( + { + name: getattr(base_cls, name) + for name in getattr(base_cls, "__slots__", []) + } + ) + + base_names = set(self._base_names) names = self._attr_names if ( @@ -741,19 +841,28 @@ def _create_slots_class(self): # We only add the names of attributes that aren't inherited. # Setting __slots__ to inherited attributes wastes memory. slot_names = [name for name in names if name not in base_names] + # There are slots for attributes from current class + # that are defined in parent classes. + # As their descriptors may be overridden by a child class, + # we collect them here and update the class dict + reused_slots = { + slot: slot_descriptor + for slot, slot_descriptor in existing_slots.items() + if slot in slot_names + } + slot_names = [name for name in slot_names if name not in reused_slots] + cd.update(reused_slots) if self._cache_hash: slot_names.append(_hash_cache_field) cd["__slots__"] = tuple(slot_names) - qualname = getattr(self._cls, "__qualname__", None) - if qualname is not None: - cd["__qualname__"] = qualname + cd["__qualname__"] = self._cls.__qualname__ # Create new class based on old class and our methods. cls = type(self._cls)(self._cls.__name__, self._cls.__bases__, cd) # The following is a fix for - # https://github.com/python-attrs/attrs/issues/102. On Python 3, + # . On Python 3, # if a method mentions `__class__` or uses the no-arg super(), the # compiler will bake a reference to the class in the method itself # as `method.__closure__`. Since we replace the class with a @@ -763,6 +872,10 @@ def _create_slots_class(self): # Class- and staticmethods hide their functions inside. # These might need to be rewritten as well. closure_cells = getattr(item.__func__, "__closure__", None) + elif isinstance(item, property): + # Workaround for property `super()` shortcut (PY3-only). + # There is no universal way for other descriptors. + closure_cells = getattr(item.fget, "__closure__", None) else: closure_cells = getattr(item, "__closure__", None) @@ -781,7 +894,7 @@ def _create_slots_class(self): def add_repr(self, ns): self._cls_dict["__repr__"] = self._add_method_dunders( - _make_repr(self._attrs, ns=ns) + _make_repr(self._attrs, ns, self._cls) ) return self @@ -853,14 +966,41 @@ def add_init(self): _make_init( self._cls, self._attrs, + self._has_pre_init, self._has_post_init, self._frozen, self._slots, self._cache_hash, self._base_attr_map, self._is_exc, - self._on_setattr is not None - and self._on_setattr is not setters.NO_OP, + self._on_setattr, + attrs_init=False, + ) + ) + + return self + + def add_match_args(self): + self._cls_dict["__match_args__"] = tuple( + field.name + for field in self._attrs + if field.init and not field.kw_only + ) + + def add_attrs_init(self): + self._cls_dict["__attrs_init__"] = self._add_method_dunders( + _make_init( + self._cls, + self._attrs, + self._has_pre_init, + self._has_post_init, + self._frozen, + self._slots, + self._cache_hash, + self._base_attr_map, + self._is_exc, + self._on_setattr, + attrs_init=True, ) ) @@ -918,7 +1058,7 @@ def __setattr__(self, name, val): self._cls_dict["__attrs_own_setattr__"] = True self._cls_dict["__setattr__"] = self._add_method_dunders(__setattr__) - self._has_own_setattr = True + self._wrote_own_setattr = True return self @@ -948,13 +1088,7 @@ def _add_method_dunders(self, method): return method -_CMP_DEPRECATION = ( - "The usage of `cmp` is deprecated and will be removed on or after " - "2021-06-01. Please use `eq` and `order` instead." -) - - -def _determine_eq_order(cmp, eq, order, default_eq): +def _determine_attrs_eq_order(cmp, eq, order, default_eq): """ Validate the combination of *cmp*, *eq*, and *order*. Derive the effective values of eq and order. If *eq* is None, set it to *default_eq*. @@ -964,8 +1098,6 @@ def _determine_eq_order(cmp, eq, order, default_eq): # cmp takes precedence due to bw-compatibility. if cmp is not None: - warnings.warn(_CMP_DEPRECATION, DeprecationWarning, stacklevel=3) - return cmp, cmp # If left None, equality is set to the specified default and ordering @@ -982,6 +1114,47 @@ def _determine_eq_order(cmp, eq, order, default_eq): return eq, order +def _determine_attrib_eq_order(cmp, eq, order, default_eq): + """ + Validate the combination of *cmp*, *eq*, and *order*. Derive the effective + values of eq and order. If *eq* is None, set it to *default_eq*. + """ + if cmp is not None and any((eq is not None, order is not None)): + raise ValueError("Don't mix `cmp` with `eq' and `order`.") + + def decide_callable_or_boolean(value): + """ + Decide whether a key function is used. + """ + if callable(value): + value, key = True, value + else: + key = None + return value, key + + # cmp takes precedence due to bw-compatibility. + if cmp is not None: + cmp, cmp_key = decide_callable_or_boolean(cmp) + return cmp, cmp_key, cmp, cmp_key + + # If left None, equality is set to the specified default and ordering + # mirrors equality. + if eq is None: + eq, eq_key = default_eq, None + else: + eq, eq_key = decide_callable_or_boolean(eq) + + if order is None: + order, order_key = eq, eq_key + else: + order, order_key = decide_callable_or_boolean(order) + + if eq is False and order is True: + raise ValueError("`order` can only be True if `eq` is True too.") + + return eq, eq_key, order, order_key + + def _determine_whether_to_implement( cls, flag, auto_detect, dunders, default=True ): @@ -993,8 +1166,6 @@ def _determine_whether_to_implement( whose presence signal that the user has implemented it themselves. Return *default* if no reason for either for or against is found. - - auto_detect must be False on Python 2. """ if flag is True or flag is False: return flag @@ -1033,6 +1204,7 @@ def attrs( getstate_setstate=None, on_setattr=None, field_transformer=None, + match_args=True, ): r""" A class decorator that adds `dunder @@ -1064,7 +1236,7 @@ def attrs( inherited from some base class). So for example by implementing ``__eq__`` on a class yourself, - ``attrs`` will deduce ``eq=False`` and won't create *neither* + ``attrs`` will deduce ``eq=False`` and will create *neither* ``__eq__`` *nor* ``__ne__`` (but Python classes come with a sensible ``__ne__`` by default, so it *should* be enough to only implement ``__eq__`` in most cases). @@ -1081,7 +1253,7 @@ def attrs( *cmp*, or *hash* overrides whatever *auto_detect* would determine. *auto_detect* requires Python 3. Setting it ``True`` on Python 2 raises - a `PythonTooOldError`. + an `attrs.exceptions.PythonTooOldError`. :param bool repr: Create a ``__repr__`` method with a human readable representation of ``attrs`` attributes.. @@ -1097,10 +1269,8 @@ def attrs( ``__gt__``, and ``__ge__`` methods that behave like *eq* above and allow instances to be ordered. If ``None`` (default) mirror value of *eq*. - :param Optional[bool] cmp: Setting to ``True`` is equivalent to setting - ``eq=True, order=True``. Deprecated in favor of *eq* and *order*, has - precedence over them for backward-compatibility though. Must not be - mixed with *eq* or *order*. + :param Optional[bool] cmp: Setting *cmp* is equivalent to setting *eq* + and *order* to the same value. Must not be mixed with *eq* or *order*. :param Optional[bool] hash: If ``None`` (default), the ``__hash__`` method is generated according how *eq* and *frozen* are set. @@ -1121,9 +1291,16 @@ def attrs( behavior `_ for more details. :param bool init: Create a ``__init__`` method that initializes the - ``attrs`` attributes. Leading underscores are stripped for the - argument name. If a ``__attrs_post_init__`` method exists on the - class, it will be called after the class is fully initialized. + ``attrs`` attributes. Leading underscores are stripped for the argument + name. If a ``__attrs_pre_init__`` method exists on the class, it will + be called before the class is initialized. If a ``__attrs_post_init__`` + method exists on the class, it will be called after the class is fully + initialized. + + If ``init`` is ``False``, an ``__attrs_init__`` method will be + injected instead. This allows you to define a custom ``__init__`` + method that can do pre-init work such as ``super().__init__()``, + and then call ``__attrs_init__()`` and ``__attrs_post_init__()``. :param bool slots: Create a `slotted class ` that's more memory-efficient. Slotted classes are generally superior to the default dict classes, but have some gotchas you should know about, so we @@ -1152,7 +1329,7 @@ def attrs( :param bool weakref_slot: Make instances weak-referenceable. This has no effect unless ``slots`` is also enabled. - :param bool auto_attribs: If ``True``, collect `PEP 526`_-annotated + :param bool auto_attribs: If ``True``, collect :pep:`526`-annotated attributes (Python 3.6 and later only) from the class body. In this case, you **must** annotate every field. If ``attrs`` @@ -1163,13 +1340,21 @@ def attrs( If you assign a value to those attributes (e.g. ``x: int = 42``), that value becomes the default value like if it were passed using - ``attr.ib(default=42)``. Passing an instance of `Factory` also - works as expected. + ``attr.ib(default=42)``. Passing an instance of `attrs.Factory` also + works as expected in most cases (see warning below). Attributes annotated as `typing.ClassVar`, and attributes that are neither annotated nor set to an `attr.ib` are **ignored**. - .. _`PEP 526`: https://www.python.org/dev/peps/pep-0526/ + .. warning:: + For features that use the attribute name to create decorators (e.g. + `validators `), you still *must* assign `attr.ib` to + them. Otherwise Python will either not find the name or try to use + the default value to call e.g. ``validator`` on it. + + These errors can be quite confusing and probably the most common bug + report on our bug tracker. + :param bool kw_only: Make all attributes keyword-only (Python 3+) in the generated ``__init__`` (if ``init`` is ``False``, this parameter is ignored). @@ -1196,7 +1381,7 @@ def attrs( :param bool collect_by_mro: Setting this to `True` fixes the way ``attrs`` collects attributes from base classes. The default behavior is incorrect in certain cases of multiple inheritance. It should be on by - default but is kept off for backward-compatability. + default but is kept off for backward-compatibility. See issue `#428 `_ for more details. @@ -1226,7 +1411,9 @@ def attrs( the callable. If a list of callables is passed, they're automatically wrapped in an - `attr.setters.pipe`. + `attrs.setters.pipe`. + :type on_setattr: `callable`, or a list of callables, or `None`, or + `attrs.setters.NO_OP` :param Optional[callable] field_transformer: A function that is called with the original class object and all @@ -1234,6 +1421,12 @@ def attrs( this, e.g., to automatically add converters or validators to fields based on their types. See `transform-fields` for more details. + :param bool match_args: + If `True` (default), set ``__match_args__`` on the class to support + :pep:`634` (Structural Pattern Matching). It is a tuple of all + non-keyword-only ``__init__`` parameter names on Python 3.10 and later. + Ignored on older Python versions. + .. versionadded:: 16.0.0 *slots* .. versionadded:: 16.1.0 *frozen* .. versionadded:: 16.3.0 *str* @@ -1263,23 +1456,19 @@ def attrs( .. versionadded:: 20.1.0 *getstate_setstate* .. versionadded:: 20.1.0 *on_setattr* .. versionadded:: 20.3.0 *field_transformer* + .. versionchanged:: 21.1.0 + ``init=False`` injects ``__attrs_init__`` + .. versionchanged:: 21.1.0 Support for ``__attrs_pre_init__`` + .. versionchanged:: 21.1.0 *cmp* undeprecated + .. versionadded:: 21.3.0 *match_args* """ - if auto_detect and PY2: - raise PythonTooOldError( - "auto_detect only works on Python 3 and later." - ) - - eq_, order_ = _determine_eq_order(cmp, eq, order, None) + eq_, order_ = _determine_attrs_eq_order(cmp, eq, order, None) hash_ = hash # work around the lack of nonlocal if isinstance(on_setattr, (list, tuple)): on_setattr = setters.pipe(*on_setattr) def wrap(cls): - - if getattr(cls, "__class__", None) is None: - raise TypeError("attrs only works with new-style classes.") - is_frozen = frozen or _has_frozen_base_class(cls) is_exc = auto_exc is True and issubclass(cls, BaseException) has_own_setattr = auto_detect and _has_own_attribute( @@ -1372,12 +1561,20 @@ def wrap(cls): ): builder.add_init() else: + builder.add_attrs_init() if cache_hash: raise TypeError( "Invalid value for cache_hash. To use hash caching," " init must be True." ) + if ( + PY310 + and match_args + and not _has_own_attribute(cls, "__match_args__") + ): + builder.add_match_args() + return builder.build_class() # maybe_cls's type depends on the usage of the decorator. It's a class @@ -1395,65 +1592,24 @@ def wrap(cls): """ -if PY2: - - def _has_frozen_base_class(cls): - """ - Check whether *cls* has a frozen ancestor by looking at its - __setattr__. - """ - return ( - getattr(cls.__setattr__, "__module__", None) - == _frozen_setattrs.__module__ - and cls.__setattr__.__name__ == _frozen_setattrs.__name__ - ) - - -else: - - def _has_frozen_base_class(cls): - """ - Check whether *cls* has a frozen ancestor by looking at its - __setattr__. - """ - return cls.__setattr__ == _frozen_setattrs - - -def _attrs_to_tuple(obj, attrs): +def _has_frozen_base_class(cls): """ - Create a tuple of all values of *obj*'s *attrs*. + Check whether *cls* has a frozen ancestor by looking at its + __setattr__. """ - return tuple(getattr(obj, a.name) for a in attrs) + return cls.__setattr__ is _frozen_setattrs def _generate_unique_filename(cls, func_name): """ Create a "filename" suitable for a function being generated. """ - unique_id = uuid.uuid4() - extra = "" - count = 1 - - while True: - unique_filename = "".format( - func_name, - cls.__module__, - getattr(cls, "__qualname__", cls.__name__), - extra, - ) - # To handle concurrency we essentially "reserve" our spot in - # the linecache with a dummy line. The caller can then - # set this value correctly. - cache_line = (1, None, (str(unique_id),), unique_filename) - if ( - linecache.cache.setdefault(unique_filename, cache_line) - == cache_line - ): - return unique_filename - - # Looks like this spot is taken. Try again. - count += 1 - extra = "-{0}".format(count) + unique_filename = "".format( + func_name, + cls.__module__, + getattr(cls, "__qualname__", cls.__name__), + ) + return unique_filename def _make_hash(cls, attrs, frozen, cache_hash): @@ -1465,6 +1621,8 @@ def _make_hash(cls, attrs, frozen, cache_hash): unique_filename = _generate_unique_filename(cls, "hash") type_hash = hash(unique_filename) + # If eq is custom generated, we need to include the functions in globs + globs = {} hash_def = "def __hash__(self" hash_func = "hash((" @@ -1472,8 +1630,7 @@ def _make_hash(cls, attrs, frozen, cache_hash): if not cache_hash: hash_def += "):" else: - if not PY2: - hash_def += ", *" + hash_def += ", *" hash_def += ( ", _cache_wrapper=" @@ -1499,7 +1656,14 @@ def append_hash_computation_lines(prefix, indent): ) for a in attrs: - method_lines.append(indent + " self.%s," % a.name) + if a.eq_key: + cmp_name = "_%s_key" % (a.name,) + globs[cmp_name] = a.eq_key + method_lines.append( + indent + " %s(self.%s)," % (cmp_name, a.name) + ) + else: + method_lines.append(indent + " self.%s," % a.name) method_lines.append(indent + " " + closing_braces) @@ -1519,21 +1683,7 @@ def append_hash_computation_lines(prefix, indent): append_hash_computation_lines("return ", tab) script = "\n".join(method_lines) - globs = {} - locs = {} - bytecode = compile(script, unique_filename, "exec") - eval(bytecode, globs, locs) - - # In order of debuggers like PDB being able to step through the code, - # we add a fake linecache entry. - linecache.cache[unique_filename] = ( - len(script), - None, - script.splitlines(True), - unique_filename, - ) - - return locs["__hash__"] + return _make_method("__hash__", script, unique_filename, globs) def _add_hash(cls, attrs): @@ -1575,34 +1725,44 @@ def _make_eq(cls, attrs): " if other.__class__ is not self.__class__:", " return NotImplemented", ] + # We can't just do a big self.x = other.x and... clause due to # irregularities like nan == nan is false but (nan,) == (nan,) is true. + globs = {} if attrs: lines.append(" return (") others = [" ) == ("] for a in attrs: - lines.append(" self.%s," % (a.name,)) - others.append(" other.%s," % (a.name,)) + if a.eq_key: + cmp_name = "_%s_key" % (a.name,) + # Add the key function to the global namespace + # of the evaluated function. + globs[cmp_name] = a.eq_key + lines.append( + " %s(self.%s)," + % ( + cmp_name, + a.name, + ) + ) + others.append( + " %s(other.%s)," + % ( + cmp_name, + a.name, + ) + ) + else: + lines.append(" self.%s," % (a.name,)) + others.append(" other.%s," % (a.name,)) lines += others + [" )"] else: lines.append(" return True") script = "\n".join(lines) - globs = {} - locs = {} - bytecode = compile(script, unique_filename, "exec") - eval(bytecode, globs, locs) - # In order of debuggers like PDB being able to step through the code, - # we add a fake linecache entry. - linecache.cache[unique_filename] = ( - len(script), - None, - script.splitlines(True), - unique_filename, - ) - return locs["__eq__"] + return _make_method("__eq__", script, unique_filename, globs) def _make_order(cls, attrs): @@ -1615,7 +1775,12 @@ def attrs_to_tuple(obj): """ Save us some typing. """ - return _attrs_to_tuple(obj, attrs) + return tuple( + key(value) if key else value + for value, key in ( + (getattr(obj, a.name), a.order_key) for a in attrs + ) + ) def __lt__(self, other): """ @@ -1669,66 +1834,126 @@ def _add_eq(cls, attrs=None): return cls -_already_repring = threading.local() +if HAS_F_STRINGS: + def _make_repr(attrs, ns, cls): + unique_filename = _generate_unique_filename(cls, "repr") + # Figure out which attributes to include, and which function to use to + # format them. The a.repr value can be either bool or a custom + # callable. + attr_names_with_reprs = tuple( + (a.name, (repr if a.repr is True else a.repr), a.init) + for a in attrs + if a.repr is not False + ) + globs = { + name + "_repr": r + for name, r, _ in attr_names_with_reprs + if r != repr + } + globs["_compat"] = _compat + globs["AttributeError"] = AttributeError + globs["NOTHING"] = NOTHING + attribute_fragments = [] + for name, r, i in attr_names_with_reprs: + accessor = ( + "self." + name + if i + else 'getattr(self, "' + name + '", NOTHING)' + ) + fragment = ( + "%s={%s!r}" % (name, accessor) + if r == repr + else "%s={%s_repr(%s)}" % (name, name, accessor) + ) + attribute_fragments.append(fragment) + repr_fragment = ", ".join(attribute_fragments) -def _make_repr(attrs, ns): - """ - Make a repr method that includes relevant *attrs*, adding *ns* to the full - name. - """ + if ns is None: + cls_name_fragment = ( + '{self.__class__.__qualname__.rsplit(">.", 1)[-1]}' + ) + else: + cls_name_fragment = ns + ".{self.__class__.__name__}" + + lines = [ + "def __repr__(self):", + " try:", + " already_repring = _compat.repr_context.already_repring", + " except AttributeError:", + " already_repring = {id(self),}", + " _compat.repr_context.already_repring = already_repring", + " else:", + " if id(self) in already_repring:", + " return '...'", + " else:", + " already_repring.add(id(self))", + " try:", + " return f'%s(%s)'" % (cls_name_fragment, repr_fragment), + " finally:", + " already_repring.remove(id(self))", + ] + + return _make_method( + "__repr__", "\n".join(lines), unique_filename, globs=globs + ) - # Figure out which attributes to include, and which function to use to - # format them. The a.repr value can be either bool or a custom callable. - attr_names_with_reprs = tuple( - (a.name, repr if a.repr is True else a.repr) - for a in attrs - if a.repr is not False - ) +else: - def __repr__(self): + def _make_repr(attrs, ns, _): """ - Automatically created by attrs. + Make a repr method that includes relevant *attrs*, adding *ns* to the + full name. """ - try: - working_set = _already_repring.working_set - except AttributeError: - working_set = set() - _already_repring.working_set = working_set - if id(self) in working_set: - return "..." - real_cls = self.__class__ - if ns is None: - qualname = getattr(real_cls, "__qualname__", None) - if qualname is not None: - class_name = qualname.rsplit(">.", 1)[-1] + # Figure out which attributes to include, and which function to use to + # format them. The a.repr value can be either bool or a custom + # callable. + attr_names_with_reprs = tuple( + (a.name, repr if a.repr is True else a.repr) + for a in attrs + if a.repr is not False + ) + + def __repr__(self): + """ + Automatically created by attrs. + """ + try: + already_repring = _compat.repr_context.already_repring + except AttributeError: + already_repring = set() + _compat.repr_context.already_repring = already_repring + + if id(self) in already_repring: + return "..." + real_cls = self.__class__ + if ns is None: + class_name = real_cls.__qualname__.rsplit(">.", 1)[-1] else: - class_name = real_cls.__name__ - else: - class_name = ns + "." + real_cls.__name__ + class_name = ns + "." + real_cls.__name__ - # Since 'self' remains on the stack (i.e.: strongly referenced) for the - # duration of this call, it's safe to depend on id(...) stability, and - # not need to track the instance and therefore worry about properties - # like weakref- or hash-ability. - working_set.add(id(self)) - try: - result = [class_name, "("] - first = True - for name, attr_repr in attr_names_with_reprs: - if first: - first = False - else: - result.append(", ") - result.extend( - (name, "=", attr_repr(getattr(self, name, NOTHING))) - ) - return "".join(result) + ")" - finally: - working_set.remove(id(self)) + # Since 'self' remains on the stack (i.e.: strongly referenced) + # for the duration of this call, it's safe to depend on id(...) + # stability, and not need to track the instance and therefore + # worry about properties like weakref- or hash-ability. + already_repring.add(id(self)) + try: + result = [class_name, "("] + first = True + for name, attr_repr in attr_names_with_reprs: + if first: + first = False + else: + result.append(", ") + result.extend( + (name, "=", attr_repr(getattr(self, name, NOTHING))) + ) + return "".join(result) + ")" + finally: + already_repring.remove(id(self)) - return __repr__ + return __repr__ def _add_repr(cls, ns=None, attrs=None): @@ -1738,7 +1963,7 @@ def _add_repr(cls, ns=None, attrs=None): if attrs is None: attrs = cls.__attrs_attrs__ - cls.__repr__ = _make_repr(attrs, ns) + cls.__repr__ = _make_repr(attrs, ns, cls) return cls @@ -1755,12 +1980,12 @@ def fields(cls): :raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs`` class. - :rtype: tuple (with name accessors) of `attr.Attribute` + :rtype: tuple (with name accessors) of `attrs.Attribute` .. versionchanged:: 16.2.0 Returned tuple allows accessing the fields by name. """ - if not isclass(cls): + if not isinstance(cls, type): raise TypeError("Passed object must be a class.") attrs = getattr(cls, "__attrs_attrs__", None) if attrs is None: @@ -1782,20 +2007,20 @@ def fields_dict(cls): class. :rtype: an ordered dict where keys are attribute names and values are - `attr.Attribute`\\ s. This will be a `dict` if it's + `attrs.Attribute`\\ s. This will be a `dict` if it's naturally ordered like on Python 3.6+ or an :class:`~collections.OrderedDict` otherwise. .. versionadded:: 18.1.0 """ - if not isclass(cls): + if not isinstance(cls, type): raise TypeError("Passed object must be a class.") attrs = getattr(cls, "__attrs_attrs__", None) if attrs is None: raise NotAnAttrsClassError( "{cls!r} is not an attrs-decorated class.".format(cls=cls) ) - return ordered_dict(((a.name, a) for a in attrs)) + return ordered_dict((a.name, a) for a in attrs) def validate(inst): @@ -1829,15 +2054,21 @@ def _is_slot_attr(a_name, base_attr_map): def _make_init( cls, attrs, + pre_init, post_init, frozen, slots, cache_hash, base_attr_map, is_exc, - has_global_on_setattr, + cls_on_setattr, + attrs_init, ): - if frozen and has_global_on_setattr: + has_cls_on_setattr = ( + cls_on_setattr is not None and cls_on_setattr is not setters.NO_OP + ) + + if frozen and has_cls_on_setattr: raise ValueError("Frozen classes can't use on_setattr.") needs_cached_setattr = cache_hash or frozen @@ -1855,9 +2086,7 @@ def _make_init( raise ValueError("Frozen classes can't use on_setattr.") needs_cached_setattr = True - elif ( - has_global_on_setattr and a.on_setattr is not setters.NO_OP - ) or _is_slot_attr(a.name, base_attr_map): + elif has_cls_on_setattr and a.on_setattr is not setters.NO_OP: needs_cached_setattr = True unique_filename = _generate_unique_filename(cls, "init") @@ -1866,44 +2095,41 @@ def _make_init( filtered_attrs, frozen, slots, + pre_init, post_init, cache_hash, base_attr_map, is_exc, - needs_cached_setattr, - has_global_on_setattr, + has_cls_on_setattr, + attrs_init, ) - locs = {} - bytecode = compile(script, unique_filename, "exec") + if cls.__module__ in sys.modules: + # This makes typing.get_type_hints(CLS.__init__) resolve string types. + globs.update(sys.modules[cls.__module__].__dict__) + globs.update({"NOTHING": NOTHING, "attr_dict": attr_dict}) if needs_cached_setattr: # Save the lookup overhead in __init__ if we need to circumvent # setattr hooks. - globs["_cached_setattr"] = _obj_setattr - - eval(bytecode, globs, locs) + globs["_setattr"] = _obj_setattr - # In order of debuggers like PDB being able to step through the code, - # we add a fake linecache entry. - linecache.cache[unique_filename] = ( - len(script), - None, - script.splitlines(True), + init = _make_method( + "__attrs_init__" if attrs_init else "__init__", + script, unique_filename, + globs, ) + init.__annotations__ = annotations - __init__ = locs["__init__"] - __init__.__annotations__ = annotations - - return __init__ + return init def _setattr(attr_name, value_var, has_on_setattr): """ Use the cached object.setattr to set *attr_name* to *value_var*. """ - return "_setattr('%s', %s)" % (attr_name, value_var) + return "_setattr(self, '%s', %s)" % (attr_name, value_var) def _setattr_with_converter(attr_name, value_var, has_on_setattr): @@ -1911,7 +2137,7 @@ def _setattr_with_converter(attr_name, value_var, has_on_setattr): Use the cached object.setattr to set *attr_name* to *value_var*, but run its converter first. """ - return "_setattr('%s', %s(%s))" % ( + return "_setattr(self, '%s', %s(%s))" % ( attr_name, _init_converter_pat % (attr_name,), value_var, @@ -1944,73 +2170,17 @@ def _assign_with_converter(attr_name, value_var, has_on_setattr): ) -if PY2: - - def _unpack_kw_only_py2(attr_name, default=None): - """ - Unpack *attr_name* from _kw_only dict. - """ - if default is not None: - arg_default = ", %s" % default - else: - arg_default = "" - return "%s = _kw_only.pop('%s'%s)" % ( - attr_name, - attr_name, - arg_default, - ) - - def _unpack_kw_only_lines_py2(kw_only_args): - """ - Unpack all *kw_only_args* from _kw_only dict and handle errors. - - Given a list of strings "{attr_name}" and "{attr_name}={default}" - generates list of lines of code that pop attrs from _kw_only dict and - raise TypeError similar to builtin if required attr is missing or - extra key is passed. - - >>> print("\n".join(_unpack_kw_only_lines_py2(["a", "b=42"]))) - try: - a = _kw_only.pop('a') - b = _kw_only.pop('b', 42) - except KeyError as _key_error: - raise TypeError( - ... - if _kw_only: - raise TypeError( - ... - """ - lines = ["try:"] - lines.extend( - " " + _unpack_kw_only_py2(*arg.split("=")) - for arg in kw_only_args - ) - lines += """\ -except KeyError as _key_error: - raise TypeError( - '__init__() missing required keyword-only argument: %s' % _key_error - ) -if _kw_only: - raise TypeError( - '__init__() got an unexpected keyword argument %r' - % next(iter(_kw_only)) - ) -""".split( - "\n" - ) - return lines - - def _attrs_to_init_script( attrs, frozen, slots, + pre_init, post_init, cache_hash, base_attr_map, is_exc, - needs_cached_setattr, - has_global_on_setattr, + has_cls_on_setattr, + attrs_init, ): """ Return a script of an initializer for *attrs* and a dict of globals. @@ -2021,13 +2191,8 @@ def _attrs_to_init_script( a cached ``object.__setattr__``. """ lines = [] - if needs_cached_setattr: - lines.append( - # Circumvent the __setattr__ descriptor to save one lookup per - # assignment. - # Note _setattr will be used again below if cache_hash is True - "_setattr = _cached_setattr.__get__(self, self.__class__)" - ) + if pre_init: + lines.append("self.__attrs_pre_init__()") if frozen is True: if slots is True: @@ -2080,7 +2245,7 @@ def fmt_setter_with_converter( attr_name = a.name has_on_setattr = a.on_setattr is not None or ( - a.on_setattr is not setters.NO_OP and has_global_on_setattr + a.on_setattr is not setters.NO_OP and has_cls_on_setattr ) arg_name = a.name.lstrip("_") @@ -2210,8 +2375,14 @@ def fmt_setter_with_converter( else: lines.append(fmt_setter(attr_name, arg_name, has_on_setattr)) - if a.init is True and a.converter is None and a.type is not None: - annotations[arg_name] = a.type + if a.init is True: + if a.type is not None and a.converter is None: + annotations[arg_name] = a.type + elif a.converter is not None: + # Try to get the type from the converter. + t = _AnnotationExtractor(a.converter).get_first_param_type() + if t: + annotations[arg_name] = t if attrs_to_validate: # we can skip this if there are no validators. names_for_globals["_config"] = _config @@ -2228,7 +2399,7 @@ def fmt_setter_with_converter( if post_init: lines.append("self.__attrs_post_init__()") - # because this is set only after __attrs_post_init is called, a crash + # because this is set only after __attrs_post_init__ is called, a crash # will result if post-init tries to access the hash code. This seemed # preferable to setting this beforehand, in which case alteration to # field values during post-init combined with post-init accessing the @@ -2237,7 +2408,7 @@ def fmt_setter_with_converter( if frozen: if slots: # if frozen and slots, then _setattr defined above - init_hash_cache = "_setattr('%s', %s)" + init_hash_cache = "_setattr(self, '%s', %s)" else: # if frozen and not slots, then _inst_dict defined above init_hash_cache = "_inst_dict['%s'] = %s" @@ -2254,49 +2425,54 @@ def fmt_setter_with_converter( args = ", ".join(args) if kw_only_args: - if PY2: - lines = _unpack_kw_only_lines_py2(kw_only_args) + lines - - args += "%s**_kw_only" % (", " if args else "",) # leading comma - else: - args += "%s*, %s" % ( - ", " if args else "", # leading comma - ", ".join(kw_only_args), # kw_only args - ) + args += "%s*, %s" % ( + ", " if args else "", # leading comma + ", ".join(kw_only_args), # kw_only args + ) return ( """\ -def __init__(self, {args}): +def {init_name}(self, {args}): {lines} """.format( - args=args, lines="\n ".join(lines) if lines else "pass" + init_name=("__attrs_init__" if attrs_init else "__init__"), + args=args, + lines="\n ".join(lines) if lines else "pass", ), names_for_globals, annotations, ) -class Attribute(object): +class Attribute: """ *Read-only* representation of an attribute. + The class has *all* arguments of `attr.ib` (except for ``factory`` + which is only syntactic sugar for ``default=Factory(...)`` plus the + following: + + - ``name`` (`str`): The name of the attribute. + - ``inherited`` (`bool`): Whether or not that attribute has been inherited + from a base class. + - ``eq_key`` and ``order_key`` (`typing.Callable` or `None`): The callables + that are used for comparing and ordering objects by this attribute, + respectively. These are set by passing a callable to `attr.ib`'s ``eq``, + ``order``, or ``cmp`` arguments. See also :ref:`comparison customization + `. + Instances of this class are frequently used for introspection purposes like: - `fields` returns a tuple of them. - Validators get them passed as the first argument. - - The *field transformer* hook receives a list of them. - - :attribute name: The name of the attribute. - :attribute inherited: Whether or not that attribute has been inherited from - a base class. - - Plus *all* arguments of `attr.ib` (except for ``factory`` - which is only syntactic sugar for ``default=Factory(...)``. + - The :ref:`field transformer ` hook receives a list of + them. .. versionadded:: 20.1.0 *inherited* .. versionadded:: 20.1.0 *on_setattr* .. versionchanged:: 20.2.0 *inherited* is not taken into account for equality checks and hashing anymore. + .. versionadded:: 21.1.0 *eq_key* and *order_key* For the full version history of the fields, see `attr.ib`. """ @@ -2307,7 +2483,9 @@ class Attribute(object): "validator", "repr", "eq", + "eq_key", "order", + "order_key", "hash", "init", "metadata", @@ -2333,10 +2511,14 @@ def __init__( converter=None, kw_only=False, eq=None, + eq_key=None, order=None, + order_key=None, on_setattr=None, ): - eq, order = _determine_eq_order(cmp, eq, order, True) + eq, eq_key, order, order_key = _determine_attrib_eq_order( + cmp, eq_key or eq, order_key or order, True + ) # Cache this descriptor here to speed things up later. bound_setattr = _obj_setattr.__get__(self, Attribute) @@ -2348,14 +2530,16 @@ def __init__( bound_setattr("validator", validator) bound_setattr("repr", repr) bound_setattr("eq", eq) + bound_setattr("eq_key", eq_key) bound_setattr("order", order) + bound_setattr("order_key", order_key) bound_setattr("hash", hash) bound_setattr("init", init) bound_setattr("converter", converter) bound_setattr( "metadata", ( - metadata_proxy(metadata) + types.MappingProxyType(dict(metadata)) # Shallow copy if metadata else _empty_metadata_singleton ), @@ -2399,15 +2583,6 @@ def from_counting_attr(cls, name, ca, type=None): **inst_dict ) - @property - def cmp(self): - """ - Simulate the presence of a cmp attribute and warn. - """ - warnings.warn(_CMP_DEPRECATION, DeprecationWarning, stacklevel=2) - - return self.eq and self.order - # Don't use attr.evolve since fields(Attribute) doesn't work def evolve(self, **changes): """ @@ -2450,7 +2625,7 @@ def _setattrs(self, name_values_pairs): else: bound_setattr( name, - metadata_proxy(value) + types.MappingProxyType(dict(value)) if value else _empty_metadata_singleton, ) @@ -2481,7 +2656,7 @@ def _setattrs(self, name_values_pairs): ) -class _CountingAttr(object): +class _CountingAttr: """ Intermediate representation of attributes that uses a counter to preserve the order in which the attributes have been defined. @@ -2495,7 +2670,9 @@ class _CountingAttr(object): "_default", "repr", "eq", + "eq_key", "order", + "order_key", "hash", "init", "metadata", @@ -2516,7 +2693,9 @@ class _CountingAttr(object): init=True, kw_only=False, eq=True, + eq_key=None, order=False, + order_key=None, inherited=False, on_setattr=None, ) @@ -2541,7 +2720,9 @@ class _CountingAttr(object): init=True, kw_only=False, eq=True, + eq_key=None, order=False, + order_key=None, inherited=False, on_setattr=None, ), @@ -2553,7 +2734,7 @@ def __init__( default, validator, repr, - cmp, # XXX: unused, remove along with cmp + cmp, hash, init, converter, @@ -2561,7 +2742,9 @@ def __init__( type, kw_only, eq, + eq_key, order, + order_key, on_setattr, ): _CountingAttr.cls_counter += 1 @@ -2571,7 +2754,9 @@ def __init__( self.converter = converter self.repr = repr self.eq = eq + self.eq_key = eq_key self.order = order + self.order_key = order_key self.hash = hash self.init = init self.metadata = metadata @@ -2614,12 +2799,11 @@ def default(self, meth): _CountingAttr = _add_eq(_add_repr(_CountingAttr)) -@attrs(slots=True, init=False, hash=True) -class Factory(object): +class Factory: """ Stores a factory callable. - If passed as the default value to `attr.ib`, the factory is used to + If passed as the default value to `attrs.field`, the factory is used to generate a new value. :param callable factory: A callable that takes either none or exactly one @@ -2630,8 +2814,7 @@ class Factory(object): .. versionadded:: 17.1.0 *takes_self* """ - factory = attrib() - takes_self = attrib() + __slots__ = ("factory", "takes_self") def __init__(self, factory, takes_self=False): """ @@ -2641,6 +2824,38 @@ def __init__(self, factory, takes_self=False): self.factory = factory self.takes_self = takes_self + def __getstate__(self): + """ + Play nice with pickle. + """ + return tuple(getattr(self, name) for name in self.__slots__) + + def __setstate__(self, state): + """ + Play nice with pickle. + """ + for name, value in zip(self.__slots__, state): + setattr(self, name, value) + + +_f = [ + Attribute( + name=name, + default=NOTHING, + validator=None, + repr=True, + cmp=None, + eq=True, + order=False, + hash=True, + init=True, + inherited=False, + ) + for name in Factory.__slots__ +] + +Factory = _add_hash(_add_eq(_add_repr(Factory, attrs=_f), attrs=_f), attrs=_f) + def make_class(name, attrs, bases=(object,), **attributes_arguments): """ @@ -2670,16 +2885,24 @@ def make_class(name, attrs, bases=(object,), **attributes_arguments): if isinstance(attrs, dict): cls_dict = attrs elif isinstance(attrs, (list, tuple)): - cls_dict = dict((a, attrib()) for a in attrs) + cls_dict = {a: attrib() for a in attrs} else: raise TypeError("attrs argument must be a dict or a list.") + pre_init = cls_dict.pop("__attrs_pre_init__", None) post_init = cls_dict.pop("__attrs_post_init__", None) - type_ = type( - name, - bases, - {} if post_init is None else {"__attrs_post_init__": post_init}, - ) + user_init = cls_dict.pop("__init__", None) + + body = {} + if pre_init is not None: + body["__attrs_pre_init__"] = pre_init + if post_init is not None: + body["__attrs_post_init__"] = post_init + if user_init is not None: + body["__init__"] = user_init + + type_ = types.new_class(name, bases, {}, lambda ns: ns.update(body)) + # For pickling to work, the __module__ variable needs to be set to the # frame where the class is created. Bypass this step in environments where # sys._getframe is not defined (Jython for example) or sys._getframe is not @@ -2696,7 +2919,7 @@ def make_class(name, attrs, bases=(object,), **attributes_arguments): ( attributes_arguments["eq"], attributes_arguments["order"], - ) = _determine_eq_order( + ) = _determine_attrs_eq_order( cmp, attributes_arguments.get("eq"), attributes_arguments.get("order"), @@ -2711,7 +2934,7 @@ def make_class(name, attrs, bases=(object,), **attributes_arguments): @attrs(slots=True, hash=True) -class _AndValidator(object): +class _AndValidator: """ Compose many validators to a single one. """ @@ -2751,6 +2974,9 @@ def pipe(*converters): When called on a value, it runs all wrapped converters, returning the *last* value. + Type annotations will be inferred from the wrapped converters', if + they have any. + :param callables converters: Arbitrary number of converters. .. versionadded:: 20.1.0 @@ -2762,4 +2988,19 @@ def pipe_converter(val): return val + if not converters: + # If the converter list is empty, pipe_converter is the identity. + A = typing.TypeVar("A") + pipe_converter.__annotations__ = {"val": A, "return": A} + else: + # Get parameter type from first converter. + t = _AnnotationExtractor(converters[0]).get_first_param_type() + if t: + pipe_converter.__annotations__["val"] = t + + # Get return type from last converter. + rt = _AnnotationExtractor(converters[-1]).get_return_type() + if rt: + pipe_converter.__annotations__["return"] = rt + return pipe_converter diff --git a/src/attr/_next_gen.py b/src/attr/_next_gen.py index 2b5565c56..5a06a7438 100644 --- a/src/attr/_next_gen.py +++ b/src/attr/_next_gen.py @@ -1,16 +1,24 @@ -""" -This is a Python 3.6 and later-only, keyword-only, and **provisional** API that -calls `attr.s` with different default values. +# SPDX-License-Identifier: MIT -Provisional APIs that shall become "import attrs" one glorious day. +""" +These are Python 3.6+-only and keyword-only APIs that call `attr.s` and +`attr.ib` with different default values. """ -from functools import partial -from attr.exceptions import UnannotatedAttributeError +from functools import partial from . import setters -from ._make import NOTHING, _frozen_setattrs, attrib, attrs +from ._funcs import asdict as _asdict +from ._funcs import astuple as _astuple +from ._make import ( + NOTHING, + _frozen_setattrs, + _ng_default_on_setattr, + attrib, + attrs, +) +from .exceptions import UnannotatedAttributeError def define( @@ -34,22 +42,45 @@ def define( getstate_setstate=None, on_setattr=None, field_transformer=None, + match_args=True, ): r""" - The only behavioral differences are the handling of the *auto_attribs* - option: + Define an ``attrs`` class. + + Differences to the classic `attr.s` that it uses underneath: + + - Automatically detect whether or not *auto_attribs* should be `True` (c.f. + *auto_attribs* parameter). + - If *frozen* is `False`, run converters and validators when setting an + attribute by default. + - *slots=True* + + .. caution:: + + Usually this has only upsides and few visible effects in everyday + programming. But it *can* lead to some suprising behaviors, so please + make sure to read :term:`slotted classes`. + - *auto_exc=True* + - *auto_detect=True* + - *order=False* + - Some options that were only relevant on Python 2 or were kept around for + backwards-compatibility have been removed. + + Please note that these are all defaults and you can change them as you + wish. :param Optional[bool] auto_attribs: If set to `True` or `False`, it behaves exactly like `attr.s`. If left `None`, `attr.s` will try to guess: - 1. If all attributes are annotated and no `attr.ib` is found, it assumes - *auto_attribs=True*. + 1. If any attributes are annotated and no unannotated `attrs.fields`\ s + are found, it assumes *auto_attribs=True*. 2. Otherwise it assumes *auto_attribs=False* and tries to collect - `attr.ib`\ s. + `attrs.fields`\ s. - and that mutable classes (``frozen=False``) validate on ``__setattr__``. + For now, please refer to `attr.s` for the rest of the parameters. .. versionadded:: 20.1.0 + .. versionchanged:: 21.3.0 Converters are also run ``on_setattr``. """ def do_it(cls, auto_attribs): @@ -74,6 +105,7 @@ def do_it(cls, auto_attribs): getstate_setstate=getstate_setstate, on_setattr=on_setattr, field_transformer=field_transformer, + match_args=match_args, ) def wrap(cls): @@ -86,9 +118,9 @@ def wrap(cls): had_on_setattr = on_setattr not in (None, setters.NO_OP) - # By default, mutable classes validate on setattr. + # By default, mutable classes convert & validate on setattr. if frozen is False and on_setattr is None: - on_setattr = setters.validate + on_setattr = _ng_default_on_setattr # However, if we subclass a frozen class, we inherit the immutability # and disable on_setattr. @@ -158,3 +190,31 @@ def field( order=order, on_setattr=on_setattr, ) + + +def asdict(inst, *, recurse=True, filter=None, value_serializer=None): + """ + Same as `attr.asdict`, except that collections types are always retained + and dict is always used as *dict_factory*. + + .. versionadded:: 21.3.0 + """ + return _asdict( + inst=inst, + recurse=recurse, + filter=filter, + value_serializer=value_serializer, + retain_collection_types=True, + ) + + +def astuple(inst, *, recurse=True, filter=None): + """ + Same as `attr.astuple`, except that collections types are always retained + and `tuple` is always used as the *tuple_factory*. + + .. versionadded:: 21.3.0 + """ + return _astuple( + inst=inst, recurse=recurse, filter=filter, retain_collection_types=True + ) diff --git a/src/attr/_version_info.py b/src/attr/_version_info.py index 014e78a1b..51a1312f9 100644 --- a/src/attr/_version_info.py +++ b/src/attr/_version_info.py @@ -1,4 +1,5 @@ -from __future__ import absolute_import, division, print_function +# SPDX-License-Identifier: MIT + from functools import total_ordering @@ -8,7 +9,7 @@ @total_ordering @attrs(eq=False, order=False, slots=True, frozen=True) -class VersionInfo(object): +class VersionInfo: """ A version object that can be compared to tuple of length 1--4: diff --git a/src/attr/converters.py b/src/attr/converters.py index 715ce1785..a73626c26 100644 --- a/src/attr/converters.py +++ b/src/attr/converters.py @@ -1,16 +1,21 @@ +# SPDX-License-Identifier: MIT + """ Commonly useful converters. """ -from __future__ import absolute_import, division, print_function +import typing + +from ._compat import _AnnotationExtractor from ._make import NOTHING, Factory, pipe __all__ = [ - "pipe", - "optional", "default_if_none", + "optional", + "pipe", + "to_bool", ] @@ -19,6 +24,9 @@ def optional(converter): A converter that allows an attribute to be optional. An optional attribute is one which can be set to ``None``. + Type annotations will be inferred from the wrapped converter's, if it + has any. + :param callable converter: the converter that is used for non-``None`` values. @@ -30,6 +38,16 @@ def optional_converter(val): return None return converter(val) + xtr = _AnnotationExtractor(converter) + + t = xtr.get_first_param_type() + if t: + optional_converter.__annotations__["val"] = typing.Optional[t] + + rt = xtr.get_return_type() + if rt: + optional_converter.__annotations__["return"] = typing.Optional[rt] + return optional_converter @@ -39,14 +57,14 @@ def default_if_none(default=NOTHING, factory=None): result of *factory*. :param default: Value to be used if ``None`` is passed. Passing an instance - of `attr.Factory` is supported, however the ``takes_self`` option + of `attrs.Factory` is supported, however the ``takes_self`` option is *not*. - :param callable factory: A callable that takes not parameters whose result + :param callable factory: A callable that takes no parameters whose result is used if ``None`` is passed. :raises TypeError: If **neither** *default* or *factory* is passed. :raises TypeError: If **both** *default* and *factory* are passed. - :raises ValueError: If an instance of `attr.Factory` is passed with + :raises ValueError: If an instance of `attrs.Factory` is passed with ``takes_self=True``. .. versionadded:: 18.2.0 @@ -83,3 +101,44 @@ def default_if_none_converter(val): return default return default_if_none_converter + + +def to_bool(val): + """ + Convert "boolean" strings (e.g., from env. vars.) to real booleans. + + Values mapping to :code:`True`: + + - :code:`True` + - :code:`"true"` / :code:`"t"` + - :code:`"yes"` / :code:`"y"` + - :code:`"on"` + - :code:`"1"` + - :code:`1` + + Values mapping to :code:`False`: + + - :code:`False` + - :code:`"false"` / :code:`"f"` + - :code:`"no"` / :code:`"n"` + - :code:`"off"` + - :code:`"0"` + - :code:`0` + + :raises ValueError: for any other value. + + .. versionadded:: 21.3.0 + """ + if isinstance(val, str): + val = val.lower() + truthy = {True, "true", "t", "yes", "y", "on", "1", 1} + falsy = {False, "false", "f", "no", "n", "off", "0", 0} + try: + if val in truthy: + return True + if val in falsy: + return False + except TypeError: + # Raised when "val" is not hashable (e.g., lists) + pass + raise ValueError("Cannot convert value to bool: {}".format(val)) diff --git a/src/attr/converters.pyi b/src/attr/converters.pyi index 7b0caa14f..0f58088a3 100644 --- a/src/attr/converters.pyi +++ b/src/attr/converters.pyi @@ -1,4 +1,5 @@ -from typing import TypeVar, Optional, Callable, overload +from typing import Callable, Optional, TypeVar, overload + from . import _ConverterType _T = TypeVar("_T") @@ -9,3 +10,4 @@ def optional(converter: _ConverterType) -> _ConverterType: ... def default_if_none(default: _T) -> _ConverterType: ... @overload def default_if_none(*, factory: Callable[[], _T]) -> _ConverterType: ... +def to_bool(val: str) -> bool: ... diff --git a/src/attr/exceptions.py b/src/attr/exceptions.py index fcd89106f..5dc51e0a8 100644 --- a/src/attr/exceptions.py +++ b/src/attr/exceptions.py @@ -1,9 +1,9 @@ -from __future__ import absolute_import, division, print_function +# SPDX-License-Identifier: MIT class FrozenError(AttributeError): """ - A frozen/immutable instance or attribute haave been attempted to be + A frozen/immutable instance or attribute have been attempted to be modified. It mirrors the behavior of ``namedtuples`` by using the same error message diff --git a/src/attr/filters.py b/src/attr/filters.py index dc47e8fa3..baa25e946 100644 --- a/src/attr/filters.py +++ b/src/attr/filters.py @@ -1,10 +1,9 @@ +# SPDX-License-Identifier: MIT + """ Commonly useful filters for `attr.asdict`. """ -from __future__ import absolute_import, division, print_function - -from ._compat import isclass from ._make import Attribute @@ -13,17 +12,17 @@ def _split_what(what): Returns a tuple of `frozenset`s of classes and attributes. """ return ( - frozenset(cls for cls in what if isclass(cls)), + frozenset(cls for cls in what if isinstance(cls, type)), frozenset(cls for cls in what if isinstance(cls, Attribute)), ) def include(*what): """ - Whitelist *what*. + Include *what*. - :param what: What to whitelist. - :type what: `list` of `type` or `attr.Attribute`\\ s + :param what: What to include. + :type what: `list` of `type` or `attrs.Attribute`\\ s :rtype: `callable` """ @@ -37,10 +36,10 @@ def include_(attribute, value): def exclude(*what): """ - Blacklist *what*. + Exclude *what*. - :param what: What to blacklist. - :type what: `list` of classes or `attr.Attribute`\\ s. + :param what: What to exclude. + :type what: `list` of classes or `attrs.Attribute`\\ s. :rtype: `callable` """ diff --git a/src/attr/filters.pyi b/src/attr/filters.pyi index 68368fe2b..993866865 100644 --- a/src/attr/filters.pyi +++ b/src/attr/filters.pyi @@ -1,4 +1,5 @@ -from typing import Union, Any +from typing import Any, Union + from . import Attribute, _FilterType def include(*what: Union[type, Attribute[Any]]) -> _FilterType[Any]: ... diff --git a/src/attr/setters.py b/src/attr/setters.py index 240014b3c..12ed6750d 100644 --- a/src/attr/setters.py +++ b/src/attr/setters.py @@ -1,8 +1,9 @@ +# SPDX-License-Identifier: MIT + """ Commonly used hooks for on_setattr. """ -from __future__ import absolute_import, division, print_function from . import _config from .exceptions import FrozenAttributeError @@ -67,11 +68,6 @@ def convert(instance, attrib, new_value): return new_value +# Sentinel for disabling class-wide *on_setattr* hooks for certain attributes. +# autodata stopped working, so the docstring is inlined in the API docs. NO_OP = object() -""" -Sentinel for disabling class-wide *on_setattr* hooks for certain attributes. - -Does not work in `pipe` or within lists. - -.. versionadded:: 20.1.0 -""" diff --git a/src/attr/setters.pyi b/src/attr/setters.pyi index 19bc33fd1..3f5603c2b 100644 --- a/src/attr/setters.pyi +++ b/src/attr/setters.pyi @@ -1,10 +1,11 @@ -from . import _OnSetAttrType, Attribute -from typing import TypeVar, Any, NewType, NoReturn, cast +from typing import Any, NewType, NoReturn, TypeVar, cast + +from . import Attribute, _OnSetAttrType _T = TypeVar("_T") def frozen( - instance: Any, attribute: Attribute, new_value: Any + instance: Any, attribute: Attribute[Any], new_value: Any ) -> NoReturn: ... def pipe(*setters: _OnSetAttrType) -> _OnSetAttrType: ... def validate(instance: Any, attribute: Attribute[_T], new_value: _T) -> _T: ... diff --git a/src/attr/validators.py b/src/attr/validators.py index b9a73054e..eece517da 100644 --- a/src/attr/validators.py +++ b/src/attr/validators.py @@ -1,30 +1,98 @@ +# SPDX-License-Identifier: MIT + """ Commonly useful validators. """ -from __future__ import absolute_import, division, print_function +import operator import re +from contextlib import contextmanager + +from ._config import get_run_validators, set_run_validators from ._make import _AndValidator, and_, attrib, attrs from .exceptions import NotCallableError +try: + Pattern = re.Pattern +except AttributeError: # Python <3.7 lacks a Pattern type. + Pattern = type(re.compile("")) + + __all__ = [ "and_", "deep_iterable", "deep_mapping", + "disabled", + "ge", + "get_disabled", + "gt", "in_", "instance_of", "is_callable", + "le", + "lt", "matches_re", + "max_len", + "min_len", "optional", "provides", + "set_disabled", ] +def set_disabled(disabled): + """ + Globally disable or enable running validators. + + By default, they are run. + + :param disabled: If ``True``, disable running all validators. + :type disabled: bool + + .. warning:: + + This function is not thread-safe! + + .. versionadded:: 21.3.0 + """ + set_run_validators(not disabled) + + +def get_disabled(): + """ + Return a bool indicating whether validators are currently disabled or not. + + :return: ``True`` if validators are currently disabled. + :rtype: bool + + .. versionadded:: 21.3.0 + """ + return not get_run_validators() + + +@contextmanager +def disabled(): + """ + Context manager that disables running validators within its context. + + .. warning:: + + This context manager is not thread-safe! + + .. versionadded:: 21.3.0 + """ + set_run_validators(False) + try: + yield + finally: + set_run_validators(True) + + @attrs(repr=False, slots=True, hash=True) -class _InstanceOfValidator(object): +class _InstanceOfValidator: type = attrib() def __call__(self, inst, attr, value): @@ -61,16 +129,15 @@ def instance_of(type): :type type: type or tuple of types :raises TypeError: With a human readable error message, the attribute - (of type `attr.Attribute`), the expected type, and the value it + (of type `attrs.Attribute`), the expected type, and the value it got. """ return _InstanceOfValidator(type) @attrs(repr=False, frozen=True, slots=True) -class _MatchesReValidator(object): - regex = attrib() - flags = attrib() +class _MatchesReValidator: + pattern = attrib() match_func = attrib() def __call__(self, inst, attr, value): @@ -79,18 +146,18 @@ def __call__(self, inst, attr, value): """ if not self.match_func(value): raise ValueError( - "'{name}' must match regex {regex!r}" + "'{name}' must match regex {pattern!r}" " ({value!r} doesn't)".format( - name=attr.name, regex=self.regex.pattern, value=value + name=attr.name, pattern=self.pattern.pattern, value=value ), attr, - self.regex, + self.pattern, value, ) def __repr__(self): - return "".format( - regex=self.regex + return "".format( + pattern=self.pattern ) @@ -99,48 +166,51 @@ def matches_re(regex, flags=0, func=None): A validator that raises `ValueError` if the initializer is called with a string that doesn't match *regex*. - :param str regex: a regex string to match against + :param regex: a regex string or precompiled pattern to match against :param int flags: flags that will be passed to the underlying re function (default 0) - :param callable func: which underlying `re` function to call (options - are `re.fullmatch`, `re.search`, `re.match`, default - is ``None`` which means either `re.fullmatch` or an emulation of - it on Python 2). For performance reasons, they won't be used directly - but on a pre-`re.compile`\ ed pattern. + :param callable func: which underlying `re` function to call. Valid options + are `re.fullmatch`, `re.search`, and `re.match`; the default ``None`` + means `re.fullmatch`. For performance reasons, the pattern is always + precompiled using `re.compile`. .. versionadded:: 19.2.0 + .. versionchanged:: 21.3.0 *regex* can be a pre-compiled pattern. """ - fullmatch = getattr(re, "fullmatch", None) - valid_funcs = (fullmatch, None, re.search, re.match) + valid_funcs = (re.fullmatch, None, re.search, re.match) if func not in valid_funcs: raise ValueError( - "'func' must be one of %s." - % ( + "'func' must be one of {}.".format( ", ".join( sorted( e and e.__name__ or "None" for e in set(valid_funcs) ) - ), + ) ) ) - pattern = re.compile(regex, flags) + if isinstance(regex, Pattern): + if flags: + raise TypeError( + "'flags' can only be used with a string pattern; " + "pass flags to re.compile() instead" + ) + pattern = regex + else: + pattern = re.compile(regex, flags) + if func is re.match: match_func = pattern.match elif func is re.search: match_func = pattern.search else: - if fullmatch: - match_func = pattern.fullmatch - else: - pattern = re.compile(r"(?:{})\Z".format(regex), flags) - match_func = pattern.match + match_func = pattern.fullmatch - return _MatchesReValidator(pattern, flags, match_func) + return _MatchesReValidator(pattern, match_func) @attrs(repr=False, slots=True, hash=True) -class _ProvidesValidator(object): +class _ProvidesValidator: interface = attrib() def __call__(self, inst, attr, value): @@ -175,14 +245,14 @@ def provides(interface): :type interface: ``zope.interface.Interface`` :raises TypeError: With a human readable error message, the attribute - (of type `attr.Attribute`), the expected interface, and the + (of type `attrs.Attribute`), the expected interface, and the value it got. """ return _ProvidesValidator(interface) @attrs(repr=False, slots=True, hash=True) -class _OptionalValidator(object): +class _OptionalValidator: validator = attrib() def __call__(self, inst, attr, value): @@ -216,7 +286,7 @@ def optional(validator): @attrs(repr=False, slots=True, hash=True) -class _InValidator(object): +class _InValidator: options = attrib() def __call__(self, inst, attr, value): @@ -229,7 +299,10 @@ def __call__(self, inst, attr, value): raise ValueError( "'{name}' must be in {options!r} (got {value!r})".format( name=attr.name, options=self.options, value=value - ) + ), + attr, + self.options, + value, ) def __repr__(self): @@ -248,16 +321,20 @@ def in_(options): :type options: list, tuple, `enum.Enum`, ... :raises ValueError: With a human readable error message, the attribute (of - type `attr.Attribute`), the expected options, and the value it + type `attrs.Attribute`), the expected options, and the value it got. .. versionadded:: 17.1.0 + .. versionchanged:: 22.1.0 + The ValueError was incomplete until now and only contained the human + readable error message. Now it contains all the information that has + been promised since 17.1.0. """ return _InValidator(options) @attrs(repr=False, slots=False, hash=True) -class _IsCallableValidator(object): +class _IsCallableValidator: def __call__(self, inst, attr, value): """ We use a callable class to be able to change the ``__repr__``. @@ -287,14 +364,14 @@ def is_callable(): .. versionadded:: 19.1.0 :raises `attr.exceptions.NotCallableError`: With a human readable error - message containing the attribute (`attr.Attribute`) name, + message containing the attribute (`attrs.Attribute`) name, and the value it got. """ return _IsCallableValidator() @attrs(repr=False, slots=True, hash=True) -class _DeepIterable(object): +class _DeepIterable: member_validator = attrib(validator=is_callable()) iterable_validator = attrib( default=None, validator=optional(is_callable()) @@ -329,7 +406,7 @@ def deep_iterable(member_validator, iterable_validator=None): """ A validator that performs deep validation of an iterable. - :param member_validator: Validator to apply to iterable members + :param member_validator: Validator(s) to apply to iterable members :param iterable_validator: Validator to apply to iterable itself (optional) @@ -337,11 +414,13 @@ def deep_iterable(member_validator, iterable_validator=None): :raises TypeError: if any sub-validators fail """ + if isinstance(member_validator, (list, tuple)): + member_validator = and_(*member_validator) return _DeepIterable(member_validator, iterable_validator) @attrs(repr=False, slots=True, hash=True) -class _DeepMapping(object): +class _DeepMapping: key_validator = attrib(validator=is_callable()) value_validator = attrib(validator=is_callable()) mapping_validator = attrib(default=None, validator=optional(is_callable())) @@ -377,3 +456,139 @@ def deep_mapping(key_validator, value_validator, mapping_validator=None): :raises TypeError: if any sub-validators fail """ return _DeepMapping(key_validator, value_validator, mapping_validator) + + +@attrs(repr=False, frozen=True, slots=True) +class _NumberValidator: + bound = attrib() + compare_op = attrib() + compare_func = attrib() + + def __call__(self, inst, attr, value): + """ + We use a callable class to be able to change the ``__repr__``. + """ + if not self.compare_func(value, self.bound): + raise ValueError( + "'{name}' must be {op} {bound}: {value}".format( + name=attr.name, + op=self.compare_op, + bound=self.bound, + value=value, + ) + ) + + def __repr__(self): + return "".format( + op=self.compare_op, bound=self.bound + ) + + +def lt(val): + """ + A validator that raises `ValueError` if the initializer is called + with a number larger or equal to *val*. + + :param val: Exclusive upper bound for values + + .. versionadded:: 21.3.0 + """ + return _NumberValidator(val, "<", operator.lt) + + +def le(val): + """ + A validator that raises `ValueError` if the initializer is called + with a number greater than *val*. + + :param val: Inclusive upper bound for values + + .. versionadded:: 21.3.0 + """ + return _NumberValidator(val, "<=", operator.le) + + +def ge(val): + """ + A validator that raises `ValueError` if the initializer is called + with a number smaller than *val*. + + :param val: Inclusive lower bound for values + + .. versionadded:: 21.3.0 + """ + return _NumberValidator(val, ">=", operator.ge) + + +def gt(val): + """ + A validator that raises `ValueError` if the initializer is called + with a number smaller or equal to *val*. + + :param val: Exclusive lower bound for values + + .. versionadded:: 21.3.0 + """ + return _NumberValidator(val, ">", operator.gt) + + +@attrs(repr=False, frozen=True, slots=True) +class _MaxLengthValidator: + max_length = attrib() + + def __call__(self, inst, attr, value): + """ + We use a callable class to be able to change the ``__repr__``. + """ + if len(value) > self.max_length: + raise ValueError( + "Length of '{name}' must be <= {max}: {len}".format( + name=attr.name, max=self.max_length, len=len(value) + ) + ) + + def __repr__(self): + return "".format(max=self.max_length) + + +def max_len(length): + """ + A validator that raises `ValueError` if the initializer is called + with a string or iterable that is longer than *length*. + + :param int length: Maximum length of the string or iterable + + .. versionadded:: 21.3.0 + """ + return _MaxLengthValidator(length) + + +@attrs(repr=False, frozen=True, slots=True) +class _MinLengthValidator: + min_length = attrib() + + def __call__(self, inst, attr, value): + """ + We use a callable class to be able to change the ``__repr__``. + """ + if len(value) < self.min_length: + raise ValueError( + "Length of '{name}' must be => {min}: {len}".format( + name=attr.name, min=self.min_length, len=len(value) + ) + ) + + def __repr__(self): + return "".format(min=self.min_length) + + +def min_len(length): + """ + A validator that raises `ValueError` if the initializer is called + with a string or iterable that is shorter than *length*. + + :param int length: Minimum length of the string or iterable + + .. versionadded:: 22.1.0 + """ + return _MinLengthValidator(length) diff --git a/src/attr/validators.pyi b/src/attr/validators.pyi index 9a22abb19..54b9dba24 100644 --- a/src/attr/validators.pyi +++ b/src/attr/validators.pyi @@ -1,20 +1,24 @@ from typing import ( - Container, - List, - Union, - TypeVar, - Type, Any, - Optional, - Tuple, + AnyStr, + Callable, + Container, + ContextManager, Iterable, + List, Mapping, - Callable, Match, - AnyStr, + Optional, + Pattern, + Tuple, + Type, + TypeVar, + Union, overload, ) + from . import _ValidatorType +from . import _ValidatorArgType _T = TypeVar("_T") _T1 = TypeVar("_T1") @@ -25,6 +29,10 @@ _K = TypeVar("_K") _V = TypeVar("_V") _M = TypeVar("_M", bound=Mapping) +def set_disabled(run: bool) -> None: ... +def get_disabled() -> bool: ... +def disabled() -> ContextManager[None]: ... + # To be more precise on instance_of use some overloads. # If there are more than 3 items in the tuple then we fall back to Any @overload @@ -48,14 +56,14 @@ def optional( def in_(options: Container[_T]) -> _ValidatorType[_T]: ... def and_(*validators: _ValidatorType[_T]) -> _ValidatorType[_T]: ... def matches_re( - regex: AnyStr, + regex: Union[Pattern[AnyStr], AnyStr], flags: int = ..., func: Optional[ Callable[[AnyStr, AnyStr, int], Optional[Match[AnyStr]]] ] = ..., ) -> _ValidatorType[AnyStr]: ... def deep_iterable( - member_validator: _ValidatorType[_T], + member_validator: _ValidatorArgType[_T], iterable_validator: Optional[_ValidatorType[_I]] = ..., ) -> _ValidatorType[_I]: ... def deep_mapping( @@ -64,3 +72,9 @@ def deep_mapping( mapping_validator: Optional[_ValidatorType[_M]] = ..., ) -> _ValidatorType[_M]: ... def is_callable() -> _ValidatorType[_T]: ... +def lt(val: _T) -> _ValidatorType[_T]: ... +def le(val: _T) -> _ValidatorType[_T]: ... +def ge(val: _T) -> _ValidatorType[_T]: ... +def gt(val: _T) -> _ValidatorType[_T]: ... +def max_len(length: int) -> _ValidatorType[_T]: ... +def min_len(length: int) -> _ValidatorType[_T]: ... diff --git a/src/attrs/__init__.py b/src/attrs/__init__.py new file mode 100644 index 000000000..a704b8b56 --- /dev/null +++ b/src/attrs/__init__.py @@ -0,0 +1,70 @@ +# SPDX-License-Identifier: MIT + +from attr import ( + NOTHING, + Attribute, + Factory, + __author__, + __copyright__, + __description__, + __doc__, + __email__, + __license__, + __title__, + __url__, + __version__, + __version_info__, + assoc, + cmp_using, + define, + evolve, + field, + fields, + fields_dict, + frozen, + has, + make_class, + mutable, + resolve_types, + validate, +) +from attr._next_gen import asdict, astuple + +from . import converters, exceptions, filters, setters, validators + + +__all__ = [ + "__author__", + "__copyright__", + "__description__", + "__doc__", + "__email__", + "__license__", + "__title__", + "__url__", + "__version__", + "__version_info__", + "asdict", + "assoc", + "astuple", + "Attribute", + "cmp_using", + "converters", + "define", + "evolve", + "exceptions", + "Factory", + "field", + "fields_dict", + "fields", + "filters", + "frozen", + "has", + "make_class", + "mutable", + "NOTHING", + "resolve_types", + "setters", + "validate", + "validators", +] diff --git a/src/attrs/__init__.pyi b/src/attrs/__init__.pyi new file mode 100644 index 000000000..fc44de46a --- /dev/null +++ b/src/attrs/__init__.pyi @@ -0,0 +1,66 @@ +from typing import ( + Any, + Callable, + Dict, + Mapping, + Optional, + Sequence, + Tuple, + Type, +) + +# Because we need to type our own stuff, we have to make everything from +# attr explicitly public too. +from attr import __author__ as __author__ +from attr import __copyright__ as __copyright__ +from attr import __description__ as __description__ +from attr import __email__ as __email__ +from attr import __license__ as __license__ +from attr import __title__ as __title__ +from attr import __url__ as __url__ +from attr import __version__ as __version__ +from attr import __version_info__ as __version_info__ +from attr import _FilterType +from attr import assoc as assoc +from attr import Attribute as Attribute +from attr import cmp_using as cmp_using +from attr import converters as converters +from attr import define as define +from attr import evolve as evolve +from attr import exceptions as exceptions +from attr import Factory as Factory +from attr import field as field +from attr import fields as fields +from attr import fields_dict as fields_dict +from attr import filters as filters +from attr import frozen as frozen +from attr import has as has +from attr import make_class as make_class +from attr import mutable as mutable +from attr import NOTHING as NOTHING +from attr import resolve_types as resolve_types +from attr import setters as setters +from attr import validate as validate +from attr import validators as validators + +# TODO: see definition of attr.asdict/astuple +def asdict( + inst: Any, + recurse: bool = ..., + filter: Optional[_FilterType[Any]] = ..., + dict_factory: Type[Mapping[Any, Any]] = ..., + retain_collection_types: bool = ..., + value_serializer: Optional[ + Callable[[type, Attribute[Any], Any], Any] + ] = ..., + tuple_keys: bool = ..., +) -> Dict[str, Any]: ... + +# TODO: add support for returning NamedTuple from the mypy plugin +def astuple( + inst: Any, + recurse: bool = ..., + filter: Optional[_FilterType[Any]] = ..., + tuple_factory: Type[Sequence[Any]] = ..., + retain_collection_types: bool = ..., +) -> Tuple[Any, ...]: ... diff --git a/src/attrs/converters.py b/src/attrs/converters.py new file mode 100644 index 000000000..edfa8d3c1 --- /dev/null +++ b/src/attrs/converters.py @@ -0,0 +1,3 @@ +# SPDX-License-Identifier: MIT + +from attr.converters import * # noqa diff --git a/src/attrs/exceptions.py b/src/attrs/exceptions.py new file mode 100644 index 000000000..bd9efed20 --- /dev/null +++ b/src/attrs/exceptions.py @@ -0,0 +1,3 @@ +# SPDX-License-Identifier: MIT + +from attr.exceptions import * # noqa diff --git a/src/attrs/filters.py b/src/attrs/filters.py new file mode 100644 index 000000000..52959005b --- /dev/null +++ b/src/attrs/filters.py @@ -0,0 +1,3 @@ +# SPDX-License-Identifier: MIT + +from attr.filters import * # noqa diff --git a/src/attrs/py.typed b/src/attrs/py.typed new file mode 100644 index 000000000..e69de29bb diff --git a/src/attrs/setters.py b/src/attrs/setters.py new file mode 100644 index 000000000..9b5077080 --- /dev/null +++ b/src/attrs/setters.py @@ -0,0 +1,3 @@ +# SPDX-License-Identifier: MIT + +from attr.setters import * # noqa diff --git a/src/attrs/validators.py b/src/attrs/validators.py new file mode 100644 index 000000000..ab2c9b302 --- /dev/null +++ b/src/attrs/validators.py @@ -0,0 +1,3 @@ +# SPDX-License-Identifier: MIT + +from attr.validators import * # noqa diff --git a/tests/__init__.py b/tests/__init__.py index e69de29bb..548d2d447 100644 --- a/tests/__init__.py +++ b/tests/__init__.py @@ -0,0 +1 @@ +# SPDX-License-Identifier: MIT diff --git a/tests/attr_import_star.py b/tests/attr_import_star.py new file mode 100644 index 000000000..636545268 --- /dev/null +++ b/tests/attr_import_star.py @@ -0,0 +1,9 @@ +# SPDX-License-Identifier: MIT + + +from attr import * # noqa: F401,F403 + + +# This is imported by test_import::test_from_attr_import_star; this must +# be done indirectly because importing * is only allowed on module level, +# so can't be done inside a test. diff --git a/tests/dataclass_transform_example.py b/tests/dataclass_transform_example.py new file mode 100644 index 000000000..49e09061a --- /dev/null +++ b/tests/dataclass_transform_example.py @@ -0,0 +1,45 @@ +# SPDX-License-Identifier: MIT + +import attr + + +@attr.define() +class Define: + a: str + b: int + + +reveal_type(Define.__init__) # noqa + + +@attr.define() +class DefineConverter: + # mypy plugin adapts the "int" method signature, pyright does not + with_converter: int = attr.field(converter=int) + + +reveal_type(DefineConverter.__init__) # noqa + + +# mypy plugin supports attr.frozen, pyright does not +@attr.frozen() +class Frozen: + a: str + + +d = Frozen("a") +d.a = "new" + +reveal_type(d.a) # noqa + + +# but pyright supports attr.define(frozen) +@attr.define(frozen=True) +class FrozenDefine: + a: str + + +d2 = FrozenDefine("a") +d2.a = "new" + +reveal_type(d2.a) # noqa diff --git a/tests/strategies.py b/tests/strategies.py index a9bc26408..f630b228a 100644 --- a/tests/strategies.py +++ b/tests/strategies.py @@ -1,3 +1,5 @@ +# SPDX-License-Identifier: MIT + """ Testing strategies for Hypothesis-based tests. """ @@ -26,8 +28,7 @@ def gen_attr_names(): Some short strings (such as 'as') are keywords, so we skip them. """ lc = string.ascii_lowercase - for c in lc: - yield c + yield from lc for outer in lc: for inner in lc: res = outer + inner @@ -153,7 +154,17 @@ class HypClass: attr_names = gen_attr_names() cls_dict = dict(zip(attr_names, attrs)) + pre_init_flag = draw(st.booleans()) post_init_flag = draw(st.booleans()) + init_flag = draw(st.booleans()) + + if pre_init_flag: + + def pre_init(self): + pass + + cls_dict["__attrs_pre_init__"] = pre_init + if post_init_flag: def post_init(self): @@ -161,12 +172,20 @@ def post_init(self): cls_dict["__attrs_post_init__"] = post_init + if not init_flag: + + def init(self, *args, **kwargs): + self.__attrs_init__(*args, **kwargs) + + cls_dict["__init__"] = init + return make_class( "HypClass", cls_dict, slots=slots_flag if slots is None else slots, frozen=frozen_flag if frozen is None else frozen, weakref_slot=weakref_flag if weakref_slot is None else weakref_slot, + init=init_flag, ) diff --git a/tests/test_3rd_party.py b/tests/test_3rd_party.py new file mode 100644 index 000000000..0707b2cd2 --- /dev/null +++ b/tests/test_3rd_party.py @@ -0,0 +1,31 @@ +# SPDX-License-Identifier: MIT + +""" +Tests for compatibility against other Python modules. +""" + +import pytest + +from hypothesis import given + +from .strategies import simple_classes + + +cloudpickle = pytest.importorskip("cloudpickle") + + +class TestCloudpickleCompat: + """ + Tests for compatibility with ``cloudpickle``. + """ + + @given(simple_classes()) + def test_repr(self, cls): + """ + attrs instances can be pickled and un-pickled with cloudpickle. + """ + inst = cls() + # Exact values aren't a concern so long as neither direction + # raises an exception. + pkl = cloudpickle.dumps(inst) + cloudpickle.loads(pkl) diff --git a/tests/test_annotations.py b/tests/test_annotations.py index f85b5d926..18f0d21cf 100644 --- a/tests/test_annotations.py +++ b/tests/test_annotations.py @@ -1,9 +1,12 @@ +# SPDX-License-Identifier: MIT + """ Tests for PEP-526 type annotations. Python 3.6+ only. """ +import sys import types import typing @@ -11,10 +14,21 @@ import attr -from attr._make import _classvar_prefixes +from attr._make import _is_class_var from attr.exceptions import UnannotatedAttributeError +def assert_init_annotations(cls, **annotations): + """ + Assert cls.__init__ has the correct annotations. + """ + __tracebackhide__ = True + + annotations["return"] = type(None) + + assert annotations == typing.get_type_hints(cls.__init__) + + class TestAnnotations: """ Tests for types derived from variable annotations (PEP-526). @@ -25,6 +39,7 @@ def test_basic_annotations(self): Sets the `Attribute.type` attr from basic type annotations. """ + @attr.resolve_types @attr.s class C: x: int = attr.ib() @@ -34,11 +49,7 @@ class C: assert int is attr.fields(C).x.type assert str is attr.fields(C).y.type assert None is attr.fields(C).z.type - assert C.__init__.__annotations__ == { - "x": int, - "y": str, - "return": None, - } + assert_init_annotations(C, x=int, y=str) def test_catches_basic_type_conflict(self): """ @@ -59,6 +70,7 @@ def test_typing_annotations(self): Sets the `Attribute.type` attr from typing annotations. """ + @attr.resolve_types @attr.s class C: x: typing.List[int] = attr.ib() @@ -66,28 +78,26 @@ class C: assert typing.List[int] is attr.fields(C).x.type assert typing.Optional[str] is attr.fields(C).y.type - assert C.__init__.__annotations__ == { - "x": typing.List[int], - "y": typing.Optional[str], - "return": None, - } + assert_init_annotations(C, x=typing.List[int], y=typing.Optional[str]) def test_only_attrs_annotations_collected(self): """ Annotations that aren't set to an attr.ib are ignored. """ + @attr.resolve_types @attr.s class C: x: typing.List[int] = attr.ib() y: int assert 1 == len(attr.fields(C)) - assert C.__init__.__annotations__ == { - "x": typing.List[int], - "return": None, - } + assert_init_annotations(C, x=typing.List[int]) + @pytest.mark.skipif( + sys.version_info[:2] < (3, 11), + reason="Incompatible behavior on older Pythons", + ) @pytest.mark.parametrize("slots", [True, False]) def test_auto_attribs(self, slots): """ @@ -107,10 +117,12 @@ class C: i = C(42) assert "C(a=42, x=[], y=2, z=3, foo=None)" == repr(i) - attr_names = set(a.name for a in C.__attrs_attrs__) + attr_names = {a.name for a in C.__attrs_attrs__} assert "a" in attr_names # just double check that the set works assert "cls_var" not in attr_names + attr.resolve_types(C) + assert int == attr.fields(C).a.type assert attr.Factory(list) == attr.fields(C).x.default @@ -135,14 +147,14 @@ class C: i.y = 23 assert 23 == i.y - assert C.__init__.__annotations__ == { - "a": int, - "x": typing.List[int], - "y": int, - "z": int, - "foo": typing.Any, - "return": None, - } + assert_init_annotations( + C, + a=int, + x=typing.List[int], + y=int, + z=int, + foo=typing.Any, + ) @pytest.mark.parametrize("slots", [True, False]) def test_auto_attribs_unannotated(self, slots): @@ -171,64 +183,259 @@ def test_auto_attribs_subclassing(self, slots): Ref #291 """ + @attr.resolve_types @attr.s(slots=slots, auto_attribs=True) class A: a: int = 1 + @attr.resolve_types @attr.s(slots=slots, auto_attribs=True) class B(A): b: int = 2 + @attr.resolve_types @attr.s(slots=slots, auto_attribs=True) class C(A): pass assert "B(a=1, b=2)" == repr(B()) assert "C(a=1)" == repr(C()) - - assert A.__init__.__annotations__ == {"a": int, "return": None} - assert B.__init__.__annotations__ == { - "a": int, - "b": int, - "return": None, - } - assert C.__init__.__annotations__ == {"a": int, "return": None} + assert_init_annotations(A, a=int) + assert_init_annotations(B, a=int, b=int) + assert_init_annotations(C, a=int) def test_converter_annotations(self): """ - Attributes with converters don't have annotations. + An unannotated attribute with an annotated converter gets its + annotation from the converter. """ - @attr.s(auto_attribs=True) + def int2str(x: int) -> str: + return str(x) + + @attr.s + class A: + a = attr.ib(converter=int2str) + + assert_init_annotations(A, a=int) + + def int2str_(x: int, y: str = ""): + return str(x) + + @attr.s + class A: + a = attr.ib(converter=int2str_) + + assert_init_annotations(A, a=int) + + def test_converter_attrib_annotations(self): + """ + If a converter is provided, an explicit type annotation has no + effect on an attribute's type annotation. + """ + + def int2str(x: int) -> str: + return str(x) + + @attr.s class A: - a: int = attr.ib(converter=int) + a: str = attr.ib(converter=int2str) + b = attr.ib(converter=int2str, type=str) + + assert_init_annotations(A, a=int, b=int) + + def test_non_introspectable_converter(self): + """ + A non-introspectable converter doesn't cause a crash. + """ + + @attr.s + class A: + a = attr.ib(converter=print) + + def test_nullary_converter(self): + """ + A converter with no arguments doesn't cause a crash. + """ + + def noop(): + pass + + @attr.s + class A: + a = attr.ib(converter=noop) assert A.__init__.__annotations__ == {"return": None} + def test_pipe(self): + """ + pipe() uses the input annotation of its first argument and the + output annotation of its last argument. + """ + + def int2str(x: int) -> str: + return str(x) + + def strlen(y: str) -> int: + return len(y) + + def identity(z): + return z + + assert attr.converters.pipe(int2str).__annotations__ == { + "val": int, + "return": str, + } + assert attr.converters.pipe(int2str, strlen).__annotations__ == { + "val": int, + "return": int, + } + assert attr.converters.pipe(identity, strlen).__annotations__ == { + "return": int + } + assert attr.converters.pipe(int2str, identity).__annotations__ == { + "val": int + } + + def int2str_(x: int, y: int = 0) -> str: + return str(x) + + assert attr.converters.pipe(int2str_).__annotations__ == { + "val": int, + "return": str, + } + + def test_pipe_empty(self): + """ + pipe() with no converters is annotated like the identity. + """ + + p = attr.converters.pipe() + assert "val" in p.__annotations__ + t = p.__annotations__["val"] + assert isinstance(t, typing.TypeVar) + assert p.__annotations__ == {"val": t, "return": t} + + def test_pipe_non_introspectable(self): + """ + pipe() doesn't crash when passed a non-introspectable converter. + """ + + assert attr.converters.pipe(print).__annotations__ == {} + + def test_pipe_nullary(self): + """ + pipe() doesn't crash when passed a nullary converter. + """ + + def noop(): + pass + + assert attr.converters.pipe(noop).__annotations__ == {} + + def test_optional(self): + """ + optional() uses the annotations of the converter it wraps. + """ + + def int2str(x: int) -> str: + return str(x) + + def int_identity(x: int): + return x + + def strify(x) -> str: + return str(x) + + def identity(x): + return x + + assert attr.converters.optional(int2str).__annotations__ == { + "val": typing.Optional[int], + "return": typing.Optional[str], + } + assert attr.converters.optional(int_identity).__annotations__ == { + "val": typing.Optional[int] + } + assert attr.converters.optional(strify).__annotations__ == { + "return": typing.Optional[str] + } + assert attr.converters.optional(identity).__annotations__ == {} + + def int2str_(x: int, y: int = 0) -> str: + return str(x) + + assert attr.converters.optional(int2str_).__annotations__ == { + "val": typing.Optional[int], + "return": typing.Optional[str], + } + + def test_optional_non_introspectable(self): + """ + optional() doesn't crash when passed a non-introspectable + converter. + """ + + assert attr.converters.optional(print).__annotations__ == {} + + def test_optional_nullary(self): + """ + optional() doesn't crash when passed a nullary converter. + """ + + def noop(): + pass + + assert attr.converters.optional(noop).__annotations__ == {} + + @pytest.mark.skipif( + sys.version_info[:2] < (3, 11), + reason="Incompatible behavior on older Pythons", + ) @pytest.mark.parametrize("slots", [True, False]) - @pytest.mark.parametrize("classvar", _classvar_prefixes) - def test_annotations_strings(self, slots, classvar): + def test_annotations_strings(self, slots): """ String annotations are passed into __init__ as is. + + It fails on 3.6 due to a bug in Python. """ + import typing as t + + from typing import ClassVar @attr.s(auto_attribs=True, slots=slots) class C: - cls_var: classvar + "[int]" = 23 + cls_var1: "typing.ClassVar[int]" = 23 + cls_var2: "ClassVar[int]" = 23 + cls_var3: "t.ClassVar[int]" = 23 a: "int" x: "typing.List[int]" = attr.Factory(list) y: "int" = 2 z: "int" = attr.ib(default=3) foo: "typing.Any" = None - assert C.__init__.__annotations__ == { - "a": "int", - "x": "typing.List[int]", - "y": "int", - "z": "int", - "foo": "typing.Any", - "return": None, - } + attr.resolve_types(C, locals(), globals()) + + assert_init_annotations( + C, + a=int, + x=typing.List[int], + y=int, + z=int, + foo=typing.Any, + ) + + @pytest.mark.parametrize("slots", [True, False]) + def test_typing_extensions_classvar(self, slots): + """ + If ClassVar is coming from typing_extensions, it is recognized too. + """ + + @attr.s(auto_attribs=True, slots=slots) + class C: + cls_var: "typing_extensions.ClassVar" = 23 # noqa + + assert_init_annotations(C) def test_keyword_only_auto_attribs(self): """ @@ -310,10 +517,6 @@ class C: y = attr.ib(type=str) z = attr.ib() - assert "int" == attr.fields(C).x.type - assert str is attr.fields(C).y.type - assert None is attr.fields(C).z.type - attr.resolve_types(C) assert int is attr.fields(C).x.type @@ -332,10 +535,6 @@ class A: b: typing.List["int"] c: "typing.List[int]" - assert typing.List[int] == attr.fields(A).a.type - assert typing.List["int"] == attr.fields(A).b.type - assert "typing.List[int]" == attr.fields(A).c.type - # Note: I don't have to pass globals and locals here because # int is a builtin and will be available in any scope. attr.resolve_types(A) @@ -372,9 +571,6 @@ class A: a: "A" b: typing.Optional["A"] # noqa: will resolve below - assert "A" == attr.fields(A).a.type - assert typing.Optional["A"] == attr.fields(A).b.type - attr.resolve_types(A, globals(), locals()) assert A == attr.fields(A).a.type @@ -394,10 +590,87 @@ class A: class B: a: A - assert typing.List["B"] == attr.fields(A).a.type - assert A == attr.fields(B).a.type - attr.resolve_types(A, globals(), locals()) + attr.resolve_types(B, globals(), locals()) + + assert typing.List[B] == attr.fields(A).a.type + assert A == attr.fields(B).a.type assert typing.List[B] == attr.fields(A).a.type assert A == attr.fields(B).a.type + + def test_init_type_hints(self): + """ + Forward references in __init__ can be automatically resolved. + """ + + @attr.s + class C: + x = attr.ib(type="typing.List[int]") + + assert_init_annotations(C, x=typing.List[int]) + + def test_init_type_hints_fake_module(self): + """ + If you somehow set the __module__ to something that doesn't exist + you'll lose __init__ resolution. + """ + + class C: + x = attr.ib(type="typing.List[int]") + + C.__module__ = "totally fake" + C = attr.s(C) + + with pytest.raises(NameError): + typing.get_type_hints(C.__init__) + + def test_inheritance(self): + """ + Subclasses can be resolved after the parent is resolved. + """ + + @attr.define() + class A: + n: "int" + + @attr.define() + class B(A): + pass + + attr.resolve_types(A) + attr.resolve_types(B) + + assert int == attr.fields(A).n.type + assert int == attr.fields(B).n.type + + def test_resolve_twice(self): + """ + You can call resolve_types as many times as you like. + This test is here mostly for coverage. + """ + + @attr.define() + class A: + n: "int" + + attr.resolve_types(A) + assert int == attr.fields(A).n.type + attr.resolve_types(A) + assert int == attr.fields(A).n.type + + +@pytest.mark.parametrize( + "annot", + [ + typing.ClassVar, + "typing.ClassVar", + "'typing.ClassVar[dict]'", + "t.ClassVar[int]", + ], +) +def test_is_class_var(annot): + """ + ClassVars are detected, even if they're a string or quoted. + """ + assert _is_class_var(annot) diff --git a/tests/test_cmp.py b/tests/test_cmp.py new file mode 100644 index 000000000..b9383871e --- /dev/null +++ b/tests/test_cmp.py @@ -0,0 +1,488 @@ +# SPDX-License-Identifier: MIT + +""" +Tests for methods from `attrib._cmp`. +""" + + +import pytest + +from attr._cmp import cmp_using + + +# Test parameters. +EqCSameType = cmp_using(eq=lambda a, b: a == b, class_name="EqCSameType") +PartialOrderCSameType = cmp_using( + eq=lambda a, b: a == b, + lt=lambda a, b: a < b, + class_name="PartialOrderCSameType", +) +FullOrderCSameType = cmp_using( + eq=lambda a, b: a == b, + lt=lambda a, b: a < b, + le=lambda a, b: a <= b, + gt=lambda a, b: a > b, + ge=lambda a, b: a >= b, + class_name="FullOrderCSameType", +) + +EqCAnyType = cmp_using( + eq=lambda a, b: a == b, require_same_type=False, class_name="EqCAnyType" +) +PartialOrderCAnyType = cmp_using( + eq=lambda a, b: a == b, + lt=lambda a, b: a < b, + require_same_type=False, + class_name="PartialOrderCAnyType", +) + + +eq_data = [ + (EqCSameType, True), + (EqCAnyType, False), +] + +order_data = [ + (PartialOrderCSameType, True), + (PartialOrderCAnyType, False), + (FullOrderCSameType, True), +] + +eq_ids = [c[0].__name__ for c in eq_data] +order_ids = [c[0].__name__ for c in order_data] + +cmp_data = eq_data + order_data +cmp_ids = eq_ids + order_ids + + +class TestEqOrder: + """ + Tests for eq and order related methods. + """ + + ######### + # eq + ######### + @pytest.mark.parametrize("cls, requires_same_type", cmp_data, ids=cmp_ids) + def test_equal_same_type(self, cls, requires_same_type): + """ + Equal objects are detected as equal. + """ + assert cls(1) == cls(1) + assert not (cls(1) != cls(1)) + + @pytest.mark.parametrize("cls, requires_same_type", cmp_data, ids=cmp_ids) + def test_unequal_same_type(self, cls, requires_same_type): + """ + Unequal objects of correct type are detected as unequal. + """ + assert cls(1) != cls(2) + assert not (cls(1) == cls(2)) + + @pytest.mark.parametrize("cls, requires_same_type", cmp_data, ids=cmp_ids) + def test_equal_different_type(self, cls, requires_same_type): + """ + Equal values of different types are detected appropriately. + """ + assert (cls(1) == cls(1.0)) == (not requires_same_type) + assert not (cls(1) != cls(1.0)) == (not requires_same_type) + + ######### + # lt + ######### + @pytest.mark.parametrize("cls, requires_same_type", eq_data, ids=eq_ids) + def test_lt_unorderable(self, cls, requires_same_type): + """ + TypeError is raised if class does not implement __lt__. + """ + with pytest.raises(TypeError): + cls(1) < cls(2) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_lt_same_type(self, cls, requires_same_type): + """ + Less-than objects are detected appropriately. + """ + assert cls(1) < cls(2) + assert not (cls(2) < cls(1)) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_not_lt_same_type(self, cls, requires_same_type): + """ + Not less-than objects are detected appropriately. + """ + assert cls(2) >= cls(1) + assert not (cls(1) >= cls(2)) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_lt_different_type(self, cls, requires_same_type): + """ + Less-than values of different types are detected appropriately. + """ + if requires_same_type: + # Unlike __eq__, NotImplemented will cause an exception to be + # raised from __lt__. + with pytest.raises(TypeError): + cls(1) < cls(2.0) + else: + assert cls(1) < cls(2.0) + assert not (cls(2) < cls(1.0)) + + ######### + # le + ######### + @pytest.mark.parametrize("cls, requires_same_type", eq_data, ids=eq_ids) + def test_le_unorderable(self, cls, requires_same_type): + """ + TypeError is raised if class does not implement __le__. + """ + with pytest.raises(TypeError): + cls(1) <= cls(2) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_le_same_type(self, cls, requires_same_type): + """ + Less-than-or-equal objects are detected appropriately. + """ + assert cls(1) <= cls(1) + assert cls(1) <= cls(2) + assert not (cls(2) <= cls(1)) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_not_le_same_type(self, cls, requires_same_type): + """ + Not less-than-or-equal objects are detected appropriately. + """ + assert cls(2) > cls(1) + assert not (cls(1) > cls(1)) + assert not (cls(1) > cls(2)) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_le_different_type(self, cls, requires_same_type): + """ + Less-than-or-equal values of diff. types are detected appropriately. + """ + if requires_same_type: + # Unlike __eq__, NotImplemented will cause an exception to be + # raised from __le__. + with pytest.raises(TypeError): + cls(1) <= cls(2.0) + else: + assert cls(1) <= cls(2.0) + assert cls(1) <= cls(1.0) + assert not (cls(2) <= cls(1.0)) + + ######### + # gt + ######### + @pytest.mark.parametrize("cls, requires_same_type", eq_data, ids=eq_ids) + def test_gt_unorderable(self, cls, requires_same_type): + """ + TypeError is raised if class does not implement __gt__. + """ + with pytest.raises(TypeError): + cls(2) > cls(1) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_gt_same_type(self, cls, requires_same_type): + """ + Greater-than objects are detected appropriately. + """ + assert cls(2) > cls(1) + assert not (cls(1) > cls(2)) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_not_gt_same_type(self, cls, requires_same_type): + """ + Not greater-than objects are detected appropriately. + """ + assert cls(1) <= cls(2) + assert not (cls(2) <= cls(1)) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_gt_different_type(self, cls, requires_same_type): + """ + Greater-than values of different types are detected appropriately. + """ + if requires_same_type: + # Unlike __eq__, NotImplemented will cause an exception to be + # raised from __gt__. + with pytest.raises(TypeError): + cls(2) > cls(1.0) + else: + assert cls(2) > cls(1.0) + assert not (cls(1) > cls(2.0)) + + ######### + # ge + ######### + @pytest.mark.parametrize("cls, requires_same_type", eq_data, ids=eq_ids) + def test_ge_unorderable(self, cls, requires_same_type): + """ + TypeError is raised if class does not implement __ge__. + """ + with pytest.raises(TypeError): + cls(2) >= cls(1) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_ge_same_type(self, cls, requires_same_type): + """ + Greater-than-or-equal objects are detected appropriately. + """ + assert cls(1) >= cls(1) + assert cls(2) >= cls(1) + assert not (cls(1) >= cls(2)) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_not_ge_same_type(self, cls, requires_same_type): + """ + Not greater-than-or-equal objects are detected appropriately. + """ + assert cls(1) < cls(2) + assert not (cls(1) < cls(1)) + assert not (cls(2) < cls(1)) + + @pytest.mark.parametrize( + "cls, requires_same_type", order_data, ids=order_ids + ) + def test_ge_different_type(self, cls, requires_same_type): + """ + Greater-than-or-equal values of diff. types are detected appropriately. + """ + if requires_same_type: + # Unlike __eq__, NotImplemented will cause an exception to be + # raised from __ge__. + with pytest.raises(TypeError): + cls(2) >= cls(1.0) + else: + assert cls(2) >= cls(2.0) + assert cls(2) >= cls(1.0) + assert not (cls(1) >= cls(2.0)) + + +class TestDundersUnnamedClass: + """ + Tests for dunder attributes of unnamed classes. + """ + + cls = cmp_using(eq=lambda a, b: a == b) + + def test_class(self): + """ + Class name and qualified name should be well behaved. + """ + assert self.cls.__name__ == "Comparable" + assert self.cls.__qualname__ == "Comparable" + + def test_eq(self): + """ + __eq__ docstring and qualified name should be well behaved. + """ + method = self.cls.__eq__ + assert method.__doc__.strip() == "Return a == b. Computed by attrs." + assert method.__name__ == "__eq__" + + def test_ne(self): + """ + __ne__ docstring and qualified name should be well behaved. + """ + method = self.cls.__ne__ + assert method.__doc__.strip() == ( + "Check equality and either forward a NotImplemented or\n" + " return the result negated." + ) + assert method.__name__ == "__ne__" + + +class TestTotalOrderingException: + """ + Test for exceptions related to total ordering. + """ + + def test_eq_must_specified(self): + """ + `total_ordering` requires `__eq__` to be specified. + """ + with pytest.raises(ValueError) as ei: + cmp_using(lt=lambda a, b: a < b) + + assert ei.value.args[0] == ( + "eq must be define is order to complete ordering from " + "lt, le, gt, ge." + ) + + +class TestNotImplementedIsPropagated: + """ + Test related to functions that return NotImplemented. + """ + + def test_not_implemented_is_propagated(self): + """ + If the comparison function returns NotImplemented, + the dunder method should too. + """ + C = cmp_using(eq=lambda a, b: NotImplemented if a == 1 else a == b) + + assert C(2) == C(2) + assert C(1) != C(1) + + +class TestDundersPartialOrdering: + """ + Tests for dunder attributes of classes with partial ordering. + """ + + cls = PartialOrderCSameType + + def test_class(self): + """ + Class name and qualified name should be well behaved. + """ + assert self.cls.__name__ == "PartialOrderCSameType" + assert self.cls.__qualname__ == "PartialOrderCSameType" + + def test_eq(self): + """ + __eq__ docstring and qualified name should be well behaved. + """ + method = self.cls.__eq__ + assert method.__doc__.strip() == "Return a == b. Computed by attrs." + assert method.__name__ == "__eq__" + + def test_ne(self): + """ + __ne__ docstring and qualified name should be well behaved. + """ + method = self.cls.__ne__ + assert method.__doc__.strip() == ( + "Check equality and either forward a NotImplemented or\n" + " return the result negated." + ) + assert method.__name__ == "__ne__" + + def test_lt(self): + """ + __lt__ docstring and qualified name should be well behaved. + """ + method = self.cls.__lt__ + assert method.__doc__.strip() == "Return a < b. Computed by attrs." + assert method.__name__ == "__lt__" + + def test_le(self): + """ + __le__ docstring and qualified name should be well behaved. + """ + method = self.cls.__le__ + assert method.__doc__.strip().startswith( + "Return a <= b. Computed by @total_ordering from" + ) + assert method.__name__ == "__le__" + + def test_gt(self): + """ + __gt__ docstring and qualified name should be well behaved. + """ + method = self.cls.__gt__ + assert method.__doc__.strip().startswith( + "Return a > b. Computed by @total_ordering from" + ) + assert method.__name__ == "__gt__" + + def test_ge(self): + """ + __ge__ docstring and qualified name should be well behaved. + """ + method = self.cls.__ge__ + assert method.__doc__.strip().startswith( + "Return a >= b. Computed by @total_ordering from" + ) + assert method.__name__ == "__ge__" + + +class TestDundersFullOrdering: + """ + Tests for dunder attributes of classes with full ordering. + """ + + cls = FullOrderCSameType + + def test_class(self): + """ + Class name and qualified name should be well behaved. + """ + assert self.cls.__name__ == "FullOrderCSameType" + assert self.cls.__qualname__ == "FullOrderCSameType" + + def test_eq(self): + """ + __eq__ docstring and qualified name should be well behaved. + """ + method = self.cls.__eq__ + assert method.__doc__.strip() == "Return a == b. Computed by attrs." + assert method.__name__ == "__eq__" + + def test_ne(self): + """ + __ne__ docstring and qualified name should be well behaved. + """ + method = self.cls.__ne__ + assert method.__doc__.strip() == ( + "Check equality and either forward a NotImplemented or\n" + " return the result negated." + ) + assert method.__name__ == "__ne__" + + def test_lt(self): + """ + __lt__ docstring and qualified name should be well behaved. + """ + method = self.cls.__lt__ + assert method.__doc__.strip() == "Return a < b. Computed by attrs." + assert method.__name__ == "__lt__" + + def test_le(self): + """ + __le__ docstring and qualified name should be well behaved. + """ + method = self.cls.__le__ + assert method.__doc__.strip() == "Return a <= b. Computed by attrs." + assert method.__name__ == "__le__" + + def test_gt(self): + """ + __gt__ docstring and qualified name should be well behaved. + """ + method = self.cls.__gt__ + assert method.__doc__.strip() == "Return a > b. Computed by attrs." + assert method.__name__ == "__gt__" + + def test_ge(self): + """ + __ge__ docstring and qualified name should be well behaved. + """ + method = self.cls.__ge__ + assert method.__doc__.strip() == "Return a >= b. Computed by attrs." + assert method.__name__ == "__ge__" diff --git a/tests/test_compat.py b/tests/test_compat.py new file mode 100644 index 000000000..4a156c4d4 --- /dev/null +++ b/tests/test_compat.py @@ -0,0 +1,52 @@ +# SPDX-License-Identifier: MIT + +import types + +import pytest + + +@pytest.fixture(name="mp") +def _mp(): + return types.MappingProxyType({"x": 42, "y": "foo"}) + + +class TestMetadataProxy: + """ + Ensure properties of metadata proxy independently of hypothesis strategies. + """ + + def test_repr(self, mp): + """ + repr makes sense and is consistent across Python versions. + """ + assert any( + [ + "mappingproxy({'x': 42, 'y': 'foo'})" == repr(mp), + "mappingproxy({'y': 'foo', 'x': 42})" == repr(mp), + ] + ) + + def test_immutable(self, mp): + """ + All mutating methods raise errors. + """ + with pytest.raises(TypeError, match="not support item assignment"): + mp["z"] = 23 + + with pytest.raises(TypeError, match="not support item deletion"): + del mp["x"] + + with pytest.raises(AttributeError, match="no attribute 'update'"): + mp.update({}) + + with pytest.raises(AttributeError, match="no attribute 'clear'"): + mp.clear() + + with pytest.raises(AttributeError, match="no attribute 'pop'"): + mp.pop("x") + + with pytest.raises(AttributeError, match="no attribute 'popitem'"): + mp.popitem() + + with pytest.raises(AttributeError, match="no attribute 'setdefault'"): + mp.setdefault("x") diff --git a/tests/test_config.py b/tests/test_config.py index 287be03a5..6c78fd295 100644 --- a/tests/test_config.py +++ b/tests/test_config.py @@ -1,15 +1,16 @@ +# SPDX-License-Identifier: MIT + """ Tests for `attr._config`. """ -from __future__ import absolute_import, division, print_function import pytest from attr import _config -class TestConfig(object): +class TestConfig: def test_default(self): """ Run validators by default. diff --git a/tests/test_converters.py b/tests/test_converters.py index f86e07e29..7607e5550 100644 --- a/tests/test_converters.py +++ b/tests/test_converters.py @@ -1,20 +1,19 @@ +# SPDX-License-Identifier: MIT + """ Tests for `attr.converters`. """ -from __future__ import absolute_import - -from distutils.util import strtobool import pytest import attr from attr import Factory, attrib -from attr.converters import default_if_none, optional, pipe +from attr.converters import default_if_none, optional, pipe, to_bool -class TestOptional(object): +class TestOptional: """ Tests for `optional`. """ @@ -45,7 +44,7 @@ def test_fail(self): c("not_an_int") -class TestDefaultIfNone(object): +class TestDefaultIfNone: def test_missing_default(self): """ Raises TypeError if neither default nor factory have been passed. @@ -101,12 +100,12 @@ def test_none_factory(self): assert [] == c(None) -class TestPipe(object): +class TestPipe: def test_success(self): """ Succeeds if all wrapped converters succeed. """ - c = pipe(str, strtobool, bool) + c = pipe(str, to_bool, bool) assert True is c("True") is c(True) @@ -114,7 +113,7 @@ def test_fail(self): """ Fails if any wrapped converter fails. """ - c = pipe(str, strtobool) + c = pipe(str, to_bool) # First wrapped converter fails: with pytest.raises(ValueError): @@ -130,9 +129,42 @@ def test_sugar(self): """ @attr.s - class C(object): - a1 = attrib(default="True", converter=pipe(str, strtobool, bool)) - a2 = attrib(default=True, converter=[str, strtobool, bool]) + class C: + a1 = attrib(default="True", converter=pipe(str, to_bool, bool)) + a2 = attrib(default=True, converter=[str, to_bool, bool]) c = C() assert True is c.a1 is c.a2 + + def test_empty(self): + """ + Empty pipe returns same value. + """ + o = object() + + assert o is pipe()(o) + + +class TestToBool: + def test_unhashable(self): + """ + Fails if value is unhashable. + """ + with pytest.raises(ValueError, match="Cannot convert value to bool"): + to_bool([]) + + def test_truthy(self): + """ + Fails if truthy values are incorrectly converted. + """ + assert to_bool("t") + assert to_bool("yes") + assert to_bool("on") + + def test_falsy(self): + """ + Fails if falsy values are incorrectly converted. + """ + assert not to_bool("f") + assert not to_bool("no") + assert not to_bool("off") diff --git a/tests/test_dunders.py b/tests/test_dunders.py index 2f1ebabdd..03644f8b2 100644 --- a/tests/test_dunders.py +++ b/tests/test_dunders.py @@ -1,8 +1,9 @@ +# SPDX-License-Identifier: MIT + """ Tests for dunder methods from `attrib._make`. """ -from __future__ import absolute_import, division, print_function import copy import pickle @@ -36,6 +37,31 @@ ReprC = simple_class(repr=True) ReprCSlots = simple_class(repr=True, slots=True) + +@attr.s(eq=True) +class EqCallableC: + a = attr.ib(eq=str.lower, order=False) + b = attr.ib(eq=True) + + +@attr.s(eq=True, slots=True) +class EqCallableCSlots: + a = attr.ib(eq=str.lower, order=False) + b = attr.ib(eq=True) + + +@attr.s(order=True) +class OrderCallableC: + a = attr.ib(eq=True, order=str.lower) + b = attr.ib(order=True) + + +@attr.s(order=True, slots=True) +class OrderCallableCSlots: + a = attr.ib(eq=True, order=str.lower) + b = attr.ib(order=True) + + # HashC is hashable by explicit definition while HashCSlots is hashable # implicitly. The "Cached" versions are the same, except with hash code # caching enabled @@ -62,25 +88,27 @@ def _add_init(cls, frozen): cls.__init__ = _make_init( cls, cls.__attrs_attrs__, + getattr(cls, "__attrs_pre_init__", False), getattr(cls, "__attrs_post_init__", False), frozen, _is_slot_cls(cls), cache_hash=False, base_attr_map={}, is_exc=False, - has_global_on_setattr=False, + cls_on_setattr=None, + attrs_init=False, ) return cls -class InitC(object): +class InitC: __attrs_attrs__ = [simple_attr("a"), simple_attr("b")] InitC = _add_init(InitC, False) -class TestEqOrder(object): +class TestEqOrder: """ Tests for eq and order related methods. """ @@ -104,6 +132,16 @@ def test_equal(self, cls): assert cls(1, 2) == cls(1, 2) assert not (cls(1, 2) != cls(1, 2)) + @pytest.mark.parametrize("cls", [EqCallableC, EqCallableCSlots]) + def test_equal_callable(self, cls): + """ + Equal objects are detected as equal. + """ + assert cls("Test", 1) == cls("test", 1) + assert cls("Test", 1) != cls("test", 2) + assert not (cls("Test", 1) != cls("test", 1)) + assert not (cls("Test", 1) == cls("test", 2)) + @pytest.mark.parametrize("cls", [EqC, EqCSlots]) def test_unequal_same_class(self, cls): """ @@ -112,14 +150,24 @@ def test_unequal_same_class(self, cls): assert cls(1, 2) != cls(2, 1) assert not (cls(1, 2) == cls(2, 1)) - @pytest.mark.parametrize("cls", [EqC, EqCSlots]) + @pytest.mark.parametrize("cls", [EqCallableC, EqCallableCSlots]) + def test_unequal_same_class_callable(self, cls): + """ + Unequal objects of correct type are detected as unequal. + """ + assert cls("Test", 1) != cls("foo", 2) + assert not (cls("Test", 1) == cls("foo", 2)) + + @pytest.mark.parametrize( + "cls", [EqC, EqCSlots, EqCallableC, EqCallableCSlots] + ) def test_unequal_different_class(self, cls): """ Unequal objects of different type are detected even if their attributes match. """ - class NotEqC(object): + class NotEqC: a = 1 b = 2 @@ -138,7 +186,21 @@ def test_lt(self, cls): ]: assert cls(*a) < cls(*b) - @pytest.mark.parametrize("cls", [OrderC, OrderCSlots]) + @pytest.mark.parametrize("cls", [OrderCallableC, OrderCallableCSlots]) + def test_lt_callable(self, cls): + """ + __lt__ compares objects as tuples of attribute values. + """ + # Note: "A" < "a" + for a, b in [ + (("test1", 1), ("Test1", 2)), + (("test0", 1), ("Test1", 1)), + ]: + assert cls(*a) < cls(*b) + + @pytest.mark.parametrize( + "cls", [OrderC, OrderCSlots, OrderCallableC, OrderCallableCSlots] + ) def test_lt_unordable(self, cls): """ __lt__ returns NotImplemented if classes differ. @@ -159,7 +221,23 @@ def test_le(self, cls): ]: assert cls(*a) <= cls(*b) - @pytest.mark.parametrize("cls", [OrderC, OrderCSlots]) + @pytest.mark.parametrize("cls", [OrderCallableC, OrderCallableCSlots]) + def test_le_callable(self, cls): + """ + __le__ compares objects as tuples of attribute values. + """ + # Note: "A" < "a" + for a, b in [ + (("test1", 1), ("Test1", 1)), + (("test1", 1), ("Test1", 2)), + (("test0", 1), ("Test1", 1)), + (("test0", 2), ("Test1", 1)), + ]: + assert cls(*a) <= cls(*b) + + @pytest.mark.parametrize( + "cls", [OrderC, OrderCSlots, OrderCallableC, OrderCallableCSlots] + ) def test_le_unordable(self, cls): """ __le__ returns NotImplemented if classes differ. @@ -178,7 +256,21 @@ def test_gt(self, cls): ]: assert cls(*a) > cls(*b) - @pytest.mark.parametrize("cls", [OrderC, OrderCSlots]) + @pytest.mark.parametrize("cls", [OrderCallableC, OrderCallableCSlots]) + def test_gt_callable(self, cls): + """ + __gt__ compares objects as tuples of attribute values. + """ + # Note: "A" < "a" + for a, b in [ + (("Test1", 2), ("test1", 1)), + (("Test1", 1), ("test0", 1)), + ]: + assert cls(*a) > cls(*b) + + @pytest.mark.parametrize( + "cls", [OrderC, OrderCSlots, OrderCallableC, OrderCallableCSlots] + ) def test_gt_unordable(self, cls): """ __gt__ returns NotImplemented if classes differ. @@ -199,7 +291,23 @@ def test_ge(self, cls): ]: assert cls(*a) >= cls(*b) - @pytest.mark.parametrize("cls", [OrderC, OrderCSlots]) + @pytest.mark.parametrize("cls", [OrderCallableC, OrderCallableCSlots]) + def test_ge_callable(self, cls): + """ + __ge__ compares objects as tuples of attribute values. + """ + # Note: "A" < "a" + for a, b in [ + (("Test1", 1), ("test1", 1)), + (("Test1", 2), ("test1", 1)), + (("Test1", 1), ("test0", 1)), + (("Test1", 1), ("test0", 2)), + ]: + assert cls(*a) >= cls(*b) + + @pytest.mark.parametrize( + "cls", [OrderC, OrderCSlots, OrderCallableC, OrderCallableCSlots] + ) def test_ge_unordable(self, cls): """ __ge__ returns NotImplemented if classes differ. @@ -207,7 +315,7 @@ def test_ge_unordable(self, cls): assert NotImplemented == (cls(1, 2).__ge__(42)) -class TestAddRepr(object): +class TestAddRepr: """ Tests for `_add_repr`. """ @@ -240,7 +348,7 @@ def custom_repr(value): return "foo:" + str(value) @attr.s - class C(object): + class C: a = attr.ib(repr=custom_repr) assert "C(a=foo:1)" == repr(C(1)) @@ -252,7 +360,7 @@ def test_infinite_recursion(self): """ @attr.s - class Cycle(object): + class Cycle: value = attr.ib(default=7) cycle = attr.ib(default=None) @@ -260,12 +368,29 @@ class Cycle(object): cycle.cycle = cycle assert "Cycle(value=7, cycle=...)" == repr(cycle) + def test_infinite_recursion_long_cycle(self): + """ + A cyclic graph can pass through other non-attrs objects, and repr will + still emit an ellipsis and not raise an exception. + """ + + @attr.s + class LongCycle: + value = attr.ib(default=14) + cycle = attr.ib(default=None) + + cycle = LongCycle() + # Ensure that the reference cycle passes through a non-attrs object. + # This demonstrates the need for a thread-local "global" ID tracker. + cycle.cycle = {"cycle": [cycle]} + assert "LongCycle(value=14, cycle={'cycle': [...]})" == repr(cycle) + def test_underscores(self): """ repr does not strip underscores. """ - class C(object): + class C: __attrs_attrs__ = [simple_attr("_x")] C = _add_repr(C) @@ -314,21 +439,21 @@ def test_str_no_repr(self): # these are for use in TestAddHash.test_cache_hash_serialization # they need to be out here so they can be un-pickled @attr.attrs(hash=True, cache_hash=False) -class HashCacheSerializationTestUncached(object): +class HashCacheSerializationTestUncached: foo_value = attr.ib() @attr.attrs(hash=True, cache_hash=True) -class HashCacheSerializationTestCached(object): +class HashCacheSerializationTestCached: foo_value = attr.ib() @attr.attrs(slots=True, hash=True, cache_hash=True) -class HashCacheSerializationTestCachedSlots(object): +class HashCacheSerializationTestCachedSlots: foo_value = attr.ib() -class IncrementingHasher(object): +class IncrementingHasher: def __init__(self): self.hash_value = 100 @@ -338,7 +463,7 @@ def __hash__(self): return rv -class TestAddHash(object): +class TestAddHash: """ Tests for `_add_hash`. """ @@ -535,7 +660,7 @@ def test_copy_hash_cleared(self, cache_hash, frozen, slots): kwargs["hash"] = True @attr.s(**kwargs) - class C(object): + class C: x = attr.ib() a = C(IncrementingHasher()) @@ -585,7 +710,7 @@ def test_copy_two_arg_reduce(self, frozen): """ @attr.s(frozen=frozen, cache_hash=True, hash=True) - class C(object): + class C: x = attr.ib() def __getstate__(self): @@ -603,7 +728,7 @@ def _roundtrip_pickle(self, obj): return pickle.loads(pickle_str) -class TestAddInit(object): +class TestAddInit: """ Tests for `_add_init`. """ @@ -622,9 +747,8 @@ def test_init(self, slots, frozen): with pytest.raises(TypeError) as e: C(a=1, b=2) - assert ( + assert e.value.args[0].endswith( "__init__() got an unexpected keyword argument 'a'" - == e.value.args[0] ) @given(booleans(), booleans()) @@ -677,7 +801,7 @@ def test_default(self): If a default value is present, it's used as fallback. """ - class C(object): + class C: __attrs_attrs__ = [ simple_attr(name="a", default=2), simple_attr(name="b", default="hallo"), @@ -695,10 +819,10 @@ def test_factory(self): If a default factory is present, it's used as fallback. """ - class D(object): + class D: pass - class C(object): + class C: __attrs_attrs__ = [ simple_attr(name="a", default=Factory(list)), simple_attr(name="b", default=Factory(D)), @@ -773,7 +897,7 @@ def test_underscores(self): underscores. """ - class C(object): + class C: __attrs_attrs__ = [simple_attr("_private")] C = _add_init(C, False) @@ -781,7 +905,7 @@ class C(object): assert 42 == i._private -class TestNothing(object): +class TestNothing: """ Tests for `_Nothing`. """ @@ -808,9 +932,16 @@ def test_eq(self): assert not (_Nothing() != _Nothing()) assert 1 != _Nothing() + def test_false(self): + """ + NOTHING evaluates as falsey. + """ + assert not NOTHING + assert False is bool(NOTHING) + @attr.s(hash=True, order=True) -class C(object): +class C: pass @@ -819,11 +950,21 @@ class C(object): @attr.s(hash=True, order=True) -class C(object): +class C: pass -class TestFilenames(object): +CopyC = C + + +@attr.s(hash=True, order=True) +class C: + """A different class, to generate different methods.""" + + a = attr.ib() + + +class TestFilenames: def test_filenames(self): """ The created dunder methods have a "consistent" filename. @@ -840,15 +981,27 @@ def test_filenames(self): OriginalC.__hash__.__code__.co_filename == "" ) + assert ( + CopyC.__init__.__code__.co_filename + == "" + ) + assert ( + CopyC.__eq__.__code__.co_filename + == "" + ) + assert ( + CopyC.__hash__.__code__.co_filename + == "" + ) assert ( C.__init__.__code__.co_filename - == "" + == "" ) assert ( C.__eq__.__code__.co_filename - == "" + == "" ) assert ( C.__hash__.__code__.co_filename - == "" + == "" ) diff --git a/tests/test_filters.py b/tests/test_filters.py index 7a1a41895..6945bd2d5 100644 --- a/tests/test_filters.py +++ b/tests/test_filters.py @@ -1,8 +1,9 @@ +# SPDX-License-Identifier: MIT + """ Tests for `attr.filters`. """ -from __future__ import absolute_import, division, print_function import pytest @@ -13,12 +14,12 @@ @attr.s -class C(object): +class C: a = attr.ib() b = attr.ib() -class TestSplitWhat(object): +class TestSplitWhat: """ Tests for `_split_what`. """ @@ -33,7 +34,7 @@ def test_splits(self): ) == _split_what((str, fields(C).a, int)) -class TestInclude(object): +class TestInclude: """ Tests for `include`. """ @@ -49,7 +50,7 @@ class TestInclude(object): ) def test_allow(self, incl, value): """ - Return True if a class or attribute is whitelisted. + Return True if a class or attribute is included. """ i = include(*incl) assert i(fields(C).a, value) is True @@ -65,13 +66,13 @@ def test_allow(self, incl, value): ) def test_drop_class(self, incl, value): """ - Return False on non-whitelisted classes and attributes. + Return False on non-included classes and attributes. """ i = include(*incl) assert i(fields(C).a, value) is False -class TestExclude(object): +class TestExclude: """ Tests for `exclude`. """ @@ -87,7 +88,7 @@ class TestExclude(object): ) def test_allow(self, excl, value): """ - Return True if class or attribute is not blacklisted. + Return True if class or attribute is not excluded. """ e = exclude(*excl) assert e(fields(C).a, value) is True @@ -103,7 +104,7 @@ def test_allow(self, excl, value): ) def test_drop_class(self, excl, value): """ - Return True on non-blacklisted classes and attributes. + Return True on non-excluded classes and attributes. """ e = exclude(*excl) assert e(fields(C).a, value) is False diff --git a/tests/test_funcs.py b/tests/test_funcs.py index 20e2747e6..d73d94c51 100644 --- a/tests/test_funcs.py +++ b/tests/test_funcs.py @@ -1,8 +1,9 @@ +# SPDX-License-Identifier: MIT + """ Tests for `attr._funcs`. """ -from __future__ import absolute_import, division, print_function from collections import OrderedDict @@ -14,7 +15,7 @@ import attr from attr import asdict, assoc, astuple, evolve, fields, has -from attr._compat import TYPE, Mapping, Sequence, ordered_dict +from attr._compat import Mapping, Sequence, ordered_dict from attr.exceptions import AttrsAttributeNotFoundError from attr.validators import instance_of @@ -26,21 +27,21 @@ @pytest.fixture(scope="session", name="C") -def fixture_C(): +def _C(): """ Return a simple but fully featured attrs class with an x and a y attribute. """ import attr @attr.s - class C(object): + class C: x = attr.ib() y = attr.ib() return C -class TestAsDict(object): +class TestAsDict: """ Tests for `asdict`. """ @@ -199,8 +200,39 @@ def test_asdict_preserve_order(self, cls): assert [a.name for a in fields(cls)] == list(dict_instance.keys()) + def test_retain_keys_are_tuples(self): + """ + retain_collect_types also retains keys. + """ + + @attr.s + class A: + a = attr.ib() + + instance = A({(1,): 1}) + + assert {"a": {(1,): 1}} == attr.asdict( + instance, retain_collection_types=True + ) + + def test_tuple_keys(self): + """ + If a key is collection type, retain_collection_types is False, + the key is serialized as a tuple. -class TestAsTuple(object): + See #646 + """ + + @attr.s + class A: + a = attr.ib() + + instance = A({(1,): 1}) + + assert {"a": {(1,): 1}} == attr.asdict(instance) + + +class TestAsTuple: """ Tests for `astuple`. """ @@ -358,7 +390,7 @@ def test_sets_no_retain(self, C, set_type): assert (1, [1, 2, 3]) == d -class TestHas(object): +class TestHas: """ Tests for `has`. """ @@ -375,7 +407,7 @@ def test_positive_empty(self): """ @attr.s - class D(object): + class D: pass assert has(D) @@ -387,7 +419,7 @@ def test_negative(self): assert not has(object) -class TestAssoc(object): +class TestAssoc: """ Tests for `assoc`. """ @@ -399,7 +431,7 @@ def test_empty(self, slots, frozen): """ @attr.s(slots=slots, frozen=frozen) - class C(object): + class C: pass i1 = C() @@ -461,7 +493,7 @@ def test_frozen(self): """ @attr.s(frozen=True) - class C(object): + class C: x = attr.ib() y = attr.ib() @@ -474,7 +506,7 @@ def test_warning(self): """ @attr.s - class C(object): + class C: x = attr.ib() with pytest.warns(DeprecationWarning) as wi: @@ -483,7 +515,7 @@ class C(object): assert __file__ == wi.list[0].filename -class TestEvolve(object): +class TestEvolve: """ Tests for `evolve`. """ @@ -495,7 +527,7 @@ def test_empty(self, slots, frozen): """ @attr.s(slots=slots, frozen=frozen) - class C(object): + class C: pass i1 = C() @@ -544,8 +576,15 @@ def test_unknown(self, C): # No generated class will have a four letter attribute. with pytest.raises(TypeError) as e: evolve(C(), aaaa=2) - expected = "__init__() got an unexpected keyword argument 'aaaa'" - assert (expected,) == e.value.args + + if hasattr(C, "__attrs_init__"): + expected = ( + "__attrs_init__() got an unexpected keyword argument 'aaaa'" + ) + else: + expected = "__init__() got an unexpected keyword argument 'aaaa'" + + assert e.value.args[0].endswith(expected) def test_validator_failure(self): """ @@ -553,14 +592,14 @@ def test_validator_failure(self): """ @attr.s - class C(object): + class C: a = attr.ib(validator=instance_of(int)) with pytest.raises(TypeError) as e: evolve(C(a=1), a="some string") m = e.value.args[0] - assert m.startswith("'a' must be <{type} 'int'>".format(type=TYPE)) + assert m.startswith("'a' must be ") def test_private(self): """ @@ -568,7 +607,7 @@ def test_private(self): """ @attr.s - class C(object): + class C: _a = attr.ib() assert evolve(C(1), a=2)._a == 2 @@ -585,8 +624,56 @@ def test_non_init_attrs(self): """ @attr.s - class C(object): + class C: a = attr.ib() b = attr.ib(init=False, default=0) assert evolve(C(1), a=2).a == 2 + + def test_regression_attrs_classes(self): + """ + evolve() can evolve fields that are instances of attrs classes. + + Regression test for #804 + """ + + @attr.s + class Cls1: + param1 = attr.ib() + + @attr.s + class Cls2: + param2 = attr.ib() + + obj2a = Cls2(param2="a") + obj2b = Cls2(param2="b") + + obj1a = Cls1(param1=obj2a) + + assert Cls1(param1=Cls2(param2="b")) == attr.evolve( + obj1a, param1=obj2b + ) + + def test_dicts(self): + """ + evolve() can replace an attrs class instance with a dict. + + See #806 + """ + + @attr.s + class Cls1: + param1 = attr.ib() + + @attr.s + class Cls2: + param2 = attr.ib() + + obj2a = Cls2(param2="a") + obj2b = {"foo": 42, "param2": 42} + + obj1a = Cls1(param1=obj2a) + + assert Cls1({"foo": 42, "param2": 42}) == attr.evolve( + obj1a, param1=obj2b + ) diff --git a/tests/test_functional.py b/tests/test_functional.py index 3878d6868..09f504802 100644 --- a/tests/test_functional.py +++ b/tests/test_functional.py @@ -1,36 +1,35 @@ +# SPDX-License-Identifier: MIT + """ End-to-end tests. """ -from __future__ import absolute_import, division, print_function +import inspect import pickle from copy import deepcopy import pytest -import six -from hypothesis import assume, given +from hypothesis import given from hypothesis.strategies import booleans import attr -from attr._compat import PY2, TYPE +from attr._compat import PY36 from attr._make import NOTHING, Attribute from attr.exceptions import FrozenInstanceError -from .strategies import optional_bool - @attr.s -class C1(object): +class C1: x = attr.ib(validator=attr.validators.instance_of(int)) y = attr.ib() @attr.s(slots=True) -class C1Slots(object): +class C1Slots: x = attr.ib(validator=attr.validators.instance_of(int)) y = attr.ib() @@ -39,19 +38,19 @@ class C1Slots(object): @attr.s() -class C2(object): +class C2: x = attr.ib(default=foo) y = attr.ib(default=attr.Factory(list)) @attr.s(slots=True) -class C2Slots(object): +class C2Slots: x = attr.ib(default=foo) y = attr.ib(default=attr.Factory(list)) @attr.s -class Base(object): +class Base: x = attr.ib() def meth(self): @@ -59,7 +58,7 @@ def meth(self): @attr.s(slots=True) -class BaseSlots(object): +class BaseSlots: x = attr.ib() def meth(self): @@ -77,7 +76,7 @@ class SubSlots(BaseSlots): @attr.s(frozen=True, slots=True) -class Frozen(object): +class Frozen: x = attr.ib() @@ -87,7 +86,7 @@ class SubFrozen(Frozen): @attr.s(frozen=True, slots=False) -class FrozenNoSlots(object): +class FrozenNoSlots: x = attr.ib() @@ -96,21 +95,19 @@ class Meta(type): @attr.s -@six.add_metaclass(Meta) -class WithMeta(object): +class WithMeta(metaclass=Meta): pass @attr.s(slots=True) -@six.add_metaclass(Meta) -class WithMetaSlots(object): +class WithMetaSlots(metaclass=Meta): pass FromMakeClass = attr.make_class("FromMakeClass", ["x"]) -class TestFunctional(object): +class TestFunctional: """ Functional tests. """ @@ -164,8 +161,7 @@ def test_validator(self, cls): # Using C1 explicitly, since slotted classes don't support this. assert ( - "'x' must be <{type} 'int'> (got '1' that is a <{type} " - "'str'>).".format(type=TYPE), + "'x' must be (got '1' that is a ).", attr.fields(C1).x, int, "1", @@ -178,7 +174,7 @@ def test_renaming(self, slots): """ @attr.s(slots=slots) - class C3(object): + class C3: _x = attr.ib() assert "C3(_x=1)" == repr(C3(x=1)) @@ -348,7 +344,7 @@ def test_default_decorator(self): """ @attr.s - class C(object): + class C: x = attr.ib(default=1) y = attr.ib() @@ -377,7 +373,7 @@ def test_dict_patch_class(self): dict-classes are never replaced. """ - class C(object): + class C: x = attr.ib() C_new = attr.s(C) @@ -392,7 +388,7 @@ def test_hash_by_id(self): """ @attr.s(hash=False) - class HashByIDBackwardCompat(object): + class HashByIDBackwardCompat: x = attr.ib() assert hash(HashByIDBackwardCompat(1)) != hash( @@ -400,13 +396,13 @@ class HashByIDBackwardCompat(object): ) @attr.s(hash=False, eq=False) - class HashByID(object): + class HashByID: x = attr.ib() assert hash(HashByID(1)) != hash(HashByID(1)) @attr.s(hash=True) - class HashByValues(object): + class HashByValues: x = attr.ib() assert hash(HashByValues(1)) == hash(HashByValues(1)) @@ -417,11 +413,11 @@ def test_handles_different_defaults(self): """ @attr.s - class Unhashable(object): + class Unhashable: pass @attr.s - class C(object): + class C: x = attr.ib(default=Unhashable()) @attr.s @@ -435,7 +431,7 @@ def test_hash_false_eq_false(self, slots): """ @attr.s(hash=False, eq=False, slots=slots) - class C(object): + class C: pass assert hash(C()) != hash(C()) @@ -447,7 +443,7 @@ def test_eq_false(self, slots): """ @attr.s(eq=False, slots=slots) - class C(object): + class C: pass # Ensure both objects live long enough such that their ids/hashes @@ -465,7 +461,7 @@ def test_overwrite_base(self): """ @attr.s - class C(object): + class C: c = attr.ib(default=100) x = attr.ib(default=1) b = attr.ib(default=23) @@ -512,7 +508,7 @@ def test_frozen_slots_combo( slots=base_slots, weakref_slot=base_weakref_slot, ) - class Base(object): + class Base: a = attr.ib(converter=int if base_converter else None) @attr.s( @@ -539,7 +535,7 @@ def test_tuple_class_aliasing(self): """ @attr.s - class C(object): + class C: property = attr.ib() itemgetter = attr.ib() x = attr.ib() @@ -630,64 +626,127 @@ def test_eq_only(self, slots, frozen): """ @attr.s(eq=True, order=False, slots=slots, frozen=frozen) - class C(object): + class C: x = attr.ib() - if not PY2: - possible_errors = ( - "unorderable types: C() < C()", - "'<' not supported between instances of 'C' and 'C'", - "unorderable types: C < C", # old PyPy 3 - ) + possible_errors = ( + "unorderable types: C() < C()", + "'<' not supported between instances of 'C' and 'C'", + "unorderable types: C < C", # old PyPy 3 + ) - with pytest.raises(TypeError) as ei: - C(5) < C(6) + with pytest.raises(TypeError) as ei: + C(5) < C(6) - assert ei.value.args[0] in possible_errors - else: - i = C(42) - for m in ("lt", "le", "gt", "ge"): - assert None is getattr(i, "__%s__" % (m,), None) - - @given(cmp=optional_bool, eq=optional_bool, order=optional_bool) - def test_cmp_deprecated_attribute(self, cmp, eq, order): - """ - Accessing Attribute.cmp raises a deprecation warning but returns True - if cmp is True, or eq and order are *both* effectively True. - """ - # These cases are invalid and raise a ValueError. - assume(cmp is None or (eq is None and order is None)) - assume(not (eq is False and order is True)) - - if cmp is not None: - rv = cmp - elif eq is True or eq is None: - rv = order is None or order is True - elif cmp is None and eq is None and order is None: - rv = True - elif cmp is None or eq is None: - rv = False - else: - pytest.fail( - "Unexpected state: cmp=%r eq=%r order=%r" % (cmp, eq, order) - ) + assert ei.value.args[0] in possible_errors - with pytest.deprecated_call() as dc: + @pytest.mark.parametrize("slots", [True, False]) + @pytest.mark.parametrize("cmp", [True, False]) + def test_attrib_cmp_shortcut(self, slots, cmp): + """ + Setting cmp on `attr.ib`s sets both eq and order. + """ - @attr.s - class C(object): - x = attr.ib(cmp=cmp, eq=eq, order=order) + @attr.s(slots=slots) + class C: + x = attr.ib(cmp=cmp) - assert rv == attr.fields(C).x.cmp + assert cmp is attr.fields(C).x.eq + assert cmp is attr.fields(C).x.order - if cmp is not None: - # Remove warning from creating the attribute if cmp is not None. - dc.pop() + @pytest.mark.parametrize("slots", [True, False]) + def test_no_setattr_if_validate_without_validators(self, slots): + """ + If a class has on_setattr=attr.setters.validate (former default in NG + APIs) but sets no validators, don't use the (slower) setattr in + __init__. - (w,) = dc.list + Regression test for #816. + """ - assert ( - "The usage of `cmp` is deprecated and will be removed on or after " - "2021-06-01. Please use `eq` and `order` instead." - == w.message.args[0] - ) + @attr.s(on_setattr=attr.setters.validate) + class C: + x = attr.ib() + + @attr.s(on_setattr=attr.setters.validate) + class D(C): + y = attr.ib() + + src = inspect.getsource(D.__init__) + + assert "setattr" not in src + assert "self.x = x" in src + assert "self.y = y" in src + assert object.__setattr__ == D.__setattr__ + + @pytest.mark.parametrize("slots", [True, False]) + def test_no_setattr_if_convert_without_converters(self, slots): + """ + If a class has on_setattr=attr.setters.convert but sets no validators, + don't use the (slower) setattr in __init__. + """ + + @attr.s(on_setattr=attr.setters.convert) + class C: + x = attr.ib() + + @attr.s(on_setattr=attr.setters.convert) + class D(C): + y = attr.ib() + + src = inspect.getsource(D.__init__) + + assert "setattr" not in src + assert "self.x = x" in src + assert "self.y = y" in src + assert object.__setattr__ == D.__setattr__ + + @pytest.mark.skipif(not PY36, reason="NG APIs are 3.6+") + @pytest.mark.parametrize("slots", [True, False]) + def test_no_setattr_with_ng_defaults(self, slots): + """ + If a class has the NG default on_setattr=[convert, validate] but sets + no validators or converters, don't use the (slower) setattr in + __init__. + """ + + @attr.define + class C: + x = attr.ib() + + src = inspect.getsource(C.__init__) + + assert "setattr" not in src + assert "self.x = x" in src + assert object.__setattr__ == C.__setattr__ + + @attr.define + class D(C): + y = attr.ib() + + src = inspect.getsource(D.__init__) + + assert "setattr" not in src + assert "self.x = x" in src + assert "self.y = y" in src + assert object.__setattr__ == D.__setattr__ + + def test_on_setattr_detect_inherited_validators(self): + """ + _make_init detects the presence of a validator even if the field is + inherited. + """ + + @attr.s(on_setattr=attr.setters.validate) + class C: + x = attr.ib(validator=42) + + @attr.s(on_setattr=attr.setters.validate) + class D(C): + y = attr.ib() + + src = inspect.getsource(D.__init__) + + assert "_setattr(self, 'x', x)" in src + assert "_setattr(self, 'y', y)" in src + assert object.__setattr__ != D.__setattr__ diff --git a/tests/test_hooks.py b/tests/test_hooks.py index 1f58eef30..92fc2dcaa 100644 --- a/tests/test_hooks.py +++ b/tests/test_hooks.py @@ -1,3 +1,5 @@ +# SPDX-License-Identifier: MIT + from datetime import datetime from typing import Dict, List @@ -17,6 +19,7 @@ def test_hook_applied(self): results = [] def hook(cls, attribs): + attr.resolve_types(cls, attribs=attribs) results[:] = [(a.name, a.type) for a in attribs] return attribs @@ -36,6 +39,7 @@ def test_hook_applied_auto_attrib(self): results = [] def hook(cls, attribs): + attr.resolve_types(cls, attribs=attribs) results[:] = [(a.name, a.type) for a in attribs] return attribs @@ -52,6 +56,7 @@ def test_hook_applied_modify_attrib(self): """ def hook(cls, attribs): + attr.resolve_types(cls, attribs=attribs) return [a.evolve(converter=a.type) for a in attribs] @attr.s(auto_attribs=True, field_transformer=hook) @@ -68,6 +73,7 @@ def test_hook_remove_field(self): """ def hook(cls, attribs): + attr.resolve_types(cls, attribs=attribs) return [a for a in attribs if a.type is not int] @attr.s(auto_attribs=True, field_transformer=hook) @@ -113,6 +119,22 @@ class Sub(Base): assert attr.asdict(Sub(2)) == {"y": 2} + def test_attrs_attrclass(self): + """ + The list of attrs returned by a field_transformer is converted to + "AttrsClass" again. + + Regression test for #821. + """ + + @attr.s(auto_attribs=True, field_transformer=lambda c, a: list(a)) + class C: + x: int + + fields_type = type(attr.fields(C)) + assert fields_type.__name__ == "CAttributes" + assert issubclass(fields_type, tuple) + class TestAsDictHook: def test_asdict(self): diff --git a/tests/test_import.py b/tests/test_import.py new file mode 100644 index 000000000..9e90a5c11 --- /dev/null +++ b/tests/test_import.py @@ -0,0 +1,11 @@ +# SPDX-License-Identifier: MIT + + +class TestImportStar: + def test_from_attr_import_star(self): + """ + import * from attr + """ + # attr_import_star contains `from attr import *`, which cannot + # be done here because *-imports are only allowed on module level. + from . import attr_import_star # noqa: F401 diff --git a/tests/test_init_subclass.py b/tests/test_init_subclass.py index 2748655a0..863e79437 100644 --- a/tests/test_init_subclass.py +++ b/tests/test_init_subclass.py @@ -1,3 +1,5 @@ +# SPDX-License-Identifier: MIT + """ Tests for `__init_subclass__` related tests. diff --git a/tests/test_make.py b/tests/test_make.py index fad4ec7e2..96e07f333 100644 --- a/tests/test_make.py +++ b/tests/test_make.py @@ -1,8 +1,9 @@ +# SPDX-License-Identifier: MIT + """ Tests for `attr._make`. """ -from __future__ import absolute_import, division, print_function import copy import functools @@ -21,7 +22,7 @@ import attr from attr import _config -from attr._compat import PY2, ordered_dict +from attr._compat import PY310, ordered_dict from attr._make import ( Attribute, Factory, @@ -29,7 +30,8 @@ _Attributes, _ClassBuilder, _CountingAttr, - _determine_eq_order, + _determine_attrib_eq_order, + _determine_attrs_eq_order, _determine_whether_to_implement, _transform_attrs, and_, @@ -38,11 +40,7 @@ make_class, validate, ) -from attr.exceptions import ( - DefaultAlreadySetError, - NotAnAttrsClassError, - PythonTooOldError, -) +from attr.exceptions import DefaultAlreadySetError, NotAnAttrsClassError from .strategies import ( gen_attr_names, @@ -59,7 +57,7 @@ attrs_st = simple_attrs.map(lambda c: Attribute.from_counting_attr("name", c)) -class TestCountingAttr(object): +class TestCountingAttr: """ Tests for `attr`. """ @@ -148,7 +146,7 @@ def f(self): def make_tc(): - class TransformC(object): + class TransformC: z = attr.ib() y = attr.ib() x = attr.ib() @@ -157,7 +155,7 @@ class TransformC(object): return TransformC -class TestTransformAttrs(object): +class TestTransformAttrs: """ Tests for `_transform_attrs`. """ @@ -186,7 +184,7 @@ def test_empty(self): """ @attr.s - class C(object): + class C: pass assert _Attributes(((), [], {})) == _transform_attrs( @@ -212,7 +210,7 @@ def test_conflicting_defaults(self): mandatory attributes. """ - class C(object): + class C: x = attr.ib(default=None) y = attr.ib() @@ -222,7 +220,8 @@ class C(object): "No mandatory attributes allowed after an attribute with a " "default value or factory. Attribute in question: Attribute" "(name='y', default=NOTHING, validator=None, repr=True, " - "eq=True, order=True, hash=None, init=True, " + "eq=True, eq_key=None, order=True, order_key=None, " + "hash=None, init=True, " "metadata=mappingproxy({}), type=None, converter=None, " "kw_only=False, inherited=False, on_setattr=None)", ) == e.value.args @@ -231,13 +230,13 @@ def test_kw_only(self): """ Converts all attributes, including base class' attributes, if `kw_only` is provided. Therefore, `kw_only` allows attributes with defaults to - preceed mandatory attributes. + precede mandatory attributes. Updates in the subclass *don't* affect the base class attributes. """ @attr.s - class B(object): + class B: b = attr.ib() for b_a in B.__attrs_attrs__: @@ -265,7 +264,7 @@ def test_these(self): If these is passed, use it and ignore body and base classes. """ - class Base(object): + class Base: z = attr.ib() class C(Base): @@ -284,7 +283,7 @@ def test_these_leave_body(self): """ @attr.s(init=False, these={"x": attr.ib()}) - class C(object): + class C: x = 5 assert 5 == C().x @@ -299,20 +298,20 @@ def test_these_ordered(self): a = attr.ib(default=1) @attr.s(these=ordered_dict([("a", a), ("b", b)])) - class C(object): + class C: pass assert "C(a=1, b=2)" == repr(C()) def test_multiple_inheritance_old(self): """ - Old multiple inheritance attributre collection behavior is retained. + Old multiple inheritance attribute collection behavior is retained. See #285 """ @attr.s - class A(object): + class A: a1 = attr.ib(default="a1") a2 = attr.ib(default="a2") @@ -347,7 +346,7 @@ def test_overwrite_proper_mro(self): """ @attr.s(collect_by_mro=True) - class C(object): + class C: x = attr.ib(default=1) @attr.s(collect_by_mro=True) @@ -364,7 +363,7 @@ def test_multiple_inheritance_proper_mro(self): """ @attr.s - class A(object): + class A: a1 = attr.ib(default="a1") a2 = attr.ib(default="a2") @@ -400,19 +399,19 @@ def test_mro(self): See #428 """ - @attr.s - class A(object): + @attr.s(collect_by_mro=True) + class A: x = attr.ib(10) def xx(self): return 10 - @attr.s + @attr.s(collect_by_mro=True) class B(A): y = attr.ib(20) - @attr.s + @attr.s(collect_by_mro=True) class C(A): x = attr.ib(50) @@ -433,7 +432,7 @@ def test_inherited(self): """ @attr.s - class A(object): + class A: a = attr.ib() @attr.s @@ -457,31 +456,18 @@ class C(B): assert False is f(C).c.inherited -class TestAttributes(object): +class TestAttributes: """ Tests for the `attrs`/`attr.s` class decorator. """ - @pytest.mark.skipif(not PY2, reason="No old-style classes in Py3") - def test_catches_old_style(self): - """ - Raises TypeError on old-style classes. - """ - with pytest.raises(TypeError) as e: - - @attr.s - class C: - pass - - assert ("attrs only works with new-style classes.",) == e.value.args - def test_sets_attrs(self): """ Sets the `__attrs_attrs__` class attribute with a list of `Attribute`s. """ @attr.s - class C(object): + class C: x = attr.ib() assert "x" == C.__attrs_attrs__[0].name @@ -493,7 +479,7 @@ def test_empty(self): """ @attr.s - class C3(object): + class C3: pass assert "C3()" == repr(C3()) @@ -519,7 +505,7 @@ def test_adds_all_by_default(self, method_name): # overwritten afterwards. sentinel = object() - class C(object): + class C: x = attr.ib() setattr(C, method_name, sentinel) @@ -560,7 +546,7 @@ def test_respects_add_arguments(self, arg_name, method_name): if arg_name == "eq": am_args["order"] = False - class C(object): + class C: x = attr.ib() setattr(C, method_name, sentinel) @@ -569,7 +555,19 @@ class C(object): assert sentinel == getattr(C, method_name) - @pytest.mark.skipif(PY2, reason="__qualname__ is PY3-only.") + @pytest.mark.parametrize("init", [True, False]) + def test_respects_init_attrs_init(self, init): + """ + If init=False, adds __attrs_init__ to the class. + Otherwise, it does not. + """ + + class C: + x = attr.ib() + + C = attr.s(init=init)(C) + assert hasattr(C, "__attrs_init__") != init + @given(slots_outer=booleans(), slots_inner=booleans()) def test_repr_qualname(self, slots_outer, slots_inner): """ @@ -577,9 +575,9 @@ def test_repr_qualname(self, slots_outer, slots_inner): """ @attr.s(slots=slots_outer) - class C(object): + class C: @attr.s(slots=slots_inner) - class D(object): + class D: pass assert "C.D()" == repr(C.D()) @@ -592,14 +590,13 @@ def test_repr_fake_qualname(self, slots_outer, slots_inner): """ @attr.s(slots=slots_outer) - class C(object): + class C: @attr.s(repr_ns="C", slots=slots_inner) - class D(object): + class D: pass assert "C.D()" == repr(C.D()) - @pytest.mark.skipif(PY2, reason="__qualname__ is PY3-only.") @given(slots_outer=booleans(), slots_inner=booleans()) def test_name_not_overridden(self, slots_outer, slots_inner): """ @@ -607,14 +604,30 @@ def test_name_not_overridden(self, slots_outer, slots_inner): """ @attr.s(slots=slots_outer) - class C(object): + class C: @attr.s(slots=slots_inner) - class D(object): + class D: pass assert C.D.__name__ == "D" assert C.D.__qualname__ == C.__qualname__ + ".D" + @pytest.mark.parametrize("with_validation", [True, False]) + def test_pre_init(self, with_validation, monkeypatch): + """ + Verify that __attrs_pre_init__ gets called if defined. + """ + monkeypatch.setattr(_config, "_run_validators", with_validation) + + @attr.s + class C: + def __attrs_pre_init__(self2): + self2.z = 30 + + c = C() + + assert 30 == getattr(c, "z", None) + @pytest.mark.parametrize("with_validation", [True, False]) def test_post_init(self, with_validation, monkeypatch): """ @@ -623,7 +636,7 @@ def test_post_init(self, with_validation, monkeypatch): monkeypatch.setattr(_config, "_run_validators", with_validation) @attr.s - class C(object): + class C: x = attr.ib() y = attr.ib() @@ -634,13 +647,34 @@ def __attrs_post_init__(self2): assert 30 == getattr(c, "z", None) + @pytest.mark.parametrize("with_validation", [True, False]) + def test_pre_post_init_order(self, with_validation, monkeypatch): + """ + Verify that __attrs_post_init__ gets called if defined. + """ + monkeypatch.setattr(_config, "_run_validators", with_validation) + + @attr.s + class C: + x = attr.ib() + + def __attrs_pre_init__(self2): + self2.z = 30 + + def __attrs_post_init__(self2): + self2.z += self2.x + + c = C(x=10) + + assert 40 == getattr(c, "z", None) + def test_types(self): """ Sets the `Attribute.type` attr from type argument. """ @attr.s - class C(object): + class C: x = attr.ib(type=int) y = attr.ib(type=str) z = attr.ib() @@ -656,7 +690,7 @@ def test_clean_class(self, slots): """ @attr.s(slots=slots) - class C(object): + class C: x = attr.ib() x = getattr(C, "x", None) @@ -669,7 +703,7 @@ def test_factory_sugar(self): """ @attr.s - class C(object): + class C: x = attr.ib(factory=list) assert Factory(list) == attr.fields(C).x.default @@ -681,7 +715,7 @@ def test_sugar_factory_mutex(self): with pytest.raises(ValueError, match="mutually exclusive"): @attr.s - class C(object): + class C: x = attr.ib(factory=list, default=Factory(list)) def test_sugar_callable(self): @@ -692,7 +726,7 @@ def test_sugar_callable(self): with pytest.raises(ValueError, match="must be a callable"): @attr.s - class C(object): + class C: x = attr.ib(factory=Factory(list)) def test_inherited_does_not_affect_hashing_and_equality(self): @@ -702,7 +736,7 @@ def test_inherited_does_not_affect_hashing_and_equality(self): """ @attr.s - class BaseClass(object): + class BaseClass: x = attr.ib() @attr.s @@ -716,7 +750,7 @@ class SubClass(BaseClass): assert hash(ba) == hash(sa) -class TestKeywordOnlyAttributes(object): +class TestKeywordOnlyAttributes: """ Tests for keyword-only attributes. """ @@ -727,7 +761,7 @@ def test_adds_keyword_only_arguments(self): """ @attr.s - class C(object): + class C: a = attr.ib() b = attr.ib(default=2, kw_only=True) c = attr.ib(kw_only=True) @@ -746,7 +780,7 @@ def test_ignores_kw_only_when_init_is_false(self): """ @attr.s - class C(object): + class C: x = attr.ib(init=False, default=0, kw_only=True) y = attr.ib() @@ -762,20 +796,15 @@ def test_keyword_only_attributes_presence(self): """ @attr.s - class C(object): + class C: x = attr.ib(kw_only=True) with pytest.raises(TypeError) as e: C() - if PY2: - assert ( - "missing required keyword-only argument: 'x'" - ) in e.value.args[0] - else: - assert ( - "missing 1 required keyword-only argument: 'x'" - ) in e.value.args[0] + assert ( + "missing 1 required keyword-only argument: 'x'" + ) in e.value.args[0] def test_keyword_only_attributes_unexpected(self): """ @@ -783,7 +812,7 @@ def test_keyword_only_attributes_unexpected(self): """ @attr.s - class C(object): + class C: x = attr.ib(kw_only=True) with pytest.raises(TypeError) as e: @@ -800,7 +829,7 @@ def test_keyword_only_attributes_can_come_in_any_order(self): """ @attr.s - class C(object): + class C: a = attr.ib(kw_only=True) b = attr.ib(kw_only=True, default="b") c = attr.ib(kw_only=True) @@ -829,7 +858,7 @@ def test_keyword_only_attributes_allow_subclassing(self): """ @attr.s - class Base(object): + class Base: x = attr.ib(default=0) @attr.s @@ -848,7 +877,7 @@ def test_keyword_only_class_level(self): """ @attr.s(kw_only=True) - class C(object): + class C: x = attr.ib() y = attr.ib(kw_only=True) @@ -867,7 +896,7 @@ def test_keyword_only_class_level_subclassing(self): """ @attr.s - class Base(object): + class Base: x = attr.ib(default=0) @attr.s(kw_only=True) @@ -890,7 +919,7 @@ def test_init_false_attribute_after_keyword_attribute(self): """ @attr.s - class KwArgBeforeInitFalse(object): + class KwArgBeforeInitFalse: kwarg = attr.ib(kw_only=True) non_init_function_default = attr.ib(init=False) non_init_keyword_default = attr.ib( @@ -918,7 +947,7 @@ def test_init_false_attribute_after_keyword_attribute_with_inheritance( """ @attr.s - class KwArgBeforeInitFalseParent(object): + class KwArgBeforeInitFalseParent: kwarg = attr.ib(kw_only=True) @attr.s @@ -939,34 +968,14 @@ def _init_to_init(self): assert c.non_init_keyword_default == "default-by-keyword" -@pytest.mark.skipif(not PY2, reason="PY2-specific keyword-only error behavior") -class TestKeywordOnlyAttributesOnPy2(object): - """ - Tests for keyword-only attribute behavior on py2. - """ - - def test_no_init(self): - """ - Keyworld-only is a no-op, not any error, if ``init=false``. - """ - - @attr.s(kw_only=True, init=False) - class ClassLevel(object): - a = attr.ib() - - @attr.s(init=False) - class AttrLevel(object): - a = attr.ib(kw_only=True) - - @attr.s -class GC(object): +class GC: @attr.s - class D(object): + class D: pass -class TestMakeClass(object): +class TestMakeClass: """ Tests for `make_class`. """ @@ -979,7 +988,7 @@ def test_simple(self, ls): C1 = make_class("C1", ls(["a", "b"])) @attr.s - class C2(object): + class C2: a = attr.ib() b = attr.ib() @@ -994,7 +1003,7 @@ def test_dict(self): ) @attr.s - class C2(object): + class C2: a = attr.ib(default=42) b = attr.ib(default=None) @@ -1022,7 +1031,7 @@ def test_bases(self): Parameter bases default to (object,) and subclasses correctly """ - class D(object): + class D: pass cls = make_class("C", {}) @@ -1066,8 +1075,23 @@ def test_make_class_ordered(self): assert "C(a=1, b=2)" == repr(C()) + def test_generic_dynamic_class(self): + """ + make_class can create generic dynamic classes. + + https://github.com/python-attrs/attrs/issues/756 + https://bugs.python.org/issue33188 + """ + from types import new_class + from typing import Generic, TypeVar + + MyTypeVar = TypeVar("MyTypeVar") + MyParent = new_class("MyParent", (Generic[MyTypeVar],), {}) + + attr.make_class("test", {"id": attr.ib(type=str)}, (MyParent[int],)) + -class TestFields(object): +class TestFields: """ Tests for `fields`. """ @@ -1109,7 +1133,7 @@ def test_fields_properties(self, C): assert getattr(fields(C), attribute.name) is attribute -class TestFieldsDict(object): +class TestFieldsDict: """ Tests for `fields_dict`. """ @@ -1147,7 +1171,7 @@ def test_fields_dict(self, C): assert [a.name for a in fields(C)] == [field_name for field_name in d] -class TestConverter(object): +class TestConverter: """ Tests for attribute conversion. """ @@ -1260,7 +1284,7 @@ def test_frozen(self): C("1") -class TestValidate(object): +class TestValidate: """ Tests for `validate`. """ @@ -1358,7 +1382,7 @@ def test_multiple_empty(self): ) -class TestMetadata(object): +class TestMetadata: """ Tests for metadata handling. """ @@ -1453,7 +1477,7 @@ def test_metadata(self): assert md is a.metadata -class TestClassBuilder(object): +class TestClassBuilder: """ Tests for `_ClassBuilder`. """ @@ -1475,7 +1499,7 @@ def test_repr(self): repr of builder itself makes sense. """ - class C(object): + class C: pass b = _ClassBuilder( @@ -1502,7 +1526,7 @@ def test_returns_self(self): All methods return the builder for chaining. """ - class C(object): + class C: x = attr.ib() b = _ClassBuilder( @@ -1527,6 +1551,7 @@ class C(object): .add_order() .add_hash() .add_init() + .add_attrs_init() .add_repr("ns") .add_str() .build_class() @@ -1556,12 +1581,12 @@ def test_attaches_meta_dunders(self, meth_name): """ @attr.s(hash=True, str=True) - class C(object): + class C: def organic(self): pass @attr.s(hash=True, str=True) - class D(object): + class D: pass meth_C = getattr(C, meth_name) @@ -1569,11 +1594,10 @@ class D(object): assert meth_name == meth_C.__name__ == meth_D.__name__ assert C.organic.__module__ == meth_C.__module__ == meth_D.__module__ - if not PY2: - # This is assertion that would fail if a single __ne__ instance - # was reused across multiple _make_eq calls. - organic_prefix = C.organic.__qualname__.rsplit(".", 1)[0] - assert organic_prefix + "." + meth_name == meth_C.__qualname__ + # This is assertion that would fail if a single __ne__ instance + # was reused across multiple _make_eq calls. + organic_prefix = C.organic.__qualname__.rsplit(".", 1)[0] + assert organic_prefix + "." + meth_name == meth_C.__qualname__ def test_handles_missing_meta_on_class(self): """ @@ -1581,7 +1605,7 @@ def test_handles_missing_meta_on_class(self): either. """ - class C(object): + class C: pass b = _ClassBuilder( @@ -1620,7 +1644,7 @@ def test_weakref_setstate(self): """ @attr.s(slots=True) - class C(object): + class C: __weakref__ = attr.ib( init=False, hash=False, repr=False, eq=False, order=False ) @@ -1634,7 +1658,7 @@ def test_no_references_to_original(self): """ @attr.s(slots=True) - class C(object): + class C: pass @attr.s(slots=True) @@ -1677,7 +1701,7 @@ def test_copy(self, kwargs): """ @attr.s(eq=True, **kwargs) - class C(object): + class C: x = attr.ib() a = C(1) @@ -1692,7 +1716,7 @@ def test_copy_custom_setstate(self, kwargs): """ @attr.s(eq=True, **kwargs) - class C(object): + class C: x = attr.ib() def __getstate__(self): @@ -1721,7 +1745,7 @@ def test_subclasses_cannot_be_compared(self): """ @attr.s - class A(object): + class A: a = attr.ib() @attr.s @@ -1744,33 +1768,32 @@ class B(A): == a.__ge__(b) ) - if not PY2: - with pytest.raises(TypeError): - a <= b + with pytest.raises(TypeError): + a <= b - with pytest.raises(TypeError): - a >= b + with pytest.raises(TypeError): + a >= b - with pytest.raises(TypeError): - a < b + with pytest.raises(TypeError): + a < b - with pytest.raises(TypeError): - a > b + with pytest.raises(TypeError): + a > b -class TestDetermineEqOrder(object): +class TestDetermineAttrsEqOrder: def test_default(self): """ If all are set to None, set both eq and order to the passed default. """ - assert (42, 42) == _determine_eq_order(None, None, None, 42) + assert (42, 42) == _determine_attrs_eq_order(None, None, None, 42) @pytest.mark.parametrize("eq", [True, False]) def test_order_mirrors_eq_by_default(self, eq): """ If order is None, it mirrors eq. """ - assert (eq, eq) == _determine_eq_order(None, eq, None, True) + assert (eq, eq) == _determine_attrs_eq_order(None, eq, None, True) def test_order_without_eq(self): """ @@ -1779,7 +1802,7 @@ def test_order_without_eq(self): with pytest.raises( ValueError, match="`order` can only be True if `eq` is True too." ): - _determine_eq_order(None, False, True, True) + _determine_attrs_eq_order(None, False, True, True) @given(cmp=booleans(), eq=optional_bool, order=optional_bool) def test_mix(self, cmp, eq, order): @@ -1791,26 +1814,75 @@ def test_mix(self, cmp, eq, order): with pytest.raises( ValueError, match="Don't mix `cmp` with `eq' and `order`." ): - _determine_eq_order(cmp, eq, order, True) + _determine_attrs_eq_order(cmp, eq, order, True) + - def test_cmp_deprecated(self): +class TestDetermineAttribEqOrder: + def test_default(self): """ - Passing a cmp that is not None raises a DeprecationWarning. + If all are set to None, set both eq and order to the passed default. """ - with pytest.deprecated_call() as dc: + assert (42, None, 42, None) == _determine_attrib_eq_order( + None, None, None, 42 + ) - @attr.s(cmp=True) - class C(object): - pass + def test_eq_callable_order_boolean(self): + """ + eq=callable or order=callable need to transformed into eq/eq_key + or order/order_key. + """ + assert (True, str.lower, False, None) == _determine_attrib_eq_order( + None, str.lower, False, True + ) - (w,) = dc.list + def test_eq_callable_order_callable(self): + """ + eq=callable or order=callable need to transformed into eq/eq_key + or order/order_key. + """ + assert (True, str.lower, True, abs) == _determine_attrib_eq_order( + None, str.lower, abs, True + ) - assert ( - "The usage of `cmp` is deprecated and will be removed on or after " - "2021-06-01. Please use `eq` and `order` instead." - == w.message.args[0] + def test_eq_boolean_order_callable(self): + """ + eq=callable or order=callable need to transformed into eq/eq_key + or order/order_key. + """ + assert (True, None, True, str.lower) == _determine_attrib_eq_order( + None, True, str.lower, True + ) + + @pytest.mark.parametrize("eq", [True, False]) + def test_order_mirrors_eq_by_default(self, eq): + """ + If order is None, it mirrors eq. + """ + assert (eq, None, eq, None) == _determine_attrib_eq_order( + None, eq, None, True ) + def test_order_without_eq(self): + """ + eq=False, order=True raises a meaningful ValueError. + """ + with pytest.raises( + ValueError, match="`order` can only be True if `eq` is True too." + ): + _determine_attrib_eq_order(None, False, True, True) + + @given(cmp=booleans(), eq=optional_bool, order=optional_bool) + def test_mix(self, cmp, eq, order): + """ + If cmp is not None, eq and order must be None and vice versa. + """ + assume(eq is not None or order is not None) + + with pytest.raises( + ValueError, match="Don't mix `cmp` with `eq' and `order`." + ): + _determine_attrib_eq_order(cmp, eq, order, True) + class TestDocs: @pytest.mark.parametrize( @@ -1833,7 +1905,7 @@ def test_docs(self, meth_name): """ @attr.s - class A(object): + class A: pass if hasattr(A, "__qualname__"): @@ -1844,24 +1916,14 @@ class A(object): assert expected == method.__doc__ -@pytest.mark.skipif(not PY2, reason="Needs to be only caught on Python 2.") -def test_auto_detect_raises_on_py2(): - """ - Trying to pass auto_detect=True to attr.s raises PythonTooOldError. - """ - with pytest.raises(PythonTooOldError): - attr.s(auto_detect=True) - - -class BareC(object): +class BareC: pass -class BareSlottedC(object): +class BareSlottedC: __slots__ = () -@pytest.mark.skipif(PY2, reason="Auto-detection is Python 3-only.") class TestAutoDetect: @pytest.mark.parametrize("C", (BareC, BareSlottedC)) def test_determine_detects_non_presence_correctly(self, C): @@ -1890,7 +1952,7 @@ def test_make_all_by_default(self, slots, frozen): """ @attr.s(auto_detect=True, slots=slots, frozen=frozen) - class C(object): + class C: x = attr.ib() i = C(1) @@ -1913,7 +1975,7 @@ def test_detect_auto_init(self, slots, frozen): """ @attr.s(auto_detect=True, slots=slots, frozen=frozen) - class CI(object): + class CI: x = attr.ib() def __init__(self): @@ -1929,7 +1991,7 @@ def test_detect_auto_repr(self, slots, frozen): """ @attr.s(auto_detect=True, slots=slots, frozen=frozen) - class C(object): + class C: x = attr.ib() def __repr__(self): @@ -1937,6 +1999,27 @@ def __repr__(self): assert "hi" == repr(C(42)) + @pytest.mark.parametrize("slots", [True, False]) + @pytest.mark.parametrize("frozen", [True, False]) + def test_hash_uses_eq(self, slots, frozen): + """ + If eq is passed in, then __hash__ should use the eq callable + to generate the hash code. + """ + + @attr.s(slots=slots, frozen=frozen, hash=True) + class C: + x = attr.ib(eq=str) + + @attr.s(slots=slots, frozen=frozen, hash=True) + class D: + x = attr.ib() + + # These hashes should be the same because 1 is turned into + # string before hashing. + assert hash(C("1")) == hash(C(1)) + assert hash(D("1")) != hash(D(1)) + @pytest.mark.parametrize("slots", [True, False]) @pytest.mark.parametrize("frozen", [True, False]) def test_detect_auto_hash(self, slots, frozen): @@ -1945,7 +2028,7 @@ def test_detect_auto_hash(self, slots, frozen): """ @attr.s(auto_detect=True, slots=slots, frozen=frozen) - class C(object): + class C: x = attr.ib() def __hash__(self): @@ -1961,7 +2044,7 @@ def test_detect_auto_eq(self, slots, frozen): """ @attr.s(auto_detect=True, slots=slots, frozen=frozen) - class C(object): + class C: x = attr.ib() def __eq__(self, o): @@ -1971,7 +2054,7 @@ def __eq__(self, o): C(1) == C(1) @attr.s(auto_detect=True, slots=slots, frozen=frozen) - class D(object): + class D: x = attr.ib() def __ne__(self, o): @@ -2007,19 +2090,19 @@ def assert_none_set(cls, ex): assert_not_set(cls, ex, "__" + m + "__") @attr.s(auto_detect=True, slots=slots, frozen=frozen) - class LE(object): + class LE: __le__ = 42 @attr.s(auto_detect=True, slots=slots, frozen=frozen) - class LT(object): + class LT: __lt__ = 42 @attr.s(auto_detect=True, slots=slots, frozen=frozen) - class GE(object): + class GE: __ge__ = 42 @attr.s(auto_detect=True, slots=slots, frozen=frozen) - class GT(object): + class GT: __gt__ = 42 assert_none_set(LE, "__le__") @@ -2035,7 +2118,7 @@ def test_override_init(self, slots, frozen): """ @attr.s(init=True, auto_detect=True, slots=slots, frozen=frozen) - class C(object): + class C: x = attr.ib() def __init__(self): @@ -2051,7 +2134,7 @@ def test_override_repr(self, slots, frozen): """ @attr.s(repr=True, auto_detect=True, slots=slots, frozen=frozen) - class C(object): + class C: x = attr.ib() def __repr__(self): @@ -2067,7 +2150,7 @@ def test_override_hash(self, slots, frozen): """ @attr.s(hash=True, auto_detect=True, slots=slots, frozen=frozen) - class C(object): + class C: x = attr.ib() def __hash__(self): @@ -2083,7 +2166,7 @@ def test_override_eq(self, slots, frozen): """ @attr.s(eq=True, auto_detect=True, slots=slots, frozen=frozen) - class C(object): + class C: x = attr.ib() def __eq__(self, o): @@ -2105,7 +2188,7 @@ def __ne__(self, o): (None, None, True), ], ) - def test_override_order(self, slots, frozen, eq, order, cmp, recwarn): + def test_override_order(self, slots, frozen, eq, order, cmp): """ If order=True is passed, ignore __le__, __lt__, __gt__, __ge__. @@ -2123,7 +2206,7 @@ def meth(self, o): slots=slots, frozen=frozen, ) - class C(object): + class C: x = attr.ib() __le__ = __lt__ = __gt__ = __ge__ = meth @@ -2132,11 +2215,6 @@ class C(object): assert C(2) > C(1) assert C(2) >= C(1) - if cmp: - assert 1 == len(recwarn.list) - else: - assert 0 == len(recwarn.list) - @pytest.mark.parametrize("slots", [True, False]) @pytest.mark.parametrize("first", [True, False]) def test_total_ordering(self, slots, first): @@ -2147,7 +2225,7 @@ def test_total_ordering(self, slots, first): Ensure the order doesn't matter. """ - class C(object): + class C: x = attr.ib() own_eq_called = attr.ib(default=False) own_le_called = attr.ib(default=False) @@ -2193,14 +2271,16 @@ def test_detects_setstate_getstate(self, slots): """ @attr.s(slots=slots, auto_detect=True) - class C(object): + class C: def __getstate__(self): return ("hi",) - assert None is getattr(C(), "__setstate__", None) + assert getattr(object, "__setstate__", None) is getattr( + C, "__setstate__", None + ) @attr.s(slots=slots, auto_detect=True) - class C(object): + class C: called = attr.ib(False) def __setstate__(self, state): @@ -2213,4 +2293,137 @@ def __setstate__(self, state): i.__setstate__(()) assert True is i.called - assert None is getattr(C(), "__getstate__", None) + assert getattr(object, "__getstate__", None) is getattr( + C, "__getstate__", None + ) + + @pytest.mark.skipif(PY310, reason="Pre-3.10 only.") + def test_match_args_pre_310(self): + """ + __match_args__ is not created on Python versions older than 3.10. + """ + + @attr.s + class C: + a = attr.ib() + + assert None is getattr(C, "__match_args__", None) + + +@pytest.mark.skipif(not PY310, reason="Structural pattern matching is 3.10+") +class TestMatchArgs: + """ + Tests for match_args and __match_args__ generation. + """ + + def test_match_args(self): + """ + __match_args__ is created by default on Python 3.10. + """ + + @attr.define + class C: + a = attr.field() + + assert ("a",) == C.__match_args__ + + def test_explicit_match_args(self): + """ + A custom __match_args__ set is not overwritten. + """ + + ma = () + + @attr.define + class C: + a = attr.field() + __match_args__ = ma + + assert C(42).__match_args__ is ma + + @pytest.mark.parametrize("match_args", [True, False]) + def test_match_args_attr_set(self, match_args): + """ + __match_args__ is set depending on match_args. + """ + + @attr.define(match_args=match_args) + class C: + a = attr.field() + + if match_args: + assert hasattr(C, "__match_args__") + else: + assert not hasattr(C, "__match_args__") + + def test_match_args_kw_only(self): + """ + kw_only classes don't generate __match_args__. + kw_only fields are not included in __match_args__. + """ + + @attr.define + class C: + a = attr.field(kw_only=True) + b = attr.field() + + assert C.__match_args__ == ("b",) + + @attr.define(kw_only=True) + class C: + a = attr.field() + b = attr.field() + + assert C.__match_args__ == () + + def test_match_args_argument(self): + """ + match_args being False with inheritance. + """ + + @attr.define(match_args=False) + class X: + a = attr.field() + + assert "__match_args__" not in X.__dict__ + + @attr.define(match_args=False) + class Y: + a = attr.field() + __match_args__ = ("b",) + + assert Y.__match_args__ == ("b",) + + @attr.define(match_args=False) + class Z(Y): + z = attr.field() + + assert Z.__match_args__ == ("b",) + + @attr.define + class A: + a = attr.field() + z = attr.field() + + @attr.define(match_args=False) + class B(A): + b = attr.field() + + assert B.__match_args__ == ("a", "z") + + def test_make_class(self): + """ + match_args generation with make_class. + """ + + C1 = make_class("C1", ["a", "b"]) + assert ("a", "b") == C1.__match_args__ + + C1 = make_class("C1", ["a", "b"], match_args=False) + assert not hasattr(C1, "__match_args__") + + C1 = make_class("C1", ["a", "b"], kw_only=True) + assert () == C1.__match_args__ + + C1 = make_class("C1", {"a": attr.ib(kw_only=True), "b": attr.ib()}) + assert ("b",) == C1.__match_args__ diff --git a/tests/test_mypy.yml b/tests/test_mypy.yml new file mode 100644 index 000000000..6759dc1a8 --- /dev/null +++ b/tests/test_mypy.yml @@ -0,0 +1,1373 @@ +- case: attr_s_with_type_argument + parametrized: + - val: "a = attr.ib(type=int)" + - val: "a: int = attr.ib()" + main: | + import attr + @attr.s + class C: + {{ val }} + C() # E: Missing positional argument "a" in call to "C" + C(1) + C(a=1) + C(a="hi") # E: Argument "a" to "C" has incompatible type "str"; expected "int" +- case: attr_s_with_type_annotations + main: | + import attr + @attr.s + class C: + a: int = attr.ib() + C() # E: Missing positional argument "a" in call to "C" + C(1) + C(a=1) + C(a="hi") # E: Argument "a" to "C" has incompatible type "str"; expected "int" + +- case: testAttrsSimple + main: | + import attr + @attr.s + class A: + a = attr.ib() + _b = attr.ib() + c = attr.ib(18) + _d = attr.ib(validator=None, default=18) + E = 18 + + def foo(self): + return self.a + reveal_type(A) # N: Revealed type is "def (a: Any, b: Any, c: Any =, d: Any =) -> main.A" + A(1, [2]) + A(1, [2], '3', 4) + A(1, 2, 3, 4) + A(1, [2], '3', 4, 5) # E: Too many arguments for "A" + +- case: testAttrsAnnotated + main: | + import attr + from typing import List, ClassVar + @attr.s + class A: + a: int = attr.ib() + _b: List[int] = attr.ib() + c: str = attr.ib('18') + _d: int = attr.ib(validator=None, default=18) + E = 7 + F: ClassVar[int] = 22 + reveal_type(A) # N: Revealed type is "def (a: builtins.int, b: builtins.list[builtins.int], c: builtins.str =, d: builtins.int =) -> main.A" + A(1, [2]) + A(1, [2], '3', 4) + A(1, 2, 3, 4) # E: Argument 2 to "A" has incompatible type "int"; expected "List[int]" # E: Argument 3 to "A" has incompatible type "int"; expected "str" + A(1, [2], '3', 4, 5) # E: Too many arguments for "A" + +- case: testAttrsPython2Annotations + main: | + import attr + from typing import List, ClassVar + @attr.s + class A: + a = attr.ib() # type: int + _b = attr.ib() # type: List[int] + c = attr.ib('18') # type: str + _d = attr.ib(validator=None, default=18) # type: int + E = 7 + F: ClassVar[int] = 22 + reveal_type(A) # N: Revealed type is "def (a: builtins.int, b: builtins.list[builtins.int], c: builtins.str =, d: builtins.int =) -> main.A" + A(1, [2]) + A(1, [2], '3', 4) + A(1, 2, 3, 4) # E: Argument 2 to "A" has incompatible type "int"; expected "List[int]" # E: Argument 3 to "A" has incompatible type "int"; expected "str" + A(1, [2], '3', 4, 5) # E: Too many arguments for "A" + +- case: testAttrsAutoAttribs + main: | + import attr + from typing import List, ClassVar + @attr.s(auto_attribs=True) + class A: + a: int + _b: List[int] + c: str = '18' + _d: int = attr.ib(validator=None, default=18) + E = 7 + F: ClassVar[int] = 22 + reveal_type(A) # N: Revealed type is "def (a: builtins.int, b: builtins.list[builtins.int], c: builtins.str =, d: builtins.int =) -> main.A" + A(1, [2]) + A(1, [2], '3', 4) + A(1, 2, 3, 4) # E: Argument 2 to "A" has incompatible type "int"; expected "List[int]" # E: Argument 3 to "A" has incompatible type "int"; expected "str" + A(1, [2], '3', 4, 5) # E: Too many arguments for "A" + +- case: testAttrsUntypedNoUntypedDefs + mypy_config: | + disallow_untyped_defs = True + main: | + import attr + @attr.s + class A: + a = attr.ib() # E: Need type annotation for "a" + _b = attr.ib() # E: Need type annotation for "_b" + c = attr.ib(18) # E: Need type annotation for "c" + _d = attr.ib(validator=None, default=18) # E: Need type annotation for "_d" + E = 18 + +- case: testAttrsWrongReturnValue + main: | + import attr + @attr.s + class A: + x: int = attr.ib(8) + def foo(self) -> str: + return self.x # E: Incompatible return value type (got "int", expected "str") + @attr.s + class B: + x = attr.ib(8) # type: int + def foo(self) -> str: + return self.x # E: Incompatible return value type (got "int", expected "str") + @attr.dataclass + class C: + x: int = 8 + def foo(self) -> str: + return self.x # E: Incompatible return value type (got "int", expected "str") + @attr.s + class D: + x = attr.ib(8, type=int) + def foo(self) -> str: + return self.x # E: Incompatible return value type (got "int", expected "str") + +- case: testAttrsSeriousNames + main: | + from attr import attrib, attrs + from typing import List + @attrs(init=True) + class A: + a = attrib() + _b: List[int] = attrib() + c = attrib(18) + _d = attrib(validator=None, default=18) + CLASS_VAR = 18 + reveal_type(A) # N: Revealed type is "def (a: Any, b: builtins.list[builtins.int], c: Any =, d: Any =) -> main.A" + A(1, [2]) + A(1, [2], '3', 4) + A(1, 2, 3, 4) # E: Argument 2 to "A" has incompatible type "int"; expected "List[int]" + A(1, [2], '3', 4, 5) # E: Too many arguments for "A" + +- case: testAttrsDefaultErrors + main: | + import attr + @attr.s + class A: + x = attr.ib(default=17) + y = attr.ib() # E: Non-default attributes not allowed after default attributes. + @attr.s(auto_attribs=True) + class B: + x: int = 17 + y: int # E: Non-default attributes not allowed after default attributes. + @attr.s(auto_attribs=True) + class C: + x: int = attr.ib(default=17) + y: int # E: Non-default attributes not allowed after default attributes. + @attr.s + class D: + x = attr.ib() + y = attr.ib() # E: Non-default attributes not allowed after default attributes. + + @x.default + def foo(self): + return 17 + +- case: testAttrsNotBooleans + main: | + import attr + x = True + @attr.s(cmp=x) # E: "cmp" argument must be True or False. + class A: + a = attr.ib(init=x) # E: "init" argument must be True or False. + +- case: testAttrsInitFalse + main: | + from attr import attrib, attrs + @attrs(auto_attribs=True, init=False) + class A: + a: int + _b: int + c: int = 18 + _d: int = attrib(validator=None, default=18) + reveal_type(A) # N: Revealed type is "def () -> main.A" + A() + A(1, [2]) # E: Too many arguments for "A" + A(1, [2], '3', 4) # E: Too many arguments for "A" + +- case: testAttrsInitAttribFalse + main: | + from attr import attrib, attrs + @attrs + class A: + a = attrib(init=False) + b = attrib() + reveal_type(A) # N: Revealed type is "def (b: Any) -> main.A" + +- case: testAttrsCmpTrue + main: | + from attr import attrib, attrs + @attrs(auto_attribs=True) + class A: + a: int + reveal_type(A) # N: Revealed type is "def (a: builtins.int) -> main.A" + reveal_type(A.__lt__) # N: Revealed type is "def [_AT] (self: _AT`-1, other: _AT`-1) -> builtins.bool" + reveal_type(A.__le__) # N: Revealed type is "def [_AT] (self: _AT`-1, other: _AT`-1) -> builtins.bool" + reveal_type(A.__gt__) # N: Revealed type is "def [_AT] (self: _AT`-1, other: _AT`-1) -> builtins.bool" + reveal_type(A.__ge__) # N: Revealed type is "def [_AT] (self: _AT`-1, other: _AT`-1) -> builtins.bool" + + A(1) < A(2) + A(1) <= A(2) + A(1) > A(2) + A(1) >= A(2) + A(1) == A(2) + A(1) != A(2) + + A(1) < 1 # E: Unsupported operand types for < ("A" and "int") + A(1) <= 1 # E: Unsupported operand types for <= ("A" and "int") + A(1) > 1 # E: Unsupported operand types for > ("A" and "int") + A(1) >= 1 # E: Unsupported operand types for >= ("A" and "int") + A(1) == 1 + A(1) != 1 + + 1 < A(1) # E: Unsupported operand types for < ("int" and "A") + 1 <= A(1) # E: Unsupported operand types for <= ("int" and "A") + 1 > A(1) # E: Unsupported operand types for > ("int" and "A") + 1 >= A(1) # E: Unsupported operand types for >= ("int" and "A") + 1 == A(1) + 1 != A(1) + +- case: testAttrsEqFalse + main: | + from attr import attrib, attrs + @attrs(auto_attribs=True, eq=False) + class A: + a: int + reveal_type(A) # N: Revealed type is "def (a: builtins.int) -> main.A" + reveal_type(A.__eq__) # N: Revealed type is "def (builtins.object, builtins.object) -> builtins.bool" + reveal_type(A.__ne__) # N: Revealed type is "def (builtins.object, builtins.object) -> builtins.bool" + + A(1) < A(2) # E: Unsupported left operand type for < ("A") + A(1) <= A(2) # E: Unsupported left operand type for <= ("A") + A(1) > A(2) # E: Unsupported left operand type for > ("A") + A(1) >= A(2) # E: Unsupported left operand type for >= ("A") + A(1) == A(2) + A(1) != A(2) + + A(1) < 1 # E: Unsupported operand types for > ("int" and "A") + A(1) <= 1 # E: Unsupported operand types for >= ("int" and "A") + A(1) > 1 # E: Unsupported operand types for < ("int" and "A") + A(1) >= 1 # E: Unsupported operand types for <= ("int" and "A") + A(1) == 1 + A(1) != 1 + + 1 < A(1) # E: Unsupported operand types for < ("int" and "A") + 1 <= A(1) # E: Unsupported operand types for <= ("int" and "A") + 1 > A(1) # E: Unsupported operand types for > ("int" and "A") + 1 >= A(1) # E: Unsupported operand types for >= ("int" and "A") + 1 == A(1) + 1 != A(1) + +- case: testAttrsOrderFalse + main: | + from attr import attrib, attrs + @attrs(auto_attribs=True, order=False) + class A: + a: int + reveal_type(A) # N: Revealed type is "def (a: builtins.int) -> main.A" + + A(1) < A(2) # E: Unsupported left operand type for < ("A") + A(1) <= A(2) # E: Unsupported left operand type for <= ("A") + A(1) > A(2) # E: Unsupported left operand type for > ("A") + A(1) >= A(2) # E: Unsupported left operand type for >= ("A") + A(1) == A(2) + A(1) != A(2) + + A(1) < 1 # E: Unsupported operand types for > ("int" and "A") + A(1) <= 1 # E: Unsupported operand types for >= ("int" and "A") + A(1) > 1 # E: Unsupported operand types for < ("int" and "A") + A(1) >= 1 # E: Unsupported operand types for <= ("int" and "A") + A(1) == 1 + A(1) != 1 + + 1 < A(1) # E: Unsupported operand types for < ("int" and "A") + 1 <= A(1) # E: Unsupported operand types for <= ("int" and "A") + 1 > A(1) # E: Unsupported operand types for > ("int" and "A") + 1 >= A(1) # E: Unsupported operand types for >= ("int" and "A") + 1 == A(1) + 1 != A(1) + +- case: testAttrsCmpEqOrderValues + main: | + from attr import attrib, attrs + @attrs(cmp=True) + class DeprecatedTrue: + ... + + @attrs(cmp=False) + class DeprecatedFalse: + ... + + @attrs(cmp=False, eq=True) # E: Don't mix "cmp" with "eq" and "order" + class Mixed: + ... + + @attrs(order=True, eq=False) # E: eq must be True if order is True + class Confused: + ... + +- case: testAttrsInheritance + main: | + import attr + @attr.s + class A: + a: int = attr.ib() + @attr.s + class B: + b: str = attr.ib() + @attr.s + class C(A, B): + c: bool = attr.ib() + reveal_type(C) # N: Revealed type is "def (a: builtins.int, b: builtins.str, c: builtins.bool) -> main.C" + +- case: testAttrsNestedInClasses + main: | + import attr + @attr.s + class C: + y = attr.ib() + @attr.s + class D: + x: int = attr.ib() + reveal_type(C) # N: Revealed type is "def (y: Any) -> main.C" + reveal_type(C.D) # N: Revealed type is "def (x: builtins.int) -> main.C.D" + +- case: testAttrsInheritanceOverride + main: | + import attr + + @attr.s + class A: + a: int = attr.ib() + x: int = attr.ib() + + @attr.s + class B(A): + b: str = attr.ib() + x: int = attr.ib(default=22) + + @attr.s + class C(B): + c: bool = attr.ib() # No error here because the x below overwrites the x above. + x: int = attr.ib() + + reveal_type(A) # N: Revealed type is "def (a: builtins.int, x: builtins.int) -> main.A" + reveal_type(B) # N: Revealed type is "def (a: builtins.int, b: builtins.str, x: builtins.int =) -> main.B" + reveal_type(C) # N: Revealed type is "def (a: builtins.int, b: builtins.str, c: builtins.bool, x: builtins.int) -> main.C" + +- case: testAttrsTypeEquals + main: | + import attr + + @attr.s + class A: + a = attr.ib(type=int) + b = attr.ib(18, type=int) + reveal_type(A) # N: Revealed type is "def (a: builtins.int, b: builtins.int =) -> main.A" + +- case: testAttrsFrozen + main: | + import attr + + @attr.s(frozen=True) + class A: + a = attr.ib() + + a = A(5) + a.a = 16 # E: Property "a" defined in "A" is read-only +- case: testAttrsNextGenFrozen + main: | + from attr import frozen, field + + @frozen + class A: + a = field() + + a = A(5) + a.a = 16 # E: Property "a" defined in "A" is read-only + +- case: testAttrsNextGenDetect + main: | + from attr import define, field + + @define + class A: + a = field() + + @define + class B: + a: int + + @define + class C: + a: int = field() + b = field() + + @define + class D: + a: int + b = field() + + reveal_type(A) # N: Revealed type is "def (a: Any) -> main.A" + reveal_type(B) # N: Revealed type is "def (a: builtins.int) -> main.B" + reveal_type(C) # N: Revealed type is "def (a: builtins.int, b: Any) -> main.C" + reveal_type(D) # N: Revealed type is "def (b: Any) -> main.D" + +- case: testAttrsDataClass + main: | + import attr + from typing import List, ClassVar + @attr.dataclass + class A: + a: int + _b: List[str] + c: str = '18' + _d: int = attr.ib(validator=None, default=18) + E = 7 + F: ClassVar[int] = 22 + reveal_type(A) # N: Revealed type is "def (a: builtins.int, b: builtins.list[builtins.str], c: builtins.str =, d: builtins.int =) -> main.A" + A(1, ['2']) + +- case: testAttrsTypeAlias + main: | + from typing import List + import attr + Alias = List[int] + @attr.s(auto_attribs=True) + class A: + Alias2 = List[str] + x: Alias + y: Alias2 = attr.ib() + reveal_type(A) # N: Revealed type is "def (x: builtins.list[builtins.int], y: builtins.list[builtins.str]) -> main.A" + +- case: testAttrsGeneric + main: | + from typing import TypeVar, Generic, List + import attr + T = TypeVar('T') + @attr.s(auto_attribs=True) + class A(Generic[T]): + x: List[T] + y: T = attr.ib() + def foo(self) -> List[T]: + return [self.y] + def bar(self) -> T: + return self.x[0] + def problem(self) -> T: + return self.x # E: Incompatible return value type (got "List[T]", expected "T") + reveal_type(A) # N: Revealed type is "def [T] (x: builtins.list[T`1], y: T`1) -> main.A[T`1]" + a = A([1], 2) + reveal_type(a) # N: Revealed type is "main.A[builtins.int]" + reveal_type(a.x) # N: Revealed type is "builtins.list[builtins.int]" + reveal_type(a.y) # N: Revealed type is "builtins.int" + + A(['str'], 7) # E: Cannot infer type argument 1 of "A" + A([1], '2') # E: Cannot infer type argument 1 of "A" + +- case: testAttrsUntypedGenericInheritance + main: | + from typing import Generic, TypeVar + import attr + + T = TypeVar("T") + + @attr.s(auto_attribs=True) + class Base(Generic[T]): + attr: T + + @attr.s(auto_attribs=True) + class Sub(Base): + pass + + sub = Sub(attr=1) + reveal_type(sub) # N: Revealed type is "main.Sub" + reveal_type(sub.attr) # N: Revealed type is "Any" + skip: True # Need to investigate why this is broken + +- case: testAttrsGenericInheritance + main: | + from typing import Generic, TypeVar + import attr + + S = TypeVar("S") + T = TypeVar("T") + + @attr.s(auto_attribs=True) + class Base(Generic[T]): + attr: T + + @attr.s(auto_attribs=True) + class Sub(Base[S]): + pass + + sub_int = Sub[int](attr=1) + reveal_type(sub_int) # N: Revealed type is "main.Sub[builtins.int]" + reveal_type(sub_int.attr) # N: Revealed type is "builtins.int" + + sub_str = Sub[str](attr='ok') + reveal_type(sub_str) # N: Revealed type is "main.Sub[builtins.str]" + reveal_type(sub_str.attr) # N: Revealed type is "builtins.str" + +- case: testAttrsGenericInheritance2 + main: | + from typing import Generic, TypeVar + import attr + + T1 = TypeVar("T1") + T2 = TypeVar("T2") + T3 = TypeVar("T3") + + @attr.s(auto_attribs=True) + class Base(Generic[T1, T2, T3]): + one: T1 + two: T2 + three: T3 + + @attr.s(auto_attribs=True) + class Sub(Base[int, str, float]): + pass + + sub = Sub(one=1, two='ok', three=3.14) + reveal_type(sub) # N: Revealed type is "main.Sub" + reveal_type(sub.one) # N: Revealed type is "builtins.int*" + reveal_type(sub.two) # N: Revealed type is "builtins.str*" + reveal_type(sub.three) # N: Revealed type is "builtins.float*" + skip: True # Need to investigate why this is broken + +- case: testAttrsMultiGenericInheritance + main: | + from typing import Generic, TypeVar + import attr + + T = TypeVar("T") + + @attr.s(auto_attribs=True, eq=False) + class Base(Generic[T]): + base_attr: T + + S = TypeVar("S") + + @attr.s(auto_attribs=True, eq=False) + class Middle(Base[int], Generic[S]): + middle_attr: S + + @attr.s(auto_attribs=True, eq=False) + class Sub(Middle[str]): + pass + + reveal_type(Sub.__init__) + + sub = Sub(base_attr=1, middle_attr='ok') + reveal_type(sub) # N: Revealed type is "main.Sub" + reveal_type(sub.base_attr) # N: Revealed type is "builtins.int*" + reveal_type(sub.middle_attr) # N: Revealed type is "builtins.str*" + skip: True # Need to investigate why this is broken + +- case: testAttrsGenericClassmethod + main: | + from typing import TypeVar, Generic, Optional + import attr + T = TypeVar('T') + @attr.s(auto_attribs=True) + class A(Generic[T]): + x: Optional[T] + @classmethod + def clsmeth(cls) -> None: + reveal_type(cls) # N: Revealed type is "Type[main.A[T`1]]" + +- case: testAttrsForwardReference + main: | + from typing import Optional + import attr + @attr.s(auto_attribs=True) + class A: + parent: 'B' + + @attr.s(auto_attribs=True) + class B: + parent: Optional[A] + + reveal_type(A) # N: Revealed type is "def (parent: main.B) -> main.A" + reveal_type(B) # N: Revealed type is "def (parent: Union[main.A, None]) -> main.B" + A(B(None)) + +- case: testAttrsForwardReferenceInClass + main: | + from typing import Optional + import attr + @attr.s(auto_attribs=True) + class A: + parent: A.B + + @attr.s(auto_attribs=True) + class B: + parent: Optional[A] + + reveal_type(A) # N: Revealed type is "def (parent: main.A.B) -> main.A" + reveal_type(A.B) # N: Revealed type is "def (parent: Union[main.A, None]) -> main.A.B" + A(A.B(None)) + +- case: testAttrsImporting + main: | + from helper import A + reveal_type(A) # N: Revealed type is "def (a: builtins.int, b: builtins.str) -> helper.A" + files: + - path: helper.py + content: | + import attr + @attr.s(auto_attribs=True) + class A: + a: int + b: str = attr.ib() + +- case: testAttrsOtherMethods + main: | + import attr + @attr.s(auto_attribs=True) + class A: + a: int + b: str = attr.ib() + @classmethod + def new(cls) -> A: + reveal_type(cls) # N: Revealed type is "Type[main.A]" + return cls(6, 'hello') + @classmethod + def bad(cls) -> A: + return cls(17) # E: Missing positional argument "b" in call to "A" + def foo(self) -> int: + return self.a + reveal_type(A) # N: Revealed type is "def (a: builtins.int, b: builtins.str) -> main.A" + a = A.new() + reveal_type(a.foo) # N: Revealed type is "def () -> builtins.int" + +- case: testAttrsOtherOverloads + main: | + import attr + from typing import overload, Union + + @attr.s + class A: + a = attr.ib() + b = attr.ib(default=3) + + @classmethod + def other(cls) -> str: + return "..." + + @overload + @classmethod + def foo(cls, x: int) -> int: ... + + @overload + @classmethod + def foo(cls, x: str) -> str: ... + + @classmethod + def foo(cls, x: Union[int, str]) -> Union[int, str]: + reveal_type(cls) # N: Revealed type is "Type[main.A]" + reveal_type(cls.other()) # N: Revealed type is "builtins.str" + return x + + reveal_type(A.foo(3)) # N: Revealed type is "builtins.int" + reveal_type(A.foo("foo")) # N: Revealed type is "builtins.str" + +- case: testAttrsDefaultDecorator + main: | + import attr + @attr.s + class C: + x: int = attr.ib(default=1) + y: int = attr.ib() + @y.default + def name_does_not_matter(self): + return self.x + 1 + C() + +- case: testAttrsValidatorDecorator + main: | + import attr + @attr.s + class C: + x = attr.ib() + @x.validator + def check(self, attribute, value): + if value > 42: + raise ValueError("x must be smaller or equal to 42") + C(42) + C(43) + +- case: testAttrsLocalVariablesInClassMethod + main: | + import attr + @attr.s(auto_attribs=True) + class A: + a: int + b: int = attr.ib() + @classmethod + def new(cls, foo: int) -> A: + a = foo + b = a + return cls(a, b) + +- case: testAttrsUnionForward + main: | + import attr + from typing import Union, List + + @attr.s(auto_attribs=True) + class A: + frob: List['AOrB'] + + class B: + pass + + AOrB = Union[A, B] + + reveal_type(A) # N: Revealed type is "def (frob: builtins.list[Union[main.A, main.B]]) -> main.A" + reveal_type(B) # N: Revealed type is "def () -> main.B" + + A([B()]) + +- case: testAttrsUsingConverter + main: | + import attr + import helper + + def converter2(s:int) -> str: + return 'hello' + + @attr.s + class C: + x: str = attr.ib(converter=helper.converter) + y: str = attr.ib(converter=converter2) + + # Because of the converter the __init__ takes an int, but the variable is a str. + reveal_type(C) # N: Revealed type is "def (x: builtins.int, y: builtins.int) -> main.C" + reveal_type(C(15, 16).x) # N: Revealed type is "builtins.str" + files: + - path: helper.py + content: | + def converter(s:int) -> str: + return 'hello' + +- case: testAttrsUsingBadConverter + mypy_config: strict_optional = False + main: | + import attr + from typing import overload + @overload + def bad_overloaded_converter(x: int, y: int) -> int: + ... + @overload + def bad_overloaded_converter(x: str, y: str) -> str: + ... + def bad_overloaded_converter(x, y=7): + return x + def bad_converter() -> str: + return '' + @attr.dataclass + class A: + bad: str = attr.ib(converter=bad_converter) + bad_overloaded: int = attr.ib(converter=bad_overloaded_converter) + reveal_type(A) + out: | + main:15: error: Cannot determine __init__ type from converter + main:15: error: Argument "converter" has incompatible type "Callable[[], str]"; expected "Callable[[Any], Any]" + main:16: error: Cannot determine __init__ type from converter + main:16: error: Argument "converter" has incompatible type overloaded function; expected "Callable[[Any], Any]" + main:17: note: Revealed type is "def (bad: Any, bad_overloaded: Any) -> main.A" + +- case: testAttrsUsingBadConverterReprocess + mypy_config: strict_optional = False + main: | + import attr + from typing import overload + forward: 'A' + @overload + def bad_overloaded_converter(x: int, y: int) -> int: + ... + @overload + def bad_overloaded_converter(x: str, y: str) -> str: + ... + def bad_overloaded_converter(x, y=7): + return x + def bad_converter() -> str: + return '' + @attr.dataclass + class A: + bad: str = attr.ib(converter=bad_converter) + bad_overloaded: int = attr.ib(converter=bad_overloaded_converter) + reveal_type(A) + out: | + main:16: error: Cannot determine __init__ type from converter + main:16: error: Argument "converter" has incompatible type "Callable[[], str]"; expected "Callable[[Any], Any]" + main:17: error: Cannot determine __init__ type from converter + main:17: error: Argument "converter" has incompatible type overloaded function; expected "Callable[[Any], Any]" + main:18: note: Revealed type is "def (bad: Any, bad_overloaded: Any) -> main.A" + +- case: testAttrsUsingUnsupportedConverter + main: | + import attr + class Thing: + def do_it(self, int) -> str: + ... + thing = Thing() + def factory(default: int): + ... + @attr.s + class C: + x: str = attr.ib(converter=thing.do_it) # E: Unsupported converter, only named functions, types and lambdas are currently supported + y: str = attr.ib(converter=lambda x: x) + z: str = attr.ib(converter=factory(8)) # E: Unsupported converter, only named functions, types and lambdas are currently supported + reveal_type(C) # N: Revealed type is "def (x: Any, y: Any, z: Any) -> main.C" + +- case: testAttrsUsingConverterAndSubclass + main: | + import attr + + def converter(s:int) -> str: + return 'hello' + + @attr.s + class C: + x: str = attr.ib(converter=converter) + + @attr.s + class A(C): + pass + + # Because of the convert the __init__ takes an int, but the variable is a str. + reveal_type(A) # N: Revealed type is "def (x: builtins.int) -> main.A" + reveal_type(A(15).x) # N: Revealed type is "builtins.str" + +- case: testAttrsUsingConverterWithTypes + main: | + from typing import overload + import attr + + @attr.dataclass + class A: + x: str + + @attr.s + class C: + x: complex = attr.ib(converter=complex) + y: int = attr.ib(converter=int) + z: A = attr.ib(converter=A) + + o = C("1", "2", "3") + o = C(1, 2, "3") + +- case: testAttrsCmpWithSubclasses + main: | + import attr + @attr.s + class A: pass + @attr.s + class B: pass + @attr.s + class C(A, B): pass + @attr.s + class D(A): pass + + reveal_type(A.__lt__) # N: Revealed type is "def [_AT] (self: _AT`-1, other: _AT`-1) -> builtins.bool" + reveal_type(B.__lt__) # N: Revealed type is "def [_AT] (self: _AT`-1, other: _AT`-1) -> builtins.bool" + reveal_type(C.__lt__) # N: Revealed type is "def [_AT] (self: _AT`-1, other: _AT`-1) -> builtins.bool" + reveal_type(D.__lt__) # N: Revealed type is "def [_AT] (self: _AT`-1, other: _AT`-1) -> builtins.bool" + + A() < A() + B() < B() + A() < B() # E: Unsupported operand types for < ("A" and "B") + + C() > A() + C() > B() + C() > C() + C() > D() # E: Unsupported operand types for > ("C" and "D") + + D() >= A() + D() >= B() # E: Unsupported operand types for >= ("D" and "B") + D() >= C() # E: Unsupported operand types for >= ("D" and "C") + D() >= D() + + A() <= 1 # E: Unsupported operand types for <= ("A" and "int") + B() <= 1 # E: Unsupported operand types for <= ("B" and "int") + C() <= 1 # E: Unsupported operand types for <= ("C" and "int") + D() <= 1 # E: Unsupported operand types for <= ("D" and "int") + +- case: testAttrsComplexSuperclass + main: | + import attr + @attr.s + class C: + x: int = attr.ib(default=1) + y: int = attr.ib() + @y.default + def name_does_not_matter(self): + return self.x + 1 + @attr.s + class A(C): + z: int = attr.ib(default=18) + reveal_type(C) # N: Revealed type is "def (x: builtins.int =, y: builtins.int =) -> main.C" + reveal_type(A) # N: Revealed type is "def (x: builtins.int =, y: builtins.int =, z: builtins.int =) -> main.A" + +- case: testAttrsMultiAssign + main: | + import attr + @attr.s + class A: + x, y, z = attr.ib(), attr.ib(type=int), attr.ib(default=17) + reveal_type(A) # N: Revealed type is "def (x: Any, y: builtins.int, z: Any =) -> main.A" + +- case: testAttrsMultiAssign2 + main: | + import attr + @attr.s + class A: + x = y = z = attr.ib() # E: Too many names for one attribute + +- case: testAttrsPrivateInit + main: | + import attr + @attr.s + class C: + _x = attr.ib(init=False, default=42) + C() + C(_x=42) # E: Unexpected keyword argument "_x" for "C" + +- case: testAttrsAutoMustBeAll + main: | + import attr + @attr.s(auto_attribs=True) + class A: + a: int + b = 17 + # The following forms are not allowed with auto_attribs=True + c = attr.ib() # E: Need type annotation for "c" + d, e = attr.ib(), attr.ib() # E: Need type annotation for "d" # E: Need type annotation for "e" + f = g = attr.ib() # E: Need type annotation for "f" # E: Need type annotation for "g" + +- case: testAttrsRepeatedName + main: | + import attr + @attr.s + class A: + a = attr.ib(default=8) + b = attr.ib() + a = attr.ib() + reveal_type(A) # N: Revealed type is "def (b: Any, a: Any) -> main.A" + @attr.s + class B: + a: int = attr.ib(default=8) + b: int = attr.ib() + a: int = attr.ib() # E: Name "a" already defined on line 10 + reveal_type(B) # N: Revealed type is "def (b: builtins.int, a: builtins.int) -> main.B" + @attr.s(auto_attribs=True) + class C: + a: int = 8 + b: int + a: int = attr.ib() # E: Name "a" already defined on line 16 + reveal_type(C) # N: Revealed type is "def (a: builtins.int, b: builtins.int) -> main.C" + +- case: testAttrsFrozenSubclass + main: | + import attr + + @attr.dataclass + class NonFrozenBase: + a: int + + @attr.dataclass(frozen=True) + class FrozenBase: + a: int + + @attr.dataclass(frozen=True) + class FrozenNonFrozen(NonFrozenBase): + b: int + + @attr.dataclass(frozen=True) + class FrozenFrozen(FrozenBase): + b: int + + @attr.dataclass + class NonFrozenFrozen(FrozenBase): + b: int + + # Make sure these are untouched + non_frozen_base = NonFrozenBase(1) + non_frozen_base.a = 17 + frozen_base = FrozenBase(1) + frozen_base.a = 17 # E: Property "a" defined in "FrozenBase" is read-only + + a = FrozenNonFrozen(1, 2) + a.a = 17 # E: Property "a" defined in "FrozenNonFrozen" is read-only + a.b = 17 # E: Property "b" defined in "FrozenNonFrozen" is read-only + + b = FrozenFrozen(1, 2) + b.a = 17 # E: Property "a" defined in "FrozenFrozen" is read-only + b.b = 17 # E: Property "b" defined in "FrozenFrozen" is read-only + + c = NonFrozenFrozen(1, 2) + c.a = 17 # E: Property "a" defined in "NonFrozenFrozen" is read-only + c.b = 17 # E: Property "b" defined in "NonFrozenFrozen" is read-only +- case: testAttrsCallableAttributes + main: | + from typing import Callable + import attr + def blah(a: int, b: int) -> bool: + return True + + @attr.s(auto_attribs=True) + class F: + _cb: Callable[[int, int], bool] = blah + def foo(self) -> bool: + return self._cb(5, 6) + + @attr.s + class G: + _cb: Callable[[int, int], bool] = attr.ib(blah) + def foo(self) -> bool: + return self._cb(5, 6) + + @attr.s(auto_attribs=True, frozen=True) + class FFrozen(F): + def bar(self) -> bool: + return self._cb(5, 6) + +- case: testAttrsWithFactory + main: | + from typing import List + import attr + def my_factory() -> int: + return 7 + @attr.s + class A: + x: List[int] = attr.ib(factory=list) + y: int = attr.ib(factory=my_factory) + A() + +- case: testAttrsFactoryAndDefault + main: | + import attr + @attr.s + class A: + x: int = attr.ib(factory=int, default=7) # E: Can't pass both "default" and "factory". + +- case: testAttrsFactoryBadReturn + main: | + import attr + def my_factory() -> int: + return 7 + @attr.s + class A: + x: int = attr.ib(factory=list) # E: Incompatible types in assignment (expression has type "List[_T]", variable has type "int") + y: str = attr.ib(factory=my_factory) # E: Incompatible types in assignment (expression has type "int", variable has type "str") + +- case: testAttrsDefaultAndInit + main: | + import attr + + @attr.s + class C: + a = attr.ib(init=False, default=42) + b = attr.ib() # Ok because previous attribute is init=False + c = attr.ib(default=44) + d = attr.ib(init=False) # Ok because this attribute is init=False + e = attr.ib() # E: Non-default attributes not allowed after default attributes. + +- case: testAttrsOptionalConverter + main: | + # flags: --strict-optional + import attr + from attr.converters import optional + from typing import Optional + + def converter(s:int) -> str: + return 'hello' + + + @attr.s + class A: + y: Optional[int] = attr.ib(converter=optional(int)) + z: Optional[str] = attr.ib(converter=optional(converter)) + + + A(None, None) + +- case: testAttrsTypeVarNoCollision + main: | + from typing import TypeVar, Generic + import attr + + T = TypeVar("T", bytes, str) + + # Make sure the generated __le__ (and friends) don't use T for their arguments. + @attr.s(auto_attribs=True) + class A(Generic[T]): + v: T + +- case: testAttrsKwOnlyAttrib + main: | + import attr + @attr.s + class A: + a = attr.ib(kw_only=True) + A() # E: Missing named argument "a" for "A" + A(15) # E: Too many positional arguments for "A" + A(a=15) + +- case: testAttrsKwOnlyClass + main: | + import attr + @attr.s(kw_only=True, auto_attribs=True) + class A: + a: int + b: bool + A() # E: Missing named argument "a" for "A" # E: Missing named argument "b" for "A" + A(b=True, a=15) + +- case: testAttrsKwOnlyClassNoInit + main: | + import attr + @attr.s(kw_only=True) + class B: + a = attr.ib(init=False) + b = attr.ib() + B(b=True) + +- case: testAttrsKwOnlyWithDefault + main: | + import attr + @attr.s + class C: + a = attr.ib(0) + b = attr.ib(kw_only=True) + c = attr.ib(16, kw_only=True) + C(b=17) + +- case: testAttrsKwOnlyClassWithMixedDefaults + main: | + import attr + @attr.s(kw_only=True) + class D: + a = attr.ib(10) + b = attr.ib() + c = attr.ib(15) + D(b=17) + +- case: testAttrsKwOnlySubclass + main: | + import attr + @attr.s + class A2: + a = attr.ib(default=0) + @attr.s + class B2(A2): + b = attr.ib(kw_only=True) + B2(b=1) + +- case: testAttrsNonKwOnlyAfterKwOnly + main: | + import attr + @attr.s(kw_only=True) + class A: + a = attr.ib(default=0) + @attr.s + class B(A): + b = attr.ib() + @attr.s + class C: + a = attr.ib(kw_only=True) + b = attr.ib(15) + +- case: testAttrsDisallowUntypedWorksForward + main: | + # flags: --disallow-untyped-defs + import attr + from typing import List + + @attr.s + class B: + x: C = attr.ib() + + class C(List[C]): + pass + + reveal_type(B) # N: Revealed type is "def (x: main.C) -> main.B" + +- case: testDisallowUntypedWorksForwardBad + mypy_config: disallow_untyped_defs = True + main: | + import attr + + @attr.s + class B: + x = attr.ib() # E: Need type annotation for "x" + + reveal_type(B) # N: Revealed type is "def (x: Any) -> main.B" + +- case: testAttrsDefaultDecoratorDeferred + main: | + defer: Yes + + import attr + @attr.s + class C: + x: int = attr.ib(default=1) + y: int = attr.ib() + @y.default + def inc(self): + return self.x + 1 + + class Yes: ... + +- case: testAttrsValidatorDecoratorDeferred + main: | + defer: Yes + + import attr + @attr.s + class C: + x = attr.ib() + @x.validator + def check(self, attribute, value): + if value > 42: + raise ValueError("x must be smaller or equal to 42") + C(42) + C(43) + + class Yes: ... + +- case: testTypeInAttrUndefined + main: | + import attr + + @attr.s + class C: + total = attr.ib(type=Bad) # E: Name "Bad" is not defined + +- case: testTypeInAttrForwardInRuntime + main: | + import attr + + @attr.s + class C: + total = attr.ib(type=Forward) + + reveal_type(C.total) # N: Revealed type is "main.Forward" + C('no') # E: Argument 1 to "C" has incompatible type "str"; expected "Forward" + class Forward: ... + +- case: testDefaultInAttrForward + main: | + import attr + + @attr.s + class C: + total = attr.ib(default=func()) + + def func() -> int: ... + + C() + C(1) + C(1, 2) # E: Too many arguments for "C" + +- case: testTypeInAttrUndefinedFrozen + main: | + import attr + + @attr.s(frozen=True) + class C: + total = attr.ib(type=Bad) # E: Name "Bad" is not defined + + C(0).total = 1 # E: Property "total" defined in "C" is read-only + +- case: testTypeInAttrDeferredStar + main: | + import lib + files: + - path: lib.py + content: | + import attr + MYPY = False + if MYPY: # Force deferral + from other import * + + @attr.s + class C: + total = attr.ib(type=int) + + C() # E: Missing positional argument "total" in call to "C" + C('no') # E: Argument 1 to "C" has incompatible type "str"; expected "int" + - path: other.py + content: | + import lib + +- case: testAttrsDefaultsMroOtherFile + main: | + import a + files: + - path: a.py + content: | + import attr + from b import A1, A2 + + @attr.s + class Asdf(A1, A2): # E: Non-default attributes not allowed after default attributes. + pass + - path: b.py + content: | + import attr + + @attr.s + class A1: + a: str = attr.ib('test') + + @attr.s + class A2: + b: int = attr.ib() + +- case: testAttrsInheritanceNoAnnotation + main: | + import attr + + @attr.s + class A: + foo = attr.ib() # type: int + + x = 0 + @attr.s + class B(A): + foo = x + + reveal_type(B) # N: Revealed type is "def (foo: builtins.int) -> main.B" + +- case: testFields + main: | + from attrs import define, fields + + @define + class A: + a: int + b: str + + reveal_type(fields(A)) # N: Revealed type is "Any" + +- case: testFieldsError + main: | + from attrs import fields + + class A: + a: int + b: str + + fields(A) # E: Argument 1 to "fields" has incompatible type "Type[A]"; expected "Type[AttrsInstance]" diff --git a/tests/test_next_gen.py b/tests/test_next_gen.py index 0ebad8d2f..1f13de0aa 100644 --- a/tests/test_next_gen.py +++ b/tests/test_next_gen.py @@ -1,5 +1,7 @@ +# SPDX-License-Identifier: MIT + """ -Python 3-only integration tests for provisional next generation APIs. +Python 3-only integration tests for provisional next-generation APIs. """ import re @@ -8,10 +10,11 @@ import pytest -import attr +import attr as _attr # don't use it by accident +import attrs -@attr.define +@attrs.define class C: x: str y: int @@ -29,7 +32,7 @@ def test_no_slots(self): slots can be deactivated. """ - @attr.define(slots=False) + @attrs.define(slots=False) class NoSlots: x: int @@ -42,9 +45,9 @@ def test_validates(self): Validators at __init__ and __setattr__ work. """ - @attr.define + @attrs.define class Validated: - x: int = attr.field(validator=attr.validators.instance_of(int)) + x: int = attrs.field(validator=attrs.validators.instance_of(int)) v = Validated(1) @@ -61,7 +64,7 @@ def test_no_order(self): with pytest.raises(TypeError): C("1", 2) < C("2", 3) - @attr.define(order=True) + @attrs.define(order=True) class Ordered: x: int @@ -71,23 +74,23 @@ def test_override_auto_attribs_true(self): """ Don't guess if auto_attrib is set explicitly. - Having an unannotated attr.ib/attr.field fails. + Having an unannotated attrs.ib/attrs.field fails. """ - with pytest.raises(attr.exceptions.UnannotatedAttributeError): + with pytest.raises(attrs.exceptions.UnannotatedAttributeError): - @attr.define(auto_attribs=True) + @attrs.define(auto_attribs=True) class ThisFails: - x = attr.field() + x = attrs.field() y: int def test_override_auto_attribs_false(self): """ Don't guess if auto_attrib is set explicitly. - Annotated fields that don't carry an attr.ib are ignored. + Annotated fields that don't carry an attrs.ib are ignored. """ - @attr.define(auto_attribs=False) + @attrs.define(auto_attribs=False) class NoFields: x: int y: int @@ -99,32 +102,71 @@ def test_auto_attribs_detect(self): define correctly detects if a class lacks type annotations. """ - @attr.define + @attrs.define class OldSchool: - x = attr.field() + x = attrs.field() assert OldSchool(1) == OldSchool(1) # Test with maybe_cls = None - @attr.define() + @attrs.define() class OldSchool2: - x = attr.field() + x = attrs.field() assert OldSchool2(1) == OldSchool2(1) + def test_auto_attribs_detect_fields_and_annotations(self): + """ + define infers auto_attribs=True if fields have type annotations + """ + + @attrs.define + class NewSchool: + x: int + y: list = attrs.field() + + @y.validator + def _validate_y(self, attribute, value): + if value < 0: + raise ValueError("y must be positive") + + assert NewSchool(1, 1) == NewSchool(1, 1) + with pytest.raises(ValueError): + NewSchool(1, -1) + assert list(attrs.fields_dict(NewSchool).keys()) == ["x", "y"] + + def test_auto_attribs_partially_annotated(self): + """ + define infers auto_attribs=True if any type annotations are found + """ + + @attrs.define + class NewSchool: + x: int + y: list + z = 10 + + # fields are defined for any annotated attributes + assert NewSchool(1, []) == NewSchool(1, []) + assert list(attrs.fields_dict(NewSchool).keys()) == ["x", "y"] + + # while the unannotated attributes are left as class vars + assert NewSchool.z == 10 + assert "z" in NewSchool.__dict__ + def test_auto_attribs_detect_annotations(self): """ define correctly detects if a class has type annotations. """ - @attr.define + @attrs.define class NewSchool: x: int assert NewSchool(1) == NewSchool(1) # Test with maybe_cls = None - @attr.define() + @attrs.define() class NewSchool2: x: int @@ -135,7 +177,7 @@ def test_exception(self): Exceptions are detected and correctly handled. """ - @attr.define + @attrs.define class E(Exception): msg: str other: int @@ -151,16 +193,16 @@ class E(Exception): def test_frozen(self): """ - attr.frozen freezes classes. + attrs.frozen freezes classes. """ - @attr.frozen + @attrs.frozen class F: x: str f = F(1) - with pytest.raises(attr.exceptions.FrozenInstanceError): + with pytest.raises(attrs.exceptions.FrozenInstanceError): f.x = 2 def test_auto_detect_eq(self): @@ -170,7 +212,7 @@ def test_auto_detect_eq(self): Regression test for #670. """ - @attr.define + @attrs.define class C: def __eq__(self, o): raise ValueError() @@ -180,35 +222,35 @@ def __eq__(self, o): def test_subclass_frozen(self): """ - It's possible to subclass an `attr.frozen` class and the frozen-ness is - inherited. + It's possible to subclass an `attrs.frozen` class and the frozen-ness + is inherited. """ - @attr.frozen + @attrs.frozen class A: a: int - @attr.frozen + @attrs.frozen class B(A): b: int - @attr.define(on_setattr=attr.setters.NO_OP) + @attrs.define(on_setattr=attrs.setters.NO_OP) class C(B): c: int assert B(1, 2) == B(1, 2) assert C(1, 2, 3) == C(1, 2, 3) - with pytest.raises(attr.exceptions.FrozenInstanceError): + with pytest.raises(attrs.exceptions.FrozenInstanceError): A(1).a = 1 - with pytest.raises(attr.exceptions.FrozenInstanceError): + with pytest.raises(attrs.exceptions.FrozenInstanceError): B(1, 2).a = 1 - with pytest.raises(attr.exceptions.FrozenInstanceError): + with pytest.raises(attrs.exceptions.FrozenInstanceError): B(1, 2).b = 2 - with pytest.raises(attr.exceptions.FrozenInstanceError): + with pytest.raises(attrs.exceptions.FrozenInstanceError): C(1, 2, 3).c = 3 def test_catches_frozen_on_setattr(self): @@ -217,7 +259,7 @@ def test_catches_frozen_on_setattr(self): immutability is inherited. """ - @attr.define(frozen=True) + @attrs.define(frozen=True) class A: pass @@ -225,7 +267,7 @@ class A: ValueError, match="Frozen classes can't use on_setattr." ): - @attr.define(frozen=True, on_setattr=attr.setters.validate) + @attrs.define(frozen=True, on_setattr=attrs.setters.validate) class B: pass @@ -237,17 +279,17 @@ class B: ), ): - @attr.define(on_setattr=attr.setters.validate) + @attrs.define(on_setattr=attrs.setters.validate) class C(A): pass @pytest.mark.parametrize( "decorator", [ - partial(attr.s, frozen=True, slots=True, auto_exc=True), - attr.frozen, - attr.define, - attr.mutable, + partial(_attr.s, frozen=True, slots=True, auto_exc=True), + attrs.frozen, + attrs.define, + attrs.mutable, ], ) def test_discard_context(self, decorator): @@ -259,7 +301,7 @@ def test_discard_context(self, decorator): @decorator class MyException(Exception): - x: str = attr.ib() + x: str = attrs.field() with pytest.raises(MyException) as ei: try: @@ -269,3 +311,130 @@ class MyException(Exception): assert "foo" == ei.value.x assert ei.value.__cause__ is None + + def test_converts_and_validates_by_default(self): + """ + If no on_setattr is set, assume setters.convert, setters.validate. + """ + + @attrs.define + class C: + x: int = attrs.field(converter=int) + + @x.validator + def _v(self, _, value): + if value < 10: + raise ValueError("must be >=10") + + inst = C(10) + + # Converts + inst.x = "11" + + assert 11 == inst.x + + # Validates + with pytest.raises(ValueError, match="must be >=10"): + inst.x = "9" + + def test_mro_ng(self): + """ + Attributes and methods are looked up the same way in NG by default. + + See #428 + """ + + @attrs.define + class A: + + x: int = 10 + + def xx(self): + return 10 + + @attrs.define + class B(A): + y: int = 20 + + @attrs.define + class C(A): + x: int = 50 + + def xx(self): + return 50 + + @attrs.define + class D(B, C): + pass + + d = D() + + assert d.x == d.xx() + + +class TestAsTuple: + def test_smoke(self): + """ + `attrs.astuple` only changes defaults, so we just call it and compare. + """ + inst = C("foo", 42) + + assert attrs.astuple(inst) == _attr.astuple(inst) + + +class TestAsDict: + def test_smoke(self): + """ + `attrs.asdict` only changes defaults, so we just call it and compare. + """ + inst = C("foo", {(1,): 42}) + + assert attrs.asdict(inst) == _attr.asdict( + inst, retain_collection_types=True + ) + + +class TestImports: + """ + Verify our re-imports and mirroring works. + """ + + def test_converters(self): + """ + Importing from attrs.converters works. + """ + from attrs.converters import optional + + assert optional is _attr.converters.optional + + def test_exceptions(self): + """ + Importing from attrs.exceptions works. + """ + from attrs.exceptions import FrozenError + + assert FrozenError is _attr.exceptions.FrozenError + + def test_filters(self): + """ + Importing from attrs.filters works. + """ + from attrs.filters import include + + assert include is _attr.filters.include + + def test_setters(self): + """ + Importing from attrs.setters works. + """ + from attrs.setters import pipe + + assert pipe is _attr.setters.pipe + + def test_validators(self): + """ + Importing from attrs.validators works. + """ + from attrs.validators import and_ + + assert and_ is _attr.validators.and_ diff --git a/tests/test_pattern_matching.py b/tests/test_pattern_matching.py new file mode 100644 index 000000000..3855d6a37 --- /dev/null +++ b/tests/test_pattern_matching.py @@ -0,0 +1,101 @@ +# SPDX-License-Identifier: MIT + +# Keep this file SHORT, until Black can handle it. +import pytest + +import attr + + +class TestPatternMatching: + """ + Pattern matching syntax test cases. + """ + + @pytest.mark.parametrize("dec", [attr.s, attr.define, attr.frozen]) + def test_simple_match_case(self, dec): + """ + Simple match case statement works as expected with all class + decorators. + """ + + @dec + class C: + a = attr.ib() + + assert ("a",) == C.__match_args__ + + matched = False + c = C(a=1) + match c: + case C(a): + matched = True + + assert matched + assert 1 == a + + def test_explicit_match_args(self): + """ + Does not overwrite a manually set empty __match_args__. + """ + + ma = () + + @attr.define + class C: + a = attr.field() + __match_args__ = ma + + c = C(a=1) + + msg = r"C\(\) accepts 0 positional sub-patterns \(1 given\)" + with pytest.raises(TypeError, match=msg): + match c: + case C(_): + pass + + def test_match_args_kw_only(self): + """ + kw_only classes don't generate __match_args__. + kw_only fields are not included in __match_args__. + """ + + @attr.define + class C: + a = attr.field(kw_only=True) + b = attr.field() + + assert ("b",) == C.__match_args__ + + c = C(a=1, b=1) + msg = r"C\(\) accepts 1 positional sub-pattern \(2 given\)" + with pytest.raises(TypeError, match=msg): + match c: + case C(a, b): + pass + + found = False + match c: + case C(b, a=a): + found = True + + assert found + + @attr.define(kw_only=True) + class C: + a = attr.field() + b = attr.field() + + c = C(a=1, b=1) + msg = r"C\(\) accepts 0 positional sub-patterns \(2 given\)" + with pytest.raises(TypeError, match=msg): + match c: + case C(a, b): + pass + + found = False + match c: + case C(a=a, b=b): + found = True + + assert found + assert (1, 1) == (a, b) diff --git a/tests/test_pyright.py b/tests/test_pyright.py new file mode 100644 index 000000000..e055ebb8c --- /dev/null +++ b/tests/test_pyright.py @@ -0,0 +1,71 @@ +# SPDX-License-Identifier: MIT + +import json +import os.path +import shutil +import subprocess +import sys + +import pytest + +import attr + + +if sys.version_info < (3, 6): + _found_pyright = False +else: + _found_pyright = shutil.which("pyright") + + +@attr.s(frozen=True) +class PyrightDiagnostic: + severity = attr.ib() + message = attr.ib() + + +@pytest.mark.skipif(not _found_pyright, reason="Requires pyright.") +def test_pyright_baseline(): + """The __dataclass_transform__ decorator allows pyright to determine + attrs decorated class types. + """ + + test_file = os.path.dirname(__file__) + "/dataclass_transform_example.py" + + pyright = subprocess.run( + ["pyright", "--outputjson", str(test_file)], capture_output=True + ) + pyright_result = json.loads(pyright.stdout) + + diagnostics = { + PyrightDiagnostic(d["severity"], d["message"]) + for d in pyright_result["generalDiagnostics"] + } + + # Expected diagnostics as per pyright 1.1.135 + expected_diagnostics = { + PyrightDiagnostic( + severity="information", + message='Type of "Define.__init__" is' + ' "(self: Define, a: str, b: int) -> None"', + ), + PyrightDiagnostic( + severity="information", + message='Type of "DefineConverter.__init__" is ' + '"(self: DefineConverter, with_converter: int) -> None"', + ), + PyrightDiagnostic( + severity="information", + message='Type of "d.a" is "Literal[\'new\']"', + ), + PyrightDiagnostic( + severity="error", + message='Cannot assign member "a" for type ' + '"FrozenDefine"\n\xa0\xa0"FrozenDefine" is frozen', + ), + PyrightDiagnostic( + severity="information", + message='Type of "d2.a" is "Literal[\'new\']"', + ), + } + + assert diagnostics == expected_diagnostics diff --git a/tests/test_setattr.py b/tests/test_setattr.py index 8e55da2d1..38fcf347d 100644 --- a/tests/test_setattr.py +++ b/tests/test_setattr.py @@ -1,4 +1,5 @@ -from __future__ import absolute_import, division, print_function +# SPDX-License-Identifier: MIT + import pickle @@ -7,22 +8,21 @@ import attr from attr import setters -from attr._compat import PY2 from attr.exceptions import FrozenAttributeError from attr.validators import instance_of, matches_re @attr.s(frozen=True) -class Frozen(object): +class Frozen: x = attr.ib() @attr.s -class WithOnSetAttrHook(object): +class WithOnSetAttrHook: x = attr.ib(on_setattr=lambda *args: None) -class TestSetAttr(object): +class TestSetAttr: def test_change(self): """ The return value of a hook overwrites the value. But they are not run @@ -33,7 +33,7 @@ def hook(*a, **kw): return "hooked!" @attr.s - class Hooked(object): + class Hooked: x = attr.ib(on_setattr=hook) y = attr.ib() @@ -54,7 +54,7 @@ def test_frozen_attribute(self): """ @attr.s - class PartiallyFrozen(object): + class PartiallyFrozen: x = attr.ib(on_setattr=setters.frozen) y = attr.ib() @@ -79,7 +79,7 @@ def test_validator(self, on_setattr): """ @attr.s(on_setattr=on_setattr) - class ValidatedAttribute(object): + class ValidatedAttribute: x = attr.ib() y = attr.ib(validator=[instance_of(str), matches_re("foo.*qux")]) @@ -113,7 +113,7 @@ def test_pipe(self): s = [setters.convert, lambda _, __, nv: nv + 1] @attr.s - class Piped(object): + class Piped: x1 = attr.ib(converter=int, on_setattr=setters.pipe(*s)) x2 = attr.ib(converter=int, on_setattr=s) @@ -145,7 +145,7 @@ def test_no_validator_no_converter(self): """ @attr.s(on_setattr=[setters.convert, setters.validate]) - class C(object): + class C: x = attr.ib() c = C(1) @@ -160,7 +160,7 @@ def test_validate_respects_run_validators_config(self): """ @attr.s(on_setattr=setters.validate) - class C(object): + class C: x = attr.ib(validator=attr.validators.instance_of(int)) c = C(1) @@ -185,7 +185,7 @@ def test_frozen_on_setattr_class_is_caught(self): with pytest.raises(ValueError) as ei: @attr.s(frozen=True, on_setattr=setters.validate) - class C(object): + class C: x = attr.ib() assert "Frozen classes can't use on_setattr." == ei.value.args[0] @@ -198,7 +198,7 @@ def test_frozen_on_setattr_attribute_is_caught(self): with pytest.raises(ValueError) as ei: @attr.s(frozen=True) - class C(object): + class C: x = attr.ib(on_setattr=setters.validate) assert "Frozen classes can't use on_setattr." == ei.value.args[0] @@ -216,16 +216,14 @@ def boom(*args): pytest.fail("Must not be called.") @attr.s - class Hooked(object): + class Hooked: x = attr.ib(on_setattr=boom) @attr.s(slots=slots) class NoHook(WithOnSetAttrHook): x = attr.ib() - if not PY2: - assert NoHook.__setattr__ == object.__setattr__ - + assert NoHook.__setattr__ == object.__setattr__ assert 1 == NoHook(1).x assert Hooked.__attrs_own_setattr__ assert not NoHook.__attrs_own_setattr__ @@ -238,7 +236,7 @@ def test_setattr_inherited_do_not_reset(self, slots): not reset it unless necessary. """ - class A(object): + class A: """ Not an attrs class on purpose to prevent accidental resets that would render the asserts meaningless. @@ -286,7 +284,7 @@ def test_slotted_class_can_have_custom_setattr(self): """ @attr.s(slots=True) - class A(object): + class A: def __setattr__(self, key, value): raise SystemError @@ -304,7 +302,7 @@ def test_slotted_confused(self): """ @attr.s(slots=True) - class A(object): + class A: x = attr.ib(on_setattr=setters.frozen) class B(A): @@ -316,13 +314,6 @@ class C(B): C(1).x = 2 - -@pytest.mark.skipif(PY2, reason="Python 3-only.") -class TestSetAttrNoPy2(object): - """ - __setattr__ tests for Py3+ to avoid the skip repetition. - """ - @pytest.mark.parametrize("slots", [True, False]) def test_setattr_auto_detect_if_no_custom_setattr(self, slots): """ @@ -383,7 +374,7 @@ def test_setattr_auto_detect_on_setattr(self, slots): ): @attr.s(auto_detect=True, slots=slots) - class HookAndCustomSetAttr(object): + class HookAndCustomSetAttr: x = attr.ib(on_setattr=lambda *args: None) def __setattr__(self, _, __): @@ -404,7 +395,7 @@ def test_setattr_inherited_do_not_reset_intermediate( """ @attr.s(slots=a_slots) - class A(object): + class A: x = attr.ib(on_setattr=setters.frozen) @attr.s(slots=b_slots, auto_detect=True) diff --git a/tests/test_slots.py b/tests/test_slots.py index b57fc639d..de4e90e0b 100644 --- a/tests/test_slots.py +++ b/tests/test_slots.py @@ -1,3 +1,5 @@ +# SPDX-License-Identifier: MIT + """ Unit tests for slots-related functionality. """ @@ -11,7 +13,7 @@ import attr -from attr._compat import PY2, PYPY, just_warn, make_set_closure_cell +from attr._compat import PYPY, just_warn, make_set_closure_cell # Pympler doesn't work on PyPy. @@ -24,7 +26,7 @@ @attr.s -class C1(object): +class C1: x = attr.ib(validator=attr.validators.instance_of(int)) y = attr.ib() @@ -39,18 +41,16 @@ def classmethod(cls): def staticmethod(): return "staticmethod" - if not PY2: - - def my_class(self): - return __class__ + def my_class(self): + return __class__ - def my_super(self): - """Just to test out the no-arg super.""" - return super().__repr__() + def my_super(self): + """Just to test out the no-arg super.""" + return super().__repr__() @attr.s(slots=True, hash=True) -class C1Slots(object): +class C1Slots: x = attr.ib(validator=attr.validators.instance_of(int)) y = attr.ib() @@ -65,14 +65,12 @@ def classmethod(cls): def staticmethod(): return "staticmethod" - if not PY2: - - def my_class(self): - return __class__ + def my_class(self): + return __class__ - def my_super(self): - """Just to test out the no-arg super.""" - return super().__repr__() + def my_super(self): + """Just to test out the no-arg super.""" + return super().__repr__() def test_slots_being_used(): @@ -88,7 +86,7 @@ def test_slots_being_used(): assert "__dict__" in dir(non_slot_instance) assert "__slots__" not in dir(non_slot_instance) - assert set(["__weakref__", "x", "y"]) == set(slot_instance.__slots__) + assert {"__weakref__", "x", "y"} == set(slot_instance.__slots__) if has_pympler: assert asizeof(slot_instance) < asizeof(non_slot_instance) @@ -152,7 +150,7 @@ class C2Slots(C1): assert "clsmethod" == c2.classmethod() assert "staticmethod" == c2.staticmethod() - assert set(["z"]) == set(C2Slots.__slots__) + assert {"z"} == set(C2Slots.__slots__) c3 = C2Slots(x=1, y=3, z="test") @@ -176,7 +174,7 @@ def test_nonslots_these(): This will actually *replace* the class with another one, using slots. """ - class SimpleOrdinaryClass(object): + class SimpleOrdinaryClass: def __init__(self, x, y, z): self.x = x self.y = y @@ -211,7 +209,7 @@ def staticmethod(): assert "clsmethod" == c2.classmethod() assert "staticmethod" == c2.staticmethod() - assert set(["__weakref__", "x", "y", "z"]) == set(C2Slots.__slots__) + assert {"__weakref__", "x", "y", "z"} == set(C2Slots.__slots__) c3 = C2Slots(x=1, y=3, z="test") assert c3 > c2 @@ -243,7 +241,7 @@ class C2(C1): assert 2 == c2.y assert "test" == c2.z - assert set(["z"]) == set(C2Slots.__slots__) + assert {"z"} == set(C2Slots.__slots__) assert 1 == c2.method() assert "clsmethod" == c2.classmethod() @@ -268,6 +266,70 @@ class C2(C1): assert {"x": 1, "y": 2, "z": "test"} == attr.asdict(c2) +def test_inheritance_from_slots_with_attribute_override(): + """ + Inheriting from a slotted class doesn't re-create existing slots + """ + + class HasXSlot: + __slots__ = ("x",) + + @attr.s(slots=True, hash=True) + class C2Slots(C1Slots): + # y re-defined here but it shouldn't get a slot + y = attr.ib() + z = attr.ib() + + @attr.s(slots=True, hash=True) + class NonAttrsChild(HasXSlot): + # Parent class has slot for "x" already, so we skip it + x = attr.ib() + y = attr.ib() + z = attr.ib() + + c2 = C2Slots(1, 2, "test") + assert 1 == c2.x + assert 2 == c2.y + assert "test" == c2.z + + assert {"z"} == set(C2Slots.__slots__) + + na = NonAttrsChild(1, 2, "test") + assert 1 == na.x + assert 2 == na.y + assert "test" == na.z + + assert {"__weakref__", "y", "z"} == set(NonAttrsChild.__slots__) + + +def test_inherited_slot_reuses_slot_descriptor(): + """ + We reuse slot descriptor for an attr.ib defined in a slotted attr.s + """ + + class HasXSlot: + __slots__ = ("x",) + + class OverridesX(HasXSlot): + @property + def x(self): + return None + + @attr.s(slots=True) + class Child(OverridesX): + x = attr.ib() + + assert Child.x is not OverridesX.x + assert Child.x is HasXSlot.x + + c = Child(1) + assert 1 == c.x + assert set() == set(Child.__slots__) + + ox = OverridesX() + assert ox.x is None + + def test_bare_inheritance_from_slots(): """ Inheriting from a bare attrs slotted class works. @@ -276,7 +338,7 @@ def test_bare_inheritance_from_slots(): @attr.s( init=False, eq=False, order=False, hash=False, repr=False, slots=True ) - class C1BareSlots(object): + class C1BareSlots: x = attr.ib(validator=attr.validators.instance_of(int)) y = attr.ib() @@ -292,7 +354,7 @@ def staticmethod(): return "staticmethod" @attr.s(init=False, eq=False, order=False, hash=False, repr=False) - class C1Bare(object): + class C1Bare: x = attr.ib(validator=attr.validators.instance_of(int)) y = attr.ib() @@ -343,8 +405,7 @@ class C2(C1Bare): assert {"x": 1, "y": 2, "z": "test"} == attr.asdict(c2) -@pytest.mark.skipif(PY2, reason="closure cell rewriting is PY3-only.") -class TestClosureCellRewriting(object): +class TestClosureCellRewriting: def test_closure_cell_rewriting(self): """ Slotted classes support proper closure cell rewriting. @@ -424,7 +485,7 @@ def statmethod(): def test_code_hack_failure(self, monkeypatch): """ Keeps working if function/code object introspection doesn't work - on this (nonstandard) interpeter. + on this (nonstandard) interpreter. A warning is emitted that points to the actual code. """ @@ -454,7 +515,7 @@ def test_not_weakrefable(): """ @attr.s(slots=True, weakref_slot=False) - class C(object): + class C: pass c = C() @@ -472,7 +533,7 @@ def test_implicitly_weakrefable(): """ @attr.s(slots=True, weakref_slot=False) - class C(object): + class C: pass c = C() @@ -487,7 +548,7 @@ def test_weakrefable(): """ @attr.s(slots=True, weakref_slot=True) - class C(object): + class C: pass c = C() @@ -502,7 +563,7 @@ def test_weakref_does_not_add_a_field(): """ @attr.s(slots=True, weakref_slot=True) - class C(object): + class C: field = attr.ib() assert [f.name for f in attr.fields(C)] == ["field"] @@ -515,7 +576,7 @@ def tests_weakref_does_not_add_when_inheriting_with_weakref(): """ @attr.s(slots=True, weakref_slot=True) - class C(object): + class C: pass @attr.s(slots=True, weakref_slot=True) @@ -535,7 +596,7 @@ def tests_weakref_does_not_add_with_weakref_attribute(): """ @attr.s(slots=True, weakref_slot=True) - class C(object): + class C: __weakref__ = attr.ib( init=False, hash=False, repr=False, eq=False, order=False ) @@ -562,7 +623,7 @@ def test_slots_empty_cell(): """ @attr.s(slots=True) - class C(object): + class C: field = attr.ib() def f(self, a): @@ -572,16 +633,16 @@ def f(self, a): @attr.s(getstate_setstate=True) -class C2(object): +class C2: x = attr.ib() @attr.s(slots=True, getstate_setstate=True) -class C2Slots(object): +class C2Slots: x = attr.ib() -class TestPickle(object): +class TestPickle: @pytest.mark.parametrize("protocol", range(pickle.HIGHEST_PROTOCOL)) def test_pickleable_by_default(self, protocol): """ @@ -599,10 +660,12 @@ def test_no_getstate_setstate_for_dict_classes(self): As long as getstate_setstate is None, nothing is done to dict classes. """ - i = C1(1, 2) - - assert None is getattr(i, "__getstate__", None) - assert None is getattr(i, "__setstate__", None) + assert getattr(object, "__getstate__", None) is getattr( + C1, "__getstate__", None + ) + assert getattr(object, "__setstate__", None) is getattr( + C1, "__setstate__", None + ) def test_no_getstate_setstate_if_option_false(self): """ @@ -610,13 +673,15 @@ def test_no_getstate_setstate_if_option_false(self): """ @attr.s(slots=True, getstate_setstate=False) - class C(object): + class C: x = attr.ib() - i = C(42) - - assert None is getattr(i, "__getstate__", None) - assert None is getattr(i, "__setstate__", None) + assert getattr(object, "__getstate__", None) is getattr( + C, "__getstate__", None + ) + assert getattr(object, "__setstate__", None) is getattr( + C, "__setstate__", None + ) @pytest.mark.parametrize("cls", [C2(1), C2Slots(1)]) def test_getstate_set_state_force_true(self, cls): @@ -625,3 +690,57 @@ def test_getstate_set_state_force_true(self, cls): """ assert None is not getattr(cls, "__getstate__", None) assert None is not getattr(cls, "__setstate__", None) + + +def test_slots_super_property_get(): + """ + Both `super()` and `super(self.__class__, self)` work. + """ + + @attr.s(slots=True) + class A: + x = attr.ib() + + @property + def f(self): + return self.x + + @attr.s(slots=True) + class B(A): + @property + def f(self): + return super().f ** 2 + + @attr.s(slots=True) + class C(A): + @property + def f(self): + return super(C, self).f ** 2 + + assert B(11).f == 121 + assert B(17).f == 289 + assert C(11).f == 121 + assert C(17).f == 289 + + +def test_slots_super_property_get_shortcut(): + """ + On Python 3, the `super()` shortcut is allowed. + """ + + @attr.s(slots=True) + class A: + x = attr.ib() + + @property + def f(self): + return self.x + + @attr.s(slots=True) + class B(A): + @property + def f(self): + return super().f ** 2 + + assert B(11).f == 121 + assert B(17).f == 289 diff --git a/tests/test_validators.py b/tests/test_validators.py index 4aeec9990..51fe2f41e 100644 --- a/tests/test_validators.py +++ b/tests/test_validators.py @@ -1,8 +1,9 @@ +# SPDX-License-Identifier: MIT + """ Tests for `attr.validators`. """ -from __future__ import absolute_import, division, print_function import re @@ -10,17 +11,22 @@ import attr -from attr import has +from attr import _config, fields, has from attr import validators as validator_module -from attr._compat import PY2, TYPE from attr.validators import ( and_, deep_iterable, deep_mapping, + ge, + gt, in_, instance_of, is_callable, + le, + lt, matches_re, + max_len, + min_len, optional, provides, ) @@ -41,7 +47,67 @@ def zope_interface(): return zope.interface -class TestInstanceOf(object): +class TestDisableValidators: + @pytest.fixture(autouse=True) + def reset_default(self): + """ + Make sure validators are always enabled after a test. + """ + yield + _config._run_validators = True + + def test_default(self): + """ + Run validators by default. + """ + assert _config._run_validators is True + + @pytest.mark.parametrize("value, expected", [(True, False), (False, True)]) + def test_set_validators_disabled(self, value, expected): + """ + Sets `_run_validators`. + """ + validator_module.set_disabled(value) + + assert _config._run_validators is expected + + @pytest.mark.parametrize("value, expected", [(True, False), (False, True)]) + def test_disabled(self, value, expected): + """ + Returns `_run_validators`. + """ + _config._run_validators = value + + assert validator_module.get_disabled() is expected + + def test_disabled_ctx(self): + """ + The `disabled` context manager disables running validators, + but only within its context. + """ + assert _config._run_validators is True + + with validator_module.disabled(): + assert _config._run_validators is False + + assert _config._run_validators is True + + def test_disabled_ctx_with_errors(self): + """ + Running validators is re-enabled even if an error is raised. + """ + assert _config._run_validators is True + + with pytest.raises(ValueError): + with validator_module.disabled(): + assert _config._run_validators is False + + raise ValueError("haha!") + + assert _config._run_validators is True + + +class TestInstanceOf: """ Tests for `instance_of`. """ @@ -76,8 +142,7 @@ def test_fail(self): with pytest.raises(TypeError) as e: v(None, a, "42") assert ( - "'test' must be <{type} 'int'> (got '42' that is a <{type} " - "'str'>).".format(type=TYPE), + "'test' must be (got '42' that is a ).", a, int, "42", @@ -88,12 +153,10 @@ def test_repr(self): Returned validator has a useful `__repr__`. """ v = instance_of(int) - assert ( - ">".format(type=TYPE) - ) == repr(v) + assert (">") == repr(v) -class TestMatchesRe(object): +class TestMatchesRe: """ Tests for `matches_re`. """ @@ -110,10 +173,10 @@ def test_match(self): """ @attr.s - class ReTester(object): - str_match = attr.ib(validator=matches_re("a")) + class ReTester: + str_match = attr.ib(validator=matches_re("a|ab")) - ReTester("a") # shouldn't raise exceptions + ReTester("ab") # shouldn't raise exceptions with pytest.raises(TypeError): ReTester(1) with pytest.raises(ValueError): @@ -127,18 +190,41 @@ def test_flags(self): """ @attr.s - class MatchTester(object): + class MatchTester: val = attr.ib(validator=matches_re("a", re.IGNORECASE, re.match)) MatchTester("A1") # test flags and using re.match + def test_precompiled_pattern(self): + """ + Pre-compiled patterns are accepted. + """ + pattern = re.compile("a") + + @attr.s + class RePatternTester: + val = attr.ib(validator=matches_re(pattern)) + + RePatternTester("a") + + def test_precompiled_pattern_no_flags(self): + """ + A pre-compiled pattern cannot be combined with a 'flags' argument. + """ + pattern = re.compile("") + + with pytest.raises( + TypeError, match="can only be used with a string pattern" + ): + matches_re(pattern, flags=re.IGNORECASE) + def test_different_func(self): """ Changing the match functions works. """ @attr.s - class SearchTester(object): + class SearchTester: val = attr.ib(validator=matches_re("a", 0, re.search)) SearchTester("bab") # re.search will match @@ -150,16 +236,10 @@ def test_catches_invalid_func(self): with pytest.raises(ValueError) as ei: matches_re("a", 0, lambda: None) - if not PY2: - assert ( - "'func' must be one of None, fullmatch, match, search." - == ei.value.args[0] - ) - else: - assert ( - "'func' must be one of None, match, search." - == ei.value.args[0] - ) + assert ( + "'func' must be one of None, fullmatch, match, search." + == ei.value.args[0] + ) @pytest.mark.parametrize( "func", [None, getattr(re, "fullmatch", None), re.match, re.search] @@ -192,7 +272,7 @@ def always_fail(_, __, ___): 0 / 0 -class TestAnd(object): +class TestAnd: def test_in_all(self): """ Verify that this validator is in ``__all__``. @@ -222,7 +302,7 @@ def test_sugar(self): """ @attr.s - class C(object): + class C: a1 = attr.ib("a1", validator=and_(instance_of(int))) a2 = attr.ib("a2", validator=[instance_of(int)]) @@ -246,7 +326,7 @@ def f(): return IFoo -class TestProvides(object): +class TestProvides: """ Tests for `provides`. """ @@ -263,7 +343,7 @@ def test_success(self, zope_interface, ifoo): """ @zope_interface.implementer(ifoo) - class C(object): + class C: def f(self): pass @@ -304,7 +384,7 @@ def test_repr(self, ifoo): @pytest.mark.parametrize( "validator", [instance_of(int), [always_pass, instance_of(int)]] ) -class TestOptional(object): +class TestOptional: """ Tests for `optional`. """ @@ -338,8 +418,7 @@ def test_fail(self, validator): with pytest.raises(TypeError) as e: v(None, a, "42") assert ( - "'test' must be <{type} 'int'> (got '42' that is a <{type} " - "'str'>).".format(type=TYPE), + "'test' must be (got '42' that is a ).", a, int, "42", @@ -354,18 +433,18 @@ def test_repr(self, validator): if isinstance(validator, list): repr_s = ( ">]) or None>" - ).format(func=repr(always_pass), type=TYPE) + ">]) or None>" + ).format(func=repr(always_pass)) else: repr_s = ( "> or None>" - ).format(type=TYPE) + "> or None>" + ) assert repr_s == repr(v) -class TestIn_(object): +class TestIn_: """ Tests for `in_`. """ @@ -390,9 +469,16 @@ def test_fail(self): """ v = in_([1, 2, 3]) a = simple_attr("test") + with pytest.raises(ValueError) as e: v(None, a, None) - assert ("'test' must be in [1, 2, 3] (got None)",) == e.value.args + + assert ( + "'test' must be in [1, 2, 3] (got None)", + a, + [1, 2, 3], + None, + ) == e.value.args def test_fail_with_string(self): """ @@ -403,17 +489,38 @@ def test_fail_with_string(self): a = simple_attr("test") with pytest.raises(ValueError) as e: v(None, a, None) - assert ("'test' must be in 'abc' (got None)",) == e.value.args + assert ( + "'test' must be in 'abc' (got None)", + a, + "abc", + None, + ) == e.value.args def test_repr(self): """ Returned validator has a useful `__repr__`. """ v = in_([3, 4, 5]) - assert (("")) == repr(v) + assert ("") == repr(v) -class TestDeepIterable(object): +@pytest.fixture( + name="member_validator", + params=( + instance_of(int), + [always_pass, instance_of(int)], + (always_pass, instance_of(int)), + ), + scope="module", +) +def _member_validator(request): + """ + Provides sample `member_validator`s for some tests in `TestDeepIterable` + """ + return request.param + + +class TestDeepIterable: """ Tests for `deep_iterable`. """ @@ -424,21 +531,19 @@ def test_in_all(self): """ assert deep_iterable.__name__ in validator_module.__all__ - def test_success_member_only(self): + def test_success_member_only(self, member_validator): """ If the member validator succeeds and the iterable validator is not set, nothing happens. """ - member_validator = instance_of(int) v = deep_iterable(member_validator) a = simple_attr("test") v(None, a, [42]) - def test_success_member_and_iterable(self): + def test_success_member_and_iterable(self, member_validator): """ If both the member and iterable validators succeed, nothing happens. """ - member_validator = instance_of(int) iterable_validator = instance_of(list) v = deep_iterable(member_validator, iterable_validator) a = simple_attr("test") @@ -451,6 +556,8 @@ def test_success_member_and_iterable(self): (42, instance_of(list)), (42, 42), (42, None), + ([instance_of(int), 42], 42), + ([42, instance_of(int)], 42), ), ) def test_noncallable_validators( @@ -471,17 +578,16 @@ def test_noncallable_validators( assert message in e.value.msg assert value == e.value.value - def test_fail_invalid_member(self): + def test_fail_invalid_member(self, member_validator): """ Raise member validator error if an invalid member is found. """ - member_validator = instance_of(int) v = deep_iterable(member_validator) a = simple_attr("test") with pytest.raises(TypeError): v(None, a, [42, "42"]) - def test_fail_invalid_iterable(self): + def test_fail_invalid_iterable(self, member_validator): """ Raise iterable validator error if an invalid iterable is found. """ @@ -492,12 +598,11 @@ def test_fail_invalid_iterable(self): with pytest.raises(TypeError): v(None, a, [42]) - def test_fail_invalid_member_and_iterable(self): + def test_fail_invalid_member_and_iterable(self, member_validator): """ Raise iterable validator error if both the iterable and a member are invalid. """ - member_validator = instance_of(int) iterable_validator = instance_of(tuple) v = deep_iterable(member_validator, iterable_validator) a = simple_attr("test") @@ -510,14 +615,29 @@ def test_repr_member_only(self): when only member validator is set. """ member_validator = instance_of(int) - member_repr = ">".format( - type=TYPE - ) + member_repr = ">" + v = deep_iterable(member_validator) + expected_repr = ( + "" + ).format(member_repr=member_repr) + assert expected_repr == repr(v) + + def test_repr_member_only_sequence(self): + """ + Returned validator has a useful `__repr__` + when only member validator is set and the member validator is a list of + validators + """ + member_validator = [always_pass, instance_of(int)] + member_repr = ( + "_AndValidator(_validators=({func}, " + ">))" + ).format(func=repr(always_pass)) v = deep_iterable(member_validator) expected_repr = ( "" ).format(member_repr=member_repr) - assert ((expected_repr)) == repr(v) + assert expected_repr == repr(v) def test_repr_member_and_iterable(self): """ @@ -525,13 +645,9 @@ def test_repr_member_and_iterable(self): and iterable validators are set. """ member_validator = instance_of(int) - member_repr = ">".format( - type=TYPE - ) + member_repr = ">" iterable_validator = instance_of(list) - iterable_repr = ( - ">" - ).format(type=TYPE) + iterable_repr = ">" v = deep_iterable(member_validator, iterable_validator) expected_repr = ( ">))" + ).format(func=repr(always_pass)) + iterable_validator = instance_of(list) + iterable_repr = ">" + v = deep_iterable(member_validator, iterable_validator) + expected_repr = ( + "" + ).format(iterable_repr=iterable_repr, member_repr=member_repr) + + assert expected_repr == repr(v) -class TestDeepMapping(object): + +class TestDeepMapping: """ Tests for `deep_mapping`. """ @@ -629,13 +766,9 @@ def test_repr(self): Returned validator has a useful `__repr__`. """ key_validator = instance_of(str) - key_repr = ">".format( - type=TYPE - ) + key_repr = ">" value_validator = instance_of(int) - value_repr = ">".format( - type=TYPE - ) + value_repr = ">" v = deep_mapping(key_validator, value_validator) expected_repr = ( "".format( + op=nv.compare_op, bound=23 + ) + + +class TestMaxLen: + """ + Tests for `max_len`. + """ + + MAX_LENGTH = 4 + + def test_in_all(self): + """ + validator is in ``__all__``. + """ + assert max_len.__name__ in validator_module.__all__ + + def test_retrieve_max_len(self): + """ + The configured max. length can be extracted from the Attribute + """ + + @attr.s + class Tester: + value = attr.ib(validator=max_len(self.MAX_LENGTH)) + + assert fields(Tester).value.validator.max_length == self.MAX_LENGTH + + @pytest.mark.parametrize( + "value", + [ + "", + "foo", + "spam", + [], + list(range(MAX_LENGTH)), + {"spam": 3, "eggs": 4}, + ], + ) + def test_check_valid(self, value): + """ + Silent if len(value) <= max_len. + Values can be strings and other iterables. + """ + + @attr.s + class Tester: + value = attr.ib(validator=max_len(self.MAX_LENGTH)) + + Tester(value) # shouldn't raise exceptions + + @pytest.mark.parametrize( + "value", + [ + "bacon", + list(range(6)), + ], + ) + def test_check_invalid(self, value): + """ + Raise ValueError if len(value) > max_len. + """ + + @attr.s + class Tester: + value = attr.ib(validator=max_len(self.MAX_LENGTH)) + + with pytest.raises(ValueError): + Tester(value) + + def test_repr(self): + """ + __repr__ is meaningful. + """ + assert repr(max_len(23)) == "" + + +class TestMinLen: + """ + Tests for `min_len`. + """ + + MIN_LENGTH = 2 + + def test_in_all(self): + """ + validator is in ``__all__``. + """ + assert min_len.__name__ in validator_module.__all__ + + def test_retrieve_min_len(self): + """ + The configured min. length can be extracted from the Attribute + """ + + @attr.s + class Tester: + value = attr.ib(validator=min_len(self.MIN_LENGTH)) + + assert fields(Tester).value.validator.min_length == self.MIN_LENGTH + + @pytest.mark.parametrize( + "value", + [ + "foo", + "spam", + list(range(MIN_LENGTH)), + {"spam": 3, "eggs": 4}, + ], + ) + def test_check_valid(self, value): + """ + Silent if len(value) => min_len. + Values can be strings and other iterables. + """ + + @attr.s + class Tester: + value = attr.ib(validator=min_len(self.MIN_LENGTH)) + + Tester(value) # shouldn't raise exceptions + + @pytest.mark.parametrize( + "value", + [ + "", + list(range(1)), + ], + ) + def test_check_invalid(self, value): + """ + Raise ValueError if len(value) < min_len. + """ + + @attr.s + class Tester: + value = attr.ib(validator=min_len(self.MIN_LENGTH)) + + with pytest.raises(ValueError): + Tester(value) + + def test_repr(self): + """ + __repr__ is meaningful. + """ + assert repr(min_len(23)) == "" diff --git a/tests/test_version_info.py b/tests/test_version_info.py index db4053f94..5bd101bcc 100644 --- a/tests/test_version_info.py +++ b/tests/test_version_info.py @@ -1,9 +1,9 @@ -from __future__ import absolute_import, division, print_function +# SPDX-License-Identifier: MIT + import pytest from attr import VersionInfo -from attr._compat import PY2 @pytest.fixture(name="vi") @@ -27,9 +27,6 @@ def test_suffix_is_preserved(self): == VersionInfo._from_version_string("19.2.0.dev0").releaselevel ) - @pytest.mark.skipif( - PY2, reason="Python 2 is too YOLO to care about comparability." - ) @pytest.mark.parametrize("other", [(), (19, 2, 0, "final", "garbage")]) def test_wrong_len(self, vi, other): """ diff --git a/tests/typing_example.py b/tests/typing_example.py index eb86c8f8c..1c6691f75 100644 --- a/tests/typing_example.py +++ b/tests/typing_example.py @@ -1,8 +1,11 @@ +# SPDX-License-Identifier: MIT + import re from typing import Any, Dict, List, Tuple, Union import attr +import attrs # Typing via "type" Argument --- @@ -59,6 +62,14 @@ class FF: z: Any = attr.ib() +@attrs.define +class FFF: + z: int + + +FFF(1) + + # Inheritance -- @@ -96,6 +107,19 @@ class Error(Exception): str(e) +@attrs.define +class Error2(Exception): + x: int + + +try: + raise Error2(1) +except Error as e: + e.x + e.args + str(e) + + # Converters # XXX: Currently converters can only be functions so none of this works # although the stubs should be correct. @@ -118,6 +142,20 @@ class Error(Exception): # ConvCDefaultIfNone(None) +# @attr.s +# class ConvCToBool: +# x: int = attr.ib(converter=attr.converters.to_bool) + + +# ConvCToBool(1) +# ConvCToBool(True) +# ConvCToBool("on") +# ConvCToBool("yes") +# ConvCToBool(0) +# ConvCToBool(False) +# ConvCToBool("n") + + # Validators @attr.s class Validated: @@ -153,7 +191,7 @@ class Validated: attr.validators.instance_of(C), attr.validators.instance_of(D) ), ) - e: str = attr.ib(validator=attr.validators.matches_re(r"foo")) + e: str = attr.ib(validator=attr.validators.matches_re(re.compile(r"foo"))) f: str = attr.ib( validator=attr.validators.matches_re(r"foo", flags=42, func=re.search) ) @@ -165,10 +203,33 @@ class Validated: validator=attr.validators.instance_of((int, str)) ) k: Union[int, str, C] = attr.ib( - validator=attr.validators.instance_of((int, C, str)) + validator=attrs.validators.instance_of((int, C, str)) ) +@attr.define +class Validated2: + num: int = attr.field(validator=attr.validators.ge(0)) + + +@attrs.define +class Validated3: + num: int = attr.field(validator=attr.validators.ge(0)) + + +with attr.validators.disabled(): + Validated2(num=-1) + +with attrs.validators.disabled(): + Validated3(num=-1) + +try: + attr.validators.set_disabled(True) + Validated2(num=-1) +finally: + attr.validators.set_disabled(False) + + # Custom repr() @attr.s class WithCustomRepr: @@ -178,6 +239,14 @@ class WithCustomRepr: d: bool = attr.ib(repr=str) +@attrs.define +class WithCustomRepr2: + a: int = attrs.field(repr=True) + b: str = attrs.field(repr=False) + c: str = attrs.field(repr=lambda value: "c is for cookie") + d: bool = attrs.field(repr=str) + + # Check some of our own types @attr.s(eq=True, order=False) class OrderFlags: @@ -199,24 +268,51 @@ class ValidatedSetter: ) +@attrs.define(on_setattr=attr.setters.validate) +class ValidatedSetter2: + a: int + b: str = attrs.field(on_setattr=attrs.setters.NO_OP) + c: bool = attrs.field(on_setattr=attrs.setters.frozen) + d: int = attrs.field( + on_setattr=[attrs.setters.convert, attrs.setters.validate] + ) + e: bool = attrs.field( + on_setattr=attrs.setters.pipe( + attrs.setters.convert, attrs.setters.validate + ) + ) + + # field_transformer def ft_hook(cls: type, attribs: List[attr.Attribute]) -> List[attr.Attribute]: return attribs +# field_transformer +def ft_hook2( + cls: type, attribs: List[attrs.Attribute] +) -> List[attrs.Attribute]: + return attribs + + @attr.s(field_transformer=ft_hook) class TransformedAttrs: x: int +@attrs.define(field_transformer=ft_hook2) +class TransformedAttrs2: + x: int + + # Auto-detect -# XXX: needs support in mypy -# @attr.s(auto_detect=True) -# class AutoDetect: -# x: int +@attr.s(auto_detect=True) +class AutoDetect: + x: int + + def __init__(self, x: int): + self.x = x -# def __init__(self, x: int): -# self.x = x # Provisional APIs @attr.define(order=True) @@ -224,8 +320,7 @@ class NGClass: x: int = attr.field(default=42) -# XXX: needs support in mypy -# ngc = NGClass(1) +ngc = NGClass(1) @attr.mutable(slots=False) @@ -233,8 +328,7 @@ class NGClass2: x: int -# XXX: needs support in mypy -# ngc2 = NGClass2(1) +ngc2 = NGClass2(1) @attr.frozen(str=True) @@ -242,10 +336,83 @@ class NGFrozen: x: int -# XXX: needs support in mypy -# ngf = NGFrozen(1) +ngf = NGFrozen(1) + +attr.fields(NGFrozen).x.evolve(eq=False) +a = attr.fields(NGFrozen).x +a.evolve(repr=False) + + +attrs.fields(NGFrozen).x.evolve(eq=False) +a = attrs.fields(NGFrozen).x +a.evolve(repr=False) @attr.s(collect_by_mro=True) class MRO: pass + + +@attr.s +class FactoryTest: + a: List[int] = attr.ib(default=attr.Factory(list)) + b: List[Any] = attr.ib(default=attr.Factory(list, False)) + c: List[int] = attr.ib(default=attr.Factory((lambda s: s.a), True)) + + +@attrs.define +class FactoryTest2: + a: List[int] = attrs.field(default=attrs.Factory(list)) + b: List[Any] = attrs.field(default=attrs.Factory(list, False)) + c: List[int] = attrs.field(default=attrs.Factory((lambda s: s.a), True)) + + +attrs.asdict(FactoryTest2()) +attr.asdict(FactoryTest(), tuple_keys=True) + + +# Check match_args stub +@attr.s(match_args=False) +class MatchArgs: + a: int = attr.ib() + b: int = attr.ib() + + +attr.asdict(FactoryTest()) +attr.asdict(FactoryTest(), retain_collection_types=False) + + +# Check match_args stub +@attrs.define(match_args=False) +class MatchArgs2: + a: int + b: int + + +# NG versions of asdict/astuple +attrs.asdict(MatchArgs2(1, 2)) +attrs.astuple(MatchArgs2(1, 2)) + + +def accessing_from_attr() -> None: + """ + Use a function to keep the ns clean. + """ + attr.converters.optional + attr.exceptions.FrozenError + attr.filters.include + attr.setters.frozen + attr.validators.and_ + attr.cmp_using + + +def accessing_from_attrs() -> None: + """ + Use a function to keep the ns clean. + """ + attrs.converters.optional + attrs.exceptions.FrozenError + attrs.filters.include + attrs.setters.frozen + attrs.validators.and_ + attrs.cmp_using diff --git a/tests/utils.py b/tests/utils.py index ad3fb578a..3d10621da 100644 --- a/tests/utils.py +++ b/tests/utils.py @@ -1,8 +1,9 @@ +# SPDX-License-Identifier: MIT + """ Common helper functions for tests. """ -from __future__ import absolute_import, division, print_function from attr import Attribute from attr._make import NOTHING, make_class @@ -66,7 +67,7 @@ def simple_attr( ) -class TestSimpleClass(object): +class TestSimpleClass: """ Tests for the testing helper function `make_class`. """ diff --git a/tox.ini b/tox.ini index 0508fdaa8..f93fa449a 100644 --- a/tox.ini +++ b/tox.ini @@ -10,67 +10,69 @@ filterwarnings = # Keep docs in sync with docs env and .readthedocs.yml. [gh-actions] python = - 2.7: py27 3.5: py35 3.6: py36 - 3.7: py37, docs - 3.8: py38, lint, manifest, typing, changelog - 3.9: py39 - pypy2: pypy2 - pypy3: pypy3 + 3.7: py37 + 3.8: py38, changelog + 3.9: py39, pyright + 3.10: py310, manifest, typing, docs + 3.11: py311 + pypy-3: pypy3 [tox] -envlist = typing,lint,py27,py35,py36,py37,py38,py39,pypy,pypy3,manifest,docs,pypi-description,changelog,coverage-report +envlist = typing,pre-commit,py35,py36,py37,py38,py39,py310,py311,pypy3,pyright,manifest,docs,pypi-description,changelog,coverage-report isolated_build = True +[testenv:docs] +# Keep basepython in sync with gh-actions and .readthedocs.yml. +basepython = python3.10 +extras = docs +commands = + sphinx-build -n -T -W -b html -d {envtmpdir}/doctrees docs docs/_build/html + sphinx-build -n -T -W -b doctest -d {envtmpdir}/doctrees docs docs/_build/html + python -m doctest README.rst + + [testenv] -# Prevent random setuptools/pip breakages like -# https://github.com/pypa/setuptools/issues/1042 from breaking our builds. -setenv = - VIRTUALENV_NO_DOWNLOAD=1 -extras = {env:TOX_AP_TEST_EXTRAS:tests} +extras = tests commands = python -m pytest {posargs} -[testenv:py27] -extras = {env:TOX_AP_TEST_EXTRAS:tests} +[testenv:py35] +extras = tests commands = coverage run -m pytest {posargs} [testenv:py37] -# Python 3.6+ has a number of compile-time warnings on invalid string escapes. -# PYTHONWARNINGS=d and --no-compile below make them visible during the Tox run. -install_command = pip install --no-compile {opts} {packages} -setenv = - PYTHONWARNINGS=d -extras = {env:TOX_AP_TEST_EXTRAS:tests} +extras = tests commands = coverage run -m pytest {posargs} -[testenv:py38] +[testenv:py310] # Python 3.6+ has a number of compile-time warnings on invalid string escapes. # PYTHONWARNINGS=d and --no-compile below make them visible during the Tox run. -basepython = python3.8 +basepython = python3.10 install_command = pip install --no-compile {opts} {packages} setenv = PYTHONWARNINGS=d -extras = {env:TOX_AP_TEST_EXTRAS:tests} +extras = tests commands = coverage run -m pytest {posargs} [testenv:coverage-report] -basepython = python3.7 +basepython = python3.10 +depends = py35,py37,py310 skip_install = true -deps = coverage[toml]>=5.0.2 +deps = coverage[toml]>=5.4 commands = coverage combine coverage report -[testenv:lint] -basepython = python3.8 +[testenv:pre-commit] +basepython = python3.10 skip_install = true deps = pre-commit @@ -79,18 +81,8 @@ commands = pre-commit run --all-files -[testenv:docs] -# Keep basepython in sync with gh-actions and .readthedocs.yml. -basepython = python3.7 -extras = docs -commands = - sphinx-build -n -T -W -b html -d {envtmpdir}/doctrees docs docs/_build/html - sphinx-build -n -T -W -b doctest -d {envtmpdir}/doctrees docs docs/_build/html - python -m doctest README.rst - - [testenv:manifest] -basepython = python3.8 +basepython = python3.10 deps = check-manifest skip_install = true commands = check-manifest @@ -115,8 +107,22 @@ commands = towncrier --draft [testenv:typing] -basepython = python3.8 -deps = mypy +basepython = python3.10 +deps = mypy>=0.902 commands = - mypy src/attr/__init__.pyi src/attr/_version_info.pyi src/attr/converters.pyi src/attr/exceptions.pyi src/attr/filters.pyi src/attr/setters.pyi src/attr/validators.pyi + mypy src/attrs/__init__.pyi src/attr/__init__.pyi src/attr/_version_info.pyi src/attr/converters.pyi src/attr/exceptions.pyi src/attr/filters.pyi src/attr/setters.pyi src/attr/validators.pyi mypy tests/typing_example.py + + +[testenv:pyright] +# Install and configure node and pyright +# This *could* be folded into a custom install_command +# Use nodeenv to configure node in the running tox virtual environment +# Seeing errors using "nodeenv -p" +# Use npm install -g to install "globally" into the virtual environment +basepython = python3.9 +deps = nodeenv +commands = + nodeenv --prebuilt --node=lts --force {envdir} + npm install -g --no-package-lock --no-save pyright + pytest tests/test_pyright.py -vv