Thanks to visit codestin.com
Credit goes to github.com

Skip to content

symfony/console 5.4.12 make db Too many connection errors #47566

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Kleinast opened this issue Sep 13, 2022 · 15 comments
Closed

symfony/console 5.4.12 make db Too many connection errors #47566

Kleinast opened this issue Sep 13, 2022 · 15 comments

Comments

@Kleinast
Copy link

Symfony version(s) affected

5.4.12

Description

I upgrade symfony/console to last version (5.4.12) and lots of my functionnal tests doesn't work anymore because of error like:

1) App\Tests\Infra\Review\ReviewBoosterSenderTest::****Send with data set "sms" (1, 2)
Doctrine\DBAL\Exception\DriverException: An exception occurred in the driver: SQLSTATE[08004] [1040] Too many connections

/root/project/vendor/doctrine/dbal/src/Driver/API/MySQL/ExceptionConverter.php:117
/root/project/vendor/doctrine/dbal/src/Connection.php:1818
/root/project/vendor/doctrine/dbal/src/Connection.php:1767
/root/project/vendor/doctrine/dbal/src/Connection.php:328
/root/project/vendor/doctrine/dbal/src/Connection.php:1529
/root/project/vendor/doctrine/dbal/src/Connection.php:1010
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/Persisters/Entity/BasicEntityPersister.php:750
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/Persisters/Entity/BasicEntityPersister.php:768
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/EntityManager.php:521
/root/project/var/cache/****/ContainerRdlthlA/EntityManager_9a5be93.php:95
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/EntityRepository.php:199
/root/project/src/Infra/Shop/Repository/ShopRepository.php:42
/root/project/src/Infra/Review/ReviewBoosterSender.php:47
/root/project/****s/Infra/Review/ReviewBoosterSenderTest.php:27
/root/project/vendor/phpunit/phpunit/src/Framework/TestResult.php:726
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/TextUI/TestRunner.php:673

Caused by
Doctrine\DBAL\Driver\PDO\Exception: SQLSTATE[08004] [1040] Too many connections

/root/project/vendor/doctrine/dbal/src/Driver/PDO/Exception.php:28
/root/project/vendor/doctrine/dbal/src/Driver/PDO/MySQL/Driver.php:34
/root/project/vendor/doctrine/dbal/src/Connection.php:326
/root/project/vendor/doctrine/dbal/src/Connection.php:1529
/root/project/vendor/doctrine/dbal/src/Connection.php:1010
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/Persisters/Entity/BasicEntityPersister.php:750
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/Persisters/Entity/BasicEntityPersister.php:768
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/EntityManager.php:521
/root/project/var/cache/****/ContainerRdlthlA/EntityManager_9a5be93.php:95
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/EntityRepository.php:199
/root/project/src/Infra/Shop/Repository/ShopRepository.php:42
/root/project/src/Infra/Review/ReviewBoosterSender.php:47
/root/project/****s/Infra/Review/ReviewBoosterSenderTest.php:27
/root/project/vendor/phpunit/phpunit/src/Framework/TestResult.php:726
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/TextUI/TestRunner.php:673

Caused by
PDOException: SQLSTATE[08004] [1040] Too many connections

/root/project/vendor/doctrine/dbal/src/Driver/PDO/MySQL/Driver.php:28
/root/project/vendor/doctrine/dbal/src/Connection.php:326
/root/project/vendor/doctrine/dbal/src/Connection.php:1529
/root/project/vendor/doctrine/dbal/src/Connection.php:1010
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/Persisters/Entity/BasicEntityPersister.php:750
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/Persisters/Entity/BasicEntityPersister.php:768
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/EntityManager.php:521
/root/project/var/cache/****/ContainerRdlthlA/EntityManager_9a5be93.php:95
/root/project/vendor/doctrine/orm/lib/Doctrine/ORM/EntityRepository.php:199
/root/project/src/Infra/Shop/Repository/ShopRepository.php:42
/root/project/src/Infra/Review/ReviewBoosterSender.php:47
/root/project/****s/Infra/Review/ReviewBoosterSenderTest.php:27
/root/project/vendor/phpunit/phpunit/src/Framework/TestResult.php:726
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/Framework/TestSuite.php:672
/root/project/vendor/phpunit/phpunit/src/TextUI/TestRunner.php:673

Really wird problem but I don't have it with symfony/console 5.4.11.
Is it possible there is a problem in symfony/console@v5.4.11...v5.4.12 ?

How to reproduce

Run lot of functionnals tests on a project with symfony/console 5.4.12 installed

Possible Solution

No response

Additional Context

It is not happened when only one tests is run.

@xabbuh
Copy link
Member

xabbuh commented Sep 13, 2022

Can you create a small example application that allows to reproduce your issue?

@Kleinast
Copy link
Author

It is gonna be hard to create that I think, even on my project, it happen on specific context.
I don't have the erros when I run tests on my local machine witch run tests slowly.
I have the errors when my tests run on circleci (local and circleci use the same mariadb docker container). Also circle ci must run not only one test.
I dig a little bit on circleci in ssh mode, I don't know if its a good lead but it seem mysql process stacked until reach the max connection with symfony/console 5.4.12:

MariaDB [(none)]> SHOW FULL PROCESSLIST;
+------+-------------+------------------+--------+---------+------+--------------------------+-----------------------+----------+
| Id | User | Host | db | Command | Time | State | Info | Progress |
+------+-------------+------------------+--------+---------+------+--------------------------+-----------------------+----------+
| 1 | system user | | NULL | Daemon | NULL | InnoDB purge coordinator | NULL | 0.000 |
| 2 | system user | | NULL | Daemon | NULL | InnoDB purge worker | NULL | 0.000 |
| 3 | system user | | NULL | Daemon | NULL | InnoDB purge worker | NULL | 0.000 |
| 4 | system user | | NULL | Daemon | NULL | InnoDB purge worker | NULL | 0.000 |
| 5 | system user | | NULL | Daemon | NULL | InnoDB shutdown handler | NULL | 0.000 |
| 912 | root | 172..*.3:59738 | legacy | Sleep | 147 | | NULL | 0.000 |
| 915 | root | 172.
..3:59744 | db | Sleep | 146 | | NULL | 0.000 |
| 943 | root | 172.**.
.3:59800 | legacy | Sleep | 138 | | NULL | 0.000 |
| 946 | root | 172..*.3:59806 | db | Sleep | 138 | | NULL | 0.000 |
| 951 | root | 172.
..3:59818 | legacy | Sleep | 135 | | NULL | 0.000 |
| 954 | root | 172.**.
.3:59824 | db | Sleep | 134 | | NULL | 0.000 |
| 959 | root | 172..*.3:59836 | legacy | Sleep | 132 | | NULL | 0.000 |
| 962 | root | 172.
..3:59842 | db | Sleep | 131 | | NULL | 0.000 |
| 965 | root | 172.**.
.3:59848 | legacy | Sleep | 127 | | NULL | 0.000 |
| 968 | root | 172..*.3:59854 | db | Sleep | 127 | | NULL | 0.000 |
| 971 | root | 172.
..3:59860 | legacy | Sleep | 124 | | NULL | 0.000 |
| 974 | root | 172.**.
.3:59866 | db | Sleep | 124 | | NULL | 0.000 |
| 977 | root | 172..*.3:59872 | legacy | Sleep | 121 | | NULL | 0.000 |
| 980 | root | 172.
..3:59878 | db | Sleep | 121 | | NULL | 0.000 |
| 988 | root | 172.**.
.3:59896 | legacy | Sleep | 109 | | NULL | 0.000 |
| 991 | root | 172..*.3:59906 | db | Sleep | 109 | | NULL | 0.000 |
| 995 | root | 172.
..3:59914 | legacy | Sleep | 105 | | NULL | 0.000 |
| 998 | root | 172.**.
.3:59922 | db | Sleep | 105 | | NULL | 0.000 |
| 1002 | root | 172..*.3:59932 | legacy | Sleep | 101 | | NULL | 0.000 |
| 1005 | root | 172.
..3:59938 | db | Sleep | 101 | | NULL | 0.000 |
| 1009 | root | 172.**.
.3:59946 | legacy | Sleep | 97 | | NULL | 0.000 |
| 1012 | root | 172..*.3:59952 | db | Sleep | 97 | | NULL | 0.000 |
| 1016 | root | 172.
..3:59964 | legacy | Sleep | 94 | | NULL | 0.000 |
| 1019 | root | 172.**.
.3:59970 | db | Sleep | 93 | | NULL | 0.000 |
| 1023 | root | 172..*.3:59978 | legacy | Sleep | 90 | | NULL | 0.000 |
| 1026 | root | 172.
..3:59986 | db | Sleep | 90 | | NULL | 0.000 |
| 1030 | root | 172.**.
.3:59996 | legacy | Sleep | 86 | | NULL | 0.000 |
| 1033 | root | 172..*.3:60002 | db | Sleep | 86 | | NULL | 0.000 |
| 1036 | root | 172.
..3:60008 | legacy | Sleep | 83 | | NULL | 0.000 |
| 1039 | root | 172.**.
.3:60018 | db | Sleep | 82 | | NULL | 0.000 |
| 1042 | root | 172..*.3:60024 | legacy | Sleep | 79 | | NULL | 0.000 |
| 1045 | root | 172.
..3:60030 | db | Sleep | 79 | | NULL | 0.000 |
| 1053 | root | 172.**.
.3:60046 | legacy | Sleep | 74 | | NULL | 0.000 |
| 1056 | root | 172..*.3:60052 | db | Sleep | 73 | | NULL | 0.000 |
| 1060 | root | 172.
..3:60062 | legacy | Sleep | 70 | | NULL | 0.000 |
| 1063 | root | 172.**.
.3:60070 | db | Sleep | 69 | | NULL | 0.000 |
| 1068 | root | 172..*.3:60080 | legacy | Sleep | 66 | | NULL | 0.000 |
| 1071 | root | 172.
..3:60086 | db | Sleep | 66 | | NULL | 0.000 |
| 1081 | root | 172.**.
.3:60106 | legacy | Sleep | 62 | | NULL | 0.000 |
| 1084 | root | 172..*.3:60114 | db | Sleep | 61 | | NULL | 0.000 |
| 1085 | root | 172.
..3:60116 | NULL | Query | 0 | init | SHOW FULL PROCESSLIST | 0.000 |
| 1088 | root | 172.**.
.3:60124 | legacy | Sleep | 58 | | NULL | 0.000 |
| 1091 | root | 172..*.3:60130 | db | Sleep | 58 | | NULL | 0.000 |
| 1094 | root | 172.
..3:60136 | legacy | Sleep | 55 | | NULL | 0.000 |
| 1097 | root | 172.**.
.3:60142 | db | Sleep | 55 | | NULL | 0.000 |
| 1102 | root | 172..*.3:60152 | legacy | Sleep | 52 | | NULL | 0.000 |
| 1105 | root | 172.
..3:60160 | db | Sleep | 51 | | NULL | 0.000 |
| 1108 | root | 172.**.
.3:60168 | legacy | Sleep | 48 | | NULL | 0.000 |
| 1111 | root | 172..*.3:60174 | db | Sleep | 48 | | NULL | 0.000 |
| 1114 | root | 172.
..3:60180 | legacy | Sleep | 45 | | NULL | 0.000 |
| 1117 | root | 172.**.
.3:60186 | db | Sleep | 44 | | NULL | 0.000 |
| 1125 | root | 172..*.3:60204 | legacy | Sleep | 39 | | NULL | 0.000 |
| 1128 | root | 172.
..3:60210 | db | Sleep | 39 | | NULL | 0.000 |
| 1133 | root | 172.**.
.3:60220 | legacy | Sleep | 35 | | NULL | 0.000 |
| 1136 | root | 172..*.3:60226 | db | Sleep | 35 | | NULL | 0.000 |
| 1139 | root | 172.
..3:60232 | legacy | Sleep | 31 | | NULL | 0.000 |
| 1142 | root | 172.**.
.3:60238 | db | Sleep | 31 | | NULL | 0.000 |
| 1145 | root | 172..*.3:60244 | legacy | Sleep | 27 | | NULL | 0.000 |
| 1148 | root | 172.
..3:60252 | db | Sleep | 27 | | NULL | 0.000 |
| 1153 | root | 172.**.
.3:60262 | legacy | Sleep | 23 | | NULL | 0.000 |
| 1156 | root | 172..*.3:60268 | db | Sleep | 23 | | NULL | 0.000 |
| 1159 | root | 172.
..3:60274 | legacy | Sleep | 17 | | NULL | 0.000 |
| 1162 | root | 172.**.
.3:60280 | db | Sleep | 17 | | NULL | 0.000 |
| 1165 | root | 172..*.3:60288 | legacy | Sleep | 14 | | NULL | 0.000 |
| 1168 | root | 172.
..3:60298 | db | Sleep | 14 | | NULL | 0.000 |
| 1171 | root | 172.**.
.3:60304 | legacy | Sleep | 9 | | NULL | 0.000 |
| 1174 | root | 172..*.3:60310 | db | Sleep | 9 | | NULL | 0.000 |
| 1177 | root | 172.
..3:60316 | legacy | Sleep | 5 | | NULL | 0.000 |
| 1180 | root | 172.**.
.3:60322 | db | Sleep | 5 | | NULL | 0.000 |
| 1183 | root | 172..*.3:60328 | legacy | Sleep | 1 | | NULL | 0.000 |
| 1186 | root | 172.
.*.3:60334 | db | Sleep | 1 | | NULL | 0.000 |
+------+-------------+------------------+--------+---------+------+--------------------------+-----------------------+----------+
76 rows in set (0.000 sec)

But with symfony/console 5.4.11, it more like:
MariaDB [(none)]> SHOW FULL PROCESSLIST;
+------+-------------+------------------+--------+---------+------+--------------------------+-----------------------+----------+
| Id | User | Host | db | Command | Time | State | Info | Progress |
+------+-------------+------------------+--------+---------+------+--------------------------+-----------------------+----------+
| 1 | system user | | NULL | Daemon | NULL | InnoDB purge coordinator | NULL | 0.000 |
| 2 | system user | | NULL | Daemon | NULL | InnoDB purge worker | NULL | 0.000 |
| 3 | system user | | NULL | Daemon | NULL | InnoDB purge worker | NULL | 0.000 |
| 4 | system user | | NULL | Daemon | NULL | InnoDB purge worker | NULL | 0.000 |
| 5 | system user | | NULL | Daemon | NULL | InnoDB shutdown handler | NULL | 0.000 |
| 1085 | root | 172..*.3:60116 | NULL | Query | 0 | init | SHOW FULL PROCESSLIST | 0.000 |
| 1726 | root | 172.
..3:33572 | legacy | Sleep | 3 | | NULL | 0.000 |
| 1729 | root | 172.**.
.3:33578 | db | Sleep | 3 | | NULL | 0.000 |
+------+-------------+------------------+--------+---------+------+--------------------------+-----------------------+----------+
8 rows in set (0.000 sec)

The max_connection for db is 100, that would mean it reach it.

I think it's happen during our tests because some of them reset the db and fixture running a console command. But I think it could happen on prod with project that run often symfony console command ?

@xabbuh
Copy link
Member

xabbuh commented Sep 14, 2022

I am afraid that without being able to debug we won't be able to help here and you would need to try to find the root cause (and maybe submit a PR then based on your findings).

@Kleinast
Copy link
Author

I tryed to dig more into my problem but I'm not very confortable understanding deep code of the component, but here my deductions.

It seem the mysql process "leak" comme from the php function pcntl_signal() https://github.com/symfony/console/blob/c072aa8f724c3af64e2c7a96b796a4863d24dba1/SignalRegistry/SignalRegistry.php#L37

Before 5.4.12 it seem the $this->signalRegistry->register() was never call because of the condition if ($command instanceof SignalableCommandInterface)
May I assume that was not normal and the purpose of bug #47218 fix
But now the signalRegistry->register() is call, it seem that the function pcntl_signal() is what cause my process leak.
Indeed, i don't have the problem when I comment the line https://github.com/symfony/console/blob/c072aa8f724c3af64e2c7a96b796a4863d24dba1/SignalRegistry/SignalRegistry.php#L37 in the component.

I don't understand well the behavior of the function pcntl_signal but my current lead will be that this function mess up when multiple console application->run are done in clause range.

In my context, our functionnals tests need to reset db fixtures after data insert/update/delete, so we have something like:

       foreach ($dbs as $db) {
           $input = new ArrayInput([
               'command' => 'doctrine:database:drop',
               '--force' => true,
               '--if-exists' => true,
               '--connection' => $connection,
               '-q' => true,
           ]);
           $application->run($input, $output);

           $input = new ArrayInput([
               'command' => 'doctrine:database:create',
               '--connection' => $connection,
               '-q' => true,
           ]);
           $application->run($input, $output);

           $input = new ArrayInput(array_merge(
               [
                   'command' => 'doctrine:migrations:migrate',
                   '--allow-no-migration' => true,
                   '-n' => true,
                   '-q' => true,
               ],
               !empty($data['config']) ? ['--configuration' => $data['config']] : [],
           ));
           $application->run($input, $output);

           $input = new ArrayInput([
               'command' => 'dbal:fixture:load',
               'db' => $connection,
               'path' => $data['fixtures'],
               '-q' => true,
           ]);
           $application->run($input, $output);
}

We have 3 differents db and it seem the mysql process is well terminate only for the last db we call the command, the 2 others stack new process each time a test reload the fixture.

I'm aware our use case may be very specific but I think It can be a problem for other case.

@kfrohwein
Copy link

kfrohwein commented Sep 26, 2022

We are using Spryker and do an import every night of around 250.000 items. The v5.4.12 seems to have the same issue. So we downgraded to v.5.4.10.

We don't get any error. Our staging system now runs 10 minutes slower but the production system has like 3 times more load and fails in the end.

@carsonbot
Copy link

Hey, thanks for your report!
There has not been a lot of activity here for a while. Is this bug still relevant? Have you managed to find a workaround?

@benmeynell
Copy link

There has not been a lot of activity here for a while. Is this bug still relevant? Have you managed to find a workaround?

@Kleinast ping?

@carsonbot carsonbot removed the Stalled label Jun 26, 2023
@Kleinast
Copy link
Author

At the end we refacto this part of our code to drasticly less call the db reset command, so we don't have this error anymore but doesn't mean the problem not potentialy there anymore. But I can't investigate more on this now, so I think you can close this issue.
Thx for your time

@kfrohwein
Copy link

Just to note that we didn't find a solution for spryker and pinned the package to 5.4.11. We hope it goes away if Spryker updates to Symfony 6.

@benmeynell
Copy link

benmeynell commented Jun 29, 2023

@kfrohwein To be clear from your comments, both v5.4.10 and v5.4.11, of the Console Component, do not exhibit the defect, yet you DO definitively and repeatedly experience the defect in v5.4.12 -- full stop? Second question, have you tried any patches beyond v5.4.12 (specifically, v5.4.[13-24])? Let us know! Super useful!

@Kleinast You experienced the defect at patch v5.4.12? Specifically and unequivocally that patch? If yes, that corroborates @kfrohwein report "to the tea". I know a project that is running v5.4.24 and is experiencing the "Too many connections" error. So potentially three affected people/orgs here that took time to document here... one can only imagine how many more are out there struggling with this seemingly phantomish edge/corner case!

Genesis:

Breaking Change (Per Anecdotal Reports / Not Confirmed):

Authors are @xabbuh and @GwendolenLynch. Change was committed by @nicolas-grekas. Any keen insight(s), ponderings, brain dumps, from anyone in that trio? 🥺

@kfrohwein @Kleinast It was emphasized to you by @xabbuh above, and I'll re-emphasis what I believe the sentiment in the communication to be:

This issue is will not resolved as-is. We need more contextual information -- even just a little, to bring us closer to solving this! Anything is better than nothing and super appreciated! If you don't have time, etc., to create a failing use case, as was requested above by @xabbuh, could you maybe add some "low hanging fruit" (high value, low effort, info) into this issue? Examples:

  1. Stacktrace(s)?
  2. Database config(s), or simply vendor and version even (are you both running Maria, for example?) Or is someone also running MySQL? What version? Doctrine config (bin/console debug:config ...)?
  3. PHP Version?
  4. Server Information (if linux, something like neofetch --stdout would be great to see).
  5. Is the error ALWAYS "Too many connections ..." (fundamentally stemming from surpassing that corresponding configuration option inside of MariaDB and MySQL)?
  6. Active PHP Modules (php -m) on your system? php -i would be great, too. To boot, phpinfo() from the web too 🏆
  7. PHP Context: Does the error occur running via CLI only, via HTTP/Web only, or in both contexts?
  8. Long(er)-Running Process(es): Do you only ever experience the error after PHP runs "for a while"?
  9. Messenger Component: Are you running this component and do you experience this error within the messenger workflow? I am interested in exploring the angle of running bin/console messenger:consume "for a long time", both as to gather more anecdotal information as well as an easy path to reproducing the issue.
    9a. How are you running your worker/consumer process(es)? More specifically, what command options and/or arguments are you using (time limit and/or max memory and/or etc., ...)? If you have the exact command, that'd be epic! Are you leveraging a process manager (i.e., supervisord), or is execution occurring without that added layer -- for example, as a Docker Container (Service)? Please let us know! ... And if the former, what's your manager (i.e. supervisor) config? If the latter, may we kindly see your docker compose config construct for your worker definition(s)? Something as simple as docker compose config | yq would be grand!

Thanks ya'll! 🥇

P.S. @weaverryan big fan of SymfonyCasts bro, have you heard (whispers?) of this issue at all???

@GwendolenLynch
Copy link
Contributor

For context, when I hit the problems in #45332, it was working on commands doing large long-running PostgreSQL jobs. It is too long ago now, but I seem to remember hitting something similar when writing my tests. Something to do with @runInSeparateProcess and not closing the connection in the test class' tearDown() method.

@kfrohwein
Copy link

Took a look in our git.

It's right now pinned to "symfony/console": "v5.4.10",

And the largest number we did use was "symfony/console": "^v5.4.17",

5.4.11 as far as I remember should work as you just changed comments. From 5.4.12 our nightly import is like 20% slower but it is unknown why. If I understand the changes correctly you did change how the signals are registered. But I do not see any reason why this should slow down anything.

@nicolas-grekas nicolas-grekas reopened this Jul 3, 2023
@carsonbot
Copy link

Hey, thanks for your report!
There has not been a lot of activity here for a while. Is this bug still relevant? Have you managed to find a workaround?

@carsonbot
Copy link

Hello? This issue is about to be closed if nobody replies.

@carsonbot
Copy link

Hey,

I didn't hear anything so I'm going to close it. Feel free to comment if this is still relevant, I can always reopen!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants