-
-
Notifications
You must be signed in to change notification settings - Fork 9.6k
[Process] add a process manager allowing to run commands in parallel (queueing up if needed) #8454
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Note (more details on the rationale): to achieve parallelism some developers rely on a master-script using curl to execute worker-scripts via http calls to localhost. This is imho bad practice as it forces the php.ini used for web purpose to have exceedingly high memory/timeout issues. It also keeps web worker processes occupied for a very long time |
👍 |
2 similar comments
👍 |
👍 |
Well, time to get back from the grave I guess, 3 plusses mean that I have to turn this into a full-fledged Pull Request... |
@gggeek maybe you should wait a bit more. Votes from the community are important, but the decisive votes are only the ones from the Symfony Core. So far none of them have voted on this. |
@javiereguiluz ok then |
I think that's a good idea. Having such a manager would allow to get rid of a lot of boilerplate code you would have to write in your own application otherwise. |
You mean this: https://github.com/kriswallsmith/spork ? |
@mvrhov this class is not about managing Process instances |
maybe this https://github.com/liuggio/fastest/tree/master/src/Process will help you? |
About spork: it seems that it can indeed be used to achieve the same goal, or almost. |
spork uses fork model (which means that only PHP processes can be parallelized), while symfony/process can achieve the pretty same goals with just I currently working on PoC implementation of process manager for symfony/process with some IPC support via pipes (to add concurrency to Behat). |
It would be great to have multiple strategies to achieve the goal. Like using pthread, fork, pipe, msg, whatever is available on env. And if there aren't any - fallback to running things in sequence, using same interface. |
@keradus the thing is that pthreads, fork, proc_open are for different goals. For example pthreads + coroutine could be used to achieve micro-threads functionality, fork can be used for speed-up workers startup and proc_open for anything else. Currently there is no way to use symfony/process with long-lived processes. See #14482 for details. After process is started the only way to pass messages to it is selectable streams ( I think that what should be done, is simple process manager for allowing to start multiple processes via For example I have this use cases: |
It is tricky, but workers+jobs strategy may be done with threads and processes too. |
I think that this doesn't bring much benefits. |
See #18513, finally |
…SKIP_OUT/ERR modes (nicolas-grekas) This PR was merged into the 3.1-dev branch. Discussion ---------- [Process] Turn getIterator() args to flags & add ITER_SKIP_OUT/ERR modes | Q | A | ------------- | --- | Branch? | master | Bug fix? | no | New feature? | yes | BC breaks? | no | Deprecations? | no | Tests pass? | yes | Fixed tickets | #8454 | License | MIT | Doc PR | - Targeted at 3.1 Commits ------- 428f12e [Process] Turn getIterator() args to flags & add ITER_SKIP_OUT/ERR modes
Since this was scheduled for |
@martinsik tbh I am not sure. I looked at #18513, and all I can see is changes in the way process IO is handled, which allows to set up chained processes. Otoh I have found this extremely simple Process Manager that seems to fit the bill for my original usecase: https://github.com/jagandecapri/symfony-parallel-process |
@gggeek Do you want to provide a PR for this ? |
To anyone wondering about this request, don't miss the comments in #23596 |
Thank you for this suggestion. |
I'm going to close this issue as it has no activity for 3 years now. |
A common pattern when writing batch scripts importing huge number of existing data into an app is to write the batch script so that it can execute its work in parallel. Running the script with eg. 8 instances in parallel allows to finish the import task in a fraction of the time.
Depending on the platform in use, the developer might use forking, threading, pnctl and a myriad other techniques to achieve parallelism. There's no rocket science in that code, but it still quite a chore and bug-prone.
I suggest that the Sf Process component gets extended with a process manager class, which can be used to execute multiple processes in parallel.
Sample code - not based on Sf currently - is available at https://gist.github.com/gggeek/5956177 for discussion
The text was updated successfully, but these errors were encountered: