-
-
Notifications
You must be signed in to change notification settings - Fork 31.9k
Improve documentation on multiprocessing synchronization-between-processes #116526
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
To be honest this is a pretty bad example to begin with - because you won't notice any difference if you simply remove the lock so it's pretty confusing when users try it out. The stdout is buffered in most environments and won't flush until seeing a newline. We should probably change this to something more obvious - multiple prints maybe (building a triangle?). The exception mentioned in this issue can be reproduced with spawn, but might be hard to repro with fork as fork is too fast. |
Hey @gaogaotiantian , thanks for your reply. I think your point makes sense, and it still confuses me when reading it as well. Either with or without the lock, the example doesn't seem to emphasize what difference this lock brings. Tbh, my original intent was to just complete the example to remove some exceptions, now I think maybe a new example is needed. |
one quick example I could think of is as below, we increment a value 3 times in a process. import multiprocessing
# Function to be executed by each process
def increment_counter(counter, lock, process_id):
lock.acquire()
for _ in range(3):
counter.value += 1
print(f"Counter value: {counter.value} incremented by the {process_id} process")
lock.release()
if __name__ == "__main__":
# Shared counter value
counter = multiprocessing.Value('i', 0)
# Creating a Lock
lock = multiprocessing.Lock()
# Creating processes
processes = []
for j in range(4): # Creating 4 processes
p = multiprocessing.Process(target=increment_counter, args=(counter, lock, j))
processes.append(p)
p.start()
# Waiting for all processes to finish
for p in processes:
p.join()
# Output the final value of the counter
print("Final counter value:", counter.value) with the lock, the value will consecutively be incremented by the same process, and the example will output something like this
if we remove the lock-related code, import multiprocessing
# Function to be executed by each process
def increment_counter(counter, process_id):
for _ in range(3):
counter.value += 1
print(f"Counter value: {counter.value} incremented by the {process_id} process")
if __name__ == "__main__":
# Shared counter value
counter = multiprocessing.Value('i', 0)
# Creating processes
processes = []
for j in range(4): # Creating 4 processes
p = multiprocessing.Process(target=increment_counter, args=(counter, j))
processes.append(p)
p.start()
# Waiting for all processes to finish
for p in processes:
p.join()
# Output the final value of the counter
print("Final counter value:", counter.value) it will randomly print out some duplicates like below, eventually not even reaching 12.
wdyt? |
Documentation
https://docs.python.org/3/library/multiprocessing.html#synchronization-between-processes
(A clear and concise description of the issue.)
when running the provided example locally, unlike other examples on this page which can print valid stdout, the code in this section will only print out the error below, which is caused by not providing the Process.join() call at the end.
An example patch could be
Linked PRs
The text was updated successfully, but these errors were encountered: