0

First of all I shall explain the structure of my scripts:

Script1 --(call)--> Script2 function and this function further calls 12-15 functions in different scripts through multiprocessing, and in those functions some have infinite loops and some have threading in it.
The function that contains threading further calls some functions having infinite loops in them.

The code of multiprocessing is just like:

Script1.py
def func1():
   #some infinite functionality
Script2.py
def func2():
   #some infinite functionality
Script3.py
def func3():
   #Thread calling
#..
#..
#and so on...

from multiprocessing import Process
from Script1 import func1
from Script2 import func2
from Script3 import func3
process=[]
process.append(Process(target=func1))
process.append(Process(target=func2))
process.append(Process(target=func3))
process.append(Process(target=func4))
process.append(Process(target=func5))
process.append(Process(target=func6))
# ..
# ..
# and so on.

for p in process:
    p.start()
    print("process started:", p)

for p in process:
    p.join()
    print("process joined:", p)

Now the issues I faced are:

  1. It only prints the first joined print statement, but all processes started successfully.(It means join for all processes not executed).
  2. Second, is some processes becomes zombie, when run with multiprocessing but without multiprocessing when functions run externally works very well(so here I know that join cannot work properly due to some issue, our functions are fine).
  3. Third, in my situation should I used multiprocessing. Pools or multiprocessing. Process works fine for me? Any suggestion?
  4. Fourth, I want to know, if there are any alternatives to this except multithreading and mutliprocessing. Pools maybe?

I also tried a different thing but did not work like:

set_start_context('spawn')

Because it individually runs every process externally like we run scripts through popen externally, according to my knowledge

Note:
I am using Ubuntu 16.04. I have a scenario where I can use multiprocessing. Process, multiprocessing. Pool and something that works like multiprocessing, but cannot use multithreading here.

3
  • Actually i did not get any response from stack overflow that's why i am posting it again here. Commented Jan 5, 2020 at 18:03
  • Stackoverflow has the more knowledgeable python users. You've most likely received no response due to not giving enough information. Try to give an SSCCE that shows your problem. My suspicion is that the issue is in one (or more) of your scripts. Without seeing them or being able to run one I don't think it will be possible to help much. So try to cut down to the minimum code that shows your problem. Commented Jan 5, 2020 at 18:24
  • A "zombie" process is a process whose parent process has not collected its exit status. See man wait, or see if Python can harvest the exit status. Commented Jan 5, 2020 at 21:17

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.