Up to Python 3.13 (the current one) There is no way in Python to pause a given thread, unless one's own code has slots that will verify for a condition and do the pause.
In the same process, the most natural way to do that is by asyncio programming - every await is a "slot" where one can insert arbitrary code, and a synchronous sleep would pause the entire async loop. But again, the running code should be an async function, with at least an await expression in its loop - not unlike having to check a Queue or Event object from time to time.
Subprocess approach
@Suramuthu's approach of running the target code in a subprocess can work - but not in the way (currently) described in the answer: the subprocess can have a controler thread to pick signals (through a multiprocessing.Queue or otherwise), and start the user code - but nonetheless, it can't pause the running code with time.sleep: one would be in the exact same situation as when running the code in another thread in the first process.
However, it is possible to send O.S. level signals to the subprocess - and these, unlike threads, can be controlled by externally running code.
The signal.SIGSTOP and signal.SIGCONT signals can be used to pause the whole process
and could be easy to use. Note that these are not available under Windows though - I believe it is possible to register a signal handler for ordinary SIGKILL (or other) in windows to "fake" a process stop until a second SIGKILL is sent. For Linux and Mac, SIGSTOP and SIGCONT makes things a lot easier, as the OS itself processes them.
Even if you want to run just one process/thread at a time it may be interesting to use concurrent.futures.ProcessPoolExecutor as it will handle communications (queue), return value, subprocess lifetime for you, including all edge cases, with minimal fuss.
(/me goes to the REPL for a PoC)
Ok - it worked as planned, and I got a beautiful gitch-related ASCII art worth posting as part of the research:
In [1]: import time, os, signal; from concurrent.futures import ProcessPoolExecutor
In [4]: def counter(n):
...: for i in range(n):
...: time.sleep(1)
...: print(i)
...: return i
...:
# ... setup executor with 1 worker:
In [5]: t = executor.submit(counter, 15)
# retrieve its worker PID:
In [15]: pid, proc = next(iter(executor._processes.items()))
# submit a call to counter up to 15
In [17]: t = executor.submit(counter, 15)
# type the pause sequence while counter is running - the fun part:
# ( I type os.kill(os.getpid(), signal.SIGSTOP) - but this is
# my terminal :-D :
In [18]: o
s1
.k2
ill3
(4
pi5
d, 6
signal7
.SI8
GST9
(P )10
In [18]: os.kill(pid, signal.SIGSTOP)
# counter function is now paused - I have the terminal for me!
In [19]: os.kill(pid, signal.SIGCONT)
11
In [20]: 12
13
14
In [20]: t.result()
Out[20]: 14
In [21]:
Now, going back to your snippet and applying this:
import os, signal,time
from concurrent.futures import ProcessPoolExecutor
from threading import Timer
MSEC="MSEC"
def _warmerfunc():
time.sleep(0)
class Thread:
def __init__(self, callback, *args):
self.timer = None
self.executor = ProcessPoolExecutor(1)
# start executor with a "nop" call so the worker
# is created:
self.executor.submit(_warmerfunc)
# retrieve the dict key for the only worker - which is the PID
# (although this is a private implementation detail
# of ProcessPoolExecutor, it is not likely changing anytime
# soon. Might not work for pypy, though)
self.pid = next(iter(self.executor._processes))
self.future = self.executor.submit(callback, *args)
def stop(self):
os.kill(self.pid, signal.SIGSTOP)
def resume(self):
self.timer = None
os.kill(self.pid, signal.SIGCONT)
def sleep_for(self, duration, units=MSEC):
# code for handling units - normalize duration to seconds
...
# ...
self.stop()
self.timer = Timer(duration, self.resume)
self.timer.start()
def done():
return self.future.done()
def exception():
return self.future.exception()
def result():
return self.future.result()
def __del__(self):
self.executor.shutdown()
Possible same process-approach from Python 3.14 on:
With Python 3.14 (in beta, to be released in October 2025) it may be possible to
pause a thread in the same process using the PEP 768 approach to insert a function call inside the another thread of a program. It will still involve a subprocess, but the called code will be running in the same process that causes the pause - therefore some limits for subprocesses (like non serializable objects) can be overcome.
The main limitation is that although it becomes possible to attach an arbitrary call to a running thread with no stop or check points, it only works with the main thread - which means that you have to run your code, including the caller code, in a secondary thread, and run the "pausable" code in the main thread - that is no trivial exercise.
Maybe it is possible to use, from ctypes, the same hooks that sys.remote_exec uses in the Python interpreter and have this working for other threads - but it would be complex.