Skip to main content
Filter by
Sorted by
Tagged with
1 vote
1 answer
59 views

I have a task where I have to fetch a lot of files from a database (I/O-bound) and process each of them (CPU-bound). I thought of using a producer-consumer pattern, where fetch workers (producers) ...
squinterodlr's user avatar
3 votes
1 answer
82 views

I am building an autonomous AI Agent (managing training workflows) that automatically generates PyTorch/OpenMMLab training scripts and executes them in a background subprocess. One of the common ...
user32496676's user avatar
0 votes
1 answer
56 views

We recently upgraded from Python 3.9 to 3.14 and it appears that the default start method for multiprocessing has switched from 'fork' to 'spawn' and it is causing some tests to fail, trying to ...
rpmcnally's user avatar
  • 269
0 votes
0 answers
28 views

I'm writing a program that requires transferring GPU tensors across processes in a pipeline manner. I knew that using torch.Multiprocessing would automatically get the CudaIpcMemHandle for me and send ...
LongTran's user avatar
Advice
0 votes
4 replies
80 views

I've been going a bit down a rabbit hole that started when I tried to get a real-world performance comparison of 2 SSD drives, under Linux. I discovered KDiskMark, which tries very hard to emulate ...
RJVB's user avatar
  • 860
Best practices
2 votes
7 replies
154 views

This is related to this question. Basically, I need to add many fractions together. Seems simple enough? Except it isn't simple. I can't use fractions.Fraction class, that thing stupidly does fraction ...
Ξένη Γήινος's user avatar
0 votes
0 answers
41 views

I have a Dataset that is based on IterableDataSet, looking like that class MyDataSet(torch.utils.data.IterableDataset): def __init__(self): # doing init stuff here def __iter__(self): ...
RaJa's user avatar
  • 1,597
0 votes
1 answer
129 views

I have a multi-process application (one orchestrator and multiple workers) where all processes write to the same log file using NLog 6. The log file is configured to be archived: daily (ArchiveEvery=...
Christopher Fontaine's user avatar
14 votes
3 answers
717 views

Consider the following executable Python script mtmp.py: import numpy as np import os from concurrent.futures import ThreadPoolExecutor, ProcessPoolExecutor # `x` is referenced `n_loop` times. def ...
user2961927's user avatar
  • 1,962
2 votes
0 answers
92 views

I am trying to learn some operating system engineering so I came about MIT's operating system engineering course which has been wonderful so far. The problem I'm having is with the second problem read-...
아이스크림은메로나's user avatar
1 vote
1 answer
79 views

I am hoping to use Celery to manage the task queue for my application, and am wondering if it is capable of managing tasks that themselves use multiprocessing, called from an external library. For ...
kehunter's user avatar
2 votes
1 answer
114 views

I’m experimenting with mixing asyncio and multiprocessing in Python 3.12. On Linux, the following code works as expected — the event loop stays responsive and the child process prints normally. On ...
Kritika Arora's user avatar
1 vote
0 answers
98 views

I'm trying to run calculations using multiple cores in Python on multiple platforms (Linux, macOS, Windows). I need to pass a large CustomClass Object and a dict (both readonly) to all workers. So far ...
polyte's user avatar
  • 459
2 votes
2 answers
118 views

Overview: I am trying to use a Pool internally in a module that is not __main__ and make it invisible to main that this pool exists. Because of this, if __name__ == "__main__": protection is ...
blizzdex's user avatar
4 votes
1 answer
170 views

I am running into a FAILED_PRECONDITION: DNN library initialization failed error when trying to parallelize a JAX function using either Python's multiprocessing library or joblib. The strange part is ...
PowerPoint Trenton's user avatar
0 votes
1 answer
193 views

I have a file enwiktionary_namespace_0.tar.gz that contains 86 .ndjson files enwiktionary_namespace_0_0.ndjson enwiktionary_namespace_0_1.ndjson enwiktionary_namespace_0_2.ndjson ... ...
Akira's user avatar
  • 2,858
1 vote
0 answers
92 views

I'm experiencing an issue where the parent Python process terminates unexpectedly when debugging, but only when a child process is sent SIGTERM. The same code works perfectly when run normally (...
Skyman2413's user avatar
2 votes
1 answer
133 views

I tried to run bw2io.import_ecoinvent_release() to import ecoinvent as I used to, but the function is now stuck in a loop and return error messages indefinitely (see below). What I run: import bw2data ...
Victor Maneval's user avatar
0 votes
1 answer
100 views

I am trying to setup a multiprocessing Python task on Windows 10, Python 3.13. I have "main.py" module, containing the main entry, "orchestration.py" module with worker ...
PChemGuy's user avatar
  • 1,895
0 votes
1 answer
43 views

In my scenario I use multiple DataLoaders with multiple Datasets to evaluate models against each other (I want to test models with multiple resolutions, which means each dataset has a distinct ...
Yuval's user avatar
  • 3,608
0 votes
1 answer
349 views

I have a simple app with a main controller process, and a child process that handles API calls. They communicate using Python queues. The app looks (something) like this: import multiprocessing as mp ...
BHK's user avatar
  • 29
0 votes
2 answers
119 views

I scan for a trigger and when I get it, load a .npy file and process it. It started to take almost 2 seconds to load the NumPy file from within the process but when I tried to load the same file from ...
user2037777's user avatar
3 votes
1 answer
135 views

I try to use a custom multiprocessing manager, mostly following the example from the docs. The main difference is that my class updates internal state. It looks like this: class IdIndex: def ...
Achim's user avatar
  • 15.7k
1 vote
1 answer
93 views

I have a class with methods to simulate sources across 16 detectors using the Gelsa package. In my main script, I call the method generate.sources. I am trying to use multiprocessing to speed up the ...
Nicolò Fiaba's user avatar
4 votes
2 answers
306 views

I've been trying to parallelize some code that I wrote in python. The actual work is embarrassingly parallel, but I don't have much experience with multiprocessing in Python. The actual code I'm ...
user2506833's user avatar
0 votes
0 answers
122 views

I ran into a problem when I'm trying to plot figures with matplotlib in separate processes using multiprocessing. I tried it on OpenBSD 7.7 with Python 3.12.11 and on Debian 12 with Python 3.11.2 and ...
Tandoori's user avatar
0 votes
1 answer
75 views

I have a python script that pings another device every so often. PING_SIZE = "1" PING_RETRYS = "1" while True: try: result = subprocess.run( ["ping&...
Thewafflication's user avatar
2 votes
2 answers
226 views

I'm working on an asynchronous server using asyncio, but I need to monitor a multiprocessing.Event (used to signal termination from another process) inside my async event loop. Here's the simplified ...
UnemployedBrat's user avatar
3 votes
0 answers
144 views

I'm working on several projects that are targeting bare-metal multi-core microprocessors, with hardware provisions for shared memory. In other words, the device has several separate CPU cores (of ...
Fake Name's user avatar
  • 5,953
3 votes
1 answer
95 views

Suppose I create a shared memory object: from multiprocessing import shared_memory shm_a = shared_memory.SharedMemory(create=True, size=1024) buffer = shm_a.buf and put a generic object of a generic ...
Geremia's user avatar
  • 5,906
3 votes
4 answers
185 views

I want to distribute a Python application over several kernels, and from the documentation, I understand that a pool is the way to do this. My problem can be reproduced by the following code: #!/usr/...
tommiport5's user avatar
1 vote
1 answer
81 views

for some reason the connection timeout does not seem to work when connecting to a Firebird database. For example, if an incorrect IP address is specified, the script hangs and waits for a long time. #!...
Сер За's user avatar
2 votes
2 answers
185 views

I have a large array and an object I'd like to call multiple times with multiprocessing. Neither the data nor the object internals get modified. This works: import numpy as np from multiprocessing ...
I.P. Freeley's user avatar
2 votes
1 answer
235 views

On windows, running the below code with Python or pytest makes it print out Windows fatal exception: access violation (but the script will continue with no issue) Reproduction: import multiprocessing ...
kevinlinxc's user avatar
0 votes
1 answer
90 views

In the project I am currently working on, I am parsing a json file and according to client list, I need to start processes in parallel. However there might be the case where a inside a client list, I ...
Berat Yilmaz's user avatar
0 votes
0 answers
112 views

I have a FastAPI application where one of the API endpoints needs to parse PDF files using unstructured.partition_pdf. Since the parsing is CPU-heavy, I want to run it in the background so that the ...
Inderjeet Singh's user avatar
1 vote
0 answers
141 views

I'm having trouble getting this example working. In serial this function works just fine, but when I attempt to run it in a multiprocessing.Pool it locks up and will not return a simple random integer....
RobSmith's user avatar
0 votes
0 answers
58 views

I am very new to python. My current project is to create a basic Tkinter window to start and stop a TCPdump Ethernet recording using a couple of buttons. I have been able to start a recording using a ...
SteveL's user avatar
  • 1
0 votes
1 answer
85 views

I am trying to measure the processing time, or CPU time, of a CPU-intensive computation that has been parallelized with multiprocessing. However, simply bookending the parallelization of the ...
SapereAude's user avatar
2 votes
0 answers
72 views

The problem I am running machine learning models in parallel using multiprocessing. When using models with parameters stating the number of threads used- num_threads, num_jobs, etc. - the code works ...
Ottpocket's user avatar
  • 353
1 vote
0 answers
151 views

I wasn't able to find anything regarding this on the internet: I am using multiprocessing (concurrent.futures. ProcessPoolExecutor(max_workers=(...)) as executor) to execute several DRL training ...
Sum's user avatar
  • 11
1 vote
2 answers
91 views

I can succesfully create numpy arrays with simple dtypes (int32 etc). But when i try to use something like : b_shared_memory = shared_memory.SharedMemory(create=True, name = "test235", size =...
Janso's user avatar
  • 19
0 votes
2 answers
69 views

I have a Python function decorated with a 1-minute timeout. Inside it I spawn one or more worker threads that each run a 2-minute sleep via a dynamically executed script (using exec() in a custom ...
Vikash Rajput's user avatar
2 votes
1 answer
53 views

I'm trying to use multiprocessing in a LibreOffice Macro in Python 3.8. When I create a Macro it is part of the 'ooo_script_framework' module. So, when the process is pickled it is in fact part of an ...
Hiul_Dragonfel's user avatar
0 votes
2 answers
128 views

I am trying to make a program that has equivalents for another module's code. The module has it's own thread class, with the commands 'stop', and 'sleep_for' For example, using this module, you could ...
YouEatTomatoes's user avatar
2 votes
0 answers
84 views

I'm training deep learning models using TensorFlow (with GPU support) on my local machine. I noticed a surprising behavior: When I train just one model (in a single terminal), it runs slower. But ...
Palantir's user avatar
0 votes
1 answer
140 views

I am trying to write a Python script to run another potentially multithreaded script in a way to further parallelise them. Consider a (C++) executable which can be run as run_task <sample_number>...
Lost_Soul's user avatar
0 votes
0 answers
84 views

This question is related to the Distributed hyperparameter tuning strategy of keras tuner to get the possible hyperparameters for my model. The search space is huge. The worker thread is as in the ...
Don Woodward's user avatar
0 votes
1 answer
212 views

I use python to read a large csv file, transform the data within (mostly string operations), and then write the results to a single parquet file. In the transformation process, rows are independent. 1 ...
norcalpedaler's user avatar
0 votes
0 answers
69 views

I have created a colab (link: https://colab.research.google.com/drive/1gg57PS7KMLKvvx9wgDKLMDiyjhRICplp#scrollTo=zG2D7JO2OdEC) to play with the gpt2 fine-tuning. And I was trying to practice the DDP ...
novakwang's user avatar

1
2 3 4 5
296