Took a fun couple hours creating a function that does his slowest part as a service (not doing his work for him--we realized this needs to exist outside of his work) and then also making a parallelized helper for it.
Yep, 10-15x speedup using 16 cores. I think the operational machines have 32 or more cores, so this is great. Gets our final runtimes down to <1s (easy case) and <10s (hard case).
#python #multiprocessing #threads #software #space #orbitalmechanics
#python #multiprocessing #threads #software #space #orbitalmechanics
@the_curiostech I don't have a particular go-to. My frequent #multiprocessing use-case is embarassingly parallel so I use #python mp.Pool.map().
No rpc, no queues or locks or anything. Just "please blast this code across 10000 items and give me the results".
I find if I write "base code" any more complicated than that the bugs I encounter are too hard for my tiny brain.
🚀 Supercharge your #Python projects with Aiomultiprocess! Easily integrate #multiprocessing & #asyncio with this powerful library. Learn how through a real-world web scraping example. 🔥
📖 Read more:
https://qtalen.medium.com/aiomultiprocess-super-easy-integrate-multiprocessing-asyncio-in-python-2e883b65ba46
#aiomultiprocess #webdevelopment #webdev #programming #coding
#python #multiprocessing #asyncio #aiomultiprocess #webdevelopment #webdev #programming #coding
@jannem in my course, we use both #numba and #multiprocessing. It's amazing how easily you can get by with accelerating something way more by just throwing in a JIT compile decorator than by rewriting your code to run multiprocessing. It easily makes multiprocessing seem not worth the work.
Wer multiprocessing.Pool nutzt, sollte auch
set_start_method('spawn')
nutzen. Warum und wo das Problem liegt steht hier:
@joxean Can you instead collect the status/results from each worker as it finishes the tasks? That's the usual way to do it.
If you want realtime communication from a worker then you can use a Queue or a Pipe, depending on whether you need one way or two way messaging.
You set the Queue or Pipe object as part of the initialisation of the worker, with one end in the controller script and the other in the worker, and it abstracts away locking for you.
@joxean what state do you need to share across workers with a Pool?
Can you just do the thing needing synchronization at the controller process, instead of in the workers? e.g. do expensive work in the workers, and print results as they come back, instead of in each worker.
Sometimes when I think I need to sync my workers I realise I can dedupe the workload upfront.
Can you share the code or problem to understand why you need locking?
#multiprocessing #python #concurrency
Another very long article on a topic I am not using very often. However I learned something new and hope to remember it if I ever need it.
The author gives some nice to follow examples how to screw things up using multiprocessing and states strategies on fixing the shown race conditions
https://superfastpython.com/multiprocessing-race-condition-python
This year's programming revelation for me was the #multiprocessing library #ray. Now that I've discovered how easily you can adapt existing code to make it scale up to any number of nodes, I can never go back to Python's built-in multiprocessing, especially considering all the limitations that have suddenly evaporated in a ray :mind_blown: Also, the interface is so dead simple, I can't believe it. #python #programming #DataScience #bigdata #scalability #parallelization #interfaces #ux
#multiprocessing #ray #python #programming #DataScience #BigData #scalability #parallelization #interfaces #ux
#PythonForSciComp schedule update:
- Now (xx:10), Parallel in #python. #multiprocessing is useful for many people, #MPI is more as a demo. A good 50-min intro if you want to know what's available.
#PythonForSciComp #python #multiprocessing #mpi #teaching #livestream
#PythonForSciComp is going well. Coming up:
- Now: #scipy library ecosystem
- xx:15 or so: getting data from web APIs
- In about an hour: #parallel (#multiprocessing #MPI in #Python)
#PythonForSciComp #scipy #parallel #multiprocessing #mpi #python
#PythonForSciComp #livestream resumes tomorrow morning 9:50 EET / 8:50 CET, with these four lessons:
- scripts (moving from #jupyter notebooks to reusable interfaces with #CommandLine)
- T+~1h: library ecosystem (#scipy ecosystem)
- T+~1h: Getting data from web APIs with #requests
- T+~2h: Parallel code in Python (#multiprocessing, #MPI, and a bit more)
All are relative basic level, designed to introduce new programmers to the topics.
#PythonForSciComp #livestream #jupyter #commandline #scipy #requests #multiprocessing #mpi
Fork and Run: The Definitive Guide to Getting Started With Multiprocessing - Since the early 2000s, the CPU industry has shifted from raw clock speed to core c... - https://hackaday.com/2022/09/15/fork-and-run-the-definitive-guide-to-getting-started-with-multiprocessing/ #softwaredevelopment #multiprocessing #multicore #featured #openmp #thread #fork
#fork #thread #openmp #featured #multicore #multiprocessing #softwaredevelopment
Fork and Run: The Definitive Guide to Getting Started With Multiprocessing
https://hackaday.com/2022/09/15/fork-and-run-the-definitive-guide-to-getting-started-with-multiprocessing/
#SoftwareDevelopment #multiprocessing #multicore #Featured #openmp #thread #fork
#softwaredevelopment #multiprocessing #multicore #Featured #openmp #thread #fork
#RFC: I know thath this already exists, but I am bothered by the lack of a useful #FOSS library.
So, I developed a #CLI tool that walks through a website and checks reachability of all sites linked and records their latency.
This was the first time that I properly deployed parallel processing in Python with multiprocessing.
https://codeberg.org/developers/maintain-website-tool/src/branch/main
#rfc #foss #cli #codeberg #multiprocessing #python
passer de 45 minutes de traitement a 4 minutes, merci le
#multiprocessing de #python 😍