I will show you how to do two or more processes at the same time in Python
--Thread --Thread pool --Process pool --Event loop (coroutine)
Threads allow you to run multiple functions at the same time.
It works if you pass the function as target
to the threading.Thread
class and start it with start ()
.
import time
import threading
def func1():
while True:
print("func1")
time.sleep(1)
def func2():
while True:
print("func2")
time.sleep(1)
if __name__ == "__main__":
thread_1 = threading.Thread(target=func1)
thread_2 = threading.Thread(target=func2)
thread_1.start()
thread_2.start()
Execution result
func1
func2
func2
func1
func1
func2
func2
func1
It is even more powerful if you use the concurrent.futures package of Python 3.2 or later. Use the ThreadPoolExecutor class in it.
If you decide the maximum number max_workers
to run at the same time first, it will use threads again, so it is smarter than the ordinary threads introduced above.
If you have a newer version of Python, you can actively use it.
import time
import concurrent.futures
def func1():
while True:
print("func1")
time.sleep(1)
def func2():
while True:
print("func2")
time.sleep(1)
if __name__ == "__main__":
executor = concurrent.futures.ThreadPoolExecutor(max_workers=2)
executor.submit(func1)
executor.submit(func2)
Execution result
func1
func2
func1
func2
func1
func2
func1
func2
Same as above concurrent.futures package, but it's called ** process pool ** instead of thread pool There is also.
By dividing into process units instead of threads, you will not be subject to Global Interpreter Lock (GIL) restrictions ** You will be able to operate with multiple cores. ** ** However, since it uses a process that is larger than the thread, other restrictions may increase. Caution!
It's easy to use, just change the ThreadPoolExecutor
introduced above to ProcessPoolExecutor
.
import time
import concurrent.futures
def func1():
while True:
print("func1")
time.sleep(1)
def func2():
while True:
print("func2")
time.sleep(1)
if __name__ == "__main__":
executor = concurrent.futures.ProcessPoolExecutor(max_workers=2)
executor.submit(func1)
executor.submit(func2)
Execution result
func1
func2
func1
func2
func1
func2
func1
func2
There is also a way to run multiple processes in one thread. One of them is the ** event loop **. In Python3.4 or later, it can be realized with asyncio module.
It is easy to understand the difference from multithreading and when to use it ... by reading Asynchronous processing in Python: asyncio reverse lookup reference. ..
It is much more efficient than increasing threads for asynchronous I / O such as communication and file input / output, but it is difficult to get used to because the concept is difficult.
The sample code is quite different from the thread case.
I'm using ʻasyncio.sleepinstead of
time.sleep to wait, because I call ʻasyncio.sleep
and move to another parallel process while waiting. It's just a coroutine.
import asyncio
@asyncio.coroutine
def func1():
while True:
print("func1")
yield from asyncio.sleep(1)
@asyncio.coroutine
def func2():
while True:
print("func2")
yield from asyncio.sleep(1)
if __name__ == "__main__":
loop = asyncio.get_event_loop()
tasks = asyncio.wait([func1(), func2()])
loop.run_until_complete(tasks)
Execution result
func2
func1
func2
func1
func2
func1
func2
func1
Recommended Posts