Python + Multi-Threading

Multi-threading - code executed by several processors at different stages of execution.

Background:

While developing an integration between two APIs, I needed to implement multi-threading within my Python-Flask application to return a response to an incoming HTTP-Post request while simultaneously allowing further data processing to continue.

Challenge:

The request sending API has a maximum response wait time of 10 seconds, and the request receiving API takes approximately 6-7 seconds to complete a cloning action. Along with the cloning action, 5-6 other sub-actions must be executed "after" the cloning is complete.

Solution:

Multi-Threading!

In Python, true parallelism can't be executed because the Global Interpreter Lock (GIL) prevents multiple threads of Python code from running simultaneously; therefore, concurrent execution of code through the use of workers/threads can be used to add efficiency when there is a lot of I/O bound dependencies throughout the execution of a script.

In my case, since the main cloning action takes 6-7 seconds before returning a success response and there are 5-6 subsequent sub-actions that depend on a UID returned within the response, using a thread-pool and concurrently running my sub-actions among three worker threads enables me to asynchronously return a response to the requesting API before the 10-second timeout limit is reached. Simultaneously, I can hand off the necessary response UID to the subsequent sub-actions that will run in three threads outside the main process. Threads are lighter than processes and share the same memory space.

For my purposes, I was able to utilize Python's base concurrent.futures library; however, Python has a lot of other modules like Celery that are more advanced. I plan on writing about more advanced queue management modules, so check for another post and subscribe below.