Is there a multithreaded map() function?

0 votes
asked Apr 1, 2010 by sandro

I have a function that is side-effect free. I would like to run it for every element in an array and return an array with all of the results.

Does Python have something to generate all of the values?

7 Answers

0 votes
answered Jan 1, 2010 by thomas

This functionality is not built in. However, someone has already implemented it.

0 votes
answered Jan 1, 2010 by adam-nelson

Maybe try the Unladen Swallow Python 3 implementation? That might be a major project, and not guaranteed to be stable, but if you're inclined it could work. Then list or set comprehensions seem like the proper functional structure to use.

0 votes
answered Apr 1, 2010 by samtregar

Try the function from multiprocessing:

It's not multithreaded per-se, but that's actually good since multithreading is severely crippled in Python by the GIL.

0 votes
answered Apr 1, 2010 by braincore

You can use the multiprocessing python package ( The cloud python package, available from PiCloud (, offers a multi-processing map() function as well, which can offload your map to the cloud.

0 votes
answered Apr 1, 2010 by danben

I would think there would be no reason to have such a function. All Python threads have to execute on the same CPU. Assuming your map function has no I/O component, you would not see any speedup in processing (and would probably see a slowdown due to context switching).

Other posters have mentioned multiprocessing - that is probably a better idea.

0 votes
answered Jan 4, 2015 by maximilian

Python now has the concurrent.futures module, which is the simplest way of getting map to work with either multiple threads or multiple processes.

0 votes
answered Sep 15, 2017 by speedplane

Below is my map_parallel function. It works just like map, except it can run each element in parallel in a separate thread (but see note below). This answer builds upon another SO answer.

import threading
import logging
def map_parallel(f, iter, max_parallel = 10):
    """Just like map(f, iter) but each is done in a separate thread."""
    # Put all of the items in the queue, keep track of order.
    from queue import Queue, Empty
    total_items = 0
    queue = Queue()
    for i, arg in enumerate(iter):
        queue.put((i, arg))
        total_items += 1
    # No point in creating more thread objects than necessary.
    if max_parallel > total_items:
        max_parallel = total_items

    # The worker thread.
    res = {}
    errors = {}
    class Worker(threading.Thread):
        def run(self):
            while not errors:
                    num, arg = queue.get(block = False)
                        res[num] = f(arg)
                    except Exception as e:
                        errors[num] = sys.exc_info()
                except Empty:

    # Create the threads.
    threads = [Worker() for _ in range(max_parallel)]
    # Start the threads.
    [t.start() for t in threads]
    # Wait for the threads to finish.
    [t.join() for t in threads]

    if errors:
        if len(errors) > 1:
            logging.warning("map_parallel multiple errors: %d:\n%s"%(
                len(errors), errors))
        # Just raise the first one.
        item_i = min(errors.keys())
        type, value, tb = errors[item_i]
        # Print the original traceback"map_parallel exception on item %s/%s:\n%s"%(
            item_i, total_items, "\n".join(traceback.format_tb(tb))))
        raise value
    return [res[i] for i in range(len(res))]

NOTE: One thing to be careful of is Exceptions. Like normal map, the above function raises an exception if one of it's sub-thread raises an exception, and will stop iteration. However, due to the parallel nature, there's no guarantee that the earliest element will raise the first exception.

Welcome to Q&A, where you can ask questions and receive answers from other members of the community.
Website Online Counter