Trying to use multiprocessing to fill an array in python -


i have code this

x = 3; y = 3; z = 10; ar = np.zeros((x,y,z))  multiprocessing import process, pool  para = [] process = [] def local_func(section):     print "section %s" % str(section)     ar[2,2,section] = 255     print "value set %d", ar[2,2,section]  pool = pool(1)  run_list = range(0,10) list_of_results = pool.map(local_func, run_list)  print ar 

the value in ar not changed multithreading, might wrong?

thanks

you're using multiple processes here, not multiple threads. because of that, each instance of local_func gets own separate copy of ar. can use custom manager create shared numpy array, can pass each child process , results expect:

import numpy np functools import partial multiprocessing import process, pool import multiprocessing.managers  x = 3; y = 3; z = 10;   class mymanager(multiprocessing.managers.basemanager):     pass mymanager.register('np_zeros', np.zeros, multiprocessing.managers.arrayproxy)   para = [] process = [] def local_func(ar, section):     print "section %s" % str(section)     ar[2,2,section] = 255      print "value set %d", ar[2,2,section]  if __name__ == "__main__":     m = mymanager()     m.start()     ar = m.np_zeros((x,y,z))      pool = pool(1)      run_list = range(0,10)     func = partial(local_func, ar)     list_of_results = pool.map(func, run_list)      print ar 

Comments

Popular posts from this blog

php - Submit Form Data without Reloading page -

linux - Rails running on virtual machine in Windows -