Skip to content
This repository has been archived by the owner on Dec 11, 2023. It is now read-only.

how to share ctable object in share memory with multi process #412

Open
WoolenWang opened this issue Jan 17, 2021 · 2 comments
Open

how to share ctable object in share memory with multi process #412

WoolenWang opened this issue Jan 17, 2021 · 2 comments

Comments

@WoolenWang
Copy link

WoolenWang commented Jan 17, 2021

the_ctable = bcolz.open('test_ctable.bcolz', 'r')
table_ndarray = the_ctable[:]
from multiprocessing.sharedctypes import RawArray
the_memory = RawArray('b', table_ndarray.nbytes)
dst_data = np.ndarray(buffer=the_memory, dtype=table_ndarray.dtype, shape=table_ndarray.shape)
np.copyto(dst_data, table_ndarray)
from multiprocessing import Process
proc = Process(target=other_process,args=(the_memory,))
proc.start()

using this way can share the ctable data with ndarray type, but the data is not compressed and consume too many memory, is that any way to share ctable object in share memory ??

@kaybinwong
Copy link

you can take a shot on shared-array, good luck.

how to read data from carray, do u kown?

@WoolenWang
Copy link
Author

you can take a shot on shared-array, good luck.

how to read data from carray, do u kown?

the most important thing is the numpy ndarray is not compressed and the bcolz ctable is compressed in memory so the usage of memory is significant different . the shared-array pkg is still try to share the numpy ndarray ?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants