bsb_hdf5 package¶
Submodules¶
bsb_hdf5.chunks module¶
The chunks module provides the tools for the HDF5 engine to store the chunked placement data received from the placement module in separate datasets to arbitrarily parallelize and scale scaffold models.
The module provides the ChunkLoader mixin for
Resource objects (e.g. PlacementSet,
ConnectivitySet) to organize ChunkedProperty and ChunkedCollection
objects within them.
- class bsb_hdf5.chunks.ChunkLoader¶
Resourcemixin to organize chunked properties and collections within itself.- Parameters:
properties – An iterable of functions that construct
ChunkedProperty.collections – An iterable of names for constructing
ChunkedCollection.
- Type:
Iterable
- Type:
Iterable
- chunk_context(chunks)¶
- clear_chunk_filter()¶
- exclude_chunk(chunk)¶
Exclude a chunk from the data when loading properties/collections.
- get_all_chunks(handle=None)¶
- get_chunk_path(chunk=None, collection=None, key=None)¶
Return the full HDF5 path of a chunk.
- Parameters:
chunk (
bsb.storage._chunks.Chunk) – Chunk- Returns:
HDF5 path
- Return type:
- get_loaded_chunks()¶
- include_chunk(chunk)¶
Include a chunk in the data when loading properties/collections.
- require_chunk(chunk, handle=None)¶
Create a chunk if it doesn’t exist yet, or do nothing.
- set_chunk_filter(chunks)¶
- class bsb_hdf5.chunks.ChunkedCollection(loader, collection, shape, dtype, insert=None, extract=None)¶
Chunked collections are stored inside the
chunksgroup of theChunkLoaderthey belong to.Inside the
chunksgroup another group is created per chunk, inside which a group exists per collection. Arbitrarily named datasets can be stored inside of this collection.- append(chunk, key, data, handle=None, **kwargs)¶
Append data to a property chunk. Will create it if it doesn’t exist.
- Parameters:
data – Data to append to the chunked property.
chunk (
bsb.storage._chunks.Chunk) – Chunk
- clear(chunk, handle=None)¶
- keys(handle=None)¶
- load(key, handle=None, **kwargs)¶
- load_all(handle=None, **kwargs)¶
- overwrite(chunk, data, key, handle=None, **kwargs)¶
- class bsb_hdf5.chunks.ChunkedProperty(loader, property, shape, dtype, insert=None, extract=None, collection=None)¶
Chunked properties are stored inside the
chunksgroup of theChunkLoaderthey belong to.Inside the
chunksgroup another group is created per chunk, inside which a dataset exists per property.- append(chunk, data, key=None, handle=None)¶
Append data to a property chunk. Will create it if it doesn’t exist.
- Parameters:
data – Data to append to the chunked property.
chunk (
bsb.storage._chunks.Chunk) – Chunk
- clear(chunk, key=None, handle=None)¶
- get_chunk_reader(handle, raw, key=None, pad_by=None)¶
Create a chunk reader that either returns the raw data or extracts it.
- load(raw=False, key=None, pad_by=None, handle=None)¶
- overwrite(chunk, data, key=None, handle=None)¶
bsb_hdf5.connectivity_set module¶
- class bsb_hdf5.connectivity_set.CSIterator(cs, direction=None, local_=None, global_=None)¶
- get_global_iter(direction, local_, global_)¶
- get_local_iter(direction, local_)¶
- class bsb_hdf5.connectivity_set.ConnectivitySet(engine, tag, handle=None)¶
Fetches placement data from storage.
Note
Use
Scaffold.get_connectivity_setto correctly obtain aConnectivitySet.- chunk_connect(src_chunk, dst_chunk, src_locs, dst_locs, handle=None)¶
Must connect the
src_locsto thedest_locs, interpreting the cell ids (first column of the locs) as the cell rank in the chunk.
- clear(handle=None)¶
Must clear (some chunks of) the placement set.
- connect(pre_set, post_set, src_locs, dest_locs, handle=None)¶
Must connect the
src_locsto thedest_locs, interpreting the cell ids (first column of the locs) as the cell rank in the placement set.
- classmethod create(engine, pre_type, post_type, tag=None, handle=None)¶
Create the structure for this connectivity set in the HDF5 file.
Connectivity sets are stored under
/connectivity/<tag>.
- static exists(engine, tag, handle=None)¶
Checks whether a
ConnectivitySetwith the given tag exists.- Parameters:
engine (
HDF5Engine) – Engine to use for the lookup.tag (str) – Tag of the set to look for.
handle (
h5py.File) – An open handle to use instead of opening one.
- Returns:
Whether the tag exists.
- Return type:
- flat_iter_connections(direction=None, local_=None, global_=None)¶
Iterates over the connectivity data.
for dir, lchunk, gchunk, data in self.flat_iter_connections(): print(f"Flat {dir} block between {lchunk} and {gchunk}")
If a keyword argument is given, that axis is not iterated over, and the value is fixed in each iteration.
- Parameters:
direction (str) – When omitted, iterates
incandout. When given, it restricts the iteration to the given value.local (Union[Chunk, list[Chunk]]) – When omitted, iterates over all local chunks in the set. When given, it restricts the iteration to the given value(s).
global (Union[Chunk, list[Chunk]]) – When omitted, iterates over all global chunks in the set. When given, it restricts the iteration to the given value(s).
- Returns:
Yields the direction, local chunk, global chunk, and data. The data is a tuple of the local and global connection locations.
- Return type:
tuple[str, Chunk, Chunk, tuple[numpy.ndarray, numpy.ndarray]]
- get_chunk_stats(handle=None)¶
- get_global_chunks(direction, local_, handle=None)¶
Must list all the global chunks that contain data coming from a
localchunk in the givendirection
- get_local_chunks(direction, handle=None)¶
Must list all the local chunks that contain data in the given
direction("inc"or"out").
- classmethod get_tags(engine, handle=None)¶
Returns all the connectivity tags in the network.
- load_block_connections(direction, local_, global_, handle=None)¶
Load the connection block with given direction between the given local and global chunk.
- Parameters:
- Returns:
The local and global connections locations
- Return type:
Tuple[numpy.ndarray, numpy.ndarray]
- load_local_connections(direction, local_, handle=None)¶
Load all the connections of the given local chunk.
- Parameters:
- Returns:
The local connection locations, a vector of the global connection chunks (1 chunk id per connection) and the global connections locations. To identify a cell in the global connections, use the corresponding chunk id from the second return value.
- Return type:
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray]
- nested_iter_connections(direction=None, local_=None, global_=None)¶
Iterates over the connectivity data, leaving room for the end-user to set up nested for loops:
for dir, local_itr in self.nested_iter_connections(): for lchunk, global_itr in local_itr: print("I can do something at the start of a new local chunk") for gchunk, data in global_itr: print(f"Nested {dir} block between {lchunk} and {gchunk}") print("Or right before we move to the next local chunk")
If a keyword argument is given, that axis is not iterated over, and the amount of nested loops is reduced.
- Parameters:
direction (str) – When omitted, iterates
incandout, otherwise when given, pins it to the given valuelocal (Union[Chunk, list[Chunk]]) – When omitted, iterates over all local chunks in the set. When given, it restricts the iteration to the given value(s).
global (Union[Chunk, list[Chunk]]) – When omitted, iterates over all global chunks in the set. When given, it restricts the iteration to the given value(s).
- Returns:
An iterator that produces the next unrestricted iteration values, or the connection dataset that matches the iteration combination.
- classmethod require(engine, pre_type, post_type, tag=None, handle=None)¶
Get or create a
ConnectivitySet.- Parameters:
engine (
HDF5Engine) – Engine to fetch/write the data.pre_type (
CellType) – Presynaptic cell type.post_type (
CellType) – Postsynaptic cell type.tag (str) – Tag to store the set under. Defaults to
{pre_type.name}_to_{post_type.name}.
- Returns:
Existing or new connectivity set.
- Return type:
- exception bsb_hdf5.connectivity_set.LocationOutOfBoundsError¶
- bsb_hdf5.connectivity_set.get_dir_iter(direction)¶
bsb_hdf5.file_store module¶
bsb_hdf5.morphology_repository module¶
- class bsb_hdf5.morphology_repository.MetaEncoder(*, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, sort_keys=False, indent=None, separators=None, default=None)¶
Encodes morphology metadata to JSON.
Constructor for JSONEncoder, with sensible defaults.
If skipkeys is false, then it is a TypeError to attempt encoding of keys that are not str, int, float or None. If skipkeys is True, such items are simply skipped.
If ensure_ascii is true, the output is guaranteed to be str objects with all incoming non-ASCII characters escaped. If ensure_ascii is false, the output can contain non-ASCII characters.
If check_circular is true, then lists, dicts, and custom encoded objects will be checked for circular references during encoding to prevent an infinite recursion (which would cause an RecursionError). Otherwise, no such check takes place.
If allow_nan is true, then NaN, Infinity, and -Infinity will be encoded as such. This behavior is not JSON specification compliant, but is consistent with most JavaScript based encoders and decoders. Otherwise, it will be a ValueError to encode such floats.
If sort_keys is true, then the output of dictionaries will be sorted by key; this is useful for regression tests to ensure that JSON serializations can be compared on a day-to-day basis.
If indent is a non-negative integer, then JSON array elements and object members will be pretty-printed with that indent level. An indent level of 0 will only insert newlines. None is the most compact representation.
If specified, separators should be an (item_separator, key_separator) tuple. The default is (’, ‘, ‘: ‘) if indent is
Noneand (‘,’, ‘: ‘) otherwise. To get the most compact JSON representation, you should specify (‘,’, ‘:’) to eliminate whitespace.If specified, default is a function that gets called for objects that can’t otherwise be serialized. It should return a JSON encodable version of the object or raise a
TypeError.- default(o)¶
Implement this method in a subclass such that it returns a serializable object for
o, or calls the base implementation (to raise aTypeError).For example, to support arbitrary iterators, you could implement default like this:
def default(self, o): try: iterable = iter(o) except TypeError: pass else: return list(iterable) # Let the base class default method raise the TypeError return super().default(o)
- class bsb_hdf5.morphology_repository.MorphologyRepository(engine)¶
- all(handle=None)¶
Fetch all the stored morphologies.
- Returns:
List of the stored morphologies.
- Return type:
- get_all_meta(handle=None)¶
Get the metadata of all stored morphologies.
- Returns:
Metadata dictionary
- Return type:
- get_meta(name, handle=None)¶
Get the metadata of a stored morphology.
- has(name, handle=None)¶
Check whether a morphology under the given name exists.
- load(name, preloaded_meta=None, handle=None)¶
Load a stored morphology as a constructed morphology object.
- Parameters:
name (str) – Key of the stored morphology.
- Returns:
A morphology
- Return type:
- preload(name, meta=None, handle=None)¶
Load a stored morphology as a morphology loader.
- Parameters:
name (str) – Key of the stored morphology.
- Returns:
The stored morphology
- Return type:
- remove(name, handle=None)¶
- save(name, morphology, overwrite=False, update_meta=True, handle=None)¶
Store a morphology.
- Parameters:
name (str) – Key to store the morphology under.
morphology (bsb.morphologies.Morphology) – Morphology to store
overwrite (bool) – Overwrite any stored morphology that already exists under that name
- Returns:
The stored morphology
- Return type:
- select(*selectors)¶
Select stored morphologies.
- Parameters:
selectors (list[bsb.morphologies.selector.MorphologySelector]) – Any number of morphology selectors.
- Returns:
All stored morphologies that match at least one selector.
- Return type:
- bsb_hdf5.morphology_repository.meta_object_hook(obj)¶
bsb_hdf5.placement_set module¶
- class bsb_hdf5.placement_set.PlacementSet(engine, cell_type)¶
Fetches placement data from storage.
Note
Use
Scaffold.get_placement_setto correctly obtain a PlacementSet.- append_additional(name, chunk, data)¶
Append arbitrary user data to the placement set. The length of the data must match that of the placement set, and must be storable by the engine.
- Parameters:
name
chunk (Chunk) – The chunk to store data in.
data (numpy.ndarray) – Arbitrary user data. You decide ❤️
- append_data(chunk, positions=None, morphologies=None, rotations=None, additional=None, count=None, handle=None)¶
Append data to the placement set.
- Parameters:
chunk (Chunk) – The chunk to store data in.
positions (
numpy.ndarray) – Cell positionsrotations (RotationSet) – Cell rotations
morphologies (MorphologySet) – Cell morphologies
additional (dict) – Additional data to attach to chunk
count (int) – Amount of entities to place. Excludes the use of any positional, rotational or morphological data.
handle (
h5py.Group) – hdf5 file handler
- append_entities(chunk, count, additional=None)¶
Append entities to the placement set.
- clear(chunks=None, handle=None)¶
Clear (some chunks of) the placement set.
- Parameters:
chunks (list[bsb.storage._chunks.Chunk]) – If given, the specific chunks to clear.
- convert_to_local(ids, handle=None)¶
Converts a list of global ids to local ids, if the PlacementSet is not separated in chunks check the ids within a range on the full size of the PS.
- classmethod create(engine, cell_type, handle=None)¶
Create the structure for this placement set in the HDF5 file.
Placement sets are stored under
/placement/<tag>.
- static exists(engine, cell_type, handle=None)¶
Check existence of a placement set.
- Parameters:
engine (bsb.storage.interfaces.Engine) – The engine that governs the existence check.
cell_type (bsb.cell_types.CellType) – The cell type to look for.
- Returns:
Whether the placement set exists.
- Return type:
- get_chunk_stats(handle=None)¶
Should return how many cells were placed in each chunk.
- get_label_mask(labels=None, handle=None)¶
Should return a mask that fits the placement set for the cells with given labels. To filter non labelled cells, set labels to empty list.
- Parameters:
- Return type:
- get_labelled(labels=None, handle=None)¶
Should return the ids of the cells labelled with given labels. To filter non labelled cells, set labels to empty list.
- Parameters:
- Return type:
- get_unique_labels(handle=None)¶
Should return the unique labels assigned to the cells.
- label(labels, cells, handle=None)¶
Should label the cells with given labels.
- label_by_mask(labels, mask, handle=None)¶
Should label the masked with the given labels.
- load_additional(key=None, handle=None)¶
- load_ids(handle=None)¶
- load_morphologies(handle=None, allow_empty=False)¶
Preload the cell morphologies.
- Parameters:
- Returns:
MorphologySet object containing the loader of all morphologies
- Return type:
- Raises:
DatasetNotFoundError when the morphology data is not found.
- load_positions(handle=None)¶
Load the cell positions.
- Raises:
DatasetNotFoundError when there is no rotation information for this cell type.
- load_rotations(handle=None)¶
Load the cell rotations.
- Raises:
DatasetNotFoundError when there is no rotation information for this cell type.
- remove_labels(labels, cells, handle=None)¶
Should remove the provided labels assigned to the cells.
- remove_labels_by_mask(labels, mask, handle=None)¶
Should remove the provided labels assigned to the masked cells.
- classmethod require(engine, cell_type, handle=None)¶
Return and create a placement set, if it didn’t exist before.
The default implementation uses the
exists()andcreate()methods.- Parameters:
engine (bsb.storage.interfaces.Engine) – The engine that governs this PlacementSet.
cell_type (bsb.cell_types.CellType) – The cell type whose data is stored in the placement set.
- Returns:
A placement set
- Return type:
- bsb_hdf5.placement_set.encode_labels(data, ds)¶
bsb_hdf5.resource module¶
- class bsb_hdf5.resource.Resource(engine: HDF5Engine, path: str)¶
- Parameters:
engine (HDF5Engine)
path (str)
- append(new_data, dtype=<class 'float'>)¶
- property attributes¶
- create(data, *args, **kwargs)¶
- exists()¶
- get_attribute(name)¶
- get_dataset(selector=())¶
- keys()¶
- remove()¶
- require(handle)¶
- property shape¶
- unmap(selector=(), mapping=<function Resource.<lambda>>, data=None)¶
- unmap_one(data, mapping=None)¶
- bsb_hdf5.resource.handles_class_handles(handle_type)¶
Decorator for class methods to lock and open hdf5 files.
The
Enginehandler is expected to be the second argument of the decorated function.
Module contents¶
HDF5 storage engine for the BSB framework.
- class bsb_hdf5.ConnectivitySet(engine, tag, handle=None)¶
Fetches placement data from storage.
Note
Use
Scaffold.get_connectivity_setto correctly obtain aConnectivitySet.- chunk_connect(src_chunk, dst_chunk, src_locs, dst_locs, handle=None)¶
Must connect the
src_locsto thedest_locs, interpreting the cell ids (first column of the locs) as the cell rank in the chunk.
- clear(handle=None)¶
Must clear (some chunks of) the placement set.
- connect(pre_set, post_set, src_locs, dest_locs, handle=None)¶
Must connect the
src_locsto thedest_locs, interpreting the cell ids (first column of the locs) as the cell rank in the placement set.
- classmethod create(engine, pre_type, post_type, tag=None, handle=None)¶
Create the structure for this connectivity set in the HDF5 file.
Connectivity sets are stored under
/connectivity/<tag>.
- static exists(engine, tag, handle=None)¶
Checks whether a
ConnectivitySetwith the given tag exists.- Parameters:
engine (
HDF5Engine) – Engine to use for the lookup.tag (str) – Tag of the set to look for.
handle (
h5py.File) – An open handle to use instead of opening one.
- Returns:
Whether the tag exists.
- Return type:
- flat_iter_connections(direction=None, local_=None, global_=None)¶
Iterates over the connectivity data.
for dir, lchunk, gchunk, data in self.flat_iter_connections(): print(f"Flat {dir} block between {lchunk} and {gchunk}")
If a keyword argument is given, that axis is not iterated over, and the value is fixed in each iteration.
- Parameters:
direction (str) – When omitted, iterates
incandout. When given, it restricts the iteration to the given value.local (Union[Chunk, list[Chunk]]) – When omitted, iterates over all local chunks in the set. When given, it restricts the iteration to the given value(s).
global (Union[Chunk, list[Chunk]]) – When omitted, iterates over all global chunks in the set. When given, it restricts the iteration to the given value(s).
- Returns:
Yields the direction, local chunk, global chunk, and data. The data is a tuple of the local and global connection locations.
- Return type:
tuple[str, Chunk, Chunk, tuple[numpy.ndarray, numpy.ndarray]]
- get_chunk_stats(handle=None)¶
- get_global_chunks(direction, local_, handle=None)¶
Must list all the global chunks that contain data coming from a
localchunk in the givendirection
- get_local_chunks(direction, handle=None)¶
Must list all the local chunks that contain data in the given
direction("inc"or"out").
- classmethod get_tags(engine, handle=None)¶
Returns all the connectivity tags in the network.
- load_block_connections(direction, local_, global_, handle=None)¶
Load the connection block with given direction between the given local and global chunk.
- Parameters:
- Returns:
The local and global connections locations
- Return type:
Tuple[numpy.ndarray, numpy.ndarray]
- load_local_connections(direction, local_, handle=None)¶
Load all the connections of the given local chunk.
- Parameters:
- Returns:
The local connection locations, a vector of the global connection chunks (1 chunk id per connection) and the global connections locations. To identify a cell in the global connections, use the corresponding chunk id from the second return value.
- Return type:
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray]
- nested_iter_connections(direction=None, local_=None, global_=None)¶
Iterates over the connectivity data, leaving room for the end-user to set up nested for loops:
for dir, local_itr in self.nested_iter_connections(): for lchunk, global_itr in local_itr: print("I can do something at the start of a new local chunk") for gchunk, data in global_itr: print(f"Nested {dir} block between {lchunk} and {gchunk}") print("Or right before we move to the next local chunk")
If a keyword argument is given, that axis is not iterated over, and the amount of nested loops is reduced.
- Parameters:
direction (str) – When omitted, iterates
incandout, otherwise when given, pins it to the given valuelocal (Union[Chunk, list[Chunk]]) – When omitted, iterates over all local chunks in the set. When given, it restricts the iteration to the given value(s).
global (Union[Chunk, list[Chunk]]) – When omitted, iterates over all global chunks in the set. When given, it restricts the iteration to the given value(s).
- Returns:
An iterator that produces the next unrestricted iteration values, or the connection dataset that matches the iteration combination.
- classmethod require(engine, pre_type, post_type, tag=None, handle=None)¶
Get or create a
ConnectivitySet.- Parameters:
engine (
HDF5Engine) – Engine to fetch/write the data.pre_type (
CellType) – Presynaptic cell type.post_type (
CellType) – Postsynaptic cell type.tag (str) – Tag to store the set under. Defaults to
{pre_type.name}_to_{post_type.name}.
- Returns:
Existing or new connectivity set.
- Return type:
- class bsb_hdf5.FileStore(engine)¶
- all()¶
Return all ids and associated metadata in the file store.
- get_encoding(id)¶
Must return the encoding of the file with the given id, or None if it is unspecified binary data.
- get_meta(id)¶
Must return the metadata of the given id.
- get_mtime(id)¶
Must return the last modified timestamp of file with the given id.
- has(id)¶
Must return whether the file store has a file with the given id.
- load(id)¶
Load the content of an object in the file store.
- Parameters:
id (str) – id of the content to be loaded.
- Returns:
The content of the stored object
- Return type:
- Raises:
FileNotFoundError – The given id doesn’t exist in the file store.
- load_active_config()¶
Load the active configuration stored inside the storage.
- Returns:
The active configuration that is loaded when this storage object is.
- Return type:
- remove(id)¶
Remove the content of an object in the file store.
- Parameters:
id (str) – id of the content to be removed.
- Raises:
FileNotFoundError – The given id doesn’t exist in the file store.
- store(content, meta=None, id=None, encoding=None, overwrite=False)¶
Store content in the file store. Should also store the current timestamp as mtime meta.
- store_active_config(config)¶
Set the active configuration for this network.
- Parameters:
config (Configuration) – The active configuration that will be loaded when this storage object is.
- class bsb_hdf5.HDF5Engine(root, comm)¶
- clear_connectivity(*args, **kwargs)¶
collective Must clear existing connectivity data.
- clear_placement(*args, **kwargs)¶
collective Must clear existing placement data.
- copy(*args, **kwargs)¶
collective Must copy the storage object to the new root.
- create(*args, **kwargs)¶
collective Must create the storage engine.
- exists()¶
Must check existence of the storage object.
- get_chunk_stats(*args, **kwargs)¶
readonly Must return a dictionary with all chunk statistics.
- move(*args, **kwargs)¶
collective Must move the storage object to the new root.
- static recognizes(root, comm)¶
Must return whether the given root argument is recognized as a valid storage object.
- Parameters:
root – The unique identifier for the storage
comm (mpi4py.MPI.Comm) – MPI communicator that shares control over the Storage.
- remove(*args, **kwargs)¶
collective Must remove the storage object.
- require_placement_set(*args, **kwargs)¶
- property root_slug¶
Must return a pathlike unique identifier for the root of the storage object.
- versions(*args, **kwargs)¶
Must return a dictionary containing the version of the engine package, and bsb package, used to last write to this storage object.
- exception bsb_hdf5.HDF5SlowLockingWarning¶
- class bsb_hdf5.MorphologyRepository(engine)¶
- all(handle=None)¶
Fetch all the stored morphologies.
- Returns:
List of the stored morphologies.
- Return type:
- get_all_meta(handle=None)¶
Get the metadata of all stored morphologies.
- Returns:
Metadata dictionary
- Return type:
- get_meta(name, handle=None)¶
Get the metadata of a stored morphology.
- has(name, handle=None)¶
Check whether a morphology under the given name exists.
- load(name, preloaded_meta=None, handle=None)¶
Load a stored morphology as a constructed morphology object.
- Parameters:
name (str) – Key of the stored morphology.
- Returns:
A morphology
- Return type:
- preload(name, meta=None, handle=None)¶
Load a stored morphology as a morphology loader.
- Parameters:
name (str) – Key of the stored morphology.
- Returns:
The stored morphology
- Return type:
- remove(name, handle=None)¶
- save(name, morphology, overwrite=False, update_meta=True, handle=None)¶
Store a morphology.
- Parameters:
name (str) – Key to store the morphology under.
morphology (bsb.morphologies.Morphology) – Morphology to store
overwrite (bool) – Overwrite any stored morphology that already exists under that name
- Returns:
The stored morphology
- Return type:
- select(*selectors)¶
Select stored morphologies.
- Parameters:
selectors (list[bsb.morphologies.selector.MorphologySelector]) – Any number of morphology selectors.
- Returns:
All stored morphologies that match at least one selector.
- Return type:
- class bsb_hdf5.PlacementSet(engine, cell_type)¶
Fetches placement data from storage.
Note
Use
Scaffold.get_placement_setto correctly obtain a PlacementSet.- append_additional(name, chunk, data)¶
Append arbitrary user data to the placement set. The length of the data must match that of the placement set, and must be storable by the engine.
- Parameters:
name
chunk (Chunk) – The chunk to store data in.
data (numpy.ndarray) – Arbitrary user data. You decide ❤️
- append_data(chunk, positions=None, morphologies=None, rotations=None, additional=None, count=None, handle=None)¶
Append data to the placement set.
- Parameters:
chunk (Chunk) – The chunk to store data in.
positions (
numpy.ndarray) – Cell positionsrotations (RotationSet) – Cell rotations
morphologies (MorphologySet) – Cell morphologies
additional (dict) – Additional data to attach to chunk
count (int) – Amount of entities to place. Excludes the use of any positional, rotational or morphological data.
handle (
h5py.Group) – hdf5 file handler
- append_entities(chunk, count, additional=None)¶
Append entities to the placement set.
- clear(chunks=None, handle=None)¶
Clear (some chunks of) the placement set.
- Parameters:
chunks (list[bsb.storage._chunks.Chunk]) – If given, the specific chunks to clear.
- convert_to_local(ids, handle=None)¶
Converts a list of global ids to local ids, if the PlacementSet is not separated in chunks check the ids within a range on the full size of the PS.
- classmethod create(engine, cell_type, handle=None)¶
Create the structure for this placement set in the HDF5 file.
Placement sets are stored under
/placement/<tag>.
- static exists(engine, cell_type, handle=None)¶
Check existence of a placement set.
- Parameters:
engine (bsb.storage.interfaces.Engine) – The engine that governs the existence check.
cell_type (bsb.cell_types.CellType) – The cell type to look for.
- Returns:
Whether the placement set exists.
- Return type:
- get_chunk_stats(handle=None)¶
Should return how many cells were placed in each chunk.
- get_label_mask(labels=None, handle=None)¶
Should return a mask that fits the placement set for the cells with given labels. To filter non labelled cells, set labels to empty list.
- Parameters:
- Return type:
- get_labelled(labels=None, handle=None)¶
Should return the ids of the cells labelled with given labels. To filter non labelled cells, set labels to empty list.
- Parameters:
- Return type:
- get_unique_labels(handle=None)¶
Should return the unique labels assigned to the cells.
- label(labels, cells, handle=None)¶
Should label the cells with given labels.
- label_by_mask(labels, mask, handle=None)¶
Should label the masked with the given labels.
- load_additional(key=None, handle=None)¶
- load_ids(handle=None)¶
- load_morphologies(handle=None, allow_empty=False)¶
Preload the cell morphologies.
- Parameters:
- Returns:
MorphologySet object containing the loader of all morphologies
- Return type:
- Raises:
DatasetNotFoundError when the morphology data is not found.
- load_positions(handle=None)¶
Load the cell positions.
- Raises:
DatasetNotFoundError when there is no rotation information for this cell type.
- load_rotations(handle=None)¶
Load the cell rotations.
- Raises:
DatasetNotFoundError when there is no rotation information for this cell type.
- remove_labels(labels, cells, handle=None)¶
Should remove the provided labels assigned to the cells.
- remove_labels_by_mask(labels, mask, handle=None)¶
Should remove the provided labels assigned to the masked cells.
- classmethod require(engine, cell_type, handle=None)¶
Return and create a placement set, if it didn’t exist before.
The default implementation uses the
exists()andcreate()methods.- Parameters:
engine (bsb.storage.interfaces.Engine) – The engine that governs this PlacementSet.
cell_type (bsb.cell_types.CellType) – The cell type whose data is stored in the placement set.
- Returns:
A placement set
- Return type: